• Corpus ID: 15976540

Capture and Retrieval of Life-Log

@inproceedings{Aizawa2004CaptureAR,
  title={Capture and Retrieval of Life-Log},
  author={Kiyoharu Aizawa and Shinya Kawasaki and Takayuki Ishikawa and T. Yamasaki},
  year={2004}
}
In wearable computing environments, digitization of personal experiences will be made possible by continuous recordings using a wearable video camera. This could lead to the ``automatic life-log application''. However, it is evident that the resulting amount of video content will be enormous. Accordingly, to retrieve and browse desired scenes, a vast quantity of video data must be organized using structural information. We are developing a ``context-based video retrieval system for life-log… 

Experience retrieval in a ubiquitous home

TLDR
The system consists of a graphical user interface that can be used to retrieve video summaries interactively using simple queries and several algorithms are implemented to obtain compact summaries corresponding to the activity of each person.

Evaluation of video summarization for a large number of cameras in ubiquitous home

TLDR
A system for video summarization in a ubiquitous environment that consists of a graphical user interface that can be used to retrieve video summaries interactively using simple queries and several methods for extracting key frames from the resulting video sequences are presented.

10-12 Video Summarization for a Large Number of Cameras Using Floor Sensors in a Ubiquitous Environment

TLDR
A system for video summarization in a ubiquitous environment that consists of a graphical user interface that can be used to retrieve video summaries interactively using simple queries and several methods for extracting key frames from the resulting video sequences are presented.

3D video system for capturing unexpected moments in daily life

TLDR
A new free-view video system that generates 3D video from arbitrary point of view, using multiple cameras, and uses a cinematographic camera control system and an ARToolkit to control virtual cameras.

Building Mobile Social Network with Semantic Relation Using Bayesian NeTwork-based Life-log Mining

TLDR
This work builds a novel mobile social network by mining semantic relations between users in life-logs using Bayesian network and discusses how this network can be used for practical application and implemented a proof-of-concept application.

Automatic Synthesis of Stitched Wide FOV Images for Reviewing FPV Videos Selecting and Grouping Images Based on Stitching Criteria

TLDR
This paper reports a technique to automatically synthesize stitched wide FOV images from a first person view (FPV) video for reviewing it by approximately reducing the number of image combination (hypotheses) that must be evaluated for the image selection, based on pixel consistency among a small number of local images.

Journalist robot: robot system making news articles from real world

TLDR
Experiments show the ability of the journalist robot system to find news-like phenomena and describe images with words and the ability to characterize events with two values: "anomaly" and "relevance" to the user.

References

SHOWING 1-10 OF 20 REFERENCES

Capturing life-log and retrieval based on contexts

  • Tetsuro HoriK. Aizawa
  • Computer Science
    2004 IEEE International Conference on Multimedia and Expo (ICME) (IEEE Cat. No.04TH8763)
  • 2004
TLDR
This work is developing a "context-based video retrieval system for life-log applications" that can capture not only video and audio but also various sensor data and provides functions that make efficient video browsing and retrieval possible by using data from these sensors, some databases and various document data.

Context-based video retrieval system for the life-log applications

TLDR
This paper attempts to develop a "context-based video retrieval system for life-log applications" that is capable of continuously capturing data not only from a wearable camera and a microphone, but also from various kinds of sensors such as a brain-wave analyzer, a GPS receiver, an acceleration sensor, and a gyro sensor to extract the user's contexts.

Wearable imaging system for summarizing personal experiences

  • Y. SawahataK. Aizawa
  • Computer Science
    2003 International Conference on Multimedia and Expo. ICME '03. Proceedings (Cat. No.03TH8698)
  • 2003
TLDR
This paper attempts to develop a "wearable imaging system" that is capable of constantly capturing data, not only from a wearable video camera, but also from various kinds of sensors, such as a GPS, an accelerometer and a gyro sensor.

Summarizing wearable video

TLDR
This paper proposes an approach to the automatic structuring and summarization of wearable video that is very successful for real world experiments and it automatically extracted all the events that the subjects reported they had felt interesting.

Novel Concept for Video Retrieval in Life Log Application

TLDR
A novel concept in video retrieval is demonstrated: integrating the content of video data with context from the various sensors in the Life Log system to detect interesting conversations.

Structuring personal activity records based on attention-analyzing videos from head mounted camera

TLDR
A method for analyzing video records which contain personal activities captured by a head mounted camera is introduced, which uses the user's behaviors which appear when he/she pays attention to something to support the user in retrieving the most important or relevant portions from the videos.

Unsupervised clustering of ambulatory audio and video

  • B. ClarksonA. Pentland
  • Computer Science
    1999 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings. ICASSP99 (Cat. No.99CH36258)
  • 1999
TLDR
This system can (without any prior labeling of data) cluster the audio/visual data into events, such as passing through doors and crossing the street, and hierarchically cluster these events into scenes and get clusters that correlate with visiting the supermarket, or walking down a busy street.

Interaction Corpus and Experience Sharing

TLDR
This work demonstrates an application of generating a video-based experience summary that is reconfigured automatically from the interaction corpus, a captured collection of human behaviors and interactions among humans and artifacts.

Wearable interfaces for a video diary: towards memory retrieval, exchange, and transportation

TLDR
Wearable interfaces for a computational memory-aid useful in everyday life are discussed and it is believed that the above interfaces can be integrated into the video diary system.

StartleCam: a cybernetic wearable camera

TLDR
The implementation described here aims to capture events that are likely to get the user's attention and to be remembered, and creates a "flashbulb" memory archive for the wearable which aims to mimic the wearer.