• Corpus ID: 235694478

Egocentric Image Captioning for Privacy-Preserved Passive Dietary Intake Monitoring

  title={Egocentric Image Captioning for Privacy-Preserved Passive Dietary Intake Monitoring},
  author={Jianing Qiu and Frank P.-W. Lo and Xiao Gu and Modou Lamin Jobarteh and Wenyan Jia and Tom Baranowski and Matilda Steiner-Asiedu and Alex Kojo Anderson and Megan A. McCrory and Edward S. Sazonov and Mingui Sun and Gary S. Frost and Benny P. L. Lo},
Camera-based passive dietary intake monitoring is able to continuously capture the eating episodes of a subject, recording rich visual information, such as the type and volume of food being consumed, as well as the eating behaviours of the subject. However, there currently is no method that is able to incorporate these visual clues and provide a comprehensive context of dietary intake from passive recording (e.g., is the subject sharing food with others, what food the subject is eating, and how… 


Counting Bites and Recognizing Consumed Food from Videos for Passive Dietary Monitoring
Deep neural networks were used to perform bite counting and food item recognition in an end-to-end manner and it is shown that recognizing consumed food items is indeed harder than recognizing visible ones, with a drop of 25% in F1 score.
Assessing Individual Dietary Intake in Food Sharing Scenarios with a 360 Camera and Deep Learning
  • Jianing Qiu, F. P. Lo, B. Lo
  • Computer Science
    2019 IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN)
  • 2019
A novel vision-based approach for estimating individual dietary intake in food sharing scenarios is proposed in this paper, which incorporates food detection, face recognition and hand tracking
Point2Volume: A Vision-Based Dietary Assessment Approach Using View Synthesis
Compared to previous methods, this method has addressed several major challenges in vision-based dietary assessment, such as view occlusion and scale ambiguity, and it outperforms previous approaches in accurate portion size estimation.
“Automatic Ingestion Monitor Version 2” – A Novel Wearable Device for Automatic Food Intake Detection and Passive Capture of Food Images
Results suggest that AIM-2 can provide accurate detection of food intake, reduce the number of images for analysis and alleviate the privacy concerns of the users.
Im2Calories: Towards an Automated Mobile Vision Food Diary
A system which can recognize the contents of your meal from a single image, and then predict its nutritional contents, such as calories, is presented, significantly outperforming previous work.
Learning Deep Representations for Video-Based Intake Gesture Detection
This study collects and labels video data of eating occasions using 360-degree video of 102 participants, and shows that deep learning architectures can be applied to the problem of video-based detection of intake gestures.
An Intelligent Food-Intake Monitoring System Using Wearable Sensors
This paper presents a wearable sensor platform that autonomously provides detailed information regarding a subject's dietary habits and demonstrates a detailed overview of the subject's food intake that is difficult to quantify from manually-acquired food records.
Auracle: Detecting Eating Episodes with an Ear-mounted Sensor
A wearable earpiece that can automatically recognize eating behavior in free-living conditions, using an off-the-shelf contact microphone placed behind the ear that could sense, process, and classify audio data in real time.
Food Volume Estimation Based on Deep Learning View Synthesis from a Single Depth Map
A view synthesis approach based on deep learning is proposed to reconstruct 3D point clouds of food items and estimate the volume from a single depth image and is evaluated by comparing the volume estimated by the synthesized3D point cloud with the ground truth volume of the object items.
Food watch: detecting and characterizing eating episodes through feeding gestures
The potential of commercial smartwatches to be used in detection of eating episodes with short durations confounded by other activities of daily living in order to truly capture all eating episodes in the field is shown.