The analysis of people's nutrition habits is one of the most important mechanisms for applying a thorough monitorisation of several medical conditions (e.g. diabetes, obesity, etc.) that affect a high percentage of the global population. Methods for automatically logging one's meals could not only make the process easier, but also make it objective to the user's point of view and interpretability. One of the solutions adopted recently that could ease the automatic construction of nutrition diaries is to ask individuals to take photos with their mobile phones. An alternative technique is visual lifelogging that consists of using a wearable camera that automatically captures pictures from the user point of view (egocentric point of view) with the aim to analyse different patterns of his/her daily life and extract highly relevant information like nutritional habits. In this talk we will show how deep learning applied to the food detection and food recognition problems can help to automatically infer the user's eating pattern.
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
Tel.: +1 703 830 6300
Fax: +1 703 830 2300 email@example.com
(Corporate matters and books only) IOS Press c/o Accucoms US, Inc.
For North America Sales and Customer Service
West Point Commons
Lansdale PA 19446
Tel.: +1 866 855 8967
Fax: +1 215 660 5042 firstname.lastname@example.org