Full Text Views
89
Since the development of food diaries could enable people to develop healthy eating habits, food image recognition is in high demand to reduce the effort in food recording. Previous studies have worked on this challenging domain with datasets having fixed numbers of samples and classes. However, in the real-world setting, it is impossible to include all of the foods in the database because the number of classes of foods is large and increases continually. In addition to that, inter-class similarity and intra-class diversity also bring difficulties to the recognition. In this paper, we solve these problems by using deep convolutional neural network features to build a personalized classifier which incrementally learns the user's data and adapts to the user's eating habit. As a result, we achieved the state-of-the-art accuracy of food image recognition by the personalization of 300 food records per user.
Qing YU
The University of Tokyo
Masashi ANZAWA
The University of Tokyo
Sosuke AMANO
The University of Tokyo,foo.log Inc.
Kiyoharu AIZAWA
The University of Tokyo
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Qing YU, Masashi ANZAWA, Sosuke AMANO, Kiyoharu AIZAWA, "Personalized Food Image Classifier Considering Time-Dependent and Item-Dependent Food Distribution" in IEICE TRANSACTIONS on Information,
vol. E102-D, no. 11, pp. 2120-2126, November 2019, doi: 10.1587/transinf.2019PCP0005.
Abstract: Since the development of food diaries could enable people to develop healthy eating habits, food image recognition is in high demand to reduce the effort in food recording. Previous studies have worked on this challenging domain with datasets having fixed numbers of samples and classes. However, in the real-world setting, it is impossible to include all of the foods in the database because the number of classes of foods is large and increases continually. In addition to that, inter-class similarity and intra-class diversity also bring difficulties to the recognition. In this paper, we solve these problems by using deep convolutional neural network features to build a personalized classifier which incrementally learns the user's data and adapts to the user's eating habit. As a result, we achieved the state-of-the-art accuracy of food image recognition by the personalization of 300 food records per user.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2019PCP0005/_p
Copy
@ARTICLE{e102-d_11_2120,
author={Qing YU, Masashi ANZAWA, Sosuke AMANO, Kiyoharu AIZAWA, },
journal={IEICE TRANSACTIONS on Information},
title={Personalized Food Image Classifier Considering Time-Dependent and Item-Dependent Food Distribution},
year={2019},
volume={E102-D},
number={11},
pages={2120-2126},
abstract={Since the development of food diaries could enable people to develop healthy eating habits, food image recognition is in high demand to reduce the effort in food recording. Previous studies have worked on this challenging domain with datasets having fixed numbers of samples and classes. However, in the real-world setting, it is impossible to include all of the foods in the database because the number of classes of foods is large and increases continually. In addition to that, inter-class similarity and intra-class diversity also bring difficulties to the recognition. In this paper, we solve these problems by using deep convolutional neural network features to build a personalized classifier which incrementally learns the user's data and adapts to the user's eating habit. As a result, we achieved the state-of-the-art accuracy of food image recognition by the personalization of 300 food records per user.},
keywords={},
doi={10.1587/transinf.2019PCP0005},
ISSN={1745-1361},
month={November},}
Copy
TY - JOUR
TI - Personalized Food Image Classifier Considering Time-Dependent and Item-Dependent Food Distribution
T2 - IEICE TRANSACTIONS on Information
SP - 2120
EP - 2126
AU - Qing YU
AU - Masashi ANZAWA
AU - Sosuke AMANO
AU - Kiyoharu AIZAWA
PY - 2019
DO - 10.1587/transinf.2019PCP0005
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E102-D
IS - 11
JA - IEICE TRANSACTIONS on Information
Y1 - November 2019
AB - Since the development of food diaries could enable people to develop healthy eating habits, food image recognition is in high demand to reduce the effort in food recording. Previous studies have worked on this challenging domain with datasets having fixed numbers of samples and classes. However, in the real-world setting, it is impossible to include all of the foods in the database because the number of classes of foods is large and increases continually. In addition to that, inter-class similarity and intra-class diversity also bring difficulties to the recognition. In this paper, we solve these problems by using deep convolutional neural network features to build a personalized classifier which incrementally learns the user's data and adapts to the user's eating habit. As a result, we achieved the state-of-the-art accuracy of food image recognition by the personalization of 300 food records per user.
ER -