Loading…

Mixed deep learning and natural language processing method for fake-food image recognition and standardization to help automated dietary assessment

The present study tested the combination of an established and a validated food-choice research method (the 'fake food buffet') with a new food-matching technology to automate the data collection and analysis. The methodology combines fake-food image recognition using deep learning and foo...

Full description

Saved in:
Bibliographic Details
Published in:Public health nutrition 2019-05, Vol.22 (7), p.1193-1202
Main Authors: Mezgec, Simon, Eftimov, Tome, Bucher, Tamara, Koroušić Seljak, Barbara
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c519t-2c5bcc1dae1bca3428b68072fa6fe598a2ed19e7cf8da780f5a2f37cfaeef7653
cites cdi_FETCH-LOGICAL-c519t-2c5bcc1dae1bca3428b68072fa6fe598a2ed19e7cf8da780f5a2f37cfaeef7653
container_end_page 1202
container_issue 7
container_start_page 1193
container_title Public health nutrition
container_volume 22
creator Mezgec, Simon
Eftimov, Tome
Bucher, Tamara
Koroušić Seljak, Barbara
description The present study tested the combination of an established and a validated food-choice research method (the 'fake food buffet') with a new food-matching technology to automate the data collection and analysis. The methodology combines fake-food image recognition using deep learning and food matching and standardization based on natural language processing. The former is specific because it uses a single deep learning network to perform both the segmentation and the classification at the pixel level of the image. To assess its performance, measures based on the standard pixel accuracy and Intersection over Union were applied. Food matching firstly describes each of the recognized food items in the image and then matches the food items with their compositional data, considering both their food names and their descriptors. The final accuracy of the deep learning model trained on fake-food images acquired by 124 study participants and providing fifty-five food classes was 92·18 %, while the food matching was performed with a classification accuracy of 93 %. The present findings are a step towards automating dietary assessment and food-choice research. The methodology outperforms other approaches in pixel accuracy, and since it is the first automatic solution for recognizing the images of fake foods, the results could be used as a baseline for possible future studies. As the approach enables a semi-automatic description of recognized food items (e.g. with respect to FoodEx2), these can be linked to any food composition database that applies the same classification and description system.
doi_str_mv 10.1017/S1368980018000708
format article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_6536832</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><cupid>10_1017_S1368980018000708</cupid><sourcerecordid>2022982532</sourcerecordid><originalsourceid>FETCH-LOGICAL-c519t-2c5bcc1dae1bca3428b68072fa6fe598a2ed19e7cf8da780f5a2f37cfaeef7653</originalsourceid><addsrcrecordid>eNp1UU1v1DAQjRCIlsIP4IIsceESsJ1N7FyQUAUFqYgDcLYm9jjrktiL7VSUv8EfrrNdypc42J7xe_PGz1NVjxl9zigTLz6yppO9pJSVRQWVd6pjthFtzQUXd0tc4HrFj6oHKV0UTiuEuF8d8b7jjez64-rHe_cNDTGIOzIhRO_8SMAb4iEvESYygR8XGJHsYtCY0orPmLfBEBsisfAFaxtK5uaVFVGH0bvsgt_LpFx2iMZ9h_1dDmSL047AksMMeW3tMEO8IpBSkZ_R54fVPQtTwkeH86T6_Ob1p9O39fmHs3enr85r3bI-11y3g9bMALJBQ7PhcugkFdxCZ7HtJXA0rEehrTQgJLUtcNuUFBCt6NrmpHp5o7tbhhmNLq2LY7WLxUq8UgGc-hPxbqvGcKlKbScbXgSeHQRi-Lpgymp2SeNU_gzDkhSnnPeSt3vq07-oF2GJvthTvKG0o3276QqL3bB0DClFtLePYVStI1f_jLzUPPndxW3FzxkXQnMQhXmIzoz4q_f_Za8BczS7dQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2300609546</pqid></control><display><type>article</type><title>Mixed deep learning and natural language processing method for fake-food image recognition and standardization to help automated dietary assessment</title><source>Cambridge University Press</source><source>PubMed Central</source><creator>Mezgec, Simon ; Eftimov, Tome ; Bucher, Tamara ; Koroušić Seljak, Barbara</creator><creatorcontrib>Mezgec, Simon ; Eftimov, Tome ; Bucher, Tamara ; Koroušić Seljak, Barbara</creatorcontrib><description>The present study tested the combination of an established and a validated food-choice research method (the 'fake food buffet') with a new food-matching technology to automate the data collection and analysis. The methodology combines fake-food image recognition using deep learning and food matching and standardization based on natural language processing. The former is specific because it uses a single deep learning network to perform both the segmentation and the classification at the pixel level of the image. To assess its performance, measures based on the standard pixel accuracy and Intersection over Union were applied. Food matching firstly describes each of the recognized food items in the image and then matches the food items with their compositional data, considering both their food names and their descriptors. The final accuracy of the deep learning model trained on fake-food images acquired by 124 study participants and providing fifty-five food classes was 92·18 %, while the food matching was performed with a classification accuracy of 93 %. The present findings are a step towards automating dietary assessment and food-choice research. The methodology outperforms other approaches in pixel accuracy, and since it is the first automatic solution for recognizing the images of fake foods, the results could be used as a baseline for possible future studies. As the approach enables a semi-automatic description of recognized food items (e.g. with respect to FoodEx2), these can be linked to any food composition database that applies the same classification and description system.</description><identifier>ISSN: 1368-9800</identifier><identifier>ISSN: 1475-2727</identifier><identifier>EISSN: 1475-2727</identifier><identifier>DOI: 10.1017/S1368980018000708</identifier><identifier>PMID: 29623869</identifier><language>eng</language><publisher>Cambridge, UK: Cambridge University Press</publisher><subject>Algorithms ; Artificial intelligence ; Automation ; Classification ; Data collection ; Deep Learning ; Diet Records ; Food ; Food composition ; Food Preferences ; HOT TOPIC: ICT Assisted Dietary Data Collection and Analysis ; Humans ; Image acquisition ; Image classification ; Image processing ; Image Processing, Computer-Assisted ; Image segmentation ; Information sources ; International conferences ; Knowledge management ; Laboratories ; Language ; Machine learning ; Matching ; Meals ; Model accuracy ; Multimedia ; Natural Language Processing ; Nutrition Assessment ; Object recognition ; Pattern recognition ; Pixels ; Research Paper ; Researchers ; Standardization ; Studies</subject><ispartof>Public health nutrition, 2019-05, Vol.22 (7), p.1193-1202</ispartof><rights>The Authors 2018</rights><rights>2018 This article is published under (https://creativecommons.org/licenses/by/3.0/) (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>The Authors 2018 2018 The Authors</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c519t-2c5bcc1dae1bca3428b68072fa6fe598a2ed19e7cf8da780f5a2f37cfaeef7653</citedby><cites>FETCH-LOGICAL-c519t-2c5bcc1dae1bca3428b68072fa6fe598a2ed19e7cf8da780f5a2f37cfaeef7653</cites><orcidid>0000-0003-1663-3286</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6536832/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.cambridge.org/core/product/identifier/S1368980018000708/type/journal_article$$EHTML$$P50$$Gcambridge$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,27924,27925,53791,53793,72960</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/29623869$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Mezgec, Simon</creatorcontrib><creatorcontrib>Eftimov, Tome</creatorcontrib><creatorcontrib>Bucher, Tamara</creatorcontrib><creatorcontrib>Koroušić Seljak, Barbara</creatorcontrib><title>Mixed deep learning and natural language processing method for fake-food image recognition and standardization to help automated dietary assessment</title><title>Public health nutrition</title><addtitle>Public Health Nutr</addtitle><description>The present study tested the combination of an established and a validated food-choice research method (the 'fake food buffet') with a new food-matching technology to automate the data collection and analysis. The methodology combines fake-food image recognition using deep learning and food matching and standardization based on natural language processing. The former is specific because it uses a single deep learning network to perform both the segmentation and the classification at the pixel level of the image. To assess its performance, measures based on the standard pixel accuracy and Intersection over Union were applied. Food matching firstly describes each of the recognized food items in the image and then matches the food items with their compositional data, considering both their food names and their descriptors. The final accuracy of the deep learning model trained on fake-food images acquired by 124 study participants and providing fifty-five food classes was 92·18 %, while the food matching was performed with a classification accuracy of 93 %. The present findings are a step towards automating dietary assessment and food-choice research. The methodology outperforms other approaches in pixel accuracy, and since it is the first automatic solution for recognizing the images of fake foods, the results could be used as a baseline for possible future studies. As the approach enables a semi-automatic description of recognized food items (e.g. with respect to FoodEx2), these can be linked to any food composition database that applies the same classification and description system.</description><subject>Algorithms</subject><subject>Artificial intelligence</subject><subject>Automation</subject><subject>Classification</subject><subject>Data collection</subject><subject>Deep Learning</subject><subject>Diet Records</subject><subject>Food</subject><subject>Food composition</subject><subject>Food Preferences</subject><subject>HOT TOPIC: ICT Assisted Dietary Data Collection and Analysis</subject><subject>Humans</subject><subject>Image acquisition</subject><subject>Image classification</subject><subject>Image processing</subject><subject>Image Processing, Computer-Assisted</subject><subject>Image segmentation</subject><subject>Information sources</subject><subject>International conferences</subject><subject>Knowledge management</subject><subject>Laboratories</subject><subject>Language</subject><subject>Machine learning</subject><subject>Matching</subject><subject>Meals</subject><subject>Model accuracy</subject><subject>Multimedia</subject><subject>Natural Language Processing</subject><subject>Nutrition Assessment</subject><subject>Object recognition</subject><subject>Pattern recognition</subject><subject>Pixels</subject><subject>Research Paper</subject><subject>Researchers</subject><subject>Standardization</subject><subject>Studies</subject><issn>1368-9800</issn><issn>1475-2727</issn><issn>1475-2727</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><recordid>eNp1UU1v1DAQjRCIlsIP4IIsceESsJ1N7FyQUAUFqYgDcLYm9jjrktiL7VSUv8EfrrNdypc42J7xe_PGz1NVjxl9zigTLz6yppO9pJSVRQWVd6pjthFtzQUXd0tc4HrFj6oHKV0UTiuEuF8d8b7jjez64-rHe_cNDTGIOzIhRO_8SMAb4iEvESYygR8XGJHsYtCY0orPmLfBEBsisfAFaxtK5uaVFVGH0bvsgt_LpFx2iMZ9h_1dDmSL047AksMMeW3tMEO8IpBSkZ_R54fVPQtTwkeH86T6_Ob1p9O39fmHs3enr85r3bI-11y3g9bMALJBQ7PhcugkFdxCZ7HtJXA0rEehrTQgJLUtcNuUFBCt6NrmpHp5o7tbhhmNLq2LY7WLxUq8UgGc-hPxbqvGcKlKbScbXgSeHQRi-Lpgymp2SeNU_gzDkhSnnPeSt3vq07-oF2GJvthTvKG0o3276QqL3bB0DClFtLePYVStI1f_jLzUPPndxW3FzxkXQnMQhXmIzoz4q_f_Za8BczS7dQ</recordid><startdate>20190501</startdate><enddate>20190501</enddate><creator>Mezgec, Simon</creator><creator>Eftimov, Tome</creator><creator>Bucher, Tamara</creator><creator>Koroušić Seljak, Barbara</creator><general>Cambridge University Press</general><scope>IKXGN</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QP</scope><scope>7RQ</scope><scope>7RV</scope><scope>7T2</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8C1</scope><scope>8FE</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ATCPS</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB0</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>NAPCQ</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0003-1663-3286</orcidid></search><sort><creationdate>20190501</creationdate><title>Mixed deep learning and natural language processing method for fake-food image recognition and standardization to help automated dietary assessment</title><author>Mezgec, Simon ; Eftimov, Tome ; Bucher, Tamara ; Koroušić Seljak, Barbara</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c519t-2c5bcc1dae1bca3428b68072fa6fe598a2ed19e7cf8da780f5a2f37cfaeef7653</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Algorithms</topic><topic>Artificial intelligence</topic><topic>Automation</topic><topic>Classification</topic><topic>Data collection</topic><topic>Deep Learning</topic><topic>Diet Records</topic><topic>Food</topic><topic>Food composition</topic><topic>Food Preferences</topic><topic>HOT TOPIC: ICT Assisted Dietary Data Collection and Analysis</topic><topic>Humans</topic><topic>Image acquisition</topic><topic>Image classification</topic><topic>Image processing</topic><topic>Image Processing, Computer-Assisted</topic><topic>Image segmentation</topic><topic>Information sources</topic><topic>International conferences</topic><topic>Knowledge management</topic><topic>Laboratories</topic><topic>Language</topic><topic>Machine learning</topic><topic>Matching</topic><topic>Meals</topic><topic>Model accuracy</topic><topic>Multimedia</topic><topic>Natural Language Processing</topic><topic>Nutrition Assessment</topic><topic>Object recognition</topic><topic>Pattern recognition</topic><topic>Pixels</topic><topic>Research Paper</topic><topic>Researchers</topic><topic>Standardization</topic><topic>Studies</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Mezgec, Simon</creatorcontrib><creatorcontrib>Eftimov, Tome</creatorcontrib><creatorcontrib>Bucher, Tamara</creatorcontrib><creatorcontrib>Koroušić Seljak, Barbara</creatorcontrib><collection>Cambridge University Press Gold Open Access Journals</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Calcium &amp; Calcified Tissue Abstracts</collection><collection>ProQuest Career &amp; Technical Education Database</collection><collection>ProQuest Nursing &amp; Allied Health Database</collection><collection>Health and Safety Science Abstracts (Full archive)</collection><collection>Agricultural Science Collection</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Public Health Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Agricultural &amp; Environmental Science Collection</collection><collection>ProQuest Central</collection><collection>Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Agriculture Science Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>PML(ProQuest Medical Library)</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Public health nutrition</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Mezgec, Simon</au><au>Eftimov, Tome</au><au>Bucher, Tamara</au><au>Koroušić Seljak, Barbara</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Mixed deep learning and natural language processing method for fake-food image recognition and standardization to help automated dietary assessment</atitle><jtitle>Public health nutrition</jtitle><addtitle>Public Health Nutr</addtitle><date>2019-05-01</date><risdate>2019</risdate><volume>22</volume><issue>7</issue><spage>1193</spage><epage>1202</epage><pages>1193-1202</pages><issn>1368-9800</issn><issn>1475-2727</issn><eissn>1475-2727</eissn><abstract>The present study tested the combination of an established and a validated food-choice research method (the 'fake food buffet') with a new food-matching technology to automate the data collection and analysis. The methodology combines fake-food image recognition using deep learning and food matching and standardization based on natural language processing. The former is specific because it uses a single deep learning network to perform both the segmentation and the classification at the pixel level of the image. To assess its performance, measures based on the standard pixel accuracy and Intersection over Union were applied. Food matching firstly describes each of the recognized food items in the image and then matches the food items with their compositional data, considering both their food names and their descriptors. The final accuracy of the deep learning model trained on fake-food images acquired by 124 study participants and providing fifty-five food classes was 92·18 %, while the food matching was performed with a classification accuracy of 93 %. The present findings are a step towards automating dietary assessment and food-choice research. The methodology outperforms other approaches in pixel accuracy, and since it is the first automatic solution for recognizing the images of fake foods, the results could be used as a baseline for possible future studies. As the approach enables a semi-automatic description of recognized food items (e.g. with respect to FoodEx2), these can be linked to any food composition database that applies the same classification and description system.</abstract><cop>Cambridge, UK</cop><pub>Cambridge University Press</pub><pmid>29623869</pmid><doi>10.1017/S1368980018000708</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0003-1663-3286</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1368-9800
ispartof Public health nutrition, 2019-05, Vol.22 (7), p.1193-1202
issn 1368-9800
1475-2727
1475-2727
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_6536832
source Cambridge University Press; PubMed Central
subjects Algorithms
Artificial intelligence
Automation
Classification
Data collection
Deep Learning
Diet Records
Food
Food composition
Food Preferences
HOT TOPIC: ICT Assisted Dietary Data Collection and Analysis
Humans
Image acquisition
Image classification
Image processing
Image Processing, Computer-Assisted
Image segmentation
Information sources
International conferences
Knowledge management
Laboratories
Language
Machine learning
Matching
Meals
Model accuracy
Multimedia
Natural Language Processing
Nutrition Assessment
Object recognition
Pattern recognition
Pixels
Research Paper
Researchers
Standardization
Studies
title Mixed deep learning and natural language processing method for fake-food image recognition and standardization to help automated dietary assessment
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-23T03%3A56%3A13IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Mixed%20deep%20learning%20and%20natural%20language%20processing%20method%20for%20fake-food%20image%20recognition%20and%20standardization%20to%20help%20automated%20dietary%20assessment&rft.jtitle=Public%20health%20nutrition&rft.au=Mezgec,%20Simon&rft.date=2019-05-01&rft.volume=22&rft.issue=7&rft.spage=1193&rft.epage=1202&rft.pages=1193-1202&rft.issn=1368-9800&rft.eissn=1475-2727&rft_id=info:doi/10.1017/S1368980018000708&rft_dat=%3Cproquest_pubme%3E2022982532%3C/proquest_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c519t-2c5bcc1dae1bca3428b68072fa6fe598a2ed19e7cf8da780f5a2f37cfaeef7653%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2300609546&rft_id=info:pmid/29623869&rft_cupid=10_1017_S1368980018000708&rfr_iscdi=true