Your browser doesn't support javascript.
loading
I2N: image to nutrients, a sensor guided semi-automated tool for annotation of images for nutrition analysis of eating episodes.
Ghosh, Tonmoy; McCrory, Megan A; Marden, Tyson; Higgins, Janine; Anderson, Alex Kojo; Domfe, Christabel Ampong; Jia, Wenyan; Lo, Benny; Frost, Gary; Steiner-Asiedu, Matilda; Baranowski, Tom; Sun, Mingui; Sazonov, Edward.
Afiliação
  • Ghosh T; Department of Electrical and Computer Engineering, University of Alabama, Tuscaloosa, AL, United States.
  • McCrory MA; Department of Health Sciences, Boston University, Boston, MA, United States.
  • Marden T; Colorado Clinical and Translational Sciences Institute, University of Colorado, Denver, CO, United States.
  • Higgins J; Department of Medicine, University of Colorado Anschutz Medical Campus, Aurora, CO, United States.
  • Anderson AK; Department of Nutritional Sciences, University of Georgia, Athens, GA, United States.
  • Domfe CA; Department of Nutritional Sciences, University of Georgia, Athens, GA, United States.
  • Jia W; Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, PA, United States.
  • Lo B; Department of Surgery and Cancer, Imperial College, London, United Kingdom.
  • Frost G; Department of Metabolism, Digestion and Reproduction, Imperial College, London, United Kingdom.
  • Steiner-Asiedu M; Department of Nutrition and Food Science, University of Ghana, Accra, Ghana.
  • Baranowski T; Children's Nutrition Research Center, Department of Pediatrics, Baylor College of Medicine, Houston, TX, United States.
  • Sun M; Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, PA, United States.
  • Sazonov E; Department of Electrical and Computer Engineering, University of Alabama, Tuscaloosa, AL, United States.
Front Nutr ; 10: 1191962, 2023.
Article em En | MEDLINE | ID: mdl-37575335
Introduction: Dietary assessment is important for understanding nutritional status. Traditional methods of monitoring food intake through self-report such as diet diaries, 24-hour dietary recall, and food frequency questionnaires may be subject to errors and can be time-consuming for the user. Methods: This paper presents a semi-automatic dietary assessment tool we developed - a desktop application called Image to Nutrients (I2N) - to process sensor-detected eating events and images captured during these eating events by a wearable sensor. I2N has the capacity to offer multiple food and nutrient databases (e.g., USDA-SR, FNDDS, USDA Global Branded Food Products Database) for annotating eating episodes and food items. I2N estimates energy intake, nutritional content, and the amount consumed. The components of I2N are three-fold: 1) sensor-guided image review, 2) annotation of food images for nutritional analysis, and 3) access to multiple food databases. Two studies were used to evaluate the feasibility and usefulness of I2N: 1) a US-based study with 30 participants and a total of 60 days of data and 2) a Ghana-based study with 41 participants and a total of 41 days of data). Results: In both studies, a total of 314 eating episodes were annotated using at least three food databases. Using I2N's sensor-guided image review, the number of images that needed to be reviewed was reduced by 93% and 85% for the two studies, respectively, compared to reviewing all the images. Discussion: I2N is a unique tool that allows for simultaneous viewing of food images, sensor-guided image review, and access to multiple databases in one tool, making nutritional analysis of food images efficient. The tool is flexible, allowing for nutritional analysis of images if sensor signals aren't available.
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Tipo de estudo: Qualitative_research Idioma: En Revista: Front Nutr Ano de publicação: 2023 Tipo de documento: Article País de afiliação: Estados Unidos

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Tipo de estudo: Qualitative_research Idioma: En Revista: Front Nutr Ano de publicação: 2023 Tipo de documento: Article País de afiliação: Estados Unidos