UACES Facebook Study offers improvements to food quality computer predictions
skip to main content

Study offers improvements to food quality computer predictions

Sept. 24, 2024

By John Lovett
University of Arkansas System Division of Agriculture
Arkansas Agricultural Experiment Station

Fast facts

  • Human perception of food in different lighting conditions improves computer prediction
  • Machine vision-based prediction errors decreased by 20 percent
  • Lighting color temperature, brightness influence human perception of food quality

(693 words)

Download related PHOTO

FAYETTEVILLE, Ark. — Have you ever stood in front of apples on display at the grocery store trying to pick out the best ones and wondered, “Is there an app for this?”

Dongyi Wang with computer
FOOD QUALITY PREDICTION — Dongyi Wang's study showed computer prediction of food quality improved when based on human perceptions under various lighting situations. (U of A System Division of Agriculture photo by Paden Johnson)

Current machine-learning based computer models used for predicting food quality are not as consistent as a human’s ability to adapt to environmental conditions. Still, information compiled in an Arkansas Agricultural Experiment Station study may be used someday to develop that app, as well as provide grocery stores with insights on presenting foods in a more appealing manner and optimize software designs for machine vision systems used in processing facilities.

The study led by Dongyi Wang, assistant professor of smart agriculture and food manufacturing in the biological and agricultural engineering department and the food science department, was recently published in the Journal of Food Engineering.

Even though human perception of food quality can be manipulated with illumination, the study showed that computers trained with data from human perceptions of food quality made more consistent food quality predictions under different lighting conditions.

“When studying the reliability of machine-learning models, the first thing you need to do is evaluate the human’s reliability,” Wang said. “But there are differences in human perception. What we are trying to do is train our machine-learning models to be more reliable and consistent.”

The study, supported by the National Science Foundation, showed that computer prediction errors can be decreased by about 20 percent using data from human perceptions of photos under different lighting conditions. It outperforms an established model that trains a computer using pictures without human perception variability taken into consideration.

Even though machine vision techniques have been widely studied and applied in the food engineering field, the study noted that most current algorithms are trained based on “human-labeled ground truths or simple color information.” No studies have considered the effects of illumination variations on human perception, and how the biases can affect the training of machine vision models for food quality evaluations, the authors stated.

The researchers used lettuce to evaluate human perceptions under different lighting conditions, which were in turn used to train the computer model. Sensory evaluations were done at the experiment station’s Sensory Science Center. Han-Seok Seo, professor in the food science department and director of the Sensory Science Center, was a co-author of the study.

Out of 109 participants in a broad age range, 89 completed all nine sensory sessions of the human perceptional reliability phase of the study. None of the participants were color blind or had vision problems. In five consecutive days, the panelists evaluated 75 images of Romaine lettuce each day. They graded freshness of the lettuce on a scale of zero to 100.

The images of lettuce the sensory panel graded were of samples photographed over the course of eight days to provide different levels of browning. They were taken under different lighting brightness and color temperatures, ranging from a blueish “cool” tone to an orangey “warm” tone, to obtain a dataset of 675 images.

Several well-established machine learning models were applied to evaluate the same images as the sensory panel, the study noted. Different neural network models used the sample images as inputs and were trained to predict the corresponding average human grading to better mimic human perception.

As seen in other experiments at the Sensory Science Center, human perception of food quality can be manipulated with illumination. For example, warmer environmental colors can disguise lettuce browning, Wang explained.

Wang said the method to train machine vision-based computers using human perceptions under different lighting conditions could be applied to many things, from foods to jewelry.

Other co-authors of the study from the University of Arkansas included Shengfan Zhang, associate professor of industrial engineering in the College of Engineering; Swarna Sethu, former post-doctoral researcher in biological and agricultural engineering department, and now assistant professor of Computer Information Sciences at Missouri Southern State University; and Victoria J. Hogan, program assistant in the food science department.

The study was supported by the National Science Foundation, grant numbers OIA-1946391 and No. 2300281. The authors also recognized graduate and senior undergraduate students Olivia Torres, Robert Blindauer and Yihong Feng for helping collect, analyze and grade samples.

To learn more about the Division of Agriculture research, visit the Arkansas Agricultural Experiment Station website. Follow us on X at @ArkAgResearch, subscribe to the Food, Farms and Forests podcast and sign up for our monthly newsletter, the Arkansas Agricultural Research Report. To learn more about the Division of Agriculture, visit uada.edu. Follow us on X at @AgInArk. To learn about extension programs in Arkansas, contact your local Cooperative Extension Service agent or visit uaex.uada.edu.

About the Division of Agriculture

The University of Arkansas System Division of Agriculture’s mission is to strengthen agriculture, communities, and families by connecting trusted research to the adoption of best practices. Through the Agricultural Experiment Station and the Cooperative Extension Service, the Division of Agriculture conducts research and extension work within the nation’s historic land grant education system.

The Division of Agriculture is one of 20 entities within the University of Arkansas System. It has offices in all 75 counties in Arkansas and faculty on five system campuses.

The University of Arkansas System Division of Agriculture offers all its Extension and Research programs and services without regard to race, color, sex, gender identity, sexual orientation, national origin, religion, age, disability, marital or veteran status, genetic information, or any other legally protected status, and is an Affirmative Action/Equal Opportunity Employer.

 

# # #

Media Contact: John Lovett
U of A System Division of Agriculture
Arkansas Agricultural Experiment Station
(479) 763-5929
jlovett@uada.edu

Top