The right taste and texture are critical to brand success while speed and accuracy are key in getting the right product to the right consumers.
“What does it taste like?” Whether it is asked of a sensory scientist or a consumer, this question is never simple to answer. The descriptive language of foods and beverages that humans have used for centuries to describe and create food can be difficult to articulate and have different meanings across cultures and individuals. Taste is a highly subjective characteristic and can vary between individuals, much less cultures.
Historically, the world’s numerous languages are bountiful with adjectives, and the only way to describe taste was to rely on those adjectives. Today, Analytical Flavor Systems, the New York-based company behind Gastrograph AI, has found a way to quantify sensory adjectives and consumer language to create an evolving, dynamic, and intelligent system for predicting consumer sensory perception and preference – just as accurately while faster and more efficiently than traditional methods.
A new study proved Gastrograph AI’s ability to save time, cost and human resources with accurate predictions of consumer perception and preference. This new study was the result of a partnership between Ajinomoto Co., Inc., (Ajinomoto) a leader in the digital transformation of R&D, and a major global market research firm in China.
The Changing Landscape
Central location testing (CLT) has always been the “go-to” research method to analyze how a new product and/or line extension will fare in the marketplace, or to provide the marketing team with some creative inspiration for campaigns and positioning. But it remains largely subjective, majority rules, and slow. The typical innovation cycle, from concept to launch is long – with an average consumer focus group step in R&D lasting about three months. Factor in uncontrollable parameters or events such as a pandemic, supply chain disruption, and pressure on test subjects to quantify the product, and all the consumer research is railroaded and/or significantly delayed, creating mounting costs.
Utilizing traditional methods, such as CLT and iHUT, companies are engaging in a time-consuming and a costly process to collect consumer insights and data that has a limited shelf-life. Frequently, CLT or iHUT data is only used once, for the specific project the recruitment was designed for. Despite this, CLT remains the most common method to collect consumer insights on foods and beverages.
Gastrograph AI reimagined what consumer insights and market research can be with a convenient, fast and most importantly, predictive methodology. The dynamic artificial intelligence system, which is being actively used to predict the perception and preferences of targeted consumers across food and beverage categories around the world, is saving the large multinational user base time and resources – chiefly by eliminating the need for new data collection. And it has been proven a success.
Analytical Flavor Systems has been growing quietly behind the scenes — and can now predict perception and preference for more than 16 countries and 100 categories of products. Now, this new study provides proof that Gastrograph AI works much more efficiently than traditional methods. Gastrograph AI is uniquely able to leverage its proprietary database, the world’s largest sensory data set, from a diverse range of products and consumers from all over the world to train the AI. Just 10 -15 tasters are needed to review a product for the AI to predict how any other consumer demographic in the world will respond to the product.
The purpose of this study was to validate Gastrograph AI’s ability to accurately predict human perception and preference across demographic parameters when compared to the results of conventional consumer sensory methods.
In this original blind study, Gastrograph AI predicted perception and preferences for 10 different Chinese Consumer Demographics from a non-representative group of 12 Japanese tasters in Tokyo by utilizing proprietary perception translation algorithms. Gastrograph AI made all predictions available to Ajinomoto before any data was collected in China. For validation, the Ajinomoto team conducted a CLT with a major market research firm in China using a traditional survey methodology. The aim was to prove the predictive accuracy and repeated predictability across these 10 different Chinese demographic targets.
Using a screener, the traditional process collected overall consumer liking data for the product category from 242 consumers selected to participate in the test based on frequency of consumption and other measures. These selected individuals were drawn from the same population that Gastrograph AI made predictions for.
The 12 Japanese panelists in Tokyo tasted nine products with Gastrograph AI before any data collection was conducted by the independent market research firm in China. The panelists entered their impressions on the products into Gastrograph AI’s proprietary sensory interface. However, the 242 respondents in China were required to answer a series of product evaluation questions; the session lasted between 5-10 minutes per test product. Other post-survey questions collected information on the respondent’s cultural and educational backgrounds. In total, this CLT in China took 3 months (12 weeks), while the Gastrograph AI predictions took less than 2 weeks from start to finish.
Collectively, this research is the first of its kind publicly released and is empirical proof that Gastrograph AI’s predictions are as accurate as Analytical Flavor Systems has claimed.
Ajinomoto is an early adopter of Gastrograph AI’s platform and expected Gastrograph AI predictions to be generally accurate for the overall Chinese population. However, the level of predictive accuracy on the much harder test of sub-demographics, such as Upper-Class Chinese Millennials, was a surprising result to the Ajinomoto team.
Hiroya Kawasaki, Ph.D., Associate General Manager of Ajinomoto’s Institute of Food Sciences and Technologies, states, “the accuracy and resolution of the perception translation model for predicting preferences exceeded our expectations. Gastrograph AI is able to reduce the time to get critical consumer sensory insights and is at least an order of magnitude faster from existing empirical methods.”
Opening New Doors
Using data entered by users, Gastrograph AI calculates its own representation of where each food and beverage lies in high-dimensional flavor space.
Jason Cohen, Founder and CEO of Analytical Flavor Systems, explains, “our artificial intelligence works by learning the position of each product in infinite dimensional Hilbert space, and modeling each flavor, aroma, and texture as a topological subspace. The math is complex, but it makes consumer science and product development insights fast and easy.”
The more Gastrograph AI is employed, the more predictive and precise it becomes. During the past 10 years, Analytical Flavor Systems has collected the largest sensory data set of perceptions and preferences of in-market products, with data on every ready-to-eat and ready-to-drink category available from more than 16 countries and 30 regions. With three standing panels in New York City and Shanghai, its data continues to grow across products and categories – a truly massive data set.
Considering that all data collected can be recycled and reused to make new predictions for an increasing number of demographics, the ability of Gastrograph AI to obtain fast and accurate consumer insights on in-market products is continuously improving. For example, data collected by panelists in Japan can be used to predict preferences in Coastal China, or for any target demographic covered by the database from countries as divergent as Brazil, Germany or Thailand. No new data collection would be necessary.
As another example, collection of only 10 respondent reviews for yogurt can be translated into multiple demographics, but it can also be used for predictions on different product categories such as beer or chips. It allows to considerably reduce the time (and budget) spent on sensory data collection, and open new doors to create better, more targeted food and beverage products.
In normal times, the value of AI for predicting perceptions and preferences is its ability to develop products anywhere and predict how they will perform in any targeted market or demographic without requiring a stratified random sampling from consumer tests or central location tests.
During COVID-19 quarantines and lockdowns, when CLTs have not been safe or viable, Gastrograph AI is the go-to method to keep the innovation pipeline chugging along in a time of great consumer change.
The Future is Here
Gastrograph AI has proven that taste – the unique combinations of flavor, aroma, and texture — is now quantifiable, which eliminates the need to take the long road to survey consumers and understand the market.
Just as Pantone has mastered the art and science of digitizing color, so too has Gastrograph AI mastered the digitizing of flavor, aroma and texture. This unique technology helps food and beverage CPG brands to produce better, more precisely calibrated products much faster, thereby condensing the product development cycle.
This breakthrough test arrives at a most auspicious time in developing foods and beverages to satisfy the palate for increasingly diverse taste demands. As one example, according to Edward Bergen, Global Food & Drink Analyst for market research intelligence firm Mintel, the taste of a perfect biscuit is the primary selection factor for 75% of consumers who eat biscuits. In the sweet market, he noted, “many brands [that responded to Mintel’s survey] have sought to just develop core flavors rather than something brand new. Another strategy is to intensify favorites.” Similar research has shown that flavor is the number one predictor of a product’s success or failure in the market, all else being equal.
This recent market data emphasizes the power of ensuring the taste is right. Cohen concludes, “We’re helping companies make targeted products that consumers love — more diverse products for a more diverse world.”