Figuring out whether the fries on your plate contain traces of trans-fat, or if those celery sticks are truly pesticide-free can be tricky, if not impossible. That’s why Isabel Hoffmann along with mathematician Stephen Watson set out to create TellSpec, a hand-held device that you can simply point at a food item, to identify what’s in it. Not only does the device warn you about chemicals, allergens and ingredients you’d rather avoid, it’ll also help you figure out food sensitivities and track your vitamin intake. The goal, the company says, is to help people make clean food choices by letting them “check their food as easily as they check their mail.”
“We want to promote healthier eating, alert those who have allergies and educate consumers by telling them exactly what’s in their food – beyond what the label says,” Hoffman tells Gizmag.
The device utilizes a small Raman spectrometer, a unique cloud-based algorithm and a simple smartphone app. Scanning a food item on the plate or in a shopping aisle is as simple as aiming TellSpec at it and pushing a button. It beams a low-powered laser at the item and analyzes the reflected light waves to identify the chemical makeup of the food.
This data is uploaded to the analysis engine which processes the information, compares it to reference spectra, interprets the results with the help of a database, and downloads the results to the user’s smartphone. Hoffman states that the device can successfully identify foods and their ingredients approximately 97.7 percent of the time after scanning the food’s surface.
“Depending on how transparent the surface of the food is, the more accurate the scan will be,” explains Hoffman. “Users must understand that these scans can only go so deep. To scan a Twinkie, the user could do two separate scans for a more accurate reading. One at the surface and then a second in the center of the Twinkie.”
The team scanned 3,000 food items to create the initial database, but the device can potentially identify an unlimited number of ingredients, according to Hoffman. Its ability to make identifications is expected to increase exponentially as the number of TellSpec users grow, and add their own scans of different food items. Initially the company plans to direct 82 beta food testers early next spring (Northern Hemisphere) to startTellSpecing as soon as the devices become available, increasing the breadth and depth of the food data.
“This is the crowd-sourced element of our clean food revolution,” Hoffman tells us. “It is literally in the hands of the people. It is they who will truly participate actively in creating a global footprint of food data.The food database is an evolving number – the more people scan, the more the database grows and the more precise the scans become.”
We’ve seen devices like the iTube which turns a smartphone into an allergen sensor, but TellSpec is designed to be a smart device. It will do more than tell you if there’s Monosodium Glutamate (MSG) in that soup mix or if those chips are truly gluten-free. It can also give you the background story on little-known ingredients like Tartrazine, a synthetic lemon yellow that’s commonly used as food coloring. For the calorie conscious, TellSpec can breakdown the amount of sugars, fats and more per gram of a scanned food item. It can help users ascertain that they stay within recommended limits, when it comes to their intake of toxic substances like mercury.
Plans are in the works to have the device calculate the volume of food a person consumes, too. While it doesn’t diagnose food sensitivities or allergies, users who are uncertain about food triggers can log in symptoms after they’ve eaten something (like feeling uncomfortable after drinking a glass of milk) to get TellSpec’s suggestions.
“It will ask you how you feel,” Hoffman explains. “If you tell Tellspec you feel bloated, it will suggest that it may be caused by lactose and that you should check with your doctor about the possibility of an allergy.”
Hoffman hopes that the device will find use as a holistic health tool, as its food database grows larger and people use it in real time to register any disturbing symptoms.
“Eventually, the food data ‘bank’ will compile across time, peoples’ historical food data and individual symptoms at a global level,” says Hoffman. “These correlations between how people feel and what they really eat will eventually lead to a ‘TellSpecodedia’ of food data and personalized health information.”
All the TellSpec data will be open source, allowing anyone to use the data to create their own health-based apps. For instance, a diabetic app that tracks blood sugar levels could utilize TellSpec data to track what sugars or carbohydrates the user consumes, and identify ingredients in their food that would also convert into sugar.
“By sharing our API with the world, we want TellSpec to engage the crowd-sourcing power that a group and population bring to any problem,” says Hoffman. “One or two brains on any food issue would not yield the answers that the world could definitely benefit from. We want this to be an open source for new applications and new fields of study that grow from a source of food data that has never been available before.”
TellSpec is currently under development, after raising three times its funding goal on Indiegogo. Shipping is slated to begin in August 2014. Its US$320 price tag includes one year of free analysis of food scans, with further analyses being made available through subscription plans.
Here’s a first look at the device.