The tool “thinks” by accessing a million-plus recipe database, and then matches the closest database results with the photo of the food in question. The team hopes the tool could one day be used to teach people to cook, count calories and track user dietary habits so they ultimately make better choices.
Where is the latest place deep-learning could be put to good use?
Answer: your kitchen
That’s right, with the snap of your smartphone's camera, you could soon be able to identify ingredients and recommend similar recipes based on the food pictures you take. A research team at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) developed the food recognition deep-learning algorithm to visually dissect foods. At the moment, the program is correct around 65 percent of the time, according to Gizmodo reporting.
The tool “thinks” by accessing a million-plus recipe database, and then matches the closest database results with the photo of the food in question. The team hopes the tool could one day be used to teach people to cook, count calories and track user dietary habits so they ultimately make better choices.
The tool “thinks” by accessing a million-plus recipe database, and then matches the closest database results with the photo of the food in question. The team hopes the tool could one day be used to teach people to cook, count calories and track user dietary habits so they ultimately make better choices.