Google Lens now supplies recipes and nutrition advice, starting with Uncle Ben’s products

Ever grocery shop without a list? If so, the newest collaboration involving Google Lens, Google’s AI-powered search and computer vision tool, might be just what the doctor ordered. Mountain View tech startup Innit and Mars Food have jointly announced that from today Lens will reveal “dynamic content,” like recipes, ingredient lists, and nutrition advice from Innit’s connected platform, beginning with Uncle Ben foods.

Mars notes that with this integration, Uncle Ben becomes the first food brand to supply Lens users with information beyond basic web results.

Through Lens, you’ll get meal recommendations based on your tastes, dietary preferences, and allergies, along with a personalized score for products like Uncle Ben’s Ready Rice, Flavored Grains, Flavor Infusions, and beans. Additionally, you’ll see meals that can be built around the product, accompanied by step-by-step cooking instructions and guided videos.

“The … experience is designed to help families cut through the clutter to provide recommendations, inspiration, and information where and when they need it,” wrote an Innit spokesperson. “It’s been a big year for food already … and this marks a breakthrough for food tech, as we move toward the grocery store of the future.”

The new feature follows a Lens capability that highlights top meals at a restaurant and a partnership with Wescover that supplies information about art and design installations. Lens also recently gained the ability to split a bill or calculate a tip after a meal, to overlay videos atop real-world publications, like Bon Appetit, and to read signs and other text for people who can’t read or don’t understand the printed language.

Google Lens began as a feature exclusive to Pixel smartphones, but it quickly spread to Google Photos and now ships onboard flagship smartphones from companies like Sony and LG.

The growing list of things Lens can recognize spans over 1 billion products from Google Shopping, including furniture, clothing, books, movies, music albums, and video games. (That’s in addition to landmarks, points of interest, notable buildings, Wi-Fi network names and passwords, flowers, pets, beverages, and celebrities.) Lens can also surface stylistically similar outfits or home decor and read words in signage and prompt you to take action. Perhaps most useful of all, it’s able to extract phone numbers, dates, and addresses from business cards and add them to your contacts list.

At its I/O keynote back in May, Google took the wraps off a real-time analysis mode for Lens that superimposes recognition dots over actionable elements in the live camera feed — a feature that launched first on the Pixel 3 and 3 XL. Lens not long ago came to Google image searches on the web, and more recently Google brought Lens to iOS through the Google app and launched a redesigned experience across Android and iOS.