People probably wouldn’t forget something at the store if they taped a grocery list to their eyeballs, not that I’m recommending that. But a new prototype from smart contact lens maker Mojo would do something pretty similar.
Today, Mojo announced a potential feature that would integrate Alexa Shopping Lists as an application on Mojo Lens, calling it the “first major third-party consumer application on a smart contact lens.” Take that, paper grocery lists.
A user would be able to access the Alexa Shopping List in their frame of view, ask Alexa to add or remove items, and check off groceries as they’re grabbed, all just by using their eyes (sorry hands). Your hands would be free to carry a basket or rub your irritated eyes while forgetting there are smart lenses in them.
If someone at home just finished the last of the milk, they could also remotely add an item, and it would appear in Mojo Lens as you shook your head.
“At Amazon, we believe experiences can be made better with technology that is always there when you need it, yet you never have to think about it,” said Ramya Reguramalingam, GM, Alexa Shopping List.
“We’re excited that Mojo Vision’s Invisible Computing for Mojo Lens, paired with the demonstration of Alexa Shopping List as a use case, is showing the art of what’s possible for hands-free, discreet smart shopping experiences.”
To be clear, this is just an early test and won’t be available next week or anything. The Mojo smart contact lenses are still in early development as well. They’ll have to figure that part out first, but demonstrations have shown they hope to achieve an eye-controlled user interface that augments activities, like seeing the trails while in nature or talking points for a presentation.
The idea seems to make it appear like you know what you’re doing without people noticing that you’re looking things up.
In any case, you probably won’t need a grocery list to remember to buy eye drops while wearing a smart contact lens.