Looks like Apple Moving towards AI gadgetsAnd if recently Bloomberg report Any indication, many of them will have one thing in common: They will “use”visual intelligence” If you're not familiar with your Apple branding, “Visual Intelligence” is the company's version of computer vision – an AI feature that gives gadgets “sight”, so to speak.

According to Bloomberg, Apple wants visual intelligence to be a defining feature across a range of hardware, including the new generation AirPods with camera, Apple's first pair of smart glassesand even a AI pendant It's strangely reminiscent of a fail AI Pin Made by Humane. What might computer vision actually be doing in those gadgets? Well, apparently, this is exactly the same thing it does in other gadgets. Per Bloomberg:

“…the most basic applications might include picking up a plate of food and identifying objects and ingredients. More advanced uses include the device giving specific instructions to perform a task based on what it sees. This could mean advanced turn-by-turn directions, in which the device simply asks the user to cross a specific landmark rather than a certain number of feet. The technology could also remind users to do something when they walk up to a certain object or location.”

If you're at all familiar with computer vision and how it works in gadgets like smart glasses, you've probably read the above and felt a little disappointed. Computer vision is a defining feature of popular smart glasses Ray-Ban Meta AI Glasses And it can be used for many things, like translating text on food menus, identifying objects in your environment, and giving you instructions for a recipe while cooking. While I will admit that the use case for navigation would be new, it seems like Apple is on the same path as Meta and other companies squeezing computer vision capabilities into their hardware.

Whether Apple will have any further success in making computer vision—er, visual intelligence—work in AI gadgets is anyone's guess. While computer vision is arguably one of the more futuristic and innovative features of smart glasses, it is also one of the least reliable and, at times, least applicable to your daily use. In my experience using the Ray-Ban Meta AI glasses, computer vision has a habit of getting things wrong (you can). Read my review of the Meta Ray-Ban Display for specific examples), making it harder to trust and consequently even harder to incorporate into your daily use. I still think the technology can be great for accessibility purposes, but that's not exactly what Apple is offering here.

Although it's likely that Apple is working toward some kind of breakthrough on the computer vision front that would make visual intelligence more reliable and useful, little has been done so far to show progress. As Bloomberg notes, for example, existing visual intelligence features inside iOS rely mostly on OpenAI's ChatGPT and, in the near future, Google's Gemini. As per my experience, the models offered by those companies are as weak as the rest.

A lot can happen between now and when Apple finally decides to launch its own AI-focused hardware (later this year at the earliest), but for now, it appears AI gadgets are stuck on how/when computer vision can be used — or at least stuck on making those scenarios functional. Apple's approach to visual intelligence may seem a little more utilitarian than OpenAI reports smart speaker with cameraBut this is a fairly low bar.

Categorized in: