Imagine silently mouthing something to yourself, and your AI assistant knows what you’re trying to say. It could be via your glasses or your earbuds or your phone’s camera. Apple just purchased a company, called Q.ai, that’s trying to do this exact thing. That sounds weird and sci-fi, and yet to me, as someone who’s been looking at smart glasses and wearables for a long time, it sounds very familiar, too.
The Vision Pro can already track face movements, but it can’t convert lip movements into speech.
Part of a new interface system for wearables and glasses?
I just wrote about how Apple is already showing signs of moving toward an ecosystem of connected AI wearables: pins, glasses, earbuds, watches or some combination thereof. Any of these wearables could potentially use what Q.ai is developing. It sounds like headphones and glasses are the two most likely areas, though, and with reports that the next generation of AirPods will have infrared cameras onboard, the pieces look even more ready to connect.




