Google Debuts Smart Glasses Built With Real-Time Language Translation
A successor to Google Glass is in the works, and this time it’s focused on language translation.
Google’s smart glasses, unveiled today, are a prototype designed to transcribe, translate, and then display what someone is saying on the lenses, all in real time.
“What we’re working on is technology that enables us to break down language barriers,” Eddie Chun, Google director of product management, said in a video about the prototype.
The device essentially uses Google Translate to churn out translated subtitles. Google also showed that a deaf person could use the smart glasses to help them understand those who haven’t learned sign language.
The company debuted a video of the prototype at its virtual Google I/O developers conference, while talking up its attempts to explore augmented-reality devices.
Google CEO Sundar Pichai described the technology as a new frontier in computing. “These AR capabilities are already useful on phones and the magic will really come alive when you can use them in the real world without the technology getting in the way,” he said in a blog post(Opens in a new window).
Recommended by Our Editors
However, Google didn’t reveal much else about the product, including its name, the specs, or how the subtitles look on the lenses. Instead, the video of the prototype merely provides a “simulated” point of view of the experience.
Still, the device signals Google has some big plans for the smart glasses space. The original Google Glass was introduced back in 2012 with much fanfare, but interest eventually fizzled out due to a myriad of issues, including its high $1,500 price. The company has since pivoted to sell a Google Glass product meant for enterprises(Opens in a new window).
Get Our Best Stories!
Sign up for What’s New Now to get our top stories delivered to your inbox every morning.