Meta’s (formerly Facebook) next Ray-Ban Stories smart glasses will reportedly let users livestream video to viewers who can talk back to them.
Internal documents obtained by tech journalist Janko Roettgers revealed that the second-generation Ray-Ban Stories will not only allow users to stream video directly to Facebook and Instagram but will also allow viewers to whisper in their ear, reports The Verge.
“Users will be able to livestream directly to Facebook and Instagram with the device. There’s no word on support for other services at this point,” Roettgers wrote in a post on Lowpass.
“Live streamers will be able to directly communicate with their audience, with the glasses relaying comments via audio over the built-in headphones,” he added.
Ray-Ban Stories devices of the current generation can capture photos and short video clips but do not support live streaming.
According to the report, the next version of the smart glasses will come with “improved battery life and better cameras”.
Moreover, the smart glasses may also include adaptive volume and additional audio services.
“To improve the overall audio experience, Meta is looking to bring adaptive volume control to its smart glasses. With this feature, the glasses will automatically monitor the ambient noise level, and increase playback volume in noisy surroundings,” Roettgers said.
Current-generation Ray-Ban Stories feature integrated stereo speakers and can be used as a Bluetooth headset, plus the device offers a more direct integration with Spotify, with users being able to skip tracks and more simply by tapping the frame.
Meta is looking to bring similar features to other music services, but it is not confirmed which service will be next, according to Roettgers.
Meanwhile, Meta has launched a new all-in-one, multilingual multimodal AI translation and transcription model for up to 100 languages depending on the task.
Called ‘SeamlessM4T,’ the single model can perform speech-to-text, speech-to-speech, text-to-speech, and text-to-text translations.