Ray-Ban Meta smart glasses are getting a few upgrades. The company will add real-time information to the on-board assistant and begin testing new “multimodal” features that will allow the assistant to answer questions about the user’s environment.

Until now, Meta AI only had access to information up to 2022, so it couldn’t answer questions about current events. However, that is now changing, according to Meta’s CTO Andrew Bosworth, according to which they will now all Meta smart glasses in the United States able to get real-time updates.

Meta also begins testing one of his assistant’s more interesting abilities, which he calls “multimodal artificial intelligence”. First introduced during the Connect conference, the feature allows Meta AI to answer contextual questions about your surroundings. Unfortunately, it will likely be some time before most people with smart glasses have access to the new multimodal features. Bosworth said the beta version will initially be available only to “a small number of people” in the US, with widespread access likely not coming until 2024.

Mark Zuckerberg shared several videos that illustrate the new multimodal features. For example, Zuckerberg asks an AI to look at a shirt he’s holding in his hand and then asks for suggestions for pants that might go with it. He also shared screenshots of Meta AI identifying a picture of a piece of fruit or translating the text of a meme.

Source: www.engadget.com