The Meta Connect 2024 has just finished and there Mark Zuckerberg and some other Meta executives have announced very important news concerning XR and AI. Of all these announcements, I selected the 5 most important ones for me and I want to briefly describe them to you, explaining also why they could be relevant for the future of XR. Are you interested in that?
(In the next few days, a more complete roundup of the event is coming, subscribe to my newsletter so as not to miss it!)
1. Orion AR glasses
We already knew that Meta was going to show the first version of its AR glasses, but seeing them revealed for real has been pretty exciting. These glasses, codenamed Orion, have advanced features for the current state of the technology: more than 70° of FOV, a weight below 100 grams, and no connection wires. The glasses are not fully standalone: to be this small, they require a compute pack that you have to put in your pocket, but the good thing with regard to other devices like the Vision Pro is that the connection is fully wireless.
Even if the glasses are small and lightweight, thanks to a lot of custom chips that Meta designed, they support hand tracking, eye tracking, and environment understanding. Some video shots revealed by Meta showed a full operating system working on the glasses, and even the possibility of meeting in multiplayer with other people represented by Codec Avatars. The Codec Avatar appeared a bit choppy in these videos and this means that there is still some work to do in squeezing performances out of these glasses.
Meta Orion uses the Neural Wristband as an input. Thanks to the wristband, which detects the electrical impulses that the brain sends to the muscles, you just have to do micro gestures with your hands, even with your arms at full rest, and you can interact with the glasses in a very comfortable way. The wristband looks already very lightweight and sleek: a very good format for such advanced technology.
These glasses are not going to be on sale because Meta had to use very expensive solutions to make them work: for instance, the lenses are not made in glass, but silicon carbide, a highly refractive material that enables their wide field of view. Only internal Meta employees and some selected partners will have access to this device. But Meta said that it intends to keep working on this product so that it can in the future release similar glasses at a more consumer-friendly price. The rumors say that the date is 2027, but Meta promised nothing in this sense.
I think this has been the most exciting news of the event. Meta prepared this presentation very well and it was able to not only sell its vision of AR glasses but also show that it can actually build them. Orion is not just a tech demo, it’s a working product. Some famous people like Gary Vee or Jensen Huang have tried them and said on camera that they were impressed by them. While Apple is selling the Vision Pro as the most advanced piece of technology ever created by Apple, Meta showed to people that it is ready to go to the next generation, that is able to deliver the next technological platform: lightweight wireless AR glasses. I’m pretty sure that anyone who loves technology in these hours will see that Meta is showing that AR glasses are a reality. That’s pretty good to create awareness around XR.
I have to say that as an XR enthusiast, seeing after 10 years a real pair of AR glasses showcased by a major tech company has been a bit emotional. It means that even if the road is still long, we are getting closer to our dreams of widespread daily use of immersive realities.
2. Quest 3S
Meta has launched Quest 3S, the most affordable mixed-reality headset out there. This device has the same lenses and display as Quest 2, but the computational power and the mixed reality capabilities of Quest 3. Its price is $299.99 for the 128GB SKU and $399.99 for the 256GB SKU. Preorders are already open and shipping starts on October, 15th. People buying this device get the Batman: Arkham Shadow game for free. I just preordered my unit and it should ship on October, 17th.
Meta has also announced a bunch of new games coming for it like Just Dance VR and Triangle Factory, the streaming of the screen on Quest for Windows 11 laptops, a YouTube co-watching experience, and Dolby Atmos support.
Zuck fired shots at Apple defining Quest 3 and Quest 3S “The best family of mixed reality devices out there, PERIOD”. It’s fun to see him always trying to provoke Apple and it’s even funnier that Apple always ignores him…
We already knew everything about this device, so nothing about it surprised me. But I still think that the launch of it was cool, because Quest 3S is very cheap, but powerful at the same time, so it can help a lot to increase the adoption of XR. The holiday season is close and Meta Store is seeing pretty cool games launched for it, like Batman: Arkham Shadow, Metro Awakening, and Alien: Rogue Incursion, so I bet many parents will buy Quest 3S as a Christmas gift for their children. The big question is how much this headset will actually sell because we have pretty high expectations of it.
3. Passthrough APIs
After so much push from all of us mixed reality developers, finally Meta promised to release Passthrough APIs at the beginning of the next year! I think they had no choice: after the community found a hack to access the camera frames, the company understood it had better release a proper way to do that instead of letting everyone do that in a hacky way.
This is the piece of news that made me the most happy. It’s months that I pushed with all my heart to have this feature released and finally, Meta listened to me and all the other people that insisted to have it. I think it’s important that we devs have access to the camera frames of the device because only with the contextual information we can have from them we can create experiences that are fully context-aware. Plus we can give the camera feed to AI for analysis or to perform object tracking. This is the only way to have true mixed reality, which is not just about having virtual elements as a layer on top of physical reality but it is having the two realities blending together in a new and richer reality. I really can’t wait to see what developers will be able to develop with it.
Meta has talked a lot about AI in this event and especially it announced the new LLAMA 3.2 model that is multimodal. Ray-Ban Meta glasses are getting to support these new multimodal interactions of Meta AI, so it will soon be possible:
- To interact with the glasses more naturally, without repeating “Hey Meta” all the times
- To let the glasses read text and scan QR codes
- To show the glasses some clothes and ask them if they fit you or if they fit a particular style required by a party
- To ask the glasses to remember things: for instance, you could park your car in the parking lot of a commercial center and ask the glasses to remember where it is
- To have a live translation of you speaking with someone in another language like Spanish, French, or Italian. The glasses can hear people speaking in another language and output to your speakers the English translation. Then you could speak in English, and the companion app of the glasses could translate what you say in the language of the other person. This is great when you are visiting a foreign country, of course.
Meta is also partnering with Be My Eyes to use these AI capabilities of the glasses to help blind people.
More integration of Ray-Ban Meta glasses with AI was something I mentioned in my predictions for this Connect and Meta did not disappoint me. It was just the natural evolution of today’s technology. Meta also announced that Ray-Ban Meta glasses are now compatible with the transition lenses made by Luxottica, so these glasses may be used both outdoors as sunglasses and indoors as transparent frames (or prescription glasses, if you need prescriptions). This makes sure that people may use Ray-Ban Meta everywhere and not just outside their homes.
Zuck defined smart glasses as “a new AI device category” and I totally agree with him: these fashionable sunglasses may be the trojan horse through which we start accessing the support of AI assistants in our everyday life. This is invaluable, but at the same time opens up a can of worms for what concerns privacy: if this AI assistant is able to analyze what I have around me and can also remember things, it means that Meta AI is probably collecting info about me…
5. Hyperscapes
You were probably expecting a mention of the new Meta Avatars as the 5th feature and I was in fact close to doing that, especially because of the AI editor that next year will let you create whatever avatar you want for you by just describing it. But I decided to do something different, so I selected Hypercapes as the 5th main piece of news.
Mark Zuckerberg showed that Meta is working on a system that lets you scan the environment around you using a mobile app and then lets you enter a high-fidelity reconstruction of this space yourself, or makes other people that you want to visit it, too. Hyperscapes are not a thing of the future, but they are already available: the mobile app to scan the environments is already available in beta in the US.
I love the idea of Hyperscapes because it’s a while since I think that social media of 3D spaces will be one thing that will be a reality in the future. The idea of Hyperscapes is exactly that: as today I take a picture of something and I share it on social media, in the future I will be able to scan a place I am in and let other people visit it in immersive reality. To me, it looks incredibly neat. And I’m a bit sorry for Varjo who was working on something similar.
These have been my 5 main highlights of the Meta Connect 2024. What have been yours? And why? Let me know here below in the comments or on my social media channels!
(Header image by Meta)
Disclaimer: this blog contains advertisement and affiliate links to sustain itself. If you click on an affiliate link, I’ll be very happy because I’ll earn a small commission on your purchase. You can find my boring full disclosure here.
Related