Unity, MRTK, Needle Tools and 8th Wall are just some of the tools you'll need to develop!
The Meta Quest Pro is available now and we have already seen some very cool things being teased from developers leading up to its launch. Of course, in order to develop, you need an amazing set of tools. Our Head of R&D, Dilmer Valecillos took a moment to take a deep dive into some of the top development tools you can use with your Quest Pro headset to develop and launch your own XR experiences.
In the video, Dilmer gives us a little bit of comparison of between the Quest Pro and the recently released Magic Leap 2 MR headset. He also gives us some perspective on color passthrough and what that will mean not only for the Meta Quest Pro, but for XR in general.
Some of the tools mentioned here are Unity, Unreal, MRTK 3, Needle Tools, 8th Wall, and Mozilla, and a brief skimming of how to use them for the Quest and deploy your builds.
All of these tools along with others will be essential for you to develop incredible XR experiences on the Quest Pro headset that you can bring into your workforce as a training platform, or used for social events and entertainment.
You can expect an even deeper dive into the Meta Quest Pro in upcoming posts here on our blog and on our YouTube page.
Create a roadmap to Industry 4.0 by adopting a sequence of technology upgrades while also removing outdated systems.
By Bobby Carlton
The rise of Industry 4.0 has created new opportunities for manufacturers to improve their efficiency and deliver new revenue streams. Through the use of advanced analytics, XR solutions and machine learning, companies are able to collect and analyze crucial data to improve their operations.
The benefits of implementing advanced analytics and machine learning are numerous, such as improving product quality and reducing production downtime. However, implementing these technologies in a large-scale manner can be a bit of a challeng for some due to the lack of engagement and legacy operations.
The rapid emergence and evolution of Industry 4.0 has created a huge opportunity for companies to improve their efficiency. According to a report by Statista, the market for advanced analytics and machine learning is expected to reach $1 trillion by 2028.
The increasing interest in using sensor networks is due to their ability to create feedback loops to improve the efficiency of manufacturing operations. This process can help identify “hidden factories”. These are bottleneck points or costly problems that are miniscule but in the long run can slow down production.
At the same time, you’re able to use technology for predictive maintenance, explore what-if scenarios, and reduce the costs of operations and highlight advanced analytics.
Despite the widespread use of Industry 4.0, many companies fail to collect and analyze data quickly enough. This is because companies tend to implement the technology in a way that's faster than it can be used.
The biggest barriers to the implementation of Industry 4.0 are the legacy mindsets of employees and operations practices. Despite the significant investment in new technology, these practices still prevent companies from fully embracing the potential of Industry 4.0.
One of the core challenges industries face in embracing the potential of Industry 4.0 is the lack of standardization. This is because there are many different ways of working that make it hard to identify the most effective ways to improve productivity and reduce risk.
Some employees will revert to the old ways of working when new innovations are introduced. This is because they don't trust the new technology such as robotics or XR, and are afraid to take on the new challenge. The value of implementing new technology is not realized until the old methods are phased out.
One of the biggest factors that prevents companies from fully embracing the potential of Industry 4.0 is the lack of preparation. Although many companies have the necessary resources to implement advanced software and sensors, they underestimate the training requirements of their workforce, they lack a strategy to blend new workflow methods into traditional approaches.
A great example of this is introducing VR headset into the workforce. You need to consider what is the strategy to do this with the least amount of disruption and the least amount of employee alienation.
A plan should be developed to avoid the trap of buying technology that's not being used properly or doesn't deliver the desired results. You should consider a more measured approach to the transition process that involves addressing cultural norms and systems thinking.
This step will help companies develop a comprehensive plan that will guide their efforts in implementing Industry 4.0. It will also help them identify the most effective ways to improve their internal processes through technology.
One of the most important factors that employers should consider when it comes to implementing Industry 4.0 is identifying the most critical problems that will drive their transformation efforts. Unfortunately, many companies only install networks that are made up of hundreds of sensors and then try to solve them using a solution that's not designed to solve them.
Start small. For example instead of creating a digital twin of an entire factory floor that included interactivity, high fidelity 3D art, and avatars. You should start off by building out a single room or a section. Take a basic scan using LiDAR to introduce the virtual environment and use that as a foundation to introduce your employees to it. From there, you can then scale up and build off of those successes.
Have a moderated approach that involves identifying the most critical problems that will drive their transformation efforts. This step will help companies identify the most effective ways to improve their internal processes. One of the most important factors that employers should consider when it comes to implementing Industry 4.0 is engaging their employees to identify the most critical issues that will drive their transformation efforts.
This exercise can help you identify the areas where they can make improvements. Another important factor that employers should consider is putting in place processes that can reduce the time it takes to implement the technology.
Before implementing Industry 4.0, it is important that employers adopt a sequence of technology upgrades while also removing outdated systems. With the help of advanced algorithms, sensors, and cloud platforms, workers can gain new insights. Unfortunately, once they encounter problems or outdated methods of working, they will revert to their old ways of doing business. So it needs to be a commitment.
To help the workforce adapt to the new technology, companies should introduce incremental and tightly scoped initiatives. Doing so will allow them to easily digest the changes and improve their performance. However, it is also important to remove outdated systems to prevent your teams from returning to their old ways of doing business.
New virtual tools for the workforce, avatars, and the Meta Quest Pro
Meta made some big announcements today that gave us a bigger and brighter view of their goals with XR technology. On top of new avatar technology, new tools for the virtual workspace, and their Horizon World platform opening up to be WebXR accessible, the company also officially launched their much anticipated Meta Quest Pro VR headset.
A huge social shift has taken place. A few years ago, the majority of time spent in VR was spent alone. Today, the vast majority of time spent in Meta Quest is in multiplayer experiences and the top apps in the Meta Quest Store are social.
"We’re seeing that many others also believe in this future. They’re living it, building new worlds, hanging out in VR, and exploring what’s possible," said Meta CEO, Mark Zuckerberg. "The VR ecosystem is thriving as social interaction powers the success of the most popular experiences, and now, we’re shipping our first high-end mixed reality headset, a key moment on the road to augmented reality."
Meta Quest Pro is the first in our new high-end line of devices, featuring the most advanced VR technology we’ve ever shipped. It’s built for collaboration, creativity, and getting things done.
Meta Quest Pro is packed with innovative new features including high-res sensors that enable full-color mixed reality. Thin pancake optics push the limits on state-of-the-art visuals while enabling a sleeker, more balanced headset design. Completely redesigned self-tracking controllers with upgraded haptic feedback help every gesture in VR feel second-nature. And with eye tracking and Natural Facial Expressions, your avatar can reflect your facial movements and make eye contact more naturally, so you can be authentically you while connecting with friends and colleagues. With all this and more, Meta Quest Pro expands what’s possible in VR.
Full Color Mixed Reality: Breakthrough high resolution mixed reality lets you engage with the virtual world while maintaining presence in your physical space in full color. Meta Quest Pro’s high-res outward-facing cameras have 4X the number of pixels as Meta Quest 2’s cameras, enabling the headset to recreate the environment around you in VR with greater fidelity. Meta Quest Pro features stereoscopic mixed reality passthrough, which combines multiple sensor views to create a natural view of the world in 3D. Compared to monoscopic passthrough solutions, this results in a higher quality and more comfortable customer experience with better depth perception and fewer visual distortions for both close-up and room scale mixed reality scenarios.
Next Generation Optics: With thin pancake lenses, increased pixel density, local dimming and quantum dot technology you will see the world in a whole new way. Whether you are connecting with friends, reading or multitasking, or playing games you can immediately notice better visual clarity. Pancake optics also let us reduce the depth of the optical stack by 40% compared to Quest 2.
Self Tracking Controllers: With our first ever self-tracking controllers that are more balanced to hold and provide improved haptic feedback, VR has not felt this intuitive and second nature – Like the controllers are extensions of your hands in VR. The self-tracked design allows for a full 360-degree range of motion, while TruTouch Haptics include new localized and VCM haptics upgrades. Fine motor controls include precision pinch motion and joystick for increased control of every gesture, plus a stylus tip for sketching and whiteboarding. With built-in rechargeable batteries and an integrated charging solution with the Meta Quest Pro Charging Dock (included with headset purchase), your controllers will always be ready to go.
Premium Design: World class counter-balanced ergonomics in our sleekest design ever, for a more comfortable headset that easily provides a great fit so you can focus on the experience. Features our first-ever curved cell battery at the back to better conform to the shape of the head.
More Natural Avatar Expression: Bring your personality into virtual experiences with Natural Facial Expressions and eye tracking. Be more of yourself with avatars that mimic your facial expressions and eye contact, so you can feel like you are more present in meetings and gatherings – letting you connect with friends and colleagues almost as if you were physically there.
High Performance Hardware: With 256GB storage, 12GB RAM, 10 advanced VR/MR sensors, spatial audio and more, Meta Quest Pro has the hardware to deliver great VR experiences. It’s also the first device to use the new Snapdragon XR2+ processor which is optimized for VR to run at 50% more power than Quest 2 with better thermal dissipation, resulting in significantly better performance.
Meta Quest Pro will be available on Oct. 25 for $1499.99 USD. Pre-orders open today.
Additional Supporting Features:
Backwards Compatibility: Access the Meta Quest 2 app catalog to enjoy your favorite games, entertainment apps and more.
Multitasking Made Easy: You now have a new superpower. Pop open multiple resizable screens so you can organize tasks, work on new ideas, stream your feeds, or message with your friends – all while staying present in your physical space (like your office or desk) with mixed reality. Multitasking has never been easier.
Comes with Charging Dock: The companion Charging Dock is designed for effortless system charging with a rapid 45W adapter included to keep Meta Quest Pro and controllers ready whenever your creativity strikes.
Flexible Levels of Immersion: VR is no longer one size fits all. Meta Quest Pro offers adjustable levels of VR immersion from fully open peripheral vision, to partially blocked (enabled by light blockers included in-box) or fully immersed (accessory sold separately).
Unique Default Home Environment: Invite friends and family to your virtual personal space where you can enjoy an all-new Home environment designed especially for Meta Quest Pro.
Autodesk: At Connect last year we spoke about the potential for creative people to work in true 3D, and next year we’re going to have a great new option for that. Autodesk is updating their collaborative design review app to take advantage of the new possibilities unlocked by Meta Quest Pro. This will give architects and designers a new way of reviewing 3D models immersively.
Adobe: We’ve also been working with Adobe to help people get more done in VR. Next year, Adobe’s Substance 3D apps for professional 3D creators, designers, and artists are coming to Meta Quest Pro and Meta Quest 2, so anyone can model 3D objects and join collaborative reviews inside VR with our controllers. And Adobe will bring Adobe Acrobat to the Meta Quest Store, enabling PDF document viewing, editing, and collaboration – major advances for productivity in VR.
Microsoft: Meta is partnering with Microsoft to evolve how we work in VR using Meta Quest Pro and Quest 2. We announced:
Microsoft Teams immersive meeting experiences for Meta Quest: Connect, share, and collaborate in Teams immersive experiences.
Microsoft Windows 365 for Meta Quest: Stream the Windows experience on Quest Pro and Quest 2 devices, and access your personalized apps, content, and settings in VR.
Microsoft 365 app experiences for Meta Quest: Interact with 2D content from SharePoint or productivity apps like Word, Excel, PowerPoint and Outlook directly from Meta Quest Pro and Meta Quest 2.
Microsoft Teams/Workrooms integration: Join a Teams meeting from inside Workrooms.
Meta Avatars in Microsoft Teams: Use your Meta Avatar in Teams for whiteboarding, brainstorming, and meetups.
Microsoft Intune and Azure Active Directory support for Meta Quest: Enable enterprise security and management on Quest Pro and Quest 2 devices.
Accenture: Accenture is partnering with Meta and Microsoft in the coming year to develop new experiences to help companies leverage VR and transform the way they engage employees, interact with customers, or even create products and services in the metaverse. Over the last two years, Accenture has deployed 60,000 Quest 2 headsets to help onboard new joiners, while its “Nth Floor” virtual campus called the “Nth Floor,” co-created with Microsoft, has welcomed 150,000 people.
Quest for Business: Next year we’ll launch Quest for Business, a subscription bundle for Meta Quest Pro and Meta Quest 2 that includes a suite of features like device and application management, premium support and more. It will also unlock access to the Microsoft Windows 365 app and Microsoft Intune and Azure Active Directory. Quest for Business is currently in Beta and will be priced to suit the needs of different businesses, and more details will be shared over the coming year.
Meta Quest Pro was designed with productivity in mind, and will be a major upgrade for those who use VR as a tool for work—but hardware is only part of the equation. We’re equally focused on creating software that improves the way you work and collaborate, both on Meta Quest Pro or Meta Quest 2. Horizon Workrooms is our VR space for teams to connect and collaborate. At Connect we announced the latest set of productivity and collaboration features;
More expressive avatars: By utilizing Quest Pro’s inward-facing sensors that power eye tracking and Natural Facial Expressions features, we can now generate more authentic, natural avatars in real time. And because avatars can now show non-verbal cues like eye contact and facial expressions, meetings in Workrooms will give you a much greater sense of “being there” than traditional video calls.
Breakout groups: We’ve introduced the ability for teams to shift from a big group presentation into smaller discussions in the same room for more active brainstorming. With spatial audio this means you can hear your breakout group clearly and ambient discussions elsewhere in the room.
Sticky notes for whiteboard: Right now you can turn any free space in your physical environment into a virtual whiteboard. We’re introducing the ability to add sticky notes to the whiteboard for brainstorming and collaborating. And because it’s persistent, it will all still be there the next time anyone on your team logs in to that Workroom.
Personal Office environment with multi-screens: The solo Workrooms experience is also getting a major update. We’re shipping the ability to personalize your office in VR, with four types of personal offices and most importantly, the ability to spawn three massive virtual screens on your desk. These changes, combined with color passthrough, will turn your headset into a portable office and one day may even be able to replace your monitor.
Zoom integration: Workrooms is evolving all the time. Early in 2023, you’ll be able to join Workrooms via Zoom, enabling more options on how you choose to show up.
3D models: We’re also working on an option to review 3D models in Workrooms, which we believe will be game-changing for designers, architects and creatives.
Magic Room: At Connect, we also showed a sneak peek of a mixed reality experience we're building that lets any mix of people, some together in a physical room and some remote, collaborate in the same physical room together. Workrooms is already a great solution for fully remote teams, but the goal of this effort is to make collaboration easier and more productive for any team, including hybrid teams where some people are physically together and some are not. We're already experimenting with Magic Rooms at Meta, and we hope to make the experience widely available next year.
We’re seeing incredible progress across the Meta Quest ecosystem. It was only 2 years ago that we announced Meta Quest 2 at Connect. Since then, it’s become the first virtual reality device to break out into the mainstream with a whole ecosystem emerging around it. This is great news for developers, who are finding success on the platform.
Of the over 400 apps in the Meta Quest Store, roughly 1 in 3 are making revenue in the millions.
33 titles have made over $10M in gross revenue, up 11 more titles from our announcement in February, and the number of apps that have made over $5M in gross revenue has doubled since last year, now at 55.
The Walking Dead: Saints & Sinners has surpassed $50M in revenue on Quest alone, nearly double its revenue on all other platforms.
In its first 24 hours, Resident Evil 4 made $2M on the Quest Store.
It took only 24 hours for Zenith: The Last City to make its first $1M.
Blade & Sorcery: Nomad made its first $1M in just two days.
Bonelab from Stress Level Zero made its first $1M in less than an hour when it launched last month. Bonelab also now holds the record as the fastest-selling app in Quest history.
To date, over $1.5BN has been spent on games and apps in the Quest store.
And Meta Quest Store digital gift cards are now available, allowing people to gift apps, games, and experiences to friends and family, and enabling developers to take advantage of gifting opportunities and seasons. (Now available online in increments of $15, $25, and $50 at Walmart, GameStop, and Best Buy; coming soon to Amazon and Target.)
Since it launched last year, there are now over 2,000 apps in the App Lab.
The Quest 2 ecosystem in gaming is thriving. Developers are already making amazing games and the next few months are going to see some of the most exciting releases yet.
Marvel’s Iron Man VR is coming to Meta Quest 2 on November 3 from the talented teams at Camouflaj and Endeavor One, along with our partners at Sony Interactive Entertainment and Marvel Entertainment. Take to the skies in Tony Stark’s iconic Iron Man armor. Travel the world and face off against Ghost, a hacker with a dangerous drone army and a mysterious connection to Stark’s past. Packed full of deep-cut references for longtime Marvel fans, this is the closest you can get to inhabiting Iron Man’s armor, short of building your own.
POPULATION: ONE Sandbox: Multiplayer phenomenon, POPULATION: ONE, is evolving in a big way this December. With the debut of POPULATION: ONE Sandbox–a robust, user-generated game creator limited only by your imagination–you can take the action anywhere: a moon base, a Viking village, or a fight with swords in zero-g. If you're not a creator, no worries, you can discover and play through new or featured community games with your friends. The team at BigBox VR is excited to bring endless replayability to POPULATION: ONE.
Behemoth: The team that designed the brutal combat of The Walking Dead: Saints & Sinners has also been working in secret on a new project. Skydance Interactive revealed the game’s name, Behemoth, and showed off a glimpse of this grim new world via a new teaser video. Behemoth is being built from the ground up to showcase everything Skydance Interactive has learned about what it means to fight for your very survival in VR, and will release for Meta Quest 2 in late 2023.
Xbox Cloud Gaming (Beta): We're partnering with Microsoft to bring Xbox Cloud Gaming (Beta) to the Meta Quest Store in the future. You'll be able to hook up an Xbox controller to your headset and play console games from the Xbox Game Pass Ultimate library on a massive screen via Quest. We’re not announcing a launch date for Xbox Cloud Gaming on the Quest Store, but we’ll share more details as soon as possible.
Every month, millions of people turn to VR to get physically active using fitness apps like LES MILLS BODYCOMBAT, FitXR, and Supernatural. People stick with it too, with around 90% of paid subscriptions still active after the first month.
Meta Quest 2 Active Pack: On October 25, we’ll be releasing the Meta Quest 2 Active Pack to make exercising in VR more comfortable. Available for pre-order today, the Active Pack will contain adjustable knuckle straps and controller grips, plus a face interface you can easily wipe down after working out. And this is just the start of a wider Made For Meta accessory program we’re launching to bring more accessories to Quest 2, starting next year.
Meta Quest Move Updates: Early next year we’ll be rolling out Meta Quest Move integration, giving people access to more and better metrics in VR, displayed in real-time as part of the Move overlay. Next year we’ll also introduce a new way to share your fitness progress with selected friends, for additional support or a bit of friendly competition.
Fitness API Beta: This fall we’re sharing the Fitness API Beta for select developers. It lets people share both real-time and historical fitness data, unlocking things like custom stats or new levels based on progress.
Supernatural: At the end of October, Supernatural is launching knee strikes as an explosive new way to activate your core and lower body during workouts, while developing balance and coordination.
Gym Class - Basketball VR is making the move from App Lab to the Meta Quest Store later this fall.
Horizon Worlds on the Web: The metaverse should be accessible across all kinds of screens and devices, and you should be able to jump in from anywhere. We're developing Meta Horizon Worlds on the Web, so anyone will be able to access Meta Horizon Worlds from their mobile devices or computers. Eventually, people will also begin to see early bridges from Meta technologies to Meta Horizon Worlds – so for example, soon you will be able to take a video in Meta Horizon Worlds and easily make an Instagram Reel to share that video.
The Horizon Worlds toolkit is expanding beyond VR: You'll be able to use TypeScript, a powerful scripting language, to make more dynamic and interactive worlds. And we're making it so you'll be able to import tri-mesh items into Horizon Worlds, and build parts of your worlds using 3D content creation tools like Maya, Blender and Adobe Substance 3D.
NBCUniversal: Next year we’ll begin a multi-year collaboration with NBCUniversal, bringing iconic comedy and horror experiences into the metaverse. The Office, Blumhouse, UniversalMonsters, and Halloween Horror Nights, and so much more, will come to Meta Horizon Worlds, and you’ll be able to immerse yourselves in these worlds like never before via VR. Last but not least, the Peacock app is also coming to Meta Quest.
YouTube VR - New Social Innovations: In the future, if you’re hanging out with friends in Meta Horizon Home, you’ll be able to bring up YouTube and watch videos together, just as if you were watching together in-person. And we’re working with the YouTube team to make the experience even more flexible.
Earlier this year we launched cochlear implants, over-the-ear hearing aids, and wheelchairs for Avatars, but we have more improvements coming.
Avatar Store: The Avatar Store is launching in VR later this year. We’re working with partners across sports, entertainment, and more to ensure that you can find clothes that fit your personal style so you can express yourself in the metaverse.
Avatars in Facebook Messenger and WhatsApp: We’re bringing Avatars to video chat, starting in Messenger and WhatsApp. It’s going to add a whole new dimension to video chat — a third mode — between video on and video off. You can still express yourself and react, but not on camera.
Unity and Unreal Engine: We’re extending Meta Avatars SDK to iOS and Android in Unity, and rolling out the Avatars SDK in Unreal Engine soon.
Legs! We’re launching legs — full body avatars in VR. We're getting ready to bring legs to Horizon first, and we'll keep bringing them to more experiences over time as we improve our technology stack.
Ray-Ban Stories: We’re making good progress on two of the most powerful windows into the metaverse -- virtual reality for full immersion, and mixed reality for blending physical and virtual worlds. The next set of experiences that needs to get built out is augmented reality, where you see digital objects overlaid perfectly on the world around you.Last year, we partnered with EssilorLuxottica to introduce Ray-Ban Stories. Ray-Ban Stories are one of the best selling smart glasses in the world.
Ray-Ban Stories - Hands-Free Native Calling/SMS: Soon, you'll be able to call or text on Ray-Ban Stories hands-free with your existing phone number.
Ray-Ban Stories - Spotify Tap playback: We're also rolling out Spotify Tap playback to Ray-Ban Stories, which will allow people to listen to music from their glasses. You can tap and hold the side of your glasses to play Spotify. And if you want to hear something else, tap and hold again and Spotify will recommend something new.
Spark AR is now Meta Spark: Starting today, creators and developers can build interactive virtual objects in Meta Spark Studio and begin testing them in mixed reality with Meta Spark Player for Meta Quest devices via a beta program. These updates are the next steps in our journey to AR glasses and the early beginnings of the tools needed to build interactive AR experiences that will contribute to the augmented layer of the metaverse—and a first taste of what those experiences will be like hands-free.
Virtual objects: Virtual objects are the foundation for the types of visual AR experiences that AR Glasses will enable. They are interactive, world-facing and built in Meta Spark Studio. Today, these can be viewed and tested in Meta Spark Player on Meta Quest devices.
Meta Spark Player: The new Meta Spark Player for Meta Quest devices is an app for testing virtual objects on Meta Quest devices by leveraging passthrough.
We're introducing new products and features to help developers build immersive apps and successful businesses.
App to App Travel API: Later this year, we'll roll out App to App Travel, which will allow people to travel from one Quest Store app Destination to another, and enable developers to cross-promote between apps.
Presence Platform: At last year’s Connect, we introduced Presence Platform—our suite of machine perception and AI capabilities that can help developers build compelling mixed reality experiences. Today, we're introducing:
Movement SDK: Developers can bring avatars and characters to life with eye, face and three-point body tracking with Movement SDK. Developers can also use these capabilities to enhance gameplay or interactions in their apps.
Shared Spatial Anchors Helps developers build experiences with a shared world-locked frame of reference, so multiple people can have a shared co-located experience to play or work together in VR while in the same physical space.
Color Passthrough: With Quest Pro, developers can now build experiences that let people engage with the virtual world while maintaining presence in their physical space in full, rich color.
WebXR: We’re excited to share a couple updates coming soon that will enable more immersive and high-quality WebXR experiences:
Mixed Reality support: We're bringing Presence Platform capabilities like passthrough, scene understanding, and spatial anchors to developers through WebXR to enable new types of compelling experiences.
Project Flowerbed: We’re launching a developer preview of Project Flowerbed -- a new VR experience designed to showcase best practices for developing great VR experiences on the web through an immersive, meditative, garden-building experience.
Meta XR Simulator: Early next year, we'll launch the Meta XR Simulator, which simulates our XR devices and features on an API level. This will allow for faster integration when developing Meta Quest apps and minimizes the need to take a headset on and off to test a change.
We showed off some of the future-facing research we’re conducting inside our labs, with a focus on next generation interfaces, bringing physical objects into the metaverse, and photo realistic avatars.
Next-gen Interface: In the long term, the most powerful way to interact with the metaverse will be through personalized AI combined with electromyography technology (EMG), which uses the neuromuscular signals through your wrist directly as input to give you an intuitive, almost frictionless interface. This will open up all sorts of avenues to help us do more.For example:
Project Aria: Carnegie Mellon University’s Navcog project started using Reality Labs at Meta’s Project Aria research glasses to 3D map the Pittsburgh International Airport, allowing them to build phone maps that enable the visually impaired to navigate more confidently indoors, where GPS signals often don’t reach.
EMG: Wearing an EMG wristband, Mark Zuckerberg demoed checking his messages and taking a photo with a few subtle hand gestures. In a second demo, a person plays an arcade game, communicating their intended actions to the computer with almost no hand movement.
3D Object Scanning: The ability to build 3D objects will play a key role in the metaverse, but doing that from scratch is difficult. The use of physical objects as templates is easier and faster. We’re researching different technologies that can help us scan an object and bring its digital twin into VR or AR.
Codec Avatars: Avatars in the metaverse will come in a variety of styles, including photorealistic avatars that can create a genuine sense of social presence.
Full Body: Last year, we showed early progress on full-body Codec Avatars. We’ve continued to develop that technology. It's now possible to change your virtual outfit.
Codec Avatars 2.0: We also showed our latest progress to make the facial expressions of our Codec Avatars truer to our physical forms — alongside Zuckerberg’s own 2.0 Codec Avatar.
Instant Avatars: Codec Avatars take a long time to generate, so we’re working on something much faster for people to use in the future. With Instant Codec Avatars, all you need for the scan is a phone and a few hours to generate the avatar.
This Meta SDK will bring hand interactions to your XR experiences.
Today I am super excited to share an announcement regarding the following new SDK which I believe will be a huge addition to anyone who wants to work with VR or Passthrough with Oculus, also be sure to watch THIS HAND INTERACTION SDK VIDEO and trust me that it will be worth your time.
The Oculus Interaction SDK is a hands and controllers interaction components library which provides very realistic interactions and allows to easily use prefabs when building games or apps for virtual reality with and without Passthrough features.
The following sample scenes are available in the Interaction SDK:
👉 Basic Grab scene which showcases a scene with the HandGrabInteractor
👉 Complex grab scene which showcases a scene with the simpler GrabInteractor but with the addition of Physics, Transforms and Constraints on objects.
👉 Basic Ray scene which showcases ray interactions with Unity canvas.
👉 Basic Poke scene which showcases UI interactions such as buttons, scrollable areas, and box-based proximity fields.
👉 Basic Pose detection scene which demonstrates pose detection for several common hand poses such as Thumbs Up, Thumbs Down, Rock, Paper, Scissors, and Stop
Let me know if you have any questions after watching this announcement and know that I will personally be covering every single feature available in the SDK and I am working closely with Oculus to make sure I have all the info I need for future videos.
This new announcement and video series also contains source code examples which you can access from GitHub today.
Thanks everyone and enjoy it 🙂 time to play with XR !
By Bobby Carlton
When it comes to creating that perfect balance of realism and cartoony avatars that can be used across multiple VR platforms, Ready Player Me is the leader. Very easy to make and only takes minutes, a Ready Player Me avatar can be personalized with clothing, fashion accessories, and even outfits from popular movies, and that could be a big deal for Enterprise adoption.
Last week, the company announced that they raised $56M led by a16z to help grow their business and give people the ability to connect people to the metaverse in a more meaningful way. This is obviously a huge leap for pushing avatar technology, but it also means a big step for the metaverse as more people and companies explore the potential of these virtual worlds.
Creating your own 3D avatar is incredibly simple. Absolutely no coding skills are needed to create one which you can then import into platforms such as Spatial, Mozilla Hubs, VRChat, and others with ease by copying and pasting a generated code made by the software.
You may think that avatars are something you would only use in socialVR platforms or in games, but there is a big push to bring this type of virtual representation into work environments. Ready Player Me has already positioned themselves into Enterprise solutions by lining themselves up with dozens of partners to use their avatar technology for corporate training, team building, and even having avatar creation being part of the onboarding steps for new employees.
As companies establish their digital twins in platforms like Mozilla Hubs, MeetinVR, Glue, Virbela, and others, avatars are how we represent ourselves as employees in VR, and it helps create a diverse workforce in both the real world and in the metaverse. Employees expect inclusion, culture, and heritage to be things that are represented at work.
Last year saw 24 companies adopting Ready Player Me avatars for employee representation in the metaverse, and with this new round of fundraising, the company looks to push that number even higher.
Timmu Tõke, CEO of Ready Player Me believes that being able to represent your individual heritage in the metaverse, whether it’s for meeting up with friends for a concert or being part of a client meeting is important for all of us.
The thought is that your skin tone, your hair, the shape of your eyes, and how you dress all make up who you are and is part of the story behind you the person, and you the employee.
In an interview with GamesBeat, Tõke talked about how his company will bring that representation and consistent identity across all experiences saying, “We’re doing cross-game answers for the metaverse, as we saw that people spend a lot of time in virtual worlds.” Tõke added, “The metaverse is not one app, or one game or one platform. It’s a network of thousands of different virtual worlds. So it makes sense for users to have an avatar to traverse across many different virtual worlds.”
“You have to build the network out for diversity as a developer tools company,” said Tõke in a interview with VRScout. “That’s where we spend most of our time.”
The metaverse is expanding each day with more social experiences and more companies and industries uncovering its potential for everything from connecting consumers through a metaverse portal, marketing goals, B2B, employee training and recruitment and how we can improve things such as automation, robotics, infrastructure, warehouse management, and so much more.
Earlier in the year Ready Player Me announced a partnership with the AR company 8th Wall that would allow you to bring the Ready Player Me avatars into any 8th Wall AR experience using A-Frame, which potentially could bring more personalization into AR training initiatives such as on-the-fly training or reskilling. It could also have an impact on how companies approach marketing, recruitment and onboarding.
Tõke realized that we’re not totally there yet but the metaverse is gaining a lot of momentum. “Based on our rapid growth rate (40% month on month), I think it is fair to say the VR industry is booming right now, and expanding quicker than many people realize. Like any new technology, however, its success largely depends on how quickly it is adopted by consumers, and in that respect we still have some way to go.”
There is no question about it. The COVID-19 pandemic changed everything for us.This includes our personal and professional life. Though we have started to see some normalcy in our day-to-day lives, we are seeing new concerns such as an evolving COVID virus and monkeypox, which is making people and businesses rethink how we socialize with our family, friends and the people we work with.
However a published research paper shows that people who use VR to hang out with friends through socialVR platforms, attended a virtual concert, played games, work in a VR environment, or “visited” other parts of the world, were actually really happy!
They socialized with friends and family, and they connected with work colleagues. There is a sense of normalcy inside of VR that isn't restricted with things such as travel limitations or having to worry about things such as COVID or monkeypox.
As the COVID pandemic completely disrupted and changed our lives, we found ourselves unable to participate in things that made sense to our identity. You no longer had that foundation of autobiographical memory, which is your memories of your own history. It sounds a little sciency and complex, but your memories actually play a role in your current happiness, and without them your days kind of blend into each other leaving you with this feeling of emptiness.
This also includes your work memories. Collaborating with co-workers, approached training, and even having "water cooler" talk.
So yes, the quarantine has created a long lasting impact and it's definitely making you feel pretty burned out, even in today's world.
Italian researchers worked with 400 participants over a three month period. Users were encouraged to view 360 photos and videos of other countries, visit virtual gardens, beaches, encouraged them to spend time with other VR users in platforms such as VRChat or Mozilla Hubs, work together, and even had users jump into VR to create a safe and comfortable personal bubble to reflect and be alone, a place referred to as the “Secret Garden.”
Participants were then interviewed and researchers found them to be happier. They were much more engaging and felt better and more confident about their work.
Riva and his team told participants to use VR anytime during the day. Morning, afternoon, before bed or anytime they felt anxious due to the lack of social activities. There were no restrictions. Use VR in the same way you’d text a friend or a co-worker.
The end results showed that people felt calm and connected during the height of the pandemic. They were happy. They had that sense of autobiographical memory that helped connect them with what makes them happy, and that this approach still works today, even without a global pandemic.
With more and more businesses moving to remote work or a blended work environment, using a digital twin or a metaverse portal to connect workers helps create happy employees. It creates an autobiographical work memory, which is really important for team collaboration and team moral. VR helps get people out of video calls and into something more meaningful.
As we start to see more and more positive benefits of VR, we’re starting to see VR adoption take hold on consumers and businesses in many industries.
But others are experiencing that true happiness from VR and not because they are gaining from it monetarily. They’re happy because it’s giving them a way to spend time with friends and escape the confinement of their homes. The metaverse is opening new opportunities for people to connect throughout the day and this includes workhours.
One VR user talked about why being in VR makes them happy, saying, “I work from home and being at home alone all day sucks. I wake up, work in my living room for 8 hours and then that’s it. I’ve never left my home. I live in a rural place so I don’t have a downtown I can visit. Going to work is my social time!” the individual goes on to say, “But once I put on my headset, I’m able to leave my house and hang out with the people I work with or people that I’ve made in virtual reality. I know I’m not actually leaving my house, but it feels like I do. I get to talk with people and socialize, and it makes me really happy to have that connection. It’s honestly saved me from going crazy. There have definitely been times when I’m really bummed out, but once I get into VR, I’m happy to see people I know.”
During an interview with CNBC, Skip Rizzo, Ph.D., Director of Medical VR at the Institute for Creative Technologies at the University of Southern California talked about how an immersive VR simulation could be “emotionally evocative,” and explains that VR gives you the tools to develop your own relaxation strategies to cope with the stress of COVID-19 anxiety or any stressful situation. It gives you a portal to escapism.
Through Riva’s research, he and his team have suggested the following VR user guide that you can use to help you stay happy.
Use VR two or more times a day every week for work meeting or to socialize. Jump into VR whenever you feel a little anxiety. Play a game or meet up with a friend. Riva also suggests having a "Secret Garden" as a safe haven to help reduce emotional or work stress.
Be creative with your VR space. Whether it's to socialize or for work. Doing things together in VR can help reconstruct that sense of community that makes us happy and helps reduce stress.
Use VR to reflect on your identity and future goals. Jump into your "Secret Garden" to think of your family, friends, or maybe you just want to clear your mind. Or use the space as a way to work out a work problem. Of course, be social!
Point is, VR can make you happy. It connects you with the people you love, people that make you laugh, people you work with, and allows you to escape reality and even join reality.