By Bobby Carlton
By utilizing AR glasses, individuals who experience hearing loss or those who are deaf can now read speech in real-time using closed captioning through AR glasses. XRAI Glass has launched a suite of solutions that allow users to experience the world through AR, according to an article at Auganix.com.
The software, which is called XRAI Glass, converts audio into a version of conversation that can be displayed on the user's AR glasses. It can also recognize the voice of the speaker and translate conversations in nine different languages.
To use the app, users need to own a set of Nreal Air AR glasses, which are tethered to a mobile device. Through its partnership with Nreal, XRAI Glass was able to provide users with a device that can be used to view conversations. After pairing the app with their Nreal Air glasses, individuals were able to see the real-world enhanced with the help of digital captions.
On top of real-time audio transcription, the XRAI Glass app also includes the following features:
Through a command of "Hey XRAI" people can access an AI-powered personal assistant and ask questions such as what is the weather like in their area or ask for information, much like Siri or Alexa. The answer to your question will then automatically be displayed on your glasses where only you can see it.
You'll also be able to recall a conversation from the previous day by saying "Hey XRAI, what was I asked to pick up yesterday?”
With translation available in multiple languages, XRAI Glass can also help you transcribe and subtitle conversations in nine of the world’s most spoken languages that include English, Mandarin, French, German, Italian, Japanese, Korean, Portuguese, and Spanish, and the company plans on rolling out more languages in the near future.
In a previous announcement, the company revealed that its software would soon be able to detect variations in pitch, accents, and voice tones, which plays a huge part in how we communicate, and will have an impact with Web3.0 experiences.
Dan Scarfe, the founder and CEO of XRAI Glass, said that the company was thrilled to announce the availability of its technology worldwide. He noted that the company's goal is to provide a solution that will help people with hearing loss connect with their communities. Through its partnership with various organizations, such as DeafKidz International, the company was able to test the product and learn from its users.
Due to the capabilities of XRAI Glass, the company has been able to help more people than it initially thought possible. For instance, people with neurodiversity, who have difficulty understanding sound and speech, have been able to benefit from the technology, and as we become more digitally connected, IoT is changing our personal and work lives.
Another way to see how XRAI Glass and Nreal's partnership would assist is in work environments that are incredibly loud with background noise, but verbal communication is still important. You could have workers using the glasses as a secondary layer of communication to make sure everyone is getting the correct information. You can also access the information through an app on your Android device.
This is important as more and more industries are turning to technologies such as digital twinning, AI, AR, and VR to be more productive and efficient.
Through its software, XRAI Glass is able to record conversations so that users can easily recall past interactions. The company offers various subscription plans, such as the Essential, Premium, and Ultimate. The basic plan, which is free, provides users with a basic screen duplicate mode and unlimited transcription.
The Premium plan, which is priced at £19.99 a month, includes 30 days of conversation history and unlimited transcription, and it also comes with a variety of additional features such as 3D support and translation into nine additional languages.
The Ultimate plan, which costs £49.99 a month, comes with everything that the Premium plan has. It also includes unlimited conversation history, cloud-enhanced transcription, and a personal AI assistant.
For more information about XRAI Glass, check out the company's website.
By Bobby Carlton
During Qualcomm's annual event, the Snapdragon Summit, the company unveiled its latest technology, the AR2 Gen 1. This new platform paves the way for the next generation of AR glasses that are lightweight, less bulky, will look more stylish, and could be used in our day-to-day lives and at work.
According to Hugo Swart, a marketing manager at the company, the AR2 Gen 1 platform represents the first of its kind for the development of thinner and lighter AR glasses that would look more like normal glasses that we see today. He noted that the creation of these types of day-to-day wearable glasses is unlike something like the Quest or HTC VR headset.
One of the biggest challenges designers face when it comes to creating wearable technology is the power consumption. Through its multi-chip design, which is capable of delivering a 2.5x increase in AI performance, the company was able to reduce the power consumption of its products. This could allow them to create glasses that are more accurate and lightweight.
The AR2 Gen 1 platform from Qualcomm is designed to split the computational load between the three co-processors in the frames. This allows the company to provide a more efficient and powerful solution for developing AR glasses. It features an AR processor that's capable of handling various features, such as graphics and visual analytics. It can also support up to 9 cameras for monitoring your surroundings.
Unfortunately, according to the company, the new AR2 Gen 1 chipset won't be able to deliver the same level of performance compared to the current generation of virtual reality headsets. For instance, while it will allow users to get more accurate scanning and depth sensing, it won't be able to provide them with the same level of detail.
To ensure that the next generation of AR glasses will be a success, the company is relying on the support of various devices, such as smartphones and computers. Some of the chipsets used in the upcoming AR2 Gen 1 will be able to process graphics using Wi-Fi 7, which will allow them to connect to a network at speeds up to 5.8Gbps. This feature will help reduce the latency and provide a more natural and responsive experience.
The AR2 Gen 1 platform will also allow eye tracking to support security features, such as iris authentication. This could allow users to use their devices' cameras to unlock their AR glasses, and even be used for other features depending on how the AR glasses are used.
Before the company started working on the next generation of augmented reality glasses, it had already worked on various products, such as the NREAL Light and the A3 from Lenovo. During a briefing with reporters, the company's marketing director, Chadd Swart, noted that the company's current efforts have not been able to provide the same performance when it comes to the battery life of the devices.
Getting involved in the ecosystem allows tech companies to provide their customers with the best possible experience. This is also beneficial for Microsoft as it allows them to develop new products and expand their reach beyond their HoloLens program. For instance, earlier in the year, the company partnered with Intel to use the AR2 Gen 1 chipset in future products.
Another company that recently introduced a new AR headset that's powered by the AR2 platform is San Francisco-based startup company, Niantic Labs. The company's Outdoor AR headset is designed to be light and portable.
It features a sleek and modern design, and it's powered by the AR2 platform. The device, which weighs only 0.5 pounds, comes with some impressive specs.
Qualcomm also unveiled new S3 and S5 Gen 3 chipsets that will allow users to access the latest technology. Some of these features include spatial audio, which will allow users to monitor their head movements, and adaptive noise cancellation, which will allow games to utilize this new technology.
Although the company's next generation of AR glasses will have various features, it's not yet clear if the technology can deliver the same level of performance. With that in mind, it's possible that the innovation that the company has brought to the table could lead to a new era of technological change.
You can learn more about Qualcomm’s Snapdragon A2 Gen 1 platform by clicking here.
By Bobby Carlton
Since it launched its first product, the HaptX DK2, in January 2021, HaptX has been pushing the envelope in terms of haptic technology and how it can improve the way XR is used as an enterprise solution. Now, it's time for the public to get their hands on the company's next generation of gloves, the HaptX G1. This is a ground-breaking device that's designed to be used in large-scale implementations in many industries.
The new HaptX Gloves G1 features interesting features that are designed to meet the needs of its users. Some of these include wireless mobility, improved ergonomics, and multiuser collaboration.
“With HaptX Gloves G1, we’re making it possible for all organizations to leverage our lifelike haptics,” said Jake Rubin, founder and CEO of HaptX, in an official press release. “Touch is the cornerstone of the next generation of human-machine interface technologies, and the opportunities are endless.”
HaptX Gloves G1 leverages advances in materials science and the latest manufacturing techniques to deliver the first haptic gloves that fit like a conventional glove and delivers necessary precision tactile feedback for jobs that require that type of accuracy.
The flexible and soft materials used in the production of the HaptX G1 provide a level of comfort and dexterity that's not found in other products. To ensure the users are able to have a good fit, the GI glove is available in different sizes, such as the Medium, Large, and Extra large.
Built into the glove are hundreds of actuators that expand and contract on specific parts of the glove to provide a realistic sense of touch when you interact with various virtual objects. For example if you were to hold a wrench in VR or AR, the actuators in the glove would expand on the parts of your physical hand to convince you that you were actually holding a real wrench.
Tto do this, the G1 utilizes a wireless Airpack, which is a lightweight device that generates compressed air and controls its flow to provide a physical feedback via the actuators. The Airpack can be worn on your body in backpack mode or placed on a table for standing or seated applications.
The Airpack can be charged using a single charge, which provides it with an additional three hours of use, making it ideal for military, educational, and enterprise applications.
The HaptX SDK provides developers with a variety of features that make it easier to create custom applications for any industry or workflow. One of these is its advanced feedback technology, which can be used to simulate the microscale textures of various surfaces. The G1 also comes with a variety of plugins for platforms such as Unreal Engine and Unity, as well as C++ API.
According to Joe Michaels, HaptX's Chief Revenue Officer, many organizations resort to using game controllers when developing their metaverse strategies due to how ineffective they are at providing effective touch feedback. With the G1's ability to provide real-time feedback, businesses can now rely on its capabilities to improve their operations.
To celebrate the G1's launch, the company is currently taking pre-orders for the G1 through its website. For a limited time, customers can get a pair of the G1 for just $5,495. They can also save money by purchasing a bundle of four sizes.
In addition to pre-ordering and a discounted bundle option, the company is also introducing a subscription program that provides its customers with a comprehensive service and support plan. The subscription includes the Airpack, the SDK, and a comprehensive maintenance and support package.
The subscription for the HaptX Gloves G1 starts at $495 a month. Pre-ordering the product allows customers to make a small deposit to cover the cost of the gloves. Once the G1 is delivered, they can select their subscription options.
The G1 is expected to ship in the third quarter of 2023. To learn more about the G1 and its subscription model, visit the HaptX website.
By Bobby Carlton
The German railway company, Deutsche Bahn is building a digital twin of its railway network that will allow them to monitor and improve the performance of its 20,500 miles of tracks and stations. Through an interconnected network of sensors and cameras and AI through Nvidia Omniverse, the railway can analyze the data collected by its sensors and cameras to identify the causes of its various operational issues and improve its performance.
A digital twin can provide you with a quick overview of what's going wrong, but it can also help you prevent it. With the help of AI, you can learn how to fix issues and make the whole system work better. For instance, an AI can analyze a process and uncover design flaws and identify the cause of why it's happening. It can also help you schedule regular inspections and maintenance on certain parts of the machinery through predictive maintenance.
“With NVIDIA technologies, we’re able to begin realizing the vision of a fully automated train network,” said Ruben Schilling, who leads the perception group at DB Netz, part of Deutsche Bahn in an official Nvidia press release. "The envisioned future railway system improves the capacity, quality and efficiency of the network."
That said, it’s important to not underestimate the real-time aspect of AI’s role with digital twinning in industry 4.0. According to David Crawley, a professor at the University of Houston's College of Technology, the university collaborated with other organizations to develop a digital twin that can be used in its digital oilfield laboratory.
He noted that an oil rig worker in the South Pacific was able to use AR headgear to show an engineer how to fix a faulty part of the equipment without shutting down the operations.
According to Crawley, the use of AI in the metaverse allows people to engage in activities that are similar to what they're actually doing in the real world using a AR, VR, or WebXR. For instance, a worker hundreds of miles away can use a device like a Magic Leap 2 headset to fix a pipe or identify a problem with a valve.
There's also a symbiotic relationship between AI and digital twins that exists in an industrial metaverse.
“AI is ultimately the analysis of data and the insights we draw from it,” Lisa Seacat DeLuca, then a distinguished engineer and director of Emerging Solutions at IBM, during an interveiw with VentureBeat. “The digital twin provides a more effective way to monitor and share that data, which feeds into AI models. I boldly state that you can’t have AI without digital twins because they can bring users closer to their assets and enable them to draw better, more accurate and more useful insights.”
A digital twin can be built using the data collected by various sensors and devices and IOT. Aside from providing more data points, the digital twin can also help improve the AI's performance by allowing it to perform more effective simulations.
Deutsche Bahn Chief Technology Innovation Officer Rolf Härd, noted that the company can collect enough data to allow its AI to perform more impactful simulations and provide predictions that will help Deutsche Bahn be more efficient.
David Crawley explained how a digital twin can be used to perform predictive maintenance analyses on a trains components, and noted that because of his knowledge of how these components work, he can use the digital twin to model maintenance scenarios.
When creating a digital twin at such a large scale, the process can become a massive undertaking. You need a strategy and a roadmap to a custom-built 3D pipeline that connects computer-aided design datasets that are built within your ecosystem with high-definition 3D maps and various simulation tools. In this case Deutsche Bahn used Nvidia's Universal Scene Description 3D framework, to connect and combine data sources into a single shared virtual model.
Through digital twinning and data collected by the IoT sensors, Crawley and his team were able to identify areas where his organization can improve its operations. For instance, by analyzing the speed and weather of a train, he was able to identify where Deutsche Bahn could improve its service to its customers.
By Bobby Carlton
During a call with investors, which was part of Meta's third-quarter earnings report, CEO Mark Zuckerberg and other executives were asked about the company's rising costs. During the quarter, the company's expenses increased by 19% to $22.1 billion.
Meta's stock price tanked by 24% this past week (3rd week of 10/2022) after the company provided fourth-quarter guidance that was below analysts' expectations. During the third quarter, the company's revenue fell 4% to $27.7 billion, while its profit dropped 52% to $4.4 billion. A lot of this was because of their investment into developing their vision of the metaverse.
Meta's Reality Labs unit, which develops augmented and virtual reality technology that will be used in the company's metaverse, has lost over $9 billion in 2022 so far. Dave Wehner, the company's chief financial officer, attributed the revenue decline to the Quest 2 sales.
“We do anticipate that Reality Labs operating losses in 2023 will grow significantly year-over-year,” Meta said in an official statement. “Beyond 2023, we expect to pace Reality Labs investments such that we can achieve our goal of growing overall company operating income in the long run.”
During the earnings call, Jefferies analyst Brent Thill said that he believes investors are thinking that Meta is focused on “too many experimental bets versus proven bets on the core”.
Zuckerberg was asked about the company's strategy of investing heavily in the metaverse and not in the core. He noted that there is a difference between experimenting and not knowing if the product will be successful. He said that the company is still focused on improving its various apps and is confident that they will be successful. Some of the company's projects include improving its TikTok-like short-video service and its business messaging features.
Despite Zuckerberg's lack of an estimate regarding the size of the company's projects, he noted that the various improvements that the company is making are going in the right direction. He also noted that the company's metaverse is a long-term effort that the company is working on and believes that the various projects that it is developing will eventually be successful.
YouTuber Marques Brownlee did a bit of under the surface dive into Meta's thoughts on this and though he doesn't provide a solid answer as to why Meta is okay this direction, he does give us some good insight on what he thinks is Zuckerberg's vision with the technology. As Brownlee notes in his video, "the metaverse might have a pretty terrible reputation already," but instead of just riffing off that, Brownlee asks us to take a step back and look at the metaverse objectively.
It's a pretty interesting video and I would recommend taking the 15 minutes to watch.
Back to Meta. Responding to a question, Zuckerberg said that it's hard to predict the future success of a company due to the various factors that are happening in the world today, and noted that it's also hard to have a simple solution that will solve all of the company's problems.
Zuckerberg also noted that the company is facing various challenges, such as the poor economy and the effects of Apple's 2021 iOS privacy update (which is addressed in Brownlee's video). This prevented Meta from being able to target ads to its users, which is a big part of how the company makes money.
Long-term investments into the metaverse are “going to provide greater returns over time, I think we’re going to resolve each of these things over different periods of time, and I appreciate the patience and I think that those who are patient and invest with us will end up being rewarded,” said Zuckerberg.
One thing to mention is that one of the main reasons that the company is developing and heavily investing their Meta framework, is to ensure that its platform will be ready for the shift that Zuckerberg is predicting so the company not be negatively affected by the decisions of its competitors.
The other main reason that the company is developing the metaverse is to ensure that it will be more innovative when it comes to building software and hardware. He noted that technology companies are more capable of developing new innovations when they can build both hardware and software.
“A lot of this is just you can build new and innovative things by when you control more of the stack yourself,” Zuckerberg said.
By Bobby Carlton
In the past year we have seen a lot of movement with brands and industries shifting towards a mission statement that embraces virtual tools and XR enterprise solutions. One company making that shift is the tech giant Microsoft. To help them stay focused on making strategic steps towards that goal, they have created an internal group called the Industrial Metaverse Core, which will explore immersive tech for workers in the industrial sector.
After all, the use of a digital twin is more sustainable, allows companies to explore scenarios without putting employees in danger or using actual products, has a proven ROI, and it can give you supercharged KPIs and data.
Through the use of XR solutions, Microsoft aims to enhance the core capabilities of industrial work by developing software interfaces that can be used in various functions such as industrial robotics, automated warehouses, and control systems for electrical plants. The company also claims that a virtual world that's focused on factory environments could be used to monitor machines, automated warehouse environments, and help with overall workplace safety.
In addition, Microsoft's industrial offerings will also cover transportation networks. In 2018, the company acquired AI startup company Bonsai, which it said would be integrated into the company's Azure public cloud. Gurdeep Pall, the corporate vice president of Microsoft's autonomous systems division, noted that the service would be used on the company's platform.
Through the company's services, industrial engineers are able to combine AI and XR with their existing processes and equipment, and it can be done regardless of the engineer's experience in software development.
One company looking to take advantage of XR technology is Mercedes-Benz. The automobile company has partnered with Microsoft and is reportedly working on developing a new XR data platform that will allow the German carmaker to improve its vehicle production efficiency by connecting Microsoft’s Cloud with Mercedes-Benz’s newly-introduced MO360 Data Platform.
The platform will be able to connect with the company's existing data infrastructure, which would help Mercedes-Benz improve in three ways; vehicle-production efficiency, sustainability, and resilience.
“This new partnership between Microsoft and Mercedes-Benz will make our global production network more intelligent, sustainable and resilient in an era of increased geopolitical and macroeconomic challenges,” said Joerg Burzer, a member of the Board of Management of Mercedes-Benz Group AG, Production & Supply Chain Management. Burzer continues, “the ability to predict and prevent problems in production and logistics will become a key competitive advantage as we go all electric.”
Mercedes-Benz’s Chief Information Officer, Jan Brecht, provided additional advantages of MO360’s operability saying, “With the MO360 data platform, we democratize technology and data in manufacturing. As we are moving toward a 100% digital enterprise, data is becoming everyone’s business at Mercedes-Benz. Our colleagues on the shop floor have access to production and management-related real-time data. They are able to work with drill-down dashboards and make data-based decisions.”
Brecht says this will allow everyone in the organization to access and use real-time data and noted that the platform would allow employees to make better decisions and improve their efficiency.
Of course we are only talking about the automobile industry. Microsoft looks to use their Industrial Metaverse Core and explore how immersive technology and XR enterprise solutions can have a positive impact on all types of work.
According to Althoff, organizations can use machine learning and artificial intelligence to analyze and improve the data they collect in an enriched state. These capabilities can then be used to create digital twins of their operations.
He said that creating digital twins can help improve the efficiency and effectiveness of industrial processes by allowing workers to access and manage different parts of the operation.
Through the use of digital twins, employees can also connect with their digital feedback loop. He said that by feeding the twins into experiences that are on handheld devices, they can easily create their own app tool chains that will allow them to interact with their lives in a digital manner.
“You can think of this as the model teaching the people and the people teaching the model for real time digital feedback and enhanced learning."
Despite the various changes that have occurred in the metaverse, Althoff noted that the industrial metaverse is still in its infancy. He said that hundreds of organizations are already using these capabilities in their operations.
Althoff indicated that sustainability could be one of the biggest benefits presented by the industrial metaverse.
“If you make anything or you move anything, you create a carbon footprint,” he said. “If we can simulate that infinitely in the cloud before you make it or before you move it, we can help you build better products more effectively, more efficiently, with lower carbon footprint, lower water utilization, more sustainably than ever before.”
Through its industrial metaverse capabilities, Microsoft has been able to help companies like Hellenic, which is one of the largest producers of Coca-Cola products in Europe. With over 55 facilities across the continent, Hellenic is able to produce over 90,000 bottles of Coca-Cola per hour on a single production line.
According to Althoff, the company was able to reduce its energy consumption by over 9% in just 12 weeks by implementing a sensor fabric and creating digital twins.
That is a pretty huge benefit.