The metaverse, the internet realm where animated avatars of our physical selves will be able to virtually conduct everything from shopping to gaming to traveling — someday — is heralded as the digital world’s next big thing.
According to experts, it could take a decade or more for the essential technologies to catch up to the hype.
However, the healthcare industry is already implementing some of the key elements that will eventually make up the metaverse — virtual reality (VR), augmented reality (AR), mixed reality (MR), and artificial intelligence (AI) — as well as the software and hardware that will power their applications.
Numerous healthcare applications have been developed since Facebook — now Meta Platforms — purchased Oculus and its virtual reality headset technology in 2014 for $2 billion.
One of the most recent was a cooperation between the WHO Academy and Facebook Reality Labs and Nexus Studios.
The R&D incubator of the organization created a mobile learning software for health workers fighting Covid-19 all over the world.
One of the training courses uses augmented reality to recreate the right techniques and sequence for putting on and removing personal protective equipment on a smartphone.
The app, which has material in seven languages, is based on the demands of 22,000 global health workers who were polled by the WHO last year.
UConn Health, the University of Connecticut’s medical center in Farmington, Connecticut, uses Oculus technology to train orthopaedic surgery trainees.
Precision, a Canadian medical software business that delivers VR training and instructional modules in orthopedics, has partnered with educators.
Residents wearing Oculus Quest headsets can see themselves executing a variety of surgical procedures in 3-D, including pinning a shattered bone.
Because the method is carried out digitally, students can make mistakes and obtain feedback from teachers to improve their performance on their next attempt.
Early users included Stryker, a medical technology business based in Kalamazoo, Michigan when Microsoft released its HoloLens AR smart glasses for commercial development in 2016.
It started using the AR device in 2017 to help hospitals and surgery centers enhance their operating room design procedures.
Because operating rooms (ORs) are shared by a variety of surgical services ranging from general surgery to orthopedics, heart surgery, and other procedures, lighting, equipment, and surgical tools vary depending on the treatment.
Stryker engineers are now able to create shared ORs using holograms, thanks to the HoloLens 2’s ability to transform OR design from 2D to 3D.
Without requiring tangible objects or people to be there, the MR experience visualizes all of the people, equipment, and settings.
Zimmer Biomet, a medical device company based in Warsaw, Indiana, recently unveiled its OptiVu Mixed Reality Solutions platform, which includes three applications that use HoloLens devices: one that uses MR to manufacture surgical tools, another that collects and stores data to track patient progress before and after surgery, and a third that allows clinicians to share an MR experience with patients before a procedure.
A Zimmer Biomet spokeswoman said, “We are now employing the HoloLens in a pilot mode with a remote assist in the United States, EMEA, and Australia.”
According to a company spokeswoman, the technology has been used for remote case coverage and training programs, and the company is building software applications for the HoloLens as part of data solutions focusing on pre-and post-procedures.
Microsoft’s futuristic holographic vision
Mesh, a mixed reality platform powered by Microsoft’s Azure cloud service, was unveiled in March, allowing users in different physical places to join 3-D holographic experiences on a variety of devices, including the HoloLens 2, a variety of VR headsets, smartphones, tablets, and PCs.
The company pictured avatars of medical students learning about human anatomy gathered around a holographic model, peeling back muscles to discover what’s underlying, in a blog post.
In June, Johns Hopkins neurosurgeons performed the institution’s first-ever AR surgeries on living patients, demonstrating real-world applications of AR medical technology.
During the initial treatment, doctors inserted six screws into the patient’s spine as part of a spinal fusion.
A separate team of surgeons removed a malignant growth from a patient’s spine two days later.
Both teams used Augmedics headsets, which have a see-through eye display that displays images of a patient’s interior anatomies, such as bones and other tissue, based on CT scans.
Timothy Witham, M.D., director of the Johns Hopkins Neurosurgery Spinal Fusion Laboratory, described it as “having a GPS navigator in front of your eyes.”
The Gordon Center for Simulation and Innovation in Medical Education at the University of Miami’s Miller School of Medicine uses AR, VR, and MR to train emergency first-responders to treat trauma patients, such as those who have suffered a stroke, heart attack, or gunshot wound.
On Harvey, a life-like mannequin that realistically replicates practically any cardiac illness, students practice life-saving cardiac treatments.
Students can “see” the underlying anatomy that is visually exposed on Harvey by using VR goggles.
“We’re not restricted by physical objects in the digital realm,”Barry Issenberg, MD, Professor of Medicine and Director of the Gordon Center.
Students had to physically be on the scene and train on genuine trauma victims before establishing the virtual technology curriculum, he said.
“Now we can ensure that all learners, regardless of their geographic location, get the same virtual experience.”
The University of Southern California Institute for Creative Technologies (ICT) has been developing virtual reality, artificial intelligence, and other technologies to treat a variety of physical and mental health disorders since 1999.
“The technology was Stone Age when I first got involved,” Albert “Skip” Rizzo, a psychologist, and director of medical virtual reality at ICT, said, recalling his experimentation with an Apple IIe and a Game Boy handheld system.
He now uses Oculus, HP, and Magic Leap VR, and AR headsets.
Rizzo has aided in the development of Bravemind, a virtual reality exposure therapy targeted at alleviating PTS in veterans of the Iraq and Afghanistan wars.
During exposure therapy, a patient faces his or her trauma memories through simulations of their experiences, aided by a qualified therapist.
The patient can be engaged in many virtual scenarios, including a Middle-Eastern themed city and desert road landscapes, by wearing a headset.
“Patients simulate individuals, insurgents, explosions, even odors, and vibrations” using a keyboard, according to Rizzo.
As an alternative to traditional talk therapy, a patient can experience a scenario in a secure, virtual world rather than depending just on imagination.
More than a dozen Veterans Administration hospitals now offer evidence-based Bravemind therapy, which has been found to reduce PTS symptoms significantly.
More randomized controlled trials are in the works.
The health-care industry remains a real-life proving ground while Big Tech, alongside software and hardware companies, academia, and other R&D partners, continues to build out the metaverse.
On the advisory firm’s website, Paulo Pinheiro, head of software at Cambridge, U.K.-based Sagentia Innovation, said, “While the metaverse is still in its infancy, it holds immense potential for the reform and improvement of health care.” “It’ll be intriguing to see how things play out.”