VR / AR in healthcare: technology trends

0

VR / AR technologies have traditionally been associated with the entertainment industry, with virtual and immersive environments used for gamers, product designers, and architects. However, increasing healthcare spending and the need for advanced technologies to aid in the development of new therapies and diagnostics have fueled the need for this technology in the healthcare industry.

Below are the top technology trends impacting VR / AR in the healthcare industry, as identified by GlobalData.

AR as the next big computing platform

Ultimately, AR glasses of some sort can replace the smartphone as the primary connected device that users carry with them. The first-mover advantage in AR technology will be a game-changer. That’s why all the big tech companies take it so seriously. Apple, the world’s most profitable smartphone maker, is potentially the most vulnerable in the long run.

Mobile AR

Augmented reality software or applications could be installed on more than 3.5 billion smartphones by 2022, according to Digi-Capital. A 2019 survey by the same company of AR companies and businesses found that 78% of respondents identified mobile as the most important AR platform. The existing smartphone ecosystem, consumer convenience with smartphones, and improved computing capabilities of these devices are the main drivers of mobile AR platforms. Apple and Google are the main potential beneficiaries of the growth in mobile AR, given their strong smartphone ecosystems and well-established AR software development kits (SDKs).

Augmented reality cloud

The AR cloud refers to a real-time 3D virtual map overlaid on the real world. It allows multiple users and devices to share augmented reality experiences. In the lingo, the AR cloud promises to enable the use of persistent content by multiple users, individually or collectively. Real-time tagging of virtual content to physical locations will propel AR beyond device boundaries and make the AR experience more natural and intuitive. Google, Apple, Microsoft, Facebook, Amazon, Magic Leap, and Samsung are all big investors in the AR cloud.

Conversation platforms

Smartphones today are usually equipped with virtual assistants built into the operating system. AR headsets and smart glasses are starting to incorporate similar capabilities. Microsoft’s Cortana in HoloLens 2, Amazon’s Alexa in Focals by North, and Google Assistant in Bose AR smart glasses are just a few of the early iterations of voice-enabled AR devices. AI-enabled and voice-activated virtual assistants enable hands-free operation of the device, which can be essential for some augmented reality applications.

Increasing degrees of freedom

One of the main issues with using VR and AR technologies is nausea among inexperienced users. This is due to the disconnect between the apparent movement that the user’s eye perceives while the user’s body remains still. To address this, VR companies recently introduced three additional degrees of freedom to increase the fidelity of the VR experience. These include gyroscopes and accelerometers which have been added to further improve the fidelity of the VR experience.

Haptic feedback

Since “tool simulators” are a potentially extremely important part of the future VR space, it is important that the user can “feel” the tool they are holding for maximum fidelity. To this end, companies such as ImmersiveTouch have provided surgical tool simulators that provide haptic feedback.

Lower the costs

As VR and AR mature as technologies, the costs of purchasing a new set of VR equipment are steadily decreasing. This means that adoption may increase in the healthcare industry as it becomes more mainstream.

Finger tracking

As virtual reality and augmented reality mature, the level of detail at which the user can interact with the “world” of virtual reality becomes more sophisticated. Steam tests the EV3 Knuckle controllers, which individually track the position of each of the user’s fingers. This means that previously users could only grab or drop a tool, but now they can hold a tool with a few fingers, pinch a tool, etc. The tool is even able to detect which fingers hold more firmly than others.

5G

5G can improve many aspects of VR / AR in healthcare, from medical imaging to ensuring remote consultations and patient follow-up. Verizon recently partnered with Medivis at its 5G lab in New York, leveraging its 5G technology with Medivis’ team of surgeons, radiologists and engineers to accelerate the development of the Medivis platform and bring VR / AR at the forefront of health care.

Brain-computer interface

The brain-computer interface (BCI) involves devices that allow users to interact with computers solely through brain activity, typically measured by electroencephalography (EEG). The merger of BCI with VR / AR can provide additional communication channels by increasing the bandwidth of human-VR / AR interaction. This is done either explicitly through active BCIs or implicitly using passive BCIs. Active BCIs allow users to issue commands to devices or enter text without physical involvement, while passive BCIs monitor a user’s state and can be used to proactively adapt the VR / AR interface. .

BCIs with VR / AR offer the possibility of immersive situations through induced illusions of an artificially perceived reality that can be used not only in basic BCI research, but also in therapeutic applications, etc. In September 2019, Facebook acquired neural interface startup CTRLLabs, a company that makes a bracelet capable of transmitting electrical signals from the brain to the computer.

This is an edited excerpt from the Virtual / augmented reality in healthcare – Thematic research report produced by GlobalData Thematic Research.


Source link

Leave A Reply

Your email address will not be published.