Healthcare wearables have been around for quite a while. In the late 1980s, the invention of digital hearing aids marked the beginning of the industry’s digitalization and gave rise to a whole new category of medical devices. Since then, wearables have evolved in various forms and learned to perform various operations.
Today, powered by smart sensors, artificial intelligence, and machine learning technologies, wearable devices are being actively adopted for:
- Remote patient monitoring
- Early diagnostics
- Monitoring chronic conditions
- Enhancing everyday health and lifestyle patterns
- And even handling emergency health situations
The use of intelligent wearables in the healthcare sector has been growing. These devices are forecast to lead to industry cost savings of about $200 billion in the US in the coming years. The global intelligent wearables market is projected to reach $180 billion by 2025.
Overall, because of their mobility, autonomy, accuracy, and ease of use, AI-powered wearables can be wonderful aids for healthcare providers. In the times of COVID-19, when availing proper healthcare facilities has become all the more difficult, AI wearables are especially important.
* Sounds too good to be true? Before we proceed, let’s make a stop for a reality check!
Are Wearables Really Capable of Using AI?
It’s important to separate the concept of Artificial General Intelligence (AGI) from the application of AI subcomponents.
AGI, aka strong AI, is a machine’s ability to successfully perform intellectual tasks and make data-driven decisions like a human being. Such type of intelligence is used in self-driving cars, big robotics systems, industrial predictive maintenance solutions, etc.
For healthcare wearables and IoT implementations, we’re dealing with weak AI. It just enables gadgets to handle specific reasoning or problem-solving tasks and doesn’t pretend to cover human-like cognitive abilities. Speech recognition, visual perception, and even decision-making capabilities can be present in wearable AI applications, but none are at a high level.
So, don’t expect AI-powered wearables to perform full-fledged diagnostics or write a prescription based on your health conditions. These devices are unlikely to substitute doctors in the near future.
But despite that, even now, the combination of wearable technologies and AI can greatly assist doctors in their day-to-day activities and improve patients’ lives. Here’s how.
Remote Heart Monitoring with AI and Sound
AI elements have already been used in cardiology for about a decade. For example, AI was successfully integrated into the back end of cardiology imaging and diagnostic systems to optimize staff onboarding, workflows, and reporting.
Now, machine learning algorithms can be found in wearable heart monitors and smartphone apps, allowing clinicians to monitor heart conditions remotely.
Eko, a California-based startup that became famous for its digital stethoscopes, introduced an AI-powered platform that consists of advanced heart monitors, patient and provider software, and AI-powered reporting.
The platform helps identify early signs of structural heart disease and atrial fibrillation and shares actionable data with a patient’s clinician.
The AI algorithms analyze the patient’s heart condition through a single-lead ECG and the heartbeat. Trained on multiple heart sound recording datasets, the AI algos use a deep neural network model for sound analysis. It can separate cardiac beats and catch the tissue vibration caused by turbulent blood flow and identify murmurs that are hard for the human ear to detect.
The platform and all its devices have FDA clearance, and the algorithms are HIPAA-compliant. At the moment, Eko aims to expand its platform to more healthcare providers and launch an at-home monitoring program for cardiopulmonary patients.
Predicting Women’s Fertility with AI and Big Data
AI is a tool that can eliminate guesswork from any hard to forecast process, including... calculating women’s fertility peaks.
Ava, a health company based in Zurich and San Francisco, introduced a solution that analyzes multiple health parameters to precisely identify the opening and closing of a woman’s fertile window.
The solution itself consists of a smart sensor bracelet, a mobile app, and a robust backend with AI and machine learning algorithms. Bound together, they collect data on hormonal changes, ovulation cycle, body temperature, and other physiological parameters. The data is accumulated in a big database, processed, and analyzed using AI that delivers a report with insights and recommendations.
This technology has already been tested and validated in multiple clinical trials.
In addition, the Ava bracelet’s AI capabilities can be used for menstrual cycle analysis and pregnancy monitoring.
AI Wearables for the Blind and Visually Impaired
A couple of years ago, Sumu, a US-based healthcare startup, introduced Sumu Band, a revolutionary healthcare wearable. The band was designed to help the blind or visually impaired navigate through environments and avoid collision with surrounding obstacles.
With the help of sonar and echolocation, Sumu Band detects objects up to 16 feet or 5.5 meters away. Then haptic vibration feedback informs the user how close (or far away) they are from objects.
The device became quite popular in the US market. And although it doesn’t contain AI components, it laid the foundation for a whole new category of healthcare wearables—wearables for the visually impaired. This category is evolving, and one of the first solutions in this class to adopt AI technology is…Google Glass!
Google Glass is Dead! Long Live Google Glass!
Yes, you’ve read it right. Google Glass is still alive. The second edition has fully updated hardware and was redesigned primarily for business needs. The device is supposed to help hands-on workers with glanceable, voice-activated assistance.
Also, with the installed AI software toolkit from Envision, Google Glass 2 makes an excellent tool for blind and low-vision people that helps them understand the surrounding environment. It uses cameras to identify objects and AI software components to recognize and describe them to a user.
Envision’s software is based on Optical Character Recognition (OCR) technology, the fastest and most accurate at the moment.
With the app, the Glass can read text from posters, menus, food packaging, display screens, QR and barcodes, and even some handwritten text. The device can also recognize faces, identify colors and object shapes, describe surrounding scenes, and read complex documents.
The application is multi-lingual (it supports 60 languages) and has an intuitive voice recognition user interface.
AI that Helps Deaf People ‘Feel’ the Environment
Starkey Hearing Technologies, a healthcare company based in Minnesota, introduced Livio Edge AI, a multifunctional hearing aid equipped with gesture-recognition and natural language processing capabilities.
As the company reports, audibly impaired people usually suffer from losing a sense of space and balance. This can potentially put them at risk of falling, which can cause serious injury.
To make up for these weakened senses, the hearing aid uses 3D motion and gesture detection sensors to capture the surrounding world. Next, the data gets analyzed with the AI-powered app, which makes smart, immediate adjustments to the sound.
This approach allows hearing-impaired people to catch a full perception of the environment.
And this hearing aid looks slick too.
AI to Simplify the Lives of Patients with Chronic Conditions
According to a study carried out by Twilio, COVID-19 has sped up digital transformation by 5.3 years! It has affected all industries, including healthcare. The pandemic has been a digital “accelerant” for healthcare developers: lots of revolutionary telehealth solutions have become available on the market due to little or no resistance from the side of healthcare administrative executives.
Current Health is one such solution. It appeared in the market in April, after receiving FDA approval. This is a single enterprise platform designed to capture patient health at home. The platform’s engine is a set of AI-powered algorithms that analyze the collected data and quickly identify health deterioration. The system notifies doctors about severe cases and allows them to intervene using built-in patient engagement and telemedicine tools.
NHS hospitals at Dartford and Gravesham have partnered with Current Health to launch a pilot program aimed at post-discharge monitoring of patients with chronic diseases. The participants of this program received wi-fi armbands to record their vitals. They also got tablets equipped with chatbots to get medication reminders, alerts, and updates on their health conditions and to live chat with doctors whenever needed.
The results of the pilot campaign were impressive. The number of unnecessary home visits was reduced by 22%, enabling care teams to dedicate more effort to patients who were really at risk. Also, the system proved to be efficient in the pandemic times as a way to reduce the risk of spreading the infection among vulnerable patient groups.
Bottom Line
A combination of wearable and AI components promises long-term positive growth for the healthcare industry.
The advancement of this medical device type is supported by concurrent technological progress in both software and hardware, and the increased market demand caused by the COVID-19 pandemic.
So it is just the beginning. The impact and possible applications of AI-powered wearables have immense potential and will continue to rock the industry.