Independent gatekeepers must regulate mental health apps

0

FFrom individuals to employers, there is growing interest in using digital services to help people overcome their mental health challenges. This is a solid approach, given how difficult and expensive it can be to find and work with an in-person therapist. However, digital therapeutics currently represent a kind of “Wild West”. Supervisors need to be more methodical in the way they evaluate these products.

Amid a strong backlash against technology, with congressional scrutiny on topics ranging from privacy to moderation of online content, many tech companies are now asking people to trust them in new and more intimate ways. when they go to health care. Some digital health players have called for change from within, but not much is said about how people outside the digital health industry can help ensure that users of these new technologies are protected against the risks that digital health presents to them.

Without outside oversight, some businesses will take shortcuts. I saw the risks and dangers of this in the company I co-founded, Modern Health, which was recently reported by The Information. Some startups seem willing to risk patient safety and provide inadequate care in the hopes of getting rich quick.

advertising

I think these problems can be solved, but not by relying on tech companies promising to control themselves. Here are some key solutions that don’t rely on startup self-regulation, which few people talk about.

App stores must enforce standards

At the highest level, review processes in major app stores need to be strengthened. A large number of apps in Google and Apple stores have dodged Food and Drug Administration regulations aimed at ensuring patient safety by entering the “health and fitness” category of app stores, which does not require review by the FDA, instead identifying itself as “medical” applications, which the FDA oversees.

advertising

By avoiding more rigorous scrutiny, health and fitness apps can often cause issues such as health privacy breaches and worse. A 2019 study of suicide prevention apps, for example, found that phone numbers for suicide helplines were often listed incorrectly, or not listed at all. These applications had already been downloaded over 2 million times. It is not difficult to imagine that lives could have been lost.

Apple’s and Google’s business models for mobile devices allow these companies to earn substantial commissions on every app purchased. So, these two gigantic vendors should also take responsibility for categorizing apps correctly. Startups will never volunteer for more rigorous app store review processes, so the gatekeepers – Apple and Google – need to do this work in addition to profiting from it.

Companies, HR managers are facing serious new risks

Monitoring of digital health applications should also come from the human resources departments of companies that make decisions about what type of digital health services to purchase and provide as a benefit to their employees. Many digital health apps, while aimed at individual consumers as end users, have business models built around large and small companies purchasing apps for their employees. This market is currently estimated at $ 20 billion.

When I worked at Modern Health, HR managers in companies bought the service for their employees, and I’ve seen many of those employees become our patients. In the beginning, I was often the first person patients met and helped them connect to care before building systems to support this work. Many faced difficult challenges; some were suicidal. Later, I built systems to direct patients to the right care. Since some workers have serious mental health issues, HR teams need to improve their games and perform a rigorous review when it comes to selecting digital health services.

Until now, choosing health insurance has been a relatively low risk choice, as the offline healthcare system is heavily regulated by the government to ensure safety. Medical treatments begin in research labs controlled by ethics committees, tested in clinical trials, and ultimately approved by the FDA. In the digital world, a new health platform can be cooked up in someone’s garage and brought online without any external scrutiny. Of course, some of these types of apps use terms like “evidence-based”, but that often doesn’t make sense.

Making smart choices is critical – albeit difficult – for HR teams because old ways of doing due diligence don’t work. Many companies just want to know if a digital health service will be easily adopted and used by a large number of employees, but it’s like hoping employees have to keep coming back to the doctor. How often employees use a service is not the same as being in good health, and high usage is often driven by incredibly light treatments, like sending users self-care articles or offering them meditation classes that do not deal with basic health issues. In fact, these light but high-commitment treatments can even hamper the real care people need to recover quickly, and can even discourage them from seeking appropriate treatment.

While existing methods of evaluating HR departments – where they exist – do not work well for digital health, the scientific and medical communities have a solution: peer-reviewed research published in established scientific journals that test a application in question. A quick glance at these research papers, which quality companies can provide to potential buyers, offers some assurance that the service purchased is doing what it claims.

Business buyers should contact doctors or other health experts – not those affiliated with the app being evaluated – and request assistance in interpreting independent analyzes. The performance of an application once deployed in the labor market should be treated in the same way. The primary metric shouldn’t be how many people are using an app, but how well it works.

Just as physicians change their practices as medicine evolves, HR departments must assess new research and respond accordingly when deciding to keep a digital health service or make a change.

Give patients a protected voice

It can be scary for some people to post reviews of an app that essentially admit that they have a serious mental health issue that the app has or has not improved or made worse. The same goes for people with diabetes, cancer, or most other conditions.

But this feedback is absolutely necessary. That’s why Apple and Google, as app store owners, as well as digital health app developers and HR teams, need to create better feedback mechanisms that protect user privacy while at the same time giving a voice.

Opinions and complaints must also remain permanent. Apple’s App Store currently allows tech companies to clear all notices whenever new versions of their apps are released. This can make sense when bug fixes or new features are added to simple apps. But when an app is plugged into real-world care, digital therapy companies shouldn’t be allowed to suppress serious medical failures just because they’ve changed the color of a few buttons.

It should be a bright red flag if an app has significantly fewer ratings than one of its similarly sized competitors.

We have to work to get it right

Health care and, more importantly, mental health care is hard to come by.

It’s hard to fault mission-driven digital therapy companies, especially when so many people are in pain. That said, it is precisely because of the need for better physical and mental health care that we as a society must work collectively to get it right. At the end of the day, the public and the tech are all on the same side.

Most tech companies want to do the right thing, just like everyone else hopes. Society will benefit tremendously from innovation in technologies, especially in health technologies. But the stakes are rising and these companies must act accordingly.

Erica Johnson is an advisor to several digital health and wellness companies and co-founder of Modern Health.

Share.

About Author

Comments are closed.