Mental Health Apps Are Not Keeping Your Data Safe

日本 ニュース ニュース

Mental Health Apps Are Not Keeping Your Data Safe
日本 最新ニュース,日本 見出し
  • 📰 sciam
  • ⏱ Reading Time:
  • 72 sec. here
  • 3 min. at publisher
  • 📊 Quality Score:
  • News: 32%
  • Publisher: 63%

With little regulation and sometimes outright deception, the possibility of discrimination and other “data harms” is high

Imagine calling a suicide prevention hotline in a crisis. Do you ask for their data collection policy? Do you assume that your data are protected and kept secure? Recent events may make you consider your answers more carefully.

Loris AI, a company that uses artificial intelligence to develop chatbot-based customer services products, had used data generated by the over 100 million Crisis Text Line exchanges to, for example, help service agents understand customer sentiment. Loris AI has reportedly deleted any data it received from Crisis Text Line, although whether that extends to the algorithms trained on that data is unclear.

We surveyed 132 studies that tested automation technologies, such as chatbots, in online mental health initiatives. The researchers in 85 percent of the studies didn’t address, either in study design, or in reporting results, how the technologies could be used in negative ways. This was despite some of the technologies raising serious risks of harm.

In policy, most U.S. states give special protection to typical mental health information, but emerging forms of data concerning mental health appear only partially covered. Regulations such as the Health Insurance Portability and Accountability Act do not apply to direct-to-consumer health care products, including the technology that goes into AI-based mental health products.

Crunch enough data points about a person’s behavior, the theory goes, and signals of ill health or disability will emerge. Such sensitive data create new opportunities for discriminatory, biased and invasive decision-making about individuals and populations.

このニュースをすぐに読めるように要約しました。ニュースに興味がある場合は、ここで全文を読むことができます。 続きを読む:

sciam /  🏆 300. in US

日本 最新ニュース, 日本 見出し

Similar News:他のニュース ソースから収集した、これに似たニュース記事を読むこともできます。

Now Android fitness apps like Peloton and MyFitnessPal can share data via Health ConnectNow Android fitness apps like Peloton and MyFitnessPal can share data via Health ConnectA central location for sharing fitness and health data across apps.
続きを読む »

Peloton, Oura and other fitness apps roll out support for Google's Health Connect platform | EngadgetPeloton, Oura and other fitness apps roll out support for Google's Health Connect platform | EngadgetGoogle and Samsung developed a platform that enables health and fitness apps to share data more easily.
続きを読む »

State leaders won’t commit to specifics about how much they might invest in children’s mental healthState leaders won’t commit to specifics about how much they might invest in children’s mental healthAs public debate intensifies over how the state will divvy up billions in new money, community groups that treat children for mental illness fear they’re not a priority.
続きを読む »

Jonah Hill’s New Documentary Is an Actually Helpful Film About Mental HealthJonah Hill’s New Documentary Is an Actually Helpful Film About Mental HealthIn Netflix’s “Stutz,” he works through body image issues, grief, and more with his therapist.
続きを読む »

Nick, Angel Carter Raise Funds for Mental Health Charity After Aaron's DeathNick, Angel Carter Raise Funds for Mental Health Charity After Aaron's DeathFollowing Aaron Carter's death, Nick Carter and Angel Carter are raising money for a mental health charity in honor of their late brother. Details:
続きを読む »

LVMC hosting free lecture on mental health: 'Start the Conversation and End the Stigma'LVMC hosting free lecture on mental health: 'Start the Conversation and End the Stigma'Lompoc Valley Medical will host a cost-free lecture featuring Transitions- Mental Health Association for an evening of 'mental health myth-busting and inspiration' at 6 p.m. Wednesday, Nov. 16, in the
続きを読む »



Render Time: 2025-03-10 04:10:47