How Crisis Text Line crossed the line in the public’s mind: Lock and Code S03E05

Credit to Author: Malwarebytes Labs| Date: Mon, 28 Feb 2022 16:55:48 +0000

Last month, Politico reported that Crisis Text Line, a national mental health support nonprofit whose volunteers help people through text-based chats, was sharing those chats with a for-profit company that Crisis Text Line spun-off in an attempt to boost funding for itself. That for-profit venture, called Loris.AI, received “anonymized” conversational data from Crisis Text Line, which Loris.AI would use to hone its product—a customer support tool.

The thinking behind this application of data went a little something like this: Companies all over the world have trouble dealing with difficult customer support conversations. Crisis Text Line had trained an entire volunteer force on having broadly difficult conversations. What if the lessons from those conversations could be gleaned from the data trails they left behind? What if the lessons could be taught to a product, which would in turn help customer support representatives deal with angry customers?

But that setup, once exposed by Politico, infuriated many members of the public. Some thought it was wrong to keep conversational data, period. Some thought it was wrong to allow outside researchers to study the data of texters and the volunteers who support them. And some were primarily upset with the application of this data to bolster a for-profit venture.

Today, to help us understand that anger and to dive into data privacy principles for crisis support services, we’re speaking with Courtney Brown, the former director of a suicide hotline network that was part of the broader National Suicide Prevention Lifeline.

Interestingly, during her time with her suicide hotline network, Brown consulted with Crisis Text Line on the evaluation of its volunteer training program in their first year.

For Brown, the problems with Crisis Text Line are clear: The use of the data was not proven to help anyone in any way that hadn’t already been discovered in prior suicide research.

“[Crisis Text Line is] acting like there is a social good, that there must be—there must be a social good somewhere in here. But seriously, what is it. Tell me what it is. Maybe I’ll reevaluate it if you can tell me how using this data is different from using all of the other data that’s been collected about suicide prevention.”

Tune in to hear all this and more on this week’s Lock and Code podcast by Malwarebytes Labs.

This video cannot be displayed because your Functional Cookies are currently disabled.

To enable them, please visit our privacy policy and search for the Cookies section. Select “Click Here” to open the Privacy Preference Center and select “Functional Cookies” in the menu. You can switch the tab back to “Active” or disable by moving the tab to “Inactive.” Click “Save Settings.”

You can also find us on Apple PodcastsSpotify, and Google Podcasts, plus whatever preferred podcast platform you use.

The post How Crisis Text Line crossed the line in the public’s mind: Lock and Code S03E05 appeared first on Malwarebytes Labs.

https://blog.malwarebytes.com/feed/

Leave a Reply