It is already possible, “today” to monitor social media for messages from people sharing their suicidal thoughts and feelings; to geo-locate those individuals, and then to send rescue to their door, within only minutes of their first suicidal message. IT professionals know this. It’s the mental health community that is just figuring this out. IT professionals are already developing this technology. When they “throw the switch” and begin to use it, mental health needs to be ready to respond effectively.
This is an interesting development for several reasons. First, most mental health professionals avoid even passing familiarity with the use of social media, let alone understanding the culture and nuances of interacting on various mainstream platforms, such as Twitter or Tumblr. Many in my field aren’t familiar with research that suggests that suicidal people may be more honest about their risk on social media (some research suggests people report suicide at higher rates on social media than in real life).
Right now there is a “street nurse” in Toronto that finds people on Twitter who are suicidal and gets them local resources in real time (@RealTimeCrsisis). But for many in the mental health industry, the thought of doing this is anxiety provoking, and overwhelming. It is our community’s anxiety that is getting in the way of making a meaningful contribution at the intersection of mental health and social media.
Second, the IT industry has no “IRB” or mental health review process, and it’s not going to. People who can develop a code or technology will do it, whether they have thought through the mental health implications of it or not. It is up to the mental health community to become familiar with social media and IT, and to initiate relationships with the IT community. This is the only way that we can increase the chance of technology and social media being used thoughtfully and effectively in mental health related matters.
Can you imagine an IT developer “throwing the switch” on a program that quickly identifies and geolocates suicidal people via social media posts, only to find people in need of help with no idea how to go about helping them? While our mental health community is still debating the utility and security of encrypted email with patients (we’ve had that capability for years, patients want it, and yet few mental health providers use it), IT develops are rapidly expanding the capabilities and possibilities for communication with people who are suicidal. It is past time for our mental health community to begin to engage with IT and social media developers in meaningful ways about public health and safety.
Finally, this is interesting because IT and social media developers now have the tools to initiate social science experiments on a scale that was unimaginable only a few years ago, and they are already doing it. Facebook developers have demonstrated ability to manipulate your mood. They performed an experiment on over 700,000 users with a speed and scale that is unimaginable to most social science and mental health researchers today.
Quite honestly, our mental health community isn’t sure how to react to this. Should we be impressed? Nervous? Indignant that we weren’t consulted and our safety review methods not used? Ashamed we didn’t anticipate this and prepare? Intrigued at what might be possible, and how to use that for the greatest good?
In the end, I think the only useful reaction is to replace fear with curiosity. Anxiety and avoidance is no substitute for thoughtful caution and curiosity. If our mental health community stands by our tradition of advocacy and compassion, then we have a duty to acknowledge and embrace the possibilities of social media for preventing suicide and improving mental health. Because if we don’t do it, then someone else who doesn’t understand mental health will.
When it comes to suicide prevention and mental health, technology is moving too fast for our field to wait until we are comfortable enough with social media to innovate. Instead, we must innovate until we are comfortable.