Content warning: This post discusses suicide and a forum about suicide.
Recently, I read a NYT’s piece titled “Where the Despairing Log On, and Learn Ways to Die” about a site with forum functionality where those experiencing suicidal thoughts can talk to each other. I personally don’t want to amplify the site based on its current design so I will not link to it. I also don’t have the capacity to dig into itty bitty details of this entire topic! For now though, I do want to touch on the actual site’s design, for lack of a better term, and ways in which what’s there could be improved to make the site more useful and, frankly, ethical. My hopes in sharing is that those who created the site or who might use the site can advocate for some changes and evolve what’s there rapidly as there does seem to be great harm caused in the current form (one could argue the same for Instagram or Facebook but I digress). While I did not spend more than 30 minutes on the site and don’t know how it’s changed over time, I did want to share my perspective based on just over a decade of experience working on the web.
- Have content locked behind account creation. Currently, you can see seemingly every forum thread without ever creating an account. This seems extremely unwise for the following three points below. Update while writing this post: they did implement this since the NYT piece came out which gives me hope!
- Have an onboarding flow that helps teach people how to navigate the platform, what resources are available, etc. I highly recommend getting in touch with folks at Koko as they had an excellent onboarding process for their app back in the day that included light coaching around dos and don’ts for interacting on the app.
- Have numerous option to opt in and out of content. For example, someone should be able to hide threads that mention methods or to opt out of the forum functionality entirely if they want to.
- Have the option to select what you are there for at the start. This can help folks get what they want out of the space and help those in charge of the site make more informed decisions about why folks are there (rather than guessing).
- Don’t simply perpetuate the most popular content and instead have some level of curation/amplification in place that’s in line with the stated values of the forum rules.
- Create long form content on nuanced topics related to suicide that balances out the forum functionality which acts as a true free for all, making it hard for folks to find what might be most helpful. For example, at my job, I’ll sometimes recap important conversations in our private LGBTQ+ group by anonymizing the discussion, getting reviews in the private channel, and sharing a condensed version publicly to both continue the convo and have a touchpoint for the future.
- Amplify some content over others. In particular, de-emphasize methods and add more friction to get to conversations related to that. Currently, the content appears without any particular order making it incredibly easy to land there.
- Lean into the community aspect of the site to evolve what’s there — ask folks what would be helpful for them rather than deciding for them! Tied to this, based on the article, it sounds like more community moderators could be advantageous.
- Promote and partner with groups like Emotional CPR or Project Lets (who recently ran a rad Anti-Carceral Approaches to Suicide training). This helps shift the conversation from individual choices to the collective system that might be leading someone to be suicidal. It also can create more “muscles” around navigating these conversations without leaning too far in either direction (fully robbing suicidal people of their rights or encouraging folks to go through with it).
While the above ideas will undoubtedly add friction to the experience, this feels like necessary friction to help more people headed to that site to actually find a sense of community. My call is to lean into the community aspect and help the collective build muscles to have better, more nuanced conversations on their own terms. By allowing more people to interact in more ways on the site, I think it’ll ultimately be helpful to more people if that’s what the creators ultimately are aiming to do (by help, I don’t mean “save” but help feel less alone, provide resources for recovery, etc).
Side note: I wish we spent more time talking about the technical decisions behind platforms and exploring solutions in that space.