Design Principles to Combat Extremism on Reddit
There have been 56 school shootings in the U.S. this year and it is only May. We need urgent gun reform that limits ammunition to ranges and background checks at all places where guns are sold such as conventions, not simply licensed dealerships. But we also need tech platforms such as Reddit, Facebook, and Instagram, to take responsibility for the rise in extremism their platforms have created and to find design solutions to decrease school shootings and other extremist attacks.
Today social platforms concentrate on image and text matching to decrease violent content from being seen and spread. This means an AI stops people from uploading photos that have been previously identified as violent and deletes text that praises terrorist support organizations. These methods often demand using human moderators as well as AI, which takes a terrible toll on moderators’ mental health. These methods are highly reactive, focused on mitigating the effects of extremism once they are on the platform.
I believe there’s another way. It is possible to use information architecture to proactively design for healthy information environments on social platforms. The specific example I use in this article centers around Reddit and the topic of guns, but a lot of the general design principles of Branch Design, Information Checkpoints, and Community Centered Wellness hold true for a broad array of topics and platforms.
Healthy Branch Design
What is an information environment? Information is often stored in text on our phones and reconstructed into images and archetypes in our minds. This makes it hard to talk about the differences between information environments the way a person could compare and contrast say, the jungle to the icy tundra.
The difficulty in understanding information environments as places that have a real, profound impact on us makes it doubly hard to understand the impact of an unhealthy information environment. A person can see a pile of trash outside their home and say “yeah, this is probably bad to be around, I should do something.” Information environments are built so subtly, people are often unaware that the information pathways they have developed are pushing them towards extremist thoughts.
I ran an experiment with my own reddit feed to see how easy it was to fall into a negative information environment. My normal Reddit feed is filled with subreddits such as r/catswithjobs, r/squirrels, r/healthyfood, r/antiwork, r/twoXchromosomes, r/foxes, r/aww. I joined r/guns, and from there was suggested r/proguns, r/firearms, r/nwttyg (nobody wants to take your gun), and /ar15. Once I liked one gun subreddit, it became increasingly easy for my entire feed to be filled with gun-related content.
When one’s information environment is suffused with only one or two major topics, they become the most important dominant influences in your world by influencing the schemas by which you analyze information. When you read mainly subreddits based on guns, your world begins to be broken down into intruder and defender, big bad government and good individuals.
This holds true for other topic areas and other platforms. On Tik Tok watching a handful of diet videos quickly fills your whole feed with unhealthy diet fads that can lead young women to anorexia and binge eating. Their world narrows to categories of skinny and fat. All social platforms utilize suggestion algorithms to get people to interact with them more, which means they see more ads, and in turn increases profit margins. But the simplicity of these algorithms creates a snowball effect that creates extremism across any topic.
The image depicted below shows on the left the “information pathways” that I was suggested after I joined the subreddit r/guns. Some communities were on the sidebar, others were suggested to me as I scrolled through my feed. While r/guns as a large subreddit was heavily moderated, as I was suggested smaller subreddits based on my interest in r/guns, the members became increasingly fearful of big government and open with racist and misogynistic remarks. On the right shows a potential different information structure that subtly pushes people who join r/guns towards related but more prosocial subreddits. To build these healthier pathways, these social platforms need to understand not simply that people come to r/guns, but why they do so.
Do they love the thrill of shooting? Help them navigate towards interests in airsoft, paintball, or archery. Do people love the rugged feeling a gun gives them? Help Redditors develop an interest in camping and the outdoors. Are people trying to feel safe in their communities? Guide them to subreddits where they can get involved with activism in their local communities in a productive way. And for Redditors that migrate towards gun-focused subreddits because they are lonely and stressed, guide them to subreddits that can help them work through trauma and develop healthy outlets to focus their feelings to stop fear developing into hate and violence.
I am sure some will say “But this is manipulation!”. To this I say, yes, yes it is. But everything you see on social platforms has been designed by advertisers that want to sell their products, content creators that want to sell their brands, or people who want you to join their cause. Whether you acknowledge it or not, your information environment has been designed by people and you are being manipulated. But in acknowledging how deeply information design can impact our thoughts, I think Reddit’s, Facebook’s, and Instagram’s designers can begin the work of building healthy information environments for their users. The first step is healthy branch design that leads r/gun users towards a multiplicity of suggestions related to guns like camping, archery, and community development rather than filling their feed with purely guns. The second step is to develop information checkpoints.
Reddit in 2014 closed down the subreddit r/gunsforsale, which enabled private individuals to bypass background checks by having the gun sales fall under the category of “private transactions” and outside of the authority of the federal Bureau of Alcohol, Tobacco, Firearms and Explosives. Even though this marketplace has been shut down, the r/guns subreddit still served as a means for the Buffalo supermarket shooter to get feedback on what gear to use. Gendron who went by Jimbo-boiii on reddit asked other redditors about the best ear muffs to use for his attacks, along with questions on gun types and ammunition. These posts have since been deleted on Reddit but were captured by the web archives and are depicted below.
Along with advice, people are allowed to post generically on the r/guns subreddit that they have guns to offload and then the deal is made via direct message (DM) or off-site.
I understand why teenagers want to learn about guns — many grew up playing first-person shooter games, saw their fathers go range shooting, watched movies where the hero takes down the bad guys with their guns. But social platforms need to set up information checkpoints to stop people under 18 from gaining specific information that can help them plan mass shootings and connections that lead to the purchase of guns through the platform.
What this means is that while general information on guns should be open to all, the kinds of specific information on how to access a gun should be restricted until people are over 18, and they should need to provide a photo of their IDs as proof.
Above on the left shows a current page on r/guns that helps first-time gun owners through the process of buying their guns online. On the right shows a potential design revision of the page where users would need to show that they are over 18 by submitting a photo of their driver’s license or other government issued ID.
There are other information checkpoints that should be utilized. Below depicted is a current page on r/guns and a possible revision where the usernames of people on these forums would also be un-clickable and randomly generated until a person proves they are over 18. This makes it so users cannot direct message others on the r/guns subreddit for specific advice and gun purchases.
By making it harder for under-18 year-olds to become a part of the gun community, it allows their brains to mature and makes it more likely that they have passed high school and in turn less likely to rampage specifically at schools. Many past shooters were the loner kid in high school and felt alienated. When they leave high school, all of a sudden they are not locked in to this one hierarchy where they felt at the bottom. Those hurts and grudges matter less, and the hope is that there will not be that same anger combined with access and knowledge all at the same time.
Community Centered Wellness
r/guns has on their FAQs page a one-line statement with the telephone number of the Suicide Prevention Lifeline. However, suicide hotlines in the country are woefully understaffed and the staff often poorly trained. Many people who seek help are pushed off to voicemail, put on hold for over thirty minutes, or given unhelpful advice. Suicide hotlines are also bad for long-term mental health needs and the kind of de-escalation needed to stop people from utilizing guns to become killers. What is needed is a community centered approach from the guns community to help people in their community. Below are four key design changes the r/guns community should implement to move towards community centered wellness.
#1: Get Serious About Offering Mental Support
It’s not enough to just have a phone number for the suicide hotline. The r/guns community should link both to specific people that members can reach out to, as well as links to mental health related subreddits and actively encourage their members to join them. Depicted below is how the r/guns community’s FAQ page could look with these links.
#2: Make Your Ethos Clear
Many perpetrators of violence do so because they believe the communities they are within are supportive. Community-centered wellness in the r/guns community means making it explicit that they do not condone racism or misogyny, and that a part of gun ownership means taking on the responsibility of being safe with a gun.
At the top of their FAQ page, they should include a statement such as:
Guns are awesome to use for hunting, target shooting, and protection. This community does not condone using guns for anger or violence towards others or yourself. Being a gun owner means taking responsibility for your mental health for yourself and others.
I do not know what specific wording should be used, but this is something the r/guns subreddit mods should take on and get feedback from their entire community. To work, this needs to be developed from the users themselves.
#3: Mental Health Mondays
There’s specific subreddits for mental health, but many people believe they do not need help. The r/guns subreddit and other gun communities across Reddit, Facebook, Tok Tok, etc. can do a lot for their members by making it a part of their culture that to be a gun owner means to be a responsible gun owner. Having weekly threads that talk about topics related to the intersection of mental health and gun use can go a long way to instill the mentality of responsible gun ownership into their community.
#4: Put Their Money Where Their Mouth Is
It is a refrain you see over and over again in the gun subreddits: it is the person’s mental health that is at fault, not the gun. If the people in the r/guns subreddit and related subreddits truly believe this, gun rights activists should be the number one proponents of Medicare for All. If all people who suffer from depression, aggression, and a various array of mental illnesses have adequate access to mental health services, then everyone who has a gun will also be someone who is properly equipped both with their knowledge of firearms and their mental state to have a gun. On the community’s sidebar, they can add a link to foundations that lobby for Medicare for All and other organizations that fund mental health.
But What About The Money?
The common refrain is that extremism pays and so this is why big tech has not changed the structure of their platforms. But bills are increasingly coming to the House floor that push for accountability in tech that are likely to make big tech take some action themselves to change themselves before government intervention. Moreover, teens have increasingly become disenchanted and disengaged with social media in part because of the negativity on it. By building healthier information environments that help people build more prosocial, balanced information ecosystems it might get people to scroll less daily but maintains these users on the platform in the long term.
It is time for social platforms to take more responsibility for developing the information environments where extremist mentalities are cultivated and connections built that give would-be perpetrators the communal permission and access to the tools that help them commit atrocities. Instead of apologies, we need more thoughtful, active design of the information structures that build our ideas of the world around us. The idea of healthy information environments is a new one, but in the coming years I believe it will be the determining factor in whether America splinters into increasingly extremist communities or whether we grow closer together.