By Micah Fyock, contributing writer

The prominence and accessibility of the internet makes it easy for us to forget how young it is; however, tragic events like that of Charlottesville remind us that the actions we take today will set a foundation for how we view the internet from now on. In light of the violence committed by white-supremacists, several companies are attempting to distance themselves from these groups as a statement against neo-Nazi and white-supremacist beliefs. Online dating services such as Tinder, Bumble and OkCupid have taken down white-supremacist profiles, and Spotify has even started taking down what it considers to be white power music. Social media sites such as Reddit and Facebook confirmed their involvement in banning pages with ties to far right extremists.

According to BuzzFeed News, Apple Pay disabled services for websites that “sold sweaters with Nazi logos, T-shirts emblazoned with the phrase ‘White Pride,’ and a bumper sticker showing a car plowing into stick figure demonstrators.” This merchandise is horrendous, and kudos to Apple for taking a stand against it. However, I’m concerned that this method doesn’t really deal with any of the issues at hand. What does silencing a group really achieve? Does silencing the internet presence of extremists make them disappear?

The Electronic Frontier Foundation (EFF) is a nonprofit organization set on defending civil liberties in the digital world. In reference to Google and GoDaddy refusing to maintain the domain name for a neo-Nazi website called the Daily Stormer, the EFF writes, “Domain name companies also have little claim to be publishers, or speakers in their own right, with respect to the contents of websites. Like the suppliers of ink or electrical power to a pamphleteer, the companies that sponsor domain name registrations have no direct connection to internet content.” The ink supplier doesn’t just stop supplying because it doesn’t agree with the way the ink is used. It must expect that the people will be able to see the misuse of the ink, and form an opinion for themselves. These companies are attempting to take a stand against white-supremacy when in reality they are showing the absence of faith they have in the public’s ability to form its own opinion. Not only that, but they are giving white-supremacists a reason to feel persecuted, which might make the extremist fires burn brighter than before.

This already muddy topic gets significantly harder to tackle in regards to violence. Freedom of speech is intended to spark discussion and healthy debate. Violent speech and hate speech only intends to harm people and doesn’t listen to reason. The tightrope for managing violence on the internet is an incredibly thin one to walk. In an interview with Cnet, a Reddit spokesperson said, “We are very clear in our site terms of service that posting content that incites violence will get users banned from Reddit.” This initially mundane statement really says a lot about how we should tackle this issue. Reddit has been banning groups that specifically praise violent acts of extremists; they aren’t just denying service to a group to avoid a negative public opinion. In an interview with The Verge, CEO of CloudFlare Matthew Prince said, “I don’t think this is as much of a free speech issue as a due process issue. If you participate in a system, you should be able to know up front what the rules of that system are.”

I agree with Prince that it is a due process issue, but companies should avoid turning this into a free speech issue. There is a difference between silencing extremists, and refusing to tolerate their actions. These rules against violent and hateful speech need to be established and set in stone now. The internet is vulnerable to people who might use their power to influence people. In this case it is an attempt for the side of good, but as opinions become even more polarizing in our country, these issues won’t be as black and white as the evils of racism.