Although we are focused on protecting and optimizing the operation of the Internet, Cloudflare is sometimes the target of complaints or criticism about the content of a very small percentage of the more than thirteen million websites that use our service. Our termination of services to the Daily Stormer website a year and a half ago drew significant attention to our approach to these issues and prompted a lot of thinking on our part.  

At the time, Matthew wrote that calls for service providers to reject some online content should start with a consideration of how the Internet works and how the services at issue up and down the stack interact with that content. He tasked Cloudflare’s policy team with engaging broadly to try and find an answer. With some time having passed, we want to take stock of what we’ve learned and where we stand in addressing problematic content online.  

The aftermath of the Daily Stormer decision

The weeks immediately following the decision in August 2017 were filled with conversations. Matthew made sure the Cloudflare team accepted every single invitation to talk about these issues; we didn’t simply put out a press release or “no comment” anyone. Our senior leadership team spoke with the media and with our employees -- some of whom had received threats related both to Cloudflare’s provision of services to the Daily Stormer and to the termination of those services. On the policy side, we spoke with a broad range of ideologically-diverse advocacy groups who reached out to alternatively congratulate us or chastise us for the decision.

As the time stretched into months, the conversations changed. We spoke with organizations who have made it their mission to fight hate and intolerance, with human rights organizations that depend on access to the Internet, with tech companies doing their best to moderate content, with academics who think about and research all aspects of content online, and with interested government and non-governmental organizations on two continents. In the end, we spoke with hundreds of different experts, groups, and entities about how different companies and different types of services address troubling content at different places in the Internet stack.  

Our overwhelming sense from these conversations is that the Internet, and the industry that has grown up around it, is at a crossroads. Policy makers and the public are rightly upset about misuse of the Internet.  We heard repeatedly that the world is moving away from the Internet as a neutral platform for people to express themselves and access information. Many governments and many of the constituents they represent appear to want the Internet cleaned up and stripped of troubling content through any technical means necessary, even if it means that innovation will be stifled and legitimate voices will be silenced. And companies large and small seem to be going along with it.

Moving forward

We’ve thought long and hard about what’s next both for us and the Internet in general. Although we share concerns about the exploitation of online tools, we are convinced that there are ways forward that do not shortchange the security, availability, and promise of the Internet.

We think the right solution will take us out of the clouds and into the weeds.  We have to figure out what core functions need to be protected to have the Internet we want, and we will have to get away from the idea that there’s a one-size-fits-all solution that will address the problems we see. If we really want to address risks online while maintaining the Internet as a forum for communication, commerce, and free expression, different kinds of services are going to have to deal with abuse differently.

The more we talked to people, the more that we saw a fundamental split on the Internet between the services that substantively touch content and the infrastructure services that do not.  It’s possible that, as a company that provides largely infrastructure services ourselves, we were were looking for this distinction. But we believe the distinction is real and helps explain why different businesses make distinctly different choices. As we discuss in our blog posts on transparency this week, the approach to questions about abuse complaints will mean different things for different Cloudflare products. Although we are not at the point yet where Cloudflare’s products organize, analyze, or promote content, we are aware that this conclusion may have implications for us in the future.

Content curators

The Internet has revolutionized the way we communicate and access information. Because of the way the Internet works, everyone online has the opportunity to create and consume the equivalent of their own newspaper or television network. Almost any content you could want is available, if you can find it. That idea is at the heart of a the divide between services that curate content -- like social media platforms and search engines -- and basic Internet infrastructure services.  

Content curators make content-based decisions for a business purpose. For a search engine, that might mean algorithmically reviewing content to best match what is sought by the user. For a social media site, it might be a review of content to help predict what content the user will want to see next or what advertising might be most appealing.

For these types of online products, users understand and generally expect that the services will vary based on content. Different search engines yield different results; Different social media platforms will promote different content for you to review. These services are the Internet’s equivalents of the very small circle of newspaper editors or television network executives of old, making decisions about what you see online based on what they think you’ll want to see.

The value in these content curator services depends on how well they analyze, use, and make judgments about content.  From a business perspective, that means that these services want the flexibility to include or exclude particular content from their platforms. For example, it makes perfect sense for a platform that advertises itself as building community to have rules that prevent the community from being disrupted with hate-filled messages and disturbing content.

We should expect content curator services to moderate content and should give them the flexibility to do so. If these services are transparent about what they allow and don’t allow, and how they make decisions about what to exclude, they can be held accountable the same way people hold other businesses to account. If people don’t like the judgments being made, they can take their business to a platform or service that’s a better fit.

Basic Internet infrastructure services

Basic Internet services, on the other hand, facilitate the business of other providers and website owners by providing infrastructure that enables access to the Internet.  These types of services -- which Matthew described in detail in the Daily Stormer blog post -- include telecommunications services, hosting services, domain name services such as registry and registrar services, and services to help optimize and secure Internet transmissions. The core expertise of these services is not content analysis, but providing the infrastructure needed for someone else to develop and analyze that content.

Because people expect these infrastructure services to be used to provide technical access to the Internet, the notion that these numerous services might be used to monitor what you’re doing online or make decisions about what content you should be entitled to access feels like a misuse, or even an invasion of privacy.

Internet infrastructure is a lot like other kinds of physical infrastructure.  At some basic level, we believe that everyone should be allowed to have housing, electricity or telephone, no matter what they plan to do with those services. Or that individuals should be able to send packages through FedEx or walk down the street wearing a backpack with a reasonable expectation they won’t be subject to unfounded search or monitoring. Much as we believe that the companies that provide these services should provide services to all, not just those with whom they agree, we continue to believe that basic internet infrastructure services, which provide the building blocks for other people to create and access content online, should be provided in a content-neutral way.

Complicated companies

Developing different expectations for content curation services and infrastructure services is tougher than it seems. Behemoths best known for content curation services often provide infrastructure services as well. Alphabet, for example, provides content-neutral infrastructure services to millions of customers through Google Cloud and Google Domains, while also running one of the world’s largest content curated site in YouTube. And even if companies try to distinguish their infrastructure from content curation services, their customers may not.

In a world where content needs to be on a large network to stay online, there are only a handful of companies that can satisfy. Reducing that handful to those — like Cloudflare — that fall solely into the infrastructure bucket makes the number almost impossibly small. That is why we want to do better job talking about differences in expectations not by company, but by service.

And maybe we should also recognize that having only a small number of companies with robust enough networks to keep content online--most of which do content curation--is part of the problem. If you believe that the only way to be online is to be on a platform that curates content, you’re going to be rightly skeptical of that company’s right to take down content that they don’t want on their site. That doesn’t mean that a business that depends on analyzing content has to stop doing it, but it does make it that much more important that we have neutral infrastructure. It might be impossible for an alternate platform to be built, and for certain voices to have a presence online, without it.

The good news is that we’re not alone in our view of the fundamental difference between content curators and Internet infrastructure services. From the criticism we received for the Daily Stormer decision, to the commentary of Mike Masnick at Techdirt, to the academic analysis of Yale Law Professor Jack Balkin, to the call of the Global Commission on the Security of Cyberspace (GCSC) to protect the “public core” of the Internet, there’s an increasing awareness that not protecting neutral Internet infrastructure could undermine the Internet as we know it.

Thoughts on due process

In his blog post on the Daily Stormer decision, Matthew talked about the importance of due process, the idea that you should be able to know the rules a system will follow if you participate in that system. But what we’ve learned in our follow up conversations is that due process has a different meaning for content curators.

There has been a clamor for companies like Facebook and Google to explain how they make decisions about what to show their users, what they take down, and how someone can challenge those decisions. Facebook has even developed an “Oversight Board for Content Decisions” -- dubbed as Facebook’s supreme court -- that is empowered to oversee the decisions the company makes based on its terms of service. Given that this process is based on terms of service, which the company can change at will to accommodate business decisions, this mostly seems like a way to build confidence in the company’s decision-making process. Instituting an internal review process may make users feel that the decisions are less arbitrary, which may help the company keep people in their community.

That idea of entirely privatized due process may make sense for content curators, who make content decisions by necessity, but we don’t believe it makes sense for those that provide infrastructure services. When access to basic Internet services is on the line, due process has to mean rules set and adjudicated by external decision-makers.

Abuse on Internet infrastructure

Although we don’t believe it is appropriate for Cloudflare to decide what voices get to stay online by terminating basic Internet services because we think content is a problem, that’s far from the end of the story. Even for Internet infrastructure, there are other ways that problematic content online can be, and is, addressed.

Laws around the world provide mechanisms for addressing particular types of content online that governments decide is problematic. We can save for another day whether any particular law provides adequate due process and balances rights appropriately, but at a minimum, those who make these laws typically have a political legitimacy that infrastructure companies do not.

Tomorrow, we’ll talk about how we are operationalizing our view that it’s important to  get into the weeds by considering how different laws apply to us on a service-by-service, and function-by-function basis.