Q&A: MoveOn Board President Eli Pariser on his new book, The Filter Bubble

Page 2 of 2

Westword: Can you start by just defining the "filter bubble?"

Eli Pariser: Well, increasingly we don't all see the same Internet. If you go to Google, it used to be the case that everyone would get the same search results for the same terms, but now that's not true anymore. Different people get different search result depending on who Google thinks they are and what Google thinks they're interested in. It turns out this kind of technology is spreading very rapidly across the Internet and onto most major websites and it means we're all surrounded by an array of personalizing code that creates a unique experience for all of us.

WW: So, what's the big danger of that?

EP: There are several dangers. Because its invisible -- you don't see this editing happening -- you can get a distorted view of the world and not even know it. Most people don't know that Google is actually reflecting back to them its assumptions about their interests and political views. You end up with a slanted picture of what's going on.

Another danger is important, but not-highly clickable stories are falling out of view. The way that most of these programs work is that they want you to click more and they look at which pages you're most likely to view. So stories about a civil war in Africa or homelessness aren't likely going to get a lot of clicks and therefore aren't showing up as much.

WW: Wait, so are you talking about Google specifically or on news sites as well?

EP: You know, when these filters are at play Google News, for example, prioritizes stories about Apple over stories about Afghanistan, because that's what people click the most.

WW: But that's not on purpose --

EP: No, it's not on purpose -- like a human thinks Apple is more important than Afghanistan -- but the way the algorithm works, it shows more what people are clicking on.

Another place where that same phenomenon shows up is on Facebook. You know, the "like" button has a very particular feel to it and it's easy to click "like" on "I ran a marathon," but it's hard to click it on "joblessness rates go up." So you see some kinds of information and you don't see other kinds.

The third problem is that it's easier than ever to see only people and arguments you already agree with. In some way these sites are harnessing the power of confirmation bias -- which is the idea that we feel good about media that confirms what we already know. You can actually see people getting a little hit of pleasure when they're presented with information that validates what they believe. Conversely, when presented with information that challenges it you feel cranky and uncomfortable. So, if all you're trying to do is maximize the number of people viewing your page, why would you ever show them something that makes them feel sad?

WW: Who does the problem fall on -- is it just the way social media has changed news -- or is more about how the newspapers are presenting it?

EP: I think, for all of its flaws, the system of journalistic ethics that has been behind the newspaper industry has a built in structure that you have to give people information not that just makes them happy, but that makes them be good citizens and make good decisions. I think online, that's not how these algorithms are being developed. They don't have values embedded in them.

The new media institutions that are replacing the old media institutions don't have the same sense of ethics that old ones do.

KEEP WESTWORD FREE... Since we started Westword, it has been defined as the free, independent voice of Denver, and we'd like to keep it that way. With local media under siege, it's more important than ever for us to rally support behind funding our local journalism. You can help by participating in our "I Support" program, allowing us to keep offering readers access to our incisive coverage of local news, food and culture with no paywalls.
Thorin Klosowski
Contact: Thorin Klosowski