When we search for things on the Internet, a number of variables determine what we end up finding. Whether it's the configuration of our computers, the location, or our past search history, the information we get is filtered and changed to match our record. When these same algorithms start determining the news we see, things start to get messy and according to Eli Pariser's new book, The Filter Bubble, which he'll be reading from at Tattered Cover LoDo tonight, it starts to get dangerous.
Westword: Can you start by just defining the "filter bubble?"
Eli Pariser: Well, increasingly we don't all see the same Internet. If you go to Google, it used to be the case that everyone would get the same search results for the same terms, but now that's not true anymore. Different people get different search result depending on who Google thinks they are and what Google thinks they're interested in. It turns out this kind of technology is spreading very rapidly across the Internet and onto most major websites and it means we're all surrounded by an array of personalizing code that creates a unique experience for all of us.
WW: So, what's the big danger of that?
EP: There are several dangers. Because its invisible -- you don't see this editing happening -- you can get a distorted view of the world and not even know it. Most people don't know that Google is actually reflecting back to them its assumptions about their interests and political views. You end up with a slanted picture of what's going on.
Another danger is important, but not-highly clickable stories are falling out of view. The way that most of these programs work is that they want you to click more and they look at which pages you're most likely to view. So stories about a civil war in Africa or homelessness aren't likely going to get a lot of clicks and therefore aren't showing up as much.
WW: Wait, so are you talking about Google specifically or on news sites as well?
EP: You know, when these filters are at play Google News, for example, prioritizes stories about Apple over stories about Afghanistan, because that's what people click the most.
WW: But that's not on purpose --
EP: No, it's not on purpose -- like a human thinks Apple is more important than Afghanistan -- but the way the algorithm works, it shows more what people are clicking on.
Another place where that same phenomenon shows up is on Facebook. You know, the "like" button has a very particular feel to it and it's easy to click "like" on "I ran a marathon," but it's hard to click it on "joblessness rates go up." So you see some kinds of information and you don't see other kinds.
The third problem is that it's easier than ever to see only people and arguments you already agree with. In some way these sites are harnessing the power of confirmation bias -- which is the idea that we feel good about media that confirms what we already know. You can actually see people getting a little hit of pleasure when they're presented with information that validates what they believe. Conversely, when presented with information that challenges it you feel cranky and uncomfortable. So, if all you're trying to do is maximize the number of people viewing your page, why would you ever show them something that makes them feel sad?
WW: Who does the problem fall on -- is it just the way social media has changed news -- or is more about how the newspapers are presenting it?
EP: I think, for all of its flaws, the system of journalistic ethics that has been behind the newspaper industry has a built in structure that you have to give people information not that just makes them happy, but that makes them be good citizens and make good decisions. I think online, that's not how these algorithms are being developed. They don't have values embedded in them.
The new media institutions that are replacing the old media institutions don't have the same sense of ethics that old ones do. WW: But there's this adage that your bring people into the site with the fluff and they stick around for the news.
EP: Right, the most successful sites are bringing people in with those stories and then showing them more of those stories. As a short term strategy, that works pretty well. There's a part of all of us that wants to read about celebrities and scandal, but there's also a part that wants to be knowledgeable about the world. If the media knows how to balance those things and give us little treats while exposing us to important and complicated issues it works. But these algorithms can't do that.
WW: Is a linear path through the daily news better? Like the NBC Nightly News or something?
EP: I'm not nostalgic for that particularly, I think that having a small group of people deciding what to talk about and what to expose people to isn't the solution either, but in one way we've just moved from one small group to another. Instead it's a group of engineers at Google and Facebook and the choices they make about what information gets through and what doesn't really effects what people see and what they don't. I'd like to see the Internet live up the hope many of us had for it as a media that would actually democratize information instead of having a group of people pull a chart.
WW: But we're kind of doing this to ourselves, right? We only follow people we like on Twitter, or RSS feeds we agree with or even our friends on Facebook.
EP: Yes, that's one level of the problem and there's this additional level that we're not actually conscious of what we're doing to ourselves. It's one thing to follow a person on Twitter you agree with, but it's another when you load of Yahoo News and knowing that you're being shown something targeted toward you -- not just ads, but content. I think that element of choice is really important, but right now it's missing. One of the things being done here is to make the web a passive experience -- you don't have to work so hard, the information comes to you.
WW: Did these algorithms start with advertising?
EP: Yeah, the advertising was the jumping off point, but I think increasingly people are realizing you can do it with ideas as well. Personally I think the product recommendations are really helpful, but when that same structure is used to recommend information or ideas it's a different order of importance. You need to be much better when you're doing that.
WW: And the fact many sites aren't politically neutral anymore means that just keeps going.
EP: Right, you get people directed to sites that only have a sense of their own views. I experienced this on Facebook myself, where I'went out of the way to get to know people that have different political views than my friends and Facebook didn't display their posts because their algorithms didn't think I was interested in them. So, these things can be happening and you aren't aware of it.
WW: Do you have a solution?
EP: Part of it is just educating ourselves about how these filters are working and what information they draw from because the biggest danger here is that you are using these tools to get what you think is objective information but it's actually tailored. The other piece is pushing these companies to make it more transparent and to give us more control and incorporate a civic ethos into what they're doing.
WW: More transparent like, Amazon or Netflix where it explicitly says, "you like this, so you might like this," or something different?
EP: Yeah, I think that would at least call attention to the fact there is editing going on.
WW: Are there big sites that use this -- the New York Times?
EP: Well, the Times actually just switched from the tabs on the side to the "recommended for you" design. That's exactly this -- but I haven't totally figured out how the Times algorithm is designed, but it seems like they are trying to build an ethic into it. But they're definitely moving in that direction. Both the Times and the Washington Post have invested heavily in startups to build these sort of, "personalized news of the future," programs.
WW: Does this work if you're not logged into Facebook, or if you clear you Internet cache and everything?
If you like this story, consider signing up for our email newsletters.
SHOW ME HOW
You have successfully signed up for your selected newsletter(s) - please keep an eye on your mailbox, we're movin' in!
EP: Google will still track about 57 different variables even if you're logged out. If you take a new laptop out of the box and put it on a table there are 57 things it can read to obtain information about you, from what kind of laptop it is, to what kind of software you have installed, how big the fonts are, how long you're hovering over links.
WW: Wait a minute -- is the font size a way to determine age?
EP: You know, presumably, but I don't know what they do with font size, I just know it's something you read from a browser. Really these algorithms are just statistical prediction engines that are trying to find correlations in the data. So if someone has a bigger font size, then they tend to clink on this link first, smaller then this link -- nobody ever says this is proxy for age.