Wednesday, July 30, 2014

Popping the Filter Bubble






What is a Filter Bubble?
 

"We're all you know. You're beginning to believe the illusions we're spinning here. You're beginning to think that the tube is reality, and that your own lives are unreal. You do whatever the tube tells you! You dress like the tube, you eat like the tube, you raise your children like the tube, you even *think* like the tube! This is mass madness, you maniacs! In God's name, you people are the real thing! *WE* are the illusion!"-Network
   
     Sustainability relies a great deal upon understanding world-views; to deal with the innumerable opinions and needs of people and affect actual change, one must understand why they do what they do. In the age of the internet there's no reason to think that this would be difficult; the networks that connect us all should do nothing but facilitate interactivity and discourse about all manner of things. But unfortunately that is slowly evolving into a distant fantasy, a byproduct of the fact that each person's internet experience is slowly being tailored to them. Political and internet activist Eli Pariser calls this phenomenon the "filter bubble," noting that all major browsers and websites are slowly shifting to adapt their information flow to the interests of each user, rather than let the user naturally receive information as it comes. By examining what links a user selects or what information they "upvote," websites are becoming increasingly good at identifying exactly what each person wants from them. 

      But while convenient and somewhat opportune, this phenomenon bodes poorly. Pariser's argument is that this will make our society increasingly divisive, drawing together groups of close-knit individuals that will reject alternative ways of thinking, preventing cooperation and understanding across the globe. A NRA proponent, for instance, might never encounter arguments against gun ownership if this continues, and never bother to approach the issue from both sides. It doesn't matter the issue; all sides need to be well informed to find middle grounds or they will constantly demonize the opposition unjustly.

      Funnily enough, this phenomenon closely resembles that which the 1976 movie Network explored, one in which the world outside of the news channel was becoming increasingly irrelevant. Much of the film was devoted to arguing against this phenomenon and for all people to spend more time ignoring the opinions of the "box," but the movie never truly arrives at a solution to the issue that still allows interaction with television. Since it is unlikely people will shun the internet, for sustainability to survive in the coming years it is imperative that a solution is found; thankfully unlike Network however, there are a myriad of possibilities on how to fix the issue. 

Solutions   


      Pariser's solution is for Google to both consider the public good when customizing searches and open their search algorithms so everyone can understand how their results are individually filtered. The first sounds reasonable from merely a business aspect, but the second is perhaps a little too farfetched; Google has declined to reveal such information because it is an integral part of the uniqueness of their product. Certainly Google could give some vague idea of how they rank, but the nonspecificity of such a revelation would likely make such information useless. Since everyone's results are customized distinctly, relevance would likely be factor constantly changing.
      

      The first is still not a perfect solution though; there are significant issues with allowing certain people to determine what is relevant or important to each person. Pariser thinks that there can be a good blend between important information and "fun" results that we may be more interested in, but there seems to be a fine line between delivering information and politicizing a news feed. Saying certain things need to appear may be just as bad as saying that they shouldn't. 
     

      As Levy notes, Google responded to Pariser by introducing Search plus Your World, a separate search engine that could tailor results based upon friends' searches and interests or just based off the available data. But while the first is better than nothing and the second is democratizing, both still fail. By having only friends' interests dictate data, you can still easily be stuck in filter bubbles of social life. After all, we're more likely to be friends with those who share our beliefs. The second is good, but relevant and crucial information could easily escape if people aren't careful. Plus, tailoring the internet has certain advantages; it would be good if they could be retained. Thaler and Sunstein argue that while it may be somewhat immoral trying to shape someone's worldview, offering them the option of other choices as long as the original options they wanted are present, is perfectly fine and arguably justified as a public good.
     

      Stray has several suggestions to fix these issues, and I think correctly identifies that this problem needs to be solved from several angles. From arguing that the web needs curation of high quality or socially relevant media to claiming that mapping web activity would identify biases, he touches upon the multi-dimensional aspect of the issue. His thoughts are that there should be settings available on websites to tailor information by categories like importance, relevance, popularity, and diversity from your one's viewpoint. While a nice idea and assuming that bias in selection in kept to a minimum, I think this works great in theory, but considering that most people operate in "straight-lines," rarely changing ho they do something unless it's an emergency, I think proactive change is for the best. I think his argument is strongest when he says:

"Another possibility is to analyze social networks to look for alternate perspectives on whatever you're reading. If people in our personal social network all hold similar opinions, our filters could trawl for what people are reading outside of our home cluster, retrieving items which match our interests but aren’t on our social horizon.\

      The benefits of search customization could be easily paired with the diversity of other opinions; 1 of every 5 results could be from differing opinions. Alternatively, they could be from the pages of people who vary mostly with your opinions but have a common interest in that one. Graells-Garrido et al seem to believe that option is acceptable; their research indicates that there is a statistically significant portion of people who are open to seeing diversity in their online access. 

      That's not to say that filter bubbling can be solved in one fell swoop, but I think that change can be slowly integrated into how we use the internet. For both our sake and the future of sustainability, it will remain important in the coming years to see how other sides view various arguments and meaningfully interact with them in the digital world. 

Citations


Ali T. Bubble 2. Flickr.com. 23 Aug, 2008. Photo. Retrieved from http://bit.ly/1xBG2tt

Az, d-221 books. Flickr.com. 16 Oct, 2010. Photo. Retrieved from http://bit.ly/1oNMpcr

Graells-Garrido et al. Data Portraits: Connecting People of Opposing Views. Cornell University Library. 19 Nov, 2013. Digital.

Internet Movie Database. Network. Photo. Retrieved from http://www.imdb.com/title/tt0074958/

Jairoagua. Networking. 5 July, 2005. Flickr.com. Photo. Retrieved from http://bit.ly/1o5Rz2R

Levy, S. Has Google Popped the Filter Bubble? Wired.com. 10 Jan, 2012. Digital. Retrieved from http://www.wired.com/2012/01/google-filter-bubble/

Pariser, E. "Beware online filter bubbles." Ted.com. Mar, 2011. Video. Retrieved from http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles

Stray, J. Are we stuck in filter bubbles? Here are five paths out? Nieman Journalism Laboratory, Harvard University. 11 Jul, 2012. Digital. Retrieved from http://bit.ly/1zxswdc

Thaler, R. and Sunstein, C. Nudge: Improving Decisions about Health, Wealth, and Happiness. Yale University Press, 8 Apr, 2008.

No comments:

Post a Comment