In The Filter Bubble, Eli Pariser explores
how personalization on the Internet is surreptitiously altering what people
read and how they think. He weaves together strands of technological data,
media studies theory, political and journalistic history, psychological
effects, and pop culture references into an intriguing argument. In their
attempt to personalize everything, Internet entities (Google, Facebook, etc.) create
a “filter bubble” that is actually stunting creativity and frustrating
Democracy. In this book, Pariser attempts to push back against this trend of
hyper-personalization and “burst” the filter bubble. Eli Pariser is an author,
political and Internet activist, chief executive at Upworthy (a website devoted
to “meaningful” web content), president of MoveOn.org (a progressive public
policy activist group), and co-founder of Avaaz (a global civic organization).
Pariser
begins by diagnosing the problem he calls “the filter bubble”. In a competitive
race for users’ clicks and advertisers’ dollars, Internet giants have begun
catering every user’s activity around that user’s online preferences and habits.
But because the filter bubble is solitary (it pulls us apart), invisible (we
don’t know what algorithms are working where), and involuntary (automatic
participation means you must opt out,
not in), the whole filtering process is largely unnoticed and therefore
particularly dangerous. The aggression and analytical ambition that fuel
start-up companies do not magically disappear when those companies become
Internet giants that “rule the world”. (181)
The filter
bubble commodifies user data and activity for sale to the highest bidder—typically
advertisers who are keen on showing their products to interested parties. In
its infancy, the Internet was celebrated for its disintermediating potential,
but celebrants failed to predict how the absence of a middleman would affect
content and attention. For example, in the realm of journalism, it is now much
easier to go to an aggregated news site that collects stories from smaller
sources, all of which are relevant to you,
than to spend the effort clicking around those sources directly. However,
“while personalization is changing our experience of news, it’s also changing
the economics that determine what stories get produce.” (69) Thus, instead of
reporting on important or worthy news, news agencies have a vested interest in
publishing stories that are likely to garner lots of online attention.
Internet
giants are offering us more convenience, but “in exchange for convenience, you
hand over some privacy and control to the machine.” (213) As technology
continues to develop, even non-Internet corporations are figuring out new ways
to pursue consumers. Whether creating their own media content or augmenting
your “reality” to highlight their products, these companies are coming after us
with more aggression and less transparency.
Pariser
describes the delicate cognitive balance that has historically been the driving
force behind human creativity and ingenuity: our brains automatically “tread a
tightrope between learning too much from the past and incorporating too much
new information from the present.” (84) Thus, by measuring the novel and
unknown against the established and known, we can integrate useful news ideas into
the canon of human knowledge. Yet, the filter bubble removes the unknown from
our horizon, which lessens the impetus (and possibility) to learn and innovate.
“Left to their own devices, personalization filters serve up a kind of
invisible autopropaganda, indoctrinating us with our own ideas, amplifying our
desire for things that are familiar and leaving us oblivious to the dangers
lurking in the dark territory of the unknown.” (15) Any cognitive dissonance or
tension is flattened until a user’s Google self, Facebook self, Amazon self,
etc. all become the same and any chance for serendipity or identity
experimentation is nullified.
The dangers
of the filter bubble extend to the political sphere as well. As people are currently
only being shown news on issues that are “relevant” to them, public issues that
should be of at least marginal interest to everyone are ignored. Because the
same symbol or event means different things to different people, a fragmented
public is harder to lead. “Democracy requires citizens to see things from one
another’s point of view, but instead we’re more and more enclosed in our own
bubbles. Democracy requires a reliance on shared facts; instead, we’re being
offered parallel but separate universes.” (5) Citizens must be willing and able
to see beyond their own narrow self-interests, but the filter bubble makes this
increasingly difficult.
Pariser
concludes with a few possible solutions to the filter bubble dilemma. Consumers
can try to vary their online activity, breaking habitual website patronage, and
choose transparent sites like Twitter over Facebook, which is notoriously murky
about its privacy policies. Companies can do their part as well. By being more
transparent, giving users the option to surf through relevant or novel
material, they can break the over-personalization cycle. Furthermore,
government can be more responsible about holding companies accountable
(regarding their user’s control of privacy) instead of succumbing to their
deluge of lobbyists.
This book
advances the understanding of Mass Communication in several ways. Primarily,
Pariser is to be commended for bring to the public’s attention (this book is a
NT Times bestseller) an issue that is universally important. This affects all
of us[1], in
almost every area of our lives, and whether or not you agree with Pariser’s
diagnosis, the issue certainly warrants discussion. Furthermore, this book
encourages people to be more conscious about what media they consume, from the
mundane websites we check every day to why we vote the way we do. Few areas of
electronic activity are unaffected by Pariser’s argument.
Overall, The Filter Bubble is largely successful in communicating its
message, but there are some failings. Pariser neglects to acknowledge the fact
that most of the entities he is criticizing (Google, Facebook, etc.) do offer users the option to turn off
personalization and return pure results. It’s simply that the default settings
of the programs are geared towards personalization. True, most users may never
change (or even be aware of) the ability to turn these settings on or off, but
the possibility bears mention.
Pariser’s argument also runs the possibility
of turning on itself. Through personalization, Facebook and Google indirectly control our online
experiences, so Pariser’s solution is that they should expose us to content we
don’t want. This would, in fact, put then in direct control. But who decides what is important for everyone? Do
we really want Google and Facebook determining what is important for us?
Furthermore, what incentives would
companies have to sell their products online if they aren’t going to reach
relevant customers? Pariser decries Cass Sunstein’s Republic.com for advocating a naïve “fairness doctrine” (where
companies willingly sacrifice profits for a more Marxist Internet structure),
but then advocates a similar set of solutions. Will companies ever spend
resources on something that doesn’t contribute—and might even be detrimental—to
their bottom line?
It is difficult to tell if Pariser
resorts to hyperbole in order to scare the reader into agreeing with him. According
to Pariser, RFID chips, ambient intelligence, DNA, and behavioral data make it
possible to “run statistical regression analysis on an entire society.” (199)
Whether this dystopian vision of a techno-society is a plegitimate threat or
not remains to be seen, but at least Pariser’s experience in political and
Internet activism make it credible enough to consider.
Pariser’s argument is strengthened
with some brilliant ideas. The notion of a falsifiability
algorithm for Amazon—one that tries to disprove its conception of you with
random genre suggestions—is simple and (more importantly) practical, since it
would actually give the company a more accurate and dynamic picture of who you
are. Other suggestions, like “a slider bar running from ‘only stuff I like’ to
‘stuff other people like that I’ll probably hate’” (235), are less likely to
occur but noble suggestions nonetheless. His general approach to thinking about
personal information “as a form of property” (240), instead of merely
sacrificing it for a little convenience, is a positive step in the right
direction. Pariser’s Filter Bubble
moves us toward balancing the current asymmetries of knowledge, data, and
therefore power.
Pariser's TED talk on the subject can be found here: http://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles.html
[1]
Well, all of us in the First World. But the way things are moving, the problems
of the filter bubble are flowing into Third World societies as our technology
does. Moreover, sites like Facebook and Google are global in scale.