A Web for One: the danger of agressive personalization
Everything which bars freedom and fullness of communication sets up barriers that divide human beings into sects and cliques, into antagonistic sects and factions, and thereby undermines the democratic way of life – John Dewey
The technology will be so good it will be very hard for people to watch or consume something that has not in some sense been tailored for them – Eric Schmidt, Google CEO
There was an entry in the Google blog on December 4,2009. For some, this is the day the world changed – and not in a good way. The headline was “Personalized Search for everyone”.
Eli Pariser and others picked up on the profound significance of this. What personalized search means is that, based on 57 different inputs (signals) Google will deliver custom search results to you. Put a different way, the search results presented to you will be unique to your profile. Compared with others doing the same search your results may exclude or include different links based on the difference in your “signals” and profiles. In a sense, you will see a web customized for you – a web constructed specifically for you a single individual that may be different for everyone else. You might think this is a good thing. Is it?
If I could make this analogy. It’s similar to the Multiverse theory in physics but applied to information. What Google will present to you is a customized universe of information tailored just for you.
The profundity of this should be obvious. You will be caught in a “You Loop”. What will be “erased” is the diversity of ideas, opinion, thoughts, and hard information that the algorithm decides is not relevant to you. And it’s not just Google that is doing aggressive personalization.
What happens to the public debate, when we rely on information from the Internet but come to realize that we no longer get objective results from search engines but rather receive a universe of information that has been “synthesized” specifically for us by an algorithm. What is the nature of that algorithm and how will it steer public opinion in politics, society, and the consumer culture?
Take a watch on this TED presentation by Eli Pariser on Filter Bubbles.
Read some related articles –
In the articles below, use the terms “Choice Architecture” and “Personalization” interchangeably and think of the power of this new search engine strategy.