Google CEO Eric Schmidt enthuses that the product I’ve always wanted to build is Google code that will guess what I’m trying to type. Google Instant, which guesses what you’re searching for as you type and was rolled out in the fall of 2010, is just the start. Schmidt believes that what customers want is for Google to “tell them what they should be doing next”
Personalistion could easily have a hand not only in who goes on a date with whom but in where they go and what they talk about. The algorithms that orchestrate our ads are starting to orchestrate our lives
Three dynamics of the filter bubble.
First, you’re alone in it. A cable channel that caters to a narrow interest (say golf) has other views with whom you share a frame of reference. But you’re the only person in your bubble.
Second, the filter bubble is invisible. Most views of conservative or liberal news sources know that they’re going to a station curated to server a particular political viewpoint. But Google’s agenda is opaque. Google doesn’t tell you who it thinks you are or why it’s showing you the results your seeing. You don’t know if its assumptions about you are right or wrong – and you might not even know it’s making assumptions about you in the first place.
Finally, you don’t choose to enter the bubble. When you turn on Fox News or read The Nation, you’re making a decision about what kinds of filter to use to make sense of the world. It’s an active process, and like putting on a pair of tinted glasses, you can guess how the editors’ leaning shapes your perception. You don’t make the same kind of choice with the personalised filters. They come to you – and because they drive up profits for the Web sites that use them, they’ll become harder an harder to avoid.
it was nearly impossible to guess how the algorithms would shape the experience of any given user. There were simply too many variable and inputs to track. So while Google can look at overall clicks, it’s much harder to say how it’s working for any one person.
Our bodies are programmed to consume fat and sugars because they’re rate in nature … In the same way, we’re biologically programmed to be attentive to things that stimulate: content that is gross, violent, or sexual and that gossip which is humiliating, embarrassing, or offensive. If we’re not careful, we’re going to develop the psychological equivalent of obesity. We’ll find ourselves consuming content that is least beneficial for ourselves or society as a whole. [Danah Body, speech 2000 Web 2.0 Expo]
By definition, a world constructed from the familiar is a world in which there’s nothing to learn. If personalisation is too acute, it could prevent us from coming into contact with the mind-blowing, preconception-shattering experiences and ideas that change how we think about the world and ourselves.
It’s easy to push the “Like” button and increase the visibility of a friends post about finishing a marathon or an instructional article about how to make onion soup. It’s harder to push the “Like” button on an article titled, “Darfur sees bloodiest month in two years”. In a personalised world, important but complex or unpleasant issues – the rising prision population, for example, or homelessness – are less likely to come to our attention at all.
Your identity shapes your media. There’s just one flaw in this logic. Media also shape identity.
If a self-fulfilling prophecy is a false definition of the world that through one’s actions becomes true, we’re now on the verge of self-fulfilling identities, in which the internet’s distorted picture of us becomes who we really are.
We’ve seen the pendulum swing from the anonymity of the early Internet to the one-identity view currently in vogue; the future may look like something in between.
“Why would I have done x if I weren’t a person who does x – therefore I must be a person who does x”. Each click you take in this loop is another action to self-justify.
A computer can be made blind to race and gender in ways that humans usually can’t. But that’s only if the relevant algorithms are designed with care and acuteness. Otherwise, they’re likely to simply reflect the social morals of the culture they’re processing – a regression to the social norm.
Part of what’s troubling about this world is that companies aren’t required to explain on what basis they’re making these decisions. And as a result, you can get judged without knowing it and without being able to appeal.
The statistical models that make up the filter bubble write off the outliers. But in human life it’s the outliers who make things interesting and give us inspiration. And it’s the outliers who are the first signs of change.
Big companies represent new loci of power. And while their multinational character makes them resistant to some forms of regulation, they can also offer one-stop shopping for governments seeking to influence information flows.
“Consciously or unconsciously, deliberately or inadvertently, societies choose structures for technologies that influence how people are going to work, communicate, travel, consume and so forth over a very long time” (Winner 1980) This isn’t to say that today’s designers have malevolent impulses, of course – or even that they’re always explicitly trying to shape society in certain ways. It’s just to say that they can – in fact, they can’t help but shape the worlds they build.
Technodeterminism is alluring and convenient for newly powerful entrepreneurs because it absolves them of responsibility for what they do.
While the Internet offers access to a dazzling array of sources and options, in the filter bubble we’ll miss many of them. While the Internet can give us new opportunities to grow and experiment with our identities, the economics of personalisation push toward a static conception of personhood. While the Internet has the potential to decentralise knowledge and control, in practise it’s concentrating control over what we see and what opportunities we’re offered in the hands of fewer people than ever before.
We live in an increasingly algorithmic society, where our public functions, from police databases to energy grids to schools, run on code. We need to recognise that societal values about justice, freedom, and opportunity are embedded in how code is written and what is solves for. Once we understand that, we can begin to figure out which variables we care about and imagine how we might solve for something different.