Printer Friendly

The troubling future of internet search: data customization is giving rise to a private information universe at the expense of a free and fair flow of information, says the former executive director of Moveon.org.

Someday soon, Google hopes to make the search box obsolete. Searching will happen automatically.

"When I walk down the street, I want my smartphone to be doing searches constantly--'did you know?' 'did you know?' 'did you know?' 'did you know?' In other words, your phone should figure out what you would like to be searching for before you do," says Google CEO Eric Schmidt.

This vision is well on the way to being realized. In 2009, Google began customizing its search results for all users. If you tend to use Google from a home or work computer or a smartphone--i.e., an IP address that can be traced back to a single user (you)--the search results you see incorporate data about what the system has learned about you and your preferences. The Google algorithm of 2011 not only answers questions, but it also seeks to divine your intent in asking and give results based, in part, on how it perceives you.

This shift speaks to a broader phenomenon. Increasingly, the Internet is the portal through which we view and gather information about the larger world. Every time we seek out some new bit of information, we leave a digital trail that reveals a lot about us, our interests, our politics, our level of education, our dietary preferences, our movie likes and dislikes, and even our dating interests or history. That data can help companies like Google deliver you search engine results in line with what it knows about you.

Other companies can use this data to design Web advertisements with special appeal. That customization changes the way we experience and search the Web. It alters the answers we receive when we ask questions. I call this the "filter bubble" and argue that it's more dangerous than most of us realize.

In some cases, letting algorithms make decisions about what we see and what opportunities we're offered gives us fairer results. A computer can be made blind to race and gender in ways that humans usually can't. But that's only if the relevant algorithms are designed with care and acuteness. Otherwise, they're likely to simply reflect the social mores of the culture they're processing--a regression to the social norm.

The use of personal data to provide a customized search experience empowers the holders of data, particularly personal data, but not necessarily the seekers of it. Marketers are already exploring the gray area between what can be predicted and what predictions are fair. According to Charlie Stryker, a financial services executive who's an old hand in the behavioral targeting industry, the U.S. Army has had terrific success using social-graph data to recruit for the military--after all, if six of your Facebook buddies have enlisted, it's likely that you would consider doing so, too. Drawing inferences based on people like you or people linked to you is pretty good business.

And it's not just the Army. Banks, too, are beginning to use social data to decide to whom to offer loans. If your friends don't pay on time, it's likely that you'll be a deadbeat, too. "A decision is going to be made on creditworthiness based on the creditworthiness of your friends," says Stryker.

If it seems unfair for banks to discriminate against you because your high-school buddy is bad at paying his bills or because you like something that a lot of loan defaulters like, well, it is. And it points to a basic problem with induction, the logical method by which algorithms use data to make predictions. When you model the weather and predict there's a 70% chance of rain, it doesn't affect the rain clouds. It either rains or it doesn't. But when you predict that, because my friends are untrustworthy, there's a 70% chance that I'll default on my loan, there are consequences if you get me wrong. You're discriminating.

One of the best critiques of algorithmic prediction comes, remarkably, from the late nineteenth-century Russian novelist Fyodor Dostoevsky, whose Notes from Underground was a passionate critique of the Utopian scientific rationalism of the day. Dostoevsky looked at the regimented, ordered human life that science promised and predicted a banal future. "All human actions/' the novel's unnamed narrator grumbles, "will then, of course, be tabulated according to these laws, mathematically, like tables of logarithms up to 108,000, and entered in an index ... in which everything will be so clearly calculated and explained that there will be no more incidents or adventures in the world."

The world often follows predictable rules and falls into predictable patterns: Tides rise and fall, eclipses approach and pass; even the weather is more and more predictable. But when this way of thinking is applied to human behavior, it can be dangerous, for the simple reason that our best moments are often the most unpredictable ones. An entirely predictable life isn't worth living. But algorithmic induction can lead to a kind of information determinism, in which our past clickstreams entirely decide our future. If we don't erase our Web histories, in other words, we may be doomed to repeat them.

Exploding the Bubble

Eric Schmidt's idea, a search engine that knows what we're going to ask before we do, sounds great at first. We want the act of searching to get better and more efficient. But we don't want to be taken advantage of, to be pigeon-holed, stereotyped, or discriminated against based on the way a computer program views us at any particular moment. The question becomes, how do you strike the right balance?

In 1973, the Department of Health, Education, and Welfare under Nixon recommended that regulation center on what it called Fair Information Practices:

* You should know who has your personal data, what data they have, and how it's used.

* You should be able to prevent information collected about you for one purpose from being used for others.

* You should be able to correct inaccurate information about you.

* Your data should be secure.

Nearly forty years later, the principles are still basically right, and we're still waiting for them to be enforced. We can't wait much longer: In a society with an increasing number of knowledge workers, our personal data and "personal brand" are worth more than they ever have been. A bigger step would be putting in place an agency to oversee the use of personal information. The European Union and most other industrial nations have this kind of oversight, but the United States has lingered behind, scattering responsibilities for protecting personal information among the Federal Trade Commission, the Commerce Department, and other agencies. As we enter the second decade of the twenty-first century, it's past time to take this concern seriously.

None of this is easy: Private data is a moving target, and the process of balancing consumers' and citizens' interests against those of these companies will take a lot of fine-tuning. At worst, new laws could be more onerous than the practices they seek to prevent. But that's an argument for doing this right and doing it soon, before the companies who profit from private information have even greater incentives to try to block it from passing.

[ILLUSTRATION OMITTED]

Eli Pariser is the board president and former executive director of the 5 million member organization MoveOn.org. This essay is excerpted from his book, The Filter Bubble: What the Internet Is Hiding From You. Reprinted by arrangement of The Penguin Press, a member of Penguin Group (USA), Inc. Copyright [c] 2011 by Eli Pariser.
COPYRIGHT 2011 World Future Society
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2011 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Computers / SCI/TECH
Author:Pariser, Eli
Publication:The Futurist
Article Type:Excerpt
Geographic Code:1USA
Date:Sep 1, 2011
Words:1246
Previous Article:Agencies are unprepared for climate change.
Next Article:Finding connection and meaning in Africa: a doctor discovers meaningfulness in a simpler, survival-oriented culture.
Topics:

Terms of use | Privacy policy | Copyright © 2024 Farlex, Inc. | Feedback | For webmasters |