If you don’t, I encourage you to take a close look at all of the privacy, security, and search settings of the web applications and search engines you use—because that’s exactly what they do.
For those who are unaware, many environments include algorithms that record the fact that you like “A” and assume by it that you’ll like “B”. And based on that accumulated data, they begin to feed you more and more of what they perceive are your interests to the exclusion of other perfectly valid, useful information. It is what Eli Pariser calls a “filter bubble”.
I bring it to your attention because I would seem to mean that we are less likely to stumble across the material we’re not looking for— and that those chance encounters, to my way of thinking, serve up some of life’s most profound learning experiences.
Don’t get me wrong, I want to see content providers compensated for what they provide—and for advertisers to reach their audiences. But I don’t believe in someone else deciding what I should see, and hear, and read about—and certainly not without their intention being explained up front and prominently.
In his TED talk, Pariser quotes Eric Schmidt, Google’s CEO, as saying, “The power of individual targeting—the technology will be so good it will be very hard for people to watch or consume something that has not in some sense been tailored for them.”
Yes, every platform obviously needs to determine what data it chooses to present, but when you show me one thing and the next person something different, that’s when I begin to worry. In the interest of transparency I’d like to know, first, that you’re doing it, and second, about the assumptions you’re making in deciding what to show.
From Springer Link: Breaking the filter bubble: democracy and design…
There are mentions of the issue in this conversation from the NYT: Are We Becoming Cyborgs?…
The WSJ, Holman Jenkins article from which the Eric Schmidt quote was taken…
From the Internet Policy Review: Should we worry about filter bubbles?
Thoughts?