As a Mashable reader, you’re probably well-aware that Facebook’s News Feed and Google’s search results adjust based on your behavior and demonstrated preferences. But are these and the web’s other algorithms making us collectively uninformed as a society? That’s the argument that Eli Pariser, the former executive director of MoveOn.org, made in a TED talk on Thursday in Long Beach, California.
Pariser started his talk by noting a trend he saw on Facebook. Over time, he said, the conservative friends he had started following to ensure a diverse set of viewpoints (Pariser describes himself as “progressive” politically) gradually started disappearing. As he would soon discover, that was a result of him clicking far more frequently on the links posted by his more liberal friends.
This “invisible algorithmic editing of the web,” as Pariser describes it, “moves us to a world where the Internet shows us what it thinks we need to see, but not what we should see.” Beyond Facebook, Pariser notes the huge diversity of search results his friends find on Google about topics like Egypt, where one friend sees news about recent protests and Lara Logan, while another sees results about travel and vacations.
In turn, Pariser believes we’re collectively creating what he calls a personal “filter bubble,” which is also the title of a book on the subject due out in May. And while he falls short of arguing that the trend towards personalization must end, he says the likes of Facebook and Google need to “have a sense of civic responsibility, and let us know what’s getting filtered … [and offer] controls to let us decide what gets through and what doesn’t.”
What he’d like to see is an information world that “gives us a bit of Justin Bieber and a bit of Afghanistan,” marked by controls that let us filter content by its relevance, importance, comfort (topics that can be difficult to discuss or read), challenge level, and points of view (with an option to see “alternative”).
Of course, much of that goes against the history of the Internet, which has been marked by its ability to connect like-minded people, both for good and for bad. In other words, Google and Facebook could build such controls and even bake more human editing into their algorithms, but do people even want them?
What do you think? Is the personalization of the Web making us less informed? Do the companies driving innovation on the web have a civic responsibility to give us a fuller world view? Sound off in the comments.
Pariser started his talk by noting a trend he saw on Facebook. Over time, he said, the conservative friends he had started following to ensure a diverse set of viewpoints (Pariser describes himself as “progressive” politically) gradually started disappearing. As he would soon discover, that was a result of him clicking far more frequently on the links posted by his more liberal friends.
This “invisible algorithmic editing of the web,” as Pariser describes it, “moves us to a world where the Internet shows us what it thinks we need to see, but not what we should see.” Beyond Facebook, Pariser notes the huge diversity of search results his friends find on Google about topics like Egypt, where one friend sees news about recent protests and Lara Logan, while another sees results about travel and vacations.
In turn, Pariser believes we’re collectively creating what he calls a personal “filter bubble,” which is also the title of a book on the subject due out in May. And while he falls short of arguing that the trend towards personalization must end, he says the likes of Facebook and Google need to “have a sense of civic responsibility, and let us know what’s getting filtered … [and offer] controls to let us decide what gets through and what doesn’t.”
What he’d like to see is an information world that “gives us a bit of Justin Bieber and a bit of Afghanistan,” marked by controls that let us filter content by its relevance, importance, comfort (topics that can be difficult to discuss or read), challenge level, and points of view (with an option to see “alternative”).
Of course, much of that goes against the history of the Internet, which has been marked by its ability to connect like-minded people, both for good and for bad. In other words, Google and Facebook could build such controls and even bake more human editing into their algorithms, but do people even want them?
What do you think? Is the personalization of the Web making us less informed? Do the companies driving innovation on the web have a civic responsibility to give us a fuller world view? Sound off in the comments.
No comments:
Post a Comment