Monday, November 7, 2011

Filters: our Shared Responsibility

A final look, I promise, at "The Filter Bubble", by Eli Pariser. (Part 1, Part 2)


There's a huge question about whether unbiased content is really a meaningful thing to talk about. In the same way that Pariser skewers the myth of disintermediation, the idea that there's an unmediated set of search results for some search term, an unbiased newspaper front page, or a listing of TV shows that everyone in the world would agree is value neutral probably needs skewering too.

If you've every written a web app of any complexity, you'll know that getting everyone to agree on what should appear for the result of a search can be difficult, even for the simplest of things. Actually, especially for the simplest of things. When you factor in the different ways in which the list can be ordered, well, the idea that there is some neutral way that Bing (thought I'd give Google a break) can be expected to index and organize the world's information starts to look a little ... lacking in rigour. It's already well-filtered before it gets to you. It better be. That's the value of of a search engine. Looked at this way, any further filtering in the form of personalization that you apply to most web content is merely an extra lettuce leaf placed on top of a huge pile of lettuce leaves (I'm trying that phrase out, see how it goes).

Kickin' it old school

As an aside, there's one particular reason above all others I'm endeared to "The Filter Bubble": cameo appearances by great writers. To be honest, sometimes in books like this you can get a little bit lost in a 25-page chapter dealing with some subtle point about the problem of induction: good meaty stuff, very worthy, but if I'm tired or distracted I may just drift a wee bit. But someone who pays homage to our shared literary canon by bringing in some great author as regularly as Pariser does snaps me out of it and wins me over every time.

It's also an effective pedagogical technique, if quite a stretch, to introduce the likes of Dostoyevsky at the tail end of a critique of algorithmic prediction techniques. But you have to admit, in "Notes from Underground", the great man pretty much nailed it: "All human actions will be tabulated according to these laws, mathematically, like tables of logarithms up to 108,000 and entered in an index...". Clustered, hopefully. Also making guest appearances in "The Filter Bubble" are Asimov (not so strange I suppose), Kafka, and Nabokov. Perhaps inspired by such exalted company Pariser rises on occasion to some tasty penmanship himself: at one stage he opines that personalization offers "a return to a Ptolemaic universe in which the sun and everything else revolves around us." Bravo!

I love this: there are plenty of books written about technology which barely acknowledge writers and thinkers of the past. Why would they? All this new technology is so unprecedented, future-shocky and paradigm-shifty that how can anyone over the age of 30, let alone some dead old Russian guy, have anything to say about how people interact with machines?

Sometimes though, Pariser overstretches, but even then there's a good story to be had, at least. In a chapter called "The Adderall Society", we're given the strange story of Russian defector Yuri Nosenko and the mishandling of his case by the CIA for 6 pages, simply to make the point that people can get their view of the world distorted quite easily. It's interesting stuff, but ... what does this have to do with me moving my Yahoo News widgets around again?

The LEGO Turing Test

A place where personalized filtering would be a boon, and I would happily pay for it, is if, for example, YouTube could work out that the intelligence interfacing with it from a particular machine was a child. These days my kids wake up very early, charge downstairs and use my Motorola Xoom to search for Lego Star Wars videos. Love it. I'm happy they're using magic technology that I couldn't have dreamed of when I was their age to light up their pleasure cells like a Christmas tree. It makes them happy, and they're learning to use the tools their world is rapidly filling up with. But it'd be nice to think that YouTube will keep a protective, avuncular eye on the little tykes, and stop them from being served up some video nasties by accident.

At the risk of infantilizing adult aficionados of Lego Star Wars (although I would argue it's too late for that) it shouldn't be too hard to make a short-term inference that the person watching all these kids videos is a kid, and to tailor the site's content accordingly: "The Babysitter Filter".

So, what's the problem again?

Before I finish, it's worth going over again what exactly the problems are with personalization, in a nutshell. In the shallow end, "there's less room for the chance encounters that bring insight and learning." In the deep end, excessive personalization is no less than a threat to democracy itself, as people exclusively inhabit their own bubbles, rarely if ever exposed to differing opinions, counter-arguments or dissent, the sine qua non of an informed worldview.

If that all sounds too crude and obvious, Pariser predicts that in the example of China there will be a rise in second-order censorship: the manipulation of curation, context, and the flow of information and attention, all assisted by the filters we gratefully sign up to use. Direct censorship is sooo 1989.

I read all this, and I'm not sure what to think. By what law of the universe should Google, Amazon, Bing, or Yahoo search results show the exact same information to each and every one of us? Speaking to friends about it, I find they're often surprised to find out that results are personalized, but then we usually end up muddled about whether this is good or bad. There's definitely a whiff of a 'good old days' argument in this book, for instance when Pariser reminisces about the time "when Yahoo was king, and the online terrain felt like an unmapped continent", but at the same time he openly acknowledges that the web is no different to other media insofar as it's growing up fast. It's just that, as he says, it was supposed to be different. Was it really, though? And if it was, have we really lost it so soon?

You can always still go to Wikipedia's home page if you want a more-or-less randomised inventory of interesting historical and scientific facts and (what seems - but it would, wouldn't it - to be) neutral news stories. Same with Twitter. Just hang out on the front page and watch the world go by. And really, the people who are interested in the world will seek out a diversity of opinion, the same way they always have. Those that aren't won't. It's up to you to find out if your service of choice is likely to be preventing you from seeing the big wide world, same way it's up to you not to be too much of a dick online, and up to you not to get ripped off by phishers and scammers. There's no magic formula: it just helps to be skeptical when things seem too good to be true.