Foucault’s Robot [the digital creep factor, continued][post 2/100]

One of the things I’ve been thinking about a lot over the past year is why certain interactions with algorithms feel so creepy – not the annoying stuff, like the “you’ve-just-bought-this-thing-so-why-don’t-you-buy-another-one” advertising algos that are all the rage at the moment, but the things that really freak people out.

Example (true story): a few years ago, after many months of going to various doctors for various tests that gave inconclusive results, a friend of mine was diagnosed with MS. The reason he finally got a diagnosis wasn’t because of some medical breakthrough on his doctors’ part – it was because he asked them if it was possible he might have it. And he asked them because ads for MS treatments had started to come up in his Gmail interface. So, based on the correspondence that was keeping his family and friends in the loop, Google guessed what was wrong before the medical community did. The general consensus on this is that it’s seriously creepy. But why? I could make an argument that it was quite useful, since it facilitated a diagnosis when the lack of one had been really stressful. Yet even as I type that I’m thinking, “still creepy.”

Part of it is simply discomfort with the idea that someone we don’t know could know so much about us. But there’s something deeper going on here too, and it’s about who – or what – the ‘someone’ is that’s communicating with you. When you read a book or a magazine, or look at a painting at the Tate Modern, or listen to BBC Radio, there’s a good chance that (at least for now) what you’re reading/seeing/hearing was made by another person. That helps us to interpret what’s going on – no matter how weird that book or painting might be, we can engage with it, theorise about what it means, interpolate a meaning for ourselves. If a normally mild-mannered presenter says something really out of character, we can decipher that she didn’t mean it that way (unless of course she did). I’d suggest that we feel comfortable doing this because we share a ground-level context with the author/artist/maker. Human beings have human motivations, so we can more or less figure out what’s going on.

But when a thing – a blog post, a news article, a twitter bot, an advertising algorithm, a coffee maker that decides for itself when to brew the coffee – is made by a bit of code, those ground rules no longer apply. Unless you happen to be familiar enough with the algorithm to know exactly what it’s doing, you can’t understand the machine’s ‘motivations’. This makes interpreting meaning difficult, and I’d argue that this is what makes some interactions feel so creepy.

This all might sound like pseudo-intellectual* hand-wringing, but I think we need to bear it in mind as we design more and more objects, systems, networks and so forth that touch human life but do so without an actual human touch – in some cases actively replacing the human touch. Making sense of a world that’s increasingly algo-driven is going to take some getting used to, and we should be careful not to underestimate the emotional and cognitive impact of negotiating the shift from thousands of years of always having a person at the other end (even if that person died 500 years ago, or lived halfway round the world), to a future where what’s at the other end is often largely unknowable for most of us.

It’s one thing for the average person to not really understand the mechanics of how the stock market works, or autopilot technology. It’s quite another to not understand how your thermostat works, or your lights, or your (god help you) internet refrigerator. The closer technology gets to our bodies and homes, the more deeply interwoven it becomes, the more opportunity it brings – both to improve our lives and to warp them. It’s our job as designers, strategists and technologists to do our best to make it the former and not the latter.

 

*Lengthy discourse on Deconstructionism, epistemology, and the works of Foucault and Barthes have been intentionally avoided. You are welcome.