I’d like to revisit the topics of personalisation and etiquette in digital services. Over the last 2 days I’ve posted a couple of things about beacons, their potential and also their pitfalls, and I’ve had some feedback across a range of channels that tells me further discussion is warranted about opting in and preferences and user control and all those lovely things that apply not only to beacons but to pretty much any experience that provides content in some kind of automated fashion. Here are a few very basic truths:
- A user opting into an app, or checking the ‘Agree’ box on a EULA, does not make it OK for the entity at the other end to bombard them with whatever kind of nonsense the business thinks is a good idea. Even if the EULA makes it legal, it’s terrible etiquette, and an excellent way to alienate users.
- Asking a user to fill in some kind of lengthy and involved questionnaire to assess tastes is a terrible experience.
- Pushing content of any kind to people and not giving them the chance to tell you whether they like it or whether it’s appropriate is just plain rude.
Let’s take an old-school, real-world experience to illustrate this:
About 14 years ago, I was living in Chicago and worked just down the road from Marshall Field’s, the lovely but sadly now-defunct department store. I used to shop quite a bit at a certain designer’s counter, and over time the woman who worked in that department, Jennie, got to know me. When new collections came in, Jennie would sometimes put aside a piece or two that she thought I’d like, and she was usually right. Eventually she asked for my number, so that she could call me when things came in that she thought I’d want to have a look at, and I gave it to her. She’d call every few months and describe a few items, and I’d either pop round to try them on, or thank her but politely decline. I grew quite fond of Jennie, and I’m absolutely certain I spent several thousand extra dollars at that counter because of her. It worked because she did it right – she treated me with respect, and used unimpeachable etiquette. Jennie was what most recommendations engines want to be: effective and appreciated. Here’s why, with the digital counterpart for each:
- Call me every day, or even every time new stock came in – only when she was relatively sure I’d like something. [Irrelevant content; content dictated by the business rather than the customer]
- Come to my office and throw clothes at me while I was trying to work, or to my house while I was trying to read. [Intrusive advertising]
- Stand on the pavement and try to force coupons on everyone who passed by. [The whole last round of, and an alarming amount of the new round of Bluetooth/beacon advertising]
- Try to make me buy more of what I’d just bought. [A huge proportion of current online advertising, which tries to push you precisely the thing you’ve just bought]
- Always learn from feedback – both what I did and what I said. [Ongoing iterative feedback loops, empowering the customer/user to tune their profile]
- Use creativity in her recommendations, coming up with some things that I probably wouldn’t have chosen for myself but which ended up being some of my favourites. [Recommendations that broaden horizons, instead of collapsing them inward]
So how do we make recommendations and all their content cousins more like Jennie? As with most of the current crop of design challenges, this all comes down to respect, expectation and (relinquishing) control. Treating people with respect means understanding that they have ideas and opinions, and giving them the opportunity to voice those rather then trying to shout them down. The basis of any great experience or relationship, be it digital or interpersonal, is one of respect and clear expectations – saying what you mean, doing what you say, being mindful of the other and responsive to their feelings and needs. And when people have the means to participate and feed back, when they understand what to expect and what’s expected of them, they feel happier and more in control of their lives. Looking at this list of human interactions, it seems obvious – so why do we continually make such messes of the same kinds of interactions in the digital world? Because it’s difficult, and costly, to design these things well, and too often we don’t see the human cost of designing them badly.
*The title is a reference to a story from 2002 about personalisation gone wrong in the TV world.