One of the top-level recommendations of the FTC privacy report was greater transparency about the data practices of companies and technologies. The report pointed to mobile apps as especially needing better transparency. Indeed, a previous FTC staff report on mobile apps for kids found that hardly any of the apps that were studied offered full privacy disclosures.
But as a computer scientist, I tend to look beyond written privacy policies, seeking disclosure opportunities elsewhere, in the design of the technology itself. Beyond seeing disclosure as something that sits alongside the technology–a link you can click–I tend to look for ways to make technologies whose privacy attributes are clear to the user. The goal is to make transparency an integral part of the user experience.
One of my favorite examples is the little red light next to my laptop’s built-in camera. The light comes on whenever the camera is active and capturing video. Users understand this naturally, without having to read a manual, and they can tell at any moment whether they are being observed. Could companies have gotten the same level of real-world transparency by putting statements into the privacy policies of programs on my laptop? I doubt it.
Of course, the red light can’t be the only control on use of the laptop camera. The light tells me that the camera is active, but it doesn’t tell me which applications have access to the video stream. The developers had more work to do, to make sure programs didn’t get camera access when they shouldn’t have, and that programs couldn’t eavesdrop on the video stream in ways that would surprise the user. Still, the red light lends valuable transparency, without interfering with the user’s flow of activity.
One of the reasons the red light works well is that it fits with user expectations about the camera. The design of the laptop makes the camera’s presence and location obvious–which makes sense, because the user wants to look at the camera. The status of the camera is most sensitive when the user is in front of the camera and can see the light. And a red “on air” light is a standard feature on video cameras, so the user will know how to interpret the light when it comes on.
These same factors make the red light a poor fit for other areas where we want transparency. We can’t secure access to my mobile phone’s address book by putting a red light next to the address book (whatever that means). If we want the same kind of natural transparency elsewhere, we can’t just follow a cookie-cutter approach–we have to think about the user’s expectations in each instance. This is what great user experience (UX) designers do already. We just need to apply it to privacy.
There’s one more thing that great UX designers know: you can’t just build a technology and then bolt on a great UX at the end. You have to have UX in mind through the entire design process, looking at your design through the user’s eyes. What will the user expect this feature to do? How can we make this option clear to the user? Why does that function exist at all? A great UX isn’t just a skin, it’s a way of aligning the user’s expectations and the underlying technology with each other.
The same is true for privacy. Explaining things to the user will only get you so far. To align your practices seamlessly with your users’ expectations, you will need to shape both your user interface and your underlying practices. This doesn’t mean that you need to refrain entirely from collecting and using users’ information–but it does mean that you shouldn’t collect and use data in ways that surprise your users. If you can do this successfully–if users find that your products don’t give them unpleasant privacy surprises–then you can build the kind of trust that wins loyal customers.
[Note: Looking for a joke or Easter Egg in this post? There's not one in the text.]