Five Privacy Considerations to Build Trust in Your Product
Sometimes life has a funny way of coming full circle. Back in 2016, I published a blog after seeing a play called “Privacy.” Now I enjoyed the play, but it also really expanded my mental horizons on privacy to the point that I had to sit down and write that post.
The play wove in a great deal of audience real-time interaction, focused on each audience member’s personal data. The clever conceit was: At times the play stopped, and they showed how much supposedly private information could be dug up on individual audience members. They pulled up photos of us buying our tickets and images of our homes.
For me the experience brought home how easily companies can access private information and breach the trust of users who give data without expecting it to be used in other contexts.
I was thrilled to have Professor Ari Ezra Waldman – an internationally recognized thought leader on privacy and online safety and author of “Privacy as Trust” – as a guest on the Georgian Impact podcast because he had been a consultant on this play. I knew he’d be able to give us some excellent tips on how to think about privacy as we design technology products.
Have are five highlights of the discussion for you to consider as you build products with privacy and trust:
1. Privacy and Intimacy
Modern technology is changing the notion of privacy. In an intimate relationship, for example, with a friend, doctor or lawyer we don’t need NDAs or Terms & Conditions. We may often share the most sensitive of information, but we do so in an environment of intimacy and trust.
This trust is built upon expectations; there are personal space expectations of ‘do not invade’, for example. These expectations can be thought of as ‘non-interference’. NDAs and similar, are trying to reflect these social expectations in written form.
The same privacy requirements can also be applied to technology. At the moment, tech companies take our data to serve their own interest. Ari argued that the law should treat big tech companies in the same way it treats other fiduciaries like doctors.
The same expectations of data protection and respect we have with doctors and lawyers could be applied to technology vendors.
2. Contextual Privacy
Context is an important aspect of privacy. Ari makes the case in his book that privacy should be an expectation if we share information within a particular context. For example, if we attend an Alcoholics Anonymous (AA) meeting and share intimate details, we expect privacy and there are rules on non-disclosure outside of the group context.
However, privacy is not secrecy. There is a significant difference between posting a public image and disclosing HIV information to ten of your closet friends. Ultimately, privacy is mapped to the expectations of trust within a given context.
Technology makes this more complicated. What if you post something on Facebook but only make is accessible to 10 friends? Is this the same as intimate sharing offline? It may well be, but technology complicates the landscape. This disclosure can affect the Facebook algorithm. It may then disclose information – or related information – to others. In a real-world case, an LBGTQ person came out to a small group of friends, in a closed group. This influenced the Facebook algorithm of her father’s feed, thereby notifying him of her status.
Privacy cases can be lost based on the mistaken idea that disclosing to one discloses to all. Think about the privacy expectations of data you collect.
3. Diversity by Design
How does all of this influence app design what’s the best way to include privacy in the design process?
Ari told us that any company that collects personal data, to maintain the trust of their users must connect safety, function, and app design. The design should reflect the privacy policies and be built to Privacy by Design (PbD) principles.
This can only be achieved by structuring design teams that reflect the diversity in our societies. Design currently happens at the engineering level where team members are engineers and predominantly white males. Privacy issues then trickle through this filter; they get to a privacy professional at a late stage.
Lawyers sociologists, anthropologists and anyone who understands human behavior and Human-Computer Interactions (HCI) should be in the design meeting from the outset.
Even if this slows down the design process, it will ultimately be worth it, because building this way from day one will avoid issues later on. Design patterns manipulate users, and social psychology has been used adversely for technology design. We share data for reward without understanding the risks associated with that choice – just look at what’s happening with FaceApp.
Now is the time to allow diverse backgrounds like sociologists and lawyers into the design room.
4. The Vendor as Data Fiduciary
Traditionally, a fiduciary is based on a ‘special relationship.’ We give over control to a doctor or lawyer. Within this relationship, we disclose information to allow treatment, and so on. This makes us vulnerable within that relationship.
These elements are true of other data collectors, too, including technology vendors. The vendor is the expert; Google is the search expert – matching a search result to input data. Individual consumers are placed in a vulnerable position when they use these platforms. The platform holds the power; they know a lot about us, and we know little about them.
No harm should be done by companies that collect our data. To redress this imbalance, technology companies should act with a duty of care, loyalty, and confidentiality.
5. The Perfect T&C’s and the Problem of “Over-choice”
When it comes down to it, we shouldn’t rely on long legalese documents for privacy and terms of service. We should operate under general rules that reflect general actions. Reading these T&Cs is a burden on the user. Toggling privacy preferences on multiple mobile apps is also a burden. This is the problem of over-choice.
The law needs to change, so this burden is placed on companies, not the individual. To do this, we need to build in better defaults, build in data minimization, and place limitations on third party use of data.
Listen to the full podcast episode to find out more, including:
- More on how trust, intimacy, and privacy interlock
- Where context comes into the privacy equation
- What harm are social media companies doing to us as individuals
- Who should be involved to create the best technology design team?
- How “Queer Data Apps are Unsafe by Design”
Who is Ari Ezra Waldman?
Professor Ari Ezra Waldman is an internationally recognized thought leader on privacy and online safety and, in March of 2018, he published a book “Privacy as Trust – Information Privacy for an Information Age” and, more recently, he had an op-ed posted as part of the New York Times privacy project. Ari is a Professor of Law and the Director of the Innovation Center for Law and Technology at NYLS. He also holds a Ph.D. in sociology. His research focuses on the influence of law on technology.
Read more like this
Why Georgian Invested in Armis (Again)
Armis offers visibility, security and risk management to enterprises across the Internet…
Why Georgian Invested in Glooko (Again)
We are pleased to announce that Georgian has led Glooko’s $100 million...