Do you trust the companies you share your data with? Do you even know which companies are accessing your data? Do you know how it’s being used? Do you have visibility of the impact these activities have on your life and the lives of others?
Based on my experience, the answer to most of these questions, although somewhat nuanced, is no.
In this article I’ll articulate a succinct view of the current state of data sharing and trust. I’ll also share some insights from >X’s work on the front line helping organisations operationalise strong data ethics frameworks, bring proactive data protection practices to life, and embed Data Trust by Design in the forefront of the customer and brand experience. Sharing openly will be an ongoing trend for us throughout 2019.
You should read this if you’re grappling with the nuance and complexity of data sharing and trust. It’ll help you start building your business case. It’ll give you ideas to help get practical work started.
Let’s get started.
Trust is at an all time low
A variety of measures, including Eldman’s Trust Barometer, show trust at an all time low globally.
Facebook’s deceitful design patterns, the exposed behaviours of Australian financial service providers from the Hayne Royal commission and scandals such as Google admitting it had been tracking people’s locations, even when Location History had been switched off, give people validated reasons to distrust.
With all of this happening, individuals, organisations and regulators are only now beginning to grapple with the real world consequences of how data is being used.
This has forced industry to start asking hard questions about the ethical use of data. What is right and what is wrong?
We’re amidst an evolving regulatory landscape
Data protection regulation globally is strengthening. The EU is leading the charge, yet California, Brazil, Japan and a number of other jurisdictions are evolving their approach.
Yet regulation is just one piece of the puzzle. It isn’t a ‘solution’ per se. It’s my view that organisations need to take a stance. Those that do have an opportunity to lead from the front.
People’s attitudes and behaviours are beginning to shift
“The findings, instead, support a new explanation: a majority of Americans are resigned to giving up their data — and that is why many appear to be engaging in tradeoffs. Resignation occurs when a person believes an undesirable outcome is inevitable and feels powerless to stop it. Rather than feeling able to make choices, Americans believe it is futile to manage what companies can learn about them.” The Tradeoff Fallacy, 2015
This has long been a common theory. In fact, the number of times this rationale has been cited during meetings I’ve attended is hard to comprehend. People have cognitive dissonance; they say they value privacy and security, but share their data at the drop of a hat for free WiFi.
This theory is now being challenged. People’s behaviours are getting closer to their attitudes. Globally, people are beginning to take action to protect their privacy. They’re installing adblockers. They’re using VPNs. They’re refusing to ‘consent’, which, as you may know, is one of the most poorly understood concepts in data protection today. You can read more about our perspective here.
Millions use Microsoft’s GDPR privacy tools to control their data
Transparency seems to be more valued more than ever
“95% of people surveyed wanted companies to give options to opt out of certain types of information collected about them, how it can be used and/or what can be shared with others” Consumer Policy Research Centre July, 2018
The view that, “I should be in control of how my information is used”, is becoming far more popular. People expect to know what they’re signing up to. Attitudinal surveys, like IBM’s Trust in the Cognitive Era, have shown this for years.
The tick and forget model we’ve all participated in — the model partly responsible for this state of widespread distrust — is being directly challenged.
In fact, you can read more about this dynamic in our Designing for Trust Playbook.
A clear(er) relationship between trust and data sharing is emerging
This understanding, that people value transparency and are now willing to take action, is beginning to drive evolved organisational behaviour. AXA’s ‘Give Data Back’ initiative is good example of this. The Icelandic Government’s is another.
However, the current data sharing model lacks clarity. Value is highly subjective in most cases. As a result, who defines what’s ‘appropriate’?
It’s really hard to appropriate the value of sharing (as an individual)
Most organisations process data for a reason. They want to deliver value to their customers. Even if they’re doing a good job of this behind the scenes, the entire market communicates this poorly.
As referenced above, much of the difficulty resides in people’s inability to appropriate the value exchange. This comes down to;
A lack of understanding (what am I granting access to? What is the consequence? What are my rights?)
Inappropriate agreements given the formfactor and context of use (pre-ticked boxes and convoluted agreements that read at postgraduate level, even though people are engaging transaction on their way to work), and
Value proposals that lack purpose and specificity (What is a free service? What am I really signing up to? What is the pathway and time to value i.e. the outcome I want to achieve?)
For organisations, making progress isn’t simple.
The reality is, there’s a gap in the market…
Many of the publicly accessible data sources that reference the relationship between trust and data sharing are qualitative and focused on the attitudinal research dimension.
Our empirical assessment, after years focused on this emerging market sector, is that most organisations are still in the thinking and experimenting phase. This makes it very difficult to assign appropriate confidence intervals to such data sources.
Data across our client projects over the last 12 months, however, has showcased significant promise. Although these are small sample sizes, they provide insight into the types of behaviours we might be able to expect when new data sharing patterns emerge.
Comprehension of agreements increases by 60%
Time to Comprehension decreases by 10x, and
People are up to 8x more willing to share data
But the real question is…
Where do we go next?
We’re observing an increased propensity within organisations to start taking real action. This is because all organisations we speak to want to be the most trusted and valued organisation they can be. They want deeper, more meaningful relationships with their customers. They want access to data only the customer can give.
Data Ethics initiatives, stronger data protection programmes and an intent focus on trustworthy customer experience design are part of making this goal a reality. Yet the biggest challenge organisations face relates to attitudes and behaviours internally. It’s all about the people.
To overcome this very real barrier to progress, I’d suggest the following;
Select something to focus on. Make it tight. Ensure there are real customer and business drivers. Onboarding processes are a great place to start because that’s where terrible agreements that no one reads or understands are first executed. It’s also where customers often start actively sharing their data
Frame some hypotheses and conduct an experiment. You can use the tools and approaches in our playbook to help. Use your existing process as the control and document the qualitative and quantitative impact the new approach has
Socialise the data. Share the approach. Get people bought into what you’re trying to achieve, and then
Build a plan to make this real.
Earning Trust by Design isn’t something that can happen transactionally. It’s not something that can be achieved overnight. It needs to be a cross organisational effort. It needs to be backed by an ethical decision-making framework that is documented, contestable and easily accessible. People need the right incentives. They also need the right tools and approaches at their disposal.
At >X we tackle these challenges head on daily. It doesn’t mean we have all the answers. But it does mean we’ve made plenty of the moves you’re likely to make. We’ve already learned and optimised our approach.