Categories
Uncategorised

Data privacy and the challenge of “risk empathy”

Protecting your customers’ data is at the intersection of three organisational competencies that companies already struggle with. Together these amount to a lack of what we might call risk empathy.

Risk empathy is the ability to feel the risks that others face. It’s clear that this is a struggle for many organisations. British Airways (BA) may well be the first high-profile test case for enforcement of the EU’s General Data Protection Regulation (GDPR) but it won’t be the last.

While you might say GDPR only relates to citizens in the EU, the essence of BA’s record fine is a failure to protect personal information and payments data. GDPR certainly offers the most aggressive regulatory protections for these types of breaches, but similar protections are already in place in other jurisdictions – including Australia – and the trajectory of regulation is clear.

In any case you shouldn’t think this-wont-happen-to-us. The core organisational deficiencies that impact your ability to feel risk empathy are familiar to many organisations. If you want to improve your ability to feel risk empathy you have to solve the following organisational challenges:

  1. Organisations are traditionally not good at managing the risks relating to the conduct of others. Take, for example, the issues raised during the Banking Royal Commission around conduct risk and our ability to guarantee all parties are working in the interests of customers.
  2. Data management itself has also been a challenge to many organisations. Data governance discussions too often begin with the creeping realisation that “we thought that was somebody else’s problem…”
  3. Organisations are not good at managing risks they don’t own – or rather that aren’t explicitly represented in the executive accountabilities that would ensure risk mitigations are properly designed and funded.

Data breaches are right at the intersection of these challenges.

In the case of British Airways data was stolen. A third party had to commit a criminal act – but it’s BA that must pay the price.

This is different to the Cambridge Analytica case where the problem was primarily that Facebook and its partners configured features under their control to share data beyond the consent obtained from its customers.

BA’s fine is GDPR’s way of pricing this type of risk. It is a call to consider the risk to the individual who is actually the subject of the data.

Managing “data subject” risk is different to managing other risks an organisation might face. If a risk impacts your organisation you might choose to accept the risk, or perhaps cultural deficiencies might mean the risk is never even raised. There is also a personal risk involved – if the risk event occurs somebody in the executive team will likely be held to account.

But when data subject risks are discussed you have to remember that the data subject – your customer – isn’t in the room. Somebody has to speak for them with a voice that is strong enough to get your organisation to act differently.