Dangerous Data? - 3 min read

We at HARBR believe in a world where data supports and enriches everyone’s lives. We’re calling for a data revolution; all data, one platform, effortlessly accessible.

But no revolution happens without setbacks.

There have been previous media storms about the loss or misuse of personal data. But the recent controversy around Facebook (FB) and Cambridge Analytica (CA) has cut particularly deeply into public consciousness. FB is ubiquitous in the modern world, so its potential links to political manipulation were a perfect trigger for media interest.

According to The New York Times, and The Observer in the UK, CA gained the agreement of 270,000 FB users to sell their data from their FB profiles for ‘academic research.’ What the users did not realise was that this was in fact political research, and included not only their data, but that of those connected to them on the site – increasing the harvest of FB profiles to over 70 million US based members. The FB data was then used to create ‘psychographic profiles’ to better target ads at potential voters for Donald Trump. There have been subsequent allegations that CA used similar techniques to target potential ‘leave’ voters in the UK’s ‘Brexit’ referendum of 2016.

The public concerns around this controversy are multiple: FB’s apparent lack of oversight, to CA’s alleged breaches of data privacy, to the broader issue of whether it is legitimate to use such data – even with the owners’ agreement – for ‘political’ marketing.

To us, the starting point for thinking about how societies handle data is by recognising its power.

Like any innovation in human history, ‘big’ data has the capacity to be used in dubious ways, but also for great good. Data can be used to tap into our fears, but it can also be used to better meet our needs, improve our quality of life, our health and the wider environment. While the storm continues around FB and CA, data-driven projects such as Google’s collaboration with London’s Moorfields Eye Hospital quietly continue to make the world a better place.

Our values are the foundation of our approach to these questions. We believe that revolutionising data should not come at the expense of transparency or appropriate controls. Data owners need to have a clear and auditable line of sight on how their data is being used. And if a data owner does not like what is being done with their data, they need to be able to say ‘no’.

In a nutshell – ingenuity and integrity should go hand in hand. Because clever solutions that cut ethical corners are not really clever solutions.

martin yong