by

Brainmatters | The Value of Data

02/12/2015

Google Streetview recording nice pictures of the street as well as unprotected Wi-Fi traffic in a neighborhood. TomTom which shares information with the police about stretches of roads where many motorists tend to speed so that the police can target their positioning of speed cameras better. ING which wants to analyze payment data of customers, and wants to sell these analyses to third parties, so that those commercial parties can focus their publicity efforts more. The tax authorities that use SMS parking services to trace tax fraud. These are examples from everyday practice whereby personal data is used in a way that makes many people’s hair stand on end. That is when the so-called ‘creepy line’ has been crossed.

Big Data is a potential goldmine - bits and bytes are the oil of the 21st century. And by making sensible analyses of big sets of data we can optimize healthcare, organize manufacturing processes far more efficiently, solve major logistical problems, and understand complex social issues. It is for good reason that value creation is high on the agenda of our Data Science Center in Eindhoven.

Still, if we want to be able to cash in on that value literally and figuratively, we will have to pay heed to a totally different set of values - those of a moral and ethical nature. For regardless of legal limits the success of Big Data will depend strongly on the respect which businesses and authorities can muster for the autonomy and the protection of personal privacy of people.

Ethical discussion around Big Data deserves a place in every boardroom

Granted, these moral and ethical limits have been anything but clearly delineated, are changing constantly as technology progresses, and are found elsewhere for many people. Yet that is no excuse to throw in the towel (in the vein of Scott McNealy, from Sun Microsystems “You already have zero privacy anyway. Get over it.”).

On the contrary, I think that an ethical discussion around the use of Big Data deserves to be on the agenda in every boardroom and every council chamber. Not once, but continuously. And not in order to avoid legal pitfalls, or to reduce potential reputational damage, but simply because it is morally right.

Wijnand IJsselsteijn | professor of Cognition and Affect in Human-Technology Interaction

Share this article