As your digital footprint grows, do corporate data protection attitudes need to change?

Data SecurityEurope is big on data privacy. Europe is trying to keep on top of data legislation but as we move towards, or rather deeper into the ‘internet of things’, the data on us is going to be increasingly personal, and opt in or not, we’re not going to like it.

It’s personal. It won’t be my traditional ‘marketing profile’ that companies access, such as facts or assumptions about my age, income, family status, interests, hobbies, readership. They will access browsing habits, social connections, smartphone location trails – they can do this now – and then driving habits, films watched, rooms in the house used, blood pressure, exercise regime and genomic information. Information that can be captured, stored and accessed – my ‘digital footprint’ as some call it. The big challenge is not any one company’s data privacy code, but the code that covers the aggregation of data sourced from a myriad of companies into a complex mass of web of data – your digital footprint. This ever-growing portfolio of me and my habits can be gathered by different companies into an intensely private portfolio which I almost certainly don’t want everyone to see.

The personal nature of this data will drive increasingly personal interventions from companies trying to get their message across which will cause a more substantive public reaction that we have seen so far. Imagine spending a few hours looking online for information on tobacco brands, prices and availability at home and abroad. You could be looking for a gift for a friend or researching a report for a school project or just looking for fun. But to a data miner, tracking your click stream, this digital trail could be read as a telltale signal of an unhealthy habit — a data-based prediction that could make its way to a health insurer, customs official or potential employer. This might seem far-fetched but it will happen without regulation and control.

Corporate investment in ‘big data’ is already huge. Companies are investing massive amounts in strategies that rely on access to deep, broad digital footprints. Google bought Nest (in home, in office technologies). Facebook acquired WhatsApp. Wearable health technologies are taking off as seen by its domination at the 2015 CES exhibition in Las Vegas. Nissan, BMW and others have invested in socially enabled smart cars to provide lots of immediate and comparative data to drivers and get data from them in return. Companies see financial value in developing products, services, pricing and promotions through understanding their consumers deeply using our digital footprints.

Why don’t we opt out? We have a choice. We can opt in. Or can we? Companies are clever at collecting data and including ‘opt in’ choices with other terms and conditions so that consumers opt in without really wanting to. They trade data privacy for ‘more convenience’. Disney for example, will provide a RFID bracelet on entry to a leisure park which they can use to track park behaviour. The benefit for consumers is that they can jump the queues. Disney in turn collect information about your park journey, experiences, visits and potentially purchases, so that they can ‘optimise’ the experience next time. Consumers sign the terms and conditions for Google and Google use their search criteria and browsing behaviour. We get annoyed occasionally, but are generally pretty apathetic – look at the way people react to Facebook’s privacy settings when they are announced but carry on using the service anyway. It’s all rather deceptive.

At the moment we put up with it, but for how long? Each of us draws the line in a different place, but for most there will be a line.

So is regulation the answer? Dr Alex Pentland, a social scientist and director of the Human Dynamics Lab at the M.I.T. is an academic adviser to the World Economic Forum’s initiatives on Big and personal data, believes that regulations on data collection make sense, as long as they are flexible and not a “sledgehammer that risks damaging the public good.” He talks about what he calls “a new deal on data” with three basic components: you, the consumer, have the right to possess your data, to control how it is used, and to destroy or distribute it as you see fit. The project wants to put consumers, not corporates, in control of their data portfolio.

And it seems to work in the real world. The ‘new deal on data’ is being tested in safe harbours such as in Trento, Italy. When consumers in Trento, who chose to opt in, interact with the government or participating companies such as Telecom Italia, they are told what data’s being collected, and what’s being used for, and have the chance to opt-out. They get notification and control of data generated about them. It’s securely shared in an auditable way. We haven’t seen direct results on the project but Dr Pentland says that people are sharing more (not less) personal data, because they feel more in control and know what’s going on. They are also more engaged with communications and services that are derived from the data that they have deliberately shared, giving them the perception of ‘just for me’ campaigns.

Maybe the results may give us a glimpse of the future? Every long-term relationship in the real world is founded on trust. Companies need to abandon the trick-and-trap of companies’ complex terms and conditions, where we all have to click “I Agree”. If consumer data strategies become totally transparent then consumers may share more with you, not less, and there may be more interested in your communications as a result. The lesson here is to build data trust and never abuse it – this requires a change in corporate attitude towards what is fair and useful and less of ‘what am I allowed to get away with’. I wonder if this is just a pipedream?

Neil Woodcock is Chairman and CEO of The Customer Framework and an expert on Smart Data, contact him on neil.woodcock@thecustomerframework.com or follow him on twitter at @ndwoodcock