Collision courseBig data opens up a world of possibilities for organisations wishing to exploit the vast array of information available to them. Many commentators believe that expoitation of big data
Collision course
Big data opens up a world of possibilities for organisations wishing to exploit the vast array of information available to them. Many commentators believe that expoitation of big data will add hundreds of billions of dollars to the economy.
Big data is the neat label for a set of technologies which are coming of age. New analytic tools allow rapid appraisal of real time transactions and facilitate deeper analysis of multiple databases. For example, big data might build a customer profile based on details of his web browsing habits, prior orders, phone calls, location, social media, electoral roll information and otherprivateand public details.
The financial gains come from better analytical information supporting a wide range of businesses. For example, analysts believe big data could radically reduce the time to bring drugs to market by focussing on patient data; reduce wastage in government; or decrease the instances of fraudulent transactions.
This raises some interesting issues, particularly because the purposes for which information might have been given originally might be radically different to those for which it is deployed by analytics software. Indeed, big data makes possible analysis regardless of the data source, geography or purpose for which it is gathered. It is this repurposing that is at the heart of the difference between the big data world view and that of the regulators.
Judging big data risk
It is surprising how many non-lawyers assume that information available to them can be used wholly without constraint. People often believe they can freely use data unless access is controlled or any conditions of access are made clear. Yet almost all data has some degree of legal constraint attached to it such as copyright and database rights, data privacy, regulatory controls in sectors such as financial services, data integrity requirements, particularly in theUSunder Sarbannes-Oxley, and so forth. Big data may encounter regulatory barriers because data analytic tools can reach into data and use it in new and unexpected ways. In my own law firm, our knowledge search engines crawl every database we have, pulling up presentations, legal precedents, emails and other information. We have had to suppress some results, so that search returns omit data which is highlyconfidentialor reveals sensitive employee information.
The unfortunate thing is that most people in business routinely misjudge the risk of data misuse. I know because my colleagues equally routinely help businesses tackle breaches of data law. This misjudgement comes because businesses have been subjected to a gradual but persistent tightening of regulation and they do not realise data risk is increasing.
In 1984 when theUKintroduced its first Data Protection Act, the legislation was novel and the regulator was fairly weak. There were no real investigative powers. Transgressions would result in a stern letter or, in the worst case, a fine of up to £2000. The late 1990s and early 2000s saw a sea change in data law with the European Data Protection Directive, Basel II, MiFID and Sarbannes-Oxley. The law now expected organisations to implement systems and processes to protect data. During the last five years or so, the regulatory approach has become tougher still. In theUKfor example, where the government loss of child benefits data was headline news and banks routinely dumped customer information in branch car park bins, we have seen stringent enforcement action. Banks and building societies have been fined in the millions of pounds by the FSA and had to give public undertakings on their future conduct to the Information Commissioner's Office. Since 2011, the ICO has new powers to fine up to £500,000 and has issued its first six figure fines in recent months. The most punitive laws are yet to come. The draft Data Protection Regulation fromBrussels(a law which will apply acrossEurope) proposes fines of 2% of global turnover for the worst data privacy transgressions.
Thus, most organisations have a culture of data sharing which has come to be at odds with tough modern regulatory requirements for security, data integrity and access controls. Increasingly the regulators must be feared.
The broader perspective
The risk of abusing data is not just a matter of incurring the regulators' wrath. Employees have a right to expect a degree of trust from their bosses. To what extent could and should employers be deploying detailed and intrusive employee monitoring and analytics on their staff? This kind of activity does have a legal impact under data privacy laws (and theUKregulator has published detailed guidance on the subject). But it is also an issue of employee trust which unions and employee representative bodies will take a strong interest in, and one where employees may lose confidence in their organisation if the issue is not handled sensitively.
Customer trust is also extremely important and again the law places parameters around how organisations may build up profiles of their customers, particularly the information which must be given to individuals about how their data will be used and when consents are required. But paramount to businesses is the sensitivity consumers, particularly the over 30s, show about how details given by them are used. Breaches of the law or unexpectedly intrusive marketing techniques can cause highly negative PR and impact on brand and reputation.
What organisations should do
The risks arising from big data deployments are manageable, and indeed many customer or employee analytics projects go through regulatory impact assessments. The methodology for impact assessments is particularly mature in the data protection environment where CRM, employee monitoring and other activities need assessing, often on a global scale. Recommendations often include the giving of information to data donors/owners, consents, data security, access restrictions and formal policies and procedures.
Key issues including personal data usage, confidentiality and IPR ownership are important. But organisations dealing with financial data, government protectively marked documents or operating in a regulated field will have particular concerns too. International businesses must deal with more complexity since the law is by no means uniform across the world.
Without a proper assessment and risk management, intended big data usage can be restricted or made impossible. This can be a particular issue when data usage is repurposed and indeed there are very public examples historically where companies have got this wrong and have been required to stop their CRM activity, such as British Gas using its statutory customer data to sell other services and Boo.com's sale of its customer database. Well advised organisations are more cautious on repurposing nowadays.
Key to any risk assessment will be who gathered the data, what information was given to the data donor and what permission did the data donor give. The country in which the gatherer, donor and database are located also matter.
Personal data is the most obvious area for concern in the EU. It is simple to design privacy policies which allow for broad usage of personal data, and these will be information and sometimes consent based. However, plain language will need to be used and weak forms of consent, such as requiring employees to agree to opt-outs, can be insufficient for big data purposes. Historic data may not be capable of complete mining and suitable data flagging will be needed to signpost the level of permission for particular data. Using cookies to garner data is now subject to very bureaucratic EU laws which have yet to mature in clarity.
Metatagging and data encryption may also be necessary to isolate data which can only need to be used for selected data mining. Privacy, copyright andconfidentialinformation policies may be needed to ensure gathered data has the right permissions associated with it and is not accidentally re-utilised in breach of third party rights.
Getting it right first time
Big data is not a step ahead of the regulators. There are mature laws across the globe designed to ensure security, integrity, privacy and protection of data. However, big data presents new problems since information might be accessed and analysed across the organisation without regard to the origins of the data and constraints on use.
Neither technologists nor lawyers can solve these issues in isolation. But they don’t need to. They just need to work together. There are already audit and assessment tools available. Methodologies appropriate for data security, CRM, bring your own device, FSA compliance and other areas of technology regulatory compliance are helpful in the context of big data. The best compliance programmes can support big data solutions based on an understanding of the technologies, the legal regime and business objectives.
Big data opens up a world of possibilities for organisations wishing to exploit the vast array of information available to them. Many commentators believe that expoitation of big data will add hundreds of billions of dollars to the economy.
Big data is the neat label for a set of technologies which are coming of age. New analytic tools allow rapid appraisal of real time transactions and facilitate deeper analysis of multiple databases. For example, big data might build a customer profile based on details of his web browsing habits, prior orders, phone calls, location, social media, electoral roll information and otherprivateand public details.
The financial gains come from better analytical information supporting a wide range of businesses. For example, analysts believe big data could radically reduce the time to bring drugs to market by focussing on patient data; reduce wastage in government; or decrease the instances of fraudulent transactions.
This raises some interesting issues, particularly because the purposes for which information might have been given originally might be radically different to those for which it is deployed by analytics software. Indeed, big data makes possible analysis regardless of the data source, geography or purpose for which it is gathered. It is this repurposing that is at the heart of the difference between the big data world view and that of the regulators.
Judging big data risk
It is surprising how many non-lawyers assume that information available to them can be used wholly without constraint. People often believe they can freely use data unless access is controlled or any conditions of access are made clear. Yet almost all data has some degree of legal constraint attached to it such as copyright and database rights, data privacy, regulatory controls in sectors such as financial services, data integrity requirements, particularly in theUSunder Sarbannes-Oxley, and so forth. Big data may encounter regulatory barriers because data analytic tools can reach into data and use it in new and unexpected ways. In my own law firm, our knowledge search engines crawl every database we have, pulling up presentations, legal precedents, emails and other information. We have had to suppress some results, so that search returns omit data which is highlyconfidentialor reveals sensitive employee information.
The unfortunate thing is that most people in business routinely misjudge the risk of data misuse. I know because my colleagues equally routinely help businesses tackle breaches of data law. This misjudgement comes because businesses have been subjected to a gradual but persistent tightening of regulation and they do not realise data risk is increasing.
In 1984 when theUKintroduced its first Data Protection Act, the legislation was novel and the regulator was fairly weak. There were no real investigative powers. Transgressions would result in a stern letter or, in the worst case, a fine of up to £2000. The late 1990s and early 2000s saw a sea change in data law with the European Data Protection Directive, Basel II, MiFID and Sarbannes-Oxley. The law now expected organisations to implement systems and processes to protect data. During the last five years or so, the regulatory approach has become tougher still. In theUKfor example, where the government loss of child benefits data was headline news and banks routinely dumped customer information in branch car park bins, we have seen stringent enforcement action. Banks and building societies have been fined in the millions of pounds by the FSA and had to give public undertakings on their future conduct to the Information Commissioner's Office. Since 2011, the ICO has new powers to fine up to £500,000 and has issued its first six figure fines in recent months. The most punitive laws are yet to come. The draft Data Protection Regulation fromBrussels(a law which will apply acrossEurope) proposes fines of 2% of global turnover for the worst data privacy transgressions.
Thus, most organisations have a culture of data sharing which has come to be at odds with tough modern regulatory requirements for security, data integrity and access controls. Increasingly the regulators must be feared.
The broader perspective
The risk of abusing data is not just a matter of incurring the regulators' wrath. Employees have a right to expect a degree of trust from their bosses. To what extent could and should employers be deploying detailed and intrusive employee monitoring and analytics on their staff? This kind of activity does have a legal impact under data privacy laws (and theUKregulator has published detailed guidance on the subject). But it is also an issue of employee trust which unions and employee representative bodies will take a strong interest in, and one where employees may lose confidence in their organisation if the issue is not handled sensitively.
Customer trust is also extremely important and again the law places parameters around how organisations may build up profiles of their customers, particularly the information which must be given to individuals about how their data will be used and when consents are required. But paramount to businesses is the sensitivity consumers, particularly the over 30s, show about how details given by them are used. Breaches of the law or unexpectedly intrusive marketing techniques can cause highly negative PR and impact on brand and reputation.
What organisations should do
The risks arising from big data deployments are manageable, and indeed many customer or employee analytics projects go through regulatory impact assessments. The methodology for impact assessments is particularly mature in the data protection environment where CRM, employee monitoring and other activities need assessing, often on a global scale. Recommendations often include the giving of information to data donors/owners, consents, data security, access restrictions and formal policies and procedures.
Key issues including personal data usage, confidentiality and IPR ownership are important. But organisations dealing with financial data, government protectively marked documents or operating in a regulated field will have particular concerns too. International businesses must deal with more complexity since the law is by no means uniform across the world.
Without a proper assessment and risk management, intended big data usage can be restricted or made impossible. This can be a particular issue when data usage is repurposed and indeed there are very public examples historically where companies have got this wrong and have been required to stop their CRM activity, such as British Gas using its statutory customer data to sell other services and Boo.com's sale of its customer database. Well advised organisations are more cautious on repurposing nowadays.
Key to any risk assessment will be who gathered the data, what information was given to the data donor and what permission did the data donor give. The country in which the gatherer, donor and database are located also matter.
Personal data is the most obvious area for concern in the EU. It is simple to design privacy policies which allow for broad usage of personal data, and these will be information and sometimes consent based. However, plain language will need to be used and weak forms of consent, such as requiring employees to agree to opt-outs, can be insufficient for big data purposes. Historic data may not be capable of complete mining and suitable data flagging will be needed to signpost the level of permission for particular data. Using cookies to garner data is now subject to very bureaucratic EU laws which have yet to mature in clarity.
Metatagging and data encryption may also be necessary to isolate data which can only need to be used for selected data mining. Privacy, copyright andconfidentialinformation policies may be needed to ensure gathered data has the right permissions associated with it and is not accidentally re-utilised in breach of third party rights.
Getting it right first time
Big data is not a step ahead of the regulators. There are mature laws across the globe designed to ensure security, integrity, privacy and protection of data. However, big data presents new problems since information might be accessed and analysed across the organisation without regard to the origins of the data and constraints on use.
Neither technologists nor lawyers can solve these issues in isolation. But they don’t need to. They just need to work together. There are already audit and assessment tools available. Methodologies appropriate for data security, CRM, bring your own device, FSA compliance and other areas of technology regulatory compliance are helpful in the context of big data. The best compliance programmes can support big data solutions based on an understanding of the technologies, the legal regime and business objectives.