Big data is a UK success story; the Science and Technology Committee publish report and note that 58,000 jobs could be created and £216bn contributed to our economy (2.3% of GDP) over a five-year period.
Report: The big data dilemma
But the Committee warns that existing data is nowhere near fully exploited – companies are analysing just 12% of their data, and if 'data-phobe' businesses made good use of their data they could increase UK productivity by 3%. The Government can also do more to make its databases 'open' and to share them with businesses, and across Government departments to improve and develop new public services.
A big data revolution will need action, the Committee warn, on digital skills and infrastructure, and also on people being able to give their informed consent for how their personal data is used.
While personal data is only a small proportion of big data, given the scale and pace of data gathering and sharing, distrust arising from concerns about privacy and security is often well-founded and must be resolved by industry and Government if the full value of big data is to be realised, the Committee warn.
Nicola Blackwood MP, Chair of the Committee, said:
"We are living in the data age. 'Big Data' is driving a revolution in the speed and extent of the data applications that are shaping all aspects of our economy and our day-to-day lives. The use of 'big data' is already bringing big benefits. Exploited further, big data will be transformative, unlocking new life-saving research and creating unimagined opportunities for innovation. The Government has a role in this, in sharing and opening up its own data.
But big data is also raising legitimate concerns about privacy and the way personal data is being used and sometimes re-used in ways which re-identify previously anonymised data. There is often well-founded distrust about this and about privacy which must be resolved by industry and Government.
A 'Council of Data Ethics' should be created to explicitly address these consent and trust issues head on. And the Government must signal that it is serious about protecting people's privacy by making the identifying of individuals by de-anonymising data a criminal offence."
The Committee warn that the digital skills gap is approaching crisis levels, and that this not only has economic implications but also puts the quality and security of this data at risk. 'Big data' skills are not being strategically addressed.
The Government should commit to a continuing substantial role in developing 'data analytics' skills in businesses; increasing big data skills training for staff in Government departments; and promoting more extensively the application of big data at local government level. And the Government must address the wider context of its policies on apprenticeships and immigration control which affect the availability of people with big data skills.
Government should make more datasets available
The Committee warn that there is more to do to breakdown Government departmental data silos, to bring data together in order to further improve public services, as well as to improve data quality. The Government should make more datasets available both to decision-makers in Government and to external users and establish a framework for auditing the quality of data within Government departments and identifying data-sharing opportunities to break departmental data silos.
The failure of the 'care.data' initiative, for sharing patients’ health data, shows that patients’ consent cannot be taken for granted. The Government cannot afford a second failure from a re-launched scheme, the Committee warn. The Government should take careful account of the lessons from a similar, successful, scheme in Scotland, and to help bring patients onside and to streamline healthcare across different NHS providers — hospitals, GPs, pharmacists and paramedics — it should give them easy online access to their own health records.
Data protection and consent
Businesses and governments that communicate most effectively with the public, giving the citizen greater control in their data transactions by using simple and layered 'privacy notices', and allowing the consumer to decide exactly how far they are willing to trust each data-holder, will gain most. If informed, freely-given consent is the bedrock of a trusting relationship between a consumer and a data-holder, then it must always be part of that deal that consent freely-given can also be freely-withdrawn.
Nicola Blackwood MP said:
"Seeking to balance the potential benefits of big data and people's justified privacy concerns will not be straightforward. A debate is needed at this critical juncture, now that the new EU data protection regulation has been agreed.
The Government must contribute to that debate by clarifying its interpretation of the effect of the EU Regulation on the re-use and de-anonymisation of personal data, and introduce changes to the Data Protection Act 1998 as soon as possible to strike a transparent and appropriate balance between the benefits and the privacy concerns."
The EU agreed a General Data Protection Regulation in December 2015. It will now require changes within the next two years to the UK's Data Protection Act 1998. The new EU Regulation appears to leave it open for data to be re-used, and anonymised data potentially de-anonymised, if "legitimate interests" or "public interest" considerations in the EU Regulation are invoked.
This is an issue that urgently needs to be addressed as big data becomes increasingly a part of our lives. The Committee urge the Government to introduce as soon as possible a criminal penalty for serious data protection breaches, and warn that the Government should not regard the two-year implementation period of the recently agreed EU data protection regulation as a reason for delaying this.
Nicola Blackwood MP, Chair of the Committee, said:
"The UK is a world leader in big data research across many disciplines and in our Tech sector. But urgent action on the digital skills crisis is needed if the country is to take full advantage of our well-placed position in this sector."
Current UK data protections – set out in the Data Protection Act 1998 - cannot simply be left until the EU’s new General Data Protection Regulation is implemented, the Committee argues. Two areas need to be addressed before then. The Committee urges the Government to introduce a criminal penalty for serious data protection breaches and roll out the Kitemark developed by the Information Commissioner identifying good data practice.
The Data Protection Act will have to be revised to accommodate the new EU rules, which will come into force in the next two years. The new EU regulation appears to leave it open for data to be re-used, and anonymised data potentially de-anonymised, if "legitimate interests" or "public interest" considerations are invoked. The new EU rules also rely on greater fines rather than criminal penalties to deter data protection breaches.
The Committee say that the Government should establish a 'Council of Data Ethics' within the Alan Turing Institute as a means of addressing the growing legal and ethical challenges associated with balancing privacy, anonymisation, security and public benefit. Establishing such a Council, with appropriate terms of reference, will provide the clarity, stability and direction which has so far been lacking from the European debate on data protection issues.