Published by Alexander Hitchcock on 19 October 2016
- Our Work
- The Reformer Blog
In February 2017, Reform held a conference exploring the challenges and opportunities of Big Data in government. During the conference we explored the issues of public trust and attitudes to Big Data, Big Data in healthcare and Big Data in criminal justice.
Both private and public sector institutions face issues in securing the public’s trust in the use of their data, as was revealed by a report commissioned by the Royal Statistical Society. Despite people’s low levels of trust, their actual behaviour seems to signal something different. Private companies, who suffer from higher levels of distrust than public institutions on how they use people’s data, seem to be met by less opposition when it comes to actually using people’s ‘digital footprint’. Amazon, Google and Facebook, for example, regularly use people’s data to inform their algorithms. Is this simply because in the era of lengthy terms and conditions it has become impossible to give informed consent?
The rise of Big Data and analytics has changed the terms of debate surrounding consent, data ownership, privacy and trust. Some of the principles underpinning the UK’s data protection law – such as purpose specification and data minimization – conflict with the requirements of Big Data analytics. These presuppose large amounts of data and, according to the Royal Society, “finding purposes for that data in ways not originally anticipated”. Maximising the benefits of Big Data means building in flexibility and adaptation. There is a fundamental tension between the legislation in place and how Big Data analytics operate.
Research shows that people express higher levels of support for data sharing and the linkage if the benefits and processes are made clear and that there are proper safeguards. Government should focus on this and embark on an open conversation with the public about what Big Data means in practice.
These tensions and questions needs to be addressed if Government is to harness the full potential offered by big data analytics.
Since its earliest days the NHS has embraced technology with patients reaping the benefits. Cutting-edge research into drugs and diagnostic techniques has seen patients receive world-class care. With Big Data however, healthcare has entered a new era.
Data-driven innovation is currently being used to improve population health and support patient-centred care, health system management and research. The 100,000 Genomes Project has fuelled the discovery of more than 1,800 disease genes. Today’s researchers can find a gene suspected of causing an inherited disease in a matter of days. The project has given families a diagnosis for their children’s mystery conditions. The hope is that mapping genomes will lead to a powerful form of preventive, personalized and pre-emptive medicine as clinicians tailor care to an individual’s DNA.
Equally exciting are computer programmes capable of digesting and analysing information for medical diagnostics. IBM’s Watson can read 40 million documents in 15 seconds. Watson has been taught how to understand and accumulate medical knowledge relating to oncology. It’s creators believe it is better at diagnosing lung cancer than humans.
The Five Year Forward View aimed to digitise all health records by 2020. This timescale is optimistic but bringing together different health systems that share information would create the world’s largest health database. It could provide a golden opportunity for researchers to understand patterns of population health and disease.
If the NHS is to fully embrace the potential Big Data has to offer it must focus on protecting patient data and communicating change to the public. Patients have an expectation that their information is private and may not appreciate that it is possible to share data whilst protecting confidentiality.
Her Majesty’s Inspectorate of Constabulary recently found that 40 per cent of the time police community support officers (PCSOs) are not in the places where they are most needed. How do they know? They use Big Data. By analysing police records from 14 different forces over seven years, and matching this with more than 1,200 variables from the corresponding ONS Output Areas, the project has created a model able to explain the variance in crime with an accuracy of more than 80 per cent. This was matched with PCSO locations to evaluate if they could be deployed to greater effect.
‘Predictive policing’ have been implemented widely, especially in the US. More independent analyses need to be undertaken, but early results indicate that it has sped up crime reduction. So far, focus has largely been on property crime, but data analytics have also been used to inform who the most likely victims of cybercrime are, and where and when women are at high risk of domestic abuse.
In addition to more efficient resource-allocation, predictive policing has the potential to allow for more autonomous working practices in the police. With access to data on where crimes are most likely to occur officers can make informed decisions on how to deploy their efforts.
However, not all responses to predictive policing have been positive. Given that certain crime types are more likely to be reported than others, one fear is that prevention efforts based on Big Data will only focus on these kinds of crimes. But this is no reason to scale back. Firstly, this is a dilemma police must already grapple with – how do you prevent, or even respond to, crimes that you are given no timely information about? Secondly, making traditional policing more efficient will, if anything, free up more resources for strategies addressing this issue.
As the face of public authority, policing has high-stakes consequences if it isn’t working well. Given the recent changes to police funding, it is as important as ever that police forces operate in the smartest way possible.