The data management minefield

The public sectors’ data management is looking increasingly like Swiss cheese – full of holes. The Information Commissioner’s report out last month highlighted that public sector organisations still aren’t taking the Data Protection Act (DPA) seriously enough. It’s hardly a new policy, nine years on, and in a time of extreme data sensitivity, with identity theft and fraud rife, you’d think the importance would have sunk in.
But it seems there have been problems after problems. Reports of data breaches are all too common. In a recent case, Newcastle City Council’s computer systems allowed the download of 54,000 customer credit card records. And this flaw wasn’t caused by an errant employee with a grudge or a hacker on a mission – the information was simply placed on an open server instead of a secure network, a completely preventable and inexcusable error.
Again, recently there was the major breach experienced by the NHS MTAS recruitment system where personal details of job applicants were available for everyone to see. These are only examples in a long list of public sector data security failures.

Batten down the hatches
There is no denying that ensuring data security in a public sector organisation is a nightmarish task. And one that is getting increasingly difficult as more and more services go online and more data is stored. But that doesn’t mean that with a greater focus, individual organisations can’t batten down the hatches and adhere closely to standards and regulations outlined in the Data Protection Act to prevent massive data security breaches and the resulting large fines and damage to reputation.
This increased amount of data coupled with a growing pressure to share services (and often data too) in the public sector is simply part of a ‘joined up government’ ethos. Asking consumers to provide the same information time and time again for different government agencies can cause mass duplication of effort and is confusing for the customer. For example, registering your details for council tax and then having to register again for the electoral register. Keeping all these data sources bang up to date is a nigh on impossible task. Hence the need for joined up government. But although sharing data makes sense for most public sector organisations, it means securing data is akin to holding water in a sieve - when multiple agencies have access to it, the chances of a data breach increase significantly. Regardless of the difficulties, it’s a must and organisations have to do it carefully, and within the confines of the law.
Yet recent reports saw Home Secretary, Jacqui Smith flouting Data Protection (1998) laws by letting the MET use automatic numberplate recognition (ANR) data for crime fighting purposes. You could argue that a bending of the rules is justifiable when it concerns combatting crime. However, this brings into question that where adherence to the DPA is concerned, there are two problems. One concerns technology glitches that cause data security breaches. The other concerns confusion over the public sector’s interpretation of the DPA and how it applies to them.

Knowledge and education
The solution to this confusion is a more thorough approach to knowledge and education – some organisations simply don’t know where they stand and what applies to them. The Information Commissions’ Guide will help organisations feel their way through the requirements and comply with the DPA so no inadvertent breaches should occur.
Solving the second issue is more challenging. There are myriad public sector technology projects and a good percentage of these are bound to have problems that can ultimately cause lapses in data security. Poorly designed or managed systems, unsecured data, poor password procedures and ubiquitous access are some of the key culprits. Assuming these processes are in place is simply not enough and brushing any weaknesses under the carpet is not an option. An oversight of this nature recently stung UK bank Nationwide, which was fined nearly a million pounds over a laptop theft that was reported missing almost three weeks after the incident. This punishment serves as a stark reminder to organisations in all sectors to be ultra careful about how they deal with data security breaches and to have stringent processes in place to ensure that data is as secure as possible. Processes need to be checked, approved and audited time and time again.

Building partnerships
A major consideration is the fact that almost all these huge technology projects are implemented through outsourced suppliers. Now that’s not to say the blame lies solely with them – any supplier relationship should be a partnership - but organisations need to choose a partner that can carefully supply the consultancy and services needed. If dealing with sensitive data, organisations need to be given strategic advice about the data security policies they implement and there needs to be crystal clear clarification about where the responsibility for security lies. This needs to be carefully meted out in the contract.
Another problem that can cause security breaches is lapses in quality assurance and testing. Often a system is fully functioning until a change is made such as the application of a security patch or integration of a new piece of software. This type of activity can trigger defects in the system causing downtime or the exposure of data.
Data owners, regardless of whether they operate in the private or public sectors, have a responsibility to ensure the information is safe and secure. Public sector organisations are more accountable than ever to the public and they expect to be able to trust them. If public sector organisations fail in their data protection responsibilities, it can damage their reputations and the public’s confidence in their ability to do a good job. Ultimately it is in their interest to get it right.

Paul Bentham is the public sector expert and partner in the technology and outsourcing group at Addleshaw Goddard.

For more information

Please register to comment on this article