Radical or back-to-basics?

Many talk about information flow; the challenge is that it doesn’t flow unless you design it that way, says Nigel Tozer, business development director, EMEA, CommVaultYou may have issues with your data system and, with budgets tight, you might not be faulted for opting to build a new one with just a few improvements, using for instance a different product that performs broadly the same task, but without the issues the prior system. This can be flawed thinking though as the new product will undoubtedly have different challenges that you just have to hope are not as bad as the last. Such changes also can only ever deliver purely incremental change, never provide something fundamentally different.
You could propose then that this is an argument against doing anything, but of course it isn’t. For example, replacing a server with a virtual one leaves you with pretty much the same server functionality but operational benefits and efficiencies come in droves. Transformational changes like this cost money but they save much more in terms of utilisation or even provide possibilities that were just not open before. Unfortunately, if you talk to anyone with responsibility for Information and Communications Technology (ICT) in the public sector today you will get a worried frown – where will the money even for cost savings come from? Where will the cuts in spending and staff fall?

Finding the savings
With the spectre of cuts almost everywhere, only the IT projects with the best justification will get a nod – and these will often be the ones that will also deliver assured savings. This is what leads us back to where we began – if you make incremental changes the savings are incremental too. However, big changes to infrastructure and different approaches can yield huge savings and deliver services once thought too expensive. There are a few key methods and initiatives that are effective on their own, but combined can be even more worthwhile, such as data management and shared services.
The first of these is to implement a data and information management strategy. This may sound very basic to some or a massive job to others, but very few see data as a river running through the organisation, it’s usually seen as “piles of stuff” scattered around and categorised by owner or function and treated accordingly. This scattered approach often suits the data management vendors as they like to sell lots of products to deal with these “piles” of data all over the place.
Stepping back and taking the “river” view allows you to plot where data should be throughout its life, automating its travel and protection along the way. Combined backup, disaster recovery and archive strategies can significantly reduce the amount of tier-one storage required as well as massively cut the cost of Disaster Recovery (DR) without necessary pooling resource. Another advantage of handing control of data management over to a software layer is that it breaks the link between features and hardware, allowing any choice of disk with a number of knock-on effects. Firstly, disk prices drop when you are not tied to a specific vendor and it means that innovation such as drives that spin-down can be introduced to help achieve other key initiatives such as carbon reduction without continually pumping more disk into tier-one arrays. Secondly, having a single point of control drastically reduces integration and reporting headaches, with much less time spent on customisation or what should be simple configuration changes.

Avoiding black holes
The other key change that comes about when you treat information as something that flows is that you have an opportunity to dip into one place and see everything that runs past. Right now many organisations in the public sector struggle with freedom of information (FOI) requests or can see the benefit of something like the Local Government Classification Scheme, but can’t see a way to implement it in a cost effective way. Even those that introduce Enterprise Search Technology often find gaps in coverage and end up with two or more search tools – one for the compliance archive and one for live data for example. Combining and correlating results from two or more systems can be a tiresome task and worst of all, there will still be huge “black holes” where key information can hide. Having one place to search live data, archive and backup provides real benefits and once in place it is possible to design automated searched so that data required on a regular basis can be pre-searched and collated ready for use.
The biggest excuse held up by ICT executives for not having a true “data and information strategy” is mostly around having a mandate for unilaterally applying data management policies – certainly guidelines exist but these are often seen as minimum standards. Better to have forward thinking ICT executives that stay ahead of the curve, rather than those that wait for regulation and rely on expensive 3rd party eDiscovery services when the issues could have been dealt with at the root cause for much less.

Key initiatives
A great example of this is e-mail archiving and the related search tools. For years IT managers and directors alike have said there wasn’t the need for compliance style archiving predominantly designed for a regulation in the US, which didn’t apply to the business world in Europe, let alone government bodies. Time has proven that the IT managers and directors that decided to implement such an archive system made big savings – deadly Outlook PST files that clogged up backup jobs disappeared and employees that could find e-mail and attachments more quickly and were not subject to inbox space limits became much more productive. Not only does this kind of archive help with backup and staff productivity but it also aids the reliability of the e-mail servers, reducing the need for so much tier-one disk for e-mail. If not done correctly though, these archives can add to the “piles of data” mentioned earlier.
The second key initiative, which has been positively encouraged by central government, is shared services. Economies of scale make perfect sense and a pooled budget used centrally can achieve levels of service that are out of reach of more modest purchasing power. There are many challenges to overcome, which are well documented in terms of partners, personnel, decision making, legal and budget related, but once achieved the dividends will kick in. The initial decision of exactly what to share is often the easiest – common core services such as HR and net new projects with a lead authority are favourites, though it could be as simple as providing rack-space and power for DR on a mutual basis, though the latter is unlikely to have a significant financial benefit.
Combining data and information management into a shared service instead of attaching it to a group of processes like a Common Media or HR department may seem alien, but a fundamental shift can be made that sets costs savings in stone for many years. Commercial entities have traditionally made use of hosted services more so than public sector bodies and this is increasing with the advent of smarter, more customisable and more scalable technologies delivered via a cloud model. Government directives on data security often preclude cloud use for public bodies but there is nothing stopping groups of such bodies creating their own “cloud” of IT services or indeed data and information management specifically, such as backup, DR, archive and eDiscovery.

Overcoming barriers
In the past the barriers to sharing IT were technological – especially with data protection and eDiscovery. The ability for one party to access or change protection levels for privileged or secure information of another party is not an option; nor is running a search against documents containing personal data if you don’t have the authority. For example, you wouldn’t want a call centre operative able to run searches that pulled up child welfare information.
Thankfully, choosing the right technology means these challenges just aren’t there anymore, which paves the way for shared data and information services that are transformational. Not only can control be central but it can also be confidently delegated or made self-service with full auditing of who did what, and when, for extra piece of mind. The flip side to this level of management is that data protection functions can be merged with fault tolerant and load-balanced processes that reduce the server and media footprint as much as virtualisation helps with the production server estate. Choosing the right system means that archive can also be rolled right into the same infrastructure for even greater savings. Deduplication is one of the hottest technologies around in the data protection space and can be effectively distributed via the data management software layer – off-site DR and remote protection all of a sudden become viable and automated from one place without expensive hardware lock-in or restore penalties.
Whilst all of this is transformational from an ICT perspective, apart from the budget savings and the ability to increase the quality service with less of everything, just gaining access to the right information can be the biggest benefit. Legal, HR or regulatory searches can become quick and easier without the black holes; they can even be spread throughout the user-base with secure role-based limits on scope. Data and information management seems anything but radical until you really do get back to the basics, which is when fundamental improvements can be achieved.

For more information
Web: www.commvault.com