Story image

INFORMATION PRESERVATION

01 May 10



A staff member once told me that just as he was leaving his previous organisation, management had come to the realisation that the grounds, properties and infrastructure they owned and maintained weren’t their most important assets. No, that was, in fact, their data. This realisation has spread across business and industry with companies now heavily, if not totally, reliant on the quality and availability of their data. This has never been more important than over the past 12 months during some of the toughest economic times most can remember. The ability to extract and report on customer information to help retain and drive business during this time has been critical to the sustainability of a number of businesses.

Businesses can no longer rely on tools such as Access to provide enterprise-scale database storage and reporting, and gone are the days of copying and pasting data into an Excel spreadsheet! Within the latest versions of these products, applications such as the Office suite become reporting tools in their own right. These hook directly into the database and include far increased functionality.

By migrating Access databases to a true relational database, data can be scheduled and delivered to users or customers in the format of their choice and at whatever time they choose. Users and customers can even be given the opportunity to implement desired changes themselves. This is a huge leap in database capabilities and can be done through tools that come with the database engine, or via associated third-party tools .

With so many systems and applications reliant on a database as their ‘backend’, there is an ever-more critical need to have a standardised set of data for reporting purposes. Companies will often have separate customer relationship management, enterprise resource planning, financial and sales tracking systems, all of which contain a base set of data for their customers. With the tools available within these database products, the ability to extract, translate and load a standardised and normalised set of base data has become a much easier process. By de-duplicating this base line data, errors in processing are reduced and more accurate and timely outcomes can be achieved, meaning better and quicker results – and happier customers.

While the reporting capabilities have improved, so have the high-availability and disaster recovery options. The ability to take copies of customer data off-site no longer relies on writing backups to tape and having a courier move them to a different location. Current database technologies mean real-time copies of databases can be spread across multiple servers in one location using functionality such as real application clustering (RAC) or database clustering, or in disparate locations, geo clustering. Copies of databases can also be transferred in real time to off-site locations using tools such as data guard, database mirroring or log shipping. With the prevalence of virtualisation technologies and changes in product licensing, these options become easy to implement and cost-effective.

With reliance on data being so high, it is critical that you maintain its security, ensure it can be used in the way it is needed, and that it can be recovered in times of crisis. You need to ensure there are well-documented processes and procedures for management, maintenance and backup of the databases. With regards this, businesses need to understand two key areas: the recovery point objective (RPO) – how much data can they afford to lose – and the recovery time objective (RTO) – how long can they afford to have their systems unavailable prior to recovery. This will drive backup and recovery strategies that match requirements.

While new and improved tools provide great opportunities and leaps forward in functionality, the base tasks of securing and maintaining the database and the data remain fundamental to a great user experience. No matter what flavour of relational database management system you choose to run, it is imperative that processes and procedures are put in place to ensure its availability. Often the focus is placed on the delivery of the data and it’s only when it becomes unavailable, normally at the end of the month, that users and organisations realise how important the core database administration function is. Don’t wait to find yourself in this situation!

Is Supermicro innocent? 3rd party test finds no malicious hardware
One of the larger scandals within IT circles took place this year with Bloomberg firing shots at Supermicro - now Supermicro is firing back.
Record revenues from servers selling like hot cakes
The relentless demand for data has resulted in another robust quarter for the global server market with impressive growth.
Opinion: Critical data centre operations is just like F1
Schneider's David Gentry believes critical data centre operations share many parallels to a formula 1 race car team.
MulteFire announces industrial IoT network specification
The specification aims to deliver robust wireless network capabilities for Industrial IoT and enterprises.
Google Cloud, Palo Alto Networks extend partnership
Google Cloud and Palo Alto Networks have extended their partnership to include more security features and customer support for all major public clouds.
DigiCert conquers Google's distrust of Symantec certs
“This could have been an extremely disruptive event to online commerce," comments DigiCert CEO John Merrill. 
Schneider Electric's bets for the 2019 data centre industry
From IT and telco merging to the renaissance of liquid cooling, here are the company's top predictions for the year ahead.
China to usurp Europe in becoming AI research world leader
A new study has found China is outpacing Europe and the US in terms of AI research output and growth.