Avoid Downtime and Security Issues With These 5 Data Protection Best Practices

In today’s ultra-competitive 24x7 business environment, downtime is not an option. Everyone—from employees to customers to business partners—expects systems and data to be available all the time. And they expect that data to be secure as well.

This presents a huge challenge for senior IT and business executives, who are charged with ensuring data protection and availability as well as providing a strategic vision for risk mitigation. IT environments in many ways have become more complex. End points now stretch well past the boundaries of corporate firewalls, and many organizations are operating with hybrid computing environments that include a variety of cloud-based services.

But challenging does not mean impossible. By following several best practices, an organization can create an infrastructure that safeguards valuable data resources while making sure users can access the information they need—when they need it. Here are five steps companies can take to protect their data:

1. Determine what the cost of system downtime would be to the organization. Studies show that this is by no means a trivial cost. For example, research by Ponemon Institute showed that the average cost per minute of unplanned data center downtime was $7,900. In all likelihood, the figure is only going to escalate every year. Figuring out how much downtime can cost an organization will help managers build a business case for creating a more reliable environment.

2. Determine the risks to the organization if it loses data. Anyone who has followed the major data breaches of the past few years knows how significant an impact these attacks can have on organizations. It’s not just a matter of the actual records lost or stolen by hackers or other cyber criminals—the cost in terms of damaged reputation, brand and potentially lost business can be huge. Then there are the regulatory compliance issues and possible fines. Again, quantifying what you’re dealing with in terms of risk is important for building a solution.

3. Assess where the organization stands with data protection, in terms of its people, business processes and procedures. What types of policies and procedures are in place to ensure data integrity and security? Does the company have business processes in place that back up critical data or put it at risk? Are employees receiving proper training to make sure they know what steps to take to safeguard data? These are just a few of the questions managers should be asking when assessing the current data protection posture of the organization. This will provide a baseline for making improvements.

4. Assess which technologies are in place for data protection and how they are working. Of course, a huge part of the data protection effort has to do with technology, everything from passwords to firewalls to encryption to backup and recovery access points. A comprehensive assessment will disclose whether any key technological pieces are missing, or whether some of the tools in place need to be upgraded. This includes evaluating mobile devices and ensuring that proper authentication and access mechanisms are in place.

5. Implement any technologies that are needed in order to advance the organization to the next level of data protection. Based on the information gleaned from the technology assessment, deploy whichever hardware and software tools and services are needed to further strengthen data protection. But remember that along with the technology, people are an important component of the data protection program. Make sure everyone understands that these technologies must be used properly in order to be effective.

Data protection is a fundamental business issue, not just an IT issue. Given the potential economic impact of downtime, data breaches and other risks—as well as the loss of customer confidence and brand reputation—senior executives need to make this a high priority.