Defence in depth is the only way to pre-empt the next vulnerability

Those responsible for security must plan for the fact that new vulnerabilities, which may compromise their data, will keep being be discovered. The best way to mitigate is to adopt a defence in depth strategy.

IT systems and software are designed to be secure, but with millions of lines of code and multiple libraries from 3rd parties, vulnerabilities will inevitably sneak in. System complexity is without doubt the enemy of security. Much software has evolved from, or must integrate with, earlier systems not designed with today’s threats in mind.

This leads to unforeseen risks, as we saw with the recent Venom vulnerability and previously with Heartbleed and Poodle, where long term vulnerabilities have been revealed. Meanwhile malware writers have become more sophisticated and dedicate more time to finding these ways in.

But, if your job is to keep data secure, this unfortunately can’t be an excuse. You know vulnerabilities will be discovered in systems you have selected, and you need to build a security strategy which plans for that. And this is even more important if you move to the cloud, as you are at the mercy of the provider to upgrade and patch when vulnerabilities are revealed.

The solution is defence in depth.

You cannot rely on single system or security product. You must have a cohesive strategy which uses layers of security to keep the bad stuff out, and the good stuff safe. Comprehensive security is not about a single silver bullet, it is about an arsenal.

Start by stopping anything getting in. This means the usual firewalls and anti-virus. But these can be flawed and malware can sometimes be hidden in legitimate files. These should be supplemented with deep content inspection technologies which can scan legitimate files for anything suspicious hidden within them and then remove it. Other technologies such as sandboxing and whitelisting have a part to play as well.

Most organisations will not necessarily want to buy everything – it just isn’t practical. Finding the solutions which offer the biggest ‘bang for the buck’ is extremely important. Purchasing solutions is also only half the challenge, they also need to be configured and run, operational costs and skills requirements can put a different slant on ROI and the prioritisation of solutions.

Next, monitor what is going on inside the business. Deep Packet Inspection tools can be deployed to watch over the corporate network for anomalous behaviour that might be indicative of a malware infection. Again, part of the defence in depth strategy, but costs and skill sets can be prohibitively high for all but the largest organizations.

Finally, there is the absolute requirement to ensure your most critical information can’t get out. Information is the lifeblood of the business and should it fall into the wrong hands, either maliciously or inadvertently the consequences can be dire.

Modern data loss prevention (DLP) solutions can be set up to monitor anything going in or out of your network and stop any critical information that shouldn’t be leaving, including financial information, intellectual property, and customer or personnel records.

This sounds dramatic, but needn’t hinder business – new Adaptive DLP technology offers an alternative. Gone are the days where such solutions could only block anything suspicious until the IT department had checked and approved it. Solutions now are far more intelligent and context aware, offering the ability to redact (remove) critical information which breaks policy and leaving the rest to continue unhindered.

Managing your workforce is always an issue. The usual well worn advice around training and policy applies, but technology must also play a role. The remote workforce is often cited as a security concern as it means staff working on private devices or insecure networks, which are easier to compromise.

This needn’t be the case. Staff can be provided with logins to work on the company network remotely, rather than taking important documents off it. If the information is on your network you have much more control over it. And the aforementioned Adaptive DLP approach can ensure staff can’t take sensitive information home, even if they try to send it to their personal email account or copy it onto a USB stick. Context is everything, not just the content.

This links to the fact that most threats come from inside your organisation, often from honest mistakes. Providing such systems which ensure staff can work securely, without feeling they are under constant scrutiny, is paramount.

Selecting the right software and hardware for your different layers of security is an important part of defence in depth. For example, we saw the US Navy recently raised security concerns around their IBM servers after the company’s server line was purchased by Lenovo.

This is an extreme example, but it is important to trust that the products you buy are designed securely and not likely to have backdoors – some products have gone through rigorous independent testing to prove this.

And finally, of course, you need to keep the infrastructure up to date – patching vulnerabilities in the OS and in applications as soon as they are discovered, reducing the risk of attack. You can never guarantee you are vulnerability free, but if you have layers of security and keep it all up to date, one vulnerability shouldn’t put you at risk.

Unfortunately for the CIO, the cyber-attacker only has to get it right once, whereas they need to get it right all the time. But preventing the problem from occurring in the first place is much easier than trying to fix the problem after it has occurred.

Dr Guy Bunker is SVP Products at Clearswift


Type: White Paper


  • Favorite list is empty.
FavoriteLoadingClear favorites

Your favorite posts saved to your browsers cookies. If you clear cookies also favorite posts will be deleted.