Sunday , 24 September 2017
Home » NEWS » THIS WEEK’S GURUS » The good shepherd model for cybersecurity
Stuart Clarke, CTO Cybersecurity, Nuix
The good shepherd model for cybersecurity

The good shepherd model for cybersecurity

In 2017 nearly all organisations are storing or processing their customers’ private information electronically to an extent. Organisations that store customers’ private information have a duty of care to protect that data. Credit card numbers and other personal details fetch a high price on the black market and, unfortunately, organisations do a very poor job of keeping them out of the hands of cybercriminals.

Regulators in many countries are now levying considerable penalties against organisations that fail to protect people’s private data. When the European Union’s General Data Protection Regulation (GDPR) comes into effect in May 2018, organisations face fines of up to €20m or 4 per cent of annual turnover for exposures of European citizens’ private data. They must also disclose breaches within 72 hours of discovering them.

The bad news is that: Breaches are inevitable. Security researchers believe determined attackers can infiltrate any perimeter security system. Even so, the majority of data exposures stem from internal causes: malicious insiders, loss or theft of devices, accidental misuse, or simple errors by IT and security administrators.

Minimising access, minimising damage

If you can’t prevent hackers or insiders from getting into your organisation’s network, it’s vital to minimise the damage they can cause.

Preparing for attack requires knowledge of where data is stored, what it is worth and how it is protected. Organisations may choose to use information governance technologies to effectively monitor their data and to reduce the cost of and extent of cybersecurity breaches. These technologies provide visibility of unstructured data, so you can understand where high-value and high-risk private information is stored.

Strictly limiting access to private information—to malicious and inept insiders and external hackers—minimises the risk that this high-risk data will be exposed.

Defending through deletion

Most organisations just don’t understand what data they’ve got. They store large volumes of data that has no business value—it’s duplicated, trivial, no longer used, past its retention period, or potentially risky. Many industries and jurisdictions have strict compliance rules around how long organisations must retain data. However, once that retention period is over, the risks and costs of keeping data greatly outweigh any residual value.

Erasing this low-value data, according to predefined and legally sanctioned rules, reduces risks and minimises the volume of data that could be compromised. This, in turn, reduces the scope of post-breach investigations, right-to-erasure, and subject access requests.

In the longer term, information governance analysis can help you understand why this content is created or becomes low-value in the first place.

Data herding

Organisations often have intellectual property and company records such as contracts stored inappropriately in file shares or email attachments. Both records managers and end users struggle to find the time to ensure records are always filed correctly. Information governance technology can locate these records in the wild—often across dozens of storage systems and thousands of shares—and move them to controlled repositories with appropriate security, access controls, and retention rules. This makes it much harder for anyone to gain unauthorised access.

Data security

Employees are known to make “convenience copies” of data to work from home or as test data for a new application. They may come across data that was generated for one purpose, such as legal discovery, and use it to fulfill other needs without understanding the privacy implications of doing so. And even if they dispose of this data correctly, it may still be retained in backups or archives.

By monitoring access to this data and locations it is copied or moved to in real time and by conducting periodic sweeps of email, file shares, and other unprotected systems, you can quickly locate and remediate unprotected private data. Understanding where this high-risk data is stored also means you don’t need to spend time and effort protecting data that doesn’t need it.

Access Controls

It’s simple, ensure that the only people with access to high-risk data are those that need it for their day-to-day work. With a policy such as this in place, the risk is minimised simply as fewer people have access. Often a disgruntled employee can be a risk and many data loss incidents happen for this very reason, a strict policy of cancelling credentials as soon as someone leaves the organisation can prevent this. It can also be beneficial to investigate a dismissed employee’s recent activity, including emails and any indication that they have mishandled personal information.

Regulation has made it clear that organisations need to change their view on the handling of personal data. Privacy must be built into systems from the ground up, by design they must protect consumer data from mishandling. Living in denial and believing that breaches only happen to other people can be continued no longer, organisations must be focused on how they can minimise the opportunity for breaches and the damage they suffer. Only with this methodology can they be prepared for any future attack.

By Stuart Clarke, CTO Cybersecurity at Nuix

About Dean Alvarez

Dean is Features Editor at IT Security Guru. Aside from cyber security and all things tech, Dean's interests include wine tasting, roller blading and playing the oboe in his Christian rock band, Noughts & Crosses.

You can reach Dean via email - dean@itsecurityguru.org