There have been a couple good white papers recently dealing with security issues. One focuses on big data analytics to predict security risks while the other looks at recovering from a data breach or loss. Consider them a before and after look at protecting your data.
It seems only logical to look at preventing security risks, the whole closing the barn door after the cows escape philosophy. Platfora sponsored a white paper called “Big Data Analytics on Security Data.”
The article starts with the eye-opening proclamation that data breaches are no longer just a security problem. They are a major revenue concern, as in the average cost of a data breach for corporations is $3.5 million, according to figures from the Ponemon Institute, which conducts independent research on privacy, data protection and information security policy. The average cost of a single lost or stolen record totaling $145 – up 15 percent from 2013, Ponemon says.
The major challenges to security are bring your own devices (BYOD), massive volumes of data and “siloed” IT and security products. The Platfora report quotes James Conley, FBI director, who says, “There are now only two types of companies left in the United States: those that have been hacked and those that don’t know they’ve been hacked.”
A typical data breach lasts 8 months, according to Platfora, which drills it down to the more specific 243 days. It outlines how cybercriminals succeed in these steps:
- Reconnaissance – Cybercriminals gather information available online to create targeted campaigns designed to trick people into sharing user credentials.
- Weaponization – Many adversaries are well funded, which means they have the resources to craft a custom attack vector and payload.
- Delivery – Cybercriminals will iterate and test different variations of the attack payload until they eventually circumvent security solutions.
- Exploit/Install – Exploiting vulnerabilities and establishing a foothold on an end user— machine, network, or data center is another tactic.
- Command and Control – Malicious actors establish the C2C communication that enables delivery of future attack payloads.
- Exfiltration – Exfiltration is the act of slowly and steadily siphoning high-value information, such as millions of credit card numbers.
Platfora says security analysts are going to need a security analytics platform that is built for big data. Not surprisingly, Platfora has just rolled out a big data for security analytics platform. As I’ve said in the past, just because a company is trying to sell you something doesn’t mean the information isn’t valid.
On its company blog Platfora makes these recommendations for a security analytics solution. It needs to:
Identify the Sequence of an Attack: You must be able to easily access and analyze data surrounding the incident, below the waterline, to identify anomalies and patterns that are not “normal.”
Ask Many Questions and Get Answers Rapidly – To get to answers, you must be able to ask as many questions as needed, and receive fast responses. You also have to be able to quickly pivot your investigation and next round of questions based on the responses and data from your initial set of questions.
Derive Insights on Petabytes of Data. In a typical organization, the security events and audit logs from IT, user, and business applications amounts to several terabytes (TB) of data. To detect anomalies and patterns, you need a baseline on at least six months of data.
What can you do to recover after a security breach? A white paper sponsored by Carbonite seeks to help you with that issue and stresses the best recovery happens because of good preparation.
One major step to take is before your data is actually breached. Carbonite suggest having response time built into your service level agreement with the company doing your data backup. As Carbonite points out, response time needs to indicate how quickly a problem will be acknowledged and then approximately how long it will take to resolve the issue.
As the company points out, “Obviously, the time to completion is dictated by the amount of data needing to be restored and the technology in place but expectation should be clearly set as to how long it will take before the restore process is initiated.”