COMPUTING
By Perry Hutton, Regional Director – Africa at Fortinet
Bringing the issue of security into the Big Data discussion often produces two divergent schools of thought from IT professionals − categorical denial that Big Data should be treated any differently from existing network infrastructure, and an opposite response towards over-engineering the solution given the actual (or perceived) value of the data involved.
These are the four facets, as defined by IDC, that give rise to challenges but also opportunities:
- Volume: The amount of data is moving from terabytes to zettabytes (1 zettabyte is 1021 bytes or 1,000,000,000 terabytes) and beyond
- Velocity: The speed of data (in and out), from static one-time datasets to ongoing streaming data
- Variety: The range of data types and sources − structured, un/semi-structured or raw
- Value: The importance of the data in context
Yet, while Big Data presents new security challenges, the starting point to resolving these challenges remain the same as creating any other data security strategy: by determining data confidentiality levels, identifying and classifying the most sensitive data, deciding where critical data is to be located, and establishing secure access models for both the data and analysis.
Plan around the Big Data lifecycle
Properly defended Big Data necessitates defining specific security requirements around the Big Data lifecycle. Typically, this begins with securing the collection of data followed by securing access to the data. Like most security policies, a proper assessment of the threats to the organisation’s Big Data never ends but revolves around ensuring the integrity of data at rest and during analysis.
Performance is a key consideration when securing the collected data and the networks. Firewalls and other network security devices, such as those for encryption, must be of sufficiently high performance so they can handle the increased throughput, connections and application traffic. In a Big Data environment, policy creation and enforcement are more critical than usual because of the larger volumes of data and the number of people who will require access to it.
The sheer amount of data also proportionately increases the need to prevent data leakage. Data Loss Prevention technologies should be employed to ensure that information is not being leaked to unauthorised parties. Internal intrusion detection and data integrity systems must be used to detect advanced targeted attacks that have bypassed traditional protection mechanisms, for example, anomaly detection in the collection and aggregation layers. The inspection of packet data, flow data, sessions and transactions should all be scrutinised.
Because Big Data involves information residing over a wide area from multiple sources, organisations also need to have the ability to protect data wherever it exists. In this regard, virtualised security appliances providing a complete range of security functionality must be positioned at key locations throughout the public, private and hybrid cloud architectures frequently found in Big Data environments. Resources must be connected in a secure manner and data transported from the sources to the Big Data storage must also be secured, typically through an IPSec tunnel.
Leveraging Big Data with the right tools
While Big Data presents challenges, it also offers opportunities. With the right tools, vast amounts of information could be analysed, and this allows an organisation to understand and benchmark normal activities. If that organisation could then monitor for users who stray from that norm, it could proactively get ahead of potential data and system breaches.
This effort is aided by competent IT staff and efficient deployment of the appropriate security tools. These tools include dedicated logging, analysis and reporting appliances that can securely aggregate log data from security and other syslog-compatible devices. These appliances will also analyse, report and archive security events, network traffic, Web content, and messaging data. Policy compliance could then be measured and easily customised reports produced.
The difficulty in capturing, managing and processing information quickly in Big Data environments will continue to make security an afterthought in many firms. As portable storage and bandwidth continue to grow, the mobility of these larger datasets will also increase, resulting in breaches and disclosure of sensitive datasets. Threats will likely come from intruders manipulating the Big Data in such a way that business analytics and business intelligence tools could generate false results and lead to management decisions that could profit the intruders.
Even small changes in Big Data can have a big impact on results. So, organisations must not ignore the need to secure Big Data assets – for security reasons, business intelligence or otherwise. They must address Big Data’s main needs in terms of authentication, authorisation, role-based access control, auditing, monitoring, and backup and recovery. Going forward, Big Data analytics involving behavioral benchmarking and monitoring will also become increasingly crucial in addressing next-generation information security challenges.