Automation, the Skills Shortage and Cybersecurity

We’ve all heard of the wider IT skills shortage, but the lack of security skills in the industry is even more critical. According to a report by the Life and Times of Cybersecurity Professionals, IT workers that have specialist cyber security skills are approached with a new job offer at least once a week. In fact, 45 per cent of organisations claim to be severely lacking in this specific area of talent.

This high demand provides an opportunity for IT contractors to cash in. But, if the opportunity is so lucrative, where are these talented recruits hiding?

Unlike the machines competing in the Cyber Grand Challenge, cyber security professionals have a much more demanding role. The process of manually finding and countering hacks and other cyber threats can be incredibly labour-intensive. Professionals in this field work long hours, often having to search millions of lines of code to fix vulnerabilities.

Graham Smith, Curo Talent

This process requires patience, resources and, most importantly, knowledge and experience. Unfortunately, the cyber security talent pool simply isn’t wide enough to meet these needs.

In Curo Talent’s survey of IT contractors, IT Talent Acquisition; the candidate’s view, 31 per cent of IT contractors identified salary and day rate as the biggest attraction when looking a new IT job — another indication that finding professionals with cyber security skills could be costly.

A further 23 per cent agreed that an interesting job opportunity was what attracted them to a role. However, research by the Information Systems Security Association states that 70 per cent of IT workers believe the lack of cyber security professionals is negatively impacting their existing work. Due to an absence of support, much of their valuable time is spent fixing emergency IT issues and training junior employees, instead of advancing their own cyber security skillset.

See also: Tackling the Digital Skills Gap Requires Industry Support

Lack of professional development is also causing IT professionals to lag behind security threats, widening the gap between IT threats and the people capable of combatting them.

Cyber security challenges are continuously developing due to hackers continually changing tactics, techniques and technologies. IT contractors have an advantage here, as working for a variety of organisations and sectors gives them exposure to a wider range of threats. That being said, continuous education for permanent IT security staff is vital.

Unfortunately, re-education is not a practical solution. Not only are there too few workers operating in this sector, but the rate in which threats are developing is simply too fast for them to keep their heads above bug-ridden waters. It is a unique challenge, but automation could be the answer.

A Threat, or a Blessing?

Since robotic automation was introduced to the automotive production line in the 1960s, the threat that automation poses to manual jobs has been widely discussed. However, the machine economy now comprises of much more than mechanical muscle. Advances in artificial intelligence and automated software are now threatening more functional and intelligence-driven roles, including those in IT.

However, IT professionals already know this.

Aside from the skills shortage, the rise of automation is one of the most talked-about topics in the realm of IT. Estimates suggest that up to 80 per cent of jobs in the sector could be at risk due to an increase in automated technology and the potential of artificial intelligence (AI). But, in cyber security, which is so severely lacking in talent, is this technology really such a bad thing?

Ultimately, it depends on how you look at it. It is becoming increasingly difficult for human operators to manage all aspects of cyber security — particularly in areas that generate massive amounts of data. Let’s face it, even the AI sceptics among us are likely to use some form of automated technology.

Automating Analysis

Security automation is already a tool that is widely used to remove security decision responsibility from workers. Automated software can be taught to instantly detect specific security threats by identifying threatening e-mail attachments and scanning inbound messages for malicious URLs. Once identified, the software will also act to remove the threat.

Unlike humans, this type of automation does not have limitations for processing information and can therefore continually scan and monitor a system. IT workers can only identify a small fraction of these system threats in their own security analysis — and they certainly can’t perform this analysis around the clock.

Even the most advanced and capable IT teams cannot analyse the wealth of data generated today as accurately as automation can. However, with the support of automation, IT workers can dedicate more time to investigating anomalies and unusual or more serious threats — and perhaps get one step ahead of the bad guys.

Security automation is particularly advantageous to manage low risk threats but cannot be solely responsible for the security efforts of an entire organisation. For more complex security problems, experienced security specialists are required to analyse the severity of a threat — particularly with regards to new types of attack that the software is unfamiliar with.

Data Deception

The advantages of implementing automation for security testing are obvious, but the growth of machine learning also provides an opportunity for technology to proactively bolster an organisation’s cyber security, rather than just support it. An example of this would be data deception.

Cyber security specialists have long used deceptive tools to manipulate attackers that are trying to breach a system. For example, by creating false servers containing incorrect data, IT workers can trick hackers into perusing these assets and revealing their tactics. This method, sometimes referred to as the honey pot technique, is often used by security firms to gather intelligence on new hacking techniques.

To be effective, honey pot techniques must be implemented on a grand scale. The real network will be littered with many small pieces of information to entice hackers. This includes seemingly valuable, but fake information such as customer details, login credentials and intellectual property. However, rather than leading hackers to data they can ransom or sell, this instigates a confusing process that directs the hacker away from any real, valuable data.

Having only one honeypot is not nearly as effective or powerful as a system with several honeypots. While the configuration of a honeypot can be relatively straightforward, usually one simple algorithm, continually monitoring this maze of deception can be a longwinded process for IT workers, making manual data deception impractical.

Data deception technologies are beginning to take this responsibility away from workers. Automated products will routinely devise methods of deceiving hackers, using machine learning techniques to change and adapt over time.

Machine learning is a branch of artificial intelligence that enables technology to learn and develop through experience, reducing the need for manual programming. For example, if a system had previously experienced a specific type of cyber threat, it would develop methodologies to deal with similar attacks more efficiently in the future. It’s widely believed that machine learning tools could enable systems to spot and stop the next WannaCry attack, for instance, much faster than legacy tools.

Reducing the need for human intervention means that IT workers can dedicate more time to strengthening their own security efforts. However, as this method enters the mainstream, hackers are developing equivalent automation to overcome this technology.

In the Wrong Hands

From a defence perspective, cyber security professionals already use an increasing amount of automation and machine learning. However, this technology can be dangerous when used for malicious intentions. According to a Cylance study of information-security professionals, 62 per cent believe that hackers will begin to weaponise artificial intelligence in 2018.

But, aren’t hackers already using this technology?

Artificial intelligence is being used maliciously to mine large amounts of public domain data. For example, scouring social media and other sites to source telephone numbers, e-mail addresses and personal information, all of which can be used for hacking a person’s accounts. On a slightly more advanced level, this data is also used to send personalised phishing mails — e-mails disguised as a trustworthy sender to obtain sensitive information.

Malware creation is another task that could be accomplished by automation in the future. It’s a longwinded and manual process for cyber criminals, requiring several scripts of programming to create viruses, trojans, password scrapers and other tools to aid their attack.

In a 2017 paper, Generating Adversarial Malware Examples for Black-Box Attacks, the authors described how they build a generative adversarial network (GAN) based algorithm that was capable of automatically creating malware samples. A crucial finding of this paper was that the malware was able to bypass automated security systems — highlighting a serious flaw in machine learning-based defence.

Another research project from New York University highlighted how machine learned-based defence systems may not be capable of defending against data poisoning. As the system will learn and develop from the data it processes, hackers can poison the system with incorrect, manipulative data, rendering the defence strategy ineffective.

There’s no doubt that technologies like machine learning, automation and artificial intelligence will be the cornerstones of cyber defence strategies, and this is vital to keep up with the enemy. Adversaries are working just as hard to overcome these tools with the same technology. The difference will be the use of human intelligence to bolster defence.

Despite common misconceptions, these tools cannot completely replace humans. Instead, this technology should be used to automate the longwinded and repetitive tasks that are currently filling the workflows of IT teams, such as testing, basis threat analysis and data deception tactics.

There is a severe shortage of advanced cyber security skills in today’s IT talent pool and automation will not completely bridge this gap. However, embracing this technology will provide greater opportunities to develop the skills of the existing IT workforce. Which is, at least, a step in the right direction.

The post Automation, the Skills Shortage and Cybersecurity appeared first on Computer Business Review.