show image

Darktrace launches “Cyber AI Analyst” as skills shortage intensifies

Darktrace has built and launched a “Cyber AI Analyst” that can automatically generate written reports outlining the path of complex cyber attacks.

Industry researchers say the technology could alleviate the cyber skills shortage by freeing up security workers to spend more time improving defences.

The number of business reported to have a “problematic” shortage of security workers has risen to more than 50 per cent in the last two years and it’s feared that in the UK, Brexit could further shrink the talent pool.

Darktrace has offered products which can respond to certain kinds of incidents autonomously since 2016. But the AI analyst has been designed to collect an array of data from a network to identify each part of a multi-stage attack.

It often takes trained threat analysts several hours to produce a report that outlines a security incident and a series of possible responses. Darktrace claims that in beta tests with clients, the AI analyst dramatically reduced the time the process takes.

According to Dave Palmer, one of the company’s technology directors, the analyst has been designed to augment rather than replace workers. “Quite a lot of work that goes on in expert security teams is not about risk management,” he told NS Tech. “It’s about understanding what’s going on and telling other people about it.”

“We can pull together the picture of what’s happening, and it’s immediately documented,” Palmer added. “If every time that happens you’ve saved a couple of hours, then it’s a really nice augmentation. The human being is still making the risk decision, but has been augmented to achieve that by having a lot of the [investigative work carried out by the machine].”

Darktrace is one of a number of security vendors developing automated analysis technology. The company’s engineers spent three years building the product, which will be offered to customers for no extra charge. Engineers used a range of machine learning techniques, including unsupervised and supervised deep learning, to train the system on work carried out by more than 100 analysts.

“The need for skilled analysts is increasing and outstripping the supply, even with all of the educational programmes available,” Jonathan Care, a research director at Gartner, told NS Tech. “Tools like the Darktrace AI analyst add power to the elbow; they’re a force multiplier for our scarce resource of human analysts.”

While Care said there is “a lot of hype in the AI space in general”, he added that the ability to monitor systems around the clock and focus humans skills “where they are most needed”, is “extremely valuable”. “The proof of the pudding with an AI system is: how does it perform in my environment? How does it perform with the data I can feed it?”

In July, researchers in Australia tricked a machine learning system developed by Cylance, another security vendor, into thinking that a piece of malware was legitimate software.

Palmer said the threat of AI being manipulated was “a constant line of investigation, inquiry and technical work” at Darktrace and that the company’s researchers were always trying to identify flaws in their system. While he claimed that the sheer number of algorithms Darktrace uses reduces the risk of manipulation, he said he wouldn’t describe any system as a silver bullet: “You need defence in depth.”