[ad_1]
Microsoft is bringing the technology behind ChatGPT to the cybersecurity industry by designing a smart program that helps IT professionals protect against attacks.
Security Copilot is a virtual assistant that helps IT staff analyze and mitigate security threats facing their organization. “With Security Pilot, defenders can respond to security issues in minutes instead of hours or days,” the company said.(Opens in a new window).
The program is essentially an analysis tool that includes the capabilities of OpenAI’s new GPT-4 language model, which includes text libraries that can write expert-level responses and program computer code.
(Credit: Microsoft)
Like ChatGPT, the security assistant pilot(Opens in a new window) Functions in the quick bar. In the demo, Microsoft showed that you can request a summary of a new vulnerability, submit a suspected malicious file for analysis, or report recent security issues on your internal network.
In turn, Security Pilot can pull data from Microsoft’s other security products, including the company’s threat intelligence, to respond appropriately.
In another example, the program was smart enough to analyze the source of the attack, including which device was infected, on what domain and system processes. An IT security analyst can also use the tool to scan the corporate network for emails and logins for patterns that match suspected threats.
(Credit: Microsoft)
Another powerful feature of the program is the “Quick Book”, which is a collection of text resources, a function that can automate Security Pilot. In the demo, Microsoft showed one such speed that Security Copilot was able to reverse a malicious script in seconds and generate a report highlighting various characteristics of the attack.
Recommended by our editors
(Credit: Microsoft)
The results promise to streamline the work of cyber security, freeing people to focus on more pressing tasks. At a time when data breaches and ransomware attacks were rampant, it was enough to impress research firm Forrester. “This is the first time that a product is poised to make real improvements to diagnosis and response with AI,” Forrester senior analyst Ali Mellen said in a statement.
That said, Microsoft admits that Copilot doesn’t always respond correctly. In one example, the company referred to Windows 9, an operating system it doesn’t have, in an answer about the scope of the security risk. However, users can point out wrong answers. “As we continue to learn from these interactions, we’re fine-tuning the responses to create more consistent, relevant and useful answers,” the company added.
Microsoft maintains that all information entered into the program remains private. The company also plans to expand Security Copilot to allow integration with third-party security products. However, it will take some time before Microsoft releases its program for the cybersecurity industry. Safety Assistant Pilot is currently in preview; Expect more details in the coming months.
Are you reading it?
Register SecurityWatch Our top privacy and security reports delivered straight to your inbox.
This newsletter may contain advertising, offers or affiliate links. Signing up for the newsletter indicates your agreement to the Terms of Use and Privacy Policy. You can unsubscribe from newsletters at any time.
[ad_2]
Source link