Share this article

Latest news

With KB5043178 to Release Preview Channel, Microsoft advises Windows 11 users to plug in when the battery is low

Copilot in Outlook will generate personalized themes for you to customize the app

Microsoft will raise the price of its 365 Suite to include AI capabilities

Death Stranding Director’s Cut is now Xbox X|S at a huge discount

Outlook will let users create custom account icons so they can tell their accounts apart easier

More than a third of code made by GitHub Copilot is unsafe, research shows

36% of AI-generated code has security flaws

2 min. read

Updated onFebruary 21, 2024

updated onFebruary 21, 2024

Share this article

Read our disclosure page to find out how can you help Windows Report sustain the editorial teamRead more

Microsoft is working hard on Copilot, and even though it’s a revolutionary tool, it’s not without its flaws.

According to the latest reports, code that is generated by GitHub Copilot might not be as safe as you think.

42% of applications have long-term security flaws

42% of applications have long-term security flaws

According toHelp Net Security, 42% of applications and 71% of organizations suffer from security flaws that haven’t been addressed in more than a year.

To make matters worse, 46% of organizations have critical security debt that can put both businesses and users at risk.

As for app developers, 63% of apps have flaws in their code, while 70% of third-party libraries have security flaws.

Despite these alarming numbers, there is some good news on the horizon. According to research, the number of serious critical flaws has dropped 50% since 2016.

AI is also a huge contributor, and many developers use it daily. However, 36% of code written by GitHub Copilot contains security flaws which is concerning.

It’s worth mentioning that 64% of applications have the resources to fix security flaws in a year, while the majority of developers despite having the capacity to do so, are ignoring security flaws.

Out of all the security flaws, only 3% are considered critical, so things aren’t as bleak as they seem in terms of security.

Hopefully, the developers will utilize the AI to more efficiently address long-term as well as emerging issues.

Microsoft is already using AI to combat cyberattacks, and it seems that other developers will have to follow suit.

More about the topics:Microsoft copilot,security

Milan Stanojevic

Windows Toubleshooting Expert

Milan has been enthusiastic about technology ever since his childhood days, and this led him to take interest in all PC-related technologies. He’s a PC enthusiast and he spends most of his time learning about computers and technology.

Before joining WindowsReport, he worked as a front-end web developer. Now, he’s one of the Troubleshooting experts in our worldwide team, specializing in Windows errors & software issues.

User forum

0 messages

Sort by:LatestOldestMost Votes

Comment*

Name*

Email*

Commenting as.Not you?

Save information for future comments

Comment

Δ

Milan Stanojevic

Windows Toubleshooting Expert

Before joining WindowsReport, he worked as a front-end web developer. Now, he’s specialized in Windows errors & software issues.