Vulnerabilities, AI Compete for Software Developers Attention

  /     /     /  
Publicated : 23/11/2024   Category : security


Vulnerabilities, AI Compete for Software Developers Attention


This year, the majority of developers have adopted AI assistants to help with coding and improve code output, but most are also creating more vulnerabilities that take longer to remediate.



Less than two years after the general release of ChatGPT, most software developers have adopted AI assistants for programming. Thats boosting efficiency, but at the same time, its led to a higher cadence of software development that has made maintaining security more difficult.
Developers are on track to download more than 6.6 trillion software components in 2024, which includes a 70% increase in downloads of JavaScript components and a 87% increase in Python modules, according to the annual State of the Software Supply Chain
report
from Sonatype. At the same time, the mean time to remediate vulnerabilities in those open source projects has grown significantly over the past seven years, from about 25 days in 2017 to more than 300 days in 2024.
One likely reason: The advent of AI is driving speedier development cycles, making security more difficult, says Brian Fox, chief technology officer of Sonatype. The majority of developers now use AI tools in their development process according to a recent Stackoverflow survey, with
62% of coders saying they used an AI assistant
, up from 44% last year.
AI has quickly become a powerful tool for speeding up the coding process, but the pace of security has not progressed as quickly, and it’s creating a gap that is leading to lower-quality, less-secure code, he says. We’re headed in the right direction, but the true benefit of AI will come when developers don’t have to sacrifice quality or security for speed.
Security researchers have warned that AI code generation could result in more vulnerabilities and novel attacks. For instance, a group of researchers demonstrated the ability to
poison the large language models (LLMs) used for code generation
with maliciously exploitable code at the USENIX Security Symposium in August. In March, researchers with an LLM security vendor showed that attackers could use AI hallucinations as a way to
direct developers and their applications to malicious packages
.
Developers also have growing concerns over the potential for AI assistants to suggest or propagate vulnerable code. While the majority of developers (56%) expect AI assistants to provide usable code, only 23% expect the code to be secure, while a larger group (40%)
dont believe AI assistants provide secure code at all
, according to research by software development firm JetBrains and the University of California at Irvine, published in June.
Many developers remain nonplussed by the speed of change wrought by AI coding tools, and there is likely more to come, says Jimmy Rabon, senior product manager with Black Duck Software, a software-integrity tools provider.
We havent seen the long-term effects of adding something that can code at the level of a junior- or intermediate-level developer and at massive scale, he says. My expectation is that we will see more intermediate mistakes — the basic mistakes that you would make as a junior or intermediate level developer — and [issues with] understanding the context of where some of the data flows.
While AI assistants are now being used by the majority of developers, in business environments, adoption of AI tools is much higher — more than 90% of developers used AI assistants, according to
Black Ducks 2024 Global State of DevSecOps survey
. AI as a tool for developers is well-entrenched and will never go away, Rabon says.
Yet many developers dont have the experience to judge whether code provided by an AI assistant is safe. Entry-level developers, for example, are more trusting of AI-produced code than their professional counterparts, with 49% trusting the accuracy of AI-generated code versus 42% for more experienced developers, according to Stackoverflows annual developer survey.
In addition, AI tools will affect the education of developers and could make it harder for those entry-level developers to gain the skill needed to advance in their careers, experts say. The reliance on AI to complete simple programming projects could reduce the need for new or entry-level developers who typically tackle simpler coding tasks, removing a training path, Sonatypes Fox says.
The development community is aging, and the introduction of AI poses potential risks to younger generations, he says. If AI can handle the tasks previously assigned to budding developers, how will they gain the experience needed to replace older developers exiting the industry?
Until the companies behind AI assistants create training datasets that contain secure code suggestions, or put in place guardrails to protect against vulnerable and malicious code generation, companies will have to deploy automated software security tools to check the work of any coding assistant.
The good news is, between the additional security checks and the fast evolution of code-generation assistants, the security of software and applications could eventually become much stronger, says Black Ducks Rabon.
There are certain basic security flaws that I think will disappear, he says. If you asked an AI system to generate code, why should it ever [suggest an insecure function?] ... I dont think that weve had enough time to really see the dramatic effects of [such capabilities] or prove them out.

Last News

▸ Debunking Machine Learning in Security. ◂
Discovered: 23/12/2024
Category: security

▸ Researchers create BlackForest to gather, link threat data. ◂
Discovered: 23/12/2024
Category: security

▸ Travel agency fined £150,000 for breaking Data Protection Act. ◂
Discovered: 23/12/2024
Category: security


Cyber Security Categories
Google Dorks Database
Exploits Vulnerability
Exploit Shellcodes

CVE List
Tools/Apps
News/Aarticles

Phishing Database
Deepfake Detection
Trends/Statistics & Live Infos



Tags:
Vulnerabilities, AI Compete for Software Developers Attention