This article is more than 1 year old
AI coding is 'inescapable' and here to stay, says GitLab
Getting strong FOMO vibes from devs – tho how ML is actually used among engineers may surprise you
Almost a quarter of organizations are already using AI to augment human software development, and over two-thirds of them are planning to use such systems, according to a survey from GitLab.
"If there was one inescapable takeaway from the survey data, it’s that AI in software development is here to stay," the code-hosting biz said in its 2023 Global DevSecOps report, where "global" means 38 percent of the 1,001 survey respondents hailed from the US, 37 percent hailed from India, and 63 percent identified as male.
"The vast majority (83 percent) of respondents agreed that it is essential to implement AI in their software development processes to avoid falling behind, and this was consistent regardless of respondents’ functional area (development, operations, and security), job level, or years of experience."
GitLab's report suggests AI adoption thus far has gone well, with 90 percent of those using machine-learning tools today saying they feel confident using them in their daily work. Others characterized their organizations' efforts integrating AI into the software development lifecycle as "very" or "extremely" successful.
The use of AI among respondents, however, is not all robo-generated coding. The top use cases were: natural-language chatbots in documentation (41 percent), automated test generation (41 percent), summaries of code changes (39 percent), tracking machine learning model experiments (38 percent), suggestions for who can review code changes (37 percent), and summaries of issue comments (37 percent).
Only then did code generation and code suggestions (36 percent) come up. So generating code isn't the primary use of the technology, which perhaps isn't surprising given that devs only spend 25 percent of their time writing code, according to the report.
The rest of their time consists of improving existing code (17 percent), meetings and administrative tasks (17 percent), trying to understand what code does (14 percent), testing (11 percent), maintenance (9 percent), and identifying and mitigating security flaws (7 percent).
- One third wiped off value of GitLab shares, Wall Street didn't like weaker outlook
- Tech job bonfire rages on as Microsoft, GitLab and others join in
- GitLab deploys on a Friday and ... is down for a few hours
- What it takes to keep an enterprise 'Frankenkernel' alive
About a third of survey respondents were "very" or "extremely" concerned about AI in the software development lifecycle, with about half of those (48 percent) worrying that AI-generated code isn't subject to copyright protection and about 39 percent fretting that generated code may introduce security vulnerabilities.
These DevSecOps practitioners also wondered whether AI will take their jobs. "More than half (57 percent) of respondents said they think AI will replace their role within the next five years," the report says.
Then there were some who anticipate that AI will generate work for them: "40 percent of security professionals said they were concerned that AI-powered code generation will add more to their plate, compared to just 29 percent of respondents overall."
One area where most seemed to agree is training: 81 percent of respondents said they need more training to use AI in the workplace and 87 percent said organizations will have to train employees to adapt to the new regime.
The report concludes by noting that respondents with more AI experience were less likely to associate AI with productivity gains and faster cycle times.
"AI may be able to generate code more quickly than a human developer, but a human team member needs to verify that the AI-generated code is free of errors, security vulnerabilities, or copyright issues before it goes to production," the report concludes.
DevSecOps today, ChatBotChaperones tomorrow. ®