Developers Embrace AI Coding Tools Amid Trust and Security Concerns, Survey Reveals

January 12, 2026
Developers Embrace AI Coding Tools Amid Trust and Security Concerns, Survey Reveals
  • A majority of software developers are integrating AI coding tools into daily workflows despite concerns about the accuracy and reliability of AI-generated code.

  • Security concerns emerge as a large share of developers use personal accounts for AI tools—35% overall, rising to 52% among ChatGPT users and 63% among Perplexity users—potentially exposing confidential information.

  • Independent research finds AI-generated code contains more bugs and major issues than human-produced code, roughly 1.7 times higher.

  • Experts emphasize the need for essential knowledge and practical strategies to leverage AI while bolstering security and verification, echoing the shift in software engineering’s value proposition.

  • Additional challenges include personal AI accounts for work and the ongoing toil of correcting AI code, complicating organizational code verification practices.

  • AI usage is strongest in prototyping (88%) and internet production software (83%), with substantial use in customer-facing apps (73%); the most common assistants are GitHub Copilot (75%) and ChatGPT (74%).

  • AI introduction creates a verification debt, moving value from rapid generation to deployment confidence and exposing a trust gap between AI output and deployment readiness.

  • Many developers spend significant effort verifying AI code or finding that verification takes more time, with 38% agreeing it requires more time and 61% noting AI output often looks correct but isn’t.

  • Despite heavy use, 95% of developers verify AI output and 59% report this verification as a moderate or substantial effort, with 38% saying reviewing AI-generated code requires more effort than reviewing human-written code.

  • The Sonar State of Code Developer Survey shows 72% of developers use AI coding tools daily or multiple times a day, and 42% say AI assists in a substantial portion of their code, with expectations rising to 65% by 2027.

  • Most developers do not fully trust AI-generated code to be functionally correct, with 96% expressing distrust.

  • Despite distrust, AI-generated code now accounts for about 42% of developers’ code, up from 6% in 2023, and is expected to rise to around 65% by 2027.

Summary based on 2 sources


Get a daily email with more Tech stories

More Stories