What the 2025 DORA Report Reveals About Trust, AI, and Database Governance
November 20, 2025
See Liquibase in Action
Accelerate database changes, reduce failures, and enforce governance across your pipelines.

Google Cloud’s 2025 DORA State of AI-Assisted Software Development Report confirms a defining shift in modern engineering. AI is now part of how software gets built, tested, and shipped. Ninety percent of developers report using AI tools to write, refactor, or optimize code. Teams are moving faster than ever, but faster is not always safer.
If AI can write code, review pull requests, and modify database schemas, who is governing the change? How do we know those changes are accurate, compliant, and secure?
The DORA findings reveal that the next stage of DevOps maturity is not just automation. It is governance, observability, and trust. Liquibase helps enterprises strengthen all three at the database layer.
AI Is the New Developer and a New Risk Surface
The 2025 DORA Report highlights that developers increasingly rely on AI copilots for code generation and database scripting. While this delivers unprecedented speed, it also introduces an invisible layer of risk.
Traditional pipelines were designed for human review, but AI-generated changes can merge and deploy without anyone validating their impact. When those changes reach the database, they can alter schemas, expose sensitive data, or introduce compliance violations that go undetected until production.
Liquibase sees this pattern across industries. Organizations automate application deployments but still manage database change through manual scripts or tribal knowledge. That disconnect leaves the data layer, the most critical and regulated part of modern architecture, exposed to risk.
This imbalance creates what we call the Velocity Gap, the divide between how quickly teams can deploy and how safely they can change.
What the 2025 DORA Report Really Says
The 2025 DORA State of AI-Assisted Software Development Report outlines three key truths that every enterprise should act on:
- AI adoption is nearly universal, but governance lags behind.
Most organizations have not extended DevOps controls to AI-generated code or schema updates. - Trust requires visibility.
Developers say AI increases productivity, yet more than sixty percent have discovered AI-related errors after deployment. Without traceability, quality and compliance suffer. - Governance defines maturity.
DORA’s new AI Capabilities Model emphasizes version control, transparency, and measurable risk management. These principles align directly with Liquibase’s Database DevOps framework.
Velocity without governance is not progress. It is risk moving faster.
Why Database Governance Matters More Than Ever
AI-assisted development speeds up delivery, but it also increases the likelihood of unverified or inconsistent database changes. A single unchecked modification to a schema can ripple across analytics, compliance, and AI workloads.
Liquibase Secure brings structure to that risk. It enforces policy checks before deployment, detects unauthorized changes through drift detection, and creates a tamper-evident audit trail for every update. Each change is recorded with full metadata: who made it, when it happened, what changed, and why.
The outcome is a complete chain of custody that turns the database into a trusted foundation for innovation rather than a blind spot for risk.
Aligning with the DORA Vision for AI Governance
The DORA Report calls for stronger alignment between automation, observability, and compliance. Liquibase Secure delivers that alignment through what we call a Governed Database Pipeline, a fully automated workflow that validates, audits, and secures every database change before it reaches production. It brings the same rigor to database delivery that DevOps brought to application code and is purpose built for the AI era.
The table below shows how Liquibase Secure maps directly to the 2025 DORA AI Capabilities Model, turning governance principles into measurable outcomes at the database layer.
Together, these capabilities create a continuous governance framework. Every change is checked, verified, and auditable across build, test, and runtime environments. This ensures governance scales with velocity rather than slowing it down.
Data Integrity: The Foundation of AI Trust
The 2025 DORA Report makes it clear that AI systems are only as reliable as the data beneath them. Liquibase research shows that 78 percent of organizations struggle with AI-driven data challenges rooted in ungoverned database change.
When data types, naming conventions, or validation rules drift across environments, the models trained on that data inherit those flaws. Over time, bias, inconsistency, and error compound.
Liquibase Secure prevents this by governing schema evolution across more than sixty database platforms. It enforces consistent standards, captures lineage, and ensures data integrity from development through production. That consistency is essential for compliance with frameworks such as the EU AI Act, NIST AI RMF, and emerging global AI governance standards.
Trust in AI begins with trust in the data. Trust in the data begins at the schema.
Compliance as a Catalyst for Innovation
The 2025 DORA Report highlights that teams combining governance and automation release faster and recover from incidents sooner. Governance does not slow innovation; it enables it.
Organizations that embed Liquibase Secure into their CI/CD pipelines are proving that compliance can drive velocity. By integrating policy enforcement, drift detection, and audit readiness directly into database delivery, these teams have reduced production incidents, shortened release cycles, and strengthened accountability across development and operations.
Governance becomes more than a safeguard. It becomes a competitive advantage.
Preparing for AI Governance at Scale
As AI becomes a core part of software delivery, governance frameworks must evolve with it. Liquibase Secure enables that evolution through three essential pillars:
- Control: Policy enforcement that prevents unreviewed or AI-generated changes from reaching production.
- Transparency: Schema-level lineage that provides auditability and model explainability.
- Trust: Continuous drift detection that confirms every change is authorized and compliant.
This approach allows organizations to meet the standards outlined in the DORA AI Capabilities Model while building a foundation for scalable, responsible AI adoption.
The Path Forward: From AI Code to Confidence
The 2025 DORA State of AI-Assisted Software Development Report signals more than a trend. It marks a turning point for how software, and especially data, will be governed in the AI era.
Liquibase Secure helps organizations bridge the gap between speed and safety. By governing change at the database layer, it ensures every update is verifiable, compliant, and trusted.
AI may be the new developer. Governance is the new reliability. The future of software belongs to teams that master both.
Build trust in every change.
See how Liquibase Secure helps enterprises align governance, compliance, and velocity.


.png)
.png)
.png)


