CSV vs CSA in 2025: What Changes and What Doesn’t

Computer software assurance is the FDA’s risk-based way to build confidence in software that supports production and quality activities.
If you have lived in classic CSV world, this post explains CSA vs CSV, where CSA applies, what stays the same, and how to roll it out in 90 days.
We also cover FDA CSA guidance, 21 CFR Part 11, and the EU’s 2025 Annex 11 revision with the new Annex 22 on AI, plus a checklist and matrix.
What is Computer Software Assurance and How it Relates to CSV?
Computer software assurance is a risk-based approach for software used in production or the quality system. It helps you choose the right mix of activities to show a system is fit for its intended use and to capture only the objective evidence you need. In short, CSA complements CSV by shifting focus from paperwork volume to risk and outcomes.
In practice, you still identify intended use, assess impact on patient safety and product quality, then select activities that prove the software does what you need. Traditional scripted testing stays in the toolbox, but unscripted testing, exploratory testing, and ad-hoc tests gain a proper place when risk is low and speed matters.
Where CSA Applies, and Where It Does Not?
CSA targets software used in production and quality system processes under Part 820. It does not change requirements for software that is itself a medical device function. If your software is SaMD or part of device functionality, follow the applicable device rules and use CSV as defined in your SOPs.
If you use electronic signatures or electronic records, 21 CFR Part 11 still applies. CSA does not remove Part 11. Instead, CSA helps you right-size the assurance activities and records that support Part 11 compliance.
Predicate rules remain your baseline. Use CSA to choose efficient activities that still satisfy the rules tied to each process.

CSV vs CSA
The quick comparison below shows how CSA vs CSV differ. Use it to align your validation plan with risk while keeping inspectors happy.
Dimension | Classic CSV | CSA |
Primary focus | Documentation-heavy verification and testing | Risk-based approach with only the evidence needed |
Scope | Any GxP system per local SOP | Production and quality system software under Part 820 context |
Test approach | Mostly scripted testing | Mix of scripted, unscripted, exploratory, challenge testing based on risk |
Evidence | Emphasis on volume and completeness | Emphasis on fitness for intended use and objective evidence sufficiency |
Speed of change | Slower, broad re-validation | Faster iteration with targeted assurance after change |
Audits/inspections | Traceability and big document sets | Traceability plus risk rationale and right-sized records |
Applies to device software functions | Yes via CSV when in scope | Not intended for device functions; follow device requirements |
Relationship to Part 11 | Often tightly coupled | 21 CFR Part 11 still applies when using e-records and e-signatures |
Risk-Based Approach in Practice
Start with intended use, rank risk to patient safety, product quality, and data integrity, then select activities that match that risk. The output is a clear rationale and a compact set of records that show control.
A simple path:
- Define intended use in one sentence.
- Identify hazards and failure modes that could affect product quality or patient safety.
- Assign critical vs noncritical features.
- Pick assurance activities for each feature.
- Capture objective evidence to the level needed.
- Keep traceability so an inspector can follow your logic.
For a high-risk feature, you may keep detailed scripted testing with pre-approved steps and full screenshots. For a low-risk configuration field, a brief exploratory testing note with results may be enough. Document the why, not just the what.
Testing under CSA: Scripted vs Unscripted testing and Exploratory Testing
Under CSA you can blend methods. The goal is confidence in fitness for use with the least burden that still covers risk.
- Use scripted testing for high-risk functions, complex workflows, or regulated calculations.
- Use unscripted testing and exploratory testing for low-risk features, user interface behavior, and negative checks where creativity catches more issues in less time.
- Apply challenge testing to stress the system or probe known weak spots.
- Consider test automation in GxP for repeatable, high-value checks, and maintain versioned scripts as part of your validation package.
Keep records proportional to risk. A short test note can be valid objective evidence if it clearly states scope, environment, who tested, results, and the tie to intended use.
Lifecycle and Ongoing Assurance: Change Control, Periodic Review, Deviation and CAPA, and Maintaining the Validated State
CSA does not stop at release. Keep assurance alive through routine change control, targeted regression, and periodic review that checks logs, training, and supplier updates. Record deviation and CAPA when results differ from expected. The system stays in a validated state because you measure the right risks at the right time.
Cloud and Suppliers: SaaS Validation, Vendor Qualification, SLAs, and Infrastructure Controls
Most teams now run core GxP workflows on SaaS. With CSA, you still need vendor qualification, proof of shared controls, and clear SLAs that cover uptime, change notices, backups, and audit trail access. Keep a concise supplier file with certificates, SOC reports, pen tests, and release notes tied to your risk register.
Treat cloud services like any other component. Define infrastructure controls, version your environments, and rehearse recovery. For major supplier changes, run a focused assurance activity that proves continued fitness for use.
Data Integrity Under CSA: Audit Trail Review and Electronic Signatures Part 11
Data integrity does not get lighter with CSA. It gets clearer. Plan audit trail review by risk, review the right events at the right time, and use exception reports where possible. The UK MHRA GxP Data Integrity guidance supports risk-based audit trail review and allows exception-based reports when validated.
In clinical contexts, the EMA’s 2023 guideline on computerised systems emphasizes proactive audit trail review for critical GxP data. Even if you work in manufacturing, the principles help you design better reviews and training. Keep electronic signatures Part 11 controls as-is, then scale your review based on data risk.
EU Horizon 2025: Annex 11 Revision and New Annex 22 on Artificial Intelligence
In July 2025 the European Commission and PIC/S opened consultation on updates to Chapter 4 and Annex 11, plus a new Annex 22 on AI in GMP. Drafts emphasize continuous oversight of AI, strong change control, performance monitoring, and human review. If you operate in the EU, track these drafts and prepare to extend your supplier and lifecycle checks to cover AI models.
The CSA Workflow You Can Use Today!
The five-step loop below fits both new systems and changes. It maps cleanly to SOPs and plays well in audits.
- Identify intended use.
- Assess risk to patient safety, product quality, and data integrity.
- Choose assurance activities by risk, blending scripted vs unscripted testing.
- Capture just-enough objective evidence.
- Maintain traceability and keep the validated state through change control.
90-Day CSA Rollout Checklist
This plan gets a pharma QA team from talk to practice in three months without breaking delivery. Each block targets quick wins and inspection readiness.
Days 1–15: Align on Scope and Risk
- Build a system inventory and flag items used in production and quality system processes.
- For each flagged system, write a one-line intended use and a high-level risk note.
- Map current CSV deliverables to CSA options. Identify low-risk features where unscripted or exploratory testing fits.
Days 16–30: Procedures and Templates
- Update the Validation Master Plan to explain the risk-based approach and evidence sufficiency.
- Add a CSA SOP that defines risk categories, test method selection, and minimum records.
- Refresh your requirements traceability template to track risk, method, and objective evidence for each requirement.
Days 31–45: Pilot on a Real Change
- Pick one change with measurable business value.
- Do a concise impact assessment and select assurance activities. Include a mix of exploratory and targeted scripted checks.
- Capture only the records needed to show fitness for intended use. Close with a short test report.
Days 46–60: Close Gaps for Cloud and Suppliers
- Review vendor qualification and supplier files for SaaS validation and hosting.
- Add quality clauses to SLAs covering audit trail access, backups, and change notifications.
- Define how you will access logs for periodic review and investigations.
Days 61–75: Data Integrity Routines
- Write a risk-based audit trail review procedure with frequency by risk and example exception reports.
- Train reviewers on what to look for and how to document findings. Align with MHRA guidance and EMA expectations for proactive reviews.
Days 76–90: Scale and Prepare for Inspection
- Roll CSA to two more systems.
- Add a one-page CSA rationale to each package.
- Create an “inspector pack” with your SOPs, risk model, a sample RTM, and two completed change examples.
- For EU teams, track Annex 11 and Annex 22 updates and keep a follow-up list for when drafts finalize.
FAQs
Is CSA replacing CSV?
No. CSA is an FDA draft guidance for production and quality system software. It complements CSV by focusing on risk and the sufficiency of objective evidence.
Does CSA change 21 CFR Part 11?
No. Part 11 still applies when you use electronic records and electronic signatures. CSA helps you decide how much evidence you need to support compliance.
When should I use unscripted testing?
Use it for low-risk features where exploratory testing can quickly show fitness for use. Keep concise notes as objective evidence.
Does CSA apply to device software functions?
CSA is aimed at production and quality system software, not software that is itself a device function. Follow device software rules for SaMD and embedded functions.
What will change in the EU in 2025?
A consultation is underway to revise Annex 11 and add Annex 22 on AI. Expect stronger controls for AI oversight, change control, performance monitoring, and human review.