The Sarbanes-Oxley Act of 2002 was written in response to Enron and WorldCom — accounting frauds that destroyed billions in shareholder value. Section 404 requires public companies to establish and maintain internal controls over financial reporting and to have those controls assessed annually by external auditors.

Twenty-four years later, machine learning models are making financial decisions, generating reports, and processing data that flows directly into SEC filings. And most organizations have no idea how to make these ML pipelines SOX-compliant. The audit trail requirements alone disqualify the majority of current AI implementations.

SOX Section 404 and AI: The Scope Problem

SOX Section 404(a) requires management to assess the effectiveness of internal controls over financial reporting. Section 404(b) requires the external auditor to attest to and report on that assessment. The PCAOB's Auditing Standard No. 5 provides the framework auditors use.

The key question is: does your ML model affect financial reporting?

If a machine learning model performs any of the following, it falls within SOX scope:

If the model's output — directly or through a chain of downstream processes — touches a number in a 10-K or 10-Q filing, it's in scope. And in 2026, an increasing number of these models are not statistical regressions maintained by internal teams. They're API calls to third-party AI providers processing proprietary financial data.

The Audit Trail Gap

SOX compliance fundamentally requires an audit trail — the ability to trace any number in a financial statement back to its source, through every transformation, to the original transaction or data point. This is the "assertion" framework that auditors rely on: completeness, accuracy, existence, valuation, rights and obligations.

For traditional systems, audit trails are straightforward. Database transactions are logged. ETL transformations are recorded. Spreadsheet changes are tracked. An auditor can follow the chain from the balance sheet to the source document.

For ML models, the audit trail breaks down at several points:

Input data provenance. When financial data is sent to an AI API for processing, what happens to it? Is it logged on the provider's side? Is it used for training? Can you prove it wasn't modified in transit or during processing?

Model opacity. A neural network's decision process is not interpretable in the way an auditor requires. "The model assigned a probability of 0.73 to this transaction being fraudulent" does not constitute an auditable control. The auditor needs to understand why and be able to reproduce the result.

Data lifecycle. After the model processes financial data, where does that data go? Is it cached? Retained for model improvement? SOX requires that you control access to financial data at every stage. If your data is sitting in a third-party's GPU memory after inference, you've lost control.

Change management. SOX controls require documented, approved changes to systems that affect financial reporting. When your AI provider updates their model (which they do regularly, often without notice), that's a change to a system in your SOX scope — and you may not even know it happened.

What Auditors Actually Want

Having worked with Big Four audit teams on AI compliance, the pattern is consistent. Auditors want three things:

1. Deterministic reproducibility. Given the same inputs, the system should produce the same outputs. Most ML models satisfy this for inference (given fixed weights), but the auditor needs evidence, not assurance.

2. Complete data lifecycle documentation. From the moment financial data enters the AI system to the moment it's destroyed, every step must be documented and verifiable. This includes proving that data was not retained by third-party processors.

3. Control evidence, not control descriptions. A policy document saying "we review model outputs quarterly" is a control description. Evidence is the dated, signed review with specific findings and remediation actions. For data destruction, evidence means proof — not a log entry saying "deletion initiated."

Destruction Proofs and Attestation Ledgers

The technology that satisfies these requirements exists. Cryptographic destruction proofs — generated within Trusted Execution Environments — provide exactly the evidence auditors need.

For each inference operation involving financial data, a destruction proof records:

These proofs are recorded on an attestation ledger — an append-only, tamper-evident log that serves as the audit trail. When a PwC or Deloitte auditor asks "can you prove this financial data was processed in a controlled environment and then destroyed?" — you give them the ledger.

This is what Ardyn's sovereignty event model provides. Each financial data processing operation is a discrete, auditable event with cryptographic proof of data handling and destruction. The ledger satisfies SOX retention requirements (typically seven years) while the underlying financial data is provably destroyed after processing.

Practical Steps for SOX-Regulated Companies

Inventory your ML models. Identify every model that touches data flowing into financial statements. Include third-party APIs — these are frequently overlooked.

Classify by SOX impact. Not every model needs the same level of control. A model that directly determines revenue recognition needs stronger controls than one that generates internal analytics.

Require destruction proofs from AI vendors. If a third party processes your financial data through AI, they should provide cryptographic evidence of data destruction. Include this in vendor assessments and procurement requirements.

Build the attestation ledger into your control framework. Work with your external auditor to establish destruction proofs as acceptable control evidence. The earlier you start this conversation, the smoother your next SOX audit will be.

Document model change management. Establish controls around AI model updates — including third-party model changes — that match your existing SOX change management framework.

The intersection of AI and SOX compliance is not optional. If ML models touch your financial reporting, they're in scope. The organizations that build cryptographic proof into their AI pipelines now will save millions in audit costs and regulatory risk over the next decade.


See how Ardyn's attestation ledger satisfies SOX audit requirements at ardyn.ai.