James Barclay, CrabNebula

James Barclay, CrabNebula

The Cyber Resilience Act extends the CE mark to software, bringing 100 years of product liability, conformity assessment, and manufacturing accountability to the codebase. The industry response has been to automate compliance with AI. This talk examines why that
instinct has the model backwards.
Starting from 100 years of manufacturing history (Ford’s assembly line, Toyota’s Kaizen philosophy, and the principle that quality is built in at every step rather than inspected in at the end), the talk reframes CRA compliance as an engineering discipline. It then turns to current
threat intelligence: MuddyWater’s RustyWater implant (Trellix ARC, March 2026) shows that
nation state adversaries have already made the engineering call, choosing Rust for their
production pipelines because the line needs to be deterministic and hard to inspect.


The argument is not a critique of AI. It is a case for placing AI correctly in the architecture. The
CRA’s conformity assessment modules require human sign-off at every stage. An AI system
making those decisions autonomously may also trigger high-risk classification under the EU AI
Act, creating two compliance problems where there was one. The talk lays out what the correct
pipeline looks like: deterministic, modular, Rust-based, with AI advising at every stage and a
human signing at every gate.


The session closes with Fleet, a working compliance pipeline built on these principles: SBOM
generation as a build output, continuous vulnerability monitoring four levels into the transitive
dependency tree, and an ENISA reporting pipeline ready before September 2026.


Open Source Security Foundation
OWASP Foundation
Open regulatory compliance working group (ORCWG.ORG)