DRM3

DRM3 Labs Corp

Provenance for AI.

Know what went in. Prove what came out.

Data with receipts.

Volcano with lava flows and obsidian crystals. Raw data refined into structured intelligence

Lava flows without permission, transforms everything it contacts, and hardens into obsidian: dense, permanent, carrying the full chemical record of everything it passed through.

This is data. Not as metaphor. As behavior.

Every piece of obsidian already contains what happened. Read from it directly. No excavation required.

The Problem

Every physical product on earth has a supply chain. Data does not.

Nobody can tell you where a piece of AI output came from, what went into it, or who touched it on the way. Agents are transacting, generating, transforming, and distributing content at machine speed. The infrastructure beneath them produces no record of what happened.

Every system logs. The problem is that logs live somewhere else. They can be rewritten, deleted, or fabricated after the fact. A log entry says what someone recorded. It does not prove what actually occurred. In DRM3, the data is packaged with its receipts. You only expose it at the points of egress you choose.

When AI generates analysis from government data, financial filings, and news feeds, who can prove what went in? When the output shapes policy, investment, or public understanding, who is accountable for the inputs? Today, nobody. The chain of evidence does not exist because data has no supply chain.

Data does not have a trust problem. It has a provenance problem. Trust is the symptom. Missing receipts are the cause.

The Primitive

Attested provenance. Every bit. Every call. Every row.

Not sampled. Not approximated. Not reconstructed after the fact. Every data interaction, every inference call, every row accessed, every transformation applied carries a cryptographically signed receipt from the moment it occurs. The receipt is not attached to the data. It is inseparable from it.

For AI-generated content, this produces something that has not existed before: full attested provenance of every input that produced every output. The model version. The data sources. The access grants. The timestamp. The signing key. All of it, immutable from the instant of generation, verifiable by any downstream party without trusting anyone in the chain.

TCP/IP moved bits reliably.
Ethereum moved value trustlessly.
DRM3 moves data with receipts.

How It Works

Cryptographic attestation. Applied to every piece of data you already use.

Every data operation in DRM3 produces an attestation. The party that performed the work states what was done, what went in, what came out, and why, and signs it with Ed25519. The attestation is the claim and the proof in one act. Every attestation chains to the ones before and after it.

When an article is fetched for World News RAG, the fetch is attested. When AI analyzes that article, the analysis is attested, and its attestation points to the fetch attestation beneath it. When a batch completes, every attestation in the batch rolls into a Merkle tree and the root is signed. The chain is unbroken. Every link is independently verifiable. That is one product. The same protocol runs across every product.

Today, DRM3 Labs operates the signing infrastructure: 34 keys derived from a single root, attesting data from 23 government and public sources, 6,700+ curated news feeds processing millions of articles per year, and 8,881 scanned domains. The provenance protocol is designed to support third-party signers, and the library will be open for others to attest their own data with the same guarantees. Open Signals, the open data ingestor, has every source's license and terms of service publicly documented. The signing keys are published and verifiable. The attestation receipts travel with the data.

You know what you are getting. You know where it came from. You know what touched it on the way. And you can prove all of that to anyone else without trusting us or anyone in the chain. That is what attested provenance means. That is what DRM3 builds.

Selective granularity

Fine-grained internally. Coarser-grained at the boundary. The chain is the same. The granularity is the choice.

Row-level for audits, compliance, and forensic analysis. Session-level — Merkle-rooted — for pipeline runs and bulk processing. Service-level for external consumers who need one verifiable root without internal detail.

Four non-negotiables

Sovereignty

You know what you have. Where it came from, what its terms are, what you can do with it.

Privacy

Protection is architectural, not contractual. Sharing happens on the owner's terms, provably.

Equitability

Attested records of who contributed what, so value tracks contribution.

Transparency

A cryptographically verifiable record from origin to consumption. Not a report. A property.

Sovereignty without transparency is an unverifiable claim. Privacy without sovereignty is borrowed protection. The protocol enforces all four simultaneously or it enforces none.

Running Now

This is not a whitepaper. It is already running.

Open Data Ingest

23 government and public data sources: SEC filings, FRED economic indicators, Congress.gov legislation, FDIC bank data, Census, USGS, Federal Register. Fetched continuously. Every row signed at ingest.

Article Intelligence

6,700+ curated RSS feeds across 12 active pipelines. Millions of articles per year. Scraped, deduplicated, AI-analyzed for 60+ structured fields. Every article carries a receipt from fetch through analysis.

DNS & Infrastructure Scanning

8,881 domains under continuous scan across 40+ categories. DNS records, TLS certificates, WHOIS, HTTP probes, DNSSEC validation. Full sweep every 42 hours. Every scan signed.

Provenance

Every fetch, every analysis, every transformation signed with Ed25519. Thirty active signing keys. One root. Every key independently verifiable. All 34 anchored on Base.

Because DRM3 works with decentralized AI and open data, we can show what we are doing while we are doing it. The protocol is hardened by the products running on it. This is the start, not the finish.

raw data→ signedprocessed→ signedanalyzed→ signedobsidian

Walk the chain from any output back to the raw source.

The Evolving Protocol Manifest

Open data examples of the protocol in motion.

Open Signals fetches from 23 government and public sources. Connor scans 8,881 domains. World News RAG processes 6,700+ feeds. These are not standalone products. They are substrate APIs. Their outputs aggregate into composite data layers, get harmonized across sources, then synthesized by AI into structured intelligence.

That intelligence feeds content generation: daily digests, market reports, agent outputs, analytical summaries. Each layer consumes the one below it. Each layer signs what it produces. The provenance chain runs from the raw API call through aggregation, harmonization, synthesis, and generation to the final output.

This is what the protocol looks like when it runs on open data. The same architecture works on private data, proprietary feeds, and enterprise sources. The open examples exist because they can be shown. The protocol does not require openness. It requires signing.

substrate

Raw APIs. Government data, news feeds, DNS records. Each row signed at the moment of collection.

aggregate

Composite APIs. Multiple sources combined into unified data layers. Deduplicated, normalized, receipted.

harmonize

Cross-source reconciliation. Entities resolved, conflicts surfaced, timelines aligned. Signed.

synthesize

AI analysis. 60+ structured fields. Sentiment, entities, political positioning, financial signals. Every analysis attested.

generate

Content, reports, digests, agent outputs. Built from synthesized intelligence. The provenance of every input travels with the output.

The Moment

The EU AI Act enforcement deadline is August 2026. Articles 12, 14, and 17 require transparency, human oversight, and documented governance for AI systems. Most deployed AI infrastructure does not qualify.

But this is not just a European regulatory question. Data governance in the AI age is a structural problem. When AI shapes policy analysis, financial forecasting, and public understanding, the question of what data went in, and whether anyone can prove it, becomes a question of democratic accountability.

Open data without provenance is a promise. Open data with cryptographic attestation is evidence. The difference is whether transparency can survive the speed and scale of AI.

Provenance as an architectural property, not a reporting burden.

See how DRM3 maps to Articles 12, 14, and 17

For decentralized AI builders

Pistachio.

DRM3 is provider-agnostic. Pistachio is our reference implementation for direct Morpheus access — a consumer node that removes the broker layer. Run it locally or operate one as a gateway.

Pistachio

Data with receipts.

Everything else is entropy.

We use essential cookies to make this site work. No tracking, no third-party analytics, no ad networks. Privacy policy