The road to fully offline AI runs through here.

Four months ago we set out to build AI that runs on your hardware, with your data, under your control. No cloud. No subscriptions. No one training on your ideas.

We had six inference nodes online at one point — two Blackwell workstation cards, three Strix Halo unified-memory boxes, and a smart NAS that runs compute alongside storage. Two more Strix Halos are racked and waiting; a three-card RDNA4 build is on deck. Then we rebuilt the Beacon — the control plane, the brain — from scratch on a clean box, because the first one had grown too tangled to trust. Now the lab is being migrated through deliberately, every node planned out before it gets a wire.

// where we are — week of April 20

Beacon — the control plane, the brain — runs clean on a single workstation-class box. Two production inference lanes are live alongside it: one tuned for code generation, one for long-document reasoning. Both run entirely on local hardware. Memory is local. Logs are local. Nothing leaves the building.

What's still moving. A third inference node is being brought online — kernel work in progress. The autonomous build loop — the part where the lab builds itself — ran for nineteen hours straight, then stopped in the way we knew it eventually would. Fix is small and known: better-trained planning model, tighter target ownership, templates instead of free-form code. Bringing it back online deliberately, like everything else.

Next. Weekly Build Logs from here. Plan written down before each cable goes in.

// build log
Week of April 21

Shipped this week. The website you're reading. The vision finally written in our own voice instead of borrowed pitch language. Lab section restructured around weekly Build Logs and a snapshot of where the lab actually is — not where the marketing says it should be.

What broke and what we did about it. Documentation grew faster than our ability to navigate it. We collapsed sixteen working files into one entry-point index pointing at sectioned reference files. Every doc has a parent, every claim has a source. The discipline is the moat.

Next. Bring the autonomous build loop back online with the four fixes already known. Plan written down before the cable goes in.

Week of April 14

Shipped this week. A documentation discipline. Sixteen scattered notes collapsed into one entry-point index pointing at sectioned reference files, each capped at single-sitting read length. Not glamorous. The work product now stays organized at the speed it's being made.

What broke and what we did about it. Our first autonomous build loop ran on cron jobs that produced no output. We caught it because every step is supposed to leave a receipt, and the receipts kept coming back empty. We switched to systemd timers with health-check guards that refuse to start unless the upstream service is reachable. Failed-loud, not silent. Receipts catch what good intentions miss.

What we learned. The first version of any build loop will lie to you about whether it ran. Build the receipt format before you build the loop.

Week of April 7

Shipped this week. The control plane went up in twenty-eight hours. Most of the work units passed independent verification on the first run. Some failed at the same root cause — file-write permissions in a sandboxed runner — and waited there until we built around them instead of through them. A few were deferred for later. Every one has a receipt: the command run, the output captured, the timestamp.

What broke and what we did about it. The plan has been rewritten more than once because the first few were wrong. Building in public means showing the rewrites, not hiding them.

What's still moving. A third inference node is blocked on a kernel issue. Voice pipeline still in design. Closeout pipeline writes some sessions but not all — partial fix landed, hardening this week.

// archive — March 2026 (5 entries)
2026-03-30
Foundation plan v1.4.1 — high-quorum approval
Sent the governance foundation plan through eleven frontier models, six rounds, over a week. Landed at strong majority approval. Plan on paper; execution phase pending.
2026-03-25
Factory Phase 5 — pilot passed, ceiling hit, restart from governance up
Tried to unlock agent autonomy in the factory. Narrow pilot passed. Full phase ran into a trust ceiling — too many ways for a silent failure to ship. Twelve amendments later, archived the factory plan and rebuilt from the governance layer. Learned more from the stop than from the start.
2026-03-13
Multi-model benchmarking on Blackwell
Brought up the second inference box to start comparing models head-to-head across different hardware. We tested models from multiple origins — because the point of multi-model arbitration is that you never trust a single one. Datacenter cards get native kernels; workstation cards fall back to generic paths. Still faster than most cloud APIs.
2026-03-09
Lab online. VLANs up. Beacon deployed.
Replaced the consumer router with a virtualization host and a proper firewall stack. Full VLAN segmentation went live. Deployed the Beacon as the distributed control plane. The network went from a flat consumer setup to a segmented lab architecture in one session.
2026-03-06
First public post. Lab coming online.
First post from @directive4AI. Electrical rewiring done. Equipment racked. The lab is coming up. Everything that follows builds on this foundation.
// archive — February 2026 (1 entry)
2026-02
Cross-country move. Temporary lab in Texas.
Moved the entire operation from San Antonio, TX to Northern Michigan. Built a temporary lab in Texas before we left so work didn't stop during the 45-day transition. Rewired the electrical in the new house. The company didn't go dark — we relocated.
Don't rent your intelligence.
Dutch · Capt, USAF (Ret.)
20 years, 5 months, 10 days
Founder — Directive4