Lacuna Labs
Lacuna Labs tri-tone bar lockup.

Mind the gaps.

“I think that it’s extraordinarily important that we in computer science keep fun in computing. When it started out it was an awful lot of fun. Of course the paying customers got shafted every now and then and after a while we began to take their complaints seriously. We began to feel as if we really were responsible for the successful error-free perfect use of these machines. I don’t think we are. I think we’re responsible for stretching them setting them off in new directions and keeping fun in the house. I hope the field of computer science never loses its sense of fun. Above all I hope we don’t become missionaries. Don’t feel as if you’re Bible salesmen. The world has too many of those already. What you know about computing other people will learn. Don’t feel as if the key to successful computing is only in your hands. What’s in your hands I think and hope is intelligence: the ability to see the machine as more than when you were first led up to it that you can make it more.”

― Alan J. Perlis

About

Lacuna Labs is three engineers, two decades each on the floor of the hyperscale era. The next stretch is software that knows what it doesn’t know — small composable pieces, honest interfaces, a pager that goes off when it matters. We answer our own email.

ask the studio

Answers from the studio. Hedged when uncertain. Rate-limited.

Why us

One substrate. Many tools. The same graph engine, the same audit log, the same refusal model sits under The Curator, Lacuna, Baobab, and what comes next. One thing built well, then reused. The operator’s graph is the type.

Honest refusal beats confident guessing. When the system isn’t sure, it says so — with provenance. Your correction is canonical. The graph gets smarter because it learned from you, not because it argued with you.

SLOs on day one. Latency budget, cost meter, append-only audit, refusal-first permissions, the on-call pager wired before launch. Day-one ops posture, not a bolt-on we’ll get to later.

What we make

The Curator private beta · v2.5

A collection partner for someone who works with objects of value. Drop a photo, get help understanding it. Period, region, maker when knowable. Confidence-graded. Lists straight to Etsy and eBay when you’re ready.

Try the beta
Lacuna v0.3 · alpha

A resident operator daemon for a single-user POSIX host. UNIX-shaped: small mind, large toolkit, refusal-first permissions, append-only audit. For the people who live in their terminal.

Read the spec
Baobab private beta · v0

A pan-African and Black-diaspora portal that fuses news, markets, and federated community under an ambient-intelligent layer. The substrate is open and decentralized; the lens is ours.

Go to Baobab
Gastronomy Graph roadmap · 2026 Q4

The world of food — ingredients, techniques, lineages, regional schools — as a real graph, not a recipe blog. Same substrate discipline as The Curator.

Get the early note

How we got here

Two decades each, on the floor for the hyperscale era. We built harnesses, test suites, cluster orchestrators, deployment instruments, and the products that auto-deployed on top — running fleets at the millions-of-machines scale. Stubby. Chubby. Borg. Kubernetes since day one. cgroups. SRE on-call from the time the book was being written.

AI is the new substrate. The old discipline still applies: small composable pieces, honest interfaces, append-only audit, refusal when the system isn’t sure, and a pager that goes off when it matters. Tools we’d use ourselves; tools we’d be willing to put our names on.

Three engineers. One inbox. A real thing.

We build AI we’re willing to put our names on. If that sounds like the kind of tool you’d use, send a sentence.

alfred@lacunalabs.ai

Read on

Library Architecture, visual canon, research. Team Small on purpose. Contact Beta access, repo access, direct email.