Just Computers
Cloud as a concept hasn’t failed. Cloud as an industry has been captured by extraction machines pretending to be utilities.
The Builder
You’ve been building computers from parts since before you were in IT. Back when you were doing computer graphics, you needed the machine to exist before you could make the image. You built it.
Your first real infrastructure job is MIT’s Department of Materials Science and Engineering, fall of 1999. They need every computer system audited for Y2K. You touch all of it. Then ATG, then six years working in a small R&D facility at Mitsubishi Electric Research Labs. Racking, cabling, watching machines boot. It’s honest work. You know what you’re paying for because you carried it through the door.
Then you’re at Cartera Commerce, 2012 or so. The company runs on NTT’s cloud. It’s terrible. Provisioning takes hours. It’s colo with a web portal bolted on and somebody else’s logo. Then your CTO opens his laptop and shows you his AWS account, and you can have a VM in minutes. Actually minutes. No procurement cycle, no purchase order, no three-month wait for a vendor to ship hardware to a cage.
That was the promise. A kid in a garage gets the same infrastructure as Goldman Sachs. No capital expenditure. Just a credit card and an API.
The Overlay
After Cartera you never go back to a physical data center. You work in a secure NOC which has AWS as its compute provider at Pega Systems. The networking that makes PegaCloud work is provided by a small company you haven’t joined yet. That’s how you become aware of each other.
You follow your manager to Cengage Learning, kickstart a migration from big data centers in Ohio to AWS — data centers you never visit — and you bring that same small networking company into the Cengage stack. You’re cloud-only now, and you keep running into the same problem: the cloud has no network isolation worth a damn.
In 2016 you join the company that’s been solving it. They built the first network device in the AWS Marketplace. This was before VPCs existed in any meaningful way. Everyone’s compute running side by side on shared infrastructure. Their customers needed instances to talk privately on hardware next to a stranger’s workload. So they built an encrypted overlay network. A virtual appliance that created the isolation the cloud vendor hadn’t built yet, or hadn’t bothered to.
Genuine innovation. First-mover stuff. The kind of work where you’re explaining the problem before you can explain the product.
You spend the next decade there. You learn something that most people never see: the cloud is just computers running in a data center with some APIs.
That’s it. That’s the trillion-dollar insight that nobody wants to say out loud because it deflates the market cap. CPU. RAM. Disk. Maybe GPU. Network to get there. Everything else is abstraction someone is charging you for.
The Extraction
Then AWS builds VPC. Azure builds VNET. They take that exact concept — network isolation in shared infrastructure — and turn it into a profit center. NAT gateways. VPC endpoints at $7.20 a month each. Data transfer between availability zones. Elastic IPs you forgot to release. Interface endpoints so your Lambda can talk to S3. Death by a thousand line items.
None of it is visible until the bill shows up. The pricing pages are technically public but designed to be incomprehensible. You need a spreadsheet to estimate what a simple three-tier app will cost in network fees alone.
That’s not an accident. Opacity is the product.
The Landlord Class
The playing field didn’t get leveled. It got a new landlord class.
Goldman Sachs gets volume discounts, dedicated account teams, custom pricing, and reserved capacity. The kid in the garage gets the public rate card and a $73,000 surprise bill because he left a NAT gateway running.
AWS figured out the model first: wait for an open source project to get traction — Redis, Elasticsearch, Kafka, PostgreSQL — then offer it as a managed service. Strip the community’s name off it. Capture the revenue. Contribute nothing back. The whole “open source is a business model” era ended the day they launched ElastiCache.
Microsoft does it with better PR. They bought GitHub, they love Linux now, they sponsor projects. But Azure’s managed services catalog is the same strip-mine operation in a friendlier wrapper. And VS Code is free right up until Copilot isn’t.
Oracle does it without the pretense. Larry Ellison built a company around making it more expensive to leave than to stay, and now he’s applying that to cloud and bolting AI onto the upsell.
The pattern is always the same. Commoditize the complement. Capture the value at the layer you control. Make switching costs unbearable.
The Lock-In
You start on the free tier. You build on their proprietary services because the docs are right there and the SDKs are easy. By the time you have revenue you’re spending 40% of it on AWS and your entire architecture is welded to DynamoDB and Lambda. You didn’t get liberated. You got a landlord who can raise rent whenever he wants.
The real tell is that the biggest cloud cost optimization strategy for mature companies is leaving the cloud. Basecamp did it publicly. 37signals saved over a million a year buying their own hardware. The thing that was supposed to free you from data centers is so expensive at scale that buying your own data center becomes the escape hatch.
Meanwhile “FinOps” is a job title. The fact that you need a specialist to understand your infrastructure bill tells you something went wrong with the original promise.
Bait and switch, dressed up in YAML and infrastructure-as-code so it feels technical instead of predatory.
The Counter-Argument
You don’t miss data centers. Not for a minute. The carrying, the cabling, the 2 AM drive because a disk died. Cloud compute is fine. Great, even. Cloudflare gets it right — push to the edge, flat pricing, no egress fees. Hetzner gets it right — honest Linux VMs, no 47-service menu. Linode gets it right.
Cloud as a concept hasn’t failed. Cloud as an industry has been captured by extraction machines pretending to be utilities.
So you build the counter-argument.
Local-first compute. SQLite, not DynamoDB. Coordination through shared substrate, not proprietary services. Cloudflare for what actually needs to be at the edge and nothing else. The Mac on your desk as the unit of compute instead of a Lambda function metered to the millisecond.
A small operator shouldn’t need to navigate a 200-service catalog and a FinOps consultant to run workloads and publish state. The computer on your desk is powerful enough. It always was. The cloud vendors just spent fifteen years convincing everyone otherwise.
All you need is CPU. RAM. Some disk. Maybe GPU. And the network to get there.
That’s Cube Commons.
April 8, 2026. Roxbury. Just computers.
Barton Nicholls is the founder of Cube Commons, Inc., a Massachusetts Public Benefit Corporation. He has 27 years of infrastructure experience and doesn’t miss data centers.