Why 500,000 Developers Are Ditching AWS for This Basement-Born Startup

Will Smith
11 Min Read

Runpod Turns Basement Mining Rigs and a Reddit Post into $120 Million AI Cloud Powerhouse

From crypto hobby to $120 million in ARR

Runpod’s founders remember the moment their side project stopped being a hobby and started threatening their day jobs.

“We were sitting in our basements in New Jersey, staring at these mining rigs our wives had already signed off on,” co-founder Zhen Lu told TechCrunch. “We knew we had to make them useful, or we’d be in trouble.”

By Friday, January 16, that bet had paid off. The AI cloud startup says it has hit an annual recurring revenue run rate of $120 million, less than four years after launch and after spending its first two years without traditional venture funding.

The company’s origin story already reads like startup lore: a pivot from Ethereum mining to GPU cloud infrastructure, a couple of Reddit posts asking for beta testers, and a bootstrapped climb to more than 500,000 developers using its platform, including OpenAI and Perplexity.

“Runpod is a reminder that in this AI cycle, not every infrastructure winner will be a nine-figure VC science project,” said one early investor, who asked not to be named discussing internal metrics. “Sometimes it’s two frustrated engineers and a Reddit thread at the right moment.”

From “boring” crypto rigs to AI backbone

The story starts in late 2021, when Lu and co-founder Pardeep Singh were still developers at Comcast. Between them, they had sunk roughly $50,000 into Ethereum mining rigs, only to watch returns slide and “The Merge” loom over the whole idea.

“Mining got boring after a couple months,” Lu recalled. “But we couldn’t just walk upstairs and say, ‘Hey, remember that money? Yeah, it’s gone.’”

Rather than take the loss, they reconfigured the rigs into GPU servers just as early machine-learning projects were moving out of research labs and into real products. This was pre-ChatGPT and even before DALL·E 2, but both founders were already feeling the pain of working with GPUs.

“The experience of building on GPUs was hot garbage,” Lu said. “We weren’t trying to start a company. We were trying to fix our own workflow.”

That frustration became Runpod, an AI app hosting platform that now spans everything from consumer RTX 4090s to Nvidia H100s and AMD’s MI300X. The company pitches instances that spin up in under a minute and clusters that form in about 37 seconds, with per-second billing and up to 3,200 Gbps of bandwidth between nodes. For teams training large language models and image systems that bog down on slower, more expensive clouds, that combination has turned Runpod into a favored training ground.

A simple Reddit post and a waiting market

By early 2022, Lu and Singh had a working platform. What they didn’t have was a go-to-market plan.

“As first-time founders, we didn’t really know how to market or how to do anything,” Lu admitted. “So I’m like, alright, let’s just post on Reddit.”

They dropped a straightforward offer into a few AI-focused subreddits: free access to fast GPUs in exchange for feedback. No polished brand, no launch press, just a link and a pitch.

It was enough. Beta users turned into paying customers. Within nine months, the pair had quit Comcast and pushed Runpod to $1 million in revenue, still funded entirely by customers instead of venture capital.

For developers who had been stuck on waitlists for GPU instances on big clouds, the appeal was direct. Runpod offered transparent pricing, which users say often came in 40% to 60% below AWS or Google on similar hardware, and a workflow tuned for people who actually live in terminals: containers, APIs, real-time GPU dashboards.

“Runpod felt like it was built by people who actually spend all day in a terminal,” said one early user, an AI researcher who now leads infrastructure at a Bay Area startup. “I didn’t have to learn twelve proprietary services just to launch a training job.”

Bootstrapped until the VCs came knocking

Basement servers, though, were never going to cut it for long. About six months into their Reddit-driven growth, larger customers started to hesitate.

“They were like, ‘Hey, I want to run real production stuff, but not on servers in people’s basements,’” Lu said.

Instead of immediately raising a seed round, Lu and Singh pieced together revenue-share deals with established data centers, trying to add capacity as quickly as demand grew, but without taking on debt.

For nearly two years, there was no free tier and no venture money. New regions and new features had to justify themselves on basic economics.

“If we don’t have the GPUs, users go somewhere else,” Singh told TechCrunch. “We were always three steps ahead or we were dead.”

Reddit and Discord kept the funnel full. Then, when ChatGPT exploded in late 2022, demand for training and hosting AI apps spiked across the industry. Runpod was already sitting on a growing catalog of container templates for PyTorch, TensorFlow, ComfyUI and other popular tools, which meant new users could get started in minutes instead of wrestling with drivers and dependencies.

By May 2024, the founders finally decided to bring in outside capital. Runpod closed a $20 million seed round led by Dell Technologies Capital and Intel Capital. Angels included former GitHub CEO Nat Friedman and Hugging Face co-founder Julien Chaumond, who first discovered Runpod as a user and later reached out through the support channel.

500,000 developers and hyperscalers in the rearview

Runpod now says roughly 500,000 developers rely on its infrastructure, from solo builders to Fortune 500 teams spending millions a year. Across 31 regions, its platform hosts workloads from customers including Replit, Cursor, Wix, Zillow, and, notably, OpenAI and Perplexity.

At that scale, Runpod sits squarely in the blast radius of hyperscalers like Amazon Web Services, Microsoft Azure and Google Cloud, along with GPU-focused players such as CoreWeave and Core Scientific.

Runpod’s pitch is that its edge is as much about culture as it is about hardware. Where the big clouds lean on multi-year enterprise contracts and sprawling menus of products, Runpod sells speed, pricing and what it calls “developer empathy”: sub-minute provisioning, serverless endpoints, and tight Python SDKs instead of a labyrinth of dashboards.

“This is a dev-first platform,” Lu said. “We don’t think coding goes away. It just changes. Programmers become AI agent builders and operators. Our goal is to be what this next generation of developers grows up on.”

Over time, the company has had to meet a different set of expectations as well. To calm enterprise security teams, Runpod now points to ISO 27001 and SOC 2 certifications, Tier 3+ data centers, encryption, and role-based access controls—an attempt to prove a business that started in basements can support sensitive data and mission-critical models.

A different blueprint for AI infrastructure

The bigger question now is whether Runpod is a playbook others can follow or a one-off story.

On paper, its path looks very different from many of today’s AI infrastructure darlings. The company built a community first, funded itself with revenue, and was led from the start by technical founders who were deeply skeptical of debt and premature scaling.

At the same time, Runpod clearly benefited from timing. It had a product in market when GPUs were scarce, when developers were fed up with big-cloud friction, and when generative AI demand suddenly exploded.

“Runpod proves there’s room in the AI stack for lean, product-obsessed companies that don’t start on a 10-figure valuation slide,” said an AI infrastructure analyst at a major research firm, who asked not to be named because her employer has relationships with several clouds. “But we still don’t know how defensible these economics are once the giants decide to really squeeze on price.”

For now, the founders are preparing a Series A, betting that $120 million in ARR will command a premium in a crowded, hype-heavy market. They are also racing to secure long-term hardware deals in a world where Nvidia accelerators remain scarce and geopolitics hangs over the supply of advanced chips.

If the next generation of AI developers does end up “growing up” on Runpod, the company’s journey from bored crypto miners to a backbone of the AI boom may look inevitable in hindsight.

The tougher question is what happens next. In a market obsessed with scale and capital, Runpod was born out of a single Reddit post and a cluster of basement rigs. The real test will be whether it can stay scrappy while turning into the kind of cloud provider it was originally built to route around.

Share This Article
Follow:
At AwazLive, I focus on translating complex ideas into compelling stories that help audiences understand where technology is heading next. Always exploring, always curious, always chasing the next big shift in the tech world.