How Did You Validate Your SaaS Idea? 10 Real Founder Stories from 2026

Ten founders share exactly how they validated their SaaS idea before building — channels, ad spends, conversion thresholds, and what went wrong.

13 min read

The "how did you validate your SaaS?" question shows up on Reddit and Indie Hackers every other week. The answers in the threads are usually too short to learn from. The case-study posts that go long are usually written by content marketers who never ran the test.

We pulled ten stories where the founder went on the record about the test itself: the channel, the spend, the conversions, the kill criterion, the call. Five are well-known and have been re-cited to death. Five are less obvious. None of these stories is invented and none of the numbers is rounded for drama.

The pattern across all ten: the survivors built an artifact (page, video, spreadsheet, deposit ask) that produced auditable signal from strangers — not friends, not warm Twitter replies. The founders who killed an idea early did so because their pre-committed kill number didn't move. The founders who shipped did so because something painful (a Stripe charge, a paid newsletter sponsorship that converted, a phone deposit) said yes.

Each story below is roughly 250 words: who, the test, the numbers, the call, what they did next, and a one-line what to copy.

1. Joel Gascoigne — Buffer (two-page test, then pricing page)

Buffer is the canonical pre-sale-on-a-landing-page test. Joel Gascoigne wrote it up on the Buffer blog at the seven-week mark in late 2010. The artifact: a two-page site. Page one was the pitch. Page two said "Please choose your plan" with three prices, and clicking a plan led to a page that admitted the product wasn't ready and asked for an email.

The numbers Joel put in the post: 120 email signups across 7 weeks, of whom about half showed up on launch day, first paying customer 4 days after launch, and the company self-funded into product-market fit. His own framing: "I didn't get a billion signups, in fact in a long 7 week period I only got 120 signups. But I spoke with a lot of those people during that time."

The non-obvious part is the third page. The pricing page was the kill criterion. If nobody clicked a plan, the pitch was hollow. Buffer is now $20M+ ARR and one of the most-cited bootstrapped success stories of the decade.

What to copy: put a price on the page before you ship. Email signups without a price are curiosity; clicks on a plan are intent. The third page is the test, not the first.

2. Drew Houston — Dropbox (demo video on Hacker News)

Drew Houston filmed a 4-minute screencast of a product that didn't exist on April 5, 2007 and posted it to Hacker News with the title "My YC app: Dropbox — Throw away your USB drive". The waitlist went from roughly 5,000 to 75,000 signups in a single day. Sequoia led a $1.2M seed shortly after. Drew himself reposted the story on X in March 2026: "My (perhaps questionable) strategy for getting into @ycombinator: 1) Post a Dropbox demo on Hacker News 2) Pray @paulg @jesslivingston see it. It actually worked."

The video was a Wizard-of-Oz. The "syncing" footage was real, but the back-end at the time was held together with hand-built scripts and one demo machine. The validation wasn't "can we build this?" — Drew already knew he could. It was "do enough people care about file-sync as a problem to make it worth a multi-year build?" The waitlist answered that in 24 hours.

What to copy: if your product is visually demonstrable, a demo video on a high-signal channel is the cheapest way to find out whether anyone cares. The kill criterion was implicit: if the HN post had flopped, the product wasn't worth Drew's next year. It didn't, so he started.

3. Pieter Levels — Nomad List (public Google Sheet)

Pieter Levels' Nomad List origin story is one of the most-copied indie validation moves of the past decade. He made a public Google Sheet listing cities with cost-of-living and internet-speed data, posted it on Twitter, and watched what happened. The tweet got three retweets — modest. But strangers started adding cities and editing fields. The sheet went viral on Reddit and Hacker News within a week.

The validation wasn't downloads. It was contributions. People putting effort into a public artifact for free is a stronger signal than people clicking a button. He turned the spreadsheet into a website inside a month, and Automattic started buying $5K/month sponsorship slots.

Pieter has validated and killed over 70 ideas in his career, keeping the five that now generate >$3M/year combined. His framing: a high kill rate isn't failure, it's the cost of finding the surviving 10%.

What to copy: if your idea is data-shaped (lists, comparisons, directories), the spreadsheet test is free and self-selecting. People who edit a public sheet are people who'd pay for the polished version. Don't build the website until the sheet has 100+ unsolicited edits.

4. Foti Panagiotakopoulos — GrowthMentor (paid-ads smoke test)

Foti documented his validation test for GrowthMentor in unusual detail. The test ran in 2018 before he committed to a year-long build. Artifact: a one-page pitch for a mentorship-marketplace concept. Channel: Google Ads, 14 days. Spend: €418. Result: 16.89% landing-page conversion rate, €0.94 CPC.

The honest caveat is in his own write-up: "since there was no paywall in front, it did not prove that users were willing to pay for our service." The smoke test answered demand-for-the-promise; it didn't answer demand-for-the-price. He treated the email captures as a starting point, not an end point — followed up with calls, then ran a second pricing test, then started building. GrowthMentor is now an established marketplace with thousands of mentors.

What to copy: the €418 / 14-day frame is a reasonable floor for a paid-ads validation test. Below €150 is too thin to read; above €600 is scaling, not validating. And: a high CVR on email capture doesn't validate willingness to pay. Stack a pricing test on top.

5. Justin Welsh — solopreneur course pre-sale (10 minutes, $40.93)

Justin Welsh wrote up a single-paragraph case in his newsletter in 2024: he spent $40.93 on paid promotion and 10 minutes of work to generate $7,500 in pre-sales for a product that didn't exist yet. The audience was his existing 500K+ LinkedIn following plus newsletter list, not cold traffic.

The structure: a sharply-priced offer with a deadline, an email to the list, a paid boost, a Stripe Payment Link as the entire purchase flow. Buyers paid before the product was finished. The pre-sale revenue funded the build, the buyers became the beta cohort, and the launch had real customers on day one.

We treat the exact $40.93 → $7,500 ratio as illustrative — it's a single self-reported data point on a warm audience. The ratio doesn't transfer to a cold launch. What transfers is the structure: warm list + sharp offer + deadline + Stripe Payment Link is the highest signal-per-euro test in the validation toolbox when you have an audience.

What to copy: if you have any list at all (1,000+ engaged subscribers), pre-sell the product before you build it. Stripe Payment Link is a 12-minute setup. Refund cleanly if you don't ship. The pre-sale is both validation and capital.

6. Rob Walling — Drip (17 calls, 11 yeses, then a landing page)

Rob Walling validated Drip — the email-marketing tool he later sold to Leadpages for a life-changing exit — by calling 17 founder friends before writing a line of code. He pitched the concept (an email capture widget with built-in lead nurturing) and asked the same question every time: "would you pay for this?" Eleven said yes.

Combined with his own need for the product, that was enough. He put up a landing page at getdrip.com, A/B-tested several value propositions, kept the winner, and drove traffic via blogging, podcasts, and Facebook ads. By launch day he had 3,400 email addresses on the list. First paying customers came roughly six months later — Rob gave Drip away free to the early beta cohort, then converted them to paid.

The 17-calls-11-yeses move is the often-skipped step. It's tiny, slow, and unglamorous next to a paid-ad campaign. But it's the highest-density signal: you hear the language prospects use to describe the problem, which becomes the headline copy that converts later.

What to copy: before any landing page, call 15–20 people in your ICP. Not to pitch — to ask whether the problem is real and whether they'd pay if it were solved. If you can't get 10 yeses out of 20 conversations, the offer needs work, not the page.

7. Damon Chen — Testimonial.to (Product Hunt + lifetime deal)

Damon Chen's Testimonial.to journey is the cleanest example we've seen of validation-via-lifetime-deal. Testimonial was his fifth product after four failed attempts. He launched on Product Hunt in December 2020 with a single offer: $199 lifetime access. Result: 30 deals, $6K in revenue in the first two weeks. He shared it on his Indie Hackers AMA.

The lifetime deal does two jobs at once: it gets real money before you've fully built (validation), and it bootstraps a small audience that becomes the user-research pool (depth). Damon used the early customer feedback to add the features that mattered, ignored the rest, and grew the product to $100K ARR in 9 months as a solo founder, then to ~$840K ARR with one hire. Building in public on Twitter drove 80–90% of his subsequent customers.

The trade-off worth flagging: lifetime deals cap recurring revenue. Damon priced his at $199 — high enough to be meaningful signal, low enough to convert. Any LTD below $50 is mostly noise; above $300 starts to filter for serious buyers.

What to copy: if you can ship a usable v1, a Product Hunt launch with a $99–$199 lifetime deal is one of the highest-signal opening moves available. Cap the LTD slot count (50 or 100 max) so you don't poison the long-term subscription business.

8. Marc Lou — ShipFast (audience-first, then 48h product launch)

Marc Lou's ShipFast launch on September 1, 2023 generated $6,000 in 48 hours and $40K by end of month one. The boilerplate is now reportedly $133K/month. The validation work didn't happen at launch — it happened in the two years before, while Marc was building 27 small products in public on Twitter and growing his following from ~1,000 to 40,000 indie hackers and solopreneurs.

When he launched ShipFast, the audience was already there. The "validation" was the consistent traffic and engagement on his Twitter audience telling him which problem to solve next. He'd watched dozens of his own followers spend weekends on auth + payments + DNS plumbing. The boilerplate solved the problem he'd seen them complain about repeatedly.

We're including this story with a caveat: ShipFast is technically a code product, not a SaaS. But the validation pattern is canonical for any audience-led launch. The audience IS the test. If 40K people who care about the problem are watching when you tweet "I'm shipping the thing that fixes the plumbing," your conversion math looks different than a cold launch.

What to copy: if you have years to play the long game, an audience-first strategy makes every subsequent product launch trivial to validate. If you don't, this isn't your move — you need a paid-traffic test instead.

9. Tony Dinh — BlackMagic.so (Twitter audience + exclusive launch)

Tony Dinh's BlackMagic.so started in 2021 as a tiny script that animated a progress bar around his Twitter profile picture to celebrate his 1,000-follower milestone. The script went mildly viral. He wrapped it as a product, then evolved it into a Twitter analytics + CRM SaaS for creators.

The validation was the launch structure: he gave early access to 1,000 hand-picked Twitter followers before going public. That cohort became the feedback loop, the feature-request engine, and the social-proof pipeline. From there, BlackMagic grew to $300 MRR in 3 months, then to $14K MRR / $168K ARR. He sold it for $128K in 2023.

The pattern is closer to method 8 than method 1: validate against a known audience first. The difference from Marc Lou is scale — Tony had 1K followers when he started, not 40K. The number that matters isn't the size of the audience, it's how cleanly the audience maps to the buyer. 1,000 Twitter creators are the exact people who'd pay for Twitter analytics.

What to copy: an exclusive-access pre-launch on a small but well-mapped audience beats a public launch on a generic channel. The first 100 users are the validation. Don't open the gates until they're nodding.

10. Arvid Kahl — FeedbackPanda (lurk, observe, build)

Arvid Kahl's FeedbackPanda story is the don't-pitch-anyone version of validation. His co-founder Danielle was an online English teacher for ESL students. They didn't survey. They didn't run ads. Arvid joined the Facebook groups where online teachers complained about their workflows, and lurked.

What he saw: teachers spent hours per week writing feedback comments to students. Almost nobody automated it. The pain was named in the groups by name, repeatedly. He built FeedbackPanda — a tool that let teachers compose feedback from reusable templates — and showed it inside the same groups where the complaints lived. The first paying customers came within days. They scaled to $55K MRR in two years with two people, then sold to SureSwift Capital in 2019 for a seven-figure sum.

The validation was the lurking. He read months of complaints before he wrote a feature spec. By the time he shipped, the spec was the complaint list, inverted.

What to copy: if your buyer hangs out in a forum (Facebook groups, niche Discords, Slack communities, subreddits), the cheapest validation is to lurk for 30+ days before you build anything. Take notes on the exact phrasing of complaints. The phrasing becomes your headline copy. The repeated complaints become your feature list.

What all ten share

A few patterns hold across every one of these stories.

Each founder picked a tight buyer. Not "marketers", not "developers" — online ESL teachers, indie hackers building Next.js apps, solopreneur creators on Twitter, sales-ops at mid-market B2B. The tighter the ICP, the more legible the test result.

Each test produced auditable signal from strangers. Joel's pricing-page clicks. Drew's HN waitlist. Foti's Google Ads dashboard. Rob's 17 phone calls. Damon's $199 lifetime-deal Stripe charges. Auditable means: a third party could check the number. The signal didn't come from friends.

Each had a kill criterion (even if implicit). Drew's was "if HN flops, the product isn't worth a year." Foti's was the email-capture CVR threshold. Rob's was 11 yeses out of 17 calls. The killers in this set are mostly not in this list — they killed early and never wrote it up. Pieter's 70+ killed ideas are the iceberg; the five survivors are what we read about.

Almost all of them stacked validation methods. Foti ran a smoke test, then a pricing test. Joel ran a pitch page, then a pricing page. Damon ran a Product Hunt launch, then iterated on customer feedback. The test isn't a single artifact — it's a sequence with thresholds at each step.

Where these patterns hold and where they break

The 10 stories above are heavily skewed toward indie / solo / small-team SaaS in tech-adjacent verticals. The validation patterns hold cleanly there. They get murky elsewhere.

They hold for B2C SaaS — Buffer, BlackMagic, even Drew's Dropbox at the consumer end of the spectrum. Paid-ads tests work, pre-sales work, audience-led launches work.

They hold for indie B2B SaaS — Drip, GrowthMentor, FeedbackPanda. The 17-calls move and the lurk-the-forum move are particularly strong here.

They get murky for deep tech. None of these tests would validate a quantum computing startup, a new battery chemistry, or an FDA-track diagnostic. The validation evidence in deep tech is the technical milestone — a benchmark, a working prototype, a published result. A landing page with "early access — pay $99" is not the right artifact when the risk is technical.

They get murky for true network-effect products. A two-sided marketplace can't be validated by a Stripe Payment Link from one side. Pieter Levels' Nomad List worked partly because it wasn't really a marketplace at v1 — it was a directory. Validation for network-effect products usually means picking the harder side and concierge-serving them until the loop is real.

They get murky for regulated verticals — banking, health tech, legal — where a high CVR doesn't unlock the actual constraint (a license, a SOC 2 audit, an FDA clearance). The validation methodology has to fit the dominant risk.

For mainstream SaaS, the 10-story playbook above is a reliable opening. For anything outside it, treat the patterns as inspiration, not template.

The single move that separated the survivors

Every cleared story above required something costly from the validator. A Stripe charge ($199 LTD). A booked phone call (Rob's 17). A public spreadsheet edit (Pieter's nomads). A waitlist signup with email and name from HN (Drew's 75K). The cost-to-validator is what filters curiosity from intent.

The killed ideas — the 60+ Pieter dropped, the 4 Damon abandoned before Testimonial.to, the dozens of indie projects in the IndieHackers post-mortem archive — failed in part because the validation didn't ask anything painful from the person responding. Email signups on a free waitlist with no price visible are noise. Discord upvotes from people who know the founder are noise. Product Hunt curiosity traffic is noise. Each can be validation if the funnel converts further down — but on its own, none of them is signal.

We've written about this in validate without an MVP, the cheapest way to validate a product idea, and when is a startup idea actually validated. The principle is the same across all three: pick the artifact that produces the cheapest honest signal for your risk profile. Honest = costly to send. Cheap = your time and money to set up the test.

How LemonPage fits

We built LemonPage for the validation-page slot of the test stack: the page, the paid-traffic launch (Reddit / Meta / Google), the conversion measurement, in one workflow. The reason we shipped it as one tool: the math of pre-MVP validation only holds when running tests stays cheap. Spending four hours wiring Carrd + a Stripe Payment Link + a Meta pixel + analytics for every test compounds into "I'll just build it" within a month.

LemonPage doesn't replace the lurking, the calls, the audience-building, or the Wizard-of-Oz. It replaces the four hours of plumbing per test so the cycle stays fast. Pair it with whichever of the 10 patterns above fits your risk profile.

Recap

Ten founders. Ten different artifacts. One recurring pattern: each test required something costly from a stranger, each had an implicit or explicit kill criterion, and each surfaced enough signal in days-to-weeks to make the next call. The survivors didn't validate harder — they validated honestly, with auditable signal, on a tight buyer they'd already named.

Pick the pattern that fits your risk. Run it for at least 10 days, no more than 30. Pre-commit the kill number. Decide.

FAQ

How did successful founders actually validate their SaaS idea?

Across the ten stories, the survivors did one thing in common: they put their offer in front of strangers who had no relationship with the founder, and required those strangers to do something costly — pay, pre-order, edit a public artifact, or take a scheduled call. Asking friends or surveying warm contacts produced misleading signals every time.

What's the most common SaaS validation method that actually works?

Six of the ten used a landing page or pre-sale page with a real price visible. The remaining four used cold-call lists, lifetime-deal Product Hunt launches, audience-led launches, or community lurking. All ten had a paid-action element somewhere in the funnel — even if it took 90 seconds to set up.

Do these patterns hold for B2C SaaS?

Yes — Buffer, BlackMagic, and the consumer end of Dropbox all fit the patterns cleanly. The methods that work best in B2C are paid-ads smoke tests, pre-sales on a warm list, and demo videos on high-signal channels (HN, niche subreddits, YouTube). Cold outreach works less well in B2C than in B2B.

Do these patterns hold for deep tech?

Mostly not. A landing page with "early access for €99" doesn't validate a quantum-computing startup or an FDA-track diagnostic. The validation evidence in deep tech is a technical milestone — a benchmark, a working prototype, a published result. The patterns above are tuned to demand-and-pricing risk, dominant in mainstream SaaS but not in deep tech.

How much should you spend on a SaaS validation test in 2026?

Across the cleared stories with explicit numbers, the range was €0 (Rob's phone calls, Arvid's lurking) to ~€600 (a multi-channel ad test). The median sat around €150–€300 over 10–14 days. Spending under €100 usually doesn't produce enough volume to read; spending over €600 is scaling, not validating.

How long should SaaS validation take?

Most of the founders converged on 10–21 days. Less than 7 days produces noisy data; more than 30 days usually means you're paying ad-platform learning costs without gaining new signal. Foti's 14-day Google Ads test, Joel's 7-week Buffer landing test, and Damon's 2-week lifetime-deal launch all sit in that range. The 14-day mark is the sweet spot.