When I decided to build a startup in the veterinary industry, I was attracted by what seemed like a perfect opportunity: a growing industry with momentum yet less competition (where most VCs and ambitious founders weren't looking), and relatively low barriers to entry where I believed I could outperform incumbents.
After some initial market research, I anchored onto an idea - a last-minute urgent care booking marketplace - believing I could start somewhere and gradually refine through customer conversations. My conviction was that, given enough time, I could iterate my way to something that worked.
Recently, I’ve decided to close out this chapter. As I reflect on this journey, I've realized there are three humbling lessons worth sharing.
#1 Speed of Learning is a Key Indicator of Early Success
When I first started, I was learning quickly - devouring forums, blogs, industry reports, case studies, anything I could find online. Information was everywhere, and I absorbed it all. Then I hit a wall. I needed to go deeper: have real conversations, get inside clinics, observe the day-to-day.
A few scattered conversations and site visits revealed no meaningful patterns. Worse, getting those conversations was like pulling teeth.
In hindsight, my mistake was thinking that learning is a solo pursuit: I had a false belief that because I'm a fast learner, I could understand any industry to 80% quickly. But there is a difference between understanding an industry and building within it.
Learning to build a company means learning from customers. In my case: veterinary clinic owners, practice managers, and veterinarians. Most of them didn't want to talk to me - because I wasn't from the industry. So yes, I can learn fast. But the problem is that the industry doesn't want to teach me.
The real insights: Speed of learning isn't about a founder's ability to absorb knowledge. It's a proxy for founder-market fit. Are you someone insiders naturally open up to? Do they trust you with their real problems, not just polite surface-level complaints? Will they introduce you to others, creating a network effect of insights?
So my takeaway is that if a founder is not learning quickly, it likely means that they are not positioned to have an edge in that industry. Without high-velocity, high-quality conversations, you can't find deep insights. Without deep insights, you can't iterate meaningfully. And without rapid iteration, early success becomes nearly impossible.
#2 The Building Trap: When Execution Becomes Procrastination
The conventional startup wisdom says "validate first, build later". The reality is much messier because there is usually no good way of knowing when to build vs when to keep talking.
At some point, my conversations hit diminishing returns. Each new interview felt redundant, generating the same surface-level insights. This is when customer research gets genuinely hard: Have I reached the bottom of the problem, or am I digging in the wrong place entirely?
That's when I turned to execution - I built a landing page and tried showing it around. But quickly I got the impression that people in the industry wouldn't take the product (or me) seriously unless I had something working. I convinced myself that this industry was different. That unlike SaaS customers who might buy based on demos, veterinary professionals and key decision makers needed to see a working product to believe in it.
The insidious thing about building is that it feels like progress. You enter flow state, ship features, solve technical challenges. It's measurable momentum in an otherwise ambiguous journey. When customer conversations feel difficult, retreating to my building cocoon provides comfort and a sense of control. The productive bubble bursts when I finally emerge with my "better" product, only to discover the fundamental assumption was wrong.
The real lesson isn't just "validate before building" - it's recognizing when building becomes a sophisticated form of procrastination. Sometimes customers do need to see something tangible, but often "showing me a product" doesn't really mean they want to see it. It could mean "I don't trust you yet" or "I'm just being polite". That's the ultimate mom test: can my user interviews be so good that it pulls the hard truth out?
The balance I wish I'd struck: build the absolute minimum needed to have meaningful conversations, then resist every urge to polish until those conversations pull the product/problem out of you. Stay in the discomfort of ambiguity longer than it feels natural.
#3 Problem Discovery: Is it Creative, or Scientific?
Earlier in my journey, I came across a successful B2B founder who described his approach as "deep learning algorithms for humans": run massive numbers of user interviews, optimize your search across adjacent problem spaces, and iterate until you find a node with high demand and strong willingness to pay.
I was inspired by this framework - it felt elegant and systematic. So I launched into my own algorithmic problem discovery journey. I built interview scripts, created conversation frameworks, tracked pain points across spreadsheets. I treated each conversation as a data point and each insight as a parameter to optimize.
Despite building a sophisticated system, I wasn't extracting much insights. I treated it like debugging - if I wasn't finding problems, I needed better algorithms. But here is the meta challenge: when the systematic approach isn't working, how do you know whether it's a data problem or a algorithm problem?
Bad data looks like: surface-level answers, people being polite but not revealing real pain, conversations that feel scripted. The algorithm is fine but I'm processing bad inputs.
Bad algorithm looks like: asking leading questions, confirmation bias, forcing patterns where none exist. The data sources are good, but the processing is flawed.
The tricky part is that both can feel identical from the inside. In both cases, you're not finding compelling problems.
I eventually realized I was running a scientific approach in an industry that required a creative one. People don't open up on command. Especially in B2B, they don't trust strangers enough to tell you what really hurts — not until you've earned it. Build trust and relationships first, observe what people do, and maybe the real insights come from peripheral vision, not direct focus? The paradox is that peripheral vision requires you to stop looking so hard.
What Would Charlie Munger Say?
I've always admired Charlie Munger and his mental models. I like to ask myself - what would Charlie Munger say to my situation?
My guess this time:
"You have to figure out what your own aptitudes are. If you play games where other people have the aptitudes and you don't, you're going to lose. And that's as close to certain as any prediction that you can make. You have to figure out where you've got an edge. And you've got to play within your own circle of competence."