If You Know You Can Sell It, Just Build It.

If You Know You Can Sell It, Just Build It.

Show Notes

The product testing space is an important one, just ask any software engineer. But it’s also a complicated space, especially with the current economic situation in the tech world. These circumstances have led to a lot of challenges but also a lot of learnings for ProdPerfect, which is trying to help its clients improve timeframes, budgets, and priorities when it comes to testing web applications.

ProdPerfect co-founder and CEO Dan Widing joined startup coach Roland Siebelink this week on the latest episode of the Midstage Startup Momentum Podcast. The two talked about what has inspired ProdPerfect’s work and how the company has handled a host of challenges.

  • Why ProdPerfect was remote-first even before the pandemic hit.
  • The biggest challenge that lies ahead for companies that continue to be remote-first.
  • Whether it’s more important to build a product or prove you can sell it.
  • Why ProdPerfect was a services company early and how it developed from there.
  • What ProdPerfect is doing to adjust to the changing economic climate for startups.
  • What startups can do when customers outgrow their need for their product.
  • The techniques ProdPerfect uses to maintain a sense of transparency with its employees.

Transcript

Roland Siebelink:Hello and welcome to the Midstage Startup Momentum Podcast. I'm your host and scale up coach and advisor Roland Siebelink. And today I have with me one of the foremost CEOs working in the space of product testing. It's Dan Widing, the CEO and co-founder of ProdPerfect. Hello, Dan, how are you?

Dan Widing:Hi, I'm doing very well, Roland. And thank you for the very kind intro.

Roland Siebelink:Of course. It's an honor for us to have you on the show. Dan, we always just get started with the quick elevator pitch. What does ProdPerfect do? Who does it serve? And how does it make a difference in the world?

Dan Widing:All right, let's do the simple sentence first. We have a product that automatically tests web applications for quality assurance. That's the simplest way to put it for anyone in the space to know roughly what we do. After that, I usually ask, "Okay, who am I talking to next?" Am I talking to somebody that I wanna sell to? In which case I wanna make sure they know what CI is, continuous integration. Do they know agile methodology? And do they know the spectrum of testing? The test pyramid or end-to-end test in general? If they don't know those three concepts, we're probably not gonna have much of a conversation about what we do. We sell to the high-performance software engineering teams, ideally organizations with more than 50 million revenue. It's a subset of folks out there.

Roland Siebelink:That's very interesting. I was just gonna ask you, who's your core buyer, your core target audience? And I like that you have that qualifier on it: high-performance software engineering team. Do you just define that by the revenue of the company or are there some other markers that can tell you right away if they're a good prospect for you?

Dan Widing:If they know those three concepts and know really what they mean, we're off to the races.

Roland Siebelink:CI, agile methods, and the testing pyramid, perfect. Of course, I'm gonna have some listeners who are not necessarily familiar with all these three concepts and would still like to be counted as a high-performance software engineer. In a nutshell - and definitely in a nutshell - would you like to take us through all three concepts very quickly, please?

Dan Widing:I suppose the base concept to think of for first is agile methodology means instead of setting off from the very beginning, a big project with a huge set of requirements and months long, and at the end, you think you got something and then spending another month testing it - invariably, as much time developing as testing it - then finally, at the end you release to customers. Two decades ago, there was an agile manifesto that was published and the concept became very popular of releasing iteratively small chunks, few requirements, seeing how they go out into the wild. Continuous integration came up around this. It's a set of tools and a set of practices where you can build those small chunks of code, quickly test them component-wise, build out those components, and better deliver them to a customer. When people say CI nowadays, they typically think of a CI server, which is like a build server. It'll take a poll request or change center and run a set of tools around it, particularly for testing, potentially triggering and deploying to a software environment.

Roland Siebelink:And then the testing pyramid is where things get interesting, right?

Dan Widing:Oh yeah. Yeah. In testing, the simplest way to put best practices is that you want a base layer of unit tests, as many as possible, testing components of code as quickly as possible. You want a middle layer of integration tests that are testing certain multiple components at a time, like an API. The API, if you send a proper request, does it send a proper response? And then a very small tier of end-to-end tests.

Roland Siebelink:And that's still in my experience something where not all software engineering teams are quite living up to that promise. There's still a lot of them out there that have manual testing of that function and functionality for the user.

Dan Widing:Yeah. We're getting into the core core of it now. Thank you for triggering me. Almost no one does. There's a lot of reasons why that is. But when you get back to it, some amount of manual testing certainly makes sense when you're building. If you're testing something for the first time, it doesn't make sense to automate testing of it because no one's ever been through it before. And if it didn't work the first time, you wanna change it, so all the time you spent automating it was wasted. Ideally, you really wanna automate the things at a relationship between how heavily you want to test it, how frequent, and how important it is to the end user. We recommend - if you weren't talking to us - that you think of building an end-to-end test suite that can run within 10 to 20 minutes. And that's just your time budget. If you build something that takes more than that, undo it or cut something else out.

Roland Siebelink:And the reason behind that is that if the test takes too long, then engineers will simply not run it.

Dan Widing:Exactly. If it takes too long, people cut it on a CI. Anything that takes longer than an hour or so for a total tool chain, engineers walk away, shut their computer, turn off, and then come back the next day, at which point communication gets thrown off. It takes so much longer. If you have something that runs between 10 and 30 minutes for a full tool chain, you can get a response where the developer doesn't change context and lose all that valuable time.

Roland Siebelink:Very cool. That sketches the context really well, so what is the particular thing that ProdPerfect adds to that context? The thing that really makes it special to be working with ProdPerfect?

Dan Widing:The idea of a budget for testing is one that relatively few people actually end up doing. One of the really hard parts is to figure out what should be within - what should be tested as part of the budget and what shouldn't be. The core driver of ProdPerfect is we can figure out how important something is to test because we ask our customers. Part of our product is to install an analytics tool set, an analytics recording tool that sends us traffic about live and user behavior continuously. And we can mine that for both the patterns and the intensity of traffic. The patterns of the test cases, the intensity is how important it is.

Roland Siebelink:I think you mentioned in the chat before that you originally started the company in Boston, then went through an accelerator in New York City, and are now a remote-first company headquartered in the Bay Area. Is that right? Talk to me about that journey a little bit. Why all the geographic changes in the course of the history of the company?

Dan Widing:It was also at my last company, at WeSpire in Boston, where I saw how deeply Slack was changing, particularly engineering culture. And if you wanna hold onto an engineer, saying you can't employ them when they want to move, just works against you. After we had the first couple experiences where engineers we couldn't let go of moved and we still worked with them - and it worked fine cause everything we were doing was in Slack anyway - it's like, remote's gonna be the way of the future. It's just a matter of time.

Roland Siebelink:Was this, in fact, already the case before the pandemic hit?

Dan Widing:Oh yeah. We were totally remote-first. We had a big network in Boston. I'd say to start between a half and a third of our folks were in the Boston area more broadly.

Roland Siebelink:Okay. Very cool. Of course. I talk to a lot of companies that are figuring out whether to remain remote now, but the best ones are those that figured it out before the pandemic.

Dan Widing:Yeah. We could talk entirely about that for a while if you want. There's trouble brewing on the horizon because good managers are really hard to develop remotely. Engineers, you can develop them wherever. A really good manager - the kind of learning and the sense for how your employees are doing, what motivates them, how to excite them - that's really hard to pick up over a remote video call. It's so much easier to: one, have that communication in person, and two, pick it up from another experienced manager, seeing it in person.

Roland Siebelink:That's I think the big worry. Will people still pick up all the social cues and the socialization in general of getting into the workplace. How do they get their coaching? How do they get their cues on how to become a better employee? Very good. But I also wanted to hear how the company has been developing? As you founded it, what has been the history since you founded it and what have been the big growth points that really started accelerating the company?

Dan Widing:We had this dream of jumping to the automated solution very quickly. And to do that, we first test out, "Hey, can we sell this product?" To do so, we did a lot of man behind the curtain. The journey of other pre-seed-stage stage companies is can you sell anything at all? Can you sell a product? Okay. Maybe you're gonna become a startup. And after that, can you sell a product with product-market-fit, and then can you scale it? These are oscillating points. For us, the first thing we developed was can we get data from a customer that can inform what testing should be. Our first few customers, it was literally me writing end-to-end tests informed by the data that we were seeing.

Roland Siebelink:Right. It was almost more of a services company at that stage if you think about it.

Dan Widing:Very much so. And the journey of the company has been to componentize each step of that and then automate it more and more and more.

Roland Siebelink:I agree with you that startups should focus first and foremost on what they can sell and how they will sell it. Yet, to many entrepreneurs, that is a little bit alien. Should we have the product first? Talk to us about your thinking around that and your experience.

Dan Widing:Ninety percent of these startups fail. When you look at it, a lot of folks say I've gotta have the product first and then sell. If you look around, most people fail at that approach, so I would take it with a grain of salt. If somebody can actually sell something effectively, they stay in business. There are many, many roads to this, so I don't begrudge anyone who goes into stealth mode, knows what they wanna build, builds it, then begins selling it. But if you have the least bit of uncertainty there, you really want - the actual methodology works. You want to try to close the loop on your most risky assumptions, test them and validate them as quickly as possible.

Roland Siebelink:Okay. I like that context a lot. In a business where the product itself is the biggest risk, you might wanna start with building that. But if the traction, the market demands the risk, then by all means start there.

Dan Widing:Exactly. If you already know, you can sell a thing, just build a thing. If you don't know if the thing is saleable yet, if you don't know if you've got the tools to sell it, sell it.

Roland Siebelink:You already mentioned the oscillation during the pandemic and now also the renewed economic crisis in the startup world. How are you dealing with the new normal that is around all of us? That's typically the question I get most these days from listeners to this podcast.

Dan Widing:The mistake that we made, up until the beginning of the year, we were pricing growth far more than margin and profitability. In hindsight, I probably should have seen that coming, the anti-gravity that came from how much money got pumped into startups and in 2020. I feel like we should have seen that coming, but I suppose we're not the only ones who got caught up in that. We previously sold more to Series A and Series B startups. They had a very similar setup, similar state in their growth, hadn't filled in all these gaps yet, wasn't so complex that it was really tricky to get in there or took a long time; reliable.

Roland Siebelink:This was a way to shorten your sales cycle in a way.

Dan Widing:Exactly. Get reliable data on can we sell these guys? How do we sell to these guys? We're actually very comfortable selling to them at this point; it's a process. Downside, the crunch now, a lot of them are struggling to find funding, so it's much, much harder to sell to them now. And then the scale of growth means that even in good times, they grow to something much more complex or they die. Reliable, initially; long term, we probably should have jumped a little bit earlier to figuring out how to handle the complex cases because everybody grows into a complex case. If you eventually dump enough money into software engineering, you end up with some long tail tech debt that is messed up in its own unique way that few folks know how to unravel.

Roland Siebelink:Are you saying that during your first years of learning how to sell this product, you found that use case of companies that were just getting started in this field that had enough interest but it was a temporary window where over time they develop into so much complexity that they can handle a lot of these needs internally?

Dan Widing:Exactly. We got good at shipping what they needed initially. We got good at shipping, "Hey, I just built this project. I need to double my software engineering team. I have no idea how to also spin up an entire QA engineering team at the same time. ProdPerfect, fill the gap." For that, we’re great at filling the gap. Plug us in, we'll build tests. We'll get them in your CI system. You'll get reliable results from them. You're good to go.

Roland Siebelink:How did you then figure out that at some point in time, your solution is not sufficient enough, your customer outgrows your solution?

Dan Widing:We've got a way to scale up to more and more teams within a business by doing additional analyses, breaking down the data into deeper and deeper layers to try to find, "Okay, where is a particular pod a developer is working," and build testing just around that. They've got a dedicated test suite that works just for them. For their CI and for their build tools, they can run this thing alone or run them in parallel, either/or. That was all by-hand work and not really part of the product. But it was tricky to tell customers this thing existed, make sure that we had it configured right, ship it consistently to them. If we missed the boat on, they just spin up a second pod and that pod isn't getting enough testing for the component they're working on, they start building their own stuff. At which point they're attached to their own stuff. We can't fill the gap anymore.

Roland Siebelink:Okay. Every new pod started up is actually a risk of losing part of them over time, the entirety of that customer.

Dan Widing:Exactly. We'd catch them when they just set up their first one or two pods. But then when they grow two more pods, "Oh crap, we either have to have the total awareness to jump in and figure out how to test for them as well - rescope the engagement - or we potentially run the risk of them filling the gap themselves." And eventually, that pattern becomes the one that takes root across the entire organization. One of the incredible learnings, we started mapping out people's development processes. Almost no one is on the same pattern. There's a theme of trunk-based development that's common in these Series A, Series B startups that we've seen a lot of. It's very simple, very effective. But then the older you get, it becomes a bastardized version of Gitflow, historically. Some of the truly high-performance stuff, they stay in trunk-based development, they find ways to manage it. But a lot of them end up in this really complex Gitflow. The naming for what they call their environments, what tools get run, how long it takes to deploy them, what they test where in them, everything's bespoke.

Roland Siebelink:What's the root cause would you say for that chaos?

Dan Widing:Everything evolves organically. Life finds a way. Necessity is the mother of invention. When they see a huge problem, somebody fills in the gap - with a hack or a well-intentioned solution, it doesn't really matter. Each thing, you tailor it to the circumstances that you're in and the more tailored you are to the circumstances that you're in, eventually, the more different you become.

Roland Siebelink:What are you doing to get over that gap to move into this new normal? How do you keep your people on board with the changing vision with new ways of working? What have you figured out? What is not working for you yet? I'm just interested in hearing how founders deal with this new normal and the transition toward it.

Dan Widing:We've always been very transparent. Our financial model is actually an open spreadsheet within the company. Anyone can look at, see how much money we have, how much money we're bringing in, what we project. That's been a huge tool for us. And then, we openly talk about it, and have a state of the business every month. We talk about what's the progress every month. We create an open space for Q&A about anything, open to what people are afraid of because it's reasonable to be afraid of things in this. You just talk through what's our strategy for handling this? What are our backup plans? Who are we addressing in terms of potential strategic partners if X doesn't work out? That's been our best mechanism to date, that openness and being willing to not just tow a party line, which is what burns people out very heavily - or at least has burnt me out heavily at other places I've worked and feels the same for everyone I've worked with or brought into the startup. They don't want that party line.

Roland Siebelink:Does it make you feel vulnerable when you do that?

Dan Widing:Horrendously, miserably. Cause there's parts of this that I don't know for certain what's gonna happen. I don't know what the funding market's gonna look like in one quarter, two quarters. I don't know what it'll take to really find a solid partner that can potentially share our marketing or that we might link up with to offer a suite of products. We're entrepreneurs, so we hustle, we believe that a lot of effort toward talking to these folks and figuring out what's working with them, seeing what resonates. But you can't say with any certainty, any of these shots are gonna be the ones that does it. You just try to get as many shots as possible on goal. And by bulk, one of them has to work.

Roland Siebelink:Exactly. It's doubling down on the hustle and being convinced that if you keep trying, you'll find a way.

Dan Widing:Exactly. I found trying to only double down on the things that you're certain of, you lose the hand.

Roland Siebelink:But assuming things will work out, Dan, how big could this become? You already mentioned a gigantic market out there. Let's look with optimistic, rose-tinted glasses further down the future, maybe five to 10 years hence, how big do you see this growing?

Dan Widing:What an advisor pitched me on recently is that instead of being focused entirely on end-to-end testing at the moment, we should screw that and we should take on Copilot, take on the GitHub project. Cause they can only chatbot what other open-source projects have done. And that chat bot is great, hugely helpful for folks. But that data set only goes so far. If you actually have it to compare back to the observable real-time behavior of real users using the application, this is truly the data set that product people use to mine what features are working and what aren't. That's a much better source for figuring out what to build. We're gonna have more than just Copilot in this space in five to 10 years. Software engineering as a whole shouldn't be some of it just comes out of a single software engineer's head. It should be driven by much better tools that are aware of both how to write the code better and what its impact upon the world will be. Will it work? Will it actually delight the users at the end of the day?

Roland Siebelink:I love that vision. That's a really good one that I'm sure many other tool-set companies, as well as every engineer, will be really clamoring for. Dan, listeners to this podcast that are curious about ProdPerfect, what do you need help with? And also where can people find out more information? What should they download?

Dan Widing:Go visit prodperfect.com. There's a number of call-to-actions right there. You're welcome just to email me, [email protected]. Just email me. I'm happy to tell you whatever you want. If you wanna find me on Twitter? I'm @DanWiding. I don't use it particularly heavy, but I poke around there. If you run a software team and you want to actually get a real signal that your engineers use end-to-end to testing, if you wanna really automate testing, love to see you, love to have you part of our set of customers. In particular, I want to dive into - I wanna talk to a few more people in that front that my advisor hinted me on yesterday, which is if you have experience in using ML, in particular, for extracting insights from code or observable information, error, logs, usage patterns, would love to speculate on some projects together cause that idea is really exciting me at the moment.

Roland Siebelink:Okay. Really appreciate it. This was an amazing interview. Thank you so much, Dan. You've been very open and honest and even might I say vulnerable at some points in time, which I really appreciate. Thank you so much for all your time, Dan Widing, the CEO and co-founder of ProdPerfect. Thank you for joining the podcast.

Dan Widing:Thank you, Roland.

Roland Siebelink:Absolutely. And for our listeners, we will have another episode for you next week. Please keep tuning in.

Roland Siebelink talks all things tech startup and bring you interviews with tech cofounders across the world.