MVPs are not (always) Mini Versions of the Product
The Minimum Viable Product (MVP): never has a software development concept been more misused since teams started breaking their development work down into two week ‘agile sprints’, yet continued to integrate, test and ship big releases to customers every six months.
If your company and team like to think about themselves as agile or lean in any way, it now seems inevitable that the product development process will include ‘creating an MVP’.
But there’s so much confusion about this term, as it’s use has grown beyond its original intent, that you’d be forgiven for thinking that it has already lost all meaning.
Many of us seem to use the term MVP to mean ‘the smallest possible cut down version of the final product vision we already have in mind.’ For example, the Wikipedia entry on MVPs includes this rather curious definition that even appears to go against other definitions on the same page:
A minimum viable product has just those core features sufficient to deploy the product, and no more.
But this misses a key point - that the product vision itself is a risky assumption. And a product with limited functionality may not be testing this in the cheapest way possible.
Eric Ries, author of The Lean Startup gives a better definition:
the minimum viable product is that version of a new product which allows a team to collect the maximum amount of validated learning about customers with the least effort
This is better, because the focus is on learning about customers, not learning about your technology choices, your design or your brand.
As software product people, it’s easy to assume from the start that the way we’re going to solve a customer problem is by building software. So our MVP should be custom-built software, right? Well, no, because that may not be the cheapest way to test those big assumptions you have about customers.
One of my favourite MVP examples is this one from Steve Blank’s work with a startup who had a vision of using drones to gather agricultural data to help farmers make better decisions.
The team initially assumed that they would need to build an early, simplified version of their final product in order to reliably test their assumptions.
But they conflated the assumptions they had about whether farmers would buy the product with their own questions around technical feasibility:
We’re engineers and we wanted to test all the cool technology, but you want us to test whether we first have a product that customers care about and whether it’s a business. We can do that.
So, instead of building drones and writing software, they hired a helicopter, took photos and processed data manually - because the biggest assumption that threated their business was that farmers would find the data analysis useful in the first place.
In software, we’ve all heard stories about companies that started out by selling a spreadsheet. Or creating a ‘web app’ that merely submits a form to a human, who does all the processing work behind the scenes and manually sends a response (the Wizard of Oz approach).
The key point here is demonstrating that there are early adopters who will actually pay for it, and there are plenty of creative ways to do this.
But there’s a bigger point too: sometimes the most viable product is not software. We naturally assume that it is, because we want to build software, and because software has a reasonable track record of solving problems for many people at a relatively low cost.
But this doesn’t mean that software is always the best approach. After all, there are still people that use paper to keep track of their to do lists, and agile teams that use physical scrum boards to track progress. The nerve!
Of course, if you work on a web product development team, there is usually an a priori assumption that you’re going to be building some kind of web software. Challenging that assumption can be hard, and possibly counter-productive to your personal income stream.
Have you ever worked on a web application and thought ‘Why is this even a web app’? Hit reply and tell me about it.
All the best,