Copy pasta from the lean startup review - sorry, if you have read it already - attached is a markdown file for the same
Introduction:
The lean startup method has five principles:
1) Entrepreneurs are everywhere.
2) Entrepreneurship is management, albeit a form of management that applies under the conditions of extreme uncertainty in a startup. If you think management is not cool and reject it, you'll have chaos and failure.
3) Validated Learning. Startups exist not to just make things, serve customers, or make money. They exist to learn how to build a sustainable business.
4) Build - measure - learn: Startups should go through this loop, as fast as possible.
5) Innovation Accounting is needed to measure a startup's progress, set up milestones, prioritise work, and for the people in it to hold themselves accountable.
Chapter 1:
- The lean startup draws from related fields like lean manufacturing and design thinking.
- If a company commits itself to the wrong plan and executes that plan excellently at a big scale, it may not be able to pivot in time, because it has committed all its resources and time to the wrong vision. It will achieve failure.
Chapter 2:
- Startups can exist as islands of independence within big companies.
Chapter 3: Learn:
- Which actions are value-creating and which are wasteful? This question is at the heart of lean manufacturing as well.
- Validate your assumptions more cheaply than building the entire product.
- But not by asking people what they want — most of the time, they don't know in advance.
- People who fail often give the excuse that they learnt a lot.
- It's easier to raise money when you have zero revenue and users. Zero invites imagination. A small number invites questions about whether big numbers will ever materialise.
- So, it's tempting to postpone getting any idea until you are sure of the success. But don't do that.
- Early in a startup's life, revenue growth happens slowly. But the real progress is in validated learning.
- Don't fall prey to vanity metrics, which are numbers that look good but are not the best indicators of your company's health. For example, if you have a web site that encourages people to download an app, page views on the web site is a vanity metric, because there are better metrics, like downloads of the app, signups, active users, etc.
- Don't waste money on PR and buying media attention and getting written up in magazines. Focus on learning.
Chapter 4: Experiment
- The founder of Zappos first tested his e-store for shoes by fulfilling orders manually — going to a nearby physical shop, buying the shoes, and shipping them. After a month, a thousand orders were placed, validating his idea.
- He observed real customer behaviour, interacted with them, and learnt about their needs, not asked hypothetical questions.
- Customers react in unexpected ways, revealing information you might not have known to ask about, like returning shoes.
- Startups have a value hypothesis and a growth hypothesis.
- The value hypothesis is that customers derive value from the product or service once they start using it.
- The growth hypothesis is about how new customers will discover a product or service.
- Give your first few users wonderful attention, as if you're a concierge.
- An experiment is actually a startup's first product, not just a theoretical enquiry.
Chapter 5:
- Startups have a build - measure - learn feedback loop.
- The learning is how to build a sustainable business.
- This learning is more important than revenue.
- Minimise the time it takes for you iterate through this loop.
- People are often trained and specialised in one aspect of this loop, like engineers trained to build. What matter is not one part, but how fast you can iterate through the entire loop.
- Startups should use a scientific method.
- To do so, they should know what hypotheses to test.
- The two most important hypotheses are the value hypothesis and the growth hypothesis.
- Every startup is based on assumptions, often not recognised as such by founders.
- Some assumptions are validated by the existence of other products. For example, when Apple built the iPod, one assumption was that people want to listen to music in public places using earphones. But the popularity of the walkman validated that assumption.
- "Leap of faith" assumptions are trickier, like saying that people want to pay $399 for a portable music player.
- You want to validate them ASAP.
- The riskiest ones first.
- You do so by building one or more MVPs. An MVP lacks features that are needed later, but its purpose is to validate assumptions with as little time and effort as possible.
- You should identify and list assumptions before, not after, building the MVP. Ideally give quantitative estimates like 20% of people will be interested in our service, and 5% will be willing to pay. That way, you can't claim later on that you succeeded, by defining the goal as what you actually achieved.
- You actually run the build - measure - learn loop in reverse: start with what you want to learn (assumptions to validate), then think about what to measure to validate those assumptions, and then build that MVP.
- Don't act as if your assumptions are true. Validate them. Otherwise your startup will fail.
- You can look for analogs and antilogs.
- An analog is a similar situation that validates your assumption, as with people listening to music in public using earphones.
- An antilog is something that goes against your assumption. For example, an assumption behind the iTunes Music Store was that people are willing to pay for music, but Napster was an antilog.
- Get out of the building and talk to users. Don't theorise.
Chapter 6: Test
- Start with a quick, crappy implementation.
- Groupon began as a themed Wordpress blog with the coupons being PDFs mailed by Apple's Mail app to 500 people.
- An MVP is not necessarily the smallest product to build, but the quickest to build.
- It's hard for entrepreneurs to launch an MVP, because the vision they have of themselves is launching high-quality, polished products, not crappy ones. Overcome that hesitation.
- If you don't know who the customer is, you don't know what quality is.
- Users may be fine with what you think is low-quality stuff, and may actually find it better, disagreeing with your opinion as to what constitutes high or low quality.
- Low quality is a problem only if it slows down the build - measure - learn feedback loop.
- An MVP can also be a marketing pitch accompanied by a sign up page to gauge interest.
- Or a video, in Dropbox's case.
- You can have humans substitute for an algorithm.
- Don't worry that an established company will copy your idea. Try pitching it to the managers there. They will do nothing, partly because they're already overwhelmed with good ideas.
- MVPs often result in bad news. Or, rather, they bring it out. You're better off facing reality.
Chapter 7: Measure
- If you're making changes to your product resulting in more users, that's not good enough. It's storytelling. How do you know that your changes are causing the results? How do you know that you're drawing the right lessons from your changes?
- You need innovation accounting.
- Innovation accounting works in three steps:
1) Use an MVP to establish real data on where you are. Without a clear-eyed picture of your current status — no matter how far from the goal you may be — you cannot begin to track your progress.
2) Tune the engine to move towards the ideal.
3) Decide whether to continue on your current course or pivot.
- An MVP gets you real baseline data — conversion rates, sign-up rates, trial rates, customer life-time value, and so on.
- Don't optimise something (like making your app easier for new users to use) until you know that it's a driver of growth and is less than what you'd like.
- Putting all these together, start with a baseline metric, then form a hypothesis as to what will improve that metric, and then perform a set of experiments designed to test that hypothesis.
- Metrics about the customer acquisition funnel are important.
- Running Adwords ads, even on a low budget is important, because it gives you critical data.
- Cohort analysis is important. Here, you define a cohort, such as people who signed up during a given week, or those who used a certain feature. Then you track the performance of your app for that group of users.
- Cohort analysis lets you prove or disprove theories like, if your number of users is declining, that people who signed up recently are abandoning the app while old users continue to use it.
- Cohort analysis can point out problems when other metrics are all up and to the right (hockey sticks).
- When you get poor quantitative results, they force you to declare failure and create the motivation, context and space for more qualitative research.
- If you pivot, and the experiments you run afterward are more productive than the ones before, that's the sign of a successful pivot.
- Don't focus on optimising, whether the conversion rate or the performance of your app, because you may be building the wrong thing, in which case no amount of optimisation will help.
- A startup has to measure progress against a high bar: evidence that a sustainable business can be built. This is possible only if you've made clear, verifiable predictions ahead of time.
- Sometimes, when you make changes and launch them, it's hard to look for cause-and-effect relationships after the fact. In that case, do an A/B test.
- A/B testing can also tell you things like whether the social features you've added to a product matter.
- Hypothesis testing can require you to build new infrastructure. For example, if you're testing delayed sign-up, you'll need to support a state where users have their data in the system but haven't yet signed up.
- Industry norms like delayed sign-up helping may not be true in your case.
- That may, in turn, reveal an insight, such as: customers were not basing their decision on whether to use your product on your demo. Maybe on positioning and marketing.
- Good metrics must follow the three As: Actionable, Accessible, and Auditable.
- Go by actionable metrics, not vanity metrics. Vanity metrics are those where the cause and effect relationship isn't clear. You don't know what change you made that led to an increase (or decrease) in this metric, like page views. Or maybe it has nothing to do with you, like a mention in a popular blog. An actionable metric is the number of customers. If it decreases by 50K, you know something is wrong. You can work on it and hopefully fix it. That's actionable.
- Accessible means that you can understand what it means, like a "customer" as opposed to a "hit on your web site".
- Auditable means that if a question arises as to the validity of the metrics, you should be able to verify it. The best way is to talk to customers, who will also tell you why something is happening, not just that it is. In addition, the mechanism that generates the results must not be too complex for the metric to be auditable.
Chapter 8: Pivot
- Companies that can't pivot may be stuck in the land of the living dead, neither growing quickly enough nor dying, consuming the time and money of the people involved.
- Launching early and iterating means that if you pivot, you waste less time, energy and money. If you drag it on, you won't want to pivot because of sunk costs.
- Go by actionable metrics, rather than vanity metrics that can give a feeling of false success.
- A startup's runway is conventionally defined as the number of months, but it should be defined as the number of pivots it can make.
- Don't cut costs by slowing down the build - measure - learn loop. Then you're just going out of business slowly.
- Two telltale signs that you need to pivot are the decreasing effectiveness of product experiments and the general feeling that product development should be more productive.
- Not having PR and media attention on you is good, because you can pivot without drama.
- Some types of pivots are:
+ Zoom-in pivot (where you focus on a subset of your original product)
+ Zoom-out pivot
+ Customer segment pivot (where you realise that you're more successful with different customers from the ones you expected)
+ Customer need pivot (where you discover that the customer has more important needs than the ones you thought they had)
+ Platform pivot (where an app becomes a platform or vice-versa)
+ Value capture pivot (commonly called monetisation, but monetisation is more like a feature while value capture is more central to the product)
+ Engine of Growth pivot (moving between viral, sticky and paid engines of growth)
+ Channel pivot (moving between sales channels)
+ Technology pivot
+ A pivot is a hypothesis; we don't know ahead of time whether it will succeed.
Chapter 9: Small Batch Sizes:
- Optimise the entire system, not a piece of it.
- Have a small batch size: deliver work in smaller units.
- Launch each feature independently.
- Continuous deployment. Launch many times a a day.
- Have lots of automated tests.
- Have your designer sit with the engineer and have them design and implement each screen together. As opposed to your designer working by herself for weeks and then delivering the entire result at once.
- Smaller batch sizes are actually more efficient, despite our intuition.
- Quality problems can be identified much sooner. If you make something no one wants, you'll learn sooner.
- Large-batch systems tend to malfunction, and when they do, people blame themselves.
- Large batches lead to multiple rounds of rework.
- ... and to still larger batches, which becomes a death spiral.
- And to interruptions, people being blocked on others, communication gaps, scheduling problems, and so on.
- The longer a project takes, the more bugs, problems and conflicts it has.
- Have minimum work in progress.
- Pull, don't push. Start from the hypothesis that needs to be validated or the experiment that needs to be run, and pull work from product development in the smallest batch size to validate that hypothesis.
- Small batches will also let you work with less capital.
- Companies can stay lean as they grow. They don't need to become bureaucratic.
Chapter 10: Engines of Growth:
- New customers come from the actions of past customers. This happens in four ways:
1) Word of mouth.
2) As a side effect of product usage
3) Through advertising
4) Through repeat purchases (sticky)
- Each of these engines has a feedback loop that leads to success.
- One of the most expensive forms of potential waste for a startup is spending time arguing about prioritisation of new features.
- The engines of growth help you prioritise better.
- There are always a zillion new ideas about how to make the product better floating around, but most make a difference only at the margins. They are merely optimisations.
- If you're using the sticky engine of growth, you will grow if the rate of new customer acquisition exceeds the churn rate. Track both.
- The metric to focus on is the compound growth rate. If it's high, you're doing well.
- Activation rate and revenue per customer have little impact on growth. (They're better suited to testing the value hypothesis)
- If the churn rate and customer acquisition rate are the same, then the standard intuition to invest in sales and marketing doesn't work, because you will lose your new customers as well.
- This is an example of vanity metrics misleading you.
- The viral engine of growth depends primarily on people sharing it with friends, as a central feature of the app, not an afterthought.
- The metric to focus on is the viral coefficient, which determines how quickly your app spreads. If it's 0.1, it means one of ten people using the app are referring a friend.
- If the coefficient is less than one, the cycle of growth fizzles: if you start with 100 users, they refer 10 more, who refer one more, at which point the loop ends.
- Exactly 1 gives you linear growth: if you gain 10 new users this week, you will gain 10 the week after that, 10 the third week, and so on. That's not good enough.
- The coefficient needs to be > 1 for exponential growth.
- Tiny changes in this number cause dramatic changes. If it's 1.01 per week, you end the year with twice as many users as you began.
- If it's 1.1, you end the year with 140 times as many users as you began.
- These are often free and ad-supported because being asked to pay comes in the way of viral growth.
- The paid engine of growth relies on more paid sales. It's different from the sticky engine, which relies on repeat sales to the same customers.
- If one company earns a revenue of ₹10 per user, and another earns ₹100, and they both reinvest their profit in acquiring new users, which one grows faster? A: It depends on the Cost Per Acquisition (CPA). If they are proportional, like ₹2 and ₹20, both grow at the same rate.
- For faster growth, you need to reduce CAC or increase revenue.
- The lifetime value (LTV) of a customer is the total revenue they generate over their lifetime, minus variable costs.
- If LTV > CPA, the company will grow.
- If < it won't, despite one-time tricks like using invested capital or publicity stunts.
- Don't pursue multiple engines of growth, since it's complex to model all these effects simultaneously. Startups usually focus on one.
- Product-market fit is the moment when a startup finally finds a widespread set of customers that resonate with its product.
- A great market — a market with lots of potential customers — pulls product out of the startup. In a terrible market, the best product and best team are going to fail.
- When you achieve product-market fit, it's exhilarating.
- If you have to ask, you're not there yet.
- Depending on which engine you're using, look at the appropriate metric, like viral coefficient for a viral engine. If it's 0.9 or more, you're on the verge of success.
- The number doesn't matter as much as the direction and degree of progress.
- Every engine eventually runs out of fuel.
- Moving from early adopters to mainstream users is not automatic. The engine may stop and may require tremendous additional effort.
- Be careful to not confuse growth coming from an engine already working efficiently for growth from product development. It's possible your work has no effect, in which case you can have a sudden stop.
- To prevent this, focus on actionable metrics rather than vanity metrics, and use innovation accounting rather than traditional accounting. In other words, are you making progress on your actionable metrics? Are you running experiments and building MVPs to improve them? Are you verifying that, if you ran an experiment to reduce the churn rate, for example, that it has actually reduced the churn rate, rather than assuming that it did from increased revenue?l̥