I recently picked up 'Good Strategy / Bad Strategy' by Richard Rumelt after some Googling suggested that it was an essential read for product managers (I figured I should be more 'product-y' working in a product team). The book was surprisingly enjoyable for a non-MBA person like myself. It has plenty of real-world examples, especially engineering ones. Many points resonated with me when I thought about technical strategy work I have seen and done.
tl;dr: My key takeaways from the book were:
The kernel of a strategy has three components: a diagnosis, a guiding policy, and coherent actions.
Strategy is just a hypothesis; executing a strategy is testing it.
Understand the context around a strategy before copying it.
Analyse methodically...and just start somewhere!
Kernel of a strategy
The author describes the "kernel" of a good strategy as a diagnosis that leads to guiding policies that drive coherent actions. This resonated a lot because it is a process that a typical engineer should follow. When presented with a nebulous problem, gather data to identify a diagnosis and develop some guiding principles for the solution, after which we can derive logical and coherent next steps. The book describes several examples, including how one may address the decline of Starbucks or IBM. As a technical example from my experience, a common problem in a large distributed system is 'we can't easily ensure all actions are authorised?'. One starting diagnosis would be 'we don't have a consistent user context through the system', and we can produce some principles such as needing a singular trusted source of user context but distributed verification, leading to actions of building some central service (like an OAuth authorisation service) and libraries for distributed verification of OAuth tokens.
I've noticed many people skip over the principles part to focus on action: ' Just tell me what needs to be done.' The problem is that without principles, it is challenging to ensure that actions are coherent and consistent. Granular decisions must be made over time, long after an initial strategy is developed. Following principles ensures we (yourself or the teams you work with) can make consistent decisions in alignment with the original strategy. Leaders often won't ask you for principles behind your strategy, and sometimes they'll ask you to get to the plan and skip over principles; don't! If this happens to you, consider it a presentation problem by putting the action plan first (e.g., in the summary at the top of docs) before your principles. Those more willing to understand the principles behind your strategy can read further.
Another interesting point raised in the book is that many diagnoses are possible depending on how one looks at the data. Getting caught up with your perspective is very easy, so it is vital to get feedback from various people with different backgrounds and be willing to adjust your views. The book also suggests having an internal 'panel of experts' based on tough people you have interacted with.
Strategy is a hypothesis
How do you know your strategy is right? Answer: you don't. It's an educated guess based on your interpretation of data. This made me realise a few things:
Stay calm when company strategies change. What typically happens is that a company strategy is presented, perhaps in some big town hall event, and then several months later, it changes. As an employee working hard towards a plan, it's frustrating that we can't finish our work before the next change happens; we're pivoting again, and why can't we just stick with a plan until we get to the end? The reason is that the employees' actions prove (or disprove) the hypothesis, and leaders can and should adjust the strategy accordingly. In other words, an evolving strategy is a Good Thing.
Treat your technical strategies as hypotheses. As you develop the next steps and plans for your technical strategies, ensure there are incremental steps to prove or disprove your theories. Maybe you need to do performance testing to demonstrate scale. Or seek out additional use cases to confirm that your conceptual model is adaptable. As the data comes in, be comfortable with needing to evolve your thinking.
Others will assume your technical strategy is static. Like we bemoan company strategies seemingly changing on a whim, others may get frustrated if you change your technical strategy. Set expectations upfront about how you will prove or disprove your strategy. Explain the big rocks and risks and how the various milestones in your subsequent plan provide data to address them. Then, when your strategy evolves, be prepared to explain why.
Understand the context around a strategy before copying it
This is me paraphrasing a few points in the book. Strategies work because of things unique to a company in a specific environment. Good strategies are developed to make use of a company's advantages (sometimes self-created; there's an excellent section on 'interesting advantages' that exist when you have insights into improving the value of an advantage you have), or dynamics that exist at a point in time (e.g. deregulation, or predictable biases of competitors), or a moat due to a chain-link system that only works because all policies just fit together (e.g. no one can replicate IKEA). Simply copying what you see happening at another company, even if you're the one who developed it, doesn't work. Unfortunately, I've seen plenty of examples where folk say, 'Google does X, so we'll do it too,' and it fails to reap the benefits they expect. What makes it worse is if they don't treat their (copied) strategy as a hypothesis and adjust it as they see it failing in their current environment.
Seeing what other companies do is an excellent source of inspiration. I've found that many innovations come from applying lessons from different companies or industries to my current environment. The key is understanding why they work so you can judge whether they will work for your situation.
Analyse methodically...and just start somewhere!
The book explains some techniques to help develop sound strategies that I'm sure we have used in some capacity but didn't give them names.
One technique is the use of 'proximate objectives'. As senior engineers, we are often presented with nebulous requirements or a massive 'problem space' filled with so many problems that many people don't know where to start. The book presents the story of the Surveyor project at NASA's Jet Propulsion Laboratory to land an unmanned vehicle on the Moon. This hadn't been done before, so the requirements were completely unclear. The leader gave the team a proximate objective; the Moon looks like a rocky desert, so let's design for that. This educated guess made the problem tractablefor engineers, and adjustments can be made as experiments are run.
Another technique the book talks about is 'making a list': Create a list of the ten most important things you need to do, and start with the first one. The act of making the list makes you think hard about your company and problem space and structure your thoughts. I use this technique when proposing, starting and even during large projects when I feel overwhelmed; what are the issues the project is solving, what are the most critical jobs to be done, and how do they tie back to the original problems. This ensures I am working on the right things, either re-affirming the plan or identifying needed changes.
Summary: The Scientific Method applied in real life!
I thoroughly enjoyed this book with its raft of real-world examples from familiar companies, and I feel I'll be able toremember its lessons easily (I've already referenced them in various mentorship conversations). Maybe this is because if you look at the core message, it's the Scientific Method that we learned at school: Strategy is a loop of hypothesising, experimenting, and analysing!
Excellent summary! I think you'll like 101 Design Methods as well. They provide a good framework for the "start somewhere" part of forming a strategy.