Learning from data vs experience

Today's post takes a journey from fatherhood to corporate strategy and I think the way we get there makes sense. As I discussed a bit in my intro, some of these posts will have a personal essay flavor.

I am not the kind of person that plays up big events in my life. In fact, I think sometimes the people that care about me most are a little irritated by the way I downplay certain milestones and accomplishments. It's just not how I'm wired. One of my sisters once told me that I'm a great person to talk to when you need a lift, but I'm not a great cheerleader when you've accomplished something you're proud of, so at least I'm internally consistent, but I digress.

As a recent new parent, that means I'm unlikely to talk much about just how profoundly parenthood has changed my life, even after just a month. And in some ways, it really isn't so profound. A lot of early parenthood is simple, even if it feels convoluted. Caring for a one month old is like monitoring a computer program that processes a series of inputs (food, sleep, nurture), into outputs (poop, pee, growth), with error codes (spit up, fussing, wild crying) that trigger at random and often with no connection to the inputs and outputs. But in between managing those inputs, outputs, and errors, I find myself thinking about the future for this child, and sometimes I feel overwhelmed not by the responsibility we undertook, but by how much of my knowledge and experience I want to impart on him.

I think this started in the second trimester of my partner's pregnancy. One day I had this massive urge to make a playlist that would teach my child the entirety of music history, but would also nod towards my love of lesser known artists that may not be included in that broader history. I was basically trying to design both the survey course and the doctorate-level course at the same time, and as if they were supposed to be experienced concurrently (this is what we call bad design). I spent way too much time curating a woefully inadequate playlist that didn't match my ambition, and I finally abandoned the project (for now). More recently, I've felt a huge responsibility to ensure that Keegan does not repeat the same mistakes I made during my life. This has come with the unfortunate side effect of randomly reliving the throwaway embarrassments of a thirteen year old, that I thought had been purged from my memories, as a fully formed adult. Do not recommend. Fortunately or unfortunately, though, I don't know if I have much choice in the way Keegan chooses to learn. We can teach and guide all we want, but some mentees and proteges (read: young me) will simply refuse to learn the lesson without having the experience. Sometimes you need to touch the hot stove to learn that it, indeed, is hot.

But we don't want everyone to learn their lessons from previous experts, right? Of course we love and respect our elders, but if we, collectively, learned every lesson through their teaching and carried that forward, that's not a world that makes much progress. Progress is frequently non-linear, which means it often bucks years and centuries of collected and accepted knowledge. So as I pondered the futility of saving Keegan from years and years of hot stoves, I considered two expansions of my thoughts. First, what is the right balance of learning from data — defined as knowledge from the past imparted by a trusted advisor — and learning from experience — defined as "testing the waters" even if someone told you it wasn't worth it? Second, how can organizations, more broadly, reconsider how they learn from data and experience?

The first question is not about someone that learns from experience 90% of the time (this person would never survive the world) vs someone who learns from experience 5% of the time (this number might still be too high). We are human, we learn the vast majority of what we know from mentors of all sorts, not from experience. Our species has an enormous dependence on trusted advisors passing on important information to us precisely so we don't have to experience things on our own to know they are true. We would have gone extinct long ago if all of us had to learn, on our own, that animals with big sharp teeth are dangerous. Most of our less survival-oriented knowledge is the same. In order to learn basic arithmetic and a number of logical derivations from those basics, we do not need to derive a formulation of the Peano axioms on our own. We just accept certain things as true without formal proof. But even though we have a massive, readily available suite of knowledge at our fingertips these days, sometimes it's more interesting and beneficial to discover how things work on your own. It was way more fun for me to play around on pianos and guitars to discover how different chords sounded together than it would have been to learn the same lesson from the many books that could have taught me everything I wanted to know about music theory.

In many circumstances that look and feel like my musical experience, I find experiential learning is best. I use this when I'm training analysts. I'd rather have them struggle with problems that I already have solutions for than to teach them my answer in almost all cases. It is worth it, to me, to sacrifice short term efficiency for a long term gain in efficiency. And honestly, the young analysts frequently come up with better and more creative solutions than the one I already built, which is a major point in favor of being judicious with how often you inject your expert knowledge into a problem-solving situation. It also forces junior people, early in their careers, to grapple with and solve big, open-ended questions. But there are risks. If you push people to come up with their own creative solutions, those solutions could be error-riddled, and if you or the people you lead that are responsible for quality control are stretched, those errors could end up in final products. I imagine my error prone version of experiential teaching and learning is not a great solution for, say, accountants, for whom the risk of inaccuracy and imprecision is likely much higher than the reward of future efficiency in how the accounting is done. As my partner often reminds me, these CFOs are on the hook legally. As is typical for these kinds of questions, the answer to "how much should I experience on my own vis a vis learning from others?" is "it depends".

In the end it is a risk-reward question. The low-risk answer is to rely on tried and true experience, and the higher-risk answer is to try something out. The problem is that in business, most companies grow to over-index on the low-risk strategy. It's important to note that the low-risk answer is not a no-risk answer. This is where stagnancy takes root. And as I said in my post about Silicon Valley Bank, complacency (a derivative of stagnancy) eventually brings the death knells. SVB needed to make a large capital investment to hedge their interest rate risks (and separately, needed someone to tell Peter Thiel and other VCs not to be so jumpy, but that's another conversation), and this kind of hedging was different from their business as usual. While hedging, itself, is a risk mitigator, such a change in day-to-day operations feels risky because it's not the way things were done before, and when leadership is thinking "we've never needed this in the past", they may be slow to act. This was apparently true at SVB, and it's also true at most large companies.

Most companies are incredibly risk-averse and while I think most leaders would say they are always looking to try new things, the reality is that most truly new initiatives die on the vine. Leaders, at all levels, typically would much rather take on easy, high-success-rate projects (more business as usual), or cut costs, than to take big swings. Essentially, established big businesses have decided, en masse, to default to less innovation, and thus stop learning from experience, which is bad for growth. Everyone is acting like the accountants (I promise I'm not knocking accountants, it's just a good example!), who just want to keep everyone out of trouble. I'm sure that there are dozens of hypotheses and organizational design theories about why this is true (there is plenty of discussion to be had about incentives, for example, that I won't broach here), but I'm going to talk about one of my hunches. I think it has something to do with the way we frame strategic questions in business.

Your mileage may vary, but in my experience, when new strategic initiatives come along in an organization, the first step is to try to find a reason not to pursue the initiative (particularly if it's especially radical). It's not that companies say that outright, but the framework of the analysis pushes them there. One analysis that I see dumb down or kill projects quickly is a competitor analysis. This is the strategy implementation version of learning from data rather than learning from experience. Analyzing the behavior of competitors pushes companies to reject initiatives for which there is no clear first mover advantage, and reject implementations that don't match those of their competitors. Sometimes that results in a bunch of competitors producing the same bad products because nobody was willing to take the big swing and make it better. Sometimes a focus on cost cutting, and easy management with little innovation can undermine products so badly that an entire industry could suffer the fate of the big three automakers in the US (at least Detroit got bailed out, that's not something I'd stake my business decisions on).

There is a place for traditional strategic analysis (I'm duty bound to say this as an MBA who worships at the altar of the GE matrix), but I find that when it comes down to deciding on a path forward for implementation, competitor behavior carries too much weight. I'm not certain why that is, but I suspect it's because that's one of the easiest and earliest analyses of a strategic initiative, and the results have staying power. Perhaps deciding to ignore the competition is worthwhile, the worst case scenario is you cut your losses and learn from the experience.

Relatedly, when companies finally try to do something new, they often insist on tepidly testing the waters. A proof of concept is great, but when companies pilot new ideas (not just extensions of their current business) they tend to be too conservative with the size and length of the pilot. Small, short rollouts are a recipe for finding every possible reason you might not want to move forward. Any new thing you do has a major disadvantage to business as usual, and it's that you are new to doing it! The nominal data from a pilot can also steer you in the wrong direction if a pilot is too small ,or if your analysis is unsophisticated. Consider a company with 100 customers that tests a pilot product on 20 of those customers. Let's say the original product had a 75% success rate (by whatever definition) and the new product test shows a 65% success rate. That looks like the new product is worse. But a statistical method, called the chi-squared test, actually tells us that when we account for the fact that the pilot was small compared to the rest of the customer population, the difference is negligible or non-existent. Rather than scrap the new product, maybe we should improve it, and with improvements maybe we surpass our original 75% success rate.

If you're going to commit to learning from experience (this is what a proof of concept or pilot is for, in the end), give yourself a real chance to learn! Bigger and longer pilots boost the buy-in from all the stakeholders and gives the initiative a real chance to succeed. It's the best chance you have to get both the product and go-to-market right, and to move in a real innovative direction. And when you're successful, you might find your competitors following in your footsteps.


Subscribe to Signal-Noise Ratio

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe