Tuesday, September 6, 2016

You can't weigh risk if you don't know what you don't know

There is an old saying we've all heard at some point. It's often attributed to Donald Rumsfeld.

There are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns -- the ones we don't know we don't know
If any of us have ever been in a planning meeting, a variant of this has no doubt come up at some point. It came up for me last week, and every time I hear it I think about all things we don't know we don't know. If you're not familiar with the concept, it works a bit like this. I know I don't know to drive a boat. But because I know I don't know this, I could learn. If you know you lack certain knowledge, you could find a way to learn it. If you don't know what you don't know, there is nothing you can do about it. The future is often an unknown unknown. There is nothing we can do about the future in many instances, you just have to wait until it becomes a known, and hope it won't be anything too horrible. There can also be blindness when you think you know something, but you really don't. This is when people tend to stop listening to the actual experts because they think they are an expert.

This ties back into conversations about risk and how we deal with it.

If there is something you don't know you don't know, by definition you can't weight the possible risk with whatever it is you are (or aren't) doing. A great example here is trying to understand your infrastructure. If you don't know what you have, you don't know which machines are patches, and you're not sure who is running what software, you have a lot of unknowns. It's probably safe to say at some future date there will be a grand explosion when everything start to fall apart. It's also probably safe to say if you have infrastructure like this, you don't understand the pile of dynamite you're sitting on.

Measuring risk can be like trying to take a picture of an invisible man. Do you know where your risk is? Do you know what it should look like? How big is it? Is it wearing a hat? There are so many things to keep track of when we try to understand risk. There are more people afraid of planes than cars, but flying is magnitudes safer. Humans are really bad at risk. We think we understand something (or think it's a known or known unknown). Often we're actually mistaken.

How do we deal with the unknown unknowns in a context like this? We could talk about being agile or quick or adaptive, whatever you want. But at the end of the day what's going to save you is your experts. Understand them, know where you are strong and weak. Someday the unknowns become knows, usually with a violent explosion. To some of your experts these risks are known, you may just have to listen.

It's also important to have multiple experts. If you only have one, they could believe they're smarter than they are. This is where things can get tricky. How can we decide who is actually an expert and who thinks they're an expert? This is a whole long complex topic by itself which I'll write about someday.

Anyway, on the topic of risk and unknowns. There will always be unknown unknowns. Even if you have the smartest experts in the world, it's going to happen. Just make sure your unknown unknowns are worth it. There's nothing worse than not knowing something you should.