What’s the bigger challenge in implementing Artificial Intelligence within your enterprise? Risk or uncertainty?
Hang on a minute, you say, aren’t they sort of the same thing? They’re related, but I’m going to argue uncertainty is emotional.We’ll play a game, You may be familiar with it from your undergraduate studies. Imagine I’ve got a jar with 100 balls in it. 50 red and 50 black. The jar is opaque and you can’t see inside. Pick a color; red or black. I’ll draw out a ball from the jar. If it’s your color I pay you $10,000. If it’s not your color, you get nothing. And I’m only going to let you play this game with me once.
Do you have a color preference? The right answer should be no. Everything about the balls is the same; weight, shape, everything but color. If there were a color preference it would be slightly skewed towards red for the general population.
The odds are 50-50. How much would you pay for the chance to gain $10,000? A dollar? $2,000, $4,999? Based on the risk, the expected value is $5,000. A cunning financial wizard might like a little margin of safety and play for $4,000.
You know the odds. You know the risk. You know how much you’d pay to play. Let’s play a second game. We’ll call it Jar 2. It also has 100 balls, but you don’t know what the proportion is. The only guarantee is there are at most two colors; red and/or black. You don’t know the proportion. It could be 100% red, or 50-50, or anything in between. I get to choose the proportion.
Like our first example, you can only play this game once. What color do you want and how much would you pay to play?
You might pick red, knowing it’s slightly more likely to be picked in the general population. But I know you know that so I pick black; or knowing I know you know the bias towards red and might pick black, I choose red. But you suspect that and choose black. I know that you know that I know that so I mix it up in favor of black. And we could go on and on with the known knowns and unknown unknowns ad infinitum, but we’ll still end up with 50-50 odds in this case. So how much will you pay to play the Jar 2 game just once?
Let’s say you were willing to pay $4,000 for the Jar 1 game, how much would you be willing to pay for Jar 2? $1? $1000? $3,500? $4,999? Most of you won’t come anywhere near what you were willing to pay for Jar 1.
You feel like Jar 2 is a bigger risk. You might say there’s an uncertainty about this risk, but I can prove to you Jar 2 has the identical risk as Jar 1. How? I’ll ask you to pick your color by flipping a coin. Heads, red; tails, black. We’ve just randomized your color choice and eliminated the possibility of my choosing a distribution that disadvantages you. Neither you nor I know what color you’re going to pick.
You intuitively understand this, yet if I ask you to play the Jar 2 game most of you won’t play Jar 2, or would pay significantly less even though the odds are the same as Jar 1. This is an emotional response. Humans can’t handle uncertainty. They’ll play the odds, but flee from the uncertainty.
The balance between risk and uncertainty with implementing AI pilot projects can paralyze an organization. We can’t eliminate our emotional bias towards uncertainty, but being aware of it may give us the longer runway we need to get projects off the ground.