Google diversity memo, global warming, Pascal's wager, and other stuff
There's about 765 million blog posts about the diversity “memo” that leaked out of Google a couple of weeks ago. I think the case for any biological difference is pretty weak, and it bothers me when people refer to an “interest gap” as anything else than caused by the environment. Maybe because I have a daughter, maybe because I have too many female friends who told me stories how they were held back or discriminated against.
But disregarding my own opinion here, something else kept me annoyed for days. It seems like the all the arguments and counterarguments are very hung up on science and proof and it struck me as a very binary view of the world. It's great that we have research, but as long as people can cite studies showing almost anything, I'm not sure it really settles the debate. Anyway, I think there's a weird meta-argument that is also somewhat interesting when you think of it in terms of probabilities instead. I think it lays out the case for action almost no matter what causes you think the gender imbalance has.
Let me explain what I mean. Let's say the gender imbalance could be explained x% by biology and y% by the environment (nature vs nurture). So obviously it adds up to 100%. x and y could even be negative (eg women have a higher ability than men, but peer pressure and discrimination and whatever makes y larger than 1). Or maybe you think it's the other way around… I welcome you to my blog either way.
Now, an ultra hardcore conservative might say that it's 100% explained by nature and the whole mass of their probability distribution at x = 1, and a super progressive liberal would do the other way around. But come on… if you really had to bet money on it, would you bet your entire fortune that x is exactly 0? Let's say the odds are that you make one dollar if you're right, and lose all your money if you're wrong. So in general, everyone's belief is a probability distribution, like something like this:
Of course, we're never going to figure out the true value of x, but let's assume some alien is able to replicate Earth inside a simulator and keeps tweaking various parameter so they can figure out x with 9 decimals precision. And they come to Earth one day and offer to sell a contract that pays $x. What would you pay for that contract? I would probably buy it at -$0.3 and sell it at $0.3 personally – some old fashioned person might buy at $0.3 and sell at $0.8 or something. I don't know. Most people would assign some probability mass across a wide interval, reflecting some kind of uncertainty.
This hypothetical setup reminds me of my feelings when I read arguments trying to disprove or prove global warming. It's all fine, and I'm a big supporter of research. But at the end of the day we're still going to end up with some probability distribution. Sometimes I wonder if the focus on the “truthiness” prevents action. Instead of getting together across the spectrum and saying that x has some uncertainty, let's act accordingly, we get stuck trying to debate if x is exactly 0.0 (no human contribution to the Earth's temperature) or exactly 1.0 (all of the temperature increase in the last 100 years is caused by humans).
It gets more interesting when you weigh the uncertainty with the cost of action/inaction. So for every course of action, integrate over the probability distribution and multiply with the impact of action minus the cost of action.
In the case of climate change, let's say we can prevent human extinction from happening with 1% probability. That's worth spending a lot of money on! Similarly, regardless of your thoughts biological determinism, diversity efforts seems like a pretty good thing to focus on. Worst case it's an insurance, best case it's an investment.
Notes
- I updated this blog post to incorporate my own values since I realized I feel too strongly to write a neutral one. I also updated the choice of probability distribution (Beta is not a good choice).
- A funny thing when you do these cost analyses is that it's basically some weird form of Pascal's wager in disguise. Pascal's “bet” was that not believing in God had pretty limited downside but potentially infinite downside (at least my layman recollection).
- We actually do discuss this uncertainty sometimes – for instance spending money preventing some uncertain number of terrorist victims. It's not that we talk about the actual probabilities, but no one is hung up about trying to prove or disprove that there will be a terrorist attack. There's a debate about exactly how much money we should spend, but I don't think anyone is suggesting it should be $0.
- Of course it gets a lot more complicated if you actually try to do the math, since you can't really assign probabilities. It's some kind of Knightian uncertainty, which is sort of what Donald Rumsfeld referred to as “Unknown unknowns”.
- The argument about global warming isn't exactly a novel idea, but surprisingly I haven't heard it many times. Maybe I have lame friends.