I did fake Bayesian math with some plausible numbers, and found that if I started out believing there was a 20% per decade chance of a lab leak pandemic, then if COVID was proven to be a lab leak, I should update to 27.5%, and if COVID was proven not to be a lab leak, I should stay around 19-20%
This is so confusing: why bother doing “fake” math? How does he justify these numbers? Let’s look at the footnote:
Assume that before COVID, you were considering two theories:
- Lab Leaks Common: There is a 33% chance of a lab-leak-caused pandemic per decade.
- Lab Leaks Rare: There is a 10% chance of a lab-leak-caused pandemic per decade.
And suppose before COVID you were 50-50 about which of these were true. If your first decade of observations includes a lab-leak-caused pandemic, you should update your probability over theories to 76-24, which changes your overall probability of pandemic per decade from 21% to 27.5%.
Oh, he doesn’t, he just made the numbers up! “I don’t have actual evidence to support my claims, so I’ll just make up data and call myself a ‘good Bayesian’ to look smart.” Seriously, how could a reasonable person have been expected to be concerned about lab leaks before COVID? It simply wasn’t something in the public consciousness. This looks like some serious hindsight bias to me.
I don’t entirely accept this argument - I think whether or not it was a lab leak matters in order to convince stupid people, who don’t know how to use probabilities and don’t believe anything can go wrong until it’s gone wrong before. But in a world without stupid people, no, it wouldn’t matter.
Ah, no need to make the numbers make sense, because stupid people wouldn’t understand the argument anyway. Quite literally: “To be fair, you have to have a really high IQ to understand my shitty blog posts. The Bayesian math is is extremely subtle…” And, convince stupid people of what, exactly? He doesn’t say, so what was the point of all the fake probabilities? What a prick.
Nah, we’re just not giving him the benefit of a doubt and also have a lot of context to work with.
Consider the fact that he explicitly writes that you are allowed to reconsider your assumptions on domestic terrorism if a second trans mass shooter incident “happens in a row” but a few paragraphs later Effective Altruists blowing up both FTX and OpenAI in the space of a year the second incident is immediately laundered away as the unfortunate result of them overcorrecting in good faith against unchecked CEO power.
This should stick out even to one approaching this with a blank slate perspective in my opinion.