Most likely is not likely

https://www.lesswrong.com/posts/7hAwnpA5JKYeQLAFG/most-likely-is-not-likely

Cross-posted from Telescopic Turnip. "And I stop at every mirror just to stare at my own posterior." – Megan Thee Stallion I love bayesian statistics just as much as you do. However, there is a trap that people keep falling into when interpreting bayesian statistics. Here is a quick explainer, just so I can link to it in the future. You roll a 120-faced die. If it falls on one, you stop. If it gives any other number, you re-roll until you get a one. How many times do you think you will have to roll the die? Let’s do the math: you have a 1⁄120 chance of rolling the die only once, a 119⁄120 × 1⁄120 chance of rolling it twice, and so on. So a single roll is the most likely outcome, any other number of rolls will happen less often. Yet, a 1⁄120 chance is not even 1%. In this case, "most likely" is actually pretty unlikely. While this problem is pretty obvious in a simple die-roll game, it can become very confusing for more complicated questions. One of them, a classic Internet flamewar topic, is the doomsday argument. It’s a calculation of the year humanity is the most likely to go extinct, using Bayesian statistics. Briefly, it goes like this:

Comment

https://www.lesswrong.com/posts/7hAwnpA5JKYeQLAFG/most-likely-is-not-likely?commentId=WXojyJwvTbEtJ6EGj

I think the true reference class relevant for the doomsday argument is "humans who think about the doomsday argument", rather than "humans in general". Thus, one way in which the doomsday argument might not apply is if future generations stopped thinking about the doomsday argument.

As an illustration, imagine that humanity was replaced by p-zombies/​automata without any internal thoughts at all. That seems like it would satisfy the doomsday argument’s prediction. Thus, removing only the internal thoughts related to the doomsday argument should do so as well.

I’d therefore like to suggest the doomsday argument as a possible infohazard, which we should try to avoid discussing when possible. (To be clear, I don’t blame you at all for bringing it up, and I agree with your main point that conflating between "most" vs "very" likely is harmful and happens often. I just think that mentioning the doomsday argument is a bad idea.)

Comment

https://www.lesswrong.com/posts/7hAwnpA5JKYeQLAFG/most-likely-is-not-likely?commentId=k3NeXzRonZSv8D5re

What if the future generations just stop thinking about it because they made a safe enough world that they stop worrying about such things? That’s not a doomsday.