the thing about EA, and rationalists in general, and the offshoots — high openness optimizes *against* false negatives, meaning bizarre-sounding ideas that turn out to have value. high openness evaluates those ideas seriously despite their odd packaging this is a genuine tradeoff! you get false positives instead however: doing the opposite also entails a genuine tradeoff. scrupulously excluding the false positives means you will miss the false negatives hence it is healthy for there to be various cultures with different epistemic approaches to sift through all the ideas out there
3,27K