I put an “importance” score on my posts: this weights the topic of the post.1 The scale goes from 1 (“applies only to Gavin’s life”) to 10 (“applies to the entire possible future of all life”). So it’s superexponential: a “10” is on a topic much more than a trillion times more important than a “2”.
Anything “4” or above could affect your life, might be something you should know, as a generally optimising or curious or altruistic person.
It is an enduring mystery to me why my interest in a topic is not more correlated with my own estimate of its importance. ¯\_(ツ)_/¯
|> Existential risk as common cause|
|> AI alignment & academia|
|> AI ethics for present & future|
|> Worldview in Five Books|
|> 'Strangers Drowning' (2015) by MacFarquhar|
|> Consent as conclusive evidence|
|> 'Blindsight' (2006) by Watts|
|> 'The Great Influenza' (2004) by Barry|
|> Why is quality rare?|
|> What's the highest moral wage?|
|> 'Ficciones' (1944) by Borges|
|> Maths at the Open University|
|> π vs τ as clash of worldviews|
|> The presumed worth of anthropology|
|> Taking Hume's name|
|> Punk as ideology|
|> 'The Odyssey' (2017) by Emily Wilson|
- I stole the idea from Gwern.