I put an “importance” score on my posts: this weights the topic of the post.1 The scale goes from 1 (“applies only to Gavin’s life”) to 10 (“applies to the entire possible future of all life”). So, superexponential: a “10” topic is much more than a trillion times more important than a “2”.
Anything “4” or above could affect you, might be something you should know, as a generally optimising or curious or altruistic person.
It is an enduring mystery to me why my interest in a topic is not more correlated with my own estimate of its importance. ¯\_(ツ)_/¯
|> Drake, Hanson, and the meaning of life|
|> Existential risk as common cause|
|> Why worry about future AI?|
|> Against the Culture|
|> AI alignment & academia|
|> AI ethics for present & future|
|> Worldview in Five Books|
|> 'Strangers Drowning' (2015) by MacFarquhar|
|> 'The Decline' (1999) by NOFX|
|> Pop auteurs|
|> Mathematical dark matter|
|> 'Trompettes de la Mort' (2005)|
|> hope dies last|
|> Nation playlists|
|> 'The Odyssey' (2017) by Emily Wilson|
|> To be led out|
|> Misreading Russell on radical scepticism|
- I stole the idea from Gwern.