I put an “importance” score on my posts: this weights the topic of the post.1 The scale goes from 1 (“applies only to Gavin’s life”) to 10 (“applies to the entire possible future of all life”). So it’s superexponential: a “10” is on a topic much more than a trillion times more important than a “2”.
Anything “4” or above could affect your life, might be something you should know, as a generally optimising or curious or altruistic person.
It is an enduring mystery to me why my interest in a topic is not more correlated with my own estimate of its importance. ¯\_(ツ)_/¯
|> Drake, Hanson, and the meaning of life|
|> Existential risk as common cause|
|> Why worry about future AI?|
|> Against the Culture|
|> AI alignment & academia|
|> AI ethics for present & future|
|> Worldview in Five Books|
|> 'Strangers Drowning' (2015) by MacFarquhar|
|> Consent as conclusive evidence|
|> 'Blindsight' (2006) by Watts|
|> 'The Great Influenza' (2004) by Barry|
|> Effective Altruism Global: x: Oxford|
|> Why is quality rare?|
|> What's the highest moral wage?|
|> 'Trompettes de la Mort' (2005)|
|> hope dies last|
|> Nation playlists|
|> 'The Odyssey' (2017) by Emily Wilson|
|> Staring at mystics|
|> Misreading Russell on radical scepticism|
- I stole the idea from Gwern.