Known as Mad Max for my unorthodox ideas and passion for adventure, my scientific interests range from artificial intelligence to the ultimate nature of reality
Interesting tough love from @ESYudkowsky to everyone who thinks they probably have a workable plan for solving the AI control problem to prevent ASI from killing us all soon:
A list from @ESYudkowsky of reasons AGI appears likely to cause an existential catastrophe, and reasons why he thinks the current research community — MIRI included — isn't succeeding at preventing this from happening. https://t.co/hW4LRIAZuD