Future of Life Institute The Future of Life Institute (FLI) is a nonprofit working to reduce global catastrophic and existential risk from powerful technologies. In particular, FLI focuses on risks from artificial intelligence (AI), biotechnology, nuclear weapons and climate change. The Institute's work is made up of three main strands: grantmaking for risk reduction, educational outreach, and advocacy within the United Nations, US government and European Union institutions. FLI has become one of the world's leading voices on the governance of AI having created one of the earliest and most influential sets of governance principles: the Asilomar AI Principles. Future of Life Institute’s tracks Liron Shapira on Superintelligence Goals by Future of Life Institute published on 2024-04-16T13:46:13Z Annie Jacobsen on Nuclear War - a Second by Second Timeline by Future of Life Institute published on 2024-04-04T15:16:13Z Katja Grace on the Largest Survey of AI Researchers by Future of Life Institute published on 2024-02-29T09:11:44Z Holly Elmore on Pausing AI, Hardware Overhang, Safety Research, and Protesting by Future of Life Institute published on 2024-02-28T08:56:57Z Sneha Revanur on the Social Effects of AI by Future of Life Institute published on 2024-02-16T10:44:28Z Roman Yampolskiy on Shoggoth, Scaling Laws, and Evidence for AI being Uncontrollable by Future of Life Institute published on 2024-02-02T10:15:52Z Special: Flo Crivello on AI as a New Form of Life by Future of Life Institute published on 2024-01-19T13:51:32Z Carl Robichaud on Preventing Nuclear War by Future of Life Institute published on 2023-12-13T13:02:28Z Frank Sauer on Autonomous Weapon Systems by Future of Life Institute published on 2023-12-13T12:46:44Z Darren McKee on Uncontrollable Superintelligence by Future of Life Institute published on 2023-12-01T15:57:10Z
Liron Shapira on Superintelligence Goals by Future of Life Institute published on 2024-04-16T13:46:13Z
Annie Jacobsen on Nuclear War - a Second by Second Timeline by Future of Life Institute published on 2024-04-04T15:16:13Z
Katja Grace on the Largest Survey of AI Researchers by Future of Life Institute published on 2024-02-29T09:11:44Z
Holly Elmore on Pausing AI, Hardware Overhang, Safety Research, and Protesting by Future of Life Institute published on 2024-02-28T08:56:57Z
Sneha Revanur on the Social Effects of AI by Future of Life Institute published on 2024-02-16T10:44:28Z
Roman Yampolskiy on Shoggoth, Scaling Laws, and Evidence for AI being Uncontrollable by Future of Life Institute published on 2024-02-02T10:15:52Z
Special: Flo Crivello on AI as a New Form of Life by Future of Life Institute published on 2024-01-19T13:51:32Z
Carl Robichaud on Preventing Nuclear War by Future of Life Institute published on 2023-12-13T13:02:28Z
Frank Sauer on Autonomous Weapon Systems by Future of Life Institute published on 2023-12-13T12:46:44Z
Darren McKee on Uncontrollable Superintelligence by Future of Life Institute published on 2023-12-01T15:57:10Z