The Big Nonprofits Post 2025
Description
There remain lots of great charitable giving opportunities out there.
I have now had three opportunities to be a recommender for the Survival and Flourishing Fund (SFF). I wrote in detail about my first experience back in 2021, where I struggled to find worthy applications.
The second time around in 2024, there was an abundance of worthy causes. In 2025 there were even more high quality applications, many of which were growing beyond our ability to support them.
Thus this is the second edition of The Big Nonprofits Post, primarily aimed at sharing my findings on various organizations I believe are doing good work, to help you find places to consider donating in the cause areas and intervention methods that you think are most effective, and to offer my general perspective on how I think about choosing where to give.
This post combines my findings from the 2024 and 2025 rounds of SFF, and also includes some organizations that did not apply to either round, so inclusion does not mean that they necessarily applied at all.
This post is already very long, so the bar is higher for inclusion this year than it was [...]
---
Outline:
(01:39 ) A Word of Warning
(02:50 ) A Note To Charities
(03:53 ) Use Your Personal Theory of Impact
(05:40 ) Use Your Local Knowledge
(06:41 ) Unconditional Grants to Worthy Individuals Are Great
(08:59 ) Do Not Think Only On the Margin, and Also Use Decision Theory
(10:03 ) Compare Notes With Those Individuals You Trust
(10:35 ) Beware Becoming a Fundraising Target
(11:02 ) And the Nominees Are
(14:34 ) Organizations that Are Literally Me
(14:49 ) Balsa Research
(17:30 ) Don't Worry About the Vase
(19:04 ) Organizations Focusing On AI Non-Technical Research and Education
(19:35 ) Lightcone Infrastructure
(22:09 ) The AI Futures Project
(23:50 ) Effective Institutions Project (EIP) (For Their Flagship Initiatives)
(25:29 ) Artificial Intelligence Policy Institute (AIPI)
(27:08 ) AI Lab Watch
(28:09 ) Palisade Research
(29:20 ) CivAI
(30:14 ) AI Safety Info (Robert Miles)
(31:00 ) Intelligence Rising
(31:46 ) Convergence Analysis
(32:43 ) IASEAI (International Association for Safe and Ethical Artificial Intelligence)
(33:28 ) The AI Whistleblower Initiative
(34:10 ) Organizations Related To Potentially Pausing AI Or Otherwise Having A Strong International AI Treaty
(34:18 ) Pause AI and Pause AI Global
(35:45 ) MIRI
(36:59 ) Existential Risk Observatory
(37:59 ) Organizations Focusing Primary On AI Policy and Diplomacy
(38:37 ) Center for AI Safety and the CAIS Action Fund
(40:17 ) Foundation for American Innovation (FAI)
(43:07 ) Encode AI (Formerly Encode Justice)
(44:12 ) The Future Society
(45:08 ) Safer AI
(45:47 ) Institute for AI Policy and Strategy (IAPS)
(46:55 ) AI Standards Lab (Holtman Research)
(48:01 ) Safe AI Forum
(48:40 ) Center For Long Term Resilience
(50:19 ) Simon Institute for Longterm Governance
(51:16 ) Legal Advocacy for Safe Science and Technology
(52:24 ) Institute for Law and AI
(53:07 ) Macrostrategy Research Institute
(53:41 ) Secure AI Project
(54:20 ) Organizations Doing ML Alignment Research
(55:36 ) Model Evaluation and Threat Research (METR)
(57:01 ) Alignment Research Center (ARC)
(57:40 ) Apollo Research
(58:36 ) Cybersecurity Lab at University of Louisville
(59:17 ) Timaeus
(01:00:19 ) Simplex
(01:00:52 ) Far AI
(01:01:32 ) Alignment in Complex Systems Research Group
(01:02:15 ) Apart Research
(01:03:20 ) Transluce
(01:04:26 ) Organizations Doing Other Technical Work
(01:04:31 ) AI Analysts @ RAND
(01:05:23 ) Organizations Doing Math, Decision Theory and Agent Foundations
(01:06:43 ) Orthogonal
(01:07:38 ) Topos Institute
(01:08:34 ) Eisenstat Research
(01:09:16 ) AFFINE Algorithm Design
(01:09:45 ) CORAL (Computational Rational Agents Laboratory)
(01:10:35 ) Mathematical Metaphysics Institute
(01:11:40 ) Focal at CMU
(01:12:57 ) Organizations Doing Cool Other Stuff Including Tech
(01:13:08 ) ALLFED
(01:14:46 ) Good Ancestor Foundation
(01:16:09 ) Charter Cities Institute
(01:16:59 ) Carbon Copies for Independent Minds
(01:17:40 ) Organizations Focused Primarily on Bio Risk
(01:17:45 ) Secure DNA
(01:18:42 ) Blueprint Biosecurity
(01:19:31 ) Pour Domain
(01:20:19 ) ALTER Israel
(01:20:56 ) Organizations That Can Advise You Further
(01:21:33 ) Effective Institutions Project (EIP) (As A Donation Advisor)
(01:22:37 ) Longview Philanthropy
(01:24:08 ) Organizations That then Regrant to Fund Other Organizations
(01:25:19 ) SFF Itself (!)
(01:26:52 ) Manifund
(01:28:51 ) AI Risk Mitigation Fund
(01:29:39 ) Long Term Future Fund
(01:31:41 ) Foresight
(01:32:31 ) Centre for Enabling Effective Altruism Learning & Research (CEELAR)
(01:33:28 ) Organizations That are Essentially Talent Funnels
(01:35:24 ) AI Safety Camp
(01:36:07 ) Center for Law and AI Risk
(01:37:16 ) Speculative Technologies
(01:38:10 ) Talos Network
(01:38:58 ) MATS Research
(01:39:45 ) Epistea
(01:40:51 ) Emergent Ventures
(01:42:34 ) AI Safety Cape Town
(01:43:10 ) ILINA Program
(01:43:38 ) Impact Academy Limited
(01:44:15 ) Atlas Computing
(01:44:59 ) Principles of Intelligence (Formerly PIBBSS)
(01:45:52 ) Tarbell Center
(01:47:08 ) Catalyze Impact
(01:48:11 ) CeSIA within EffiSciences
(01:49:04 ) Stanford Existential Risk Initiative (SERI)
(01:49:52 ) Non-Trivial
(01:50:27 ) CFAR
(01:51:35 ) The Bramble Center
(01:52:28 ) Final Reminders
---
First published:
November 26th, 2025
Source:
https://www.lesswrong.com/posts/FJxc4Lk6mijiFiPp2/the-big-nonprofits-post-2025
---
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.



