There’s a quote from An Inconvenient Truth that stuck with me: “There are a lot of people who go straight from denial to despair.”1 You can have an impact. Maybe it doesn’t feel like you can have a “meaningful” impact; maybe it feels like your impact wouldn’t be enough. Don’t worry. No one is asking you to do anything all on your own.
I see three avenues toward helping prevent human extinction from AI.
1.) Raising Consciousness
2.) Earning to Give
3.) Direct Impact (Research and Engineering)
This is a list of a bunch of ideas that have occurred to me. Obviously, it’s not exhaustive.
1.) Raise Consciousness
- Share this link. (I know, I know, this is slimy and memetic, and I’m truly sorry about that. But I would be remiss if I didn’t mention it.)
- Email it to your parents or your children.
- You know that Facebook friend of yours who always gets hundreds of likes even when his posts aren’t anything special? Get him to read this, and maybe he’ll share it.
- Talk about it with people. I know, how old fashioned.
- Write an article/book/screenplay/TV series. (I did warn you, these were high effort). If you do go this route, be sure to look at these blog posts.
- Found an organization that recruits Math, CS, and philosophy Ph.D. students to this field.
- Found an extracurricular program for K-12 students that introduces them to effective altruism, of which extinction risk might be one of a few focus areas. (I don’t think you could justify the whole program focusing on extinction risk).
- Create an effective altruism based career hotline for people to call when they don’t know what to do with their life, but want to make a difference in the world. (I don’t know; I’m spitballing here).
- Create an umbrella organization for collegiate student organizations to help recruit AI Safety researchers.
- Create a cousin of this website in another language. I don’t think a direct translation would work that well for the whole thing, and I think it would be better if you gave it your own voice, but feel free to plagiarize as much as you want. Just cite me once at the end.
- Fundraise. (I’ll get into specific organizations in a moment.)
- Think of people you know who could do one of these things, and talk to them.
2.) Earn to Give
Earning to give requires a bit of an introduction. Suppose a lawyer wants to feel like he has helped the homeless. One thing he could do is go to a soup kitchen for an hour. Now suppose he wants to actually help the homeless. What if, instead of going to soup kitchen himself, he worked an extra hour at his job and donated the extra money he made? With that money, they could hire someone else to work at the soup kitchen for tens of hours. If the lawyer does this, then the same amount of time, he has helped many more people.
Earning to give, sadly, puts at odds our desire to help others with our desire to have to the experience of directly helping others. Some psychological tricks might allow for a win-win, here. Maybe our putative lawyer could spend that hour he’s working at the office reminding himself that it’s like he’s Vishnu serving soup at the soup kitchen… because it’s like he’s serving many more people… Get it? Vishnu has lots of arms? Probably good at serving soup? *Sigh* Never mind. Let’s just enjoy the thought of Vishnu having found his calling and ladling soup like there’s no tomorrow.
If your goal is to forestall human extinction, probably, for most of the people reading this, the most effective thing you can do is earn money to help pay the salaries of people who are doing the other things on this list. I know that giving away money can feel like taking hit points. I have been tempted by the feeling: “Really it’s people who are richer than me that can afford philanthropy.”
And yet, if you donate, you can know that at least you’ve done something. I have a friend named Ben who’s living that starving artist life, and he still donates to MIRI. You don’t have to make a lot to give what you can.
Depending on the dollar amount, this approach could be low effort or high effort.
3.) Direct Impact
- Error 404. Sorry, I couldn’t come up with any of these.
- Research any of the open problems in AI Safety at MIRI or FHI. For MIRI in particular, if you don’t think you have the qualifications to do research, but you like math, check this out. And this is, I think, the most compelling research agenda put forward so far; it outlines which research questions we should pursue today.
- FHI Research Areas
- AI Safety
- Technology Forecasting and Risk Assessment
- Policy and Industry
- MIRI Research Areas
- Realistic World-Models
- Logical Uncertainty
- Error Tolerance
- Value Specification
- FHI Research Areas
- Do similar research at a university (probably within a computer science department, or potentially a philosophy department).
- Work for OpenAI or DeepMind, while keeping up to speed with AI safety research.
- 80,000 Hours wrote an article that has some more good ideas.
CFAR (the organization I mentioned that’s trying to expand the pipeline to AI safety) has a link on their website where you can sign up for a 20 minute conversation. They’d be excellent at brainstorming with you about how you could help in a way that’s a good fit for you. They’d certainly be much more helpful than this unpersonalized list I’ve made.
Alternatively, you can contact me, and we can talk about how you can use your skill set to help preserve biological life. Don’t be shy if you’re only interested in helping in a small way. That’s totally fine!
I almost forgot to mention: thanks for your interest in helping out. Or at least being interested enough to click the link to this page. I was hoping you’d end up here. Thanks for reading!
Some extra ideas for college students:
- Create a college organization or club whose mission is to expand the pipeline of people working on AI Safety.
- Try out some computer science, math, and philosophy classes, and see if you like them. If so, take lots. If not, consider taking classes that will help you get good paying jobs, so that you can donate a lot of money to the cause.
- Guggenheim, D. (Director). (2006). An inconvenient truth: A global warning. Hollywood: Paramount. ↩