I don’t regularly read Freddie deBoer but Sam Atis and Scott Alexander directed me to one of his articles on Effective Altruism this week. A lot of it felt intuitively wrong and here is an attempt to state simply what those things are. The initial caveat is that this is what EA is for me and I don’t speak for EA. That’s true for everyone.
EA is about consequentialism, not utilitarianism. The consequences of feeding the bread to two starving children rather than your own starving son (assuming that it’s the same amount of bread which is weird), aren’t just about who survives, it’s about the world we live in. A pure strawman utilitarian might commend a doctor for saving the lives of five people who need organ donors ASAP by harvesting from the perfectly healthy person coming in for scripts. A consequentialist would look at the effect on society of people scared to go to the doctor for fear of being harvested - ie. the actual total consequences of the action - and see that it’s a net negative thing to do. This is also why EA isn’t “live as a pauper and donate everything to the most effective charities” but “donate 10% and do what you want with the rest”. The actual consequences of donating everything are that you’ll give up on it because that life sucks. Donating 10% is like “this is fine, life is good, I feel good about this and I also feel happy to buy an ice-cream for my kids when I feel like it”. It’s something you can keep going with in the long term.
EA can’t be about branding because a normal person bringing it up nearly anywhere lowers their standing by doing so. This happens to me and my personal and social life is probably worse for each time I’ve mentioned EA to someone.
Revealed preferences are a thing. The logic of EA is sound, but people generally don’t act on it. The movement is about acting on the logic, not about the logic itself.
The edges are not the middle. Just because some people are arguing that carnivores should be pushed into extinction doesn’t mean many or most people who consider themselves EA think that. Ditto with the other issues. Are things going into longtermerism more than they should? Yeah, look, probably. I personally think the best way to protect against long term risk is to get another billion people on board with helping - but I really don’t know.
EA dissolves into weirdness when you demand substance from particular people who like to have a new answer to everything. You’re asking for more when you don’t need more. “Do good well” is enough if the focus of the phrase is the “do” rather than the rest. See point 3 for further detail.
You could argue that consequentialism has the utilitarianism flaw of pretending it can predict the future too much - honestly, the flaw is probably magnified with consequentialism because the “total consequences” of an action are so spread out. I would say it’s a moot point in the discussion of EA because the difference between a “Top Charity” and the average one is so vast that you don’t need to actually run the calculation very far. The top charities are most likely very good, so just go with that. Be a satisficer, not a maximiser.
The article says: “Generating the most human good through moral action isn’t a philosophy; it’s an almost tautological statement of what all humans who try to act morally do. This is why I say that effective altruism is a shell game. That which is commendable isn’t particular to EA and that which is particular to EA isn’t commendable.” I don’t think I 100% understand what is going on here. I can say that many humans aren’t focussed on trying to act morally, so getting people to generally focus on it more is good. And, if there are more EAs, then there’s more of the commendable stuff, right? Which is good? And I would say much of the particular EA stuff which isn’t commendable actually isn’t stuff that most EAs are keen on. The most particular stuff for any movement is usually happening at the fringes of the movement, which by definition isn’t the movement.
Growing the EA movement is worth a lot. If I hadn’t heard about it, my life would be different, I wouldn’t have passed it on and I wouldn’t be giving away much money. I haven’t run the numbers on $10 million promoting a book, but if the average EA gives away 10% of $70,000 per year for 40 years, that’s $280k. That’s only 36 signups to make the initial investment worth it. That ballpark seems like a decent idea.