Charity Entrepreneurship (CE) is researching EA meta as one of four cause areas in which we plan to launch charities in 2021. EA meta has always been an area we have been excited about and think holds promise (after all, CE is a meta charity). Historically we have not focused on meta areas for a few reasons. One of the most important is that we wanted to confirm the impact of the CE model in more measurable areas such as global health and animal advocacy. After two formal programs we are now sufficiently confident in the model to expand to more meta charities. We were also impressed by the progress and traction that Animal Advocacy Careers made in their first year. Founded based on our research into animal interventions, this organization works at a more meta level than other charities we have incubated.
In this document, I summarize the results of 40 interviews conducted with EA experts. These interviews constitute part of CE’s research into potentially effective new charities that could be founded in 2021 to improve the EA movement as a whole.
In discussing meta charities, we are using a pretty broad definition. We include both charities that are one step removed from impact but in a single cause area (such as Animal Advocacy Careers), and more cross-cutting meta charities within the effective altruist movement.
Generally our first step when approaching an area would involve broad research and literature reviews. However, given the more limited resources focused on meta EA and our stronger baseline understanding of the area, we wanted to supplement this information with a broad range of interviews. We ended up speaking to about 40 effective altruists (EAs) across 16 different organizations and 8 current or former chapter leaders. We tried to pick people who had spent considerable time thinking about meta issues and could be considered EA experts, and overall aimed for a diverse range of perspectives.
The duration of the interviews ranged from 30 minutes to 2 hours, running about an hour on average. Not everyone answered every question but the majority of questions were answered by the majority of people. The average question got ~35 responses and none got fewer than 30. Interviewees were informed that an EA Forum post would be written containing the aggregated data but not individual responses. The background notes ended up being ~100 pages.
We broke down the questions asked into three sections:
The descriptions below aim to reflect the aggregate responses I got, not what CE thinks or my impression after speaking to everyone (that will be a different post). The results constitute one (but not the only) piece of data CE will use when coming to recommendations for a new organization.
1. Open questions
This was the hardest area to synchronize. It was surprising how much overall divergence there was between different people in terms of what ideas and concepts were seen as the most important.
Lots of ideas that came up in the open questions were covered in the category areas, but open questions also brought up original ideas that I subsequently added to the categories. One of the most interesting parts was certain concepts that came up a lot that fell outside of our categories and crucial considerations. Some concepts that came up that are not described more deeply in other sections were
Chapter support: Quite a few chapters mentioned that they could share resources and coordinate much better. It seemed like chapters can lose momentum when their leadership turns over or when the structure of support that is being offered changes dramatically. Quite a few chapters felt uncertain and lacked confidence in what the chapter landscape would look like long term, or what the career paths moving forwards from working in chapters would be.
Consistency and clarity in the scope of meta orgs: A lot of people mentioned that people overestimated the scope of what various organizations cover, and that posts explaining current and future scoping plans are super helpful. Many people said that new career orgs were very positively affected by 80,000 Hours clarifying and describing its scope on the EA Forum, and that more posts like this could further help meta charities getting started.
Covered ideas: Lots of people and organizations also mentioned to me ideas about ground they plan to cover. So in some cases, ideas are not mentioned in the sections below as it seems they could be covered well by existing actors in the near future and are thus less relevant to the potential of new organizations.
2. Crucial considerations
Expand vs improve: The first crucial consideration I asked most people about was expanding the EA movement vs improving the people already involved. I also sometimes described this as an internal focus vs an external focus. 35% of interviewees thought it was better to generally focus on expanding, 42% to focus on improving, and 23% were unsure whether one was better than the other.
Time vs money vs information: This is a common way of breaking up meta charities; the question asked which of the three is the most important focus for new charities. Time or talent was focused on careers and volunteer hours. Money was focused on funding and fundraising. Ideas/information was focused on research output and the creation of concepts. Overall 34% of people thought money is most important to focus on, with 26% of people finding ideas most important and 23%, talent. 17% were unsure or thought all areas are equally well covered.
Broad vs narrow: The next question was focused on broader meta organizations (for example, general mentorship or TED Talk-style outreach) vs more narrowly focused approaches (for example, training a small number of fellows or outreach for a specific idea such as effective giving). 41% leaned towards broad, 32% leaned towards narrow and 26% were unsure. In general, people seemed more tentative when considering these options than the other crucial considerations I asked about.
EA community trajectory: The next question asked for general thoughts on the EA community and about any positive trends that could be supported or negative trends that could be mitigated. This ended up tying quite closely to the next question asked, about the biggest current flaws in the EA movement. As such, I will talk about the results from both questions here.
A few concerns with the EA movement that seemed promising for new organizations to tackle came up a number of times. The 7 listed below each came up 10 times or more, and so are worth diving into in a bit more depth.
The biggest concern and a trend that came up again and again was that EAs tend to reinvent the wheel a lot. Things like financial advice, management training, or even just organizational best practices are often re-derived or come to by trial and error as opposed to through talking to experts outside of the EA community. People generally thought this was more of an issue for areas that are well established outside of EA as opposed to areas that were more unique to EA (e.g. approaches to prioritization, charity reviewing etc).
A related issue that came up was overconfidence in EA and misplaced deference. Many people expressed major concerns that EAs will often defer to other EAs who have thought even briefly about an issue as opposed to outside experts who have put considerably more time into the area. This was often described as EAs being far too confident in the movement as a whole and its abilities relative to those outside the EA movement. It was also mentioned that often more sophisticated processes are assumed to be used, than are used in practice. For example, the assumption is that grant makers in EA put GW-like levels of time or rigor into their grantmaking, when in practice far less rigor tends to be used. Another example of this was cause selection – many people expressed concerns about people assuming much stronger methodologies for initial EA cause selection than actually happened.
Another common concern with the EA movement was the limited opportunities for engagement and involvement. Past posts that have been strongly upvoted on the EA forum have talked about related issues so it was not a surprise that it came up on many people's minds. A related concern was that the wrong advice is often given out to people, encouraging them to upskill and apply for the few highly competitive EA jobs instead of pursuing other paths to impact. In general lots of people suggested that having a greater variety of avenues that are socially respected and seen as impactful would be a really important way for the EA community to improve.
Abstraction in work was another weakness brought up by the community. This concern was often directed at philosophical research. Many expressed concerns that this is high status and fun work to do but has pretty questionable impact on the world. Many people expressed a similar sentiment of EA needing more “do-ers” vs thinkers.
Lack of transparency and groups being overly concerned with info hazards (i.e. the potentially harmful consequences of sharing true information; read more here) was another recurring theme. A few people suggested that certain groups are extremely risk averse with info hazards in a way that is disproportionate to what experts in the field find necessary and that ultimately harms the community. Some also questioned the intents of risk aversion (e.g. it being used to avoid more scrutiny of work). Many commented that transparency and open discussion used to be quite valued in EA and now feels far more discouraged.
The above concern tied into another worry around EA closed-mindedness and intellectual stagnation. In particular, concerns came up with close mindedness to new ideas expressed by those who did not signal EA-ness clearly enough, e.g. using different vocabulary. But in general many expressed that if certain cause areas were around earlier they would likely be seen as a top EA cause area (biorisk and mental health were both mentioned specifically). Although many had concerns around general intellectual stagnation, people were unsure if this was due to low hanging fruit being picked, more conversations happening behind closed doors, or the EA movement getting more closed-minded in general.
The last concern that came up many times was worries about EAs helping each other vs people outside of their group. This was described as both an issue with funding (e.g. a lot of funding coming through personal relationships and trust) and with project directions (e.g. if a lot of EAs were unsatisfied with an issue a lot of work would be done on it even if the area did not impact the wider world much).
3. Sub areas
We took ideas that people had historically suggested on the EA forum (for example here, here, and here). We then organized them into nine categories, providing examples for each. For each category we were interested in if it was seen as above or below average, as well as whether any specific ideas seemed to stand out as promising. The summary results ordered by average score can be seen in table 1. Each category is described in deeper detail below.
Lots of other ideas were also mentioned. People were generally pretty positive towards other ideas but not many of them came up across a wide range of people; I chose to focus on the more common views as more data were available.
Overall we found this information very helpful and identified both more new ideas and a greater range of views than we expected across EAs from different spaces.
If you might be interested in founding ideas like these in the EA meta space, we recommend applying to the CE 2021 Incubation Program, which runs June 28 to August 27, 2021. First round applications are open now and close on January 15.