Rethink Charity

Powering High-Impact Charitable Projects

EA Survey 2017 Series Part 2: Community Demographics and Beliefs

2017-08-29

By: Katie Gertsch

The annual EA Survey is a volunteer-led project of Rethink Charity that has become a benchmark for better understanding the EA community. This post is the second in a multi-part series intended to provide the survey results in a more digestible and engaging format. Important to bear in mind is the potential for sampling bias and other considerations outlined in the methodology post published here. You can find key supporting documents, including prior EA surveys and an up-to-date list of articles in the EA Survey 2017 Series, at the bottom of this post. 

 

Summary

  • EAs remain predominantly young and male, though there has been a small increase in female representation since the 2015 survey.
  • The top five cities with the highest concentration of EAs include the San Francisco Bay Area, London, New York, Boston/Cambridge, and Oxford.
  • The proportion of EA’s that identify as atheist, agnostic, or non-religious came down from 87% in the 2014 and 2015 surveys to 80% in the 2017 survey.
  • The number who saw EA as a moral duty or opportunity increased, and the number who saw it as an only an obligation decreased.

Age


The EA community is still predominantly represented by a young adult demographic, with 81% of those giving their age in the EA survey falling between 20 and 35 years of age[1]. This year, ages ranged between 15 to 77, with a mean age of 29 and a median age of 27 (and a standard deviation of 10 years). The histogram below shows a visual representation of the distribution of ages.

Graph depicting ages of EAs.

[1] Ages were calculated by subtracting the self-reported birth year from 2017.

Gender

The survey respondents were male by a wide majority. Of the 1,080 who answered the question asking how they self-identified regarding gender, 757 (70.1%) identified as male, 281 (26.01%) identified as female, 21 (1.9%) respondents identified as “other”, and another 21 respondents preferred not to answer. This is similar to the 2015 survey, which had a 73% proportion of males.

Chart depicting responses to the question "In which country do you live?"

Consistent with the results of the previous survey, the US and UK are main hubs for EA, home to the majority (63.4%) of this year’s surveyed EAs. Additionally, the top five countries by population (US, UK, Germany, Canada, and Australia) from the 2015 survey remain the top five countries again in 2017. Australia and New Zealand both dropped ranking slightly, and we saw a small increase of EAs living in Northern European countries, such as Germany, Denmark, Sweden, the Netherlands, and the Czech Republic. Representation from Continental Europe overall rose from 14% to 18%.

Chart depicting responses to the question

The San Francisco Bay Area (which includes Berkeley, San Francisco, Oakland, Mountain View, Menlo Park, and other areas) remains the most populous area for EAs in our survey for this question, but only outnumbers respondents from London by a very small margin. This gap between London and the Bay Area has shrunk substantially from 2015.

 

Oxford, Boston/Cambridge (US) and Cambridge (UK) all show consistently high populations of EAs. Washington D.C. dropped from the fifth most densely populated EA city to eleventh. Newly reported additions include Berlin, Sydney, Madison, Oslo, Toronto, Zürich, Munich, Philadelphia, and Bristol.

Chart depicting responses to the question

The proportion of atheist, agnostic or non-religious people is less than the 2015 survey. Last year that number was 87% compared to 80.6% this year. That metric hadn’t changed over the last two surveys, so this could be an indicator that inclusion of people of faith in the EA community is increasing.

As noted in 2015, it has been suggested that greater efforts should be made on the part of EA to be more inclusive of religious groups. The numbers definitely still show room for growth in religious communities.

Chart depicting responses to question

The distribution of responses regarding a stance on moral philosophy is extremely similar to the last survey. In 2015, 56% selected Consequentialism (Utilitarian), 22% No opinion or not familiar with these terms, 13% Non-utilitarian consequentialism, 5% Virtue Ethics and 3% Deontology. Among respondents, the distribution of philosophical stances has not noticeably changed.

 

Do they see EA as an opportunity or an obligation?

This question was inspired by Peter Singer’s classic essay on whether doing a tremendous amount of good is an obligation or an opportunity, which inspired commentary by Luke Muehlhauser (see this post) and Holden Karnofsky (see this post), among others. Perhaps even more than a preferred moral philosophical stance, this helps us get a view to the participants’ motivation to be effective altruists.

 

The 2015 survey posed this question a little differently, presenting the choices as ‘Opportunity,’ ‘Obligation,’ or ‘Both’ instead of ‘Moral Duty’. Both surveys included ‘Other’ as a choice as well. About the same proportion chose ‘Both’ in 2015, as those who selected ‘Moral Duty’ this year. We could guess that there was a richer connotation understood by ‘Moral Duty’, over the more narrow, and somewhat negatively biased ‘Obligation’ option.

 

From 2015 to this year, those who saw EA as only an opportunity stayed the same, while those seeing it only as an obligation decreased significantly.

 

By offering ‘Moral Duty’ as a response, we may have given those who see participating in EA as primarily a dutiful action, a more neutral (less negative) and/or more principled (less self-focused) match to their personal interpretation.

 

Credits

Post written by Katie Gertsch, with edits from Tee Barnett and analysis from Peter Hurford.

 

A special thanks to Ellen McGeoch, Peter Hurford, and Tom Ash for leading and coordinating the 2017 EA Survey. Additional acknowledgements include: Michael Sadowsky and Gina Stuessy for their contribution to the construction and distribution of the survey, Peter Hurford and Michael Sadowsky for conducting the data analysis, and our volunteers who assisted with beta testing and reporting: Heather Adams, Mario Beraha, Jackie Burhans, and Nick Yeretsian.

 

Thanks once again to Ellen McGeoch for her presentation of the 2017 EA Survey results at EA Global San Francisco.

 

We would also like to express our appreciation to the Centre for Effective Altruism, Scott Alexander via Slate Star Codex, 80,000 Hours, EA London, and Animal Charity Evaluators for their assistance in distributing the survey. Thanks also to everyone who took and shared the survey.

 

Supporting Documents

EA Survey 2017 Series Articles

I – Distribution and Analysis Methodology

II – Community Demographics & Beliefs

III – Cause Area Preferences

IV – Donation Data

V – Demographics II

VI – Qualitative Comments Summary

VII – Have EA Priorities Changed Over Time?

Please note: this section will be continually updated as new posts are published. All 2017 EA Survey posts will be compiled into a single report at the end of this publishing cycle.

 

Prior EA Surveys conducted by Rethink Charity (formerly .impact)

The 2015 Survey of Effective Altruists: Results and Analysis

The 2014 Survey of Effective Altruists: Results and Analysis

 

Raw Data

Anonymized raw data for the entire EA Survey can be found here.

Share post:

EA Survey 2017 Series Part 1: Distribution and Analysis Methodology

2017-08-29

By: Ellen McGeoch and Peter Hurford

The annual EA Survey is a volunteer-led project of Rethink Charity that has become a benchmark for better understanding the EA community. This post is the first in a multi-part series intended to provide the survey results in a more digestible and engaging format. You can find key supporting documents, including prior EA surveys and an up-to-date list of articles in the EA Survey 2017 Series, at the bottom of this post.

Platform and Collection 

Data was collected using LimeSurvey. This year, a “Donations Only” version of the survey was created for respondents who had filled out the survey in prior years. This version was shorter and could be linked to responses from prior years if the respondent provided the same email address each year.

Distribution Strategy

 Author Note: Any mention of distribution of “the survey” refers to the URL of the full effective altruism (EA) survey as well as the URL for the “Donations Only” version of the survey. Each URL has a unique tracking tag that referenced the organization or group sharing the URLs and the type of medium it was being shared on. For example, the URLs shared in the 80,000 Hours newsletter had the tracking tag “80k-nl”.

 

Distribution began on April 19, 2017 and continued on a rolling basis until the close of the survey on June 16, 2017. The expansive outreach plan and lag time associated with particular forms of outreach necessitated distributing the survey on a rolling basis. We reached out to over 300 individuals and groups that posted the survey on our behalf, and/or who required permission by a group administrator for a member of the survey team to post the link to a specific site. 

 

To minimize undersampling and oversampling of different parts of EA, and to make the survey as representative of the community as a whole, we initially followed the distribution plan from the 2014 and 2015 EA surveys, and made modifications based on team consensus. This distribution plan was implemented in 2014 by Peter Hurford, Tom Ash, and Jacy Reese to reach as many members of the EA population as possible. Certain additions and omissions were made depending on the availability of particular channels since the initial drafting of the distribution plan. Anyone who had access to the survey was encouraged to share it.

 

An appropriate amount of caution should accompany any interpretation of the EA survey results. While the distribution plan included all known digital avenues to reach the EA population, there is room for error and bias in this plan. Claims that a certain percentage of respondents to the survey have certain predispositions or harbor certain beliefs should not necessarily be taken as representative of all EAs or “typical” of EAs as a whole. Any additional suggestions on how to reach the EA community are welcome.

 

In an attempt to maximize community engagement, we distributed the survey through email mailing lists, the EA slack group, social networks, forums, websites, emailing prior survey takers, and personal contact.

 

The survey was shared on the following websites and forums:

EA Forum, EA Hub, LessWrong, and Slate Star Codex

The survey team reached out to the following mailing lists and listservs to share the survey, those with an asterisk confirmed that they had shared the survey:

2013 EA Summit Alumni, 80,000Hours, Animal Charity Evaluators, Bay Area LessWrong, Center For Applied Rationality, Alumni, EA Epic, EA Global, EA London, Giving What We Can, Harvard EA, Leverage, Official EA Newsletter, Overcoming Bias NYC, Stanford EA

The survey was posted to the following general Facebook groups:

list of general facebook groups

The survey was shared with the following local and university Facebook groups, it might not have been posted to all groups due to permissions from administrators:

List of cities and universities

The survey was also emailed to those who had taken the 2014 and/or 2015 survey and had provided their email address.

 

Data Analysis

 

Analysis began on June 16, 2017 when the dataset was exported and frozen. Any responses after this date were not included in the analysis. The analysis was done by Peter Hurford with assistance from Michael Sadowsky.

 

Analysis was done in R. All scripts and associated data can be found in the public GitHub repository for the project (see the repository here and the anonymized raw data for the 2017 survey here). Data was collected by Ellen McGeoch and then transferred to the analysis team in an anonymized format, as described in the survey’s privacy policy. Currencies were converted into American dollars and standardized, and then processed and analyzed using the open source Surveytools2 R package created by Peter Hurford.

 

Subpopulation Analysis

 

In general, people found our survey via Facebook (such as the main EA Facebook group, but not including Facebook pages for local groups), SlateStarCodex, local groups (mailing lists and Facebook groups), the EA Forum, the EA Newsletter, people personally sharing the survey with others, LessWrong, Animal Charity Evaluators (social media and newsletter), 80,000 Hours (newsletter), and an email sent to prior survey takers.

 

By numbers, the referrers broke down like this:Chart of survey referral data

Referrer data was gathered via URL tracking. We also asked people to self-report from where they heard about the survey. Similar to the 2014 and 2015 surveys, the self-report data does not line up with the URL data perfectly (e.g., only 72.73% of those for whom URL tracking shows they took it from the EA Newsletter said they heard about the survey from the EA Newsletter). While we don’t know the cause of this, one possible reason might be that some individuals first hear of the survey from one source, but don’t actually take it until they see it posted via another source. Given this discrepancy, we consider URL tracking to be more reliable for determining referrers.

 

Since we know what populations we are drawing from, we want to know two key questions:

 

  • Do our subpopulations successfully capture EA as a whole? If we have 2.2% (19 LessWrong refers divided by 856 people who responded) of our population coming from LessWrong, is this close to the “true” number of self-identified EAs that frequent LessWrong more than other channels? Are we over- or under-sampling LessWrong or other channels? Are we systematically missing any part of EA by not identifying the correct channels in order to get people to respond?

 

  • Do we successfully capture our subpopulations? Are the people who take the survey from LessWrong actually representative of EAs who frequent LessWrong more than other channels? Are we systematically misrepresenting who EAs are by getting a skewed group of people who take our survey?

 

Do our subpopulations successfully capture EA as a whole?

 

Unfortunately, we can’t answer this question outright without knowing what the “true” population of EAs actually looks like. However, we can evaluate the strength of that concern by seeing how different our subpopulations are from each other. If our subpopulations vary substantially, then oversampling and undersampling can dramatically affect our representativeness. If our subpopulations don’t vary by a large margin, then there is less risk from undersampling or oversampling individual populations that we did sample from, but there is still risk from missing populations that we did not sample.

 

Based on the above table, it seems our subpopulations do differ in demographics and affinity toward causes, but not in donation amounts or income. There is a definite risk that oversampling some groups and undersampling others could introduce bias in demographics and answers like top causes.

 

As a contrived example to demonstrate what this bias could look like, imagine that SSC truly has 500 EAs on the site all of which are entirely male, and 400 of them take our survey. Whereas, the EA FB group has 1000 EAs, is entirely female, but only 100 of them take our survey. This means that the “true” population (in our contrived example) would be 33% male, whereas our sampled population would be 80% male.

 

Unfortunately, without knowing the true distribution of EAs, there’s no real way we can know whether we oversampled, undersampled, or got things close to right. This means we should be careful when interpreting EA survey results.

 

Do we successfully capture our subpopulations?

The next question is how well we capture our subpopulations. Again, without an unbiased census of the entire subpopulation, it will be difficult to tell. However, we can compare to another survey. We did some detailed analysis on this for the 2014 EA Survey. There haven’t been that many other surveys of EAs lately, but there was a 5500 person survey of SlateStarCodex readers launched just two months before we launched our survey.

 

The SSC Survey had many more SSC readers who were EAs than our EA Survey had EA Survey takers who are SSC readers. However, it seems that our EA Survey properly matched the SSC Survey on many demographics, with the exception that the EA Survey had a more consequentialist audience that donated slightly more while earning slightly less. This would indicate that there is a good chance we adequately captured at least the SSC survey-taking EA population in our EA Survey.

 

Credits

 

Post written by Ellen McGeoch and Peter Hurford, with edits from Tee Barnett and analysis from Peter Hurford.

 

A special thanks to Ellen McGeoch, Peter Hurford, and Tom Ash for leading and coordinating the 2017 EA Survey. Additional acknowledgements include: Michael Sadowsky and Gina Stuessy for their contribution to the construction and distribution of the survey, Peter Hurford and Michael Sadowsky for conducting the data analysis, and our volunteers who assisted with beta testing and reporting: Heather Adams, Mario Beraha, Jackie Burhans, and Nick Yeretsian.

 

Thanks once again to Ellen McGeoch for her presentation of the 2017 EA Survey results at EA Global San Francisco.

 

We would also like to express our appreciation to the Centre for Effective Altruism, Scott Alexander of Slate Star Codex, 80,000 Hours, EA London, and Animal Charity Evaluators for their assistance in distributing the survey. Thanks also to everyone who took and shared the survey.

 

Supporting Documents

EA Survey 2017 Series Articles

I – Distribution and Analysis Methodology

II – Community Demographics & Beliefs

III – Cause Area Preferences

IV – Donation Data

V – Demographics II

VI – Qualitative Comments Summary

VII – Have EA Priorities Changed Over Time?

Please note: this section will be continually updated as new posts are published. All 2017 EA Survey posts will be compiled into a single report at the end of this publishing cycle.

 

Prior EA Surveys conducted by Rethink Charity (formerly .impact)

The 2015 Survey of Effective Altruists: Results and Analysis

The 2014 Survey of Effective Altruists: Results and Analysis

 

Raw Data

Anonymized raw data for the entire EA Survey can be found here.

Share post: