This is the third post in the LEAN Impact Assessment series.
This is the third post in the LEAN Impact Assessment series.
In the previous Quantitative and Qualitative reports we aimed to describe our findings with minimal commentary in order to allow the data to ‘speak for itself’ and allow readers to draw their own impressions without being influenced by our own interpretations and conclusions. In this report we aim to offer more interpretation and analysis of the implications of our findings for LEAN strategy and for EA local groups.
The LEAN Impact Assessment has aimed to provide insight both into the status and value of EA local groups as a whole, and the efficacy of efforts (by LEAN and others) to support these groups. These topics occupy the first two sections of this report, respectively, and in the final section we outline LEAN’s strategic plans formed in the light of these findings.
Our findings represent only a first step in researching EA Local Groups and we plan to conduct further, more specific, research into local groups and LEAN’s services in the future.
- EA groups report producing significant impact e.g. counterfactual pledges and donations influenced
- Most members report that groups are a large or very large factor in their engagement with EA
- Organisers express a strong demand for personal support and feedback, written guides and research on group impact
Group Size and Activity
The LEAN Impact Assessment offers the most comprehensive empirical investigation into EA local groups to date. Prior to the LEAN Impact Assessment there was little, if any, systematic data about EA local groups as a whole, though minimal related data was gathered through the EA Survey (also run by Rethink Charity). As such, the first stage was to provide basic information about the number and size of EA groups and their activities.
Our sample (disseminated with help from CEA and EAF) included 98 discrete local groups and a larger number of organisers and members. This sample likely does not include all EA groups. Nor did respondents answer every question. However, it seems reasonable to assume that non-respondents were, on the less, disproportionately less active groups or groups who had less impact to report for particular metrics.
Reported group size ranged from 1 to 1350, with a median of 10. This means that the reported group size numbers are heavily dominated by a small number of very large groups. Of a total 4280 reported group members, across all groups, more than half the reported members come from the largest 3 groups, almost 70% from the largest 5 groups and almost 78% from the largest 10 groups. By contrast, more than 50% of groups contained 10 members or fewer and slightly more than 76% reported 20 members or fewer.
As noted previously, the extent to which a small number of very large groups account for almost all EA local group members is likely somewhat overstated due to differences in how group “members” were counted by different organisers. For example, the number of group members reported by the largest group in our sample corresponds to the number of group members they have in their Facebook group, but their reported number of people who have attended multiple of their events is much lower (>1300 and 160, respectively). While it is possible that smaller groups may also be ‘over-reporting’ their numbers in this way, it does not seem possible that the smaller groups could be inflating their numbers to the same extent (unless we suppose that the majority of groups have 1 member or less).
Accounting for this, the gap between the typical group and the very largest groups is less astronomical, though still substantial. Of course, the number of group members is not in itself important, but rather the impact that each group can have, which follows later in the report.
EA Survey Data
It is worth briefly comparing this data with the results of the EA Survey on local group membership, to see whether they are plausible, and to gain insight from a distinct source on the scale and scope of EA local groups. We would not expect these numbers to match up particularly closely, given the different questions asked and very different sampling techniques (the EA Survey was distributed widely and answered by EAs and non-EAs alike and had several thousand responses), the Local Groups Survey was distributed directly to group organisers (as well as posted in relevant facebook groups) and was specifically for local group organisers and members. Nevertheless it seems worthwhile to compare the results across these surveys to see whether EAs in a broader sample report being influenced by EA groups, rather than in a sample specifically targeting group organisers and members.
The EA Survey found 469 EA respondents indicating that “Yes” they were a member of a local group, compared to 953 answering “No” and 430 non-responses for that question. This means slightly more than 25% of EA respondents (including those who did not answer that question at all in order to give a more conservative estimate) reported being in an EA group.
We cannot assume this proportion applies across the whole EA population though, since it seems plausible that those EAs who answer the survey may be more likely to be in an EA group (assuming they are generally more active and connected). On the other hand, the absolute number of EAs in local groups is likely to be higher than the absolute count reported here, since the EA Survey doubtless did not include all EAs. Regrettably, no-one knows precisely how many EAs (by any definition) there actually are in the EA population (or whether or how far EA is growing). Nevertheless, almost 500 self-reported EA local group members from the EA Survey sample of ~1800 seems an appreciable number.
The EA Survey also provided data on whether individuals reported that they would attend an EA group if there were one near their home. In addition to the 469 EAs who said they do attend an EA local group, a further 271 reported that they would if there was one near their home and a further 310 who do not attend suggested that they were “unsure” whether they would attend or not. This is suggestive of fairly significant demand for EA local groups from individuals who presently cannot attend one due to lacking one nearby.
The EA Survey also offers data regarding the importance of EA local groups in getting EAs into and more involved in EA. 3.6% of the sample reported first hearing about EA from a local group, slightly exceeding Doing Good Better (3.4%) and Facebook (2.8%). Due to the wide variety of different answers indicated in the survey few ‘routes’ received very high percentages (the highest were Personal Contact and LessWrong on 15.5% and 15.3% respectively, followed by ‘Other blog post’ and SlateStarCodex on 9.4% and 7.4%). http://effective-altruism.com/ea/1h5/ea_survey_2017_series_how_do_people_get_into_ea/ (Table 2)
A survey sample drawn specifically from the EA Facebook group (using different methodology) returned higher numbers reporting that they first heard about EA from an EA local group. Here 7% reported hearing about EA first from a local group, beaten only by Friends (15%) Peter Singer/TLYCS (13%) and 80,000 Hours (13%). Including the 5% who reported first hearing about EA from a university group, the total % hearing about EA from groups was 12% (much closer to the top options) and comfortably beating the other options, including Facebook, SlateStarCodex, LessWrong and Doing Good Better. (ibid)
These findings are suggestive of EA local groups playing a significant role as the first place that many EAs encounter EA, albeit as one among a wide variety of different sources each making up a minority of the whole.
The EA Survey also offers data regarding how many EAs thought that EA groups were important in getting them into or more involved with EA. Here respondents could select multiple options as each being important. 261 respondents indicated that local groups were important for getting them more involved in EA, whereas the most commonly cited factors GiveWell and ‘Book or blog’ were cited 532 and 519 times. 261 EAs in the sample getting more involved in EA seems like a significant source of potential impact. It is significantly smaller than the number citing other factors, but this is presumably somewhat due to the fact that almost every EA has probably encountered GiveWell or an EA book or blog, whereas likely fewer than 50% of EAs have encountered a local group.
Overall, we take this findings from the EA Survey to offer reassurance that EA groups are reaching and influencing a substantial number of EAs, even in a broader sample not specifically targeting group organisers.
Age of Groups
Readers may be concerned by the fact that so many individual groups appear to be so small (for example, 9/85 contain fewer than 4 members). Likewise, looking at various metrics of impact, it is reliably the case that a number of groups report little impact. If many EA groups are largely quiescent or inefficacious, then this may be a matter for concern.
To try to understand this better, we looked at how long groups had been running. A large number of EA groups reported being very new.
This may be reassuring context when considering why some groups have not had much of an impact. They may not have had much impact yet, but many groups have not even seen a complete academic year or two seasonal pledge drives. This may be a neglected feature to consider in EA movement building discussions: a large number of EA groups (see the left hand side of the chart above) are still in their infancy and in the coming years may be coming into their own.
If we look at the number of groups of certain sizes at different ages, we see that while a majority (68%) of groups founded less than a year ago have fewer than 10 members, this proportion declines substantially for older groups (38% for groups 1-2 years old, 33% for groups 2-3 years old, and 14% for groups 3-4 years old).
Similarly, the median group size increases with group age (we have excluded the 5-7 year old groups from this graph because with only 2 members in these categories, the median is fairly unindicative):
A consequence of this is that even though most groups are younger groups (58% less than 2 years old, >80% less than 3 years old), most members are in older groups.
Another possibility we considered is that EA group size and activity might be quite variable across time, with groups, for example, being moderately active with a number of members, then falling away when a core organiser leaves or in between academic years, and then gaining more members once an academic year has started. If so, then while at any one time (such as the snapshot our survey provides), a number of groups may appear to be all but non-existent, the same groups may range across time through periods of activity and inactivity. Our qualitative data also included city-based group organisers reporting that they had high turnover of young professional members who would stay in a city only a short period of time before moving. This is partly suggested by our data on the perceived ‘precarity’ of groups, with a significant minority (30/89) of organisers thinking their group was unlikely or very unlikely to continue to function after the current organisers left. This is a trope which we heard from a number of organisers, where groups actually fell into abeyance after previously being very active. We presently lack data on how common this is, but this, and how to handle transitions better, as well as how to keep groups could be kept active, is a potential topic for future research.
Calculating the impact of local groups is exceedingly difficult in a number of ways. EA groups aim to have an impact in a variety of different ways: attracting new members to EA, encouraging members to become more engaged with EA (through providing social connection, motivation and information) and (directly and indirectly) encouraging members to take more effective actions (such as donating to effective charities, taking the pledge, considering their career based on EA principles), and increasing member retention.
Some of these things are intrinsically difficult to measure and quantify. For example, many EAs report that being in a local group increases their motivation and engagement with EA. Some impacts which we think EA groups are likely to have, are all but impossible to measure. It seems plausible that EA groups contribute to retention of EAs, as some EAs report that they likely would have left the movement but for their involvement in a local group: but EAs who do leave the movement and their reasons for leaving are typically inaccessible to EA data-gathering, and people who never join the movement but would have, counterfactually, had they encountered a local group are a fortiori inaccessible. As such, much of the true impact of local groups may be excluded from our analysis.
Further, it is exceedingly difficult to discern the counterfactuality of each of these outputs. While we can ask members and organisers for their self-reported estimations of what they would have done were they not in a group, it will often be hard for them to genuinely know. Many report that they would not have taken certain actions were they not in a local group, but counterfactually, it is possible that they would have been moved by something else to, for example, take the pledge had it been impossible for them to join a group. Conversely, an individual in a group may feel confident that they would have continued to take effective actions if they weren’t involved in a local group, but it is possible that they may be mistaken about this. This is particularly difficult given the wide variety of influences on any given EA’s actions: their interactions with EAs face to face in a local group, their contact with other EAs online, their reading EA literature, their attendance at an EA conference, may all influence their EA actions and attitudes and which of these factors are important may vary substantially across different individuals and the influence of all of these factors may not be transparent even to the individual in question.
Likewise, EAs may reasonably disagree quite radically about how much value to attach to each of these things (for example, the value of a given person taking the Giving What We Can pledge) or getting an additional person involved in the EA movement. We do not aim to settle such fundamental debates here, but rather report figures for a variety of metrics which will be considered of value to a wide range of EAs with different values.
Below we report figures for the collective ‘outputs’, for various measures, across all the groups in our sample:
Clearly these measures cannot capture all the impact of EA groups, since many potential effects of EA groups were not or could not be measured by the survey e.g. less tangible benefits resulting from reducing EA attrition, value drift or of connecting EA group organisers to valuable opportunities.
Even looking solely at the estimated (by group organisers) counterfactual impact of local EA groups, these figures seem quite substantial. As, at present, most EA groups are run by unpaid volunteers with little or no funding, the direct costs associated with local groups are quite low. The costs associated with volunteer or paid staff time, are likely to be larger, but here there will be higher variance. For many organisers the opportunity costs of their time spent running an EA group may be very low and there may even be net personal gains from their involvement in terms of experience, career capital, boosts to motivation and connection with others in EA. (http://effective-altruism.com/ea/1fh/lessons_from_a_fulltime_community_builder_part_1/cea) Of course, for other EAs, with higher impact elsewhere, running a group would be a net loss. We expect that, for the most part, individual organisers may be best placed to make these judgements themselves. We do, however, think that further research into the activities and opportunity costs of EA group organisers would be valuable.
Having looked at the total outputs across all EA local groups it is, of course, important to try to look at this more at an individual group level. The mean impact per group continues to appear to be quite high, but these numbers are driven upwards by a small number of very successful groups.
Therefore we present the median figures across a variety of metrics:
While these figures are much lower than the figures reported by the most successful groups on each metric or the mean figures, they still seem plausibly to represent significant impact. As noted, it is impossible to attach a definitive value to these various outputs, but 3 counterfactual GWWC pledges, and 5 people becoming counterfactually actively committed to EA may represent significant impact.
Of course, looking at the median figures across different metrics only captures one facet of the distribution of impact across different individual groups. If one wants to get a picture of how much impact a ‘typical’ active EA group’s impact, one might think that the median figures under-represent this, given that it includes a number of groups who are very new or quiescent (having one organiser, but no group). For example, if you want to know how much groups typically raise through group fundraisers, you might want to exclude from these figures the many groups which didn’t even attempt to run a fundraiser. By contrast, if you want to estimate the expected value of starting a group, this might be a more appropriate metric; or you might want to identify a more specific reference class by looking at groups in comparable situations (e.g. in cities which you judge to be similar) and draw inferences from that. We are wary about digging into these different ways of representing the data too much, because it introduces too much freedom for judicious selection flattering results, and we recommend that the reader looks at the specific results we report in previous sections.
Is it all the largest groups?
One of the trends that appeared most clearly in the results we posted in the quantitative data report, was how far the total figures (e.g. for members or money raised) appeared to be dominated by a small number of ‘super groups.’ Much of our data exhibited a very strong power law distribution.
We think this is likely to lead to the impression that since it seems almost all the impact from local groups, as a whole, is coming from a small number of ‘top’ groups, this is where attention and investment should be focused. This is worth considering further.
Firstly, it should be noted that strategically this doesn’t necessarily follow. If we can, at very low cost (for example, time valued at $500), cause a smaller group to bring about 3 counterfactual pledges (valued at, say, 3 x $10,000 each), this may represent a better investment than trying to help a large group producing 100 pledges produce yet more pledges, as even though their mean impact may great, producing extra marginal impact may be very costly. Even if we suppose that the number of members in a group is all that matters (for producing impact), it may be that it is most effective to focus investment in the smaller groups, in order to increase their size to that of the larger groups. This may hold true for other metrics as well. Suppose we are interested in recruiting top EA talent. It may be more likely that talented individuals will come from larger groups (since there are more members), but it may be that such individuals would likely be found anyway (due to already being surrounded by and connected with many EAs in an EA Hub). It might therefore be more important to support smaller local groups in connecting EAs with promising opportunities, as they are more likely to be missed. None of this is intended as an argument that smaller groups actually are more important to invest in, merely that it does not automatically follow that if certain groups produce most impact, it is more important to attend to and invest in these groups relative to the more numerous smaller groups.
However, there is also more to be said in terms of understanding the data and which groups are producing most impact. Given the striking distributions of group size and various measures of impact describes above, it would be tempting to conclude that it is the largest groups (with the most members) that are producing most of the impact. Further, one might suspect that it is largely simply having more members that drives increases success on various metrics (more counterfactual pledges, more donations influenced and so on).
Further consideration of the data suggests that this is only somewhat true. As noted in the quantitative report, for most measures there was a positive correlation between number of group members and various outputs, however this varied. For example, there was a strong correlation between numbers of group members basing their career on EA principles, but little relationship between the size of a group and the number of new attendees to events who were unfamiliar with EA.
Moreover, as we noted in the quantitative report, the data did not suggest that larger groups were better at producing outputs from their members (e.g. getting members to make career choices based on EA) – in fact, there appeared to be a weak negative relationship between the size of a group and the proportion of members who were basing career choices on EA or taking the pledge (plausibly explained by the fact that larger groups may contain relatively more new members and smaller groups may contain, as a proportion of their total size, relatively more core organisers).
It is also important to note that these correlations and various positive outputs do not necessarily suggest that having more group members causes higher impact in terms of pledges taken, funds raised etc. It may be that a third factor (group activity, propitious environment, good organisation) reliably causes groups to be larger and to have more pledges, funds raised etc.
To gain more insight into these questions we looked at the highest performing groups across different metrics (e.g. number of members, amount raised in group fundraisers, number of members becoming actively committed to EA, number of pledges, number of career changes based on EA principles, and number of new attendees at events who were unfamiliar with EA). Just because each metric seemed to be dominated by a small number of groups performing exceedingly well on that metric, it didn’t necessarily follow that it was the same groups dominating across each metric: it might be that the groups accounting for almost all the total group members were different from the groups accounting for almost all the pledges, for example.
Our analysis suggested that there was quite a lot of overlap between the groups dominating across different metrics. We don’t name specific groups because we wish to avoid publicly ranking groups in terms of success. Of the 5 groups with the highest member count, 4 recurred multiple times (3-4) over the other 4 categories, with the largest group topping the categories for active commitments, pledges and career changes. Notably, the other group in the top 5 for size did not recur across any of the other categories.
However there was scope for divergence from this pattern of dominance by the ‘top’ groups. Across the ‘top 5’ for the other 5 categories, each time at least 2 of the top 5 didn’t appear in the top 5 for any other category (e.g. at least two of the groups reporting the most pledges did not report the most members, most funds raised, most EA career choices or most new attendees). Moreover, one local group appeared in the top 5 for all but one of the remaining categories (funds raised, active commitments, pledges and new event attendees) despite not being one of the groups with the most members (indeed, they only have 35 members, placing them outside the top 10 for size).
Two of the categories examined were also striking outliers. The top ‘group fundraisers’ did not include the ‘top’ group across most categories and 3 of the 5 places were taken by groups which were not among the largest groups. As this graph shows, relatively little of the variance here was explained by group size:
The number of new event attendees, for a given group, who were not familiar with EA beforehand was even more striking. Here none of the largest groups were among the top groups (though the largest group, tied for 5th place) and only 1 of the 5 groups recurred across any other categories. This is quite noteworthy because this metric seems like the best measure we have (from this survey) for groups reaching out to non-EAs and succeeding in at least introducing them (via an event) to EA, yet it seemed to bear little connection to success on any other metrics. One other striking feature of this category is that all of the top groups (except for the largest group, tying in 5th place, and with only half as many new event attendees as the top group in this category) were from non-Anglo-American countries. While this is purely speculative, an explanation for this pattern might be that these groups are aggressively reaching out to people unfamiliar with EA in their areas, getting them to attend events, but largely not seeing success in transferring this into increased group membership. This issue probably bears further research, as it seems plausible that EA groups outside of the traditional geographical areas may face distinct challenges and require more tailored support (such as translation of materials).
These findings suggest that though a small number of groups tend to be very successful across different categories, there is still clear scope for other groups, which are not particularly large to produce great impact, competitive with the largest and highest performing groups.
It is also notable that many of the groups in the top 5 for various metrics were not from obviously propitious environments (e.g. elite universities or major cities). The fact that there is a fair amount of variance in success across different metrics, including success outside of the largest groups and outside of obvious EA ‘hubs’, suggests that there are influences on group impact beyond size and being in propitious locations. Further research seems needed to investigate how the most impactful groups attain their success and to see how far this can be propagated as best practice.
A final observation worth noting is that the much larger influence of the biggest group(s) may be a result of particular, contingent policies, rather than an inevitable feature of the way that EA groups’ impact will be distributed. For example, EA London reliably dominates most metrics by a substantial amount, but is unusual among groups in having had a full-time paid organiser, since 2016 (and now two dedicated staff). Looking at it from the present vantage point, it may appear inevitable that London would have grown to the size and influence that it has now. However, despite having been running since 2013, until quite recently, EA London was much smaller, averaging around 5 social event attendees every other month in 2014 compared to slightly more than 50 at the end of 2017. Of course, not all of this growth should be attributed to the presence of a funded organiser, and nor does it suggest that an organiser would have been equally successful in a different city, but it does somewhat count against the view that certain groups were simply inevitably going to be very large.
Further Evidence of Group Impact
Above we noted the median results for various metrics of group impact (e.g. pledges, donations etc.). However, as noted, groups also aim to have impact in a variety of ways which are harder to measure and quantify or which simply can’t be translated into a median value per group. We note some of these here:
We asked individual organisers and group members to indicate how much of a factor being involved with an EA group was for their involvement with EA. As noted in the quantitative report, a majority reported that being involved in an EA group had been a “large” or “very large” factor for their engagement with EA. Similarly, 89% of organisers and 78% of members reported that the way they thought about the world and/or their behaviour had changed since becoming a member of a local group, with large majorities of these reporting that they expected to have more social impact as a result of these changes.
Though hard to attach a precise value to, and reliant on self-reports, this is strongly suggestive that local groups are having a positive impact on EAs and increasing their engagement with EA. While this is not direct evidence of impact, it seems likely that increasing people’s engagement with EA may lead to impact by, on the whole, making individuals more motivated and promoting EA actions as norms.
Our qualitative data also strongly supported this, with many individuals explicitly reporting the importance of “personal interaction” with other EA for their motivation and engagement. It seems plausible that, for at least a subset of EAs, face-to-face interaction with other EAs, rather than only online contact, may be important.
There was also a general indicator that local groups are valuable to EAs, as a large majority (93%) of members rated their group’s activities as “valuable” or “very valuable.” Interestingly, group organisers, while still having a majority (57%) rating their group’s activities as “valuable” or “very valuable” and 89% rating them between “moderately” or “very valuable” were noticeably less positive than group members. Over half (55%) of all members who responded rated the group’s activities “very valuable.” We speculate that this may be partly explained by organisers and members interpreting the question of how valuable their group’s activities are slightly differently, with members taking it to be about how valuable the activities are to them and organisers taking it to be asking for an evaluation of how valuable the group is as a whole. It is also possible that organisers felt the need to be more self-critical about their group’s activities and achievements, even though members near-uniformly found the groups to be extremely valuable.
Groups also introduced a fairly large number of individuals who were unfamiliar with EA to EA via events (nearly 6000 in total, and a median of 30 per group- which is quite substantial given a median group size of only 10). This cannot be directly identified with impact (as we do not know what proportion if any of these individuals became more engaged with EA- this would be an avenue for potential future research). However, there is suggestive evidence from our findings that slightly more than half (138/254) of our respondents did not consider themselves an EA prior to joining an EA group. This may be suggestive of non-EAs joining groups and coming to identify with EA, though it may also reflect individuals applying a very stringent definition of EA to themselves and only identifying as EAs once they have become actively involved.
Value of existing programme services and resources
Where identifying the impact of local groups is, as we note above, exceedingly messy, identifying the impact of efforts to support local groups is substantially messier.
Where local groups aim to produce impact through a wide variety of means: introducing people to EA, making EAs more motivated and engaged, making them more informed and connected to opportunities, encouraging impactful actions, increasing retention and reducing dropout, efforts to support and promote the work of EA groups in a similarly broad set of ways with a broad set of means.
For example, LEAN has aimed to support EA groups in a host of ways:
- Providing technical infrastructure services (e.g. group e-mails, group websites, meetup.com etc)
- Providing informational resources (e.g. guides to running groups, the map of EA groups)
- Personal support and feedback to group leaders
- Supporting group communication (newsletter, group calls etc.)
Taking even the more straightforward of these services – providing a group with e-mails and websites – these services may potentially have positive effects on a host of outcomes via a variety of different mechanisms. They aim to make the group appear more professional, which may attract more members and/or make the group (or wider EA movement) seem more appealing, and it may allow the group to access opportunities (in virtue of as one of our qualitative respondents put it “showing people that we’re not some fringe thing that’s only locally run”). They may also make organisers and current members feel better about their group, increasing motivation. Providing them as a service may make running the group more convenient and less costly for organisers (who might otherwise feel they need to set up these solutions themselves), encouraging people to run groups and increasing retention. (For diverse other positive effects of support offered to local groups, see the qualitative report).
However, for each of these putative mechanisms and metrics, there are innumerable other factors influencing how professional an EA group appears, making EA groups more or less convenient or costly to run, and influencing how many people are attracted to or engage with the group. And likewise for every other means of supporting groups: providing leaders with mentoring or guidance, making written guides available, providing video calls for leaders to discuss issues and so on. Counterfactuals here are, of course, particularly difficult to discern, given the mutual influence of so many different factors on the same outcomes.
As such, identifying the causal impact of any particular service on the performance of groups (whose own impact is itself very difficult to identify, as we noted in the previous section) is all but impossible. As a consequence, in this report, we rely on the reports of group leaders on how useful they found different services offered to support groups. These evaluations certainly have their limitations, reliant as they are on the self-reports of organisers, but we think they are among the best evidence available to us of whether services are actually helping support groups.
Alternative Analytic Strategies
It is worth briefly sketching out some alternative means of calculating the impact of efforts to support local groups and why they are either unworkable or severely limited:
- Randomised assignment of local groups to receive support or not (for example, one set of groups might receive websites or personal support, and another set would not): this is a gold standard of experimental design highly familiar to EAs. It may be impractical, in this case, to aid some groups and not others though. Given very small samples of highly heterogeneous groups, it may be very difficult to ensure that the treatment and control groups are comparable. Furthermore, some services provided are easily accessible public goods (e.g. online resources and guides). Also not treating a ‘control’ group may have its own effects: e.g. severely dispiriting group organisers who are denied assistance or encouraging organisers to adopt their own ersatz solutions (for example, if the treatment group are, very visibly, being given new websites, this may move those assigned to the control group to set up their own website).
- Comparing pre/post treatment metrics: for example, seeing how a group performs before/after receiving support. We already have some data along these lines: for example, groups reporting increases in response rates after using an official-looking group e-mail address. We aim to expand this kind of small scale experimentation, where practicable, next year. However, in many case, where there are not discrete metrics (such as e-mail response rates), this will also be limited, as multiple other changes over the course of a year will confound the changes observed after the implementation of a particular service.
- Comparison of outcomes from observational data for different kinds of groups e.g. CEA groups vs LEAN groups: this particular suggestion would be essentially impossible, because many groups receive support from a number of different sources, meaning there are few, if any, “pure CEA” or “pure LEAN” groups. It may be possible to make informative comparisons of groups of other types.
- Comparing EA individuals who are or are not in EA local groups (based on the EA Survey) to see their impact, rates of attrition etc. This suggestion seems hamstrung by the fact that these differences would likely be severely confounded. Individuals who are (or are not) in an EA group are likely to systematically differ in other ways due to self-selection, among other things. For example, more motivated individuals may be more likely to attend an EA group. Alternatively people who are not in an EA group may be more likely to live in isolated areas away from an EA hub compared to those in a group, which may independently exert an effect. Efforts could be made in an analysis to ‘match’ comparable people and overcome this in other ways, however we are not sure this would be reliable or worthwhile. (The EA Survey data is freely available to anyone who wishes to make an attempt.)
Evaluations of Group Support and Services
During the interviews and the Local Group Survey, we received feedback about a range of resources and services, some of which had been provided by LEAN exclusively and some of which had been jointly offered by LEAN and other EA organisations or independent EA individuals. Disaggregating the impact of different organisations’ services is therefore difficult to do systematically, though it is sometimes possible to identify individual cases of impact from the qualitative interviews. Indeed, often group organisers themselves couldn’t identify which services were provided by which organisations. Nevertheless, we think these results offer a reliable indication of the value which organisers attach to different services and thus, at least to some extent, of the degree to which they are likely to help support EA groups’ work.
Technical support is the area where we can connect respondent experiences most directly to the LEAN’s investment. This is because, during the time frame considered, most of these facilities were not made available by any other organisation. Some organisers supplied their own websites and Meetup.com accounts, but these are in a minority. Furthermore, organisers came to LEAN over time and arranged for us to take on hosting for existing websites, or payment for existing Meetup.com accounts.
Our quantitative report offers ratings of the usefulness and impact of various of the tech support services. Combined with our qualitative data, these generally follow a pattern of services being found to be highly useful by a small number of groups, but not very useful or not used at all by others. That said, “technical support” as a whole “for instance, subscriptions for online services, free websites, group email addresses” was rated as useful or very useful by >70% of respondents. Qualitative responses bore out a pattern which recurs across our tech services, which is that these services were invaluable to some groups and minimally or not at all valuable to others, with a large number of groups not using the service at all.
Websites: organisers were asked whether their group website “makes a non-trivial difference in the effectiveness of your group’s outreach efforts” and websites were rated as “significantly useful” rather than “no more than trivially useful” by a slim majority of respondents. Response rates were low though (39/98), in line with the fact that relatively few groups have a website.
Meetup.com subscription: We asked organisers to estimate how much of a counterfactual increase (%) in attendees they had as a result of using meetup.com. There was an exceedingly wide range, with estimates literally ranging from 0% to 100%, with a median of 15%, but a small number of groups gaining very large increases (50-100%) from using the services.
E-mails: we did not directly survey individuals about the usefulness of group e-mail addresses, but some indication of interest came from the fact that a majority (29/54) respondents indicated that they would like one for their groups. Qualitative data offered further confirmation, with a number of organisers noting that they preferred having a group e-mail as it looked more “professional” and that they noticed higher response rates after using them.
We are not surprised by the fact that these services were extremely valuable to some groups and of little value to others, since we find that EA groups differ extremely in a wide variety of ways. As such, a tailored approach may be necessary to direct specific service to the specific groups which find them useful. It does not automatically follow from the fact that many groups find relatively little benefit from using meetup.com, that it is not worth continuing to provide it if a small number of groups gain substantial benefits from it. This is especially true in cases such as Meetup.com where it is cheap and easy to provide accounts to groups who want one, and where the service allows three different groups to share one subscription.
Personal Support and Expertise
LEAN provides expertise and support to groups through the following services and activities:
- Personal Feedback
- Practical support and ideas
- Written guides
- Connecting and introducing
- Impact assessment and research support
The LEAN Impact Assessment quantitative report found that group organisers attached extremely high value to receiving personal feedback, practical support and ideas and guidance in the form of written guides.
Personal feedback and support: 78% useful or very useful
Practical support and new ideas for group activities: 94% useful or very useful
Written guides: 83% useful or very useful
Our qualitative data filled out the details about exactly what and why organisers found valuable. A recurring theme was that some organisers felt insecure about running a group, as to whether or not what they were doing was effective, felt isolated from the broader community and felt that personal contact for feedback, reassurance and advice was beneficial for motivation.
Increased guidance and provision of EA materials (for example, stock content which could be posted to their Facebook group) were also cited as things which would make running a local group easier. Some organisers reported that they felt that they were having to invent things themselves, even though they knew that other groups must have worked on similar problems before.
Other related factors which came up in our qualitative data, but which were not captured in terms of our quantitative metrics, involved helping connect EAs and groups with relevant people (either directly, through our networks, or indirectly through the use of the Effective Altruism Map.)
A further recurring theme was a demand from group organisers for more research which would inform how they should be optimally running their groups. Many EAs wished to know whether they were acting effectively and having an impact, and wished to see centralised data from other groups to better discern what would be effective.
- Group Calls
- EA Group Newsletter
- Mentoring Programme
LEAN, in some of these cases in collaboration with CEA and EAF, support a number of platforms for EA group organisers to discuss with each other and with orgs supporting movement building, to share expertise and information. There was substantial variability in how useful each of these platforms was rated as being, as some were new or little used or organisers were not yet aware they existed, whereas others were widely valued.
Slightly more than half (54.3%) of respondents rated group calls as being useful or very useful.
Qualitative data suggested that organisers were broadly supportive of measures to improve interconnectivity between groups and group leaders, though there was little specific mention of group calls. Furthermore, some interviewees mentioned that the one-size-fits all nature of group calls meant that they had not found the issues discussed particularly relevant for their own groups.
EA Group Newsletter
LEAN published eleven newsletters for group organisers between 2015 and May 2016. The group organisers newsletter was relaunched as an inter-organizational service during 2017, and four editions have been published (note that this is not the same as the general EA Newsletter, which is also provided in collaboration between CEA, LEAN and EAF).
Feedback on the concept is limited due to the fact that few respondents were in receipt of the newsletter. 32% indicated that they receive the newsletter, whereas 61.6% of respondents expressed an interest in being added to the newsletter. 53% of respondents to a question about the usefulness of the newsletter rated it as useful or very useful, while 40% rated it as neither useful nor useless.
EA Mentoring Programme
A pilot mentoring programme was initiated by CEA in late 2015. In 2017 the concept was rebooted by LEAN in beta form. As with the rebooted EA Groups Newsletter, few participants in the sample would have direct experience of the Mentoring programme so far. At of the time of writing, 9 mentoring pairs are included in the programme. Given this, it is not surprising that the majority (61%) of respondents rated the programme as ‘neither useful nor useless,’ though 30.7% deemed the programme to be either ‘useful’ or ‘very useful’. We received two or three comments from survey participants explicitly stating that they had not heard of the Mentoring Programme and the Newsletter, and asking to be sent further information. We will be following up on the pairs of individuals involved in the mentoring program to investigate how impactful it is.
This section summarises the strategic plans that LEAN has for 2018 based on these findings.
Tech support and group communication services will continue, via more streamlined techniques. Our tech resources will be focused on developing a high quality interface for written content and resources. In response to demand from organisers for these services, we will expand our investment in offering personal support and feedback to organisers, and will consolidate our online resources for group leaders. We think that these written guides and personalised support will be synergistic with LEAN’s expanded work researching EA groups and how they can best produce impact. More broadly, we hope to work more on discerning the specific needs of groups in different contexts and offering tailored support and advice.
Having found that our tech services are highly useful to a minority of groups we have decided to continue, but streamline our service provision. While it might seem that services which are useful only to a minority of groups are not worth providing, we have concluded that the high level of usefulness to a small number of groups justifies providing these services. A significant consideration is that, in many cases, if LEAN ceased providing these services, the financial and time costs would likely simply fall to individual organisers, making running groups more costly for individuals. We have, however, switched to employing simpler, more automated processes for providing services (for example, a static site generator for EA group websites, and an improved email host) which requires less time and financial investment. At the same time we have also invested more in dedicated tech staff to ensure our systems are more reliable in the future.
Based on the evident popularity of existing written content, and the widespread wish to see existing content streamlined, we are investing in a new web interface for organisers. This will involve editing existing community generated content, and assimilating it into a central, visually appealing interface. The new interface will be based on the EA Hub, which will itself be modernised and restructured. Where possible, we will facilitate editorial additions from the community, in order to make the tool a logical home for sharing resources.
We will continue to facilitate group calls alongside other groups, as there is still demand for these, though we will work on promoting more targeted calls (and, in particular, offering groups more individual calls where requested). We will continue to run the group newsletter (which now has a larger number of organisers added) to keep organisers up to date on relevant development and to help facilitate the mentoring program for interested organisers.
Personal support and feedback
In response to very high ratings of usefulness of these kinds of services in our survey, we have decided to dedicate more attention to providing personal support and feedback to organisers where requested. Much of this will take the form of one-to-one calls with individual organisers to talk through challenges their group is facing, connect them with relevant individuals or resources. We hope this kind of tailored support will help guide organisers, and also increase their motivation and confidence in their actions. This will also allow a closer determination of i) how much counterfactual impact groups are having, ii) how far particular services are supporting groups.
Written guides and resources
Similarly, in response to a strong indication of organiser interest in this, we will dedicate time to consolidating and adding to existing written resources aimed at helping organisers run groups. This will contribute to the new content tool for organisers on the EA Hub, mentioned above. We aim to update these resources on an ongoing basis based on our ongoing research into how EA groups are functioning and how they might function better.
Many organisers expressed an interest in using metrics to determine their impact and in having access to evidence-based research on how best to run EA groups. We have also noted, throughout this report, various questions about EA movement building where further empirical research is needed. Some of this research may be conducted in concert with the EA Survey (also run by Rethink Charity). We aim to help individual groups to measure their impact, and to produce general research regarding EA groups for the wider community.
Recognising that funds are important for groups’ success, and having heard from a number of group organisers that lack of funds was a bottleneck, we are planning to make small grants to groups on a case by case basis to help facilitate group growth and are exploring ways to collaborate with CEA to direct funds to groups. Where appropriate we will align this with our ongoing research into EA groups, to try to discern how far targeted grants boost EA groups’ impact.
This report was analysed and authored by Richenda Herzig and David Moss.
Special thanks to Tee Barnett editorial feedback, and for coordinating those involved in the Impact Assessment, both internal and external. Further thanks to Peter Hurford for input as an internal advisor.
We are very grateful to Greg Lewis for his ongoing input as our external advisor.