Main > Nonprofit Fundraising and Administrative Costs > Survey Methodology


Survey Methodology

  • Fielding the Overhead Cost Survey

    In the fall of 2001, we drew a sample of nonprofit organizations that had recently filed Form 990 with the IRS. We drew upon the most comprehensive sampling frame available at the time, the year 2000 Core File developed by the National Center for Charitable Statistics. The Core Files combine descriptive information from the IRS Business Master File and financial variables from the Return Transaction Files. In an attempt to represent the full universe of nonprofit organizations that are required to file Form 990 in a given year, the Core Files insert previous-year records for organizations that fail to file or have not yet filed at the time the data set is finalized. Consequently, the Core Files err on the side of including defunct, small, and organizations that may no longer be operating as a nonprofit.

    Since the Overhead Cost Project has specific research aims, we chose to exclude several kinds of organizations from the sample. Our first criterion for exclusion takes into account the size of the organization. Since we wanted to focus on organizations that have meaningful fundraising or administrative expenses, we excluded organizations that reported less than $100,000 in gross receipts. We also removed any remaining organizations that filed Form 990-EZ. Our second criterion for exclusion concerns the types of organizations in the nonprofit universe. Since we wanted to focus mainly on operating charities that match a common conception of a public-serving nonprofit, we excluded a variety of categories of charities that are made up mostly of organizations that do not match this conception. We excluded organizations coded primarily as mutual or membership benefit organizations, pension and retirement funds, and real estate organizations. We also scanned named foundations and trusts and removed those that appeared to operate primarily as single-person charitable giving accounts.

    We divided organizations into two strata, and then sampled proportionately within the categories created by the intersection of these strata. First, we divided organizations according to whether they report fundraising expenses, report zero fundraising expenses, or failed to indicate whether they have fundraising expenses. Second, we divided organizations into four different categories of volume of annual revenue.

    Because we did not know how many organizations would be required to generate a viable sample, we drew an initial batch of 5000 organizations and delivered the list to the Center for Survey Research (CSR) at Indiana University (Bloomington). The CSR sleuthed for phone numbers that matched the name or address provided by the nonprofit on its Form 990. Organizations for which we could find no phone number were removed from the study. For organizations with a matching phone number, we conducted pre-mailing calls. We had three reasons for conducting these calls. First, we wanted to verify that the organization still existed. Second, we wanted to verify the mailing address and acquire the name of a specific individual who would be the appropriate recipient of the survey. Third, if possible, we wanted to discuss the project briefly with this person and alert him or her to the imminent arrival of the survey. In an effort to obtain a sample of approximately 3000 organizations, the CSR attempted pre-mailing calls with 3782 organizations. They were able to complete a call by reaching a live person and obtaining the required information at 3114 of these.

    Pre-mailing calls, subsequent mailouts, and all follow-up procedures were conducted identically in seven waves. The mailout package, consisting of a cover letter, survey, and stamped return envelope, was delivered via FedEx to the designated individuals in each organization. The cover letter promised a $50 donation to organizations that fully completed the survey. Of those packages returned by Federal Express because of bad address, 45 organizations resisted efforts at re-contact for purpose of obtaining a better address. Consequently, the surveys presumably reached the desks of specific individuals in 3069 organizations.

    After two weeks, CSR mailed a post-card to remind these individuals about the importance of returning the survey. After an additional two weeks, they mailed a second full package to nonrespondents. In this second mailing, CSR included a username and password that allowed organizations to access a web-based version of the survey. After several more weeks, CSR called nonrespondents to personally invite the return of the survey. After several more weeks, nonrespondents who had said that they intended to return the survey received a second reminder call.

    The field period lasted approximately four months. At the end of the field period we had received 1540 surveys, a response rate of 50.2 percent.

    This methodology, along with an analysis of nonresponse and length of time to response, was presented at the 2002 ARNOVA meeting in Montreal. Both the paper and the slides are available.

  • Survey Questionnaire

    Click here for a copy of the overhead cost survey instrument. A copy of the survey with variable names is also available.

  • Pre-Test

    Before we fielded the overhead cost survey, we conducted a pre-test so that we could better understand the conditions that would maximize the response rate. In the summer of 2001, we drew a random sample of nonprofit organizations that are recent filers of IRS Form 990. Since we are interested in studying organizations that are most likely to have and account for fundraising and administrative costs, we limited the study to organizations with more than $100,000 in annual revenues. The sample was stratified by size and industry to increase our chances that the organizations in the study would be represented proportionally by size and industry of activity.

    Our sample numbered 141 organizations. We obtained phone numbers for these organizations by consulting phone directories or searching the Internet for organizational web pages. We called the organizations to verify the mailing address, to obtain the name of an appropriate executive contact, and, if possible, to briefly describe the project with the executive contact. For some of the organizations, we were unable to reach any representative by phone. Several others had nonworking phone numbers, suggesting that the organization is inactive or defunct. Two specifically asked us not to include their organization in the study. At the end of the pre-call stage, we had 120 active organizations with executive contact names, most of whom had been alerted to the imminent arrival of our survey.

    We randomly assigned the 120 cases to one of 12 different treatment conditions, a cross-tabulation (2x2x3) of delivery by FedEx vs. plain envelope BY short survey vs long survey BY no incentive vs. $5 in mailout envelope vs. $5 in mailout plus promise of additional $50 upon delivery of completed questionnaire.

    Our survey packets included a cover letter that described the larger project, one of the two versions of the survey, and a self-addressed stamped return envelope. As a good-faith effort to allow anonymity to organizations that wished to reply anonymously, we did not number the surveys or envelopes to track anonymous responses. Two out of three packets contained a $5 bill and a brief note reiterating our appreciation for returned surveys. Two weeks after sending out the surveys, we mailed a reminder postcard to all nonrespondents. After an additional two weeks, we mailed a new cover letter and replacement survey to nonrespondents. One week later, we called nonrespondents with an offer to fax an additional copy of the survey. Two weeks later, we called the nonrespondents a final time to request the return of the survey. By the end of the study period we had received 60 completed surveys, a 50 percent return rate.

    Half of the study sample (60 organizations) received a seven page instrument with questions that would not necessarily require them to consult internal records or reports. The other half of the sample received the same seven pages plus an additional page that substantially complicated the survey instrument. Of the 60 relatively short surveys, 32 surveys were returned (53.3%). Of the 60 longer surveys, 28 surveys were returned (46.7%). This difference is not statistically significant at the .05 level (chi-square= 0.533; critical value = 3.841) and does not lead to the conclusion that the shorter and less complicated instrument leads to greater returns.

    Half of the survey packets were mailed via Federal Express. The other half were mailed via regular mail in 9x12" white envelopes. Of the 60 packets sent via Federal Express, 37 were returned (61.7%). Of the 60 packets sent via regular mail, 23 surveys were returned (38.3%). This difference is statistically significant at the .05 level (chi-square= 6.533) and approaches the critical value for p<.01 (6.635). Consequently, we conclude that use of Federal Express for the initial survey mailouts has a positive impact on returns.

    We divided the survey sample into three groups of 40 organizations so that we could test our two questions regarding financial incentives. Of the 40 that received no financial incentives, 19 surveys were returned (47.5%). Of the 40 that received a $5 bill with their survey, 18 surveys were returned (45.0%). This difference is not significant at the .05 level (chi-square= 0.050), and is not substantial enough to claim a difference between treatment conditions.

    The remaining 40 organizations in the study sample received both a $5 bill with their survey and a promise of a $50 donation to their organization upon delivery of a completed survey. This condition resulted in 23 returned surveys (57.5%). Compared to the group that received the $5 and no promise of an additional donation, the difference in returns is not statistically significant at the .05 level (chi-square= 1.251).

    Consequently, we conclude that delivery of surveys by FedEx is helpful in studies of nonprofit organizations. Moderately shorter survey length and financial incentives did not substantially influence response. Here is the citation for the full write-up of the pre-test experiment and its results:

    Mark A. Hager, Sarah Wilson, Thomas H. Pollak & Patrick Michael Rooney. 2003. "Response Rates for Mail Surveys of Nonprofit Organizations: A Review and Empirical Test." Nonprofit and Voluntary Sector Quarterly 32(2): 252-267.