Survey Methods

Posted on July 11th, 2013 by Jean Cook in What We're Learning. No Comments

Previous: Developing Language for Survey Questions

Internet Survey Methods

The survey was open to the public from September 6, 2011, through October 28, 2011. We used the Internet survey service SurveyMonkey to conduct the survey. The SurveyMonkey software afforded us flexibility. For instance, the software allowed us to insert pop-up definitions of terms that some respondents might find overly technical. SurveyMonkey was also less expensive than many alternative survey-software platforms. Finally, FMC had experience using SurveyMonkey for a survey about musicians and health insurance. We designed three versions of the survey: short, medium, and long. All three versions start with the same 18 questions; we will refer to these as the “core questions.” The core questions covered some demographic information to demonstrate eligibility for the survey: having U.S. citizenship and being at least 18 years old. The core questions also cover basic labor-market outcomes, membership in musical organizations, and revenue sources. We estimated, based on our beta testing, that the core questions would take approximately ten minutes to answer. Then, Question 18 asked the respondents to choose their survey version or path. [Note 7]

We designed the short, medium, and long versions to take an average of 10, 20, or 30 additional minutes to complete, respectively. The long version of the survey asked detailed questions about every role that the respondent reported was relevant to their experience as a working musician: composer (of music, lyrics, or both), recording artist, live performer, session musician, or teacher. We will call these questions the “role questions.” The medium version of the survey shortened the respondent’s completion time by asking role questions only about the role from which the respondent reported earning the most revenue. Respondents choosing the short version answer only two questions about what roles they play, without any detailed follow-up questions.

All three versions of the survey closed with the same 18 questions that cover a range of topics and ask about additional demographic information. I will refer to these questions as the “closing questions.” We conducted four rounds of beta testing, in which people outside the research team took draft versions of the survey. We sought feedback about ease of understanding, proper use of music-industry vocabulary, and organization of the survey questions. The tests were conducted in June, July, and August of 2011. In total, several dozen individuals served as beta testers, some taking multiple versions of the survey or testing at different times for comparison. We recorded a total of 110 practice run-throughs with the survey.


[Note 7]: Below, in Part II.E, I discuss whether the endogenous survey-path choice introduced biases through differences in attrition rates.

Next: Soliciting Participation

Comments are closed.