The Planner’s Use of Information, Ch. 2 

“Survey Methods for Planners” Nancy Nishikawa

The survey is a way of creating an area-specific, customized database. 

  • A “sample survey” allows planners to generalize findings from a relatively small number of respondents to a larger population. 
  • Sources of error
    • “Intrusive measures”
    • Operationalization of  complex social issues
  • Procedure
    • Define central concepts of study
    • Choose indicators to assess these phenomena
    • Construct questions, for which the response will better describe, evaluate, or provide understanding for each element
  • Cross-validate survey results against other measures and findings

Survey Research Objectives

Begin by asking questions, rather than identifying specific data to be obtained.

·        Reason for conducting survey

·        Kind of information needed

o       When to consider alternatives

§         Inappropriate for obtaining accurate information about sequence of historical events

§         Not good for tapping the flow of activities at individual or group level

§         Ineffective for assessing involvement in illicit activities

o       Factors difficult to control

§         Level of interest (can affect response rate)

§         Contamination effects (experience with prior surveys)

§         Relationship with respondents (some groups more responsive because of authority issues)

§         Sense of security and privacy

·        Importance of types  of information

o       Profile data (characteristics of survey population)

o       Environmental data (circumstances in which respondents live)

o       Behavioral data (relevant social behavior)

o       Psychological data (opinions, preferences, attitudes, awareness, motives, expectations)

·        Level of accuracy needed

·        Users of information

·        Consider the alternatives

o       Observation

§         Observations must be fully recorded (don’t rely on memory)

§         Distinguish between observation of actual occurrences and interpretation of those occurrences

o       Key-Informant Interview

§         Qualifications of key informants

§         Potential sources of bias from each key informant

§         Particularly useful where there is a communication gap

o       Group interviews

§         Focus groups

§         Brainstorming

§         Nominal group technique

§         Delphi process

Administrative Costs of Surveys

  • Interview costs are usually largest item in budget
    • One-third of interviewer’s time is direct interviewing
    • Typical cost $50-120 per completed interview
  • Telephone surveys are typically less expensive ($15-40
  • Mail surveys are least expensive
  • Time budgets also often underestimated

Design of Surveys

  • Cross-Sectional Surveys
    • Unweighted
    • Weighted  (oversample certain subgroups)
    • Contrasting samples (sampling from groups already known to show substantial differences)
  • Longitudinal Surveys
    • Before-and-after Study (baseline data can be difficult to obtain)
    • Trend Analysis (periodic data collection)
    • Panel Analysis (trend analysis from the same group of respondents)

Sampling the Population to Be Surveyed

  • Census (“sampling” everyone in a target population)
  • Probability Sampling
    • Findings of sample survey accurately relate only to population from which sample was drawn
    • Determining target population:
      • Spatial area
      • Respondent selection
    • Bias (sampling error)—discrepancy between distribution of characteristics in sample compared to population
  • Sampling Frame
    • Sources:  lists, registers, maps
    • Criteria:
      • Cover entire population
      • Complete (exclude no one)
      • Avoid duplication (include no one twice)
      • Accurate (up to date)
      • Available (must have access to information)
    • Types of error from sampling
      • Sampling error (some characteristic part of population missing from sample)
      • Coverage error (some areas from population missing from sample)
      • Measurement error (response does  not accurately reflect respondent)
      • Nonresponse error (systematic difference between respondents and nonrespondents)
    • Types of sampling frames
      • Systematic sample  (randomly select first unit, then choose every nth member)
      • Simple random sample (each individual selected by random number)
      • Stratified sample (population divided by stratum, individuals randomly sampled from strata)
      • Cluster sample (population divided into clusters, random sample of clusters completely sampled)
      • Nonprobability samples
        • Availability sample (survey people passing a specific location)
        • Convenience sample (survey people already assembled in a group)
        • Purposive/judgmental sample (survey based on familiarity with population)
        • Quota sampling (availability sample based on matrix of population characteristics)
        • Volunteer sampling (self-selected group of respondents)
      • Nonprobability Case Sampling
        • Matching/contrasting cases
        • Typical cases
        • Critical cases
        • Snowball technique
  • Sampling Issues
    • Population with greater similarity can be represented with smaller sample
      • Stratified sampling calls for fewest number of cases
    • The more categories needed for each analysis (“cross-tabulations”), the larger the sample needed
      • Rule of thumb:  no fewer than 20 cases per cell
    • Confidence levels:  measure of confidence that estimate from sample correctly describes population

Methods for Gathering Survey Data

  • Face-to-face Personal Interview
    • Increases chance that individual will participate
    • Increases likelihood of complete and accurate answers
    • Decreases likelihood of skipped questions
    • Permits “probing” (clarifying) of responses
    • Permits control over sequencing of questions
    • Increases accessibility to isolated populations
    • Permits use of visual aids
    • But has high costs, requires skilled interviewers, and can introduce interviewer bias
  • Telephone Interview
    • Greater economy
    • Permits screening respondents for desired characteristics
    • Can avoid some physical risks
    • CATI permits
      • Programmed skip patterns & preset follow-ups
      • Random presentation of order of responses
      • Compressing data collection and data entry tasks
    • Random-digit dialing overcomes some sampling bias issues
    • But difficult to establish social rapport, problem of hang-ups, and questions must be shorter and less demanding.
  • Mail-in Questionnaire
    • Cost-effective
    • Dillman’s (2000) “Tailored Design Method”
      • Warm-up (pre-notice letter)
      • Sales pitch (detailed, persuasive cover letter)
      • Thank-you postcards (a few days to a week later—also serves as a reminder)
      • Replacement questionnaires to non-respondents (two weeks later)
      • Phone/special delivery final contact to nonrespondents (week later)
  • Web-based Questionnaires
    • Even more cost-effective
    • Permits greater control over survey experience
    • Can include visual aids
    • But
      • respondents will have different computer hardware, screen configurations, browsers, and transmission speeds
      • problem of representativeness of respondents
        • coverage error is an issue, although online user population is looking more and more like general population
        • indiscriminate access (persons outside target population accessing questionnaire) is also a problem
      • competence to design online questionnaires may not be readily available

Designing the Questionnaire

  • Constructing questionnaire items
    • Open-ended questions (but creates problem of post-coding the responses)
    • Closed questions (multiple answers create coding problems)
      • Useful making distinctions of degree
        • Rating scale (like semantic differential—no order implied)
        • Likert scale (ordinal scale)
        • Numerical scale (ordered scale with equal intervals)
    • General guidelines
      • Questions should be straightforward and unambiguous—avoid opportunities for misinterpretation
      • Avoid double-barreled questions
      • Avoid expertise error—questions must be within expertise of intended responses
      • Avoid slang or jargon (such terms are subject to misinterpretation, or convey no meaning)
  • Ordering questionnaire items
    • Funnel sequence vs. inverted funnel
    • Arrange questions so relationship to overall purpose of study makes sense to respondents
      • Pay particular attention to skip patterns

Evaluating the Survey Instrument

  • Pilot study—preliminary study, to see what is there
  • Pretest—determine which alternative procedure to use (debrief respondents)
    • How do respondents react?
    • Able to follow instructions?
    • Completed in expected time?
    • Choices in closed questions adequate?
  • Trial run—evaluate operational plan as a whole prior to final run (including training personnel)



1.  Consider the MRAP Proposal to the EPA, and presume your task is #3 (Interrelationships Between the Basin’s Diverse “Cultures” and Basin Policy Decisions).  Design a survey to provide data for this stage of the process.

2.  Suppose you were evaluating an in-house staff development program, “Women in Leadership,” like the Midwest Women’s Leadership Institute.  Devise a survey to assess the impact of the program on the participants, the organization, and the community, with the data to be collected 5 years after the participants completed the program.

3.  Suppose you were asked to assess the city’s new neighborhood housing program to address the issue of student housing.  You have access to the consultant’s report that led to the program.  Design a community survey to measure their satisfaction with the new program.

Additional Readings (other than those at end of chapter)

Bearden, Williiam O. & Richard G. Netemeyer.  1999.  Handbook of Marketing Scales.  Newbury Park, CA:  Sage Publications.

Davis, James A.  1971.  Elementary Survey Analysis.  Englewood Cliffs, NJ:  Prentice-Hall, 1971.

Hyman, Herbert.  1955.  Survey Design and Analysis.  NY:  The Free Press.

Miller, Delbert C.  & Neil Salkind.  2002.  Handbook of Research Design and Social Measurement, 6th Ed.  Newbury Park, CA:  Sage Publications.

Nard, Peter M.  2006.  Doing Survey Research:  A Guide to Quantitative Methods.  Boston, MA:  Pearson Education Inc.

Rosenfeld, Paul, and others.  1993.  Improving Organizational Surveys.  Newbury Park, CA:  Sage Publications.




© 2006 A.J.Filipovitch
Revised 3 October 2009