Newsletter

Adventures in Research: Issue 6

A friend recently introduced me to the term “Maycember,” and man, is that resonating right now. Graduations, dances, soccer tournaments, end-of-year gifts—if I have to Venmo money for one more pizza party, I might lose my mind.

But in a few short weeks, it will abruptly stop, and we’ll have two magical months of New England summer. And let’s be honest: no one does summer quite like New England. (Hello, lobster rolls, anyone?!)

Before I think too deeply about how to develop a survey about the best places to get lobster rolls in New England, I better dive into this quarter’s newsletter.

—Alicia

(More) Survey Strategies That Work

We discussed survey strategies in our last newsletter, but this topic will never die. Today, we have three survey-related nuggets.

First up, let’s talk about those “previously validated scales.”

When creating a survey, it’s always wise to search for previously validated scales that assess the constructs you’re interested in. These scales are often built by professionals with specific expertise and tested thoroughly.

However, not all previously validated scales are valid for all samples.

Here are three questions you can ask to evaluate whether a previously validated scale is a good fit for your sample:

  1. What sample was used to validate the scale? Let me guess—a bunch of middle-income white kids? If the sample the scale was validated on does not match the sample you want to survey, it may not be a good fit for your sample.
  2. What age group was the scale tested on? Young people’s cognitive and emotional functioning changes in leaps and bounds. A scale built for elementary-age children is likely not a fit for tweens (and vice versa). And no, you should not take that scale validated for adults and just “tweak” the language.
  3. Who created the scale? If there’s no indication that young people were involved in developing or piloting the scale, you run the risk of using a scale that is statistically valid but does not accurately reflect how young people think about or talk about a concept.

Understand your survey’s goals.

According to keyword research, people in the U.S. search “how to write a survey” 260 times a month. (People search “how to write a good survey” only 70 times a month.)

If you’re one of those googling folks, stop right there and answer these questions: Who have you discussed the goals of this survey with? Where did you write those goals down?

If the answers are “fewer than four people” and “on a napkin I lost,” hit the pause button.

The first step in an effective survey development is CREATING SHARED GOALS.

  • What are the stakeholders’ goals for the survey?
  • What do stakeholders want to be able to do with the results?
  • What are their concerns?

Remember, people invest in what they create. Engage stakeholders early in the goal-creation phase, and your entire survey process will go more smoothly.

PRO TIP: Record your shared goals in a file everyone can access. Refer to the goals whenever you discuss survey progress and results.

Don’t forget to randomize the order when presenting several options in an online survey.

Recently, Kristina attended a Portuguese sweet bread tasting at Groundwork Coworking Space in New Bedford. The entries were labeled “A” through “H,” and participants voted for the best one.

“A” won.

This brings up a good reminder: Be sure to randomize the order when presenting several options in an online survey. Randomizing can be done with a “select all that apply” survey item or even with blocks of items in a longer survey. Switching up the order helps avoid primacy bias and systematic missingness due to survey fatigue.

Kristina says sweet bread “A” really was delicious, though. 🙂

What We’re Geeking Out About . . .

Progress Alert

When reviewing the results of a survey, we always encourage people to consider, “What isn’t being asked, and why isn’t it being asked”?

Sometimes, what’s not included in the survey is as revealing as what is.

For the first time since 1997, the federal government has updated its policy for collecting data related to race and ethnicity by asking the question, “Who are we excluding?”

They spent two years reviewing over 20,000 comments and conducted almost 100 listening sessions. We’re hopeful that this process will result in federal data that are more representative and inclusive in years to come.

Read more about it here.

Fab Collab

Alicia had a blast collaborating with Kim Firth Leonard and Sheila B. Robinson to create a blog post discussing how to capture authentic youth perspectives through surveys.

It’s a quick read.

What’s Next For Us

If you know you want to conduct a program evaluation this fall, there’s still time to start planning. Remember, a lot of work goes into the planning stage, like ironing out your logic model, engaging with key stakeholders, and piloting surveys, and the longer the runway, the stronger the evaluation.

Avoid a fire drill in August. Contact us ASAP to get started before summer fever takes hold of all of us.

Wishing you a wonderful summer! See you in September.

Best,
The Team at Lynch Research Associates

Email us or just put a time on our calendar to talk more: