Highlights from our MI AIR Conference Discussion in Ann Arbor, Michigan

by | Nov 21, 2017 | Events

We spoke – well, SoundRocket Research Consultant Julie M. Smith, Ph.D.; Research Programmer Rob Young; and Research Analyst Jillian Hunsanger spoke – at the 31st Annual Michigan Association of Institutional Research (MI AIR) Conference in Ann Arbor recently and, while they had a hoot of a great time at the conference, our three team members took their presentation on Maximizing Response: Practical Lessons from Campus Climate Surveys very seriously.

If you missed the conference you missed a ton of great information covering what Michigan colleges and universities are doing in the field of Institutional Research. But that doesn’t mean you need to miss the great insights, analyses and data (if we do say so ourselves) that the SoundRocket team presented. You can download the entire (free) PDF.

In the meantime, take a look at a short summary of two sections of their presentation, below.

Mobile Optimization: Help Ensure a Positive Survey Experience on any Device.

Ensuring that your campus survey optimizes well on mobile phones helps increase your response rate and completion rate (students can answer your questions when it fits their schedule), but mobile also requires appropriate care is taken so that your questions “translate” perfectly on any device your respondents may be using.

In recent studies we have found that often over 1/3 of the students complete surveys on a mobile device.

Survey device options are great! But…beware of asking too many open-ended questions because such queries can take too long when using a smartphone or tablet. Also, keep an eye on your survey’s length: questionnaires that take more than 20 minutes to complete are best suited for a laptop or PC, or studies where the survey taking environment is more controlled.

Creating an Email Strategy that Encourages Participation

Just like having a party to which no one comes (we’ve lived that nightmare!) putting together a campus climate survey to which few respond can be costly and discouraging.

To help ensure that you won’t be (to paraphrase Field of Dreams) building a survey to which no one will come, make sure you create respondent communications messages (invitations and reminders) that encourage participation. Inviting survey participation can sometimes be rocket science!

Our suggestions:

  • Send reminders every four days.  Too quick or too much time reduces overall response.
  • Send messages from a person (from Dr. Jane Smith, Dean of Student Affairs) instead of from an institution or department.
  • Keep messages short, but include important information – greeting, survey purpose, value of each response, confidentiality, survey length, incentive info, advance thanks, survey link. (Yes, it is possible to create such messages!)
  • Communicate regularly with your IT department to help make sure messages get delivered (a box of donuts every now and then also never hurts).

What about subject lines, e-mail send rates, and the optimal number of contacts? Well, you’ll just have to check out the presentation to learn more. Yes, we’re a tease: the info above is just a sampling of the actionable strategies for great response to your surveys in our presentation’s PDF, which you can get your hot little hands on right now simply by downloading it.

Don’t like free? Don’t like to read PDFs? Prefer to send emails to ask us to reach out to you? We can do that! Email away!

Currently, the FDA only regulates true direct-to-consumer (DTC) genetic tests, which have no health care provider involved either before or after testing. Consumer-initiated, physician-mediated genetic tests are considered lab developed tests (LDTs), which currently do not require FDA oversight. 


Our Study Design

Our study was designed to simulate the experience of an everyday person who is considering doing a health-related genetic test. For this reason, we only reviewed website contents presented to a consumer before ordering a test. By limiting our data collection to pre-test content, instead of digging around or contacting the companies to fill in missing data points, gaps in public-facing information that consumers use to make ‘informed’ decisions were revealed.  

Also, while a genetic counselor supervised the project, a research assistant (RA) conducted most of the website investigations. The RA was familiar enough with genetics and genetic testing to understand and identify the information presented on the websites, but has not had the clinical exposure that might create bias from knowing how specific tests work “behind-the-scenes”. 


To Sum Up

We set out to understand the landscape of health-related consumer genomics testing from the public perspective. By limiting our research (by design) to public-facing pre-test website content, we could not complete our data collection as set out in the protocol. However, this uncovered an important observation: consumer genomics websites are highly variable in content, readability and ease of use. 

This begs the question, if we can’t find basic test information on a consumer genomics website, how does a consumer have enough information to make an informed choice about testing? 

Stay tuned for Part 2 in this series, where we will dig into our study findings and reveal our most interesting observations.  



As experts in FDA user comprehension studies for consumer genomics companies seeking 510(k) clearance, we are interested in how everyday people access and understand health content that is meant for them. If you need help optimizing your consumer-directed health communications, we’ve got the in-house expertise and experience to meet your needs. Let’s chat

About the Author


Understanding human behavior—individually and in groups—drives our curiosity, our purpose, and our science. We are experts in social science research. We see the study of humans as an ongoing negotiation between multiple stakeholders: scientists, research funders, academia, corporations, and study participants.