Accountability, Evaluation, Thinking

Remember the Qualitative Data

The phrase “data-driven decision-making” has become the gold standard for proposing policies, research, or other plans to improve outcomes in higher education. You cannot apply for funding for anything without some evidence to support justify the proposed project and a detailed evaluation plan. This, of course, should not be startling for higher education. Building cases for our research and related decisions is at the heart of all that we do. What has changed is what we define as sufficient evidence. Spoiler alert – it is quantitative data.

Much of this transformation is to the good. We have new tools that allow us to gather quantitative data much more easily than we once did. With Survey Monkey or Qualtrics data from a well-designed questionnaire can be easily launched and analyzed. Getting a good sample is still a challenge, but digital tools make reaching potential respondents better than ever before. The tools for statistical analysis have similarly evolved in ways that help the analysis section of a study to perform functions that were once reserved for the most advanced mathematical thinkers. And with access to large databases, the sky’s the limit for exploring behaviors of various populations.

Then there is the power of “big data” which is so powerful in medical research right now. With access to studies from all over the world, scientists can get at a level of analysis that is much more nuanced than we once experienced. It is so exciting to see that, with all of the information available, it is possible for physicians to move from a generalized Chemo cocktail to one that has been edited more specifically for the genetic traits of an individual. It is truly breathtaking to see the advances in science that these tools provide.

In higher education, the data-driven movement is really impacting our evaluation of university outcomes at every level. We move from the big picture – graduation rates and retention overall, and then fully scrutinize factors that might show us that we are systematically leaving specific groups of students behind. Often referred to as the achievement gap, colleges and universities are no longer (and should no longer be) satisfied with gaps in retention and graduation that break down along gender, income, first-gen, and other socio-cultural groupings.

Attending to these gaps is, indeed, driving policies and programs at many universities. At WCSU, it has led to a revision of our Educational Access Program (Bridge) and to the addition of a peer mentor program. We’re tracking the impact on our overall retention rates, but also taking a deeper dive into different clusters in our community to see where we need to do more. What has really changed for us is that we are designing these efforts with follow up analyses built in from the start, so that we don’t just offer things and then move on. We have a plan to refine as we go. This is a good change.

Still, this focus on statistical data can lead to gaps in understanding that are significant. As always, our results are only as good as the questions we ask. Our questions are only as good as our ability to see beyond our worldview and make room for things we never anticipated. This is a challenge, of course, because we often don’t realize we are making assumptions or that our worldviews are limited. It is the nature of our disciplinary frames; it is the nature of being human.

Although my education has included anthropology and media ecology (both with lots of attention to our biases and qualitative data), I realize that I have been struggling to find ways to incorporate more qualitative analysis into all that we are doing at WCSU. It is tricky because it is more labor and time intensive than analyzing statistical outcomes or neatly structured survey data. It is also tricky, because we need to be informed by the qualitative without falling into the problem of generalizing from the single case. And, of course, it is tricky because, well it takes sustained practice with ethnography to do qualitative well.

I was reminded of this, as I began to read Gillian Tett’s, Anthro-Vision: A New Way to See Business and Life. This text explores the ways in which the habits of anthropology can be transformative to business processes of all kinds. It isn’t so much a “new” way to see things – after all anthropology has existed as a discipline for over a century – nor is it new to see it as a tool of business and governments (see Edward T. Hall for a glimpse of the past) – but it is an excellent reminder that anthropology offers a powerful lens. Tett’s book is full of examples of mis-steps in the tech industry and in marketing because those in charge never even questioned their assumptions about how people interact with technology. The hiring of a full-time anthropologist helped to address some of that. She also reminds us of the difference between asking questions and observing behavior – not because people lie, but because our questions were calculated to get the responses we got and, therefore, missed the bigger context. Our narrow lenses go beyond market research and explain socio-political challenges and misunderstandings on a global scale. These are important reminders, all.

So, I am reminded to take the time to dive into the questions that most statistical research will miss. There is more to understand than calculating the percentage of students who answer the question: “How often do you use the tutoring resource center?” with often, sometimes, or never. There’s the whole long list of feelings that complicate seeking help. There’s that long list of other priorities (work, co-curricular, family). There are the things I haven’t thought of yet, that are barriers to using the resources we are providing. There is research on this, I know, but I think there is more to know.

Yes, I am a fan of quantitative data, but I must admit that I have learned much more from qualitative data over the course of my life. The insights of the unexpected interaction, or the opportunity to observe for long(ish) periods of time, have improved my questions and understandings, and generated much more interesting follow up work than the summary data have ever done. This is important for the work on academic success that we are engaged in at our universities. It is even more important (and not at all unrelated) when we try to see the barriers to creating a diverse, equitable and inclusive environment. I’m thinking it may be time to put a few anthropologists on the institutional research payroll.

1 thought on “Remember the Qualitative Data”

  1. Missy,

    As a qualitative researcher, I applaud your post. Number crunching only goes so far. It is the story or experience that has greater meaning for most people. Sometimes we need to get at the heart of the matter when numbers do not represent a true picture of what is occurring, what people are thinking, and what people will remember in the long run. For example, my students are continually telling me that it is the clinical stories from my vast experience in the trenches that they remember most about a therapeutic intervention. I try to make things REAL for them. Qualitative research speaks to the reader in a unique and human way.

    I enjoyed your blog immensely.
    Well done !

    Mary Ellen Doherty

Leave a Reply to Dr. Mary Ellen Doherty Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.