Accountability, Evaluation, Thinking

Remember the Qualitative Data

The phrase “data-driven decision-making” has become the gold standard for proposing policies, research, or other plans to improve outcomes in higher education. You cannot apply for funding for anything without some evidence to support justify the proposed project and a detailed evaluation plan. This, of course, should not be startling for higher education. Building cases for our research and related decisions is at the heart of all that we do. What has changed is what we define as sufficient evidence. Spoiler alert – it is quantitative data.

Much of this transformation is to the good. We have new tools that allow us to gather quantitative data much more easily than we once did. With Survey Monkey or Qualtrics data from a well-designed questionnaire can be easily launched and analyzed. Getting a good sample is still a challenge, but digital tools make reaching potential respondents better than ever before. The tools for statistical analysis have similarly evolved in ways that help the analysis section of a study to perform functions that were once reserved for the most advanced mathematical thinkers. And with access to large databases, the sky’s the limit for exploring behaviors of various populations.

Then there is the power of “big data” which is so powerful in medical research right now. With access to studies from all over the world, scientists can get at a level of analysis that is much more nuanced than we once experienced. It is so exciting to see that, with all of the information available, it is possible for physicians to move from a generalized Chemo cocktail to one that has been edited more specifically for the genetic traits of an individual. It is truly breathtaking to see the advances in science that these tools provide.

In higher education, the data-driven movement is really impacting our evaluation of university outcomes at every level. We move from the big picture – graduation rates and retention overall, and then fully scrutinize factors that might show us that we are systematically leaving specific groups of students behind. Often referred to as the achievement gap, colleges and universities are no longer (and should no longer be) satisfied with gaps in retention and graduation that break down along gender, income, first-gen, and other socio-cultural groupings.

Attending to these gaps is, indeed, driving policies and programs at many universities. At WCSU, it has led to a revision of our Educational Access Program (Bridge) and to the addition of a peer mentor program. We’re tracking the impact on our overall retention rates, but also taking a deeper dive into different clusters in our community to see where we need to do more. What has really changed for us is that we are designing these efforts with follow up analyses built in from the start, so that we don’t just offer things and then move on. We have a plan to refine as we go. This is a good change.

Still, this focus on statistical data can lead to gaps in understanding that are significant. As always, our results are only as good as the questions we ask. Our questions are only as good as our ability to see beyond our worldview and make room for things we never anticipated. This is a challenge, of course, because we often don’t realize we are making assumptions or that our worldviews are limited. It is the nature of our disciplinary frames; it is the nature of being human.

Although my education has included anthropology and media ecology (both with lots of attention to our biases and qualitative data), I realize that I have been struggling to find ways to incorporate more qualitative analysis into all that we are doing at WCSU. It is tricky because it is more labor and time intensive than analyzing statistical outcomes or neatly structured survey data. It is also tricky, because we need to be informed by the qualitative without falling into the problem of generalizing from the single case. And, of course, it is tricky because, well it takes sustained practice with ethnography to do qualitative well.

I was reminded of this, as I began to read Gillian Tett’s, Anthro-Vision: A New Way to See Business and Life. This text explores the ways in which the habits of anthropology can be transformative to business processes of all kinds. It isn’t so much a “new” way to see things – after all anthropology has existed as a discipline for over a century – nor is it new to see it as a tool of business and governments (see Edward T. Hall for a glimpse of the past) – but it is an excellent reminder that anthropology offers a powerful lens. Tett’s book is full of examples of mis-steps in the tech industry and in marketing because those in charge never even questioned their assumptions about how people interact with technology. The hiring of a full-time anthropologist helped to address some of that. She also reminds us of the difference between asking questions and observing behavior – not because people lie, but because our questions were calculated to get the responses we got and, therefore, missed the bigger context. Our narrow lenses go beyond market research and explain socio-political challenges and misunderstandings on a global scale. These are important reminders, all.

So, I am reminded to take the time to dive into the questions that most statistical research will miss. There is more to understand than calculating the percentage of students who answer the question: “How often do you use the tutoring resource center?” with often, sometimes, or never. There’s the whole long list of feelings that complicate seeking help. There’s that long list of other priorities (work, co-curricular, family). There are the things I haven’t thought of yet, that are barriers to using the resources we are providing. There is research on this, I know, but I think there is more to know.

Yes, I am a fan of quantitative data, but I must admit that I have learned much more from qualitative data over the course of my life. The insights of the unexpected interaction, or the opportunity to observe for long(ish) periods of time, have improved my questions and understandings, and generated much more interesting follow up work than the summary data have ever done. This is important for the work on academic success that we are engaged in at our universities. It is even more important (and not at all unrelated) when we try to see the barriers to creating a diverse, equitable and inclusive environment. I’m thinking it may be time to put a few anthropologists on the institutional research payroll.

Accountability, Quality, Return on Investment

Outcomes Based Funding Metrics

This morning I read with interest a report from The Education Trust, entitled Re-Imagining Outcomes Based Funding. I was following up on Emma Whitford’s piece in Inside Higher Ed that focused on outcomes based funding (OBF) as a tool for supporting equity. I must admit, I shuddered as I considered the hundred ways outcomes funding goes wrong, but Whitford and the report helped me to think about things in new ways. Chief among those ways was that this approach actually supports a focus on who campuses admit, not just retention and graduation rates, and suggests that funding should take that into account. It seems we are getting somewhere on raising awareness about the bluntness of those measures. Hooray.

As I read through the metrics suggested, I saw some thoughtful connections between the students enrolled and the ways that our legislators might think about funding. Instead of just looking at retention and graduation rates, this approach prioritizes investing in campuses that serve more diverse student bodies. It also brings in an important new variable for OBF–campus climate.

Campus climate is often an invisible component in the retention and graduation rates of a university. We spend a lot of time looking at ways to support under-prepared students and we seek out opportunities for scholarships for our under-funded students and these are really important things to do. But, for first generation students and students of color these efforts are not sufficient. They must feel welcome.

So how do we do that? Well, campus climate surveys are one way. Interestingly enough, they are not inexpensive to administer, and they are even more expensive to use. It isn’t enough to gather the data; we need qualified personnel to analyze that same data and help the campus community find opportunities to improve. The funding for this work has to be new dollars. If it isn’t, it will get cut from the budget as soon as we have to prioritize our efforts. We will always focus on direct student support over the broader climate every time. So, I’m glad this idea was raised in the report, but there are important financial implications to consider.

Then there was another piece in the report that gave me pause. In the section called “Ten Steps for Design” (of outcomes based funding models), the following was step five:

“Discourage institutions from reducing access to high-quality degrees or credentials for students from low income backgrounds and students of color.”

This statement is a response to the negative consequences OBF as it has been implemented in the past. In short, the easiest way to improve retention and graduation rates is to change your admissions standards. Better prepared students do better than those on the margins. Better funded students do better than those who struggle to pay for their education. First-generation college students manage more uncertainty than their second or third generation peers and may be retained at somewhat lower levels. All of these students are likely to take longer than four years to graduate. Yes, the older model incentivized a less-inclusive campus. The new suggested strategies are a marked improvement.

At WCSU about 35% of our student body are the first in their families to go to college. We are a relatively affordable school and find that this is attractive to lots of Pell-Eligible students. We are also an increasingly diverse community, something we view as entirely positive, but our history is less so and we are still learning about our invisible barriers and biases, as we seek to be an inclusive campus. Most of what we do fits well into this Re-imagined OBE Funding Framework with its focus on equity. In theory, we should benefit from greater support for our campus based on this model.

But I must admit I do worry about additional unintended consequences if timelines for effectiveness are not robust enough and if there is not continuous dialog with our state representatives about how they read our metrics. For example:

  1. Even when recruiting and admission standards are comparable, a majority residential campus will do better on retention and graduation measures than a majority commuter campus. It is simply easier to help a student who is struggling when they have a regular presence in the campus community.
  2. Sufficient funding to create comparable experiences for our needier students is also an important consideration. Opportunities for internships, research experiences, or study abroad may require a cash infusion or higher need students will skip them for more work hours. They simply need the funds. Unfortunately, these are the same high impact educational experiences that inspire degree completion, applications to graduate schools, and broaden career opportunities. Without that funding stream, schools who serve the less wealthy are likely to have outcomes measures that look weaker than their better funded peers.
  3. Finally, timelines for evaluation are critical. Improvement of anything cannot really be seen in under six years in higher education. While degrees are imagined in four year increments, the students who need more support tend to take five to six years. The effectiveness of an intervention on retention could show up quickly, but its sustained impact will take time. All other interventions will be better seen over the course of a degree. But six years is also a minimum, because you will only be measuring a first cohort at that mark. Sustained improvement is better captured in 8-10 year cycles.

These nuances are hard to convey when elections are in 2- and 4-year cycles. No matter how invested elected officials are in education, there is opportunity for too narrow a view. So, I remain skeptical about the ability to create an outcomes based funding model that can truly support great education that is equitable. But I am very excited to see equity put at the center of the question. That is a great leap forward.