Evaluation, Growth Mindset

Staring at the Data

Let me give it to you straight – WCSU is struggling to improve our retention rates. As long as I have been at this university (since 2012), we have been trying to figure out why so many students leave prior to degree completion and why so many leave after the first year. We have tried numerous strategies, each time hanging our hopes on a new effort, hoping to crack the code. Here is the list, since 2012:

We had a limited FY program (not required, so only partial enrollment). Part of that program in my first two years here included cohorting of classes for FY student (they took three together). Though the idea was a good one, it did not have the desired effect. Without full community backing, the impression among the students was that they were being punished in some way. Among the students involved it had a (very small) negative impact on retention. We stopped doing that.

We now have a full FY program (required as part of our general education curriculum). Departments have developed courses that act as extended orientations to the discipline, which seems to have improved our students time to degree completion (that number has been moving up from 48.9% to 53% starting with the 2012 cohort). This, combined with published four-year plans and a general education curriculum that has fewer hidden requirements than the prior version, all seem to be moving us in the right direction on graduation rates.

Unfortunately, the FY program and the related efforts did not improve our retention rates which hovered around 73% for years, with one up-year – we hit 75.5% in 2019, only to crash to 69.9% in 2020. That is definitely a COVID impact. We’ll see how recovery goes.

I’ve been paying attention to our data, and the number one predictor of student retention at WCSU is High School GPA. Students with an 85% or higher high school GPA tend to be retained in the 80% or higher range (often as high as 85%, which is a great number for a regional public comprehensive university). That number drops immediately at 84% high school GPA (down 6-10%) and with a pretty linear correlation with each point dropped. A perfectly nice B/B- high schooler is at a 10% or higher risk of being lost in the first year than the solid B/B+ or higher. This data isn’t just WCSU, it is a national trend. This is data we can act upon.

And we have. We’ve launched a peer mentor program to address students in that B/B- category. We’re hoping just a little support from their friends will help them stay. A group of folks from tutoring, the library faculty, the first-year coordinator, academic advising, orientation, and our alternate admissions team gathered together to make a plan. They agreed on a training plan for the peer mentors and recruited a diverse group of students to do this outreach. They are now analyzing what worked and where to improve for next year, and the team is expanding. We won’t know the impact on retention until the fall, but this work has been exciting, and it feels good to take action and measure results.

With this program launched, there is more data to consider, and it is time we do so. The director of institutional research has been running some queries on a few areas variables that seem to be related to retention. The two reports he just sent my way were startlingly clear. High school GPA is predictive across majors (not just as a general indicator of losing a student): the higher percentage of students in the major with an 85% high school GPA or above, the higher the retention rate. Not surprising, based on what we knew before this point, but here is where it gets interesting. There is a striking exception to this relationship, and it occurs in a major that is organized as a very close cohort. The students who started in that cohort, and were not fully successful in their original major, stayed at WCSU even when they had to change their major.

What can we do in light of this data? Perhaps a deeper dive into the reasons why some degrees are attracting more students with lower High School GPAs and making a more specific series of supports in those majors? Maybe. Or maybe we should think about how to create cohorts in programs where they aren’t naturally occurring. Maybe this is the glue that will help the students succeed. With all of the national data the links student persistence and transparent pathways through degrees, cohorts might help us focus those pathways even further. For a majority commuter campus, cohorts might also help our students connect with peers and feel connected to the community. Cohorts provide exciting possibilities that I’m keen to explore with my colleagues.

The second piece of data was about math. Math success is not a new problem and our math department has tried numerous strategies to try to improve student outcomes. They are working hard, incorporating new technologies and trying new approaches. Nevertheless, we have a persistent problem.

A review of five years of data shows us the following:

  • A shocking number of students are placed into Intermediate Math (credit bearing, pre-general education math): Nearly 50%. This is true for students with above an 85% HS GPA and those below it.
  • Students who earn a D, F, or withdraw from any math class in their first year are retained at a rate between 63-68%. Those who happened to fail Intermediate Math with embedded remediation are retained at a rate below 50%.
  • Students who fail any math class in their first year are 20% more likely to have a high school GPA below 85% than those who pass.
  • Students who pass any math class (including the one with embedded remediation) are retained at about an 82-84% rate.
  • The retention rate for students who take no math class in their first year, drops back down to a 70%.

Given all of this, we might want to re-think our math strategies. We need to dig deeper into the patterns of who is ending up with the D, F, or W in those first courses, to see if we have a few other predictors besides HS GPA. We appear to need better placement strategies than those we are currently using. We might even need a prep-course, prior to placement testing. We probably need a differentiated delivery of our curriculum based on a combination of variables, not just our placement tests. We may need some kind of intersession process to keep students from failing or withdrawing in the first place. We’ve tried variations on each of these in the past, but like the FY program done half-way, it hasn’t worked fully. It is time to focus and develop a plan for better math outcomes. Those outcomes are so directly related to our retention rates that it is imperative that we do so.

Yes, I’m staring at the data and trying to imagine next steps. I know I have a community of colleagues who want more for our students. I know I have a community of colleagues who are eager to find better paths. I’m hoping my colleagues will ask even better questions of our data, and embrace trying new things as a result. I hope those new things are coordinated, measured, and refined as we continue to learn more about our students. I hope all of this, because one thing is certain, staring at the data is not enough.

Engagement, Evaluation

Learning from Students

For the last ten years I have been a full-time administrator. In that time, I’ve focused on student learning outcomes and university effectiveness. I’ve obsessed over better pathways through WCSU, hoping to eliminate the unintentional barriers to graduation and policies that are too heavy handed, punishing all students for the poor behavior of the few. I regularly review all the data I can gather about who is succeeding and who is not, trying to address gaps and make things better. Some of those efforts have been effective, improving our overall outcomes; some do not seem to have made a difference. Nevertheless, I forge ahead in that continuous improvement cycle, because it is my job and because I care.

This semester, due to a series of events (read COVID), I am back in the classroom. Adding one course to my insane workload might seem crazy, but it turns out to be the very best part of my week. I am teaching Public Speaking (something I can manage to keep up with, since so much of the feedback is in the classroom), and truly enjoying the interactions with the students. They are as I remember, equal parts interested and ambivalent about their education. Some are always early to class, others often late. Everyone starts the morning looking at their phones. It is my job to get them to look up.

This is a very active class, with a lot of what I call “pop-ups” to help students fight the pervasive fear of public speaking. During most classes, everyone gets up in front of the class to tell us something. You can learn a lot about your students from popups. They reveal attitudes, interests, and experiences that help me see what they are experiencing in the class and in their lives. This is also a First-Year class focused on orientation to college, so a lot of the prepared speeches focus on things at the university. Last Friday the students presented their first informative speeches and I learned a lot about the student experience at WCSU.

Lesson 1: Our study spaces matter. It is not surprising that many students focused their informative speeches on physical spaces. It is a very open-ended assignment – tell us about something at WCSU- so several students identified locations to describe. Those who did emphasized those places where they can sit down and get some work done. I was happy to hear their tales of using our library, computer Labs, quiet lounges and not so quiet spaces to get through the day. Developing these kinds of spaces has been part of our campus master plan and the facilities team has done a great job of finding spaces in every building for students to land. Our library faculty and staff have completely reimagined the library as a campus hub, with academic supports (tutoring, research, writing center) and a bagel shop. This one assignment tells me that our efforts were worth it.

But it isn’t just that they described the spaces, they described their days. They told tales that were familiar to me because I was a commuter student many years ago. With classes spread out throughout the day, and the inefficiency of going home or traveling between our two campuses, our spaces are essential for managing gaps between classes. Having those spaces near help (library) and faculty (science building in particular) was seen as a big bonus. Having access to computers (all over campus) helps them do assignments that are a pain on their mobile devices (even laptops). And being able to find a quiet space to study or a more social space that might help them meet other students was revealed to be essential.

Lesson 2: Our students are interested in co-curricular activities as part of their undergraduate experience. As a majority commuter campus, we sometimes worry about the students who stop in for class and just go home. Yet, this was not what the students in my class focused on. There are athletes (commuter and residential) who described the demands of their practices and games and how they juggle those demands. As first year students, the athletes faced a big transition from high school sports and college. This transition was described as both intimidating and rewarding. Other students talked about being part of our arts programs and hoped to lure some other students to the performances. This group seemed to have a built-in buddy system with their ensembles, exhibitions, and performances. Both of these groups of students appear to be thriving already because they have well-defined communities at WCSU, filled with both curricular and co-curricular activity.

But our offerings are not suiting all of the students’ needs. For several, who are not in those well-defined cohorts, our clubs are falling short. Every campus likes to brag that students can start any club they’d like, and that is sort of true, but it is not something that a first-year student is inclined to do. Finding something of interest is important for these students so that they do connect with others and with the campus experience outside of the classroom. It was clear that our communication about this is falling short. I must admit I flinched as I heard tales of broken links, and missing details about who is involved or when a club might meet. In addition, the meeting times for these student-run organizations absolutely dissuade our commuter students from participation. They would have to return to campus after 8:00 pm, when they have already been to class, hung around between classes, and perhaps even gone to a part-time job. Even young people don’t really want to do that.

So, we have work to do here. One student suggested we survey students about their interests: I think we might need to do this every year. We also need to carve out some time slots during the day with no classes scheduled so that we can invite more to join in these activities. These are details about our campus that I suspected to be true but hearing it from the students directly, really brought it into focus. We need to help them participate if we want them to thrive.

Lesson 3: Given half a chance, the natural inclination of our students is to be supportive. This is particularly true in a class where everyone has to stand up in front of the room and deliver a speech. We all applaud, of course, that’s just good manners, but the supportiveness comes out in other ways. As we summarized the successes and areas for improvement after our first prepared speeches, students observed growth in their peers already. One noted that everyone’s voice was stronger and more controlled than the first pop-up, another observed that the topics were interesting, and the speakers were prepared. Suggestions for improvement focused on degrees, not absolutes–try to look around the room a bit more, make more eye contact, and try not to pace. These were offered as gentle encouragement. No one felt the need to be negative or harsh in those pointers.

This supportiveness is also expressed in their desire not to offend me as they apologize for lateness or absences or messing up a due date on an assignment. Surely they want my forgiveness (no points off), but I feel that there is also a desire not to appear rude or dismissive of the work we are doing together. In this FY class, I want to encourage that behavior; I want them to feel that I am supportive of them, too. I think carefully about my responses, hoping to support each student while encouraging improvement.

Most of what I have learned so far confirms the data that I regularly review, but teaching gives me a great opportunity to move away from my spreadsheets and see things first-hand again. Being in the classroom brings the trend lines to life and in some cases, makes clear some patterns that those lines don’t fully reveal. I am not sure I will be able to teach another course anytime soon, but I am grateful for this opportunity to learn from our students. The lessons they provide are powerful, indeed.

Evaluation, Quality

The Follow Through

Here is a question that I am frequently asked: Why would you want to be in administration? It started with my first truly administrative role, assistant dean, and it persists even now that I have served as provost for nearly six years. As a person whose career began as an adjunct faculty member, then tenure-track to tenured faculty line, it is not lost on me that there are losses when one leaves the classroom. That dreams that led me to higher education were built around love of my discipline and the desire to help students see its value. Teaching is hard work, often frustrating, often rewarding, but it carries with it a clarity of purpose–teach the students in front of you. Living that purpose is exhilarating.

So, why move to administration? Well, for me it was about an ever-widening circle of concerns about how students were experiencing their education. One of my earliest questions was about whether or not students were getting the most out of the totality of their degrees, instead of just focusing on the major. I worried about the connections students were not making between those required humanities or social sciences or science courses and their major. Once I opened that can of worms, my attention moved away from my discipline and toward education as a holistic. Thus, an administrator was born.

What does that holistic perspective mean now that I am a provost? It means I continuously examine data about who we serve, who is thriving, who is not, what students are learning, where our programs are strong and where they need support, what new ideas about teaching are emerging and how to engage faculty with those ideas, and of course, since WCSU is in New England, what to do about enrollment. There’s more. There are questions about equity for everyone (students, faculty, staff). There are questions about processes and organizational structures, and whether they are doing what we want them to do. There are questions about the balance of scholarship, teaching, service, for faculty and appropriate support for professional development for everyone. There is no shortage of things to think about when you are trying to imagine an effective and rewarding whole.

Unsurprisingly, I do a lot of reading about higher education developments and trends. Indeed, this Sunday, as I settled in to review the news and enjoy my morning coffee, I found my attention drawn to a publication from the Chronicle of Higher Education, called The Truth about Student Success. I know, why ruin a perfectly nice Sunday? But I am worried about outcomes and so I downloaded the document and read it through. When I was done, all I could think was, but we’ve done all that already!

Except we haven’t quite. Despite my best efforts to foster an environment where ideas are welcome, strategies for improvement are implemented, and then results are examined, I think I am falling short on the part where we learn from it all. It reminds me of my early days in administration when I realized that higher education is very good at starting (adding) things, but terrible at finishing (subtracting) things. Even worse, we are often missing the part where we examine results and act on them, you know, closing the loop.

Over the last twenty years, higher education as a whole has developed some reasonable habits around the use of assessment to improve curriculum. Everyone has learning outcomes now and assessment plans to trigger reviews of the results. Some plans are better than others, and some programs are more committed to the meaning of those outcomes than others, but overall, folks are trying to learn from their efforts at assessment. At WCSU, I can see the impact of this work on curriculum and to some degree on teaching strategies. This has room to grow, and the sharing of this information is spotty, but it is going on.

We are (I am) less successful at systematic use of the data about the rest of what we do. For example, has the implementation of Degree Works improved academic advising? Has asking about advising practices in annual reports resulted in any changes in strategies at the department level? Are the pre-major pathways (meta-majors) reducing the time to graduation and the accumulation of excess credits? When faculty have participated in teaching institutes, has it changed their teaching strategies? Has it improved outcomes? Has the transition to embedded remediation reduced the number of students stuck in foundational courses? When we see that some courses have very high withdrawal or failure rates, are we acting on that information? There is so much more, but this is the main idea.

As fast as I run, I can’t seem to stay on top of all of this. I have not even managed to implement a good data dashboard to try to keep people in the loop on these things. I hope to complete one this semester, but in the meantime, things are filtering through Deans to Department Chairs to Faculty (maybe), listed in my weekly announcements (sometimes), announced at our University Senate meetings (when time allows), and listed in annual reports (usually). I have to do better.

Without consistent examination of information by the whole community, all of those good things we are doing will just be in pockets (silos). Departments (academic and otherwise) will continue to try new things, but we’ll never see the full impact. We risk not learning from each other and duplicating efforts that would be better if coordinated across areas. We risk abandoning strategies too soon or simply forgetting they are underway. We risk under-investing in things that show signs of working. Most of all, we squander the value of a shared effort to be better, and that is a fundamental waste of talent and resources.

So, as I finished my coffee and that darned report on The Truth about Student Success, I realize that there is no more pressing initiative than establishing good processes for gathering, analyzing, and distributing the information we already have. There is nothing new to do but that. We’re doing all of the other things that everyone else is doing. If we get this part right, we might be able to re-double our efforts on things that are working and stop doing the things that are not. That’s the follow through, folks. We need to learn from what we do.

Examining our processes and making sure that a data dashboard gets done this semester is one more thing on my endless list of duties, of course, and I wonder how I’ll get it done. But I have to because there are no magic bullets to discover; there are only evaluations of what we have already done and plans for next steps. The data dashboard is on me, but I hope that the result is for everyone. I’m hoping with better follow through many more members of our community will work together to improve the whole of the university experience.

Accountability, Evaluation, Thinking

Remember the Qualitative Data

The phrase “data-driven decision-making” has become the gold standard for proposing policies, research, or other plans to improve outcomes in higher education. You cannot apply for funding for anything without some evidence to support justify the proposed project and a detailed evaluation plan. This, of course, should not be startling for higher education. Building cases for our research and related decisions is at the heart of all that we do. What has changed is what we define as sufficient evidence. Spoiler alert – it is quantitative data.

Much of this transformation is to the good. We have new tools that allow us to gather quantitative data much more easily than we once did. With Survey Monkey or Qualtrics data from a well-designed questionnaire can be easily launched and analyzed. Getting a good sample is still a challenge, but digital tools make reaching potential respondents better than ever before. The tools for statistical analysis have similarly evolved in ways that help the analysis section of a study to perform functions that were once reserved for the most advanced mathematical thinkers. And with access to large databases, the sky’s the limit for exploring behaviors of various populations.

Then there is the power of “big data” which is so powerful in medical research right now. With access to studies from all over the world, scientists can get at a level of analysis that is much more nuanced than we once experienced. It is so exciting to see that, with all of the information available, it is possible for physicians to move from a generalized Chemo cocktail to one that has been edited more specifically for the genetic traits of an individual. It is truly breathtaking to see the advances in science that these tools provide.

In higher education, the data-driven movement is really impacting our evaluation of university outcomes at every level. We move from the big picture – graduation rates and retention overall, and then fully scrutinize factors that might show us that we are systematically leaving specific groups of students behind. Often referred to as the achievement gap, colleges and universities are no longer (and should no longer be) satisfied with gaps in retention and graduation that break down along gender, income, first-gen, and other socio-cultural groupings.

Attending to these gaps is, indeed, driving policies and programs at many universities. At WCSU, it has led to a revision of our Educational Access Program (Bridge) and to the addition of a peer mentor program. We’re tracking the impact on our overall retention rates, but also taking a deeper dive into different clusters in our community to see where we need to do more. What has really changed for us is that we are designing these efforts with follow up analyses built in from the start, so that we don’t just offer things and then move on. We have a plan to refine as we go. This is a good change.

Still, this focus on statistical data can lead to gaps in understanding that are significant. As always, our results are only as good as the questions we ask. Our questions are only as good as our ability to see beyond our worldview and make room for things we never anticipated. This is a challenge, of course, because we often don’t realize we are making assumptions or that our worldviews are limited. It is the nature of our disciplinary frames; it is the nature of being human.

Although my education has included anthropology and media ecology (both with lots of attention to our biases and qualitative data), I realize that I have been struggling to find ways to incorporate more qualitative analysis into all that we are doing at WCSU. It is tricky because it is more labor and time intensive than analyzing statistical outcomes or neatly structured survey data. It is also tricky, because we need to be informed by the qualitative without falling into the problem of generalizing from the single case. And, of course, it is tricky because, well it takes sustained practice with ethnography to do qualitative well.

I was reminded of this, as I began to read Gillian Tett’s, Anthro-Vision: A New Way to See Business and Life. This text explores the ways in which the habits of anthropology can be transformative to business processes of all kinds. It isn’t so much a “new” way to see things – after all anthropology has existed as a discipline for over a century – nor is it new to see it as a tool of business and governments (see Edward T. Hall for a glimpse of the past) – but it is an excellent reminder that anthropology offers a powerful lens. Tett’s book is full of examples of mis-steps in the tech industry and in marketing because those in charge never even questioned their assumptions about how people interact with technology. The hiring of a full-time anthropologist helped to address some of that. She also reminds us of the difference between asking questions and observing behavior – not because people lie, but because our questions were calculated to get the responses we got and, therefore, missed the bigger context. Our narrow lenses go beyond market research and explain socio-political challenges and misunderstandings on a global scale. These are important reminders, all.

So, I am reminded to take the time to dive into the questions that most statistical research will miss. There is more to understand than calculating the percentage of students who answer the question: “How often do you use the tutoring resource center?” with often, sometimes, or never. There’s the whole long list of feelings that complicate seeking help. There’s that long list of other priorities (work, co-curricular, family). There are the things I haven’t thought of yet, that are barriers to using the resources we are providing. There is research on this, I know, but I think there is more to know.

Yes, I am a fan of quantitative data, but I must admit that I have learned much more from qualitative data over the course of my life. The insights of the unexpected interaction, or the opportunity to observe for long(ish) periods of time, have improved my questions and understandings, and generated much more interesting follow up work than the summary data have ever done. This is important for the work on academic success that we are engaged in at our universities. It is even more important (and not at all unrelated) when we try to see the barriers to creating a diverse, equitable and inclusive environment. I’m thinking it may be time to put a few anthropologists on the institutional research payroll.

Evaluation, Higher Education, Hope

Continuous Improvement

With the Passover and Easter upon us and the daffodils beginning to push through the soil, it is that time of year when I feel the joyous rebirth and renewal that comes with spring. It is always a welcome sensation that helps lift me up from the endless to-do lists as I take the opportunity to reflect on all we have accomplished this year. As is natural to our structure, we are heading towards an intense period of productivity – exams, papers, grading, annual reports, assessments, and even a few accreditation visits. It could be too much, except we all know there is a break at the end, so we push ahead in this fury of activity, breathless, exhausted, and I hope, proud.

I have been thinking about our reflective practices a lot lately. In higher education, we have a way of broadening our students’ perspectives while unintentionally narrowing our own. We introduce ideas and worldviews with the passion we feel for our disciplines. We strive to develop the habits of inquiry that have served us so well as scholars, and perhaps even as citizens. But we are also specialists, focused on one field and even one aspect of our field. We train ourselves to attend to the details of that specialty and sometimes we miss the connections to other things that are so important.

If I am totally honest, we also get a little insular, not just in our field, but also within our universities and our departments. This insularity can lead us to think we are better than elsewhere or, much more commonly, thinking that we do not measure up. Neither of these are productive positions for educators. So, as the rituals and rush of spring are upon me, I am thinking about the value of external perspectives on our work.

When I began teaching in an undergraduate program in communication, our department had a habit of cultivating student research so that they might attend the professional conferences in our field. Several of my colleagues routinely took students to the regional and national communication conferences. There was an expectation that I would do so, too. I succeeded in doing so, starting at the regional level, but I must say that I was terrified. I was worried that the work was not good enough and that I had inadvertently set my students up for embarrassment. This did not happen. Participation in this experience showed me that my students were within the normal range of work, some exceeding expectations, and others solidly in the normal range. This boosted my confidence as a professor and did wonders for my students. It was an amazing peer review experience.

Soon I was involved in program review. I contributed to the department report and listened carefully to the feedback from colleagues from two external programs that our department admired. At that university, the norm was to select visitors from programs that we aspired to be. This, too, can make inspire insecurity. Our admiration for the visitor’s programs made us think we were somehow second rate. Yet, the experience was incredibly helpful. There was lots of positive feedback, and some good suggestions for how to improve. We took those suggestions to heart and the impact was clearly visible in our evaluation of our learning outcomes the next year. It was another eye-opening experience.

These days, I spend a lot of time reading reports written for accreditors. While I am fully onboard with regional accreditation, I confess that I have some misgivings about the many discipline specific accreditations that we ascribe to in higher education. Defining the norms and expectations of a field at a national level is incredibly helpful and I have zero doubt that this is productive and supports continuous improvement. What gives me pause is that some of these require overly complex evaluations and, well, the costs are not insignificant. I am not all that convinced that the results are more powerful than the simple peer review provided by colleagues from programs we admire. Nevertheless, there is value in the reflective process and the external perspective that these accreditation processes require.

Really, there is value in all of our self-assessments, external reviews, and even our annual reports. These tasks and processes force us to look up from our to-do lists and think about all we have accomplished. They force us to look around and ask ourselves how we fit into the higher education landscape. They ask us to consider whether we measure up to the expectations of our fields. Best of all, they provide an opportunity to think about what we might do better. For me, that last bit is where the fun begins.

Yes, I said fun. Amid the drudgery of doing assessments, writing annual reports, and preparing for site visits, the excitement is in the possibility for growth. We might revise a course or a program. We might find an opportunity to expand or re-focus our offerings. We might see room for building interdisciplinary partnerships within the university or with external programs and organizations. We might get a new idea. Nothing is more exciting than a new idea.

So, as we welcome spring and face the big race to the finish line, I am inviting everyone to see their to-do lists through this lens. We are not just finishing things; we are looking for opportunities to grow and improve. This is the why of it all and the true opportunity for rebirth.