Higher education institutions generate a wealth of data that can be used to improve student success, but often the volume of data and lack of analysis prevent this data from having the impact it could have. “I think it’s hard for the general faculty population or administrator population to really have a handle on the data that is really driving decisions,” says Margaret Martin, Title III director and sociology professor at Eastern Connecticut State University. “They don’t get a chance to see it or they just get very infrequent information about it. So there may be too much data, but it’s often not communicated effectively to people in ways that are both understandable and useful to them.”
In recent years, Eastern Connecticut State University has made efforts to address this issue by creating a data-driven approach to a longstanding priority of the university: helping low-income, minority, and first-generation students succeed.
Incorporating this goal into the strategic plan has kept this issue in the forefront. Title III grants and participation in the Nellie Mae Education Foundation’s Project Compass, a five-year initiative to improve retention and graduation rates, have provided the funding and accountability to make this effort possible and continue the momentum.
The first step in the Project Compass project was to identify the various sources of data. “Student affairs and housing are always collecting data. The library collects usage data. We have data on who gets tutoring. Everyone’s collecting some data. At the same time, all universities have to provide external reports on retention rates, not only for all students but also broken down by ethnic groups.
Then there are evaluation surveys that freshmen and seniors fill out—satisfaction surveys and engagement surveys. All this data are there, but we wanted to know: Who’s at risk of leaving in the first year? Who’s at risk for leaving after the second year? Can we use the information on the students coming in and develop a model that will predict who’s more likely to leave? And that’s what we did,” says Carmen Cid, dean of the School of Arts and Sciences at Eastern Connecticut State University.
Understanding the student population
Although it’s important to understand effective practices from peer institutions, each institution has a unique culture that needs to be understood in order to help students succeed. At the outset, “most people really couldn’t characterize our student population. We might have had some sense of gender distribution, maybe a little bit about ethnicity, but not a whole lot. So part of it was plodding along, trying to ask very simple questions about our students and adding that to our dataset,” Martin says.
Before this initiative, there was no first-generation or high school GPA data being collected, so the admissions form was modified.
The university uses a logistic regression model to predict which students will be at risk for withdrawal. This model uses high school GPA, engagement, and other factors to group each student into one of five levels of risk for withdrawal. In addition, project participants collected qualitative data through focus groups to determine some of the reasons why students were staying or leaving. “Numbers are very important, but you have to do student focus groups,” Cid says.
For example, analysis of quantitative data showed that transfer students do particularly well in some majors, which makes them a key target population to recruit. Understanding why they succeed is a different matter. Cid and her colleagues were able to get at this important information by asking transfer students from each of the university’s feeder schools who have succeeded about their experience at the university and what was helpful for them.
“You have to work this from various angles. You really have to develop a student success network. You have to know who needs extra help, given what you know from these data. Half the people who leave the first year are not necessarily academically deficient. They’re leaving to transfer to other places, perhaps a community college because it’s cheaper or closer to home. Or they actually improved their calculus skills, and they’re going to another university that has an engineering program. These are things you find out, but the main thing is that you have to have various people looking at the data together and talking about it,” Cid says.
One way this dialogue has been facilitated has been through the community of practice created as part of Project Compass. Funding was contingent on regular meetings of this group of people from across the university to generate a work plan and share accomplishments. “It’s kind of like having an academic coach. [Project Compass] also provided professional development to us twice a year,” Cid says.
Martin adds: “While in some respects it was difficult, one of the things that did happen was that lots of people from different parts of the university got to grapple with the raw data as it was coming forward and really participated in the analysis. Through that process ,they got a chance to really identify what the group thought was important as well as what the researchers thought was important.”
Faculty involvement
Faculty involvement in this initiative is essential, and there have been two things that have motivated faculty to participate: the desire to better serve the students and the potential to engage in activities that employ their skills (and could potentially produce publishable research).
Faculty can serve as experts in analyzing data. There are many talented people at any university with skills who might think of this as a really nice research project. So you engage the faculty as problem solvers—psychologists, sociologists, and mathematicians—who are interested in helping students.
Funding and release time might encourage some faculty members to get involved. Others are motivated by their sense of obligation to their students.
“We know that people are leaving, but when we see how they leave our own majors, it asks us to really reconsider what we’re doing individually with relatively small groups of people that we have relationships with. I said to faculty, ‘This is not just about data. It’s not just about our admission process. It’s not just about student affairs and residence life making life OK for students. It’s about you. You have to have some ownership of not just the academic success but the persistence of students and their timely graduation.’ That, I think, is beginning to be a big cultural shift for people. And it’s not fully accepted by every faculty member. But I think that sense that we have to be responsible for this really came from the data,” Martin says.
Another way to engage faculty in the issue of student success has been through an online course for faculty. The goal is to get faculty to understand the issues involved with low-income, first-generation, and minority students. “We’re reading the research about this and having lots of online discussion about pedagogy and good practices,” Martin says.
Excerpted from Data-Driven Student Success, Academic Leader, 28.1 (2012): 4.