Panel Paper:
Studying the Integrated Planning and Advising for Student Success (iPASS) at Three Institutions
*Names in bold indicate Presenter
MDRC and the Community College Research Center are working with three institutions to evaluate enhancements to iPASS strategies using randomized controlled trials. This paper examines the following research questions:
- How did the colleges enhance their iPASS implementations?
- Did the iPASS enhancements produce a different experience for students compared with standard iPASS?
- Did the enhancements produce short-term gains in student outcomes, compared with standard iPASS?
Eligibility criteria differed at each institution, but at a high level, continuing students at-risk of not graduating were eligible for the study. Two cohorts of students were randomly assigned, with over 8,000 students in total across the three institutions.
The enhancements evaluated in this study were designed to give students and advisors better, personalized data about their academic progress, to help advisors intervene earlier, to facilitate advising sessions informed by rich data sources, and to help advisors and students make adjustments to reach students’ academic goals. The institutions used new technologies such as: predictive analytics to identify at-risk students; early alerts derived from course progress, degree and career planning tools, and personalized communication campaigns. Advisors used a new protocol to facilitate in-person advising sessions using the data gathered from these technology tools. Students randomly assigned to the program group received the program for two semesters.
To estimate the program’s effects on students’ academic outcomes, the evaluation uses a random assignment design with individual-level random assignment to compare students in a program group, whose members had access to the enhancements, with students in a control group, whose members had access to the college’s standard advising services. Implementation research was conducted to measure implementation fidelity and treatment contrast.
The institutions made implementation progress, but had mixed results substantially changing the student experience, and most enhancements led to incremental changes. Institutions experienced challenges using predictive analytics tools, and had mixed results getting faculty to use new technologies. As group, however, the colleges’ experiences provide important lessons for the field, particularly as increasingly more institutions look to technology and student advising for solutions. At APPAM, we will present early impacts findings on student outcomes, implementation fidelity, treatment contrast, and opportunities and challenges the colleges face moving forward, in addition to drawing lessons for the field.
Full Paper:
- iPASS_Interim_Report.pdf (3284.0KB)