When Dr. Tammy Russell and Barb Murk from their institution’s Office of Institutional Planning, Assessment, and Research realized their course-level assessment model wasn’t generating usable data, it became clear that change was necessary. In a recent SPOL webinar, they shared their journey transitioning from a fragmented, inherited system to a streamlined, purpose-driven program-level assessment model and the impact it’s had across their campus.
Before the shift, faculty engagement in assessment felt like “jumping through hoops.” The process was unclear, time-consuming, and ultimately failed to yield meaningful data. A pending accreditation visit forced the team to take a hard look at the data they had and the data they lacked. The result was a gap in ground-level outcomes. With support from academic leadership and key assessment champions, the team knew it was time to move forward with a different approach.
The accreditation team’s feedback was the catalyst. Their observations not only validated internal concerns but also gave momentum to a procedural shift. Ms. Barb Murk, a newly hired Institutional Research Assistant with nearly two decades of institutional knowledge helped reignite capacity. Through fresh training and focused re-engagement with SPOL, the groundwork was laid for a more sustainable, useful model.
Making the Shift to Program-Level Assessment
At the core of the transition was a renewed commitment to meaningful, program-level learning outcomes (PLOs). Rather than capturing data from every course, the team mapped PLOs to three or four key courses per program, targeting summative assessments already in place. For example, an existing speech course was selected to assess the institution’s communication learning outcome, minimizing redundancy and leveraging what was already working.
One of the keys to success was demystifying assessment for faculty. A fall in-service session will be held not only to explain the differences between course and program-level assessment, but also to walk departments through revising their learning outcomes and mapping them to appropriate courses. Individualized support will be provided to every department, ensuring the rollout feels collaborative rather than prescriptive.
To reduce the burden on faculty, a simplified process was adopted. Faculty submit assessment information via a form with no system logins required. One staff member handles the data entry in SPOL, preserving consistency and easing the transition. Automated reminders and SPOL templates keep the process organized and efficient.
Outcomes and Takeaways
Since the transition, the institution has reported clearer alignment between student performance, program goals, and institutional mission. Redundancy in data collection has been reduced, and assessment is now seen less as compliance and more as a tool for improvement. Perhaps most importantly, faculty and administrators alike feel a greater sense of clarity and shared purpose.
Trust your instincts. If the data isn’t useful, it’s okay to reimagine the system. Challenge the status quo. Even longstanding systems can and should be improved. Use your tools wisely. SPOL isn’t just software; it’s a partner in your assessment strategy. As institutions continue to evolve their assessment practices, this case study stands as a testament to the power of reflection, collaboration, and the right tools at the right time.