8 Course (Re)Design Implications

Value of Peer-to-Peer Feedback

SAIL follows the classic curriculum improvement principles of being faculty-driven, educational developer supported, and data informed (Wolf, 2007). Most importantly, SAIL incorporates the wealth of rich experiential and theoretical expertise of faculty within a postsecondary institution.

Evidence-based practice suggests that peer review is most effective when it follows an iterative and reflective process designed to improve teaching (Chism, 2007; Hyland, et al., 2018; Keig, 2000). Formative peer feedback and opportunities for peer-to-peer learning were identified as the greatest strengths of SAIL by faculty participants in Pilots #1 and #2. SAIL’s developmental, faculty-led approach has been shown to foster trust, collaboration, and cross-disciplinary conversations, and to support a reflexive approach to learning (Hoessler et al., 2023).

I think it’s super important to team teach, collaborate with co-workers etc. This pilot program has been a very big learning opportunity for me to learn from peers and also to mentor some junior members.

Faculty member, Pilot #1

Faculty members’ feedback highlighted the sensitive nature of teaching. One faculty participant commented that it was a “little nerve-wracking to have a peer assess your students’ work. It felt a bit like a performance evaluation” (Pilot #2). The past couple of years were noted as especially challenging due to the global pandemic. Faculty reflected upon feeling vulnerable and forced to take risks in their teaching as they rapidly pivoted to remote learning. This highlights the importance of building trust within the ILO Pods.

Implementing a community of practice approach needs to be done with care. Significant attention should be given to building an environment in which faculty feel safe to share, be vulnerable, and take risks with their colleagues. We believe that adequate time spent together in the ILO Pods is, therefore, critical to success.

Favourite part of the process was the collaborative rubric design, reflecting together, debriefing together, building a community of practice in an interdisciplinary team. Build in more opportunities to collaboratively assess assignments and debrief!

Faculty member, Pilot #1

 

I did crave the opportunity to return to the student work and discuss it with the assessors. I think that would help me think through how I might improve the assignment (but this was not possible under the REB approval). 

– Faculty member, Pilot #2

 

To increase the time spent together while ensuring the time is well-spent, we suggest that future iterations of SAIL consider running ILO Pods on a three-semester cycle where faculty plan and develop their course in the first semester, deliver and assess their course in the second semester, and review and revise their course in the third semester all while maintaining the supportive environment of their ILO Pod.

Impact on Teaching Practice

During the Debrief, faculty noted several opportunities for improving their teaching practice, such as: revising course assignments to more intentionally address an ILO, mentoring junior faculty members within their department, using the rubric as a pedagogical tool to teach students about the skills they are learning, and modifying assignments to push students from strategic thinking to action, just to name a few.

I think this process was very helpful and insightful. As a co-op team, my recommendation is for us to review/revise curriculum so we can adjust and align better. Through developing the rubric and seeing what the institutional learning outcomes are, I see where there are ways to improve. I would like to take our course back to the drawing board and ask those tough questions about how it aligns with the rubric and overall ILO for Lifelong Learning.

Faculty member, Pilot #1

Hutching et al. (2013) argue that effective assessment requires processes that produce evidence that is “credible, suggestive, and applicable to decisions that need to be made.” This further requires that consideration is made in advance about how the assessment results will be used, by whom, and for what purpose. Therefore, careful consideration is given to the level of reporting (i.e., course, departmental, institutional) and the primary usage of results (i.e., formative or summative). These decisions take place within each ILO Pod and are based on consensus among the participating faculty members.

Assessing outside one’s own discipline was challenging but assisted by having assignment instructions. This relates to your point about inter-rater reliability; key terms in the rubric descriptions can be interpreted differently by discipline. I would endorse ongoing reflection on the breadth of the rubrics to ensure inter-disciplinarity.

– Faculty member, Pilot #2

 

The opportunity to think more deeply about your assignment was valuable. The intentional thinking time was so valuable. I appreciated the facilitated process. It makes you think about how much of a priority Social Responsibility is in the curriculum at the course level and how important your role is in helping students meet the program and institutional learning outcomes.

– Faculty member, Pilot #2

ILOs at TRU are still new to us. I did not but will in future have a more specific intention in my assignments to achieve certain foci. Then a SAIL assessment could focus on those foci only. 

– Faculty member, Pilot #2

SAIL provides faculty with the data and words to communicate information about student learning. It helps them to name expectations for learning and communicate those expectations to students. It helps them to determine the extent of student learning and then strategize how to close the gap between expectations and results.

I felt inspired by the students’ assignments that we had to review and I’m motivated to incorporate those ideas into my own class.

– Faculty member, Pilot #2

 

Using course assignments is valuable because it is organic, even if it only captures part of a course.

– Faculty member, Pilot #2

References

Chism, N.V. (2007). Peer review of teaching: A sourcebook (2nd ed.). Jossey-Bass.

Hyland, K. M., Dhaliwal, G., Goldberg, A. N., Chen, L. M., Land, K., & Wamsley, M. (2018). Peer review of teaching: Insights from a 10-year experience. Medical Science Educator, 28(4), 675-681.

Hoessler, C., Hoare, A., Austin, L., Dhiman, H., Gibson, S., Huscroft, C., McKay, L., McDonald, B., Mihalicz, L., Noakes, J., & Reid, R. (2023). Faculty in action: Researching a community of practice approach to institutional learning outcomes assessment. Journal of Formative Design in Learning, 7, 171-181, https://link.springer.com/article/10.1007/s41686-023-00084-6

Hutchings, P., Ewell, P., & Banta, T. (2013). AAHE principles of good practice: Aging nicely. American Association for Higher Education (AAHE).

Keig, L. (2000). Formative peer review of teaching: Attitudes of faculty at liberal arts colleges toward colleague assessment. Journal of Personnel Evaluation in Education, 14(1), 67-87.

Wolf, P. (2007). A model for facilitating curriculum development in higher education: A faculty-driven, data-informed, and educational developer-supported approach. In P. Wolf & J. Christensen Hughes (Eds.), Curriculum development in higher education: Faculty-driven processes and practices (pp. 15-20). New Directions for Teaching and Learning, 112.

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Strategic Assessment of Institutional Learning Copyright © by Carolyn Hoessler and Alana Hoare is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book