Evidence of Effectiveness

Evidence of effective practice in educational development, depending on the depth of faculty support provided and intended outcomes of the program, ought to be measured in different ways, and at different points in time.

Below, I provide examples of my effectiveness as an educational developer, with the evidence aligned to variety of support I have provided. I also highlight different approaches to evaluation and different forms of evidence I use to illustrate my own, and programs’ effectiveness.

One-on-one consultations

After consulting with a faculty member on how best to update teaching practices to incorporate technology, I received the following email (2015):

“Just wanted to let you know that I taught all 6 of my sections this week using my Surface. I absolutely love it. I was going to wait until after reading week so that I could get more familiar with it, but it was so easy to use that I decided to just go ahead this week.

Anyway, just wanted to send you that update and to tell you how much I have appreciated all of your encouragement and support with getting going with this. It’s SO MUCH BETTER than the overhead transparencies.”

I selected this excerpt to share as it helps illustrate some of my claims in philosophy of educational development, specifically where I state that my work has impact when it is relevant to the learner. In this case, I believe it is easy to see how the initial consultation has led to a change in practice, and in turn, increased instructor satisfaction.


“One-off” workshops: Flipping the Engineering Classroom (2017).

In addition to so-called “happy sheets” (Spowart, Winter, Muneer, McKenna & Kneale, 2017) asking for participant satisfaction at the end of a single workshop session, testimonials provided by faculty members offer a snapshot on effectiveness. The following was sent to my colleague and I after facilitating a three-hour workshop:

“Dear Gavan and Andrea,

I would like to thank you again for the excellent session on flipped classroom offered to Engineering Faculty yesterday.

There were clear learning outcomes and great ideas, which will serve as a solid foundation for those faculty members willing to try this pedagogy.”

Multi-day workshops: Preparing for Curriculum Review (2017).

A graph showing participants' rating of a session's overall effectiveness

The graph above shows faculty members self-rated report of effectiveness for a two-day workshop designed to orient program chairs to Western’s Institutional Quality Assurance Process. With a n=17, 100% of participants who responded to the invitation to complete the survey rated the overall effectiveness of the workshop as either “very effective” or “extremely effective”.

My role in Preparing for Curriculum Review was, in part, to coordinate the team of co-facilitators, manage the two-day event from a logistics perspective and facilitate or co-facilitate sessions during the two days.

I see this as a measure of my own effectiveness as an educational developer given that the collaborative nature of the workshop facilitation was a result of the educational developer’s team efforts to develop and deliver meaningful sessions for the workshop.

Multi-week workshops: Instructional Skills Workshop, Online (2016, 2017).

The Instructional Skills Workshop Online is a six-week online course focusing on designing and facilitating effective eLearning activities. An intended outcome of the ISW-O is the increase in confidence of instructors teaching online as self-rated measures of confidence influence new instructors’ willingness to incorporate new teaching strategies into their practice (Sadler, 2013).

I have co-facilitated the annual offering of the ISW-O since 2015. At the conclusion of the workshop, we distribute an online survey to participants asking for feedback on a variety of topics related to the outcomes of the workshop and our approach as facilitators.

One such measure is the self-rated measure of confidence, using a then/post evaluation design, posed as two questions: “Please rate your confidence in facilitating online instruction before the workshop” and “Please rate your confidence in facilitating online instruction after the workshop” (with a choice a being 4-point Likert Scale, ranging from “not at all confident” to “very confident”).

As a result of our data collection, we can report that participants in 2016 and 2017 were 3 to 5 times more likely to report feeling moderately to very confident with online instruction after their participation in the ISW-O.

Program evaluation

Western Active Learning Spaces Faculty Support Model Evaluation

I engaged in a scholarly, qualitative research project to evaluate the effectiveness of a faculty support model developed to introduce and orient first-time instructors to our active learning classrooms. The broad premise investigated was that the instructor support model helps orient and support instructors teaching in these ALC, which, in turn, supports student learning. Through semi-structured interviews and a grounded theory approach, I collected data and reported results from 11 instructor participants.

Results of the project were shared at the Society for Teaching and Learning in Higher Education Conference in 2017.

Program evaluation of this nature (scholarly, peer-reviewed) helps illustrate my effectiveness as an educational developer not because the findings had implications for practice (or not), but rather that I take, when appropriate, a scholarly and evidence-based approach to evaluating the impact of the work I engage in.


Sadler, I. (2013). The role of self-confidence in learning to teach in higher education. Innovations in Education and Teaching International, 50(2), 157–166. http://doi.org/10.1080/14703297.2012.760777

Spowart, L., Winter, J., Turner, R., Muneer, R., McKenna, C., & Kneale, P. (2017). Evidencing the impact of teaching-related CPD: beyond the “Happy Sheets.” International Journal for Academic Development, 22(4), 360–372. http://doi.org/10.1080/1360144X.2017.1340294