Curriculum Development

Synchronous, collaborative curriculum mapping with Google Sheets

Prideax’s (2003) description of three levels (the planned, the delivered, and the experienced) for examining curriculum is helpful for thinking about what kind of questions get asked about curricula. With a focus on the delivered curriculum, faculty members engage in curriculum mapping to see, over the course of a program, what gets taught, when it gets taught and how it gets assessed. The mapping described below occupies Prideax’s “delivered curriculum” level with faculty members’ experiences in their individual classrooms a key component of analysis.

Two key assumptions here: the data is meant to be examined in aggregate—that is, the unit of assessment is at the program level, not at the level individual courses ((This is largely an act of “bracketing” as individual courses can be seen in the data. The intent is not to use the data to say “Professor X is not doing Y in class Z”.)) ; and that the data does not drive decisions, rather the data drives discussions amongst faculty members which, in turn, drives decision making.

A challenge with curriculum mapping can be the logistics of it — done by hand and following a collaborative model, there are sticky notes to transport and transcribe, not mentioning the additional challenge of getting faculty members together in one room at one time.

At Western University we’re in the midst of developing a web-based curriculum visualization tool that will create a series of curriculum visualizations. In the meantime, we felt we could improve our analog process by taking a small step into the digital world. Enter: Google Sheets.

Before I go into any sort of detail on what we’ve done, I’m just going to go ahead and share an example map with dummy data entered ((And it should be noted that the template is licensed under the Creative Commons under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License)).

The data collected produces two visualizations: the first is a visualization of a progression of learning through the program (what we sometimes call and IRM chart, where I stands for Introduce, R stands for Reinforce and M stands for Master). The approach of asking instructor to weight the complexity of an outcomes was inspired, in part, by Veltri, Webb, Matveev & Zapatero (2011). The second asks instructors whether an outcome is taught and / or assessed in their course (what we’ll often call an T/A chart).

Sandwiched between these two visualizations is an opportunity for faculty to enter the assessment methods and instructional methods used to assess and teach the particular program-level learning outcome in their course.

What’s elegant about this stop-gap is the fact that as instructors enter data, they create the visualization. There is no additional transcription or translation — my colleague, Dr. Beth Hundey, and I set up the Google Sheet to automatically update the colour of the cell, for example, in the IRM chart. At a glance, there’s the opportunity to see how a particular program-level learning outcome progresses through a program curriculum. The data can become “useful” the moment that faculty are done entering data.

Less elegant is the interpretation of the assessment and instruction data. We provide a list of methods linked to numbers, asking instructors to enter the list numbers that match the methods used in the course. There isn’t, however, an easy way to visualize the data entered — it requires extra step(s) of downloading and manipulating the data in a program like Excel. It should be noted that Google has added an “Explore” option that interprets the data entered and creates automatic visualizations of the data. My cursory look at the graphs created doesn’t make me want to suggest that this will be a viable option for creating useful visualizations.

Regardless, as we work with programs undergoing curriculum review, our collaborative sheet allows for the quick collection and interpretation of data. There’s certainly some work that’s required to set the sheet up as well as introducing the task to individual instructors. A curriculum visualization process set up in Google Sheets can work in very specific situations to simplify the task of collecting curriculum data for both faculty members and curriculum developers.


Prideaux, D. (2003). ABC Of Learning And Teaching In Medicine: Curriculum Design. British Medical Journal, 326(7383), 268–270. Retrieved from

Veltri, N. F., Webb, H. W., Matveev, A. G., & Zapatero, E. G. (2011). Curriculum mapping as a tool for continuous improvement of IS curriculum. Journal of Information Systems Education, 22(1), 31.


Considerations for Podcasting as a Higher Education Assignment

My Podcast Set I

This morning, I’m doing a quick scoping of the teaching and learning resources related to using Podcasts as an assessment tool in the Higher Education classroom. The intended outcome of this environmental scan is to see what the evidence suggests as best practice for designing and facilitating Podcasting assignments. My sources are varied, from Blog posts to peer-reviewed journals (see the bottom of the post for relevant links to the literature).

Initial reactions

Generally speaking, the literature describes students as reacting positively to Podcasting as an assignment type in their course.

Curiously, much of the peer-reviewed literature around Podcasts seems to “peak” at the end of the oughts. Google’s trend data using the search term “Podcast” appears to support this: an explosion of searches for Podcasts, which reaches its relative peak in 2006 ((Curiously, there’s another peak in December 2014 which Google attributes to the Serial Podcast)). If this is the height of the Podcast hype, then it’s not surprising to see papers start to appear in the closing years of the 2000s reporting on the use of Podcasts in the higher ed classroom. But Podcasts, as an assessment type, seems to have moved along the educational technology “hype cycle“.

Rather than work that describes the use of Podcasts as a kind of assessment, I’ve noticed more research on the use of Podcasts as:

  • a kind of instructional technology (e.g. recording lectures as making them available as Podcasts) and
  • a way to provide student feedback.

Can’t help but think there’s an opportunity here for some kind of introspective and retrospective look at eLearning, using Podcasting as a case study.

Design & facilitation considerations

So, without further ado, here’s what people have said about creating Podcast assignments:


  • It will take students more time to produce their Podcasts than you initially imagine.
    • Limiting the length of the Podcast can limit the scope of production and, subsequently, time.
  • Consider if it is a group assignment or an individual assignment: if creating Podcasts for the first time, students want to be able to troubleshoot tech issues with peers rather than feeling it’s up to them to solve their problems.
  • To help tackle the scope of the project, and help with the technical side of things, consider scaffolding Podcasting assignment: break down the Podcast production into discrete steps and have students submit these along the way, in addition to the final version of the Podcast.
  • Is the Podcast a means to an end or an end itself: are you assessing the quality of production or the quality of ideas?
    • Consider providing explicit direction on the amount of time student should spend on post-production.
    • Reflect this in the assignment rubric.

Podcasting as a skill

  • Don’t assume that the “digital natives” in your class know what tools to use to create Podcasts, or, how to use the tools: Podcasting is a skill and they need to be taught that skill.
    • Having exemplars of other students’ Podcasts can help student grasp the expectations and scope of the assignment; or, create an example yourself.
    • One suggestion is to consider creating a Podcast as a live demo in-class: it can set students at ease and can demystify the production process.
  • Make use of your campus’ technology resources when introducing the assignment; having the appropriate campus support introduce the tools and how to use them is a great first step but…
    • Be prepared to devote class time to addressing on-going technical issues.
    • Don’t assume that there is sufficient campus resources to offer individualized support for each group or individual in your class.
  • Keep the tools inexpensive and simple: Garageband (OSX & iOS, $5) or Audacity (Windows, Linux, OSX, Free) should suffice for production.


Other considerations

  • Who is the audience for the Podcast? Have a clear notion of who the intended audience is and be able to communicate that to students.

Podcast Resources

As an assessment tool

Podcasts as an assessment tool in Higher Ed (Blog Post, 2013)

Student Thoughts about Podcasting Assignments (Blog Post, 2012)

Four Mistakes I Made when Assigning Podcasts (Blog Post, 2012)

Can Creating Podcasts be a Useful Assignment in a Large Undergraduate Chemistry Class? (Conference Proceeding, 2010)

Podcasting (Blog Post, 2010)

As a feedback tool

Reflections on using podcasting for student feedback (Article, 2007)

It was just like a personal tutorial: Using podcasts to provide assessment feedback as an instructional tool (Conference paper, 2008)

As an instructional tool

Podcasts and Mobile Assessment Enhance Student Learning Experience and Academic Performance (Article, 2010)

The value of using short-format podcasts to enhance learning and teaching (Article, 2009)

The effectiveness of educational podcasts for teaching music and visual arts in higher education (Article, 2012)