Background
Last semester, I experimented with allowing test corrections on every exam. I appreciated the opportunity to allow students to come back from mistakes. At the same time, I did not necessarily feel that I was seeing the students really learn the material. During final exam week, I participated in an AIM workshop on "Interactive assessments in open source textbooks." Some of the discussion centered around how such resources could be leveraged in mastery-based teaching. Long story short, I decided to jump all in.My Mastery Strategy
What is my strategy? I'm doing a weekly mastery quiz for 25 minutes once a week. It seems that we're getting about three or four new mastery outcomes for each week of class. The quiz includes questions for those outcomes as well as new versions of questions relating to outcomes assessed in earlier weeks. I adopted a four-level scoring system.- M = "mastered". The student provided a solution that both answered the question (essentially) correctly and conveyed a clearly communicated solution.
- R = "resubmit". The student provided a solution that was nearly correct or nearly conveyed a clear solution, but there were some items that were unclear. I tend to think the student probably has the idea mastered but needs to correct a mistake or clarify their communication.
- S = "shows progress". The student is making head-way but has at least one significant issue such that I'd like them to attempt a new version later.
- L = "lacks evidence". Not enough work was shown to evaluate their understanding or the work shows a beginning development of understanding.
Once a student scores "M" on an objective, they do not need to repeat it again. If they score "R", then within a week then need to submit the corrections and I can choose whether the update convinces me they are either "M" or "S". For scores "S" or "L", they will need to try another question on a later week to demonstrate mastery.
Setting Up Mastery Tracking on Canvas
Our university uses the Canvas learning management system. Having been referred to a video by Sharona Krinsky about to track progress in learning outcomes on Canvas, I decided to give that a try. The rest of this post is really about what I've learned so far in doing this.
First, it was a pain to setup the learning outcomes. I realized very quickly that I would not be doing it through the built-in system. Canvas requires way too many clicks. Each outcome needs to have several different descriptions as well as a grading point scheme associated with my M/R/S/L system above. With several dozen planned learning outcomes, I quickly realized using Canvas's interface was insane.
Canvas does allow you to import your objectives from a CSV file. However, you can't export your existing work as a CSV file, edit the file, and then reimport the new version. You have to follow some directions to figure out the required CSV format.
I found the instructions about the format of columns relating to the mastery ratings and the required points to achieve mastery incomplete. If mastery can be achieved with multiple ratings, I did not find how to identify the minimum score for mastery. The instructions say there is a column before the first rating for this, but I only got errors when I tried. I must have been doing something else wrong.
I found the instructions about the format of columns relating to the mastery ratings and the required points to achieve mastery incomplete. If mastery can be achieved with multiple ratings, I did not find how to identify the minimum score for mastery. The instructions say there is a column before the first rating for this, but I only got errors when I tried. I must have been doing something else wrong.
One additional item that caught me was the vendor_guid field. The instructions make it seem that these can be any unique items in the .csv file, such as going through the alphabet. However, I learned that they perhaps need to be unique for the user, at least for the outcome groups. When I used the same vendor_guid entries of "a", "b", and "c", etc., for groups, I had an import error saying that the field had been used previously. I eventually decided on creating a short id for the groups and outcomes and then generated a vendor_guid field as a concatenation of the course id with the group and outcome ids.
Here is a portion of my import .csv file.
vendor_guid,object_type,title,description,display_name,calculation_method,calculation_int,workflow_state,parent_guids,ratings,,,,,,,,, M234Sp20_D,group,G2: Derivatives,Derivatives and Differentiation,D,,,active,,,,,,,,,,, M234Sp20_L,group,G3: Limits,Limits and Continuity,L,,,active,,,,,,,,,,, M234Sp20_A,group,G4: Applications,Applications of Derivatives,A,,,active,,,,,,,,,,, M234Sp20_T,group,G5: Trigonometry,Trigonometry,T,,,active,,,,,,,,,,, M234Sp20_I,group,G6: Integration,Antiderivatives and Integration,I,,,active,,,,,,,,,,, M234Sp20_D1,outcome,D1: Derivative from Graph,Find the derivative of a function defined by a graph by approximating the slope of a tangent line.,D1,n_mastery,1,active,M234Sp20_D,4,Mastery,3,Resubmit,2,Shows Progress,1,Lacks Evidence,0,No Work M234Sp20_D2,outcome,D2: Numerical Derivative,Use the definition of the derivative to approximate the derivative of a function numerically.,D2,n_mastery,1,active,M234Sp20_D,4,Mastery,3,Resubmit,2,Shows Progress,1,Lacks Evidence,0,No Work M234Sp20_D3a,outcome,D3a: Definition of Derivative: Polynomial,Apply the definition of the derivative to find the derivative of a quadratic polynomial.,D3a,n_mastery,1,active,M234Sp20_D,4,Mastery,3,Resubmit,2,Shows Progress,1,Lacks Evidence,0,No Work M234Sp20_D3b,outcome,D3b: Definition of Derivative: Algebraic,Apply the definition of the derivative to find the derivative of a simple algebraic function.,D3b,n_mastery,1,active,M234Sp20_D,4,Mastery,3,Resubmit,2,Shows Progress,1,Lacks Evidence,0,No Work
Scoring Assessments
Well, now I have outcomes coded in Canvas and have written an assessment quiz. After the students have completed their work, it is my turn to evaluate their solutions and record their mastery ratings.I create an Assignment in Canvas for the week. I have decided to make the value of the assessment worth 1 point for the purpose of being able to easily use a score of 1 to mean they did the assessment and 0 to mean that they were absent and did not. I'm not actually using the points for any purpose other than as a simple flag.
Canvas has a SpeedGrader that would work really well if students uploaded their own work. Unfortunately, Canvas does not support instructors uploading student work and assigning it to the individual students. I wish it did, because that would make it so I could type comments and get feedback to students more quickly. So I am currently marking my feedback on the paper copies.
I do, however, still use SpeedGrader because I can turn on Rubrics. I create a rubric for the assignment and import all of the outcomes that appear on the quiz. For later quizzes, I can find that same rubric. You have to be sure that it comes from the same class so that it doesn't import another class's outcomes. Then for later weeks, I only need to add new outcomes for that week.
I do wish there were keyboard shortcuts for entering data on the rubric. You have to tab too many times to make keyboard entry efficient. I prefer clicking on the ratings, but wish I could just type the scores associated with the ratings and tab to the next outcome.
Viewing Student Progress
My biggest gripe is the useless formatting of looking at student progress. The student view allows students to see their progress by group and outcomes are sorted alphabetically.
The instructor view is not as nice. The Gradebook does have a Learning Mastery view. Unfortunately, you can only view 18 students at a time and I have no idea what order Canvas is using to sort the outcomes. You can move them around on the screen but it doesn't save your new order.
As I am including more and more outcomes as the weeks pass, I realized I needed a custom way to look at student progress. Canvas allows you to export the Learning Mastery report from the Gradebook as a .csv file. I didn't find the format to be much better than the original report.
Fortunately, once the data is free from Canvas, I can use other tools to parse the data and generate anything I want. I took some time to create a script in R that imports the data and then generates a custom progress report.
The CSV file has two columns per outcome and one row per student. There is not an obvious order to how students are sorted or how outcomes are listed. So part of the script's task is to sort students into a meaningful order and to identify outcomes into a reasonable order. I wrote the script to identify outcome groups and outcomes within the groups. The script includes a structure so that I only include outcomes that have been assessed so far in the term and does not report about future outcomes.
Here is sample output for my Test Student, which has only been marked with two outcomes with mastery.
Here is sample output for my Test Student, which has only been marked with two outcomes with mastery.
Test Student Cross-Over CO1: Cross-Over CO2a: Cross-Over CO2b: Mastery Shown Cross-Over CO3: Mastery Shown Cross-Over CO4: Derivatives D1: Derivatives D2: Derivatives D3a: Derivatives D3b:
The script itself is available from my github repository as mastery-progress.R. Please feel free to use and adapt the script in any way that you desire.