Happy 2017! I escaped the snow and ice of the Atlanta Joint Mathematics Meetings for the wind and cold (and snow) of northwest Iowa. While at JMM, I had a nice discussion with a few Gold ’14 dots about their implementations of MBT, and as a result, I decided to write a post about how I manage the logistical side of things in my MBT (and often specifications graded) courses, and share some of the resources I’ve developed.
My goal in managing the logistics has always been to make things as simple as possible for me. That might sound selfish, but it’s really more about self-preservation. Your mileage may vary when implementing any of the ideas/tools below, and that’s okay! You need to find the system that works for you and the students in your classes at your institution. Here are some general principles and ideas that have worked for me.
Do not create custom versions of exams and quizzes
I learned this one the hard way. The first semester I implemented MBT in a calculus course, I started offering students the opportunity to take mastery quizzes approximately halfway through the semester. Quizzes were administered one day/week, and students were to sign up for up to two mastery objectives by the night before, so I had time to print off the correct number of copies of each objective. Thus, if only two people wanted to do objective 3, I only made two copies of a page with an objective 3 problem, and space for a solution. If no one wanted to do objective 1, I would make no copies.
There are lots of issues with this approach to logistics, but let me highlight the two that haunt me to this day:
- Sorting and copying custom quizzes is overwhelming. My system, at least, was to print the (hopefully!) correct number of copies of each problem, write the students’ names on them, and sort by student before going to class. This was to make sure that everyone got the problems they signed up for, and ensure that we didn’t spend 10 minutes of class time making sure everyone got only the problems they’d signed up for. But this took me at least 20-30 minutes every week, and was a real pain to manage.
- Not all students remembered to sign up. Now, this was certainly the student’s fault, not mine, but it did prevent the students who forgot to sign up from showing whether or not they had learned any calculus since the last mastery opportunity. If I had extra copies of problems relevant to the students who didn’t sign up, I would sometimes offer them, but what if there were multiple students who had forgotten? Who gets the benefit despite not following the instructions?
make your exams/quizzes a just a list of problems tied to the mastery objectives (see an example
from Austin Mohr). Include all the problems students have available for mastery, and let them choose the one(s) they need to complete. Leave no space for solutions, but hand out blank paper and ask that students complete their work on the blank paper (making sure to label the problems and turn them in in order).
Spreadsheets are your friend
I spoke at JMM on implementing specifications grading in abstract algebra, and one of the questions I got (which I’ve gotten in previous MBT/specs grading talks) is how well I think the system would scale to a course with N students in it, where N >> 10. One of the main concerns seem to be tracking student performance on revisions/repeated attempts at the same problem.
My answer is always that it scales really well. To assist in the task of tracking student performance on repeated mastery opportunities, I’ve created an Excel spreadsheet
. The instructions are in the file, but the idea is that each assessment gets its own worksheet in which you enter numbers corresponding the students’ performance. Suppose that cell B8 corresponds to Student A’s performance on objective 1; this will be the case in every worksheet corresponding to an assessment. There is then a master worksheet that takes the maximum value of B8 over all the worksheets corresponding to the assessments. The master sheet then counts the number of times a student achieves a passing designation over all the objectives (and also counts the number of ‘high passes’, if your system has such a thing), which you can then plug into an appropriate formula to determine the student’s final grade.
Put a price on out-of-class revision opportunities
Along the lines of the first suggestion, make things easier on yourself by limiting out-of-class revision opportunities. In fact, through Fall 2016, I have not allowed students any out-of-class revision opportunities. In Spring 2017, I will be allowing some in-office revision opportunities, but with a price. As a devotee of specifications grading, my students receive a limited number of tokens each semester (they usually start with 5 or 7, and can sometimes earn more by doing additional optional work). My Spring 2017 calculus students will receive 7 tokens, and can cash in two tokens for one additional mastery opportunity, taken orally in my office. The goal is to the number of such instances (I have 50 students) and to encourage the students to prepare well before coming to my office. I’ll let you know how it goes!
Anyway, these are a few of the things I’ve learned about streamlining and managing the logistical side of running an MBT course. What have you come up with?