My "stealth-flipped" Intro to Ancient Rome class sat for their first midterm a few weeks ago. Honestly, I was dreading this exam and the aftermath. In the fall, everything was going extremely well until after the first midterm. After the midterm, students suddenly slacked off, started to complain about having to attend class, and ranted on Facebook about the flipped class model. In a matter of week, it went from a class that I looked forward to teaching to one that I dreaded and resented. I spent several months talking to learning specialists, reading evaluations, and trying to figure out what went wrong and, more importantly, how to prevent the same thing from happening again. One clear answer (besides weekly quizzes and less heavily weighted midterms): a much more challenging first midterm. To this end, I added a fair chunk of course content (on archaeology) and expanded the chronological range of the exam. I made sure that the questions were challenging. My aim was to have the best students score in the mid-low 90s, with the rest of the class scattered in the Bs and Cs. If a student studied, they wouldn't fail the exam; but I expected a pretty good cluster of Bs and Cs. I wanted the exam to be a wake-up call of sorts, a reminder that they had to stay engaged and working hard if they hoped to earn a high grade.
I was stunned when the results came back. Despite my most concerted efforts--and a substantially more difficult first exam, this spring class outscored their two previous cohorts by a wide margin. Some rough data for comparison:
Fall 2011 (c. 220 students, traditional lecture): Avg: 80; Median 85
Fall 2012 (400 students, traditional flipped): Avg. 81.87, Median 87
Spring 2012 (400 students, stealth flipped): Avg. 86.92, Median 92
In Fall 2012, A (159); B (109); C (56); D (24); F (33) [numbers approximate]
In Spring 2013 A (215); B (93); C (34; D (17); F (22) [numbers approximate]
I am a tough grader and when I want to write a tough exam, it's a tough exam. In the comparison of Fall 2011 and Fall 2012, the flipped class scored only a few points higher than the regular lecture course--but the exam in Fall 2012 was substantially more challenging. The grades can't really be compared because, thanks to the flipped method, I was able to write a more difficult exam that tested deeper learning. I upped the ante once again in the spring; yet, somehow, this spring cohort aced the exam and significantly outpaced the fall cohort. In part, this may reflect the fact that it is the second semester for freshmen and they are better able to manage the challenges of college--but freshmen are only about 25-30% of my students. So why such a dramatic difference--and off the charts high grades on a very difficult exam?
First of all, weekly quizzes. The spring students had taken three weekly quizzes leading up to the midterm, covering all but the final week of course material. In addition, they had access to practice quizzes for all the content that would be covered on the exam. On Blackboard, we can see who tries the quizzes (I set it so that we don't see how they perform). Over 90% of the class tried all available practice quizzes. That is, they had a lot of practice answering multiple choice and "mark all correct" questions.
In addition to multiple choice and mark all correct questions, the midterm had 8 short answer questions. In the fall, these short answer questions tripped up a number of students, primarily because they did not answer them completely, directly, or with enough detail. We received a number of complaints from them and my "grade czar" had to deal with many dissatisfied students who did not understand why they had lost partial credit for vague and incomplete answers. This time, I made a short Echo video reviewing my expectations for the short answer questions and giving them sample questions and answers, with explanations. My head TA also did a lot of work with practicing the short answer questions during his weekly Supplementary Instruction sessions. Finally, we made available to the students a very long list of potential short answer questions. It was nothing that wasn't also available last semester--but this time, I said explicitly that all questions would be drawn from the (exhaustive) list, albeit perhaps in recombined form. By telling them that work on this document would be rewarded, they focused their attentions on answering the long list of questions rather than speculation. In Fall 2012, I told them that the questions would be drawn from in class questions and review questions embedded in the lectures; but, for some reason, this was never enough to motivate them to assemble a list and learn the material. By assembling the list, distributing it to the class, and encouraging them to create a Google doc and pool knowledge, we were able to get them to focus their energy on study rather than fretting.
In the days leading up to the exam, I also observed some notable changes in learning behavior. First, the class discussion board--Piazza--was very quiet. In the past, this was a place where students would pose questions whose answers could easily be found with a bit of effort. This had stopped (perhaps in part because I had warned them against doing so in the syllabus). From what I gather, Facebook was also fairly quiet. Instead of the usual pre-midterm anxiety venting that was a hallmark of the fall class, this group seemed to be studying. Their anxiety levels were noticeably lower (the teaching team was not inundated with emails) and they seemed to have a clear grasp of what to expect. The efforts I made to convey expectations as transparently as possible, as well as the practice they had with the quizzes, seemed to pay off.
Typically, in the 48 hours before an exam, there is a big spike in Echo views of recorded lectures. We waited and waited for the spike to come, but it never did. In addition, what we did see was the students going back into the recorded class sessions to check details or refresh their memories. I was delighted by this behavior--it was exactly what I wanted. In the fall, I pleaded with students to go back to the recorded class sessions, to recognize that exam questions were coming from class and prepare by reviewing the recordings. Few of them ever did this. Without a word from me, this spring cohort is using the recorded class sessions exactly as they were intended (and making it worth the expense and effort to record class meetings).
I am learning a lot from my current cohort of students. Two things stand out: the value of frequent, low-stakes assessments for supporting learning (and, more generally, a positive experience) in a large enrollment class; the importance of lecture--not so much as a means of knowledge transfer as much as a tool for orienting students, letting them get to know me, feel a connection to me, and experience my enthusiasm for ancient Rome.
We are now heading swiftly towards the second midterm. This is a tough exam--it's the part of the class that led me to flip the class in the first place; and inspired me to toughen up the first section of the course. Thus far, quiz grades have not dropped off at all and the students themselves have observed that the material is getting more difficult and requiring more work. In the past, I tried to warn classes about the increased difficulty level to no avail--midterm grades always dropped by 10 points. I have a feeling that this cohort will finally break that pattern, and I will not have said a word to them about the increased difficulty. They have figured it out themselves and adjusted their effort accordingly.