Friday, May 24, 2013

The Early Returns on Flipping a Large Enrollment Humanities Course

Back in Fall 2012, I embarked on an instructional experiment.  I wanted to see if it was possible to run a large enrollment, core requirement lecture course as a flipped class, with the same sorts of results (e.g. increased learning, higher order thinking, more student responsibility and self-direction, higher grades) that seem to accompany flipping smaller (15-30 students) classes.  To be completely honest, I had no idea what I was doing.  I was working from instincts honed after a decade of teaching every possible kind of course at UT Austin; and with a lot of help from our IT department and our Center for Teaching and Learning.

It's been an interesting year, to say the least.  I've chronicled many of the ups and downs, the lessons learned along the way, in this blog.  In the coming weeks, I'll be adding substantially to the blog for those who are interested in trying something similar,  Especially at this moment in time, it's crucial for faculty to be willing to step outside their comfort zones, to experiment with new ways of motivating learning in their students even in the hostile to learning environment of the large enrollment "lecture" class.  What worked for me won't work for everyone, but I think it important to make the point that institutional money invested in the development of more effective "on the ground" instruction will go a long way towards addressing some of the loudest detractors of higher education.  Unlike MOOCs, we can actually demonstrate improved learning and better learning experiences for our enrolled students.

Figuring out how to teach large enrollment classes as student-focused is not easy; it's a horrendous amount of work, trial and error.  A lot of error.  And work.  That said, when you get it right (or pretty close to right), it's incredibly rewarding to see the students engaging with the content at a much deeper level; and to see their grades reflecting this more substantial grasp of the course material

Some rough data.  What I am posting is final courses grades from Fall 2011 (traditional lecture course; 220 students); Fall 2012 (attempted flipped class that didn't really flip; 387 students); Spring 2013 (redesigned and successful flipped class; 387 students).  This is preliminary data, but nothing major will change.

A couple of interesting things: In Spring 2013, 41% earned As; 10% earned A-, for a total of 51% of class in A range.  In Fall 2012 25% earned As, 15% earned A-, for a total of 40%.  In Fall 2011, 34.4% earned As, 12.6 earned A-, for a total of 47%.  The Fall 2012 course was more difficult than the Fall 2011, which partially accounts for the drop in As.  As well, though, the Fall 2012 group was, on the whole resistant to "flipping" and about 2/3 of them used the familiar "cram for the exam" approach.  Notably, the Spring 2013 course was substantially more challenging than the Fall 2012 course yet this cohort had by far the highest grades, esp. in the A category.

In Spring 2013, 76.6% of the class earned a B- or higher; 87.3% earned a C- or higher (1.3% earned credit).  In Fall 2012, 79.3% earned a B- or higher; 90.2% earned a C- or higher.   In Fall 2011, 80.6% earned a B- or higher; 94.1% earned a C- or higher (1% earned credit).  The differences between the Fall 2011 and subsequent semesters are probably due to increased difficulty; and to the fact that, in the Fall 2012 version, the grading scheme was such that, while it was difficult to earn an A or high B, it was relatively easy to earn a B or lower.  That was much less true in the Spring 2013 version, which valued consistency of performance over a wide range of activities.  Still, it is worth noting that the doubled class size had very little effect on the overall grades of the students (though it did seem to increase substantially the number of people who failed the class--largely because they did not do any of the assigned work; and the number who failed to complete the class via W or Q drop).

In the Spring 2013 version, 23/371 students failed the class, compared to 17/375 in Fall 2012 and 10/215 in Fall 2011.  In addition, in Spring 2013 16 students did not finish the class; that number was 12 in Fall 2012 and 5 (of 220 enrolled) in Fall 2011.  Overall, these are relatively small numbers and seem to remain stable as a percentage of the total enrollment.  I was disappointed that these numbers did not decrease in the spring semester, when the students were given substantially more feedback on their performance in the class.  Despite knowing that they had little chance of passing the class, this cohort of students made few adjustments and, often, waited until the very end of the semester to drop the class despite having done almost nothing through the semester.

Overall--and this is admittedly impressionistic--the most significant difference between the "lecture-based" model of instruction and the more active, flipped model is that it raised the grades of the B-C students.  The A students performed at an even higher level (we had about 10% of the class score over 100%); but the real beneficiaries were those who would normally have been in the B-C range.  These students seem to have moved up at least half a grade, if not a full grade, from previous cohorts.  These were the students who benefited most from the constant engagement, increased structure, and quiz/exam structure that strongly incentivized a "study as you go" approach to the course material.

No comments:

Post a Comment