Popular Posts

Friday, January 22, 2010

iClickers: Writing Questions for "Classroom Response Systems"

After attending several workshops over more than a year, in Fall 2009 I started using iClickers in my World History survey course. (These are devices on which students push one of five buttons in response to a question an instructor poses, and the results can be displayed immediately on-screen.)

I summarized my own experience with the clickers by putting together all of the questions (and results) over the course of the quarter in one iClicker powerpoint presentation (saved as pdf). In sum, I was somewhat disappointed by my first attempt. First, as the announcements on my World History Course homepage show, I spent an inordinate amount of time getting stragglers to register their devices, from which the serial numbers were rubbing off. But more importantly, I wasn't very good at designing questions that promoted interaction (nor at leaving/taking time to discuss the results). Part of that was because of a technology glitch: I could not see the results as they came in, but only when the students saw them, so that I didn't have time to think about interpreting them before we all saw them. But mainly I didn't find my questions particularly suitable for discussion.

My main insight on designing questions is to ask about things like causes, and phrase answers that point to different aspects. Then one can ask students who chose a certain answer to explain why they did so. Thus not so much factual questions, but interpretation-dependent opinion questions are better.

In any case, UCSB Physics colleague Roger Friedman sent around a link to this "Best Practices for Writing Clicker Questions" entry on Derek Bruff's blog "Teaching with Classroom Response Systems". Bruff is an assistant director at Vanderbilt U's Center for Teaching, and author of Teaching with Classroom Response Systems: Creating Active Learning Environments (Jossey-Bass, 2009) ($31 and previewable at amazon). Searching "history" in the amazon preview yields some interesting pages: 67 on reading quizzes, 76 on conceptual understanding questions, 95 on peer evaluation of presentations, 99 on learning about student preconceptions (evolution).

As a sample of what I found useful in Bruff's 10 blog entry1/11/ is this post "Why writing clicker questions is so hard" on Ian Beatty's blog (with the comment): you not only need to explicitly articulate your learning goals, but you need to "engage, not assess" students, as the commenter (Bruff!) notes.

To add some content to this post, here's a list of types of clicker questions, from the first link on Bruff's Jan. 11 blog entry, a Univ. of British Columbia pdf Clicker Resource Guide, p. 6:

1) Quiz on the reading assigned in preparation for the class
2) Test recall of lecture point
3) Do a calculation or choose next step in a complex calculation
4) Survey students to determine background or opinions
5) Elicit/reveal pre-existing thinking
6) Test conceptual understanding
7) Apply ideas in new context/explore implications
8) Predict results of lecture demo, experiment, or simulation, video, etc.
9) Draw on knowledge from everyday life
10) Relate different representations (graphical, mathematical, …)

"Teaching with Classroom Response Systems' is a great blog, by the way, not of all tags in the right-hand column as links that bring up a thread of all entries that have that tag.

No comments:

Post a Comment