A simple out-of-class assignment that could have big pay off in a flipped class



Peer Instruction Network member Claire, a K-12 English teacher from Michigan, worries that far too many of her students are falling through the cracks.  She is looking for ways to reach them.  Could student-generated questions be one possible low-threshold intervention?  This post digs into this question.  However, if you are looking for a quick protocol for student-generated questions, skip to the bottom. 

It’s that moment.  An hour before class starts and for a variety of reasons (committee meetings, chair beating down your door for this that or another thing, kids with the flu, etc.) you didn’t have time to prepare the totally engaging, mind blowing experience that usually rocks your students’ worlds during class.

What to do?

I faced this predicament when I last taught my education theory course at TC. Being someone who takes their teaching extraordinarily seriously, I was panicked and feeling guilty about being under prepared. So, I whipped open Keynote and started to eek out a few clicker questions. Halfway through, I stopped, dejected, when I realized that in rushing I could only come up with a few lame memorization or fact recall prompts. Scratching my head but determined to figure something worthwhile out, I decided to try an idea I had been batting around in my mind for weeks.

At this point in the semester, the students were used to my  flipped approach: they were diligently doing their readings and Just-in-Time Teaching (JiTT) exercises before class and working on application questions in class via Peer Instruction and ConcepTests.   For one of their JiTT exercises, I planned to have students write ConcepTests/clicker questions about the material. My goal was to engage students in the higher levels of Bloom’s (e.g. creating, synthesizing material) by getting them to create questions in a low-stakes environment.

I also planned to use some of the questions as ConcepTests during class.

On this one bad day,  I decided to fast track that plan and have students spend the first 15 minutes in groups writing clicker questions that we would then run in class.

While a few of the student questions were terrific, most were too easy, a small handful were terrible. I tried this exercise a few more times, but soon took it out of my active learning tool box.  I simply did not find enough student-generated questions that hit that sweet spot of not too hard and not too easy.

Did I abandon ship too quickly? Maybe.

What does research say about student-generated questions?

Recently, a colleague at the University of Texas at Austin, Zachary Williamson, pointed me to a study by Berry and Chew (2008),  which evaluated the potential benefits of student-generated questions. Berry and Chew found that in a study of undergraduate psychology students (n=102),  “generating questions related positively to improved student performance,” particularly for low- performing students.

A caution: The students who opted to participate by generating questions about course content, had performed worse on their first two exams than those who opted out. As such, there is some noise there – was it motivation to improve, more intense study, or was it the act of generating the questions? Hard to tell.

However, Berry and Chew also documented that for the students who generated questions, there was an upward trajectory in their exam performance (they did better on the third exam than would be predicted based on their previous performance). The students who did not participate in the student-generated question exercises did worse than statistically expected.  Furthermore, the more questions the lower performing students generated, the more they improved their performance. The improvement appeared to be related to frequency of question and independent of the depth or quality of their questions. Other similar studies in other disciplines such as physics found the opposite effect. For example, Harper, Etkina, and Lin (2003) found that the number of questions posed was unimportant, but the quality or depth of the question was. Berry and Chew explain this by suggesting that the effects of question generation may have some discipline-specific differences.

A possible intervention to help struggling students?

While the jury is still out, I am intrigued by the possibility of such a low-threshold instructional intervention to help teachers like Claire and her lower-performing students. In Berry and Chew’s study, instructors used the same prompt to solicit questions every time and did not even give the students feedback about their questions, they simply responded with an email awarding the extra credit. (Note:  I make a hard stop at advocating lack of feedback and I think feedback experts would probably discount this tactic as well.)

I do think it is worth trying student-generated questions to see if they might help under-performing students in our classes improve.  Even in interactive teaching environments, with master teachers,  there are students who show early warning signs of struggle. It is possible that creating incentives for generating questions could be an easy way to help them succeed. 


If you want to test student-generated questions in your class – here is the prompt Berry and Chew (2008) used:

“Provide us with three questions that you would like answered concerning the topics covered in your textbook readings or in lecture. These can be any questions you might have, as long as the questions are about the material or are stimulated by the material. They can be questions about concepts you are still unclear about, about further information you would like to have, or questions about how some issue applies to your own life or to other course concepts” (p. 306).

recent blog post by Jackie Gerstein also has some protocols developed by others.

Here is my distilled protocol based on my research into this topic:

Student-Generated Questions

Protocol for student-generated questions (c) Julie Schell

Online systems for student-generated questions 

There are fantastic online systems that facilitate student-generated questions in a higher-threshold way. My favorite is PeerWise. PeerWise is free and also provides a platform for peer and instructor social interaction.

IN CLOSING, I do not know if my efforts to get students to generate clicker questions had any statistically significant effect on their actual learning. When I abandoned ship, I assumed that the lack of depth of the questions my students generated was a sign that the exercise was not very useful. But maybe I was being too teacher centered–just because most of the questions were not useful for me does not mean creating them was not useful for my students.

Claire, I think trying out a structured approach to student-generated questions may be worth a shot.

READERS: Provide us with three questions you would like answered concerning the topics covered in this post in the comments section below…

Berry, J. & Chew, S. (2008). Improving learning through interventions of student-generated questions and concept maps. Teaching of Psychology, 35, 305-312.

Harper, K. A., Etkina, E., & Lin, Y.-F. (2003). Encouraging and analyzing student questions in a large physics course: Meaningful patterns for instructors. Journal of Research in Science Teaching, 40,  776–791.


Comments RSS
  1. Simon Bates

    I’ve just been browsing through some of the (nearly) 2000 questions that our first year non-majors Physics class have produced during 2 in-course assessments with PeerWise, selecting one or two for the upcoming final exam. Very few are trivial or irrelevant, most are pretty solid, a substantial fraction are exceptional.

    One key thing we’ve found in getting students to create their own content, is that scaffolding the activity -literally helping them ‘know what it is they don’t know’ — seems to be one of the key ingredients to get good quality questions that go beyond simple factual recall.

    • Julie Schell

      This is a great point Simon – we shouldn’t expect that students immediately know how to write questions. Zachary Williamson told me: “Because students tend to lack exposure to question generation in formal educational settings, some approach the task with apprehensiveness.” So, scaffolding is a great idea.

  2. Pip Bruce Ferguson

    Hi Julie

    I totally agree with your recommendation for PeerWise. It was developed by staff from the University of Auckland here in New Zealand, and I attended a workshop run by them to introduce us to the concept.

    It can be used for both formative and summative purposes, and really seems to encourage deep learning. I would endorse your recommendation for it and hope that it’s useful for others from this group. Clicking in just now, I see that it’s being widely used around the world now.

    Kind regards


    • Julie Schell

      Thanks so much, Pip! I know many people who use PeerWise and have seen some demos of actual student work from @simonbates that are pretty phenomenal. Thanks for note.

  3. James Lerch

    I believe that just the act of having students ‘do something’ may not improve their grades, but definitely gets their ‘attention’. I am thinking of doing this and awarding ‘extra credit’ points for those questions that I use. Another thought, the students who will do poorly in the class will do so no matter who generates the questions or topics. I believe that this will give an spark to those students who would do well anyway in having them understand the topics more fully and to think about the topic in ways that they may not have, or in the depth that they might not have prior to this exercise.

    • Julie Schell

      James – I am eager to hear what you find out.

      I do recall that my students were extremely engaged with a similar question-generation assignment that I posed for extra credit on one of their JiTT exercises with a slight tweak:

      “Prepare a clicker question for students in NEXT year’s class to answer about this theory.”

      For some reason, creating the question for their poor underclassmen who would have to slog through my course the next semester appeared to me to be highly motivating. Moreover, I thought the questions students generated in response to this prompt seemed to be much harder and deeper than those they posed them for themselves.

      • Kevin Young (@KevinAWC)

        Julie, I think I might try a prompt of writing questions that will be asked of students in the other section of the same class! Two of my sections keep asking about the other and seem quite competitive. I think they would like to “zap” each other in a friendly way.

      • Julie Schell

        I will be very interested to see how this goes! What do you teach?

What do you think?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: