Tag Archive for collaboration

Students Share Quiz Study Guides

Creating online study resources and sharing them via Schoology

Cross-posted May 20 at http://blogs.universityprep.org

Creating online study resources and sharing them via Schoology

Creating online study resources and sharing them via Schoology

Knowing that a test was coming up soon, two students in 6th grade Integrated Science created an online study guide using Quizlet and shared it with their classmates through the Schoology Updates tool. This was all student-generated: no teacher input or prompting. They also invited their classmates to contribute to the Quizlet deck so that they could all benefit from each others’ work. Collaboration and self-directed learning in action!

The next morning, 6 other students had already used the study guide

The next morning, 6 other students had already used the study guide

The Franken-Paper: Constructing a Best Response

Jigsaw

This post originally appeared on my first attempt at blogging on January 10, 2013. I’ve shut down that blog and am slowly moving posts over.

A key challenge in the collaborative classroom is balancing the inherent benefits of group work with the accountability and data of individual assessments. In my IB courses in particular, there’s a desire to prepare students for the types of exams that they’ll see at the end of their IB studies, while not losing the goals of the IB learner profile, which include Reflection and Collaboration. By adding a couple of steps to our assessment process with mock IB-style written exams, I am able to integrate a crowdsourcing element to our assessment which helps all the students benefit from each other’s work, without sacrificing the data from an individual exam.  I call it the “Franken-Paper;” a response produced by the class, spliced together with the best responses to each individual question.

1. Design open-ended questions.

Following the IB model of exams for our Information Technology in a Global Society course, I know that there are a certain number of questions in each category and level of complexity that students will encounter. I also know that the exam questions will always come from the same “command terms,” or question stems (i.e. “Define,” “List,” “Justify,” “Compare,” etc.) specified in the course syllabus. With that as a framework, I try and design the written assessments to always include open-ended questions matching the command terms and structure of the prompts that they’ll see in the IB exams.

The key, though, is simply to make sure that all questions are free-response and open-ended. Multiple choice, fill-in-the-blank or other “guided-response” prompts don’t work with this system.

2. Choose the “best responses.”

As I’m reading all of the submissions, I keep a document open in the background to capture the best responses. Most of my exams are submitted electronically, so it’s easy to copy and paste, but with hand-written mocks I’ll simply type the best answers in.

When I encounter what I think is a particularly good answer to a prompt, I write it down. If at some point I come to a better answer in another example, I’ll replace the first example with the second. The goal is to come to the end of my marking with the best answer for #1, the best for #2, etc. These often (in fact, so far always) come from different students’ submissions– no one student has the best answer for all questions. This is key to the discussion and analysis that comes later.

In the case of many possible “best responses,” I try and give the nod to representing the widest range of the class as possible– the higher the number of students that can identify their own contribution to the final product, the better.

3. Present the “Franken-Paper.”

I now have a exam that is spliced together from the best individual answers that the class submitted. There are a variety of ways to handle what comes next: I can distribute them for reading and have a discussion in class following, ask the students to evaluate these answers based on the rubric, or present them myself and identify what made each answer the most successful one in my reading. Since these are open-ended questions, I’m careful in this stage not to identify something as “the right answer,” but as the answer which in my reading best fits the criteria or rubric. Any disagreements, questions or alternate answers should be discussed at this point in order for everyone to see why this was a successful answer to the question or prompt.

The key is that I do all of this before I…

4. Return their individual papers and reflect.

Now that we’ve discussed the group’s best combined thought and knowledge, we can examine each student’s individual response. Any reflection, goal setting or self-evaluation now combines their individual performance as compared to other successful examples. Finally, the best response paper goes in their archives to study and review the subject in the future. Rather than having incomplete or unsuccessful responses to draw from and study, they have the best product of the class to learn from and continue to use.

Variations:
The easiest variation is to divide up many of the submissions and ask the students to decide a “best answer” out of the group. This works best in small groups chosen to avoid anyone in the group choosing or discussing their answer. You can also use Google Docs to have groups construct one synchronously, using their group’s chosen best answer or using a Jigsaw method.