Adaptive testing is one of the largest buzz-worthy trends in Ed Tech right now– the ISTE conference was absolutely awash in companies selling adaptive testing engines, aligned with Common Core and complete with packaged curriculum materials. It’s easy to see the appeal of adaptive testing: students are assessed on a complete package of learning objectives, and any areas of struggle or difficulty are identified and targeted. Students work at a level which is appropriate for them in rigor and complexity, and can move ahead or given additional reinforcement if necessary. Unfortunately, adaptive testing systems are incredibly complex, which makes them very hard to modify to reflect each individual teacher’s course and curriculum.
Schoology has released a handful of new features to Enterprise customers over the summer which, when used together, form a very powerful formative assessment environment. By using these tools, it’s possible to build quizzes which offer students opportunities to practice skills and content as needed, and report data back to teachers in a very granular and performance-oriented manner. For classes or schools which use standards- or learning objectives-based grading and reporting, the backwards design process of writing curriculum and assessment to match those objectives fits perfectly into this new package. The combination of Learning Objectives, Question Banks with Random Questions and the Mastery reporting panel allows teachers to generate randomized practice opportunities targeted to individual or multiple performance goals, and analyze each for diagnostic data on each student’s performance. Each of these tools requires some setup to accomplish this, so let’s dive in.
Schoology’s Learning Objectives tool allows teachers to enter specific performance objectives or standards into a course. In addition to Common Core, State or other National standards sets, teachers can create their own personal ones unique to each course. Once these are entered, they can be tagged to, in Schoology’s words, “anything that can be graded.” This means homework, graded discussions, and quizzes and tests– Media Albums cannot be graded, so they are out.
Learning objectives and standards are really easy to write but very difficult to make effective. I recommend “Writing Measurable Learning Outcomes” as a resource for building strong learning objective statements that will mesh neatly with the Question Bank. In our Middle School, we use learning objectives on our report cards, so our students and teachers are used to linking in-class activities to specific learning goals.
To enter custom learning objectives, the appropriate permissions must be activated by the system administrator (they are off by default). Once activated, Learning Objectives will appear in a teacher’s Personal Resources section. These can be organized by folders as with any other form of resource, although they cannot be shared with other teachers. Hopefully Schoology will add this in the future, as it would be very helpful for teachers who co-teach or alternate on courses to be able to share the objectives with colleagues.
Using concrete, specific and measurable learning objectives is a standard (ha!) method of building out and organizing curriculum. Following a backwards design methodology such as Understanding by Design, all of a teacher’s assessments should directly measure a learning objective. Once the objectives are entered into Schoology, assignments and activities can be tagged with those learning objectives to indicate that they are aligned. With graded discussions and homework submissions, activities are best linked to only one learning objective (an important limitation to note–more on this below). With quizzes and tests, though, each question is tagged to a learning objective. This means that individual questions can be tagged with their unique skill, content or process in the Question Bank.
In my opinion, one of the greatest features of Moodle has now made it to Schoology: the randomized Question Bank. Previously in Schoology, questions were purpose-built for each quiz or test. Now, quizzes and tests can be populated with random questions.
This allows teachers and students to take advantage of randomly-generated practice exercises within the Schoology environment. I could create, for example, a question bank of 30 questions on a common objective and tag them with that corresponding objective. I would then create a quiz that would allow students to practice multiple times. Teachers can set these settings to their preference, but I set these types of practice environments to unlimited attempts, and report the average score of all attempts. By setting the quiz to select 10 questions at random from amongst the 30 choices, the quiz is unique at each attempt. Obviously, as the size of the question bank increases, the randomization increases as well. Having students each write 10 practice questions, for example, is a great way to generate raw examples very quickly. Quizzes can be populated from multiple banks as well, so practice quizzes don’t have to be limited to only one learning objective.
By using these two tools, we can generate dynamic practice activities tied to specific learning objectives. The purpose here is not to generate summative assessments, but rather to create opportunities for students to practice skills as they see fit by taking advantage of access to the LMS from home or during class. Outside of a structured “practice exam,” this increases the opportunity for students to have low-risk practice that they can repeat until they are satisfied with their performance and are ready to move on. We can look at some of the data produced by these practice attempts by looking in Schoology’s gradebook (whether that’s the official course gradebook or not), but that will only tell us one piece of information– the average score a student earns on the quiz (or highest or most recent, depending on how the quiz is configured). To truly give us a granular diagnostic tool, though, we need more data. This is where the Mastery page comes in.
- Chapter/unit review quizzes will bring in multiple skills under one unit.
- You may want to include questions from previous units to keep prior learning fresh or accessible.
- A differentiated class may include questions of varying levels of complexity in the same practice exercises (in this instance, the same learning objectives may be duplicated and changed slightly to indicate multiple potential levels of performance– “Identifies notes on the Treble Clef,” “Identifies notes on the Treble and Bass Clef,” “Identifies notes on Treble, Bass and Alto Clefs.” Questions can be tagged with multiple learning objectives.)
- Is the student succeeding with this objective?
- Is the student at the right “stretch” level for them, or do they need more or less complexity at this point?
- Is the student actively working on this objective?
- Are there large numbers of students struggling with a similar concept, indicative of something I need to adjust class-wide?
- Is this student struggling with multiple concepts or related concepts?
- It does require setup time, particularly in generating the question banks. It’s worth noting, though, that question banks can be shared through resources, and can be generated in group resources as well, meaning that departments or schools could collaborate on them.
- If the learning objectives are weak, non-specific, or if the assessments and objectives don’t really match, then the Mastery data will be less instructive. Tightly aligned learning objectives and assessments will produce the best data.
- All items should really only be tagged with one learning objective (or, in the case of our differentiated example above, one set of differentiated objectives). This is fine for questions, since most of these types of drill-and-kill practice questions only apply to one objective. It’s unfortunate for assignments, though, as one homework assignment may include multiple learning objectives. In the case that any item is tagged with more than one learning objective, the same score will be applied to all of them. In the differentiated example, this is fine, since a question regarding note identification in the Treble Clef is equally correct no matter which standard is applied (Treble only, Treble and Bass, or Treble, Bass and Alto– either way, the question is only right or wrong). A homework assignment that involves more than one learning objective, though, would likely get a different score for different objectives (e.g. drawing a map might include: Identifies cities in Washington State vs. Identifies rivers in Washington State).
The Whole Package
Despite these limitations, this package is a major step forward towards what I’d like to see out of adaptive testing engines– the ability for teachers to control content, see granular performance data, and offer ample practice opportunities for students. Drill-and-kill and computer-aided assessment are not new, but we are getting more tools to be able to observe and dissect the data produced by them in meaningful ways. Nearly every discipline has some level of basic content knowledge or skills which require plugging through lots and lots of practice to get into an automatic recall, and having flexible and targeted tools helps us address those issues so that we can progress to higher levels of thinking and complexity. These features are new for this year, so there is much for us to explore in how best to setup and use them.