New Schoology Features – (Almost) Adaptive Assessment for Your Curriculum?

Schoology's new Mastery panel (

Adaptive testing is one of the largest buzz-worthy trends in Ed Tech right now– the ISTE conference was absolutely awash in companies selling adaptive testing engines, aligned with Common Core and complete with packaged curriculum materials. It’s easy to see the appeal of adaptive testing: students are assessed on a complete package of learning objectives, and any areas of struggle or difficulty are identified and targeted. Students work at a level which is appropriate for them in rigor and complexity, and can move ahead or given additional reinforcement if necessary. Unfortunately, adaptive testing systems are incredibly complex, which makes them very hard to modify to reflect each individual teacher’s course and curriculum.

Schoology has released a handful of new features to Enterprise customers over the summer which, when used together, form a very powerful formative assessment environment. By using these tools, it’s possible to build quizzes which offer students opportunities to practice skills and content as needed, and report data back to teachers in a very granular and performance-oriented manner. For classes or schools which use standards- or learning objectives-based grading and reporting, the backwards design process of writing curriculum and assessment to match those objectives fits perfectly into this new package. The combination of Learning Objectives, Question Banks with Random Questions and the Mastery reporting panel allows teachers to generate randomized practice opportunities targeted to individual or multiple performance goals, and analyze each for diagnostic data on each student’s performance. Each of these tools requires some setup to accomplish this, so let’s dive in.

Learning Objectives

Schoology’s Learning Objectives tool allows teachers to enter specific performance objectives or standards into a course. In addition to Common Core, State or other National standards sets, teachers can create their own personal ones unique to each course. Once these are entered, they can be tagged to, in Schoology’s words, “anything that can be graded.” This means homework, graded discussions, and quizzes and tests– Media Albums cannot be graded, so they are out.

Learning objectives and standards are really easy to write but very difficult to make effective. I recommend “Writing Measurable Learning Outcomes” as a resource for building strong learning objective statements that will mesh neatly with the Question Bank. In our Middle School, we use learning objectives on our report cards, so our students and teachers are used to linking in-class activities to specific learning goals.

To enter custom learning objectives, the appropriate permissions must be activated by the system administrator (they are off by default). Once activated, Learning Objectives will appear in a teacher’s Personal Resources section. These can be organized by folders as with any other form of resource, although they cannot be shared with other teachers. Hopefully Schoology will add this in the future, as it would be very helpful for teachers who co-teach or alternate on courses to be able to share the objectives with colleagues.

Adding a Learning Objective in the Personal Resources

Adding a Learning Objective in the Personal Resources

When creating a resource, hit the Target icon in Advanced to link the resource to the Learning Objective

When creating a resource, hit the Target icon in Advanced to link the resource to the Learning Objective

Using concrete, specific and measurable learning objectives is a standard (ha!) method of building out and organizing curriculum. Following a backwards design methodology such as Understanding by Design, all of a teacher’s assessments should directly measure a learning objective. Once the objectives are entered into Schoology, assignments and activities can be tagged with those learning objectives to indicate that they are aligned. With graded discussions and homework submissions, activities are best linked to only one learning objective (an important limitation to note–more on this below). With quizzes and tests, though, each question is tagged to a learning objective. This means that individual questions can be tagged with their unique skill, content or process in the Question Bank.

Question Bank

In my opinion, one of the greatest features of Moodle has now made it to Schoology: the randomized Question Bank. Previously in Schoology, questions were purpose-built for each quiz or test. Now, quizzes and tests can be populated with random questions.

Importing from Question Banks

Importing from Question Banks

Importing Individual Questions or Random Questions

Importing Individual Questions or Random Questions

This allows teachers and students to take advantage of randomly-generated practice exercises within the Schoology environment. I could create, for example, a question bank of 30 questions on a common objective and tag them with that corresponding objective. I would then create a quiz that would allow students to practice multiple times. Teachers can set these settings to their preference, but I set these types of practice environments to unlimited attempts, and report the average score of all attempts. By setting the quiz to select 10 questions at random from amongst the 30 choices, the quiz is unique at each attempt. Obviously, as the size of the question bank increases, the randomization increases as well. Having students each write 10 practice questions, for example, is a great way to generate raw examples very quickly. Quizzes can be populated from multiple banks as well, so practice quizzes don’t have to be limited to only one learning objective.

By using these two tools, we can generate dynamic practice activities tied to specific learning objectives. The purpose here is not to generate summative assessments, but rather to create opportunities for students to practice skills as they see fit by taking advantage of access to the LMS from home or during class. Outside of a structured “practice exam,” this increases the opportunity for students to have low-risk practice that they can repeat until they are satisfied with their performance and are ready to move on. We can look at some of the data produced by these practice attempts by looking in Schoology’s gradebook (whether that’s the official course gradebook or not), but that will only tell us one piece of information– the average score a student earns on the quiz (or highest or most recent, depending on how the quiz is configured). To truly give us a granular diagnostic tool, though, we need more data. This is where the Mastery page comes in.


If our practice exercises are only linked to one learning objective, then it is possible to look at the quiz results and see whether a student is succeeding with that objective. In reality, though, these exercises often cross multiple learning objectives:
  • Chapter/unit review quizzes will bring in multiple skills under one unit.
  • You may want to include questions from previous units to keep prior learning fresh or accessible.
  • A differentiated class may include questions of varying levels of complexity in the same practice exercises (in this instance, the same learning objectives may be duplicated and changed slightly to indicate multiple potential levels of performance– “Identifies notes on the Treble Clef,” “Identifies notes on the Treble and Bass Clef,” “Identifies notes on Treble, Bass and Alto Clefs.” Questions can be tagged with multiple learning objectives.)
In looking at student performance towards a learning goal, there are several questions I want to ask:
  1. Is the student succeeding with this objective?
  2. Is the student at the right “stretch” level for them, or do they need more or less complexity at this point?
  3. Is the student actively working on this objective?
  4. Are there large numbers of students struggling with a similar concept, indicative of something I need to adjust class-wide?
  5. Is this student struggling with multiple concepts or related concepts?
The new Mastery panel gives me much of this information since it presents information categorized by the Learning Objective, not the assignment. Again, since my practice exercises often include multiple objectives, being able to see performance by specific objective allows me to see specific conceptual or factual struggles. Furthermore, since one learning objective will appear in multiple assignments or activities, I can see if an individual concept is a recurring struggle over time.
Schoology's new Mastery panel (

Schoology’s new Mastery panel (

Mastery, like Learning Objectives, is only available as an Enterprise feature of Schoology. Accordingly, Schoology has two articles relating to Mastery: the publicly accessible version, and the version for Enterprise customers (login required).


The combination of these features gives an extremely powerful diagnostic assessment engine which allows students to practice material freely, while generating much more actionable data for teachers. It does have some limitations, however:
  • It does require setup time, particularly in generating the question banks. It’s worth noting, though, that question banks can be shared through resources, and can be generated in group resources as well, meaning that departments or schools could collaborate on them.
  • If the learning objectives are weak, non-specific, or if the assessments and objectives don’t really match, then the Mastery data will be less instructive. Tightly aligned learning objectives and assessments will produce the best data.
  • All items should really only be tagged with one learning objective (or, in the case of our differentiated example above, one set of differentiated objectives). This is fine for questions, since most of these types of drill-and-kill practice questions only apply to one objective. It’s unfortunate for assignments, though, as one homework assignment may include multiple learning objectives. In the case that any item is tagged with more than one learning objective, the same score will be applied to all of them. In the differentiated example, this is fine, since a question regarding note identification in the Treble Clef is equally correct no matter which standard is applied (Treble only, Treble and Bass, or Treble, Bass and Alto– either way, the question is only right or wrong). A homework assignment that involves more than one learning objective, though, would likely get a different score for different objectives (e.g. drawing a map might include: Identifies cities in Washington State vs. Identifies rivers in Washington State).

The Whole Package

Despite these limitations, this package is a major step forward towards what I’d like to see out of adaptive testing engines– the ability for teachers to control content, see granular performance data, and offer ample practice opportunities for students. Drill-and-kill and computer-aided assessment are not new, but we are getting more tools to be able to observe and dissect the data produced by them in meaningful ways. Nearly every discipline has some level of basic content knowledge or skills which require plugging through lots and lots of practice to get into an automatic recall, and having flexible and targeted tools helps us address those issues so that we can progress to higher levels of thinking and complexity. These features are new for this year, so there is much for us to explore in how best to setup and use them.


  1. Scott Genzer says:

    Very nice description of the new Schoology SBG features. Thanks, Jeff. I got a sneak preview of this in November but was sworn to secrecy until it came out. :) I agree with you – it’s a good step in a good direction but not quite where I’d like it to be – yet. Not being able to have multiple standards marked separately on one assessment (with different marks, of course) is a big one. Yuck.

    Any thoughts on the adaptive learning environments out there (as opposed to the adaptive testing)? I am really getting excited with this – particularly those companies that are using Knewton’s API as their engine.

    Thanks for the blog. Always love hearing your thoughts, Jeff.

  2. Jeff says:

    Thanks for the feedback (and the share!), Scott– I hope that they can add multiple standards per assignment in the future, but that would also basically force users to use rubrics. Not that I disagree with that, but I wonder if that’s too restrictive for them?

    My blanket complaint to this point with adaptive engines is the generation of the content. Right now, the systems that I’ve seen all seem to err on one of two extremes:
    –Concept mapping is really complex, so take our standards/content/lessons and use it canned. I chafe against that one, because I don’t want to use the canned lessons or assessment strategies– I want to use my own. Frankly, the “engine plus content” models are way too Common Core-driven and uninspiring to me right now.
    –Build your own systems from scratch using all of your resources. Very few teachers have the time/energy/tech skills to do this. You can do some great things with Google Scripting and Forms as an adaptive engine using your own materials, but that’s a lot of setup.

    I guess my holy grail would be if we can achieve a) a critical mass of OER’s where teachers have a great variety of choices that can be tagged with b) an API like Knewton’s that would handle the data analysis on the back end.

    So basically– an epistemological map of all content supported by a detailed, intuitive and open-access contextual tagging system. No big deal.

  3. Tim Patton says:

    Time is the key here for sure. Have any adaptive learning software suggestions which can operate in less than ideal internet connectivity situations. Ideally, when the internet doesn’t work at all?

    Keep up the lords work. Always appreciate your two cents on tech and education.

Leave a Reply

Your email address will not be published. Required fields are marked *