In Part 1, we took a look at how to deconstruct an NGSS performance expectation (PE) into student-friendly objectives, and in Part 2 we examined how to bundle PEs together to make coherent units (and it’s not just teaching the ones grouped together in the standards together). Now that we can really do some backwards design and create an assessment that is aligned to our deconstructed standards so we have a clear picture of mastery before we do any instruction.
I took the liberty of bundling and deconstructing some middle school earth science PEs. A list of those PEs and the deconstructed standards (I can statements) derived from them can be seen below. (Or click here to access the document.)
(Please note that any time you first write a set of I can statements, they should be considered as a draft until you actually teach with them, and should always be reviewed each year for changes to make them better. This is where the art of teaching comes in – you write the I cans with as much foresight as you can, but you never really know how well they will work with students until you actually use them with students! Don’t be afraid to make minor tweaks as you move through a unit, either – we’re all co-learners in this process.)
Now, if you take a peek at the document above, you’ll find that it is chock-full of all sorts of earth sciencey goodness. In fact, when I compiled these, my first thought was that there was way too much stuff in this unit and that there would be no way I would hand this entire list out to a gaggle of middle-school students. This is exactly why you need to deconstruct the PEs before finalizing your bundled units. You never really know how much stuff there is in a PE until you deconstruct it..which gives you a better sense of what and how much to bundle together. If I were still in the classroom, I might do each of the sections (Water, Weather, and Climate) as separate mini-units, then using MS-ESS3-3 and 3-5 to assess students upper-level thinking/strategic thinking skills in a separate unit. Or if I wanted to do a more problem-based/inquiry approach, you could start with MS-ESS3-3 and 3-5 and let students discover the content in the course of searching for viable solutions and asking clarifying questions.
But I digress-what about assessing these I can statements? Let’s take a look at the I can statements under the topic of “Weather,” which were deconstructed from PE MS-ESS2-5:
- I can tell the difference between weather and climate.
- I can explain and tell the difference between the different components of weather: temperature, humidity, pressure, precipitation, and wind.
- I can determine the relationships between density of air masses and the different components of weather.
- I can predict what weather will occur when two air masses interact and explain why it will occur.
- I can collect and analyze data from maps and scientific experiments I design in order to predict weather events.
Before we get started, let’s remember that the purpose of any assessment is to draw valid inferences regarding what students know, understand, and are able to do. In order to make those valid inferences, we have to have a clear picture of what evidence we need as instructors that allows us to see if students know, understand, and are able to do the things we need them to do in the I can statements. How do we get that clear picture? By looking at the verbs we used in our I can statements, of course.
Those verbs at the start of the I can statements will determine what assessment questions or activities we will have students do. If you tell students they should be able to tell the difference between something, they should do that on the assessment. Same goes with explaining, predicting, collecting, and analyzing – if those are the verbs you use, then that’s what students should be doing. Below you can see an outline (some call it an “assessment plan”) of what the assessment would look like for the I can statements.
Using the plan above, an assessment can be written that determines what mastery of the I can statements looks like ahead of time. Please note that while basic understanding of concepts is assessed, what the verb says students should be able to do is always a part of the assessment. However, like I mentioned in my first post, giving those verbs to students makes it clearER to students, but not crystal clear. For example, what does it really look like when students analyze something? Predict? Summarize? Interpret? Describe? Explain? The meaning of these can vary from educator to educator, so you’ll have to make it clear to students what those verbs look like up front so there’s little confusion. When I was in the classroom I usually discussed the lists of I can statements with students on the first day of a unit, activating prior knowledge about concepts but also looking at verbs and predicting what they would have to do on the summative assessments in order to show me that they had mastered the I can statements. It didn’t take very long, but students understanding what the verbs meant had a huge return on investment when they were working towards mastery during the unit itself.
One of the main errors I usually see when reviewing teacher-created assessments for alignment to objectives is that the actual I can statement is never actually assessed. I see a lot of questions asking students to recall facts and concepts related to the I can statement, but then students are never asked to actually do what the I can statement says they should be able to do with those facts and concepts. If we want to make valid inferences regarding student mastery, we must assess the center of the target (I can statement), not just the stuff around the target.
If we’re not assessing the center of the targets, then this does nothing but confuse students – why give them I can statements that they will never have to actually do? The point is that the assessment is aligned to the I can statements deconstructed from the standards so you can get an accurate picture of how well students have mastered the standards. You cannot infer mastery of standards if the assessment is never designed to show mastery in the first place.
Part of this assessment misalignment issue is comes from teachers still having to make the leap from assessing lots of bits of content stuff to assessing how well students can use that content stuff, and questions that ask students to really use concepts and facts are not questions teachers are traditionally used to writing. But, from my own experience, the more you try and write those upper-level thinking questions, the better you get. Often it’s getting teachers to realize that they can write perfectly valid assessment questions (and not have to rely solely on textbook-created test bank questions) that’s the first step in helping teachers write quality assessments aligned to standards. (Want more information on how to write your own aligned assessments? Click here.)
Now that the assessment is created and you have an idea of what mastery looks like for students, you can plan for instruction that helps students practice and work towards your visions of mastery. How to plan aligned instruction from your assessment will be explored in our next post.