It came from a two day NGSS workshop: Take-aways and some resources

I had the privilege of attending a 2-day workshop this week regarding the NGSS standards and how to train teachers to implement them in their classrooms.  The best thing about this workshops was the clear entry points it gave teachers looking to implement the NGSS in their classrooms.  And by “implement” I don’t mean “teach content the same old way only now just teach the stuff mentioned in the NGSS;” I mean really getting students to be scientific thinkers and problem-solvers through transforming what science instruction looks like.  In sum, the most powerful take-aways from this workshop for changing science instruction under the NGSS are listed below:

  • Go for deep understanding to foster scientific literacy.  This means giving students more time to think rather than covering content.
  • Curriculum, instruction and assessments should all be three-dimensional, intertwining the crosscutting concepts, science and engineering practices, and the disciplinary core ideas in order for students to master the performance expectations.
  • Instruction must shift from teaching topics to teaching phenomena.
  • Instruction must shift from telling students stuff to having students “figuring it out.”
  • Assessments must evaluate student understanding based on proficiency scales that are developed from the level of thinking demanded in the performance expectation.

It’s those last two bullet points that seem like major obstacles to teachers who may be used to traditional methods of science instruction that require the teacher to be the focal point of the classroom most of the time.  It means giving up some control of what’s happening in the classroom and handing over the learning to students, letting them view scientific phenomena and come up with their own questions and generate their own explanations before the teacher tells them anything.  The quick and easy example (to engage students in the disciplinary core idea ESS2.B) the presenter gave was that of an anticline and a syncline (upfolds and downfolds in rock layers) that had been revealed as a result of road construction, showing us a picture similar to the one below (mainly an anticline, but you get the idea):

nj_route_23_anticline

Image source

After showing students the picture (and pointing out the human in the picture for size reference), students generate their own observations and questions regarding the phenomenon shown in the picture, which sets the stage for what the focus of the unit will be.  Example questions would be:

  • How much energy is needed to bend rocks that way?
  • What types of energy are needed to cause the bend in the rocks?
  • How long did that bend take to form?
  • What causes the rocks to bend in the first place?
  • Why are the rocks in layers?
  • What types of rocks are the layers made out of?
  • Why are some layers different colors than other layers?

Students do their own research first on the questions they generated, seeking to come up with their own explanations for the phenomenon.  After that teachers can have students verify their explanations in some way, and then extend their understanding via some sort of application activity – i.e., a lab, creating and/or using a model, analyzing data…in other words, engage in those science and engineering practices.  Teachers then evaluate student understanding based on proficiency scales (rubrics) derived from the performance expectation being assessed.  These proficiency scales should show what students CAN do at each level of understanding rather than state what students can’t do.

As someone who spent the first 10 years or so of her 18-year science teaching career feeling like it was my job to dispense information first before giving labs that simply confirmed what I told my students, I understand that this process is going to be hard for some teachers to implement in their classrooms.  However, the advice the presenter gave (which is also the advice I give my science teachers in my district) is this:  Start small.  Start with just integrating phenomena first.  Start by making rubrics from performance expectations for evaluation.  Start by letting students create explanations first rather than lecturing right away.

Start small…but just get started.  Just like mountains and earthquakes and those anticlines and synclines pictured above, small changes will eventually lead to big changes in classroom instruction and student learning.  And the process outlined above is way better for student learning than having them sit and passively watch teachers do their jobs.

In order to help teachers get the small starts they need, a lot of good resources were explored over the two days of this workshop.  I have collected them all in a Blendspace learning playlist for easier access.  They are divided up into categories, with a text page at the start of each category that lists out what the next few links will be.  Please feel free to share them with anyone who needs resources for NGSS implementation in their classrooms.

https://www.tes.com/lessons/BRDv6-gWbJw-SA/ngss-resources-links?feature=embed

Deconstructing the NGSS Part 4: Aligned Instruction

assessment instruction puzzle

In Part 3 of this series of deconstructing the NGSS, we looked at how to align the assessments to our deconstructed learning targets.  When aligning assessments to targets, the main rule you need to remember is this:  The verb of the learning target should indicate to students what they need to do on the assessment to show mastery.  And it’s that assessment that reveals what mastery looks like for students.

If the assessment is what mastery looks like, then we need to plan instruction to get students there.  In other words, our instruction should be aligned to what mastery looks like on the assessment.

For example, let’s take a look at the assessment plan presented in the last post again:

 

Take a look at the first objective or learning target in the list: “I can tell the difference between weather and climate.”  Do students have to be able to internalize the definitions “weather” and “climate” in order to master this objective?  Yes.  But they also need to go one step further and actually state a difference – and the difference cannot be merely the definitions of those two terms.  Students need to examine the definitions and then extract a difference from them and state that and not just parrot back definitions they copied out of a text or off the internet.  A true difference reveals understanding, not just the capability of rote memorization.  So, on the test, students would encounter a differences chart such as the one below:

differences 2So how should your instruction help students master this objective (which looks really simple, but actually requires some thinking)? If I was back in the classroom, here’s how I’d go about it:

  1. Have students look up the text/internet definitions of climate and write them out.
  2. In pairs, have them discuss what those definitions are/what they mean in their own words and write the “student-translated” definition down in their notes.  Have the teacher confirm that the translation is acceptable before moving on.
  3. On their own, students complete a Frayer model diagram for each term.  Students should share their Frayer model diagrams with at least two other students and the teacher before moving on.
  4. On their own, have students extract a difference from their student-translated definitions and Frayer models.  They will write it down in a chart just like the one pictured above and then, underneath the chart, compile that difference into a well-thought-out, logical, beautifully constructed sentence that will bring the teacher to her knees with joy.  These charts and sentences will be teacher reviewed for feedback.

Please note two things about the instructional plan above: First, the teacher was only involved in setting up the learning activities ahead of time and giving feedback during the activities – in every step, the students were responsible for doing the work of learning.  Also note that the “answer” to the differences chart was never gone over as a whole group to avoid students writing down something that wasn’t their own thinking to begin with. Second, the students were practicing the thinking they would need to master the objective, not just the stuff that would help them master the objective.  Too often as teachers we think if we give students the content stuff they will just know how to magically put it all together…and they don’t.  We have to give them practice at not only learning the stuff, but also learning how to think with and use the stuff.

Now, I know what some people are probably thinking: “You have them the question before they took the test!  Of course they’re going to do well if you make it easy for them like that!”  Well, I don’t know if beginning with the end in mind is equivalent to “making it easy.” If you just hand out test questions as your instruction and expect kids to get the right answers and nothing else, then sure – it’s really easy!  However, the learning activities that are geared towards having students practice thinking certainly aren’t that easy for students in my opinion, especially when you have students that are a product of a educational system where quick right answers are more valued in class and on assessments than patient problem-solving.  But also don’t forget that students can only master a target they can see; if we keep what mastery looks like from them by never giving them the end-goal for mastery, then we’re just setting them up for failure rather than success.

(Also note that, as far as multiple choice questions are concerned, you should definitely NOT give them the same questions during instruction as you would during the test, because students will memorize correct answers and then you cannot draw valid inferences regarding what they have actually mastered.  Questions assessing similar content, concepts, and skills should be given during instruction/on formatives, but should not be the same questions as on the summative assessment in order to draw valid inferences about student understanding.)

Bottom line, your instruction needs to be planned in such a way that aligns to what mastery looks like on your assessment.  However, students not only have to practice with the content/conceptual stuff they’ll need to master, but also the thinking they will have to do with it.  To me, the “thinking practice” is much more important than any science stuff I ever taught students.  Why? Because I remember hearing once that, after students graduate from high school, they forget about half of what they learned in 6 months because they simply don’t use it or don’t need it for what they are doing with their lives.

But will all students need the ability to think, no matter where their lives take them? Absolutely.

 

 

 

Deconstructing the NGSS Part 3: Writing Aligned Assessments

In Part 1, we took a look at how to deconstruct an NGSS performance expectation (PE) into student-friendly objectives, and in Part 2 we examined how to bundle PEs together to make coherent units (and it’s not just teaching the ones grouped together in the standards together).  Now that we can really do some backwards design and create an assessment that is aligned to our deconstructed standards so we have a clear picture of mastery before we do any instruction.

I took the liberty of bundling and deconstructing some middle school earth science PEs.  A list of those PEs and the deconstructed standards (I can statements) derived from them can be seen below. (Or click here to access the document.)

(Please note that any time you first write a set of I can statements, they should be considered as a draft until you actually teach with them, and should always be reviewed each year for changes to make them better.  This is where the art of teaching comes in – you write the I cans with as much foresight as you can, but you never really know how well they will work with students until you actually use them with students!  Don’t be afraid to make minor tweaks as you move through a unit, either – we’re all co-learners in this process.)

Now, if you take a peek at the document above, you’ll find that it is chock-full of all sorts of earth sciencey goodness.  In fact, when I compiled these, my first thought was that there was way too much stuff in this unit and that there would be no way I would hand this entire list out to a gaggle of middle-school students.  This is exactly why you need to deconstruct the PEs before finalizing your bundled units.  You never really know how much stuff there is in a PE until you deconstruct it..which gives you a better sense of what and how much to bundle together.  If I were still in the classroom, I might do each of the sections (Water, Weather, and Climate) as separate mini-units, then using MS-ESS3-3 and 3-5 to assess students upper-level thinking/strategic thinking skills in a separate unit.  Or if I wanted to do a more problem-based/inquiry approach, you could start with MS-ESS3-3 and 3-5 and let students discover the content in the course of searching for viable solutions and asking clarifying questions.

But I digress-what about assessing these I can statements?  Let’s take a look at the I can statements under the topic of “Weather,” which were deconstructed from PE MS-ESS2-5:

  1. I can tell the difference between weather and climate.
  2. I can explain and tell the difference between the different components of weather: temperature, humidity, pressure, precipitation, and wind.
  3. I can determine the relationships between density of air masses and the different components of weather.
  4. I can predict what weather will occur when two air masses interact and explain why it will occur.
  5. I can collect and analyze data from maps and scientific experiments I design in order to predict weather events.

Before we get started, let’s remember that the purpose of any assessment is to draw valid inferences regarding what students know, understand, and are able to do.  In order to make those valid inferences, we have to have a clear picture of what evidence we need as instructors that allows us to see if students know, understand, and are able to do the things we need them to do in the I can statements.  How do we get that clear picture?  By looking at the verbs we used in our I can statements, of course.

Those verbs at the start of the I can statements will determine what assessment questions or activities we will have students do.  If you tell students they should be able to tell the difference between something, they should do that on the assessment.  Same goes with explaining, predicting, collecting, and analyzing – if those are the verbs you use, then that’s what students should be doing.  Below you can see an outline (some call it an “assessment plan”) of what the assessment would look like for the I can statements.


Using the plan above, an assessment can be written that determines what mastery of the I can statements looks like ahead of time.  Please note that while basic understanding of concepts is assessed, what the verb says students should be able to do is always a part of the assessment.  However, like I mentioned in my first post, giving those verbs to students makes it clearER to students, but not crystal clear.  For example, what does it really look like when students analyze something?  Predict? Summarize?  Interpret? Describe?  Explain?  The meaning of these can vary from educator to educator, so you’ll have to make it clear to students what those verbs look like up front so there’s little confusion.  When I was in the classroom I usually discussed the lists of I can statements with students on the first day of a unit, activating prior knowledge about concepts but also looking at verbs and predicting what they would have to do on the summative assessments in order to show me that they had mastered the I can statements.  It didn’t take very long, but students understanding what the verbs meant had a huge return on investment when they were working towards mastery during the unit itself.

One of the main errors I usually see when reviewing teacher-created assessments for alignment to objectives is that the actual I can statement is never actually assessed. I see a lot of questions asking students to recall facts and concepts related to the I can statement, but then students are never asked to actually do what the I can statement says they should be able to do with those facts and concepts. If we want to make valid inferences regarding student mastery, we must assess the center of the target (I can statement), not just the stuff around the target.

assess the target 2

If we’re not assessing the center of the targets, then this does nothing but confuse students – why give them I can statements that they will never have to actually do?  The point is that the assessment is aligned to the I can statements deconstructed from the standards so you can get an accurate picture of how well students have mastered the standards.  You cannot infer mastery of standards if the assessment is never designed to show mastery in the first place.

Part of this assessment misalignment issue is comes from teachers still having to make the leap from assessing lots of bits of content stuff to assessing how well students can use that content stuff, and questions that ask students to really use concepts and facts are not questions teachers are traditionally used to writing.  But, from my own experience, the more you try and write those upper-level thinking questions, the better you get.  Often it’s getting teachers to realize that they can write perfectly valid assessment questions (and not have to rely solely on textbook-created test bank questions) that’s the first step in helping teachers write quality assessments aligned to standards. (Want more information on how to write your own aligned assessments?  Click here.)

Now that the assessment is created and you have an idea of what mastery looks like for students, you can plan for instruction that helps students practice and work towards your visions of mastery. How to plan aligned instruction from your assessment will be explored in our next post.

 

Deconstructing the NGSS Part 2: Bundling PEs

colored-166920_1280

In my last post, I briefly described how to deconstruct a high school life science NGSS standard in order to make clear learning targets (objectives) for students.  Deconstructing a standard is a necessary process to unravel all of the stuff students have to know, understand, and should be able to do out of a standard, since those standards are often stuffed to the gills with skills and content.  They are also written for adults, not students, so we need to kidify them to the point that students can actually see what they are supposed to master.  Let’s look at another example from the NGSS, this time from the 4th grade Earth & Space Science standards:

4-ESS2-2. Analyze and interpret data from maps to describe patterns of Earth’s features. [Clarification Statement: Maps can include topographic maps of Earth’s land and ocean floor, as well as maps of the locations of mountains, continental boundaries, volcanoes, and earthquakes.]

What I can do when I have mastered the standard:

  • I can use a physical, topographic, or other given map to describe what patterns I see in the features of Earth’s surface.
  • After describing patterns that I see in the features of Earth’s surface, I can explain why those patterns are there using what I know about tectonic plates.

What I need to know, understand, and be able to do before I can master the standard:

  • I can identify the following parts of the Earth from a diagram: crust, mantle, core.
  • I can create an analogy for each part of the Earth that I identify on a diagram that shows I understand what it is.
  • I can describe a tectonic plate using my own words.
  • I can draw how tectonic plates can move against each other.
  • I can summarize how tectonic plates are arranged after looking at a map of Earth’s tectonic plates.
  • I can read physical and topographic maps by summarizing what they are showing.

Just a reminder: notice that the PE was first deconstructed and then the associated DCI was examined in order to figure out what they would need to know to master the standard. Common sense was also used – it just makes sense that students can’t analyze a map if they don’t know how to read it first…hence the last objective.

So now you’ve deconstructed a standard…now what?  I know I said in my last post we would be talking about writing assessments from your kidified objectives, but I realized that we first need to talk about something else – bundling PEs to make units, and that’s the next step after deconstruction of PEs.  Deconstruction must come first so you can more readily see connections between concepts and skills, picking the right PEs to bundle together into a unit because the concepts and skills make a natural fit.

Just because you’ve deconstructed one performance expectation does not a unit make.  PEs were never designed to be assessed one at a time; they were meant to be bundled together into coherent units.  From my work with them in the classroom, I found that bundling 2-3 deconstructed PEs together made for a manageable unit, depending on the complexity of the content.  Also, be warned – just because two PES are together on the same page in the same box doesn’t mean it always makes sense to bundle them together.  For example, the PE below is in the same “Earth Systems” box as the PE we deconstructed above:

4-ESS2-1. Make observations and/or measurements to provide evidence of the effects of weathering or the rate of erosion by water, ice, wind, or vegetation.

While this PE definitely belongs in the category of Earth’s systems and how they interact, I’m not sure it make sense to teach it alongside plate tectonics concepts.  I would rather pair the PE above with the engineering PE below:

4-ESS3-2. Generate and compare multiple solutions to reduce the impacts of natural Earth processes on humans.* [Clarification Statement: Examples of solutions could include designing an earthquake resistant building and improving monitoring of volcanic activity.] [Assessment Boundary: Assessment is limited to earthquakes, floods, tsunamis, and volcanic eruptions.]

This PE could be deconstructed to have students look more closely at what earthquakes, tsunamis, and volcanic eruptions are, how they are tied to plate tectonics, and the impact they have on humans when they occur.  Then students would be ready to generate and compare multiple solutions to the problems these natural Earth processes cause humans.

By the way, remember the PE 4-ESS2-1?  I would pair that PE regarding weathering and erosion with the PE below:

4-ESS1-1. Identify evidence from patterns in rock formations and fossils in rock layers to support an explanation for changes in a landscape over time.

The connection between ESS1-1 and 4-ESS2-1 is all about changing landscapes, so those (to me, anyway), would make a much better fit for a unit.

Deconstructing the PEs to really see what students will have to know, understand, and be able to do better reveals what PEs are truly connected, so you can make better bundles for units.  While I don’t think there’s a “right” way or even one way to bundle PEs, I do believe that there are better ways than others – and unpacking standards first to reveal connections just makes sense.

Now that we know how to bundle PEs, now the next step after making our units is to create the assessment for that unit using our deconstructed standards.  Stay tuned for assessment creation goodness.