We’re All Learners

I think most everyone has seen this graphic before:

7717136134_e51afabe94_o

Photo credit: oklanica via Flickr

I have been far outside of my comfort zone for the past two days.  This former high school teacher has been attending a local conference concerning Eureka Math, which we have been implementing K-5 for the past 2 years.  I thought it best to attend this three-day mathstravaganza since I am asking teachers to implement this curriculum in their classrooms…and I am the Curriculum Director, after all; I should be here.

The first day I attended a 5th grade module study (did I mention I taught high school science for 18 years??).  I spent the day doing 5th grade math, which, in Eureka, looks a lot like this:

I was introduced to the world of place value and all its chip diagrams, vertical number lines, bundling and unbundling, pictoral representations and standard algorithms.  I was completely out of my depth (after all, we high school teachers expect our students to be pre-loaded with this information when they get to us), an imposter among the real 5th-grade teaching experts in the room.  The teachers around me took pity on me after I confessed I was a Curriculum Director with a high school background, whispering definitions of things like “decimal fractions” and “number sentences” and explaining the basics of a Eureka lesson while the wonderful facilitator in the front of the room demonstrated problems and answered questions.

Boy, did I feel stupid.  But I sure did learn a lot.  I learned that this curriculum has a levels of focus and coherence that mirror that of the CCSS math standards, with concepts and skills building and spiraling, containing absolutely no fluff.  I learned that, even though this is a pre-packaged curriculum, you must still prioritize concepts and activities to maximize the time you have with students.  I learned that the main intent of Eureka math is to have students understand the math using the mathematical practices rather than the usual emphasis on finding right answers as fast as you can.  I also learned that people who don’t look at the big picture of how Eureka math is set up often focus on one piece of it that they don’t like and use that to vilify it (hence all those posts on the internet ranging at the “ridiculous” ways students are told to “do” math problems these days).

I truly believe that administrators are learners too.  Sometimes people feel that administrators should have all the answers and know everything…but how can that be possible?  No one has all the answers.  No one knows everything. And, just like the shift Eureka takes with it’s focus on student understanding rather than correctness, I think administrators have to cultivate an emphasis on growth and improvement rather than “rightness.”  And it means making ourselves uncomfortable at times, being willing to learn along with our teachers.

When we realize that we’re all learners, that administrators and teachers should be learning together because we’re all in this business of improving student learning together…that’s when the magic happens.

Deconstructing the NGSS Part 4: Aligned Instruction

assessment instruction puzzle

In Part 3 of this series of deconstructing the NGSS, we looked at how to align the assessments to our deconstructed learning targets.  When aligning assessments to targets, the main rule you need to remember is this:  The verb of the learning target should indicate to students what they need to do on the assessment to show mastery.  And it’s that assessment that reveals what mastery looks like for students.

If the assessment is what mastery looks like, then we need to plan instruction to get students there.  In other words, our instruction should be aligned to what mastery looks like on the assessment.

For example, let’s take a look at the assessment plan presented in the last post again:

 

Take a look at the first objective or learning target in the list: “I can tell the difference between weather and climate.”  Do students have to be able to internalize the definitions “weather” and “climate” in order to master this objective?  Yes.  But they also need to go one step further and actually state a difference – and the difference cannot be merely the definitions of those two terms.  Students need to examine the definitions and then extract a difference from them and state that and not just parrot back definitions they copied out of a text or off the internet.  A true difference reveals understanding, not just the capability of rote memorization.  So, on the test, students would encounter a differences chart such as the one below:

differences 2So how should your instruction help students master this objective (which looks really simple, but actually requires some thinking)? If I was back in the classroom, here’s how I’d go about it:

  1. Have students look up the text/internet definitions of climate and write them out.
  2. In pairs, have them discuss what those definitions are/what they mean in their own words and write the “student-translated” definition down in their notes.  Have the teacher confirm that the translation is acceptable before moving on.
  3. On their own, students complete a Frayer model diagram for each term.  Students should share their Frayer model diagrams with at least two other students and the teacher before moving on.
  4. On their own, have students extract a difference from their student-translated definitions and Frayer models.  They will write it down in a chart just like the one pictured above and then, underneath the chart, compile that difference into a well-thought-out, logical, beautifully constructed sentence that will bring the teacher to her knees with joy.  These charts and sentences will be teacher reviewed for feedback.

Please note two things about the instructional plan above: First, the teacher was only involved in setting up the learning activities ahead of time and giving feedback during the activities – in every step, the students were responsible for doing the work of learning.  Also note that the “answer” to the differences chart was never gone over as a whole group to avoid students writing down something that wasn’t their own thinking to begin with. Second, the students were practicing the thinking they would need to master the objective, not just the stuff that would help them master the objective.  Too often as teachers we think if we give students the content stuff they will just know how to magically put it all together…and they don’t.  We have to give them practice at not only learning the stuff, but also learning how to think with and use the stuff.

Now, I know what some people are probably thinking: “You have them the question before they took the test!  Of course they’re going to do well if you make it easy for them like that!”  Well, I don’t know if beginning with the end in mind is equivalent to “making it easy.” If you just hand out test questions as your instruction and expect kids to get the right answers and nothing else, then sure – it’s really easy!  However, the learning activities that are geared towards having students practice thinking certainly aren’t that easy for students in my opinion, especially when you have students that are a product of a educational system where quick right answers are more valued in class and on assessments than patient problem-solving.  But also don’t forget that students can only master a target they can see; if we keep what mastery looks like from them by never giving them the end-goal for mastery, then we’re just setting them up for failure rather than success.

(Also note that, as far as multiple choice questions are concerned, you should definitely NOT give them the same questions during instruction as you would during the test, because students will memorize correct answers and then you cannot draw valid inferences regarding what they have actually mastered.  Questions assessing similar content, concepts, and skills should be given during instruction/on formatives, but should not be the same questions as on the summative assessment in order to draw valid inferences about student understanding.)

Bottom line, your instruction needs to be planned in such a way that aligns to what mastery looks like on your assessment.  However, students not only have to practice with the content/conceptual stuff they’ll need to master, but also the thinking they will have to do with it.  To me, the “thinking practice” is much more important than any science stuff I ever taught students.  Why? Because I remember hearing once that, after students graduate from high school, they forget about half of what they learned in 6 months because they simply don’t use it or don’t need it for what they are doing with their lives.

But will all students need the ability to think, no matter where their lives take them? Absolutely.

 

 

 

Deconstructing the NGSS Part 3: Writing Aligned Assessments

In Part 1, we took a look at how to deconstruct an NGSS performance expectation (PE) into student-friendly objectives, and in Part 2 we examined how to bundle PEs together to make coherent units (and it’s not just teaching the ones grouped together in the standards together).  Now that we can really do some backwards design and create an assessment that is aligned to our deconstructed standards so we have a clear picture of mastery before we do any instruction.

I took the liberty of bundling and deconstructing some middle school earth science PEs.  A list of those PEs and the deconstructed standards (I can statements) derived from them can be seen below. (Or click here to access the document.)

(Please note that any time you first write a set of I can statements, they should be considered as a draft until you actually teach with them, and should always be reviewed each year for changes to make them better.  This is where the art of teaching comes in – you write the I cans with as much foresight as you can, but you never really know how well they will work with students until you actually use them with students!  Don’t be afraid to make minor tweaks as you move through a unit, either – we’re all co-learners in this process.)

Now, if you take a peek at the document above, you’ll find that it is chock-full of all sorts of earth sciencey goodness.  In fact, when I compiled these, my first thought was that there was way too much stuff in this unit and that there would be no way I would hand this entire list out to a gaggle of middle-school students.  This is exactly why you need to deconstruct the PEs before finalizing your bundled units.  You never really know how much stuff there is in a PE until you deconstruct it..which gives you a better sense of what and how much to bundle together.  If I were still in the classroom, I might do each of the sections (Water, Weather, and Climate) as separate mini-units, then using MS-ESS3-3 and 3-5 to assess students upper-level thinking/strategic thinking skills in a separate unit.  Or if I wanted to do a more problem-based/inquiry approach, you could start with MS-ESS3-3 and 3-5 and let students discover the content in the course of searching for viable solutions and asking clarifying questions.

But I digress-what about assessing these I can statements?  Let’s take a look at the I can statements under the topic of “Weather,” which were deconstructed from PE MS-ESS2-5:

  1. I can tell the difference between weather and climate.
  2. I can explain and tell the difference between the different components of weather: temperature, humidity, pressure, precipitation, and wind.
  3. I can determine the relationships between density of air masses and the different components of weather.
  4. I can predict what weather will occur when two air masses interact and explain why it will occur.
  5. I can collect and analyze data from maps and scientific experiments I design in order to predict weather events.

Before we get started, let’s remember that the purpose of any assessment is to draw valid inferences regarding what students know, understand, and are able to do.  In order to make those valid inferences, we have to have a clear picture of what evidence we need as instructors that allows us to see if students know, understand, and are able to do the things we need them to do in the I can statements.  How do we get that clear picture?  By looking at the verbs we used in our I can statements, of course.

Those verbs at the start of the I can statements will determine what assessment questions or activities we will have students do.  If you tell students they should be able to tell the difference between something, they should do that on the assessment.  Same goes with explaining, predicting, collecting, and analyzing – if those are the verbs you use, then that’s what students should be doing.  Below you can see an outline (some call it an “assessment plan”) of what the assessment would look like for the I can statements.


Using the plan above, an assessment can be written that determines what mastery of the I can statements looks like ahead of time.  Please note that while basic understanding of concepts is assessed, what the verb says students should be able to do is always a part of the assessment.  However, like I mentioned in my first post, giving those verbs to students makes it clearER to students, but not crystal clear.  For example, what does it really look like when students analyze something?  Predict? Summarize?  Interpret? Describe?  Explain?  The meaning of these can vary from educator to educator, so you’ll have to make it clear to students what those verbs look like up front so there’s little confusion.  When I was in the classroom I usually discussed the lists of I can statements with students on the first day of a unit, activating prior knowledge about concepts but also looking at verbs and predicting what they would have to do on the summative assessments in order to show me that they had mastered the I can statements.  It didn’t take very long, but students understanding what the verbs meant had a huge return on investment when they were working towards mastery during the unit itself.

One of the main errors I usually see when reviewing teacher-created assessments for alignment to objectives is that the actual I can statement is never actually assessed. I see a lot of questions asking students to recall facts and concepts related to the I can statement, but then students are never asked to actually do what the I can statement says they should be able to do with those facts and concepts. If we want to make valid inferences regarding student mastery, we must assess the center of the target (I can statement), not just the stuff around the target.

assess the target 2

If we’re not assessing the center of the targets, then this does nothing but confuse students – why give them I can statements that they will never have to actually do?  The point is that the assessment is aligned to the I can statements deconstructed from the standards so you can get an accurate picture of how well students have mastered the standards.  You cannot infer mastery of standards if the assessment is never designed to show mastery in the first place.

Part of this assessment misalignment issue is comes from teachers still having to make the leap from assessing lots of bits of content stuff to assessing how well students can use that content stuff, and questions that ask students to really use concepts and facts are not questions teachers are traditionally used to writing.  But, from my own experience, the more you try and write those upper-level thinking questions, the better you get.  Often it’s getting teachers to realize that they can write perfectly valid assessment questions (and not have to rely solely on textbook-created test bank questions) that’s the first step in helping teachers write quality assessments aligned to standards. (Want more information on how to write your own aligned assessments?  Click here.)

Now that the assessment is created and you have an idea of what mastery looks like for students, you can plan for instruction that helps students practice and work towards your visions of mastery. How to plan aligned instruction from your assessment will be explored in our next post.

 

For improvement, more practice-based evidence.

I know I promised everyone Part 3 of my Deconstructing the NGSS series regarding creating assessments (which is coming soon, I promise), but I was perusing my feed reader this morning and I came across a post by Larry Ferlazzo that contained a quote that really hits home with me, and I couldn’t help blogging about it.   This quote came from an article by Anthony Byrk titled Accelerating How We Learn to Improve where he coins the phrase “practice-based evidence:”

The choice of words practice-based evidence is deliberate. We aim to signal a key difference in the relationship between inquiry and improvement as compared to that typically assumed in the more commonly used expression evidence-based practice. Implicit in the latter is that evidence of efficacy exists somewhere outside of local practice and practitioners should simply implement these evidence-based practices. Improvement research, in contrast, is an ongoing, local learning activity.”

This quote conjured up a few questions in my mind after reading it:

  • How many schools and districts have been held back from real improvement simply because they won’t deviate from what’s sold to them as research/evidence-based (often coming to schools in the form of slick pre-packaged kits of materials from publishing companies that are touted to help all students achieve but are aimed so squarely at the middle that they actually help relatively few students)?
  • How many teachers have been held back from helping their students improve because they have been told to implement a research-based curriculum/intervention/strategy that isn’t targeted to their students’ learning needs at the exclusion of other less-researched strategies or curricula?
  • Why do we discourage rather than encourage teachers to gather practice-based evidence, thereby stunting their growth as learners in their own right?
  • How does this encourage schools and district to be the blueprint, not the copy? (Spoiler alert: It doesn’t.) Or, does it just encourage educators to work hard in the wrong direction with the wrong tools, with little to no improvement in their practice or student learning?

 

Real improvement comes from within.  We need to encourage teachers to seek out practice-based evidence in their own classrooms in order to help students, improve their practice, and pass on that learning to others around them.  Our students can’t wait for an outside entity to deem something “classroom worthy.”