Assessment in Education #3 … an ongoing series on assessment
Have you ever wondered where multiple-choice testing originated? See this Edutopia blog for a snapshot of the history of multiple choice. Over the years multiple-choice tests have been criticized, yet they are still given today.
Why? Because multiple-choice questions have several advantages:
Many people feel that multiple-choice questions are oversimplified and do not reflect the shades of gray students are likely to encounter in the workplace. Yet K–12 multiple-choice assessments in English Language Arts (ELA), for example, are an excellent means of rapidly assessing higher-order thinking skills. In a well crafted item, students can be asked to:
determine an author’s point of view or purpose
identify an argument or claim and the relevant supporting evidence
determine the meaning of unfamiliar vocabulary words through context clues
recognize a theme or central idea and analyze how it is developed through details in a text
Let’s take a closer look at what makes a good multiple-choice item in ELA.
Assessment in Education #2 … an ongoing series on assessment
What do we mean when we talk about assessments in education? Testing can provoke a lot of anxiety for students, parents, and teachers alike, so a close consideration of the goals of assessment is essential.
As we discussed last week, the billion-dollar assessment market is in flux. While this situation creates uncertainty, it also affords test makers, school systems, and other stakeholders a valuable opportunity to rethink the goals and the design of assessments. As a result, in the future we may see more models for assessments rooted in new educational philosophies. Although no assessment format is perfect, a few key models seem to endure.
Formative assessments offer instant information about a student’s educational progress. The goal of a formative assessment isn’t usually to gauge the efficacy of a teacher or a school, but rather to shed light on where students stand within a particular lesson, unit, or course. Think of these as the “present tense” of assessment: a snapshot of a student’s learning. Continue reading →
The International Literacy Association (ILA) recently issued its “What’s Hot and What’s Important” report. The report highlights what hot topics the country is talking about—and just as telling, what is important to the educational community. Here are some reflections on the report.
The number one hot topic in the country is assessment/standards. Given the polarization of politics, this is not surprising. The world of assessment is in flux now, with the educational community waiting to see what will happen. Will there be more pushback from the Common Core State Standards? And if so, how will it affect assessments aligned directly to the Common Core? Will states receive more direct funding and create their own assessments? Will assessment companies have to revise their current tests? And, most important, when will the world of assessment settle down?
We think the pushback from the Common Core will continue. This may be an opportunity for assessment developers if some states write their own standards, because they will want revised assessments aligned to those standards. One thing is clear: assessment and standards will remain in flux for most of the year, as the political battles continue.
At Victory, we have been developing many kinds of assessments. Whether the assessment is high-stakes summative testing, a performance-based task, or formative student self-assessment, assessment has a huge impact on classroom instruction. This means assessment literacy is a critical tool for teachers as they develop curriculum and apply classroom strategies.
What Is Assessment Literacy?
What does assessment literacy mean? It may help to consider other types of literacy. Science literacy, for example, means being prepared to understand and discuss science issues, especially as they impact social and political decisions. Visual literacy involves understanding how people use visual information, and choosing a visual approach that supports your goals. Digital literacy is the ability to use technology tools, and choose which tool is most appropriate for a given purpose. In the same way, assessment literacy is the ability to understand the purpose of each type of assessment and then use this knowledge to make better assessment decisions.
From our experience, these are 5 keys to assessment literacy:
5 Keys to Assessment Literacy
Understanding different kinds of assessments and their purposes
Recognizing the best assessment to use for a given situation
Knowing how to prepare students for each kind of assessment
Knowing how to implement each kind of assessment
Getting comfortable with interpreting assessment data
Recently, we premiered our digital lesson on the Boston Massacre at the ISTE and ILA conferences. The lesson was a big hit. It inspired many discussions with technology coordinators and educators on what makes a lesson good for digital literacy. The table below summarizes what we learned, and the video that follows gives concrete examples of how the 5 keys to digital literacy are executed in the Boston Massacre lesson.
5 Keys to Digital Literacy
Make sure the lesson has a beginning, a middle, and an end.
Each interactive should build on the previous one so that students gain practice and automaticity in skills and strategies.
Processes for working through a digital lesson need to be consistent.
Cross-curricular activities encourage students to employ skills and strategies from other disciplines in new ways.
Make sure students are using data, analyzing it, and using 21st-century skills.
In this blog, we’ll explain why we expanded the performance task to become an interactive lesson, with embedded performance tasks. So, this is really an evolution, not a revolution.
Here is a sneak preview of the lesson:
What Was Missing in the Original Task?
Our original Boston Massacre performance task was unique in several regards:
It developed critical thinking through the analysis and comparison of key characters.
Students evaluated multiple causes and effects to rank the importance of earlier events that led up to the key event.
Students needed to do a close reading to find evidence to support their arguments.
But, a performance task is rarely used in isolation. It is either part of a summative assessment or used formatively in a lesson. For a lesson, providing a stand-alone performance task requires the teacher to do a lot of work before it can be effective. The teacher has to decide:
whether it supports the lesson objectives,
when to use it in a lesson,
how to provide support if students struggle, and
how to use the scores.
Even the world’s best performance task won’t help students if teachers won’t use it!
In our first blog on this topic, we began to reveal the secret behind the magnetic allure of games, simulations, and other online performance-based digital learning experiences. We showed you a well-aligned, well-designed simulation-based science performance task.
In this blog, we’ll show you how the design of a digital performance task directly influences the richness of the data we can gather on student learning. This time, we’ll use a social studies task we developed:
Behind the Scenes: Designing the Boston Massacre Performance Task
We designed this performance task to give students opportunities to visually, graphically, and dynamically analyze text, make inferences based on evidence, and synthesize their understanding.
Why? Let’s examine the standards. The C3 framework, CCSS standards, and other state and national efforts to align learning expectations to 21st-century workforce demands are emphasizing critical analysis and evidence from text.
We have become proficient at developing performance tasks closely aligned to NGSS (Next Generation Science Standards). Of course, a good performance task aligns to standards across multiple disciplines. The following task was developed for middle grades and for these learning goals.
Please watch the video and then try the performance task. We’d love to hear your feedback!
This is the third in our series of posts about the changing landscape in education.
In the video below, Victory’s staff discuss what’s next in assessment. We’ve added a few more thoughts below. Feel free to use the comments to join in the conversation.
A Variation on “The Chicken or the Egg?”
Which comes first, change in curriculum or change in assessment? Often people see standards as a driver of change, and certainly both assessment and curriculum are affected by standards reform. But as it is with so many things, you don’t really know what it is (or can be) until you see it in action.