Victory’s spinoff metacog has been busy adding new features and functionalities. When companies look to incorporate metacog into their digital products, they want to know two things:
How does metacog work?
What can metacog help me do now that I couldn’t do before?
The answers to both questions lie in our unique approach to guided deep learning: machine learning steered by an understanding of real student outcomes.
In education, deep learning is different from deeper learning, which is a pedagogical approach to instruction. In the world of Big Data, deep learning is an artificial intelligence (AI) approach that creates neural networks in which each “neuron” is a computer processor. This structure mimics how the human brain works in parallel processing.
Deep learning can be very effective, but it has a drawback: neural networks are so complex that we can’t know how they arrive at certain decisions. Continue reading →
The educational market is in flux. States are pushing back from both Common Core State Standards (CCSS) and assessments linked to the CCSS. States and publishers are waiting to see:
Will funding be directed to charter schools?
How many more states will drop out of the CCSS?
Will states want summative, formative, or competency-based tests?
How will products align to changing state standards?
What products should states, districts, and publishers develop to meet current market needs?
Many states are moving to create their own standards. How will these new standards affect the educational market? What steps must states and publishers take?
All the uncertainty in the market calls for gap analyses. A gap analysis identifies how current products are aligned to new standards, which standards still correlate, and what’s missing—gaps where new standards are not well covered.
Publishers need to ensure that their products and assessments readily address the changing needs of states and districts.
States and districts need to know how their new standards align to older standards. Since most states adopted CCSS, new standards usually are analyzed and compared to CCSS.
World Languages & Education #1 … an ongoing series
As we discussed in recent posts, the assessment market is in flux. But this is nothing new. The passage of No Child Left Behind (NCLB) in 2002 disrupted the market, and for some companies this turned out to be a boon, as spending on state-level assessments nearly tripled in the next 6 years. As you can see from this graph, state-level assessment spending has decreased since 2008, while classroom assessment spending has continued to grow.
Just as the change in 2002 represented an opportunity for many companies, the shifts we see now may also have a silver lining. And for one area in particular, Spanish assessments, there may be continued growth, especially in the classroom market. Why? Regardless of other shifts that may occur, students with Spanish as the first language comprise by far the largest population among English Language Learners (ELL) in the United States, at 71%, according to the Migration Policy Institute.
At Victory, we have been developing many kinds of assessments. Whether the assessment is high-stakes summative testing, a performance-based task, or formative student self-assessment, assessment has a huge impact on classroom instruction. This means assessment literacy is a critical tool for teachers as they develop curriculum and apply classroom strategies.
What Is Assessment Literacy?
What does assessment literacy mean? It may help to consider other types of literacy. Science literacy, for example, means being prepared to understand and discuss science issues, especially as they impact social and political decisions. Visual literacy involves understanding how people use visual information, and choosing a visual approach that supports your goals. Digital literacy is the ability to use technology tools, and choose which tool is most appropriate for a given purpose. In the same way, assessment literacy is the ability to understand the purpose of each type of assessment and then use this knowledge to make better assessment decisions.
From our experience, these are 5 keys to assessment literacy:
5 Keys to Assessment Literacy
Understanding different kinds of assessments and their purposes
Recognizing the best assessment to use for a given situation
Knowing how to prepare students for each kind of assessment
Knowing how to implement each kind of assessment
Getting comfortable with interpreting assessment data
In this blog, we’ll explain why we expanded the performance task to become an interactive lesson, with embedded performance tasks. So, this is really an evolution, not a revolution.
Here is a sneak preview of the lesson:
What Was Missing in the Original Task?
Our original Boston Massacre performance task was unique in several regards:
It developed critical thinking through the analysis and comparison of key characters.
Students evaluated multiple causes and effects to rank the importance of earlier events that led up to the key event.
Students needed to do a close reading to find evidence to support their arguments.
But, a performance task is rarely used in isolation. It is either part of a summative assessment or used formatively in a lesson. For a lesson, providing a stand-alone performance task requires the teacher to do a lot of work before it can be effective. The teacher has to decide:
whether it supports the lesson objectives,
when to use it in a lesson,
how to provide support if students struggle, and
how to use the scores.
Even the world’s best performance task won’t help students if teachers won’t use it!
In our recent blog post, Instructional Design 101, we provided an overview of several popular instructional design models. One of these, the original ADDIE model, was a linear approach with some iterative features. It evolved to be more cyclical, and spawned many other models. In similar fashion, our linear workflows at Victory have evolved to keep up with rapid changes in our industry.
Watch this video for a quick look at Victory’s vendor and partnership processes. Many projects do not require a partnership process; we originally used it to develop digital products, but it has many benefits for complex print products as well.
The video also references backward design, which we first blogged about in Talking to the Test: The Learning Continuum. In backward design, the initial development focuses on assessments because they determine what evidence we will accept as proof of mastery of the associated learning objectives. Again, not every project warrants a backward-design approach. It makes the most sense for subjects with open-ended user experiences that are hard to assess and hard to teach. We have found that if most of the assessment is traditional, then a traditional development process generally will also be sufficient.
In our first blog on this topic, we began to reveal the secret behind the magnetic allure of games, simulations, and other online performance-based digital learning experiences. We showed you a well-aligned, well-designed simulation-based science performance task.
In this blog, we’ll show you how the design of a digital performance task directly influences the richness of the data we can gather on student learning. This time, we’ll use a social studies task we developed:
Behind the Scenes: Designing the Boston Massacre Performance Task
We designed this performance task to give students opportunities to visually, graphically, and dynamically analyze text, make inferences based on evidence, and synthesize their understanding.
Why? Let’s examine the standards. The C3 framework, CCSS standards, and other state and national efforts to align learning expectations to 21st-century workforce demands are emphasizing critical analysis and evidence from text.
We have become proficient at developing performance tasks closely aligned to NGSS (Next Generation Science Standards). Of course, a good performance task aligns to standards across multiple disciplines. The following task was developed for middle grades and for these learning goals.
Please watch the video and then try the performance task. We’d love to hear your feedback!
You have been asked to make the jump safe. The video below explains how to set up a simulation to investigate.
Click “Playground” in the PhET® simulation below and set up a jump as shown in the video. Remember to set friction to zero and always release the skateboarder from a height of 5 meters.
Then modify the setup to make the jump safe, where “safe” is defined as converting less than 1/4 of the total energy into thermal energy.
Use your observations of the skateboarder’s motion to explain why reducing thermal energy transfer reduces the risk of injury.
The Skate Park simulation was developed by PhET.
PhET Interactive Simulations, University of Colorado Boulder, http://phet.colorado.edu.
This is the third in our series of posts about the changing landscape in education.
In the video below, Victory’s staff discuss what’s next in assessment. We’ve added a few more thoughts below. Feel free to use the comments to join in the conversation.
A Variation on “The Chicken or the Egg?”
Which comes first, change in curriculum or change in assessment? Often people see standards as a driver of change, and certainly both assessment and curriculum are affected by standards reform. But as it is with so many things, you don’t really know what it is (or can be) until you see it in action.