Even before remote learning, I had been considering different ways of assessing students in my Integrated Math 2 and Introduction to Statistics courses. I felt (and students agreed) that they were not usually successful on more traditional multiple choice exams and that it was not a good measure of what they understood. With everything being upended by the pandemic, I felt this was the perfect time to upend what I was doing in my classes, too. And because my research team is investigating how implementing different literacy strategies support mathematical content understanding, I wanted to see how I could incorporate this into our assessments.
With the support of a colleague, an idea emerged. Students would be asked to create a portfolio on the essential content from the quarter and provide evidence of their understanding; I wanted to see if they had mastered each objective. I chose the objectives based on the enduring understandings of the course – what are the 10 things students should be able to explain without looking at their notes (influenced by the Ask Good Questions blog post here). I gave students a lot of latitude in how to provide their evidence. They could use examples from previous class assignments or provide new explanations. They could type their response or upload screen recordings of their explanations. They could select whatever work they felt showed mastery. Doing an assessment this way meant that students now had a choice in how to demonstrate their understanding, and also makes it possible to assess the depth of their knowledge on a skill/standard (something that is challenging if I can only assess a standard once on a multiple choice test).
I have been focusing on academic discourse in my classes, assessing when students are using academic language appropriately and writing explanations. The examples below are from my statistics class were from quarter 1. I felt that many of the students showed understanding but did not do a great job of putting together coherent explanations of the topic. I focused more intentionally on this during quarter 2 in both classes. We built exemplar responses and annotated them for claim, evidence, and reasoning. We took student work examples and compared them, combining pieces we liked into one exemplar class response. Earning full credit on the rubric required a specific, detailed, and original response because the idea was if you are able to explain the topic, you have reached mastery (as opposed to being able to follow an example or use a formula, which demonstrates some understanding but not full and complete).
Here are the Directions for Statistics.
I was incredibly nervous, but the results blew me away for both Math 2 and Statistics. Quite a few of my students who struggled throughout the quarter did really well. Being able to see how much they knew about a topic was such valuable information because I’m willing to bet if they saw a standard deviation problem on an exam some wouldn’t even attempt it. This way, I was able to give at least some credit because I can see that they do know some of the material.
This is now something I have implemented in my Integrated Math 2 course, and other Math 1 and Math 2 teachers have done the same thing. We also felt we had similar successes here – students worked really hard on writing their explanations and it was much easier for us to see how deeply they understood the content. This is something that I hope will become a routine practice not only in my class, but in others too.
What would I do differently?
- Some students copied/pasted directly from previous assignments – this was okay and allowed, but some students chose evidence that didn’t answer the guiding question or was incomplete. This tells me that they could find the connecting lesson and had a general idea of what to look for, but have a shallow understanding of the content. I’m thinking that for next time, I will discourage copying work from previous lessons unless there is new commentary provided. This will force students to synthesize the information and succinctly share what they know about the math involved.
- As I was grading the portfolios, it was clear that some of the guiding directions weren’t getting to the depth of content that I actually wanted. As I was writing them originally, a lot of the directions were along the lines of “define and give an example.” I think for the next iteration I want the general guidance to be “define, example, explain, interpret” and have that also connect to the rubric because if a student is able to explain and interpret something, I think that means they definitely have a proficient understanding. For some topics, I may need to provide a little more guidance around what I want, but for the most part I think this will work well for all.
- Claim, evidence, reasoning worked for many of them but in some of the objectives students really struggled (the more straightforward, process-based questions in Math 2 for example). We had spent a lot of time looking at it as three distinct sections, when in reality sometimes it may look more like: claim, evidence, reasoning, evidence, reasoning, etc. As we spend more time working on our academic discourse (and learning English in general), I’m hoping this will improve!