Curiosity may have killed the cat, but it may just secure an accurate ATLAS reading score too

By Sarah Stratton


What does curiosity have to do with the ATLAS Reading test? It quite possibly may be the key to getting accurate test results from our students. 


High stakes tests tend to produce accurate test results. Why? Students have to pass them to graduate. They have a stake in the test. The test matters to them. The test impacts their lives. Their future. 


Having spent close to 20 years teaching high school English in upstate NY and being charged with getting my students through their high stakes English Regents Exam, I personally know the difference between preparing students for a state exam and preparing students for a high stakes state exam–student motivation. Yes, some students will try their hardest on their state exams regardless whether they are high stakes or not. Especially the younger students. However, once students realize or find out that it doesn’t matter how they do (even when threatened with remedial classes), they won’t try hard. Unfortunately, reading tests are hard; they require close reading and intense concentration. 


Just last year, I was reading some passages on a practice test and found myself wanting to skim. To be even more honest–I didn’t want to read any of it. It was boring. It was long. I didn’t really have to read it. It dawned on me–this is how students feel. When you don’t have to do something, your brain knows it. It wants to slack.


My students in NY didn’t care that the text was boring. Or hard. Or long. They had to pass the test to graduate. That was enough, especially since they were well-prepared going into the exam.


Even though our students may be well-prepared going into the ATLAS exam–they need the motivation to struggle through the boring and the hard and the long. Curiosity just may be the key.


Before I get into discussing the curiosity factor, I want to acknowledge there are other factors that impact student motivation. Having a relationship with your students is huge. Students are definitely more apt to try hard on their state exams when they know their teachers care about them and work hard for them. When I taught in 2020, I asked my students how many of them tried their hardest on their Reading ACT Aspire test. Dishearteningly, I didn’t see as many hands as I wanted to see. I told them if they wouldn’t try hard for themselves, would they please try hard for me if they could tell that I worked hard for them and cared about them. One of my lowest students grew from a 409 (In Need of Support) to a 419 (Close). Yes, he grew as a reader that year, but he also tried his hardest. That matters. 


In 2022 I created a quick assessment rubric using the ACT Aspire language. I created it to assess an answer that requires analysis. When I was teaching in 2020 (I had been out of the classroom for nearly 10 years), I required writing on all of my assessments and did not want to give zeros for written responses. One thing I did know was that students who struggled would have no motivation to improve if they were always getting zeros on their incorrect answers. I invented a simple 1,2,3,4 grading system. A one was NO CLUE, a 2 was I KIND OF UNDERSTAND, a 3 was I HAVE IT, and a 4 was I UNDERSTAND THIS DEEPLY. I really only used this grading system to avoid the zero trap. At that time, I didn’t think of it as a window to a student's thinking and reading ability. When I created the quick assessment rubric, I had an epiphany. If I had created the quick assessment rubric in 2020, I could have predicted my student’s growth. His responses went from 1’s to 2’s and 3’s. By the end of the school year, his responses demonstrated he was CLOSE. 


Why am I sharing this with you? Two weeks ago, I came across my quick assessment rubric and wondered why I wasn't using it. My students had just taken a quiz on a text they had read themselves, so I made copies of the rubric and picked one question from the quiz to analyze using the quick assessment rubric. The students were instructed to use the CLAIM, EVIDENCE, COMMENTARY format to answer their questions; they had to write 3-5 or more sentences. QUICK ASSESSMENT RUBRIC


When I passed back the quizzes, I conferenced with each of my students and told them what I had seen in the response and what it was about their responses that led me to the score. I did tell them that I hadn’t warned them that I would be doing this check, so they definitely may have tried harder if they had known and scored higher. (They were curious.) I scored some as 1/2 or 2/3 because the responses were dangling between two categories. I told them they could bump up so easily by being consistent in their responses. I had some 4 responses as well. It was fun to tell students that their responses were a 4. I succeeded in piquing their curiosity. They wanted to know how each other did, how to bump their score up, etc. They asked me when I was going to check them again. When the assistant principal came in to get something from my classroom during the conferences, one of my students bragged to him that her response had been a 4. She was excited. 


What I emphasized to my students was that the quick assessment rubric was a window into their thinking, and their thinking was reflective of their reading comprehension. 


What am I trying to accomplish? First and foremost, I’m trying to get my students invested in reading closely, responding thoughtfully, and demonstrating their learning accurately. 


Then to seal the deal, I’m going to remind students as we get closer to the test that if they try their hardest, their test scores should match their thinking on their quick assessment rubric scores. I want them to want to do their best even though the texts on the exam may be boring, or hard, or long. I’m hoping that causing CURIOSITY CHAOS produces more accurate results.

Ⓒ Sarah Stratton 2025 All Rights Reserved

Comments

Popular posts from this blog

Okay–so I’ve decided to implement WOC. What’s next?

The Anatomy of a Paragraph