top of page

QUANTITATIVE GROWTH

Anchor 1

My students' quantitative growth is measured each year by the STAAR, the State of Texas Assessment of Academic Readiness. In this section, I include important background information about the STAAR, share and analyze student data, and reflect on whether or not the STAAR is an accurate test of how much students have learned. 

TABLE OF CONTENTS

STAAR Explanation and Context

Data Charts and Analysis

Teacher Reflection and References

 

STAAR EXPLANATION AND CONTEXT

In Texas public schools, quantitative academic growth is measured by the STAAR. The STAAR is a state standardized test that measures student mastery of the TEKS, the Texas learning standards. The STAAR is a static, pencil and paper exam. All students, regardless of whether they receive special education services, take the same exam. The STAAR Reading test is a multiple choice exam that does not include any short answer or essay questions. 

​

Anchor 2

The questions above are questions from the fifth grade STAAR Reading test in 2017. The STAAR often requires to think beyond comprehension and make inferences about both the texts and the author's purpose in writing the texts. In addition, the questions are usually quite lengthy and include excerpts from the passages. 

In fifth grade, students are tested in math and reading in March. In 2017, the reading test was comprised of 7 passages and 44 questions, 6 of which were field questions that were not scored. If students do not pass the exam the first time, they are required to take it a second time and attend summer school. If students do not pass the exam the second time, they are required to take it a third time at the end of summer school. Passing the STAAR is normally a requirement in order to advance to the sixth grade, although that requirement has been waived for this school year because of Hurricane Harvey. Most middle schools do not use this data in any significant way, except to identify which students may need additional academic intervention. Because teachers get the scores from the first administration late in the school year (we receive the first administration scores one week before the second administration date), I use the STAAR data to identify which students will need to attend summer school.

​

My school district does not use any additional progress monitoring to assess student growth in a quantitative way. Each school in the district has their own way of benchmarking students throughout the school year. At my campus, students are given the most recent fifth grade STAAR test during the first week of school as a benchmark exam to see where students would fall without any exposure to the fifth grade learning standards. Students take another released STAAR test in January as a mid-year benchmark, and the real STAAR test was given in March during the 2016-2017 school year. 

​

The STAAR quantifies student performance in one of four ways:

  • mastering grade level (scoring an 87% or above)

  • meeting grade level (scoring a 78% or above)

  • approaching grade level (scoring a 58% or above)

  • not meeting grade level (scoring a 57% or below)

 

Students who approach, meet, or master grade level pass the STAAR test, while students who do not meet grade level fail. 

The chart above comes from Lead4Ward, which provides educators with resources related to the STAAR. This chart shows the cutoff scores for the different passing levels: approaching grade level (58% correct), meeting grade level (78%), and masters grade level (87%). It also converts the raw score (number of questions correct) and the scale score into a percentage correct. 

DATA CHARTS AND ANALYSIS

When teachers receive the STAAR results from the first administration, the scores are provided on paper. Teachers do not have access to the scores electronically until several weeks later, and once fifth graders leave the elementary school, their scores disappear from the online system that my district uses. The data charts included below have been created by me from the paper version of my students' STAAR results.  First, I include a spreadsheet of individual student scores, and then I include several charts that give an overview of my students' results. 

Anchor 3

The spreadsheet above shows my individual student scale scores (out of 1996) and raw scores (out of 38). It shows whether my students passed the STAAR, and whether they approached, met, or mastered grade level. It also includes the number of questions that each student got correct in the three question categories: across genres, literary texts, and informational texts. Students highlighted in red received special education services throughout the year. These students did receive STAAR accommodations (such as extended time and oral administration of test questions), but they took the same test as their general education classmates. 

Looking at the data included above, several things become clear. First, all of my special education students (highlighted in red) failed the STAAR test. I believe this is because the STAAR does not have adequate accommodations for special education students, many of whom are reading multiple grade levels behind their peers. Second, the average number of questions correct was highest in the first category, Understanding and Analysis Across Genres. This shows me that most of my students had internalized my lessons about how to make connections between texts, and that these lessons helped them on the STAAR. Third, my students did the worst on questions in the second category, Understanding and Analysis of Literary Texts. This is significant because that category has the most questions on the test. Going forward, I know I will need to ensure that I am placing a continual emphasis on literary text skills throughout the school year, which I can do through our class novels. 

The charts above show the percentage of my students who approached the grade level standard and who met the grade level standard. 83% of my students approached grade level standard, which means that 83% of my students passed the STAAR on their first attempt. After the second administration of the test in May 2017, my passing rate increased to 88%.  57% of my students were classified as meeting grade level standard, which meant that they scored a 78% or higher on the test. 

​

It is also worth pointing out that my students scored higher than the school and state average for the percentage of students who approached and met grade level. This is true even when the data is broken down to look specifically at racial demographics. I believe this is because I use an authentic literacy approach that is centered around class novels and independent reading books. Students spend at least 20 minutes per day reading their independent reading book, and we spend another 20 minutes reading our class novel aloud, stopping to discuss what we are reading along the way. This means that students are reading an authentic text (as opposed to a passage or a textbook) for approximately half of our 90-minute class time. I know that this is not the norm at many Title I schools in Houston and across Texas. At many of these schools, which serve a high percentage of low-income and Hispanic students, authentic texts are pushed to the side and there is a school-wide emphasis on year-round test prep that uses reading passages and multiple choice questions. However, research has shown that students who read books they enjoy experience higher levels of academic achievement, and I believe this is evident in my STAAR results (Sullivan and Brown, 2015). 

The chart above compares my individual scores to my school average and the state average. My approaching grade level percentage was 12 points above the state average and 10 points above the school average. When the data is separated to look specifically at Hispanic students, my scores are 5 above the school average and 14 points above the state average. I did not separate the African-American data because I only taught 3 African-American students and all of them passed. The only area in which I did not exceed the school or state average was the percentage of students who mastered grade level. 

Anchor 4

In conclusion, my STAAR data from the 2016-2017 school year shows that my students made more academic growth than their peers in my school and around the state. This is true even though I teach students who, according to demographic statistics, generally have lower scores on these types of standardized tests because of language barriers and low reading levels. The percentage of my students who met and/or approached the grade level standard exceeded the state average by between 12 and 14 points. 

​

While I am pleased with my students' performance on this test, I remain skeptical that the STAAR is an accurate way to measure a student's reading performance. Because the STAAR is administered only towards the end of the year, it does not provide teachers and families with information about how a student is growing throughout the school year. In addition, because it is a promotional exam in fifth grade, the STAAR can place an immense amount of pressure on students, which can lead to anxiety and poor test performance. There is also the issue that many of the passages require background knowledge that most low-income, minority students simply don't have. For example, on the 2017 test, the paired selection focused on the obscure hobby of cherry pit-spitting in Wisconsin. This is a topic that is foreign to many Hispanic and African-American students, and thus difficult for them to comprehend. Finally, Lexile studies have shown that the passages included on the STAAR reading tests are written two grade levels above the grade level being tested. This means that my fifth graders were taking a test written at a seventh grade level. Knowing this, it is no surprise that the score needed to pass is only a 58%, and that so many special education students fail the STAAR regardless of how many times they are forced to take it. 

​

Sullivan, A., and Brown, M. (2015). Reading for pleasure and progress in vocabulary and mathematics. British Educational Research Journal, 41(6), 971-991. Retrieved from https://onlinelibrary.wiley.com/doi/full/10.1002/berj.3180#references-section

​

bottom of page