Showing posts with label text complexity. Show all posts
Showing posts with label text complexity. Show all posts

Wednesday, February 3, 2016

Are Oral Reading Norms Accurate with Complex Text?

Teacher Question:  
          A question has come up that I don't know how to address and I would love your input.  For years, we have used the Hasbrook/Tindal fluency norms as one of the ways we measure our student's reading progress.  For example, the 4th grade midyear 50th percentile is 112 CWPM.  The fourth grade team has chosen a mid-year running record passage and is finding that many of students have gone down instead of up in their CWPM.  One teacher said that is because the common-core aligned texts are more challenging and that the passage is really the equivalent of what used to be considered a 5th grade passage. She said that the norms were done using text that is easier than what the students are now expected to read. I know that the texts are more complex and challenging and therefore more difficult for the students to read, and that this particular text may not be a good choice to use for an assessment, But it does raise the larger question--are these fluency norms still applicable?

Shanahan response:
         This is a great question, and one that I must admit I hadn’t thought about before you raised it. If average fourth-graders read texts at about 112 words correct per minute by mid-fourth-grade, one would think that their accuracy and/or speed would be affected if they were then asked to read texts that in the past would have been in the fifth-grade curriculum.

         However, while that assumption seems to make sense, it would depend on how those norms were originally established. Were kids asked to read texts characteristic of their grade levels at particular times of the year or was the text agenda wider than that? If the latter, then the complex text changes we are going through would not necessarily matter very much.

         So what’s the answer to your question? I contacted Jan Hasbrouck, the grand lady herself, and put your question to her. Here is her response:

         I guess the most honest answer is "who knows?" I hope that we may actually have an answer to that question by this spring or summer because Jerry Tindal and I are in the process of collecting ORF data to create a new set of norms, which should reflect more current classroom  practice.

         My prediction is that the new ORF norms won't change much from our 2006 norms (or our 1992 norms). My prediction is based on the fact that ORF is, outside of expected measurement error (which Christ & Coolong-Chaffin, 2007 suggest is in the range of 5 wcpm for grades 1 and 2 and 9 wcpm in grades 3-8+), fairly stable. You can see evidence of this on our 2006 norms when looking at the spring 50th %iles for grades 6 (150), grade 7 (150), and grade 8 (151). When you think that these three scores represent approximately 30,000 students reading a variety of grade level passages that pretty darn stable. Other studies of older readers (high school; college) also find that 150 wcpm is a common "average.”

         Of course this stability assumes that the ORF scores were obtained correctly, using the required standardized procedures, which unfortunately is too often not the case. Standardized ORF procedures require that students read aloud for 60 seconds from unpracticed samples of grade level passages, and the performance is scored using the standardized procedures for counting errors. In my experience most educators are doing these required steps correctly. However, I see widespread errors being made in another step in the required ORF protocol: Students must try to do their best reading (NOT their fastest reading)!  In other words, in an ORF assessment the student should be attempting to read the text in a manner that mirrors normal, spoken speech (Stahl & Kuhn, 2002) and with attention to the meaning of the text. 

         What I witness in schools (and hear about from teachers, specialists, and administrators in the field) is that students are being allowed and even encouraged to read as fast as they can during ORF assessments, completely invalidating the assessment. The current (2006) Hasbrouck & Tindal norms were collected before the widespread and misguided push to ever faster reading.  It remains to be seen if students are in fact reading faster. Other data, including NAEP data, suggests that U.S. students are not reading "better."

         And yes, of course the number of words read correctly per minute (wcpm) would be affected if students were asked to read text that is very easy for them or very difficult, but again, ORF is a standardized measure that can serve as an indicator of reading proficiency. 

         Given Jan's response, I assume the norms won’t change much. The reason for this is that they don’t have tight control of the data collection—reading procedures and texts varying across sites (not surprising with data on 250,000 readers). That means that the current norms do not necessarily reflect the reading of a single level of difficulty, and I suspect that the future norms determinations won’t have such tight control either. 

         The norms are averages and they still will be; that suggests using them as rough estimates rather than exact statistics (a point worth remembering when trying to determine if students are sufficiently fluent readers). 

          Last point: your fourth-grade teachers are correct that the texts they are testing with may not be of equivalent difficulty, which makes it difficult to determine whether or not there are real gains (or losses) being made. We've known for a long time that text difficulty varies a great deal from passage to passage. Just because you take a selection from the middle of a fourth-grade textbook, doesn't mean that passage is a good representation of appropriate text difficulty. That is true even if you know the Lexile rating of the overall chapter or article that you have drawn from (since difficulty varies across text). The only ways to be sure would be to do what Hasbrouck and Tindal did--use a lot of texts and assume the average is correct; or measure the difficulty of each passage used for assessment. The use of longer texts (having kids read for 2-3 minutes instead of 1) can improve your accuracy, too.


How to Improve Reading Achievement Powerpoint
         






Sunday, January 10, 2016

Close Reading and the Reading of Complex Text Are Not the Same Thing

          Recently, I was asked to make some presentations. I suggested a session on close reading and another on teaching with complex text. The person who invited me said, “But that’s just one subject… the close reading of complex text. What else will you talk about?”

          Her response puzzled me, but since then I’ve been noting that many people are confounding those two subjects. They really are two separate and separable constructs. That means that many efforts to implement the so-called Common Core standards may be missing an important beat.

          Close reading refers to an approach to text interpretation that focuses heavily not just on what a text says, but on how it communicates that message. The sophisticated close reader carefully sifts what an author explicitly expresses and implies, but he/she also digs below the surface, considering rhetorical features, literary devices, layers of meaning, graphic elements, symbolism, structural elements, cultural references, and allusions to grasp the meaning of a text. Close readers take text as a unity—reflecting on how these elements magnify or extend the meaning.

         Complex text includes those “rhetorical features, literary devices, layers of meaning, graphic elements, symbolism, structural elements, cultural references, and allusions.” (Text that is particularly literal or straightforward is usually not a great candidate for close reading). But there is more to text complexity than that—especially for developing readers.

          Text complexity also includes all the other linguistic elements that might make one text more difficult than another. That includes the sophistication of the author’s diction (vocabulary), sentence complexity (syntax or grammar), cohesion, text organization, and tone.

          A close reader might be interested in the implications of an author’s grammar choices. For example, interpretations of Faulkner often suggest that his use of extended sentences with lots of explicit subordination and interconnection reveals a world that is nearly full determined… in other words the characters (like the readers) do not necessarily get to make free choices.

          And, while that might be an interesting interpretation of how an author’s style helps convey his meaning (prime close reading territory), there is another more basic issue inherent in Faulkner’s sentence construction. The issue of reading comprehension. Readers have to determine what in the heck Faulkner is saying or implying in his sentences. Grasping the meaning of a sentence that goes on for more than a page requires a feat of linguistic analysis and memory that has nothing to do with close reading. It is a text complexity issue. Of course, if you are a fourth-grader, you don’t need a page-long sentence to feel challenged by an author’s grammar.

          Text complexity refers to both the sophisticated content and the linguistic complexity of texts. A book like, “To Kill a Mockingbird” is a good example of sophisticated content, but with little linguistic complexity. It is a good candidate for a close reading lesson, but it won’t serve to extend most kids’ language. While a book like “Turn of the Screw” could be a good candidate for close reading, but only if a teacher is willing to teach students to negotiate its linguistic challenges.

         The standards are asking teachers to do just that: to teach kids to comprehend linguistically complex texts and the carry out close reads. They definitely are not the same thing.

Sunday, August 30, 2015

More on the Instructional Level and Challenging Text

Teacher question:
I’ve read your posts on the instructional level and complex texts and I don’t think you understand guided reading. The point of guided reading placements is to teach students with challenging text. That’s why it is so important to avoid texts that students can read at their independent level; to make sure they are challenged. The Common Core requires teaching students with challenging texts—not frustration level texts.

Shanahan response: 
I’m having déjà vu all over again. I feel like I’ve covered this ground before, but perhaps not quite in the way that this question poses the issue.

Yes, indeed, the idea of teaching students at their instructional level is that some texts could be too easy or too hard to facilitate learning. By placing students in between these extremes, it has been believed that more learning would take place. In texts that students find easy (your independent level), there would be little for students to learn—since they could likely recognize all or most of the words and could understand the text fully without any teacher help. Similarly, texts that pose too much challenge might overwhelm or frustrate students so they could not learn. Thus, placing them in instructional level materials would be challenging (there would be something to learn), but not so challenging as to be discouraging.

Or, at least that’s the theory.

So, I do get that the way you seem to be placing kids in books is meant to be challenging. But please don’t confuse this level of challenge with what your state standards are requiring. Those standards are asking that you teach students to read texts of specified levels of difficulty—levels of difficulty that for most kids will exceed what you think of as challenging.

This means that everyone wants kids to be challenged. The argument is about how much challenge. You may think that a student will do best if the texts used for teaching is only so challenging that he/she’d make no more than 5 errors per 100 words of reading, and your state may think the appropriate challenge level is grade level texts that represent a progression that would allow the students to graduate from high school with a particular level of achievement. That means in many circumstances the state would say kids need to read book X, and you’d say, “no way, my kids make too many errors with book X to allow me to teach it successfully.”

The Lexile levels usually associated with particular grade levels are not the ones that the standards have assigned to the grades. The Lexile grade-designations from the past were an estimate of the level of text that the average students could read with 75-89% comprehension. Those levels weren’t claiming that all kids in a particular grade could read such texts successfully, but that the average ones could. Thus, you’d test the individual kids and place them in books with higher or lower Lexiles to try to get them to that magical instructional level.

The new standards, however, have assigned higher Lexile bands to each grade level. That means that even the average kids will not be able to read those texts at an instructional level; some kids might be able to at those grade levels, but not the majority. That means teachers would need to teach students to read books more challenging than what have typically been at their instructional levels. In other words, plenty of kids will need to be taught at their frustration level to meet the standards.

I do get the idea that instructional level is meant to be challenging. But for the majority of kids, teaching kids at their instructional level will not meet the standards. That degree of challenge undershoots the level of challenge established by your state (and that they will test your students at). Perhaps you can take solace in the fact that research has not been able to validate the idea that there is an instructional level; that is, kids can be taught to read successfully with texts more challenging than you’ve apparently used in the past.



Wednesday, May 13, 2015

How Much Text Complexity Can Teachers Scaffold?

How much of a "gap" can be compensated through differentiation? If my readers are at a 400 Lexile level, is there an effective way to use a 820 level chapter book? 

            This is a great question. (Have you ever noticed that usually means the responder thinks he has an answer).

            For years, teachers were told that students had to be taught with books that matched their ability, or learning would be reduced. As a teacher I bought into those notions. I tested every one of my students with informal reading inventories, one-on-one, and then tried to orchestrate multiple groups with multiple book levels. This was prior to the availability of lots of short paperback books that had been computer scored for F & P levels or Lexiles, so I worked with various basal readers to make this work.

            However, a careful look at the research shows me that almost no studies have found any benefits from such matching. In fact, if one sets aside those studies that focused on children who were reading no higher than a Grade 1 level, then the only results supporting specific student-text matches are those arguing for placing students at what we would have traditionally called their frustration level.

            Given this research and that so many state standards now require teachers to enable students to read more challenging texts in grades 2-12, teachers are going to need to learn to guide student reading with higher level text than in the past.

            Theoretically, there is no limit to how much of a gap can be scaffolded. Many studies have shown that teachers can facilitate student success with texts that students can read with only 80% accuracy and 50% comprehension, and I have no doubt, that with even more scaffolding, students could probably bridge even bigger gaps.

            I vividly remember reading a case study of Grace Fernald when I was in graduate school. She wrote about teaching a 13-year-old, a total non-reader, to read with an encyclopedia volume. That sounds crazy, but with a motivated student, and a highly skilled teacher, and a lot of one-on-one instructional time, without too many interruptions… it can work.

            But what is theoretically sound or possible under particularly supportive circumstances does not necessarily work in most classrooms.

            I have no doubt teachers can scaffold a couple of grade levels without too much difficulty. That is, the fifth-grade teacher working with a fifth-grade book can successfully bring along a student who reads at a third-grade level in most classroom situations. But as you make the distance between student and book bigger than that, then I have to know a lot more about the teacher’s ability and resources to estimate whether it will work this time.

           Nevertheless, by preteaching vocabulary, providing fluency practice, offfering guidance in making sense of sentences and cohesion, requiring rereading, and so on, I have no doubt that teachers can successfully scaffold a student across a 300-400 Lexile gap--with solid learning. 

            But specifically, you ask about scaffolding a 400-Lexile reader to an 820-Lexile text. If you had asked about 500 to 920, I wouldn't hesitate: Yes, a teacher could successfully scaffold that gap. I’m more hesitant with the 400 level as the starting point. My reason for this is because 400 is a first-grade reading level. This would be a student who is still mastering basic decoding skills.

            I do not believe that shifting to more challenging text under those circumstances is such a good idea.

            To address this student’s needs, I would ramp up my phonics instruction, including dictation (I want my students to encode the alphabetic system as well as decode it). I might increase the amount of reading he or she is expected to do with texts that highlight rather than obscure how the spelling system works (e.g., decodable text, linguistic text). I would increase work on high frequency words, and I would increase the amount of oral reading fluency work, too. I’d do all of these things.

            But I would not shift him/her to a harder book because of what needs to be mastered at beginning reading levels. We’ll eventually need to do that, but not until the foundations of decoding were more firmly in place. 

           An important thing to remember: no state standards raises the text demands for students in Kindergarten or Grade 1. They do not do this because they are giving students the opportunity to firmly master their basic decoding skills. It isn't the distance between 400 and 820 that concerns me--that kind of a distance can be bridged; but a 400-Lexile represents a limited degree of decoding proficiency, and so I wouldn't want to shift attention from achieving proficiency in reading those basic words.