The Spirit is Willingham, but the Flesch is Weak

  • 15 July, 2016

Teacher’s Question:

I have read a few articles and books by Daniel Willingham in the past, and I wonder if you are familiar with his work. I recently read an article (attached) about reading comprehension strategies and am curious to know what you think of his ideas. He says that focusing heavily on reading strategies isn’t really necessary.
(I often question the need for so many reading strategies, particularly when they take away from reading being a pleasurable activity. I can understand the importance of visualizing, using prior knowledge, and maintaining focus, but teaching the other “strategies”, in my opinion, is confusing the issue. I realize there are many studies to say otherwise, but, I just can’t be convinced.)
Anyway, again, just wondering what you think of Willingham’s paper.
Shanahan's Response 
Thanks. This is the second time in two weeks I’ve been asked about Daniel Willingham’s writing on comprehension strategies. I don’t know Dr. Willingham, but I’ve read his vita.
Daniel Willingham is a cognitive psychologist with a good research record—on topics other than reading education. Although I know of his book, it is written for lay audiences—and the short excerpts or off-shoots that have come to my attention, suggest to me that he hasn’t actually read much of the research that you are asking about. But he has read some appropriate summary pieces about the subject and/or talked to some respected experts). 
In my opinion, he is kind of right. 
What is the good Doctor W right about? He is right that comprehension strategies (e.g., summarization, questioning, monitoring) are effective. There are a number of research reviews of this work, both focused on individual strategies and strategy teaching overall, and they are consistently positive. Teaching comprehension strategies appears to improve students’ reading comprehension, and it doesn’t matter if the review is somewhat comprehensive (NICHD, 2000) or highly selective, only including in the highest quality studies (Shanahan, et al., 2010); the answer is the same. 
And, he is especially right to raise the issue of, “How much of this kind of teaching is needed?” 
But that’s where my answer would deviate from his, and where reading the actual studies instead of the reviews can make a big difference. He claims students learn everything they need after 2 weeks of strategy instruction, and that we should limit such teaching to that extremely limited duration. 
I think that claim is on very thin ice and it ignores a lot of issues and a lot of studies (remember the National Reading Panel reviewed more than 200 studies on the topic). 
I say three cheers for Dan Willingham for questioning the amount of strategy instruction and I give him the raspberries for then answering his question that two weeks of strategy teaching is appropriate.
One thing that originally shocked me in reading the studies in that research literature was how brief the interventions were. Most studies focused on 6 weeks of instruction or less (though there were a few longer studies). That such brief interventions are potent enough to impact standardized reading tests is good. That we have no idea whether stronger doses have any added benefit is a serious problem. That’s why I agree with the notion that we are probably overdoing the strategy teaching. The only evidence we have on amount of strategy teaching is correlational and it is weak at best.
My conclusions: 
(1)  Strategy instruction is effective when the instruction is concentrated. In all of the studies, students were given daily ongoing instruction of and practice with strategies. Programs that give occasional doses of instruction in various strategies may be effective, but there are no studies of that kind of practice.
(2)  Strategy instruction can be effective at improving reading comprehension scores at a variety of grade levels, including the primary grades. This surprised me, too. I was pretty sure that comprehension strategies made sense with older students, but not so much with younger ones. That’s not what the research has found, however.
(3)  Strategies are not all equal. There is a greater payoff to some strategies than to others, so I would definitely put my instructional nickel on the ones with the big learning outcomes. The most powerful strategies by far are summarization (stopping throughout a text to sum up) and questioning (asking and answering your own questions about the text). The weakest: teaching students to think about how to respond to different question types (effect sizes so small that I wouldn’t waste my time).
(4)  Strategy instruction can be effective with about 6 weeks of teaching and practice. Here I’m going with the modal length of strategy studies. Perhaps the effects would have been apparent with fewer weeks of instruction, per Willingham’s contention, and, yet, this hasn’t been studied. Weaker dosages may work, too, but with so little evidence I’d avoid such strong claims. 
(5)  Even more strategy instruction than this may be effective, but, again, with so little research no one knows. We do have studies showing that 3 years of phonics instruction are more effective than 2 years of phonics instruction, but we don’t have such studies of reading comprehension teaching, so let’s not pretend. 
(6)  You raise a question about the value of different strategies, Willingham does not. The research reviews show that the teaching of multiple strategies, either singly in sequence or altogether, is beneficial—with stronger results than from single strategies. Multiple strategy teaching may be better because of the possibility that different strategies provide students with different supports (one strategy might help readers to think about one aspect of the text, another might foster some additional insights or analysis). Teach multiple strategies.
(7)  The Willingham claims fails to consider the outcome measures. Strategies are good or bad, but he doesn’t focus on what they may be good at. His focus is on motivating readers, but the studies of strategy teaching do not focus on this outcome. I think we overdo the strategy thing, and yet, I’d be surprised if an overemphasis on strategies is why kids don’t like reading. The whole point of strategy teaching is to make students purposeful and powerful, focused on figuring out what a text says. Those kinds of inputs usually have positive motivational outcomes.
(8)  It is great that comprehension strategies improve performance on standardized reading tests, but their bigger impact has usually been on specially designed instruments made for the research. Thus, summarizing usually helps students to summarize a text more than it builds general reading comprehension. I think the best test of strategies would be to give two groups a really hard text—like a science textbook—and have them read it and see who would do the best with it (passing tests, writing papers, etc.). I suspect strategies would have a bigger impact on that kind of outcome than passing a test with fairly short easy passages, multiple-choice questions, in a brief amount of time. If I'm correct about that, then strategies would worth a more extensive emphasis. Willingham apparently hasn't read the studies so he is considering only what they have found, not what they haven't considered.
(9)  Most students don’t use strategies. Though we know strategies improve comprehension, they are not used much by students. I suspect the reason for this is our fixation on relatively easy texts in schools. The only reason to use a strategy is to get better purchase on a text than one would accomplish from just reading it. If texts are easy enough to allow 75-89% comprehension (the supposed instructional level that so many teachers aim at), there is simply no reason to use the strategies being taught. Teachers may be teaching kids to use strategies, but their text choices are telling the kids that the strategies have no value.
(10)   Willingham is trying to reduce the amount of comprehension strategy instruction so that kids will like school better. I doubt that he spends much time in schools. He hasn’t been a teacher of principal or even a teacher educator and his own research hasn’t focused on practical educational applications. I’ve been conducting an observational study of nearly 1000 classrooms for the past few years, and we aren’t seeing much strategy instruction at all. There definitely can be too much strategy teaching, but in most places any dosage, not overdosage, is the problem.


See what others have to say about this topic.

R Kelleher Jun 11, 2017 09:21 PM


You mention a number of times that you don't believe that Daniel Willingham has read the research or the papers he is writing about, which if you were familiar with his work at all you would realise is ridiculously pejorative point of view and I would suggest says more about you than him. Especially as your questioner specifically mentions and links to an article (I assume this one: while you repeatedly refer to "his book" without mentioning which one.

Furthermore as some of Willingham's writing has been amongst the most influential recent contributions to practice referred to and used by practicing teachers, to admit to making judgements based on "short excerpts or off-shoots that have come to my attention" strikes me as rather extraordinary for someone writing under the byline "Distinguished Professor Emeritus of urban education".

Timothy Shanahan Jun 11, 2017 09:21 PM

I wrote that Professor Willingham’s claims reveal that he has read the research summaries/meta-analyses rather than the studies themselves since he was ignoring key features of the original studies that were important to the claims he was making. You indicate that this is "ridiculously pejorative." It is not. It is a commonplace among scholars. They read the actual studies in their fields of study, but they usually rely on summary pieces when it comes to areas outside of their expertise (Shanahan, Shanahan, Misichia, 2010). In fact, Thursday I received a note from Dan admitting that to be the case in this instance and if you check his blog yesterday you would see that he does not take my criticism as pejorative, but accepts my conclusion—since I have first-hand knowledge of the research he was talking about.

You are correct, I mentioned his book without specifying which of his four books I was speaking of; however, since it is his first book on reading education after 30 years in the field of psychology, I didn’t think readers would be likely to mix this up with his books on other topics.

You claim that Willingham’s book is one of "the most influential recent contributions to practice referred to and used by practicing teachers.” That may be true, but I don’t know how you would measure such influence. His work isn’t cited much in the practitioner literature, which is not surprising given how recently it appeared. As I indicated in my blog entry, I haven’t read his book. However, that is largely because I have been in the field for more than 40 years and have both extensive instructional and research experience addressing and studying the problems that he is opining on. Scholars don’t usually spend considerable time on what is considered to be "tertiary literature” (as opposed to primary literature—that is, the original empirical research studies, or secondary literature—that is, meta-analyses and systematic syntheses of such studies). Tertiary literature is usually not that informative to people in the field because they have already read the studies that such authors might use to form their opinions. Researchers usually wouldn’t read many books by practitioners either, though I do, because it is possible that a teacher would have as-yet-unstudied insights that could be useful, but Professor Willingham is not a teacher nor has he spent much time in schools, so that wouldn’t drive me to his book either. I’m glad that you are finding his insights to be useful, but you just have to be aware that he is writing outside of his field of practice and research, so some of his claims may unintentionally be at odds with the facts of the matter, as was the case here. Tertiary literature often misses the nuance of the information being shared.

His insight that we can overdo strategy instruction is a good one and well reasoned from the summary evidence. His claim that two weeks of strategy instruction is sufficient is a claim without any kind of research support (his reasoning, not stated in the article, was that since the outcome effect is so big, you wouldn’t need as great a dosage as was provided in the studies).

Think of a medical analogy: research finds that if you have strep throat and the doctor puts you on a 10-day schedule of antibiotics, then 90% of patients will be cured of the infection by the end of the schedule and the infection will not re-occur within 30 days. There are no studies of a 2 or 3 day medication schedule with this malady, but there is a physician, a podiatrist let’s say, in any event someone outside this particular field of study, who is reasoning that since the effect is so big, the dosage could be too high; he writes a book saying that patients should stop the medication after only 3 days use. I promise you most researchers and physicians in the area of Nose, Ears, Throat would reject such a claim without any direct research evidence, and frankly, without consideration of many other issues addressed in the original studies (like the differences in outcomes among patients with certain conditions).

Sorry you didn’t like the article. Good luck.

Joyful Jun 11, 2017 09:22 PM


Interesting! I greatly enjoy learning more about what the research has to say about best practices.

I have a clarification question. You said, "Strategies are not all equal..The weakest: teaching students to think about how to respond to different question types (effect sizes so small that I wouldn’t waste my time)." As a long-time test preparation coach, I was stunned to read this. If I may, can I make sure I correctly understood what you're saying? By teaching them (or not) to think about how to respond to different questions, do you mean that the reading instruction shouldn't primarily focus on *how to respond* to questions or did you mean that they shouldn't be taught primarily to focus on *thinking* about how to respond or, possibly, that they shouldn't be taught how to respond to *different* questions? Or maybe all of the above?

Since I teach students to think about and articulate their answers to reading test questions (actually I should qualify that to say I encourage them to think about what the passage tells them which may be used to answer the question - so, in other words, it isn't the student's own answer but a passage-informed answer) *before* looking at the answer choices. I find this very helpful for many students since it often prevents them from spending an inordinate length of time considering the legitimacy of each answer choice (a pitfall for many, I have found). I realize that my field involves a specialized application of approaches based on (or not) reading comprehension research, so perhaps the research you are referring to is not purporting to study impacts on test scores but rather long-term reading abilities. Depending on your perspective (a different can of worms), reading test scores and reading abilities may or may not overlap.

Regardless, I would very interested in clarification as to what you specifically meant here. Though I can't speak for all in my profession, I personally do my best to utilize research-based best practices in my work with students.

In addition, I would definitely love to read more about the research that has studied these questions if you could kindly point me to it. Many thanks!

Timothy Shanahan Jun 11, 2017 09:23 PM



Yes, I do mean that all of those activities are a waste of time if the goal is improving reading achievement. Various strategies encourage teaching kids question types and how to think about those aspects of text (one popular example is QAR). Consistently, the research on these instructional approaches have a tiny effect size (averaging about .15--which is a substantially smaller payoff than one gets if they teach kids to summarize what they are reading). Furthermore, the research is also consistent and clear that teaching kids to respond to particular kinds of questions (such as those that we find on reading comprehension tests) cannot possibly work because those questions are so highly correlated with each other that they are not actually tapping different cognitive outcomes. There is no consistent difference in performance between literal recall and inferential or higher order thinking questions. There are no differences between main idea, comparison, drawing conclusions, etc. questions. There are no measurable differences between right there, think and search, author and me, etc. questions. Spend your time on teaching kids to read better and you'll be way ahead of the game.

Anonymous Mar 08, 2018 06:47 AM

Can I be clear, then?
The teaching of reading strategies is useful = yes. Does it have more impact if the reading content is directly related to the curriculum students are studying?
E.g. school A removes 6 children for “comprehension” intervention and builds this around an already-published SEN resource - contents includes football, how to make a cup of tea, where pizzas came from etc.
School B removes 6 children for the same purpose but builds this around non-fiction articles on The Russian Revolution to support their reading of Animal Farm in English. Is there any guidance on which will have greater impact? Everything I’ve read over the last few years suggests the latter?

Donna scherr May 03, 2018 07:52 PM

Professor Shanahan, thank you for your informed comments. Can you direct me to the studies showing benefits of CRS in primary years?

What Are your thoughts?

Leave me a comment and I would like to have a discussion with you!

Comment *

The Spirit is Willingham, but the Flesch is Weak


One of the world’s premier literacy educators.

He studies reading and writing across all ages and abilities. Feel free to contact him.