I'm a curriculum and instruction supervisor for a smaller district. We feel like we have a pretty firm grasp on assessing and diagnosing when it comes to phonemic awareness, decoding, fluency, and comprehension. However, we're struggling with vocabulary. Is there any assessment you would recommend that would give us a feel if a student is approaching standard or at standard for that area?
In recent years, I’ve become concerned about the amount of school testing.
My complaint isn’t with the annual accountability tests (though, those are on overdose, too). No, my grievance is with the many screening and monitoring tests at epidemic levels in our schools.
It is sensible to be aware of student progress. But teachers can only use so much info and testing isn’t the only way to get it.
I oppose, for instance, weekly tests of oral reading fluency—they aren’t accurate enough and fluency growth rates don’t justify it. Testing that 2-4 times per year should be sufficient.
A better plan, perhaps, is to have teachers monitoring performance during daily fluency lessons… five observations per day within a half-hour lesson would provide 9 looks per report card. And, there’d be little need for all that formal testing.
But I digress.
One of the big drawbacks with classroom assessment is that it overemphasizes foundational skills (e.g., phonemic awareness, phonics, fluency). Don’t get me wrong, those are perfectly reasonable things to dipstick and their results really can help us to target instruction.
As sensible as that approach is, it tends to lead teachers to see the kids (and their needs) through the lens of these tests.
If we only monitor progress in foundational skills, then teachers can only see foundational skills. Reading instruction then devolves to that… kids either are “doing fine” with reading (even if they are struggling with language, comprehension, and composing), or they need extra help and that extra help will be aimed at, you guessed it, foundational skills—even if the kiddos need something else.
Teachers come to not even see those other aspects of literacy. After all, the principal isn’t monitoring kids’ success with vocabulary or comprehension. But if we’re talking about foundational skills—the ones regularly tested—then “oh doctor.”
Given all that, your desire to assess vocabulary is a laudable one.
I can find no reliable/valid vocabulary test that can easily be given multiple times per year, that is predictive of kids’ growth in reading, and that provides a good measure of student progress. There are lots of vocabulary measures, but none that fits that bill.
You can use something like the Peabody Picture Vocabulary Test (PPVT) which will provide a reasonably good estimate of overall vocabulary performance. But I’m not convinced that those scores would allow you to track student growth in any meaningful way.
That’s not a knock on the PPVT or its rivals. Vocabulary assessments just have not been designed to monitor vocabulary learning in any meaningful way (Pearson, Hiebert, & Kamil, 2007).
Vocabulary growth is just so diffuse. Kids watch a television show and bang they know a new word. The same for talking to a friend on the playground or eating dinner with mom or taking part in a reading lesson. Vocabulary comes from pretty much everywhere (as a boy, I learned “serviette” from the Three Stooges).
Words can get pretty specialized or particular in their use, too. I now have an extensive French vocabulary. If I come across words in that lexicon, I can understand them; that is, I have a good receptive French vocabulary. But, s’il vous plait, don’t ask me what the French word for something is because my expressive vocabulary is embarrassing.
Keeping track of such dispersed and sundry learning is a formidable—and so far an impossible—task.
I think the best you can do is to adopt or create a formal vocabulary curriculum. Decide on a set of words and/or morphemes that the kids in your district will master by a particular grade level. Then teach the heck out of those words.
Vocabulary assessment, in this context, should aim to monitor student progress with the taught vocabulary. These results should correlate reasonably well with students’ overall growth in word learning. Kids who learn the most words from instruction will likely pick up a lot of words from their own reading, and so on.
As with most correlations, there will be kids whose learning isn’t typical.
For example, what about the kids who are smart, but who don’t read much? They may learn most of the taught words, without curating many on their own. Or those, who do read a lot, but aren’t particularly diligent students? They may look like they are making no vocabulary progress, while doing reasonably well on their own. Such exceptions will probably balance out.
This plan should work reasonably well—as long as the kids don’t already know most of the target words. Screenings of the year’s vocabulary agenda early on should give you a sense of that. Without that information, you could convince yourself that there was a lot of learning going on, when you are only seeing that the students already knew the target words. In such a case, your curriculum is undershooting the kids.
After the first year, you could even check out what growth on your vocabulary instrument would mean. There is no reason why you could not correlate student growth in vocabulary with your state test results.
I would discourage you from trying to do this with a multiple-choice instrument. That adds too many other variables to the assessment. Just print up a list of the words for the kids to write definitions for. Don't try to get into fine-grained distinctions. The student either understood a word or not. You need to keep both the test design and the scoring simple enough that 3-4 administrations per year would be possible.
Thanks for keeping the attention on language and meaning.
Like TIm I am glad educators are looking beyond the basic three or four early reading components and beginning to ask important questions about vocabulary, i. e.,measuring progress, identifying areas for in inceased/focused instruction ... but I'm wondering what the source(s) of these graded word lists might be? Are there several? How were they developed? Some better than others? What would be a reasonable goal or number of words taught/learned each year? Has someone already answered these questions?Thank you!
Hello Friends, I have been working actively with teachers in 1st-5th grades to develop and implement comprehensive vocabulary instruction for more than a decade now. It has been exciting work - designing, implementing, researching multifaceted vocabulary instruction has been an engaging process for the teachers and me (and super engaging for students - they inevitably say, "vocab is one of my favorite subjects!"). One tool that we have employed is a series of relatively short multiple choice vocabulary assessments that test students' knowledge of "less familiar high-frequency words." These are words like... approached, identify, standard, harmony, principles, etc... that appear toward the lower end of Fry's 3000 Instant Words. Most teachers who look at them inevitably say, "Yep, those seem just about right" in terms of words that students should know, will indeed see, but often are not as familiar with as one might think. I have separate "SVKA" ("specific vocabulary knowledge assessments") tests for 3rd, 4th, 5th grades, and we recently used the 3rd grade test in a study involving 6 2nd grade classes in a dual immersion school - so, they can be used flexibly. The psychometric properties on these have been strong. I am currently running correlations with the Gates-MacGinitie test of word knowledge, a normed test of general vocabulary that we also administer, but it looks to me like those will be strong as well. For us, the words are actually a representative sample of a larger set of similar words that we teach explicitly as one part of our multifaceted vocab instruction. Thus, the assessments serve as pre/post tests to help us gauge the effectiveness of the instruction. However, I think that the SVKAs could be highly useful for folks who want to get a quick sense of students' knowledge of "sweet spot" words and, perhaps, to see how much this knowledge grows over the course of a year. I am happy to share if someone like the teacher-questioner is interested - we do this all the time simply by sharing the pdfs over Google Docs...
Patrick- is it possible to provide a link or could I post to my site or could you leave an address for those who want to write to you about those tools?
Patrick- Your work sounds awesome. I would be very interested in the SVKAs.
Patrick- I’m also very interested in the SVKAs. This is very much a need for our school. Thank you!
I would like the SVKAs as well, and would also welcome suggested activities to teach the words. I'd love to know how to make vocab instruction one of my students' favorite subjects! Thank you so much! Catherine
Hey Friends, Here are links to the Google Docs of the Grade 5, 4, and 2-3 SVKAs. This last year's 3rd grade group scored, on average, 17.5 at beginning of year on the 2-3rd version, so this was quite high (they ended the year at 22). I am thinking that this sample of words is a bit easier than the entire set of these "less familiar high-frequency words" that we teach, and I will likely change some items. But, this group also began the year at 59.75 NCE (think percentile) on the Gates word knowledge test, so they were fairly high coming into 3rd. They ended at 77.25 NCE; so, much greater growth than the norming population. This was the 3rd straight year for this kind of "more than expected growth" in general vocabulary in the VALE 3rd grade project (paper in progress). We have published several practical articles on the VALE instructional methods in The Reading Teacher, and I am currently finishing up the research reports for the 3rd grade work and the 2nd grade dual immersion school work (also greater than expected growth in general vocabulary on the GM in 1/2 the instructional time in English! And, slightly greater growth for the ELs than the proficient English speakers, although both groups grew). If you don't have access to The Reading Teacher, you can email me at firstname.lastname@example.org and I can send you pdf's of our current articles. Hope this helps!
Dr. Michael Graves would also be a great resource to learn more about multifaceted vocabulary instruction and intervention. Here is a link to an International Literacy Association practice guide that he coauthored: http://www.literacyworldwide.org/docs/default-source/member-benefits/e-ssentials/ila-e-ssentials-8035.pdf
I studied with Dr. William Nagy who also has done key vocabulary and morphology research.
I did several vocabulary in-services this year for my school district. Several resources I found to be very useful included two books by Dr. Graves: The Vocabulary Book and Teaching Vocabulary to English Language Learners. There is a consensus among other vocabulary experts on this multi-faceted model, such as the work of Blachowicz and Fisher (Teaching Vocabulary in All Classrooms).
It is interesting to note that measures of morphological awareness correlate quite highly with vocabulary measures. This would suggest that teaching derivational morphology is quite important and measures of morphology (words with multiple morphemes and derivational morpheme knowledge are useful).
Dr. Graves, like other vocabulary researchers, emphasizes the first 4,000 word list for ELL students and students in early grades with delayed vocabulary development. Other corpus based lists of most common vocabulary word lists are also important. Another very useful website is Dr. Elfrieda Hiebert's TextProject: http://www.textproject.org/archive/webinars/vocabulary-matters/
Regarding standardized measures of vocabulary, the Peabody Picture Vocabulary Test - 5 and other measures could at least help identify those students with significant delays and who need intervention. However, this test can't be given frequently. Also, distal measures of vocabulary are not sensitive to measuring the teaching of a specific set of words and complex morphology.
Too frequent testing can limit/interfere time spent on classroom instruction. But what happens when teachers take this to the extreme and do no or limited amounts of testing on basic skills? My granddaughter took her first spelling test this year as a fourth grader. Phoneme awareness, phonics, grammar, fluency...not assessed, therefore not taught at her school. It isn’t just about measuring/grading students. Testing of basic skills helps inform teachers about the effectiveness of their instruction, understand if their students are performing on grade level, and determine next steps. For free, downloadable 6-week and end-of-year assessments visit https://readingtests.info/.
Where are those classrooms that are overemphasizing foundational skills (e.g., phonemic awareness, phonics, fluency)? The schools I am working with rarely have a single professional that even knows what phonemic awareness is.
I’m a Title 1 Lead Teacher in a medium sized SW OH district. I would also be appreciative of your sharing. As Tim stated, developing both expressive and receptive vocab is critical to reading and writing success. Frequent formative assessments as we teach are important, but admins always want something more formal. Thanks in advance!
To the teacher who wrote the question:
You wrote that you are set with assessing and diagnosing comprehension. What do you use?
Here's an important reminder from a study published in Scientific Studies of Reading (2017), Examining Child and Word Characteristics in Vocabulary Learning of Struggling Readers. The authors conclude:
"Teaching students with reading difficulties a series of vocabulary strategies to induce meaning of unknown words from context, decompose words using morphological analysis, and use a glossary to find word meanings was effective for improving knowledge of the target vocabulary. This finding is consistent with the literature that has demonstrated the importance of explicitly teaching vocabulary to struggling readers (Bryant et al., 2003; Elleman et al., 2009). In addition, one child-level predictor, prior background knowledge was associated with vocabulary acquisition. Students who had higher levels of prior knowledge of the topics covered in the texts acquired more vocabulary than those who had less knowledge."
Here's a blog post I wrote about effective vocabulary instruction: https://theliteracycookbook.wordpress.com/2018/08/08/techniques-and-tools-for-vocabulary-instruction/
I also recommend considering a "Root of the Week" approach. For an explanation of this approach with free resources, check out this page: https://www.literacycookbook.com/page.php?id=16
Tim, I'm a fan of your work and appreciate this post. I agree that multiple-choice questions add too many variables, but having memorized definitions does not mean students know how to use the words (see Beck, McKeown, and Kucan), so I would recommend using sentence stems or the various approaches to quizzes mentioned in my blog post. Bottom line: There really isn't a QUICK way to assess true vocabulary knowledge; "matching" doesn't prove you can use the word.
As an EFL teacher, I have read several books and papers on teaching and learning vocabulary that were incredibly useful by these authors: Paul Nation (sometimes publishes as I.S.P. Nation), "How Vocabulary is Learned" and "Learning Vocabulary in Another Language", Norbert Schmitt "Vocabulary in Language Teaching", Michael McCarthy "Vocabulary"
I like teaching the most frequently encountered affixes and other Greek and Latin cognates in order to give students something that will transfer. Students need to be able to deconstruct a word and rebuild it in order to make meaning. The better they can do this the higher their comprehension goes. I’ve written 20 small group lessons to incorporate into teacher-led. The way I assess affixes and cognates is giving a bolded new word in a sentence and students generate a definition in their own words. Ten bolded words in context per week for a quick check for understanding.
I agree that students are on testing and screening overload. We have to find a way to accomplish our goals without constant testing and screening.
Leave me a comment and I would like to have a discussion with you!
Copyright © 2023 Shanahan on Literacy. All rights reserved. Web Development by Dog and Rooster, Inc.
See what others have to say about this topic.