By Jacy Ippolito, Joshua Lawrence, Joanna Yau, Judy Liu, and Rachel Strumpf, with support from the Hellman Foundation
According to the Society to Improve Diagnosis in Medicine, diagnostic error causes an estimated 40,000 to 80,000 deaths annually, despite the fact that diagnosis research receives relatively robust funding in the medical sciences. Educational leaders often see the symptoms of troubling practices in their schools, but diagnosing the underlying causes and providing appropriate “treatment” is challenging. Most school-level decisions are extremely complex and do not have clear indicators of success or failure. Good decisions result in modest improvements in some areas without compromising performance in others. Perhaps more importantly, how a decision is arrived at can strongly influence the success of that decision. For example, when we hope to identify leverage points for improving instructional practice, how staff contribute to the decision-making process can strongly influence how successful professional development is likely to be. At the same time, arriving at the “right” decision is important. We want to be sure that when we embark on a program of professional development, we are selecting the right focus, instructional program, or intervention.
The SERP Institute (Strategic Education Research Partnership) developed the Carnegie Content-area Literacy Survey (CALS) to assess the literacy dispositions and habits of students and teachers (with support from the Carnegie Corporation). After being used in several research contexts, SERP put the survey online to allow teachers, principals, and/or school district representatives to register and administer the survey digitally. Reading Ways uses the survey with its partner schools and has made a freely accessible version of it available through Google forms (it is available here).
To illustrate how school leadership teams are beginning to use dynamic, online diagnostic tools such as CALS, we turn to a case example from Hudson, Massachusetts. Hudson is a semi-rural town, home to more than 17,000 people, and is a metropolitan suburb of Boston (40 miles west of the city). The town supports three elementary schools (preK-4th grade), one middle school (5th-7th grade), and one high school (8th-12th grade). District and school leadership began their investigation of literacy diagnostic tools for teachers and students in 2011, knowing that their teachers and students might need additional support in meeting the challenges of common core standards. As a first step toward increasing students’ literacy achievement across the board, a team of 17 educators (teachers and administrators) came together to form a Literacy Action Team (LAT), which then began reviewing existing sources of student data. The team examined state standardized test scores, SAT, PSAT, and Advanced Placement test scores for graduating students; however, this quantitative data did not help the LAT better understand how secondary teachers in Hudson were supporting their students’ deeper reading, writing, and communication skills. The team quickly realized that they needed more information from both teachers and students about their collective understanding of literacy instruction within and across content areas. The LAT began designing a survey to administer to teachers and students, and in their search for existing templates, they found the CALS. After contacting Dr. Joshua Lawrence and chatting about the possibility of using the CALS with a group of high school teachers and students (a small shift from the survey’s original target population of middle school teachers and students), the Hudson LAT agreed to administer it.
LAT members reported that using the CALS was an easy decision because “it was already on the computer,” and that “it actually didn’t take teachers and students as long as we thought it might, because we were really sensitive to that.” Todd Wallingford, the Hudson Curriculum Director for English Language Arts and Social Studies grades 6-12 (and one of the primary conveners of the LAT) commented further: “We didn’t want anything that was going to take more than 20 minutes… and it didn’t!”
The Hudson team administered the CALS during the winter of 2012 to 185 students and 88 teachers across grade levels and content areas. The LAT was excited to receive immediate feedback on questions they had been wrestling with, for example hearing from students “How much and what kinds of reading and writing do you do?” both in- and out-of-school. Similar data was collected from teachers about their literacy practices in the content areas, and importantly, the LAT was able to gather data about previous professional development experiences focused on literacy. The results of the CALS were illuminating because the survey data helped confirm and quantify a number of suspicions the team had already identified from other data points: that a majority of the secondary teachers had not been given access to literacy-specific professional development; meeting the literacy needs of students with special needs and low literacy levels was challenging; and that students were finding it challenging to draw inferences when reading. Several new findings arose as well. Teachers reported spending a fair bit of time teaching vocabulary, but they also reported that students were not demonstrating strong understandings of academic and subject-specific vocabulary. Meanwhile, students reported not enjoying learning new vocabulary in the subject areas, but they recognized its importance in understanding course content. Such findings were important because they created opportunities for faculty to talk and collaborate with one another as part of district-designed, targeted professional development.
Perhaps more important than any one piece of data or particular finding was the process that the LAT underwent in analyzing and reporting the data. The team engaged in multiple data analysis sessions, meeting as a large group and then as small sub-groups to explore different aspects. Notably, the team created a discussion-based protocol (see above; a PDF version is here), adapted from the Data Driven Dialogue Protocol from the School Reform Initiative. The protocol allowed the team to explore their own assumptions about teachers’ and students’ literacy practices, make nonjudgmental observations about the resulting CALS data, and then craft evidence-based inferences.
Years later, the power of working in such a collaborative and deliberate manner is clear. The school and district have used this data to spur several waves of district-designed professional development addressing areas such as inferencing, vocabulary instruction, and refinement of disciplinary literacy practices. Study groups and summer institutes were formed and led by LAT members and other expert content-area teachers and leaders. The study groups have been reading Doug Buehl’s excellent book, Developing Readers in the Academic Disciplines, and in response, teachers have been collaboratively designing instruction within and across content areas to address achievement gaps.
After the administration of the CALS, Todd Wallingford reports that while the LAT does not often revisit the CALS data, the process they underwent administering and analyzing the data was pivotal in raising areas of challenge and confirming suspicions about areas of need. The idea of looking at data collaboratively built the faculty’s and leaders’ capacity to tap local expertise and provide targeted professional development efficiently (as opposed to blindly choosing new, expensive curricular packages).