- Using PVAAS for a Purpose
- Key Concepts
- Concept of Growth
- Growth Measures and Standard Errors
- Growth Standard Methodology
- Predictive Methodology
- Topics in Value-Added Modeling
- Public Reports
- Roster Verification
- Additional Resources
- General Help
Misconception: It is harder to make growth with students from certain demographic or socioeconomic backgrounds.
It is widely known that students with certain socioeconomic or demographic (SES/DEM) characteristics tend to score lower, on average, than students with other SES/DEM characteristics, and there is concern that educators serving those students could be systematically disadvantaged in the modeling.
However, this adjustment is not statistically necessary for the most sophisticated value-added models, such as those used for PVAAS in Pennsylvania. This is because PVAAS uses all available testing history for each individual student and does not exclude students who have missing test data. Each student serves as their own control, and to the extent that SES/DEM influences persist over time, these influences are already represented in the student's data.
PVAAS in Theory
As a 2004 Ed Trust study stated, specifically with regard to the PVAAS modeling, which is the approach used in Pennsylvania's LEA/district, school, and teacher reporting:
[I]f a student's family background, aptitude, motivation, or any other possible factor has resulted in low achievement and minimal learning growth in the past, all that is taken into account when the system calculates the teacher's contribution to student growth in the present.*
A 2007 paper by RAND researchers J.R. Lockwood and Dan McCaffrey explicitly verified the models used for PVAAS LEA/district, school, and teacher reporting when they wrote:
William Sanders, the developer of the TVAAS model, has claimed that jointly modeling 25 scores for individual students, along with other features of the approach is extremely effective at purging student heterogeneity bias from estimated teacher effects...The analytic and simulation results presented here largely support that claim.†
An economist-based perspective by UCLA researchers Kilchan Choi, Pete Goldschmidt, and Kyo Yamashiro provided a similar finding in their study comparing value-added models:
First, adding in an adjustment for student SES (as measured by eligibility for free- or reduced-price lunch) adds very little once a student's initial status is controlled... This indicates that student initial status captures many of the effects that SES is attempting to measure. In other words, by controlling for initial status, the model already captures the preceding effects that SES might have on students.‡
In essence, these independent researchers have found that a sophisticated value-added approach does not typically systematically advantage or disadvantage educators by the type of students that they serve. By including so many prior test scores for each student, the model controls for many student characteristics that might impact their entering achievement or growth throughout the year.
Ultimately, there may be additional political and policy considerations that lead policymakers to make socioeconomic or demographic adjustments in the value-added models, but sophisticated ones tend neither to advantage nor disadvantage educators regardless.
PVAAS in Practice
Although the statistical literature presents evidence that educators are not advantaged or disadvantaged by the type of students that they serve in sophisticated value-added reporting, actual data might be the most readily apparent evidence to support this belief. The figures below provide teacher-level data, and the results are similar to those for LEA/district- and school-level.
The first figure below plots the percentage of tested students who are considered economically disadvantaged for a specific teacher's roster in Pennsylvania against a teacher's growth index (the value-added estimate divided by its standard error) for PSSA Mathematics in grade 5 in 2019. Each dot represents one teacher, and verified rosters were used where available. Regardless of the student characteristics served by the teacher, there is little to no correlation to the growth index. In other words, the dots representing each teacher do not trend up or down as the percentage increases; the cluster of dots is fairly even across the spectrum. In the graph below, the actual correlation between the growth index and percentage of economically disadvantaged students is -0.05, which is negligible.
PENNSYLVANIA GROWTH INDEX VERSUS PERCENTAGE TESTED ECONOMICALLY DISADVANTAGED BY TEACHER
The next figure provides similar information for the percentage of students considered English Learner (EL), and there is little to no correlation to the growth index. In the graph below, the actual correlation between the growth index and percentage of students testing as EL is 0.11, which is negligible.
PENNSYLVANIA GROWTH INDEX VERSUS PERCENTAGE TESTED EL BY TEACHER
The figure below provides similar information for the percentage of students considered special education, and there is little to no correlation to the growth index. In the graph below, the actual correlation between the growth index and percentage of students testing as special education is -0.06, which is negligible.
PENNSYLVANIA GROWTH INDEX VERSUS PERCENTAGE TESTED SPECIAL EDUCATION BY TEACHER