- Using PVAAS for a Purpose
- Key Concepts
- PEERS
- About PEERS
- Understanding the PEERS pages
- Evaluation List
- Evaluation Summary
- Evaluation Forms
- Add Educator
- Add Evaluator
- Manage Access
- Add a school-level Educator to PEERS
- Add a district-level Educator to PEERS
- Add the Evaluator permission to a user's account
- Remove the Evaluator permission from a district user's account
- Add the Evaluator or Administrative Evaluator permission to a district user's account
- Remove the Administrative Evaluator permission from a district user's account
- Remove an Educator from PEERS
- Restore a removed Educator
- Assign an Educator to a district-level Evaluator
- Assign an Educator to an Evaluator
- Unassign an Educator from an Evaluator
- Assign an Educator to a school
- Unassign an Educator from a school
- Link a PVAAS account to an Educator
- Working with Evaluations
- Switch between Educator and Evaluator
- View an evaluation
- Use filters to display only certain evaluations
- Print the Summary section of an evaluation
- Understanding evaluation statuses
- Determine whether other evaluators have access to an evaluation
- Lock or unlock an evaluation
- Save your changes
- Mark an evaluation as Ready for Conference
- Release one or more evaluations
- Download data from released evaluations to XLSX
- Make changes to an evaluation marked Ready for Conference
- Reports
- School Reports
- LEA/District Reports
- Teacher Reports
- Student Reports
- Comparison Reports
- Human Capital Retention Dashboard
- Roster Verification (RV)
- Getting Started
- All Actions by Role
- All Actions for Teachers
- All Actions for School Administrators or Roster Approvers
- Manage teachers' access to RV
- Assign other school users the Roster Approver permission
- View a teacher's rosters
- Take control of a teacher's rosters
- Add and remove rosters for a teacher
- Copy a roster
- Apply a percentage of instructional time to every student on a roster
- Batch print overclaimed and underclaimed students
- Remove students from a roster
- Add a student to a roster
- Return a teacher's rosters to the teacher
- Approve a teacher's rosters
- Submit your school's rosters to the district
- All Actions for district admin or district roster approvers
- Assign other LEA/district users the Roster Approver permission
- Take control of a school's rosters
- View a teacher's rosters
- View the history of a teacher's rosters
- Edit a teacher's rosters
- Add and remove rosters for a teacher
- Copy a roster
- Apply a percentage of instructional time to every student on a roster
- Batch print overclaimed and underclaimed students
- Return a school's rosters to the school
- Approve rosters that you have verified
- Submit your district's rosters
- Understanding the RV Pages
- Viewing the History of Actions on Rosters
- Additional Resources
- Admin Help
- General Help
Misconception: It is harder to make growth with students from certain demographic or socioeconomic backgrounds.
It is widely known that students with certain socioeconomic or demographic (SES/DEM) characteristics tend to score lower, on average, than students with other SES/DEM characteristics, and there is concern that educators serving those students could be systematically disadvantaged in the modeling.
However, this adjustment is not statistically necessary for the most sophisticated value-added models, such as those used for PVAAS in Pennsylvania. This is because PVAAS uses all available testing history for each individual student and does not exclude students who have missing test data. Each student serves as their own control, and to the extent that SES/DEM influences persist over time, these influences are already represented in the student's data.
PVAAS in Theory
As a 2004 Ed Trust study stated, specifically with regard to the PVAAS modeling, which is the approach used in Pennsylvania's LEA/district, school, and teacher reporting:
[I]f a student's family background, aptitude, motivation, or any other possible factor has resulted in low achievement and minimal learning growth in the past, all that is taken into account when the system calculates the teacher's contribution to student growth in the present.*
A 2007 paper by RAND researchers J.R. Lockwood and Dan McCaffrey explicitly verified the models used for PVAAS LEA/district, school, and teacher reporting when they wrote:
William Sanders, the developer of the TVAAS model, has claimed that jointly modeling 25 scores for individual students, along with other features of the approach is extremely effective at purging student heterogeneity bias from estimated teacher effects...The analytic and simulation results presented here largely support that claim.†
An economist-based perspective by UCLA researchers Kilchan Choi, Pete Goldschmidt, and Kyo Yamashiro provided a similar finding in their study comparing value-added models:
First, adding in an adjustment for student SES (as measured by eligibility for free- or reduced-price lunch) adds very little once a student's initial status is controlled... This indicates that student initial status captures many of the effects that SES is attempting to measure. In other words, by controlling for initial status, the model already captures the preceding effects that SES might have on students.‡
In essence, these independent researchers have found that a sophisticated value-added approach does not typically systematically advantage or disadvantage educators by the type of students that they serve. By including so many prior test scores for each student, the model controls for many student characteristics that might impact their entering achievement or growth throughout the year.
Ultimately, there may be additional political and policy considerations that lead policymakers to make socioeconomic or demographic adjustments in the value-added models, but sophisticated ones tend neither to advantage nor disadvantage educators regardless.
PVAAS in Practice
Although the statistical literature presents evidence that educators are not advantaged or disadvantaged by the type of students that they serve in sophisticated value-added reporting, actual data might be the most readily apparent evidence to support this belief. The figures below provide teacher-level data, and the results are similar to those for LEA/district- and school-level.
The first figure below plots the percentage of tested students who are considered economically disadvantaged for a specific teacher's roster in Pennsylvania against a teacher's growth index (the value-added estimate divided by its standard error) for PSSA Mathematics in grade 5 in 2019. Each dot represents one teacher, and verified rosters were used where available. Regardless of the student characteristics served by the teacher, there is little to no correlation to the growth index. In other words, the dots representing each teacher do not trend up or down as the percentage increases; the cluster of dots is fairly even across the spectrum. In the graph below, the actual correlation between the growth index and percentage of economically disadvantaged students is -0.05, which is negligible.
PENNSYLVANIA GROWTH INDEX VERSUS PERCENTAGE TESTED ECONOMICALLY DISADVANTAGED BY TEACHER
The next figure provides similar information for the percentage of students considered English Learner (EL), and there is little to no correlation to the growth index. In the graph below, the actual correlation between the growth index and percentage of students testing as EL is 0.11, which is negligible.
PENNSYLVANIA GROWTH INDEX VERSUS PERCENTAGE TESTED EL BY TEACHER
The figure below provides similar information for the percentage of students considered special education, and there is little to no correlation to the growth index. In the graph below, the actual correlation between the growth index and percentage of students testing as special education is -0.06, which is negligible.
PENNSYLVANIA GROWTH INDEX VERSUS PERCENTAGE TESTED SPECIAL EDUCATION BY TEACHER