- Using PVAAS for a Purpose
- Key Concepts
- PEERS
- About PEERS
- Understanding the PEERS pages
- Evaluation List
- Evaluation Summary
- Evaluation Forms
- Add Educator
- Add Evaluator
- Manage Access
- Add a school-level Educator to PEERS
- Add a district-level Educator to PEERS
- Add the Evaluator permission to a user's account
- Remove the Evaluator permission from a district user's account
- Add the Evaluator or Administrative Evaluator permission to a district user's account
- Remove the Administrative Evaluator permission from a district user's account
- Remove an Educator from PEERS
- Restore a removed Educator
- Assign an Educator to a district-level Evaluator
- Assign an Educator to an Evaluator
- Unassign an Educator from an Evaluator
- Assign an Educator to a school
- Unassign an Educator from a school
- Link a PVAAS account to an Educator

- Working with Evaluations
- Switch between Educator and Evaluator
- View an evaluation
- Use filters to display only certain evaluations
- Print the Summary section of an evaluation
- Understanding evaluation statuses
- Determine whether other evaluators have access to an evaluation
- Lock or unlock an evaluation
- Save your changes
- Mark an evaluation as Ready for Conference
- Release one or more evaluations
- Download data from released evaluations to XLSX
- Make changes to an evaluation marked Ready for Conference

- Reports
- School Reports
- LEA/District Reports
- Teacher Reports
- Student Reports
- Comparison Reports
- Human Capital Retention Dashboard

- Roster Verification (RV)
- Getting Started
- All Actions by Role
- All Actions for Teachers
- All Actions for School Administrators or Roster Approvers
- Manage teachers' access to RV
- Assign other school users the Roster Approver permission
- View a teacher's rosters
- Take control of a teacher's rosters
- Add and remove rosters for a teacher
- Copy a roster
- Apply a percentage of instructional time to every student on a roster
- Batch print overclaimed and underclaimed students
- Remove students from a roster
- Add a student to a roster
- Return a teacher's rosters to the teacher
- Approve a teacher's rosters
- Submit your school's rosters to the district

- All Actions for district admin or district roster approvers
- Assign other LEA/district users the Roster Approver permission
- Take control of a school's rosters
- View a teacher's rosters
- View the history of a teacher's rosters
- Edit a teacher's rosters
- Add and remove rosters for a teacher
- Copy a roster
- Apply a percentage of instructional time to every student on a roster
- Batch print overclaimed and underclaimed students
- Return a school's rosters to the school
- Approve rosters that you have verified
- Submit your district's rosters

- Understanding the RV Pages
- Viewing the History of Actions on Rosters

- Additional Resources
- Admin Help
- General Help

# Teacher Value-Added

## Composite

A teacher's composite is a combined measure of all tested subjects, grades, and Keystone content areas for which a teacher received a value-added report. Teachers with one year of PVAAS teacher reporting see a one-year composite. Teachers with two consecutive years of PVAAS teacher reporting see a two-year composite. Teachers with three consecutive years of PVAAS teacher reporting see a three-year composite, also known as the three-year rolling average. This three-year rolling average is the only PVAAS teacher-specific score that is to be used on a teacher's final rating form for teacher evaluation.

For some teachers, the composite combines data for PSSA Math and ELA with data from PSSA science or a Keystone exam. Because the growth measures are on different scales, it's not possible to provide a meaningful composite growth measure.

However, growth index values across all grades, subjects, and courses are on the same scale, so they can be combined in an appropriate and meaningful way. As a result, you will see a single growth index value for the composite. This value incorporates both the growth of the teacher's group of students and the associated standard error.

To calculate the composite, a simple average is taken of all of the teacher's individual index values for up to three years. Then the average is multiplied by the square root of the number of individual index values that went into the average. This step is a necessary step that accounts for the fact that more data was used to generate the average than was used to generate each individual index, which affects the standard error for the composite.

The report includes a table that lists each year, grade, and subject that were included in the teacher's Composite Growth Index.

## Composite Color Might Differ from the Value-Added Colors

The colors used in PVAAS indicate how strong the evidence of growth is for a group of students. The assumption is that the achievement level of the teacher's group of students is maintained (green) unless there is enough evidence in the assessment data to say otherwise. When more data is available, there is more evidence to determine whether the group of students exceeded, met, or fell short of the growth standard. The color of the composite score might differ from each single PSSA subject, grade, or Keystone content area color because there is more evidence when all the data is combined for the composite. With data from multiple subjects (as well as multiple years, when available), there is more data and more evidence than the single subject, grade, or course alone.

As an analogy, let's say we have 1,000 bags of different colored candies. We can estimate how many blue candies are in each bag by looking at one bag, but there would be less error if we looked in several bags. We would have a better estimate—with more evidence and less error on our measure—if we looked at 100 bags of candies.

Remember, the colors are categories or ranges of growth indicators, and we cannot average categories. In other words, green plus red does not necessarily equal yellow; likewise, green plus dark blue does not necessarily equal light blue.

The growth index is about the amount of evidence we have that the group of students exceeded or fell short of the growth standard.

If we have two different indicators that separately indicate moderate evidence that the group of students exceeded the growth standard, combining those two different indicators is stronger evidence of exceeding standard. So, it is not surprising that when we combine two separate indicators of moderate evidence of exceeding the growth standard (e.g., +1.9 and +1.1, both light blue), we might now have enough evidence of exceeding the standard to reach a higher growth color indicator (i.e., combining +1.9 and +1.1 = +2.12 or dark blue).

In the same way, if we have two different indicators that separately indicate moderate evidence that students did not meet the growth standard, combining those two different indicators provides even stronger evidence that students fell short of the standard. So, it is not surprising that when we combine two separate indicators of moderate evidence of falling short of the standard (e.g., -1.42 and -1.81, yellow and yellow), we might now have significant evidence that students did not meet the standard instead of just moderate evidence (i.e., combining -1.42 and -1.81 = -2.28 or red).

## Additional Technical Details

Composite = (mean of indices)/(1/(sqrt(n))

Since the indices that go into this mean are already divided by their standard error, they approximate a normal distribution with a mean of 0 and a standard deviation of 1. The standard formula for the standard error of a mean is (standard deviation of the population/sqrt(n)) (see: https://en.wikipedia.org/wiki/Standard_error). As PVAAS has already normalized the indices going into the mean, the standard deviation of all of them is 1, so the formula for the standard error of the mean becomes (1/sqrt(n)). To test that a mean is different from zero, PVAAS divides the mean by its standard error. The result is the composite = (mean of indices)/(1/sqrt(n)).

The two examples below show how two light blue indicators do not equal a light blue or that a yellow and two green indicators do not equal a green. Also, both examples show how a composite is larger than two separate indices. It is more of an accumulation than an average. In other words, the composite is not really an average but an accumulation of the evidence toward meeting, exceeding, or falling short of the growth standard.

Example 1:

Grade 8 Math = + 1.28 (light blue)

Algebra I = + 1.74 (light blue)

Composite = average of growth indices multiplied by square root of number of growth indices

= [ (1.28 + 1.74) / 2 ] * sqrt(2)

= 1.51 * 1.414214

= 2.14 (dark blue)

Example 2:

Grade 7 Math = -1.21 (yellow)

Grade 8 Math = - 0.46 (green)

Algebra I = - 0.89 (green)

Composite = average of growth indices multiplied by square root of number of growth indices

= [ (-1.21 + -0.46 + -0.89) / 3 ] * sqrt(3)

= -0.853 * 1.73205

= -1.48 (yellow)

The following table illustrates the relationship between the growth color indicator and Composite Growth Index.

Growth Color Indicator | Growth Index Compared to the Growth Standard | Interpretation |
---|---|---|

Well Above | At least 2 standard errors above | Significant evidence that the group of students exceeded the growth standard |

Above | Between 1 and 2 standard errors above | Moderate evidence that the group of students exceeded the growth standard |

Meets | Between 1 standard error above and 1 standard error below | Evidence that the group of students met the growth standard |

Below | Between 1 and 2 standard errors below | Moderate evidence that the group of students did not meet the growth standard |

Well Below | More than 2 standard errors below | Significant evidence that the group of students did not meet the growth standard |