- Using PVAAS for a Purpose
- Key Concepts
- PEERS
- About PEERS
- Understanding the PEERS pages
- Evaluation List
- Evaluation Summary
- Evaluation Forms
- Add Educator
- Add Evaluator
- Manage Access
- Add a school-level Educator to PEERS
- Add a district-level Educator to PEERS
- Add the Evaluator permission to a user's account
- Remove the Evaluator permission from a district user's account
- Add the Evaluator or Administrative Evaluator permission to a district user's account
- Remove the Administrative Evaluator permission from a district user's account
- Remove an Educator from PEERS
- Restore a removed Educator
- Assign an Educator to a district-level Evaluator
- Assign an Educator to an Evaluator
- Unassign an Educator from an Evaluator
- Assign an Educator to a school
- Unassign an Educator from a school
- Link a PVAAS account to an Educator
- Working with Evaluations
- Switch between Educator and Evaluator
- View an evaluation
- Use filters to display only certain evaluations
- Print the Summary section of an evaluation
- Understanding evaluation statuses
- Determine whether other evaluators have access to an evaluation
- Lock or unlock an evaluation
- Save your changes
- Mark an evaluation as Ready for Conference
- Release one or more evaluations
- Download data from released evaluations to XLSX
- Make changes to an evaluation marked Ready for Conference
- Reports
- School Reports
- LEA/District Reports
- Teacher Reports
- Student Reports
- Comparison Reports
- Human Capital Retention Dashboard
- Roster Verification (RV)
- Getting Started
- All Actions by Role
- All Actions for Teachers
- All Actions for School Administrators or Roster Approvers
- Manage teachers' access to RV
- Assign other school users the Roster Approver permission
- View a teacher's rosters
- Take control of a teacher's rosters
- Add and remove rosters for a teacher
- Copy a roster
- Apply a percentage of instructional time to every student on a roster
- Batch print overclaimed and underclaimed students
- Remove students from a roster
- Add a student to a roster
- Return a teacher's rosters to the teacher
- Approve a teacher's rosters
- Submit your school's rosters to the district
- All Actions for district admin or district roster approvers
- Assign other LEA/district users the Roster Approver permission
- Take control of a school's rosters
- View a teacher's rosters
- View the history of a teacher's rosters
- Edit a teacher's rosters
- Add and remove rosters for a teacher
- Copy a roster
- Apply a percentage of instructional time to every student on a roster
- Batch print overclaimed and underclaimed students
- Return a school's rosters to the school
- Approve rosters that you have verified
- Submit your district's rosters
- Understanding the RV Pages
- Viewing the History of Actions on Rosters
- Additional Resources
- Admin Help
- General Help
Growth of Student Groups
There are two ways to view information on this report. You can View Growth by Subject Area or View Growth by Student Group for an individual school for the selected year. Each table shows a subject or student group depending on which tab you select. Each row shows growth across all tests and grades for a subject, Keystone content area, or student group depending on which tab you select. Each row displays the growth measure, standard error, growth index, and growth color indicator where available.
You can expand or collapse the information in the table. To see all information in one table, click Expand at the top of the table. To see all information in each row, click the arrow beside the student group or subject.
Technical Details
This report is only available for some subjects.
This report provides School value-added measures for specific student groups, for example, students who are economically disadvantaged and students who are English Learners. In Pennsylvania, the student groups for which School value-added measures are available are:
- All students
- American Indian/Alaskan Native
- Asian
- Black
- Economically disadvantaged
- English learners
- Hispanic
- Lowest performing 33% of students
- Two or more races
- Hawaiian/Pacific Islander
- Students with GIEPs
- Students with IEPs
- White
The PSSA row in each table is the overall growth index that combines PSSA growth measures across grades for that student group and subject. The website does not display a growth measure and standard error for across grade measures. For more information about how these measures are calculated, see the Statistical Models and Business Rules.
The Keystone row in each table is the overall growth index that combines Keystone growth measures for that student group and subject.
Local assessments only have a growth index for the individual grade and subject.
The minimum number of students required for the calculation of growth for a student group is the same as the general value-added reporting. There must be at least 11 students for the gain model or 11 students for the predictive model with sufficient testing history in a student group in a specific year, subject, and grade to receive a growth measure in that year, subject, and grade. The across-grades measures are calculated using the same thresholds for the minimum number of students. A user will not see a row for a group if there is not enough data available to provide a measure.
The same rules for excluding students from other value-added reports also apply to this report. These rules are described in Statistical Models and Business Rules. (For example, students not enrolled for a full academic year are excluded and students in their first year of receiving English Learner services are excluded.)
A list of students for each student group is not available within this report. Many of the student groups can be found by selecting Student Search from the Reports menu and choosing Who last tested in along with the grade, race, and demographic.
Students are placed in groups based on data from their PSSA, Keystone, or local assessments with some exceptions.
GIEP students: Students are determined to be GIEP based on the enrollment snapshot data provided by LEAs/districts. To be included a student must have a GIEP or receive gifted services through an IEP.
Lowest performing 33% of students: These students are identified by PVAAS through the use of student scores on assessments. For all Keystone assessments and the PSSA science assessments, the lowest 33% of predicted scores are determined within each assessment to identify the groups. For PSSA math and ELA, the lowest 33% of the average NCEs from the most recent two years of the same subject, calculated per subject and grade, are used to identify the groups. The lowest 33% is determined within the school.
Understanding the Report
The model we use to calculate growth measures depends on the assessments administered. Concept of Growth provides additional details.
Growth Measure
Each growth measure is a conservative estimate of the academic growth the students made, on average, in a grade and subject or Keystone content area. Because the growth measures are estimates, consider their associated standard errors as you interpret the values.
The growth measure is calculated differently for assessments analyzed with the growth standard methodology than it is for PSSA Science and Keystone assessments.
Standard Error
All growth measures on the PVAAS reports are estimates. All estimates have some amount of measurement error, which is known as the standard error. This value defines a confidence band around the growth measure, which describes how strong the evidence is that the group of students exceeded, met, or fell short of the growth standard.
For more information about standard errors, see Growth Measures and Standard Errors.
Calculating the Growth Index
The standard error is used in conjunction with the growth measure to calculate the growth index. Specifically, the growth index is the growth measure divided by its standard error. This calculation yields a robust measure of growth for the group of students that reflects both the growth and the amount of evidence. All index values are on the same scale and can be compared fairly across years, grades, and subjects throughout the reference group.
For some of the overall growth indices, a corresponding growth measure and standard error might not be possible. A user will not see a row for a group if there is not enough data available to provide a measure.
Growth Color Indicators
The Growth Color Indicator column is color-coded to aid in interpretation. The colors indicate how strong the evidence is that the student group exceeded, met, or fell short of the growth standard.
The legend at the bottom of the report provides guidance for interpreting the colors.
Growth Color Indicator | Growth Index Compared to the Growth Standard | Interpretation |
---|---|---|
Well Above | At least 2 standard errors above | Significant evidence that the group of students exceeded the growth standard |
Above | Between 1 and 2 standard errors above | Moderate evidence that the teacher's students made more progress than the growth standard |
Meets | Between 1 standard error above and 1 standard error below | Evidence that the group of students met the growth standard |
Below | Between 1 and 2 standard errors below | Moderate evidence that the group of students did not meet the growth standard |
Well Below | More than 2 standard errors below | Significant evidence that the group of students did not meet the growth standard |
As you consider the growth of student group data, note any patterns of growth you observe. This provides insight into the overall impact of the school's instructional program. For example:
- Looking across subjects/grades and/or Keystone content areas, did some student groups consistently make more growth than others?
- As you look at each group, consider whether there was stronger evidence of growth in one subject/grade than in others. Is this pattern similar across groups? Does this pattern persist across multiple subjects, grades, or Keystone content areas?
Schools should also consider the proportion of a school represented by the group. For example, in a school with 99% of the students economically disadvantaged, the growth of all students at the school will be very similar to, if not the same as, the growth of the economically disadvantaged student group. In addition, growth patterns highlight both strengths and areas for improvement within specific student groups.