- - - - - - - - - - - - - - - - - - - - - -
 Divisions
  Overview
  Academic Policy & Innovation
  Accountability, Assessment, and Data Systems
     2003-2004 Alternate Maryland School Assessment
     2012 HSA and Mod-HSA Technical Report
     HSA Technical Reports
     Maryland Standard Setting Technical Reports
     Modified HSA Technical Reports
     MSA Science Technical Reports
     MSA Technical Reports
     Staff and Student Publications
     State Test Administration and Security Committee Fact Sheet
  Career and College Readiness
  Communications, Partnerships, and Grants
  Curriculum, Assessment & Accountability
  Educator Effectiveness
  Early Childhood Development
  Finance
  Information Technology
  Library Services
  Office of the State Superintendent
  Rehabilitation Services
  Special Education and Early Intervention
  Student, Family, and School Support
Divisions
Divisions > Accountability, Assessment, and Data Systems
2004 HSA Technical Report

Technical Documentation for the
Maryland High School Assessment Program

Algebra/Data Analysis, Biology, English I, Geometry
and Government End-of-Course Assessments



 
TABLE OF CONTENTS

Introduction

Section 1 Test Construction and Administration

Test Development
Table 1.1 Number of Items on Operational HSA Forms by Item Type

  Test Specifications
Table 1.2 Algebra/Data Analysis Blueprint
Table 1.3 Biology Blueprint
Table 1.4 English I Blueprint
Table 1.5 Geometry Blueprint
Table 1.6 Government Blueprint

Item Selection and Form Design
Table 1.7 January Administration
Table 1.8 May Administration

Figure 1.1 Test Characteristic Curve: Algebra

Figure 1.2 Conditional Standard Error of Measurement: Algebra

Figure 1.3 Test Characteristic Curve: Biology

Figure 1.4 Conditional Standard Error of Measurement: Biology

Figure 1.5 Test Characteristic Curve: English I

Figure 1.6 Conditional Standard Error of Measurement: English I

Figure 1.7 Test Characteristic Curve: Geometry

Figure 1.8 Conditional Standard Error of Measurement: Geometry

Figure 1.9 Test Characteristic Curve: Government

Figure 1.10 Conditional Standard Error of Measurement: Government

Appendix 1.A. Linking Study: 2000-2001 to the Operational Scale (2003)
Table 1.A.1 Algebra
Table 1.A.2 Biology
Table 1.A.3 English I
Table 1.A.4 Geometry
Table 1.A.5 Government
Table 1.A.6 Correlations of Reference (Anchor) and Link Item Parameters

Section 2 Validity

Appendix 2.A Factor Analysis Results
Figure 2.A.1 Algebra Scree Plot
Figure 2.A.2 Biology Scree Plot
Figure 2.A.3 English I Scree Plot
Figure 2.A.4 Geometry Scree Plot
Figure 2.A.5 Government Scree Plot

Section 3 Scoring Procedures and Score Types

Scale Scores

Conditional Standard Errors of Measurement

Subscore Scoring

Lowest and Highest Obtainable Test Scores
Table 3.1 LOSS and HOSS Values

Cut-Scores
Table 3.2 HSA 2004 Cut-Scores

Appendix 3.A Review and Replication Analysis English 2003
Table 3.A.1 Number and Type of Item by Form
Table 3.A.2 Comparison of January Forms Relative to Previous Administrations
Table 3.A.3 Number and Type of Item by Form
Table 3.A.4 Composition of 2003 Operational Forms Relative to Previous Administrations
Table 3.A.5 CTB/McGraw-Hill Summary Statistics English 2003
Table 3.A.6 CTB/McGraw-Hill Summary Statistics by Administration and Year
Table 3.A.7 Characteristics of Calibration Samples by Form for January 2003

Table 3.A.8 Characteristics of Calibration Samples by Form for May 2003

Figure 3.A.1 S&L January 2003

Figure 3.A.2 Differences in Item Parameter A Values Compared to 2002

Figure 3.A.3 Differences in Item Parameter B Values Compared to 2002

Figure 3.A.4 Differences in Item Parameter C Values Compared to 2002

Table 3.A.9 Descriptive Statistics January 2003 after Stocking and Lord

Table 3.A.10 Descriptive Statistics January 2003 after Linear Equipercentile

Table 3.A.11 Descriptive Statistics January 2003 Omitting Form W S&L Link

Figure 3.A.5 S&L May 2003

Figure 3.A.6 - 3.A.8 Differences in Anchor Item Parameter Values: Forms A-C Compared to Forms D-L

Table 3.A.12 Summary Statistics May 2003 After Stocking and Lord

Table 3.A.13 Descriptive Statistics January 2003 after Linear Equipercentile


Appendix 3.B Evaluating the Use of Item-Pattern and Number-Correct to Scale Score Scoring for Reporting Subscores

Table 3.B.1 Distribution of Items by Type for Each Subscore

Table 3.B.2 Summary Statistics

Figure 3.B.1- 3.B.5 Bivariate Plots of NC and IP Scale Scores

Figure 3.B.6- 3.B.10 Empirical Conditional Standard Errors of Scale Scores for Item Pattern (IP) and Number Correct (NC) Scoring Methods

Table 3.B.3 Number and Percent of Simulees Assigned the LOSS by Subscore

Table 3.B.4 Expectation 3.2 Item Parameters

Figure 3.B.11 Expectation 3.2 Item Characteristic Curves and Expectation 3.2 Characteristic Curve

Table 3.B.5 Distribution of IP and NC Scale Scores for Expectation 3.2 within the True Score Grouping 320-359

Table 3.B.6 Expectation 3.2 Item Pattern Response Patterns and Associated IP and NC Scale Scores

Table 3.B.7 Expectation 1.1

Table 3.B.8 Expectation 1.2

Table 3.B.9 Expectation 3.1

Table 3.B.10 Expectation 3.2

Table 3.B.11 Total Test

Table 3.B.12 Simulation of Aggregate Scores

Table 3.B.13 Differences Between Mean IP and NC Scores

Figure 3.B.12 Bivariate Plots IP and NC Mean Scores


Appendix 3.C Establishing the HOSS and LOSS

Section 4 Test-Level Analyses

Demographic Distributions
Table 4.1 Demographic Information for Algebra
Table 4.2 Demographic Information for Biology
Table 4.3 Demographic Information for English
Table 4.4 Demographic Information for Geometry
Table 4.5 Demographic Information for Government

Score Distributions and Summary Statistics
Table 4.6 Mean Scores by Administration

Figure 4.1 Comparison of Scale Score Distribution: Algebra 2004

Figure 4.2 Comparison of Scale Score Distribution: Biology 2004

Figure 4.3 Comparison of Scale Score Distribution: English 2004

Figure 4.4 Comparison of Scale Score Distribution: Geometry 2004

Figure 4.5 Comparison of Scale Score Distribution: Government 2004
Table 4.7 Comparison of Mean Scores from 2002, 2003, and 2004
Table 4.8 Comparison of Passing Rates from 2002, 2003, and 2004
Table 4.9 Comparison of Geometry Passing Rates from 2002, 2003, and 2004

Speededness
Table 4.10 Proportion of Students Omitting the Last 5 Items in the First Session: January
Table 4.11 Proportion of Students Omitting the Last 5 Items in the First Session: May

Reliability
Table 4.12 Summary Statistics for Algebra Primary Forms
Table 4.13 Summary Statistics for Algebra Make-Up Forms
Table 4.14 Summary Statistics for Biology Primary Forms
Table 4.15 Summary Statistics for Biology Make-Up Forms
Table 4.16 Summary Statistics for English Primary Forms
Table 4.17 Summary Statistics for English Make-Up Forms
Table 4.18 Summary Statistics for Geometry Primary Forms
Table 4.19 Summary Statistics for Geometry Make-Up Forms
Table 4.20 Summary Statistics for Government Primary Forms
Table 4.21 Summary Statistics for Government Make-Up Forms

Section 5 Field Test Analyses

Classical Item Analyses

Differential Item Functioning (DIF)

IRT Calibration and Scaling

Government Constructed Response Study

Statistical Summary Tables
Table 5.1 Distributions of P-Values for January Field Test SR Items
Table 5.2 Distributions of P-Values for January Field Test CR Items
Table 5.3 Distributions of Item-Total Correlations for January Field Test SR Items
Table 5.4 Distributions of Item-Total Correlations for January Field Test CR Items
Table 5.5 Distributions of P-Values for May Field Test SR Items
Table 5.6 Distributions of P-Values for May Field Test CR Items
Table 5.7 Distributions of Item-Total Correlations for May Field Test SR Items
Table 5.8 Distributions of Item-Total Correlations for May Field Test CR Items
Table 5.9 Field Test Items Excluded from Analyses: January
Table 5.10 Field Test Items Excluded from Analyses: May

Appendix 5.A Maryland High School Assessment Special Study: Directional Statements Accompanying the Government Constructed Responses

Figure 5.A.1 Government Brief Constructed Response Item: With Instruction

Figure 5.A.2 Government Brief Constructed Response Item: Without Instruction
Table 5.A.1 Classical Item Statistics
Table 5.A.2 Frequency Distribution of Score Points
Table 5.A.3 IRT Parameter Estimates

Figure 5.A.3 Item Characteristic Curve for CR Item 1

Figure 5.A.4 Item Characteristic Curve for CR Item 2

Figure 5.A.5 Item Characteristic Curve for Each Response Option of Item 1

Figure 5.A.6 Item Characteristic Curve for Each Response Option of Item 2

Figure 5.A.7 Information Function for CR Item 1

Figure 5.A.8 Information Function for CR Item 2

Attachments

HSA 2004 Scoring Contractor's Report
January 2004 Agreement Rates
May 2004 Agreement Rates
Handscoring Activity Timelines
Score Distribution (Field Test)
Score Distribution (Operational)

A complete copy of the 2004 HSA Technical Report is also available.

Contact Information
Leslie Wilson, Assistant State Superintendent
Division of Accountability and Assessment
Maryland State Department of Education
200 West Baltimore Street
Baltimore, MD 21201
Maryland State Department of Education
200 West Baltimore Street
Baltimore, MD 21201
MSDE Privacy Statement Disclaimer  | Copyright © 2003 MSDE