USF Libraries
USF Digital Collections

Educational policy analysis archives


Material Information

Educational policy analysis archives
Physical Description:
Arizona State University
University of South Florida
Arizona State University
University of South Florida.
Place of Publication:
Tempe, Ariz
Tampa, Fla
Publication Date:


Subjects / Keywords:
Education -- Research -- Periodicals   ( lcsh )
non-fiction   ( marcgt )
serial   ( sobekcm )

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
usfldc doi - E11-00239
usfldc handle - e11.239
System ID:

This item is only available as the following downloads:

Full Text
xml version 1.0 encoding UTF-8 standalone no
mods:mods xmlns:mods http:www.loc.govmodsv3 xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govmodsv3mods-3-1.xsd
mods:relatedItem type host
mods:identifier issn 1068-2341mods:part
mods:detail volume mods:number 9issue 42series Year mods:caption 20012001Month October10Day 1616mods:originInfo mods:dateIssued iso8601 2001-10-16

xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam a22 u 4500
controlfield tag 008 c20019999azu 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E11-00239
0 245
Educational policy analysis archives.
n Vol. 9, no. 42 (October 16, 2001).
Tempe, Ariz. :
b Arizona State University ;
Tampa, Fla. :
University of South Florida.
c October 16, 2001
Significance of test-based ratings for metropolitan Boston schools / Craig Bolon.
x Research
v Periodicals.
2 710
Arizona State University.
University of South Florida.
1 773
t Education Policy Analysis Archives (EPAA)
4 856


1 of 57 Education Policy Analysis Archives Volume 9 Number 42October 16, 2001ISSN 1068-2341 A peer-reviewed scholarly journal Editor: Gene V Glass, College of Education Arizona State University Copyright 2001, the EDUCATION POLICY ANALYSIS ARCHIVES Permission is hereby granted to copy any article if EPAA is credited and copies are not sold. Articles appearing in EPAA are abstracted in the Current Index to Journals in Education by the ERIC Clearinghouse on Assessment and Evaluation and are permanently archived in Resources in Education .Significance of Test-based Ratings for Metropolitan Boston Schools Craig Bolon Planwright Systems Corporation (USA)Citation: Bolon, C. (2001, October 16). Significanc e of Test-based Ratings for Metropolitan Boston Schools. Education Policy Analysis Archives 9 (42). Retrieved [date] from 1998 Massachusetts began state-sponsored, annual achievement testing of all students in three public school grades. It h as created a school and district rating system for which scores on these te sts are the sole factor. It proposes to use tenth-grade test scores as a sole c riterion for high school graduation, beginning with the class of 2003. The s tate is treating scores and ratings as though they were precise educational measures of high significance. A review of tenth-grade mathematics t est scores from academic high schools in metropolitan Boston showed that statistically they are not. Community income is strongly correlat ed with test scores and accounted for more than 80 percent of the variance in average scores for a sample of Boston-area communities:


2 of 57 Once community income was included in models, other factors--including percentages of students in disadvantaged population s, (Note 1) percentages receiving special education, percentages eligible f or free or reduced price lunch, percentages with limited English proficiency school sizes, school spending levels, and property values--all failed to associate substantial additional variance. Large uncertainties in residua ls of school-averaged scores, after subtracting predictions based on comm unity income, tend to make the scores ineffective for rating performance of schools. Large uncertainties in year-to-year score changes tend to make the score changes ineffective for measuring performance trends. Contents Section 1: Background A. State Testing in Massachusetts Public Schools B. Schools in the Boston Metropolitan Area C. Statewide MCAS Test Results D. Test Score Studies E. Sources of Data Section 2: Data Analysis A. Trends Study of Variability B. Effects Study Involving Social Factors C. Observations D. Summary Analysis Section 3: Results A. Opportunities and Questions B. Conclusions Appendix 1: Education Reform in Massachusetts


3 of 57 Appendix 2: Massachusetts Vocational Schools Appendix 3: Metropolitan Boston MCAS Mathematics Sc ores Appendix 4: Metropolitan Boston School Characterist ics Section 1: BackgroundA. State Testing in Massachusetts Public SchoolsThe most recent form of state testing in Massachuse tts public schools is the Massachusetts Comprehensive Assessment System (MCAS ), a set of achievement tests sponsored and produced by the Massachusetts Departm ent of Education and administered in the spring of years beginning in 19 98 (see Appendix 1 and Bolon, 2000 ). These tests have four sections: English language ar ts, mathematics, science and technology, and history and social science. For the years 1998-2000, tests were administered in grades 4, 8 and 10. Beginning in Ap ril, 2001, the former grade 4 test sections were divided between grades 4 and 5; new t ests have been added in grades 3, 6 and 7.MCAS tests are loosely timed and include questions in multiple choice, short answer and extended answer formats; they are provided in Engli sh and Spanish. All public school students are required to take MCAS tests; there are no "opt-out" provisions. Students taught in parochial schools, other private schools, home schooling and out-of-state schools are not required to take or pass MCAS. For 1998 through 2000, the Department of Education has published test questions used for scoring approximately six months after test administration (see Mass. DoE, 2000g for example). It has produced new test forms each year. According to current plans, starti ng with the class of 2003 minimum scores on the English language arts and mathematics sections will be required to graduate from high school ( Mass. DoE, 1999f ) and to enroll at state colleges, except for MCAS test preparation courses at two-year colleges.The Massachusetts Department of Education publishes MCAS results as scaled scores in a range of 81 scale points, using the integers 200 through 280 ( Mass. DoE, 2000h ). The Department assigns labels it calls "performance lev els" to four scaled score intervals ( Mass. DoE, 1998b ) and currently considers 220 the minimum passing s caled score on all test sections ( Mass. DoE, 1999f ). The Department has not fully disclosed details o f assigning scale factors, assuring consistent scores across test forms or assuring that scores quantitatively reflect published academic st andards. It has not published distributions of either raw scores or scaled scores It has released limited information about test design and properties in "technical repo rts" for the years 1998 ( Mass. DoE, 1999c ) and 1999 ( Mass. DoE, 2000i ). After an independent analysis of score averages by "racial" and "ethnic" categories for 1998 ( Uriarte and Chavez, 1999 ), the Department published its own analysis of this type for 1999 ( Mass. DoE, 2000c ). Massachusetts tests appear to rank near the high en d of state achievement tests in difficulty, although failure rates are lower than t hose for some tests used in Arizona and Virginia. As in several other states, substantially higher failure rates are found on mathematics than on language tests in high schools. Since the tenth-grade version of the mathematics test section sets the graduation thresh old for most students, its scores have been used as subjects for these studies.B. Schools in the Boston Metropolitan Area


4 of 57Metropolitan Boston is diverse. Besides the City of Boston it includes many smaller municipalities, all operating their own school syst ems. These studies consider communities inside Route 128, a highway designed in the late 1940s (now an Interstate), enclosing areas within about 9-12 miles from Boston 's government center--that is, Boston and its inner and middle suburbs. They share a public transit system, several public and private utilities, and an economy domina ted by service industries. They include poverty areas, concentrations of wealth, mi ddle-income communities, prosperous suburban towns, a few medium-sized cities and one l arge city. The areas are bounded by the Massachusetts Bay and Atlantic Ocean to the eas t, Salem and Peabody to the north, Waltham and Newton to the west, and Braintree and Q uincy to the south (see Metropolitan Area, 1997 ). Schools in the Boston metropolitan area are also di verse. These studies, focusing on testing for graduation, consider only high schools. While a majority of the area's population of high-school age attends public school s, (Note 2) a substantial proportion attends parochial schools that began to be establis hed by the Roman Catholic Church more than 150 years ago. A smaller fraction is taug ht in other private schools or though home schooling. The Metropolitan Council for Educat ional Opportunities (METCO), founded in 1963, uses state funding to help send ov er 3,000 Boston minority students to suburban schools ( Orfield, et al., 1997 ). Within public school systems there is also substant ial diversity. All communities must support regional "vocational," "technical" and "agr icultural" high schools. Some such schools began as "manual training" schools in the 1 800s. Some communities have closed their local vocational schools; some have merged th em with their academic schools. These studies look in detail only at academic schoo ls, because the curriculum of vocational schools is substantially different and i s not designed to prepare students for MCAS tests, an issue of controversy ( Nicodemus, 2000 ). For purposes of these studies there are difficulties with a few communities, incl uding Cambridge, Quincy, Revere and Waltham, which provide vocational education in the same facilities as academic programs ( Mass. DoE, 2000f ). I chose to include such schools in these studies while noting their special characteristics.Several communities also operate experimental schoo ls, including "pilot schools" in Boston and "charter schools" in several communities ( Partee, 1997 and Wood, 1999 ), as regulated under the Massachusetts Education Reform Act of 1993. All that offer ninth-grade curriculum and above are smaller than t he regular academic schools. These schools provide motivational environments and may e xercise indirect forms of student selection that differentiate them from other public schools. Primarily because of concerns about small sample sizes, schools with fewer than 1 00 students per grade are excluded from these studies. So far no experimental school i s that large. The City of Boston presents a unique situation. Of its large academic high schools, three are exam schools: the Boston Latin School (founded in 1635) and the more recent Latin Academy (formerly Girls Latin) and O'Bryant School of Mathematics and Science (formerly Boston Technical). These draw away many B oston students who tend to score well on achievement tests, promoting a longstanding social stratification in Boston schools. Over half the students at Boston Latin com e to it from parochial and other private schools ( Daley, 1997 ); some say those students would not otherwise atte nd Boston schools. However, other public school studen ts who are not admitted leave the


5 of 57 district for high school. Starting in 1975, because of federal court orders to desegregate, exam school admission policies included a 35 percen t set-aside for African American and Latino students, maintained voluntarily after 1 987. As a result of another federal court decision ( McLaughlin, 1996 ), this approach was weakened in 1997. As with academic schools that provide vocational education, the Boston exam schools are included in these studies, but their special charac teristics are noted. C. Statewide MCAS Test Results Table 1-1 shows that statewide, tenth-grade MCAS test scores have remained nearly constant in English language arts and in science an d technology for the years 1998-2000, while scores in mathematics have risen substantiall y ( Mass. DoE, 2000h ). (Tenth-grade tests were not given in history and social science. ) Table 1-1 MCAS Statewide Results, 1998-2000 SectionYearAverage% Level 4% Level 3% Level 2% Leve l 1 English20002297293034English19992294303432English19982305333428 Math.200022815182245Math.19992229152353Math.19982227172452 Science20002263233737Science19992263213938Science19982251214236 Source of data: Mass. DoE, 2000h Table 1-1 reflects Massachusetts Department of Education pra ctice of recording students absent for a test section as scoring 200 and in Lev el 1, the lowest level ( Mass. DoE, 2000h Table 11 footnote). An undisclosed fraction of st udents were excluded from testing because of special conditions and are not c ounted in this report; others may have been provided with an alternative assessment. As cu rrently planned, students in Level 1 will be ineligible to graduate from high school as of 2003. Based on this record of scores, about half of all Massachusetts public school stude nts are at risk of being denied graduation.The labels of the four "performance levels" designa ted by the 1998 Board of Education (Note 3) for reporting MCAS results are: Level 4, Advanced Level 3, Proficient Level 2, Needs Improvement Level 1, Failing


6 of 57 Although these levels have qualitative descriptions ( Mass. DoE, 1998b ), there are no quantitative links to levels of achievement specifi ed in academic standards; content of standards has not been prioritized; nor have standa rds been promulgated through state regulations, as anticipated by law. (Note 4) Although Massachusetts law requires "competency determination" in mathematics, science and technology, history and social science, foreign languages and English, (Note 5) Massachusetts laws and regulations continue to require only US history and physical ed ucation as subjects of instruction. Massachusetts tries to set legal standards for lear ning indirectly (Note 6) through MCAS tests, procedures to set scale factors, and regulat ions for minimum scaled scores. It lacks corresponding legal commitments for instruction. It has made major changes to "curriculum frameworks" every few years ( Mass. DoE, 2000k ) and has not provided reasonable spans of time for instruction to catch u p before using revised "curriculum frameworks" as a basis for revised MCAS tests. Its teachers, parents and students cannot find out exactly what must be learned in order to m eet minimum standards for high school graduation. The 1993 Education Reform Act le ft several such problems; few have been addressed yet by the Massachusetts legislature or Board of Education. Students with disabilities (also called special edu cation students) and students with limited English proficiency (LEP students) tend to receive drastically lower MCAS scores than other students, although some students with disabilities are soon to be provided alternate assessments ( Mass. DoE, 2000l ), and some LEP students have been able to take tests in Spanish ( Mass. DoE, 2000d ). The Department of Education has not disclosed the fractions of students who are eligibl e for or have utilized its special accommodations, although it has published statewide summary data using these student categories ( Mass. DoE, 2000i Table 14.5). Most minority students also receive lower scores than other students. The Department of Educa tion has published 1999 statewide and district summary data for students categorized as "African American / Black," "Asian or Pacific Islander," "Hispanic / Latino," "Native American," "White" and "Mixed" ( Mass. DoE, 2000c Tables 5-10). As previously noted, most students in vocational programs receive lower MCAS scores than students in academic programs; this can readily be shown for the state's more than 30 vocat ional, technical and agricultural high schools ( Appendix 2 ). Based on the sources of information cited, Table 1-2 shows statewide impacts of these known risk factors on average 1999 tenth-grade math ematics scores and rates of failure. Table 1-2 MCAS 1999 Grade 10 Math Scores by Risk Factors CategoryAverage ScorePercent FailingAll students22253Students with disabilities20392Limited English proficiency20684African American20980Hispanic / Latino20885Native American21177


7 of 57 Vocational, technical, agricultural21078 Source of data: Mass. DoE, 2000i The Department of Education has not reported scores classified by other potential risk factors on which it collects information. These inc lude: Gender of students, Tests taken in Spanish or as alternate assessments, Free or red uced price lunches, as indicators of poverty, Schools with large class sizes, especially in early grades, Students retained below grade or placed below grade level, Teachers w ho lack certification in their subjects of instruction.There is also little published information about co mbinations of risk factors. However, since the Department of Education lists regional vo cational, technical and agricultural schools as separate districts in its reports of MCA S results, it is possible to use their categories of minority students ( Appendix 2 ). For those schools for which categories are reported, results are shown in Table 1-3 Table 1-3 MCAS 1999 Grade 10 Math Scores by Combined Risk Fac tors Combined CategoryAverage ScorePercent FailingVocational + African American20397Vocational + Hispanic / Latino20595 Sources of data: Mass. DoE, 2000i Mass. DoE, 2000h While the results in Table 1-3 are not strictly comparable with Table 1-2 because not all the schools and categories can be found in publishe d data, they indicate that factors can combine to worsen the scores of students with more than one risk factor. D. Test Score StudiesRecent studies question assumptions that "high-stak es" tests like MCAS can provide valid measures of either student achievement or sch ool performance, showing gains on them that are not matched by gains on other tests f or closely related educational content ( Haney, 2000 and Klein, et al., 2000 ). Political environments of "high-stakes" tests create heavy pressure to improve scores, regardless of underlying educational progress. For "low-stakes" tests aimed at measuring long-term trends, like those of the federal NAEP, it has been shown that "family variables expl ain most of the variance across scores in states" ( Grissmer, et al., 2000 Chapter 9). Individual and longitudinal studies demonstrate strong influences of parenting practice s, family structure, parent education and degrees of poverty on cognitive development (fo r example, Smith, et al., 1997 ). Other longitudinal and cross-sectional studies show cumulative responses of test scores to educational environments (for example, Phillips, et al., 1998 and Ferguson, 1998 ). However, the data generally available for test scor e research fail to capture much of the critical information needed to understand developme nt of cognitive abilities and educational achievement in the settings of public s chools. MCAS test scores have already been the subject of s everal attempts to explain, predict or interpret them ( Mass. DoE, 2001 Gaudet, 2001 Tuerck, 2001a and Tuerck, 2001b ). These prior MCAS test score studies fall into three main categories: 1) Trends studies of year-to-year and multi-year changes; 2) Effects stu dies involving social factors for the


8 of 57population; 3) Effects studies involving operating factors for the schools. Research on scores from school-based standard tests suggests that many such studies are likely to yield results of low significance. Grissmer, et al., 2000 among others, show that: Real year-to-year changes in average student perfor mance, as assessed by conventional tests, are relatively small; they can easily be masked by statistical uncertainties. Social factors are strongly associated with test sc ores. Self-reported social information tends to have high error and omission rates. Census and other community-based social information often includes confounding factors that require adjustment to reflect the hous eholds for a school population. Uncategorized school spending is only weakly associ ated with test scores. The MCAS test score studies cited use scores and st atistical data to estimate the performance of schools or districts according to si mple formulas, unsupported by other evidence. They frequently present results in a tabl e that is ranked or can be ranked like the teams in a sports league. The "league table" ap proach to presenting such results begs the question of whether the ordering of schools or districts and the differences in performance estimates have educational significance that is, whether such rankings may instead be largely matters of chance or be associat ions with factors other than school performance. This article presents a trends study a nd an effects study I conducted to explore the significance that can be associated wit h such results. E. Sources of DataThe school characteristics used in these studies ar e taken from information reported by public schools to the Massachusetts Department of E ducation for 1999 and published by the Department ( Mass. DoE, 2000f ). MCAS test scores summarized by schools are from 1998-2000 Department reports ( Mass. DoE, 2000h ). Other information is published by the Department for school districts, including prog ram budgets and percentages of special education students. Information for census tracts and communities is available from the US Bureau of the Census and other sources. Data analysis for these studies focuses on information associated with individual s chools because aggregate information for school districts or general populations can mas k school characteristics. Data used in these studies are reproduced in Appendix 3 and Appendix 4 ; interested readers can confirm them at the sources and can repeat these st udies or perform other analysis with them.The Department of Education and the school district s collect other potentially useful information that is not currently published. Of par ticular interest are data on class size and teacher preparation. Recent research has shown significant association of educational achievement as measured by "low-stakes" tests with small class size in elementary schools ( Nye, et al., 1999 and Krueger, 1999 ) and with teacher certification and education ( Darling-Hammond, 2000 ), after adjustments for student backgrounds. Studi es of the development of cognitive abilities cast doub t on whether other information currently published by government sources about pop ulation and economic characteristics in large geographical areas would s ubstantially improve the understanding of test scores.


9 of 57 A. Trends Study of VariabilityThis study considers 47 academic high schools in 32 metropolitan Boston communities through the average tenth-grade MCAS mathematics te st scores recorded for years 1998-2000. Achievement tests in mathematics typical ly require substantial skill at language interpretation (see, for example, Gipps and Murphy, 1994 Chapter 6, p. 183). Haney, 2000 in a study of another state, found stronger corre lations of state mathematics test scores with grades in English than with grades in math. As previously noted, the tenth-grade mathematics test is used in this study of significance because it sets a graduation threshold for most students.Test boycotts have been organized by students in se veral schools each year ( Steinberg, 2000 ), involving 10 to 31 percent of students in 19 cas es out of the 141 test samples. To be able to compare average scores of schools more a ccurately, the average scores reported by the Department of Education have been a djusted by removing the scores of 200 that were assigned to students who did not take the test, averaging only scores of students who participated. Table 2-1 shows changes in schools' average scores ( Appendix 3 ) between 1998 and 1999 and between 1999 and 2000, expressed in units of scale points and of standard deviations. Table 2-1 MCAS Grade 10 Math Test Score Changes by School, 19 98-2000 Changes 1998-1999Changes 1999-2000 City or TownHigh SchoolPointsDeltaPointsDeltaArlingtonArlington435-1BelmontBelmont-2-44-2BostonBoston High1060BostonBrighton213-4BostonCharlestown-1-24-2BostonDorchester-2-31-5BostonEast Boston105-1BostonHyde Park0-11-5BostonJeremiah Burke433-3BostonSouth Boston0-15-1BostonThe English High104-2BostonWest Roxbury325-1Boston ExamBoston Latin81083Boston ExamLatin Academy321917Boston ExamO'Bryant Science4483BraintreeBraintree-1-31310BrooklineBrookline215-1CambridgeRindge & Latin*-2-5-1-10


10 of 57 Table 2-5 Factor Correlations for MCAS Grade 10 Math Test Sco res Factor ABCDEFG A School population, average per grade 1.00-.11.24.02-.03-.01.16 B Percent African American -. C Percent Asian or Pacific Islander . D Percent Hispanic / Latino . E Percent limited English proficiency -. F Percent free or reduced price lunch -. G Percent reduction, grades 9+10 to 11+12 . Sources of data: Appendix 4 Statistica model Some of the correlations in Table 2-5 are strong enough that multiple regression coefficients are likely to be unstable. Therefore a model was developed in stages, examining factors for significance.The full model from the factors in Table 2-4 was first evaluated with weights proportional to numbers of test participants. It yi elded two strong factors with low correlation ( C and E ): "Percent Asian or Pacific Islander" at p <.02, with a positive coefficient, and "Percent limited English proficien cy" at p <.002, with a negative coefficient: Factors for school population and perc ent grade reduction had particularly small coefficients and low significance. They were removed, and a model with the remaining five factors then associated 67 percent o f the variance and produced the factor weights shown in Table 2-6 Table 2-6 5-Factor Model for 1999 MCAS Grade 10 Math Test Sco res FactorCoefficientStandard ErrorIntercept, for all factors zero229.41.936B Percent African American 0.0470.104 C Percent Asian or Pacific Islander 0.3470.154 D Percent Hispanic / Latino -0.0020.183 E Percent limited English proficiency -0.6370.217 F Percent free or reduced price lunch -0.1740.157 Sources of data: Appendix 3 Appendix 4 Statistica model With model factors in Table 2-6 high factor weight and significance found in othe r studies for percentages of African American or Lati no students disappear. Both factors have small coefficients and low significance. Stati stical weight that might have been attached to these factors instead follows cultural and economic factors: "Percent limited English proficiency" and "Percent free or reduced p rice lunch." As an experiment, the model was rerun with the latter factors removed; on ly 57 percent of the variance was associated, and factor weights became those shown i n Table 2-7


11 of 57 Table 2-7 Racial and Ethnic Model for 1999 MCAS Grade 10 Math Test Scores FactorCoefficientStandard ErrorIntercept, for all factors zero230.02.107B Percent African American* -0.2210.068 C Percent Asian or Pacific Islander 0.2190.156 D Percent Hispanic / Latino* -0.4350.114 Sources of data: Appendix 3 Appendix 4 Statistica model In Table 2-7 two "racial" or "ethnic" factors (marked *) have become significant at a p <.05 level. The coefficient for "Percent African Am erican" has turned from positive to negative, and the coefficient for "Percent Hispanic / Latino" has become strongly negative. It seems likely that these two factors ar e acting as proxies for cultural and economic factors with more predictive power.Residuals from the five-factor model of Table 2-6 are shown in Table 2-8 This include standard error estimates based on results from the trends study of Section 2A Table 2-8 Residuals for MCAS Grade 10 Math Test Scores, 5-Fac tor Model City or TownHigh SchoolResidualStd. ErrorRatioArlingtonArlington4.62.61.8BelmontBelmont12.22.74.5BostonBoston High-8.95.1-1.7BostonBrighton1.73.30.5BostonCharlestown1.04.80.2BostonDorchester-1.64.3-0.4BostonEast Boston2.14.20.5BostonHyde Park-6.14.8-1.3BostonJeremiah Burke4.15.00.8BostonSouth Boston-8.63.7-2.3BostonThe English High11.45.02.3BostonWest Roxbury1.23.90.3Boston ExamBoston Latin21.33.56.0Boston ExamLatin Academy4.04.11.0Boston ExamO'Bryant Science4.654.31.1BraintreeBraintree-1.782.5-0.7BrooklineBrookline9.52.34.1CambridgeRindge & Latin*-6.34.0-1.6


12 of 57 ChelseaChelsea2.27.00.3DedhamDedham-1.63.1-0.5EverettEverett*-2.82.6-1.1LexingtonLexington4.42.81.6LynnClassical-6.53.3-2.0LynnEnglish-5.03.9-1.3MaldenMalden-6.63.2-2.1MarbleheadMarblehead3.43.21.1MedfordMedford*-7.42.7-2.7MelroseMelrose-2.72.7-1.0MiltonMilton-1.73.1-0.6NewtonNorth**-7.22.5-2.9QuincyNorth Quincy-9.23.6-2.5QuincyQuincy*-14.13.1-4.5RevereRevere*-10.32.7-3.9SalemSalem*-1.53.1-0.5SaugusSaugus-2.82.9-1.0SomervilleSomerville**-7.42.6-2.8WatertownWatertown4.73.11.5WeymouthWeymouth*-7.02.3-3.0WinchesterWinchester12.52.94.4WinthropWinthrop*-5.53.5-1.6WoburnWoburn-1.92.6-0.7 school providing vocational education Sources of data: Appendix 3 Appendix 4 Statistica model At first glance, some residuals in Table 2-8 look substantial, several scale points of difference from the average scores predicted by the model. However, residual ratios for most schools are within +/2 standard errors, not significant at a p <.05 level. Someone familiar with metropolitan Boston will recognize th at schools with high and low residual ratios tend to be in high-income and low-income com munities, respectively. It therefore


13 of 57 seems likely that adding a factor for incomes can i ncrease the predictive power of the model.The most recent community income data were from the US Census of 1990, for 1989 per-capita income. Comparable 1999 income statistic s were not yet available. The Massachusetts Department of Revenue could produce c urrent community income statistics but has not done so; the state continues to use 1989 federal census data on incomes to apportion aid to public schools. After a dding 1989 per-capita community income in $1,000s as a factor ( Mass. DoR, 1999 ), without any attempt to adjust incomes so as to reflect school districts or student househ olds, the model associates 80 percent of the statistical variance, and factor weights became those shown in Table 2-9 Table 2-9 6-Factor Model for 1999 MCAS Grade 10 Math Test Sco res FactorCoefficientStandard ErrorIntercept, for all factors zero202.75.400B Percent African American -0.0200.083 C Percent Asian or Pacific Islander* 0.3710.121 D Percent Hispanic / Latino 0.0440.144 E Percent limited English proficiency* -0.6950.171 F Percent free or reduced price lunch 0.0500.131 H Per-capita community income (1989)* 1.1860.230 Sources of data: Appendix 3 Appendix 4 Statistica model Three factors in Table 2-9 (marked *) have substantial significance, at a p <.005 level or better, and three have very low significance. Facto r weight has shifted from "Percent free or reduced price lunch" to "Per-capita community in come (1989)," while "Percent limited English proficiency" retains a large coefficient an d high significance. Dropping low-significance factors, the resulting three-facto r model is shown in Table 2-10 Table 2-10 3-Factor Model for 1999 MCAS Grade 10 Math Test Sco res FactorCoefficientStandard ErrorIntercept, for all factors zero204.94.446C Percent Asian or Pacific Islander 0.3810.109 E Percent limited English proficiency -0.6260.081 H Per-capita community income (1989) 1.1040.197 Sources of data: Appendix 3 Appendix 4 Statistica model The three-factor model of Table 2-10 also associates 80 percent of the statistical vari ance. All of its factors are statistically significant at a p <.001 level. For each school included in these studies, Table 2-11 presents adjusted average 1999 tenth-grade MCAS mathematics test scores and residu als from the three-factor statistical model of Table 2-10 with the uncertainties in average scores and resi duals expressed as


14 of 57 standard errors, based on the variance estimate cal culated in the trends study of Section 2A Table 2-11 Residuals for MCAS Grade 10 Math Test Scores, 3-Fac tor Model City or TownHigh SchoolAverageStd. ErrorResidualStd Error ArlingtonArlington2342.14.22.4BelmontBelmont2432.26.72.7BostonBoston High2042.9-10.33.1BostonBrighton2052.2-0.12.9BostonCharlestown2062.7-1.63.7BostonDorchester2043.0-0.53.5BostonEast Boston2052.3-1.02.9BostonHyde Park2033.3-5.03.7BostonJeremiah Burke2082.75.13.4BostonSouth Boston2052.6-7.83.1BostonThe English High2042.39.73.7BostonWest Roxbury2052.00.02.7Boston ExamBoston Latin2541.723.62.7Boston ExamLatin Academy2332.15.02.8Boston ExamO'Bryant Science2272.14.33.4BraintreeBraintree2281.91.72.3BrooklineBrookline2401.6-0.52.7CambridgeRindge & Latin*2201.6-4.91.8ChelseaChelsea2162.33.72.8DedhamDedham2272.51.32.8EverettEverett*2211.92.32.5LexingtonLexington2381.7-5.83.0LynnClassical2162.0-3.32.8LynnEnglish2132.10.92.5MaldenMalden2212.0-2.82.6MarbleheadMarblehead2322.7-6.33.6MedfordMedford*2212.2-2.32.5MelroseMelrose2262.1-1.22.5MiltonMilton2282.3-2.02.6NewtonNorth*2391.50.82.5


15 of 57 NewtonSouth2422.02.22.8PeabodyVeterans*2201.8-2.92.3QuincyNorth Quincy2271.9-7.23.0QuincyQuincy*2122.3-11.92.6RevereRevere*2181.9-6.12.4SalemSalem*2202.10.72.5SaugusSaugus2262.20.82.6SomervilleSomerville*2161.81.72.1StonehamStoneham2272.51.52.9SwampscottSwampscott2402.67.63.0WakefieldMemorial2272.20.62.6WalthamWaltham*2201.8-4.02.1WatertownWatertown2312.64.32.8WeymouthWeymouth*2221.5-3.82.1WinchesterWinchester2432.32.83.2WinthropWinthrop*2233.0-1.43.3WoburnWoburn2262.01.32.4 school providing vocational education Sources of data: Appendix 3 Appendix 4 Statistica model Stepwise analysis shows that combinations of the fa ctors in the three-factor model of Table 2-10 associate statistical variance in the amounts list ed in Table 2-12 Table 2-12 Factor Combinations for 1999 MCAS Grade 10 Math Test Scores Factors R2C .01 E .62 H .47 C E .65 C H .52 E H .74 C E H .80 C. Percent Asian or Pacific Islander E. Percent limited English proficiency H. Per-capita community income (1989)


16 of 57 Sources of data: Appendix 3 Appendix 4 Statistica model Table 2-12 shows that the major factors are "Percent limited English proficiency" and "Per-capita community income (1989)." Although stat istically significant, "Percent Asian or Pacific Islander" is a weak cofactor, associatin g only 1 percent of the variance by itself. The three-factor model of Table 2-10 was evaluated for predictions of 1998 and 2000 average test scores. It was not expected to perform as well, since most factor data were for 1999. However, the factor weights and significa nce proved robust, and the model associated statistical variance as shown in Table 2-13 Table 2-13 Year Comparisons for MCAS Grade 10 Math Test Scores Year R21998.831999.802000.77 Sources of data: Appendix 3 Appendix 4 Statistica model The residuals in Table 2-11 suggest systematic contributions to test score ave rages at some schools that might have been produced by unusu al efforts. Probable outliers for the predictive model with correspondingly large year-to -year average score changes had a strongly positive bias: those for Boston Latin, Lat in Academy and Swampscott. Model behavior for metropolitan Boston schools may be biased by special characteristics of City of Boston schools, because the students who score well on school-based standard tests are selected for admission to the three exam schools. (Note 9) Current data also attribute average Boston per-capita income equally to all school districts instead of adjusting by districts or census tracts. The Boston cross-enrollment and busing programs would complicate an income analysis. Behavior of th e three-factor model of Table 2-10 was examined for 1999 tenth-grade MCAS mathematics test scores, considering only schools outside the City of Boston. These 34 school s had a 1999 total population per grade of about 10,200 students out of 13,730 for al l schools considered in these studies. When applied only to schools outside the City of Bo ston, the three-factor model of Table 2-10 showed significance at the p <.05 level for "Percent limited English proficiency and "Per-capita community income (1989)" but not for "P ercent Asian or Pacific Islander." A two-factor model based on the first two of these fa ctors, with schools weighted by numbers of test participants, associates 86 percent of the statistical variance. Factor weights became as shown in Table 2-14 Table 2-14 2-Factor Model for 1999 MCAS Grade 10 Math Test Sco res FactorCoefficientStandard ErrorIntercept, for all factors zero201.52.934


17 of 57 E Percent limited English proficiency -0.3250.136 H Per-capita community income (1989) 1.3070.126 Sources of data: Appendix 3 Appendix 4 Statistica model Several trial factors were added individually to th e two-factor model of Table 2-14 but none showed statistical significance at the p <.05 level. The statistical variance associated when each trial factor was added to this two-factor model is shown in Table 2-15 Table 2-15 Factor Comparison for 1999 MCAS Grade 10 Math Test Scores R2Trial Factor Added .86None.88 A Population per grade .88 B Percent African American .86 C Percent Asian or Pacific Islander .86 D Percent Hispanic / Latino .87 F Percent free or reduced price lunch .87 G Percent reduction, grades 9+10 to 11+12 .87Percent special education.86Per-capita property value, 1998, $000s.88Spending, regular education, $000s Sources of data: Appendix 3 Appendix 4 Statistica model The last three trial factors in Table 2-15 are districtaverages; the last and third from last are for all schools in all grades. Three districts, Lynn, Newton and Quincy, each operate two academic high schools which will not be disting uished by these factors. Outside the City of Boston, only "community income" and "limite d English proficiency" are significant contributors to 1999 tenth-grade MCAS m athematics test scores; their indicators are effective predictors.A one-factor model, using only "Per-capita communit y income (1989)," performed almost as well as any combination of factors shown in Table 2-15 associating 84 percent of the variance. The factor weight is in Table 2-16 Table 2-16 1-Factor Model for 1999 MCAS Grade 10 Math Test Sco res FactorCoefficientStandard ErrorIntercept, for all factors zero197.02.395H Per-capita community income (1989) 1.4650.114


18 of 57 Sources of data: Appendix 3 Appendix 4 Statistica model "Percent limited English proficiency" is much less effective in a one-factor model, associating only 38 percent of the variance. Commun ity income appears to be the dominant factor associated with these test scores. The 1999 adjusted average tenth-grade MCAS mathematics test scores by school, plus residu als from the two-factor and one-factor models for 1999, shown in Table 2-14 and Table 2-16 with standard error estimates for each, are in Table 2-17 Table 2-17 Residuals for MCAS Grade 10 Math Test Scores, 1,2-F actor Models City orTown High SchoolAverage Score Std. Error 2-factor Residual Std. Error 1-factor Residual Std. Error ArlingtonArlington2342. Latin* 2201.6-5.01.8-6.11.8 ChelseaChelsea2162.*2211.*2212.2-1.72.3-0.82.3MelroseMelrose2262.1-1.72.3-0.62.2MiltonMilton2282.3-2.62.4-1.92.4NewtonNorth*2391.50.61.9-0.21.9NewtonSouth2422.*2201.8-3.12.0-1.91.9QuincyNorth Quincy 2271. QuincyQuincy*2122.3-9.42.5-10.52.5RevereRevere*2181.9-1.32.1-0.52.1SalemSalem*2202.1-0.42.3-0.62.3SaugusSaugus2262.*2161.8-0.42.3-3.22.0StonehamStoneham2272.*2201.8-2.32.0-1.62.0


19 of 57 WatertownWatertown2312.*2221.5-3.51.8-1.91.7WinchesterWinchester2432.*2233.0-1.33.2-0.13.1WoburnWoburn2262. school providing vocational education Sources of data: Appendix 3 Appendix 4 Statistica model In an attempt to improve accuracy of the model in Table 2-14 schools with residuals from the two-factor model for 1999 that were greate r than two standard deviations were dropped, Belmont with positive residual and Cambrid ge Rindge & Latin, Marblehead High and Quincy High with negative residuals. The t wo-factor model for 1999 scores then produced the factor weights shown in Table 2-18 Table 2-18 2-Factor Trial for 1999 MCAS Grade 10 Math Test Sco res FactorCoefficientStandard ErrorIntercept, for all factors zero200.62.188E Percent limited English proficiency -0.2160.101 H Per-capita community income (1989) 1.3570.095 Sources of data: Appendix 3 Appendix 4 Statistica model Chi square for the two-factor model of Table 2-18 was 36.2 with 27 degrees of freedom (p=.11). The one-factor model of Table 2-16 was also evaluated for 1999 with the set of cases used in Table 2-18 producing the factor weights shown in Table 2-19 Table 2-19 1-Factor Trial for 1999 MCAS Grade 10 Math Test Sco res FactorCoefficientStandard ErrorIntercept, for all factors zero197.61.793H Per-capita community income (1989) 1.4630.087 Sources of data: Appendix 3 Appendix 4 Statistica model Chi square for the one-factor model of Table 2-19 was 43.4 with 28 degrees of freedom (p=.03). However, the cases and models from Table 2-18 and Table 2-19 did not provide significant chi square probabilities for 1998 or 20 00 scores; all other attempts to improve estimation by removing outliers also proved unstabl e. Residuals from the two-factor model of Table 2-14 for schools outside the City of Boston are strongly autocorrelated. The coefficient was .4 5 between 1998 and 1999 and .67 between 1999 and 2000. Scatterplots for successive years are shown in Figure 2-4 and Figure 2-5 comparing 1998 and 2000 residuals with 1999 resid uals: Figure 2-4: MCAS Grade 10 Math Test Score Residuals 1998 versus 1999


20 of 57 .. Figure 2-5: MCAS Grade 10 Math Test Score Residuals 2000 versus 1999


21 of 57 To the extent that a trend can be observed by apply ing the model of Table 2-14 to different years, it appears that schools scoring hi gher than predicted tend to increase scores in successive years, and schools scoring low er than predicted tend to decrease scores. Departures from model predictions are not a ll random. The two spans of years available for analysis suggest a systematic trend t hat could stratify high-scoring and low-scoring schools.C. ObservationsThe trends and effects studies presented in Section 2A and Section 2B show how studies of these types tend to yield results with low stati stical significance unless "something unusual" is going on. In the cases of the 2000 tent h-grade mathematics test and of the high scores and large increases at Boston Latin, Latin A cademy and Swampscott, the studies do indicate "something unusual," although they cannot tell what it is. They also illustrate that commonly published "league tables" of scores strong ly reflect social factors associated with school populations, not factors clearly associ ated with school performance. The effects study shows a robust, positive correlation of scores with household incomes, plus a smaller, negative correlation with limited English proficiency. Factor weights and statistical significance for other factors consider ed are small. School-based data are not currently published for several potentially signifi cant factors, such as class sizes in elementary grades, mathematics course enrollments a nd levels of teacher preparation. By far the strongest factor in predicting tenth-gra de MCAS mathematics test scores is "Per-capita community income (1989)." For the schoo ls outside the City of Boston this factor alone performed nearly as well as all availa ble factors combined, associating 84 percent of the variance compared with 88 percent wh en all available factors were used. The scatterplot in Figure 2-6 shows 1999 average scores for these schools versus 1989


22 of 57per-capita community income: Figure 2-6: 1999 MCAS Grade 10 Math Test Scores ver sus Community Income In Figure 2-6 the relation between school-averaged test scores and per-capita community income looks linear over an income range of about 2 to 1 for this set of schools. There is no obvious threshold or saturation behavior. As it happens, spending on regular education programs in school districts also varies over a ran ge of about 2 to 1 for those schools ( Mass. DoE, 2000f ). However, there is only a weak relationship betwe en 1999 spending on regular education programs in those school distr icts and their 1999 tenth-grade MCAS mathematics test scores, associating only about 3 p ercent of the variance and not statistically significant at a p <.05 level, as shown in the scatterplot of Figure 2-7 Figure 2-7: 1999 MCAS Grade 10 Math Test Scores ver sus School Spending


23 of 57 Although a tendency for scores on school-based stan dard tests to rise with incomes has been recognized in the US for more than 70 years ( Bolon, 2000 ), there has been relatively little research on this phenomenon in ordinary inco me ranges, compared with the attention given to associations between test scores and condi tions of poverty or race. The strong, apparently linear association of average test score s with community incomes shown here, as contrasted with their weak association with scho ol spending, calls for investigation but is beyond the scope of this report. It seems likely that community incomes are providing something beyond what school programs provide, but if so we cannot tell from these data what it might be. Perhaps this effect should not be surprising, since most students spend more than three-fourths their waking hours outside school. The factor "Percent limited English proficiency" wa s the second strongest influence on predicted test scores. (Note 10) Previous effects studies of MCAS scores might not h ave shown this if they failed to utilize the entire var iety of school-associated data available, including "limited English proficiency," "racial" o r "ethnic," and "free or reduced price lunch" student categories. A hypothesis inviting st udy is that students classified with "limited English proficiency" might receive mathema tics instruction less relevant to curriculum tested by MCAS than other students. Anot her hypothesis is that strengthening English language instruction for "limited English p roficient" students might improve their MCAS mathematics test scores, provided all current efforts to teach mathematics and other subjects are maintained. While statistical as sociations suggest these conjectures, only observations and experiments could prove or di sprove them. Factors of "Percent African American" and "Percent Hispanic / Latino" did not make significant contributions to school-averaged tenthgrade MCAS mathematics test scores after other factors were introduced. Although "Perc ent Asian or Pacific Islander" retains


24 of 57 statistical significance in certain models for some years, it is a much weaker factor than "Per-capita community income (1989)" or "limited En glish proficiency." The latter factors, and not "racial" or "ethnic" percentages, provide by far the strongest statistical associations with school-averaged tenth-grade MCAS mathematics test scores in metropolitan Boston. (Note 11) Statistical significance of test-based ratings has been the principal focus of the studies. Some studies developing or using such ratings do no t provide an analysis of variability, without which significance cannot be determined; or they estimate variability from single test session reliability measurements, which were s hown to be optimistic (for example, Gaudet, 2001 and Tuerck, 2001a ). The trends study in Section 2A of this report provides evidence derived from tenth-grade MCAS mathematics test scores for larger variability estimates.D. Summary AnalysisMy summary analysis is based on a one-factor model for metropolitan Boston communities that each operate only a single academi c high school, weighted by numbers of students tested. The effects study showed that Per-capita community income (1989)" was the dominant factor in predicting 1999 school-a veraged tenth-grade MCAS mathematics test scores. All other factors made onl y small contributions to predictions with much lower significance. From published data, community income could not be estimated reliably for each of the multiple academi c high schools in Boston, Lynn, Newton and Quincy. Reduction in scope leaves 28 hig h schools in the same number of communities, with a total of about 8,200 students p er grade recorded for 1999. Estimates of uncertainties in school-averaged test scores are based on findings of the trends study, which showed year-to-year variability of school-ave raged scores several times greater than the variability implied by conventional test reliab ility measurements. Plots of results include uncertainty intervals (sometimes called "er ror bars") equivalent to +/1.4 estimated standard errors. When item intervals do n ot overlap, when standard errors have been accurately estimated, and when items are uncor related, then differences between items are significant at the p <.05 level. The one-factor model for 1999 with this set of cases associated 82 percent of the variance and pro duced the factor weights shown in Table 2-20 Table 2-20 Community Income Model for 1999 MCAS Grade 10 Math Test Scores FactorCoefficientStandard ErrorIntercept, for all factors zero198.42.667Per-capita community income (1989)1.3970.129 Sources of data: Appendix 3 Appendix 4 Statistica model The 1999 adjusted average tenth-grade MCAS mathemat ics test scores by school, plus residuals from the foregoing one-factor model for 1 999, with standard error estimates for each, are shown in Table 2-21 Table 2-21 Residuals for MCAS Grade 10 Math Test Scores, Incom e Model


25 of 57 City or TownAverage ScoreStd. Error1-factor Residual Std. Error Arlington2342.15.62.2Belmont2432.27.12.5Braintree2281.93.52.0Brookline2401.61.02.1Cambridge*2201.6-6.21.8Chelsea2162.31.42.6Dedham2272.52.02.6Everett*2211.92.72.2Lexington2381.7-3.42.3Malden2212.00.52.2Marblehead2322.7-9.23.2Medford*2212.2-1.12.3Melrose2262.1-0.72.2Milton2282.3-1.82.4Peabody*2201.8-2.21.9Revere*2181.9-1.02.1Salem*2202.1-1.02.3Saugus2262.22.72.3Somerville*2161.8-3.62.0Stoneham2272.53.12.7Swampscott2402.65.82.8Wakefield2272.22.02.3Waltham*2201.8-1.92.0Watertown2312.64.12.7Weymouth*2221.5-2.11.7Winchester2432.31.82.8Winthrop*2233.0-0.43.1Woburn2262.02.22.1 school providing vocational education Sources of data: Appendix 3 Appendix 4 Statistica model The plot in Figure 2-8 shows school-averaged adjusted scores on the 1999 tenth-grade MCAS mathematics test and the corresponding uncerta inty intervals (or "error bars"). In this and the next two plots, the 28 schools conside red in this summary analysis have been rank-ordered from the lowest to the highest average scores on the 1999 tenth-grade MCAS mathematics test. Figure 2-8: Average 1999 MCAS Grade 10 Math Test Sc ores by School


26 of 57 The plot in Figure 2-9 shows residuals for school-averaged 1999 tenth-gra de MCAS mathematics test scores, the differences left after subtracting away predictions calculated from the factor "Per-capita community income (1989) ." Figure 2-9: Residuals of 1999 MCAS Grade 10 Math Te st Scores by School


27 of 57 The picture in the plot of average scores of Figure 2-8 with significant separations between high-scoring and low-scoring schools, is sh own by the residuals plot of Figure 2-9 to be associated largely with differences in commu nity income. Chi square for the residuals distribution is 63.2 with 27 degrees of f reedom. After subtracting predictions based on community income, residuals of average sco res settle to just a little more than statistical noise. Only five or six schools can be reliably distinguished other than by community income.The last plot, Figure 2-10 shows the changes in school-averaged 1999 tenth-g rade MCAS mathematics test scores from the same schools' aver age scores in 1998. Figure 2-10: Changes in MCAS Grade 10 Math Test Sco res by School, 1998-1999


28 of 57 There was low statistical significance in year-to-y ear changes of the school-averaged adjusted scores on tenth-grade MCAS mathematics tes ts between 1998 and 1999. The distribution of changes about an average change of +1.2 scale points, as shown in Figure 2-10 fits a normal distribution with weighted chi squa re of 22.7 for 26 degrees of freedom ( p = .65). Despite the possibility of little signific ance in score changes, such changes are criteria by which the Massachusetts Department of E ducation has begun to rate school Performance (Note 12) as "Failed to Meet Expectations," "Approached Expec tations," "Met Expectations" or "Exceeded Expectations." Ther e are severe penalties, including state closure or seizure, for schools with low scor es and ratings of "Failed to Meet Expectations."The statistical significance of differences in scho ol-averaged scores, aside from their reflection of community income, may not be not enou gh to compare schools reliably and may not be enough to evaluate short-term changes in teaching and learning. Whatever aspects of school performance these scores might me asure can be lost in the fluctuations for a particular school over any one year or few ye ars. Only averages and trends in scores over several years would be likely to yield statist ically useful information. Some observers question whether it is realistic to expect standardized tests to yield significant information comparing school performanc e, even over a period of years (for example, Rowe, 1999a and Rowe, et al., 1999b ). They argue that variations in student achievement are commonly greater within schools tha n between schools. Others contend that adaptive and defensive behavior encouraged by "high-stakes" political environments grossly distorts outcomes from all types of educati onal assessment and robs them of meaning (for example, Sacks, 1999 and Hayman, 1998 ). It is important to point out that significance attributed to results in these studies is purely statistical. Neither these studies


29 of 57nor any others known to the author have shown that MCAS test scores have practical significance, in the sense of predicting success in adult activities to any greater degree than could be done with knowledge of student backgr ounds.Section 3: ResultsA. Opportunities and QuestionsIn future work, it may be helpful to examine catego ries of metropolitan Boston schools that were excluded from or specially identified in these studies: Vocational, technical and agricultural schools Academic schools that also provide vocational educa tion Pilot schools, charter schools, specialty schools a nd small schools Other questions might be answered by extending data coverage: Do other test scores yield similar results? Do small class sizes have significant effects? Does teacher preparation have significant effects? Do math course enrollments have significant effects ? (Note 13) Can other factors be found that increase significan ce? Are the same effects found in other Massachusetts s chools? Are similar effects observed in elementary and midd le school grades? Are the same or different patterns observed in othe r US metropolitan areas? Can income factors be estimated for communities wit h multiple high schools? If individual data were available, would multi-leve l analysis show different results? The strong, linear relation found between school-av eraged tenth-grade MCAS mathematics test scores in a relatively calm year a nd community income in a previous year, within ordinary ranges of incomes, leads to s everal questions. Does community income primarily determine educational achievement, regardless of school performance? Do MCAS and similar tests measure skills and knowle dge that are acquired in schools, or do they measure skills and knowledge that are large ly acquired outside schools? Do communities with substantially different incomes ha ve substantially different expectations for student performance on MCAS and similar tests? Is current community income also strongly correlated with test scores? After a turbu lent period of local efforts to raise scores, will the correlation between income and sco res remain as strong? Exploring and understanding some of these issues will require a d ifferent approach. B. ConclusionsCommunity income has been found strongly correlated with tenth-grade MCAS mathematics test scores and associated more than 80 percent of the variance in school-averaged 1999 scores for a sample of Bostonarea communities. The influence of community income was robust against several sets of model variables and cases. Community income swamped the influence of the other social and school factors examined. Once community income was included in mod els, other factors--including percentages of students in disadvantaged population s, percentages receiving special education, percentages eligible for free or reduced price lunch, percentages with limited


30 of 57English proficiency, school sizes, school spending levels, and property values--all failed to associate substantial additional variance.Large uncertainties in residuals of school-averaged scores, after subtracting predictions based on community income, tend to make the scores ineffective for rating performance of schools. Large uncertainties in year-to-year sco re changes tend to make the score changes ineffective for measuring performance trend s. In their present state, considered as a means to rate the performance of public schools, tenth-grade MCAS mathematics tests mainly appear to provide a complex and expensive wa y to estimate community income.Appendix 1: Education Reform in MassachusettsMassachusetts has experienced many education experi ments and reforms over the past few centuries. (Note 14) During the 1990s Massachusetts education reform was driven by the McDuffy school finance lawsuit ( McDuffy, 1993 ), originally filed in 1978. In its McDuffy decision, the Massachusetts Supreme Judicia l Court said that Massachusetts funding disparities harmed the quality of education for some students, denying them education to which they were constitutionally entit led. This June, 1993, decision was widely anticipated. The Massachusetts Education Ref orm Act of 1993 ( Education Reform, 1993 ) was signed less than a week after the decision wa s released. A group called the Massachusetts Business Alliance for Education, (Note 15) organized in 1988 and led by the late John C. (Jack) Rennie, the n CEO of the former Pacer Infotec, Inc., of Burlington, MA, (now the AverStar division of Titan Corp., San Diego, CA), and S. Paul Reville, then director of the Worcester Pub lic Education Fund, wrote the reform bill sponsored by the Education Committee of the le gislature. In 1991 the Business Alliance produced a document entitled Every Child a Winner (Note 16) A story from the May 2, 1993, Northwest edition of the Boston Globe quoted former Rep. Mark Roosevelt, then House Education Committee Chair, as saying tha t the House education reform bill then pending "is essentially [the Business Alliance document]." In a publication of MassINC, Rennie is quoted as saying, "We bought cha nge" ( Walser, 1997 ). Most of this work was carried out in secret. As late as December 1992, then Lt. Gov. Cellucci was calling on the Education Committee chairs, Sen. Tho mas Birmingham of Chelsea and Rep. Roosevelt of Beacon Hill, to disclose their bi ll ( Howe, 1992 ). Almost all the controversy generated by this legislation focused o n its funding formulas. Until 1993, the public had hardly any knowledge of its sweeping cha nges in school policy and regulation. The following newspaper report was printed December 23, 1992 ( Overdue, 1992 ): "The bill also calls for higher student achievement and curriculum standards." This was the most thorough description in mainstrea m news media from 1988 through 1992. The bill was released in an emergency legisla tive session of January 4-5, 1993, but it failed to pass.Soon after the bill became public, education and pu blic interest groups began to react. As reported in the Boston Globe on January 26, 1993, a coalition headed by Stephen Bing of the Massachusetts Advocacy Center predicted major p roblems with the legislation, including these ( Ribadeneira, 1993 ): The reform bill will institutionalize unfair teachi ng practices such as using tests to track students into different ability levels.


31 of 57By substituting different certificates in place of the high school diploma, the bill will contribute to the dropout problem rather than ameliorate it. The legislation provides no mechanism for meaningfu l participation by parents or students in the development of a remedial education plan nor any opportunity to contest an inadequate plan. Such objections were ignored. Neither the mainstrea m news media nor the Great and General Court gave these or other educationally ori ented issues further attention in 1993. Rep. Thomas Finneran of Mattapan, then chair of the House Ways and Means Committee, secretly inserted anti-abortion provisions in the b ill, provoking a storm of protest ( Howe, 1993a ). Other House controversy centered on a salary cap for teachers, which was removed. Proposals for "school choice," charter sch ools and gambling revenues became the focus of activity in the Senate. The bill quick ly became a hodge-podge of added provisions with no coordination. Many observers bec ame skeptical about overall benefit. Geoffrey Beckwith of the Massachusetts Municipal As sociation was quoted as saying, "It certainly doesn't appear at this time that this bil l will bring about any fundamental reform" ( Howe, 1993b ). In February, 1993, the Supreme Judicial Court he ard testimony in the McDuffy case. In March, former Rep. Roosevelt began a (losing) campaign for Governor ( Lehigh, 1993 Howe, 1993c ). In April, the Edison Project, a business corpora tion, announced interest in privatizing Massachusetts sch ools ( Nealon, 1993 ). By May, an impasse over "school choice" had developed, the the n Senate President William Bulger of Boston demanding it and the then House Speaker Char les Flaherty of Cambridge rejecting it. At the time, the Business Alliance opposed the "school choice" and charter school amendments ( Taylor, 1993 ). However, another business group calling itself CEOs for Fundamental Change in Education" had appeared, domi nated by banking and large business interests and supported by the Pioneer Ins titute. It was actively promoting charter schools and "school choice" through the Massachuset ts Senate ( Vennochi, 1993 ). After compromising with limits and delays on "school choi ce" and charter schools, the House passed the bill through second reading June 2 and t he Senate passed it June 3. The Supreme Judicial Court released its McDuffy case de cision June 15. Former Gov. William Weld signed the Education Reform Act on June 18, 19 93. In seven years under the Education Reform Act, stat e aid to Massachusetts public schools has grown from $1.3 billion to $3.0 billion per yea r, almost all the increase going to communities with low household incomes. For example Holyoke, a low-income community, now receives over 90 percent of its scho ol funding from the state, while Brookline, a high-income community, receives only a bout 10 percent ( Mass. DoE, 2000e ). In 1992, Holyoke spent less than 75 percent as m uch per student as Brookline, but now it spends about 95 percent of what Brookline do es ( School reform, 1992 ). Still, the Act has tended to provide more of a windfall for Ho lyoke's taxpayers than for its public school students.Besides setting state commitments to equalize schoo l funding, the Education Reform Act made many changes to Massachusetts education policy and regulations, including the following, as described by the Business Alliance ( Taylor, 1993 ): New goals, standards and indicators of performance for schools, students and teachers


32 of 57Financial rewards to teachers and schools that exce l Decentralized authority, limiting school committees to policy-making and oversight, making CEOs of superintendents, and givi ng hiring and firing power to principals Preschool for all 3and 4-year-olds Expanded professional development for teachers More than seven years later, some of these changes are only starting to be implemented. Before 1996, the Massachusetts Board of Education r egarded test scores as only one component of school accountability. In a 1993 polic y advisory cited by Wheelock, 1999 the Board warned that an accountability system base d primarily on test scores would be likely to produce harmful long-term consequences, i ncluding: exclusion of weaker students from the assessed pool of students; lowere d morale among teachers and students; the loss of experienced educators from schools enro lling many disadvantaged students; distortion of instruction and curriculum to reflect test content and format; cheating and corruption of test scores.Nevertheless the Board began development of testing programs to satisfy the provisions of the Education Reform Act, which eventually became M CAS. It appointed committees of educators and parents to help insure that tests wer e meaningful, fair and free from overt forms of bias ( French, 1998 ). From 1993 through 1996, Massachusetts invested m ore than $2 million to support education reform study groups seeking ways to set high expectations for students ( Antonucci, 1997a ). By March of 1996 the Department of Education had co mpleted a Common Core of Learning ( Mass. DoE, 1994 ), released six of seven planned curriculum framewo rks based on it, and begun the development of MCAS based on t he frameworks. In addition, it had announced plans ( Mass. DoE, 1996a ) to: award grants to school districts for assessme nt activities such as portfolio development; hold stat ewide conferences on local assessment strategies; publish examples of student work that m eet the statewide standards so that districts have a model of what to strive for; devel op a bank of assessment exercises linked to the curriculum frameworks for use by classroom t eachers. All of these satisfy or support provisions of the Education Reform Act.Development of MCAS took a sharp turn away from pub lic participation ( Mass. DoE, 1996b ) after the appointment of John Silber as chair of the Massachusetts Board of Education in November, 1995 ( Pawlack-Seaman, 1996 ). In August, 1996, Silber, former president of Boston University and an unsuccessful candidate for Governor, working with then Gov. Weld, engineered replacement of the 17-me mber Board of Education, including four African-Americans and Latinos, with a 9-member board, including several with ties to school privatization and charter schools, only o ne African-American and no Latinos ( Jackson, 1996 Wong, 1996 ). At his first meeting with the Board, Silber said the Education Reform Act's underlying principle that al l students are capable of learning at high levels was "rubbish" ( Future, 1996 ). Responding to a demand for his resignation in 1997 he commented, "Some of the things that pass fo r learning disabilities used to be called stupidity" ( Pawlack-Seaman, 1997 ). Soon after the Board replacement, the committees of educators and parents that had been f ormed to oversee curriculum


33 of 57frameworks, test development and other education re forms were disbanded ( Antonucci, 1997b ). In December, 1996, Silber proposed a two-track syst em ( Avenoso, 1996 ) with a general diploma awarded for passing the GED, a test introdu ced during World War II by Everett F. Lindquist, developer of the Iowa test series, an d now administered by the American Council on Education. An honors diploma would be aw arded for high scores on the Massachusetts test series. Silber was forced to aba ndon the plan in January, 1997, when his personally chosen Board of Education refused to support it ( Leung, 1997 ). However, a legacy of Silber's proposal remains, the view that MCAS should be aimed at the exceptional student. In August, 1999, the Business Alliance revived the two-track concept ( Still, 1999 ) with a proposal to award general diplomas to stud ents who satisfy "essential requirements in English and math." The Business All iance did not specify how this would be administered, and the Department of Education an d Board of Education still oppose the concept. What they have done instead is to make a competency determination" required by the Education Reform Act for a high-school diplo ma depend on achieving relatively low MCAS test scores, (Note 17) answering about 40 percent of the questions. (Note 18) A "certificate of mastery," as specified by the Act is to be awarded for much higher scores, (Note 19) answering about 80 percent of the questions to achi eve an "advanced" rating on one or more tests.After the loss of two Education Commissioners in ra pid succession ( Battenfield and Pressley, 1999 ), Silber resigned during a struggle over a new Com missioner in March, 1999. The outcome of the controversy ( Estrin, 1999 ) was replacement of Silber by James Peyser (see Peyser, 1996 and Peyser, 1998 ), head of the reactionary Pioneer Institute, tied to school voucher and privatization movements, and retention of the compliant acting Commissioner David Driscoll. Since the Silber era, MCAS development has been closely monitored by Board of Education member Abigail Ther nstrom, a fellow of the Manhattan Institute, (Note 20) and hired consultant Sandra Stotsky (see Stotsky, 1999 ), a writer for the Fordham Foundation and now an Associate Commiss ioner of Education. Both of these right-wing foundations have supported forms of scho ol privatization. Four rounds of MCAS tests have now been administered, in the sprin g of 1998, 1999, 2000 and 2001. The Board of Education has made the questions used in scoring available to the public, although they have not disclosed their standards fo r evaluating essay questions or all the details of their approach to computing scores. (Note 21) Students in religious-run and other private schools and students being taught at home a re not required to take or pass MCAS tests. Bills have been filed but have not been enac ted to include private schools in testing and to exclude charter schools. A system of "school accountability" has been defined by the Department of Education ( Mass. DoE, 1999b ). It is based entirely on MCAS scores, a violation of Education Reform Act requirements. (Note 22) MCAS has been heavily promoted by a business-orient ed group organized as Mass Insight Education and Research Institute, Inc., in Boston, founded in 1997 by registered Massachusetts lobbyist William H. Guenther, who is its president. (Note 23) Guenther is also involved with three other public relations org anizations, Mass Insight Corp., in Cambridge, Opinion Dynamics Corp., in Cambridge, an d New England Economic Project, in Walpole. Mass Insight Education and Res earch Institute is a non-profit corporation that coordinates several policy groups and has close relationships with business and education executives. Leaders of its Campaign for Higher Standards" have included Gloria Larson, former Mass. Secretary of E conomic Affairs, the late John C.


34 of 57Rennie, former Chairman of the Massachusetts Busine ss Alliance for Education and Vice-Chairman of AverStar, Inc., and Cathy Minehan, President of the Federal Reserve Bank of Boston. Leaders of its "Coalition for Highe r Standards" include James Caradonio, Superintendent of Worcester Public Schools, and Tho mas Payzant, Superintendent of Boston Public Schools. Its board of directors has i ncluded Maura Banta, Manager for External Programs at IBM Corporation, John Rennie, Abigail Thernstrom, Senior Fellow at the Manhattan Institute and member of the Massac husetts Board of Education, and Bruce Tobey, Mayor of Gloucester. Financial support ers of Mass Insight Education and Research Institute include BankBoston (now FleetBos ton Financial), State Street Corp., Bell Atlantic (now Verizon), Boston Edison (now an NSTAR division), Liberty Mutual Group, PricewaterhouseCoopers, Goodwin, Procter & H oar, AverStar, Inc. (now a division of Titan Corp.), Gorton's Seafoods, Hewlet t-Packard, IBM and Intel. Mass Insight publications promoting MCAS have been distributed to public schools through the Massachusetts Board of Education, (Note 24) and Mass Insight has received public funds for its services. Mass Insight has bee n cited in minutes of the Board of Education as a source of policy initiatives, (Note 25) including a proposal to use a score of 220 on tenth-grade language arts and mathematics te sts as the initial "competency determination" for high-school graduation, which wa s adopted by the Board in November, 1999. Mass Insight presents a simple but misleading picture of MCAS, saying that it measures "skills that students will need after grad uationat college or on the job" ( Why, 1999 ). No such significance has ever been demonstrated for MCAS or other state accountability tests.Massachusetts schools are often castigated by newsp apers and politicians as mediocre, (Note 26) but actually they are superior. In the October, 199 9, Boston Magazine, Jon Marcus wrote: "According to assessment tests and other measures, Massachusetts schools are among the nation's best. Students here rank fou rth nationally in reading, sixth in math, and eighth in science on the Nationa l Assessment of Educational Progress, administered by the US Depart ment of Education. They scored higher this spring in reading than 69 percen t of their peers across the country on the Iowa Test of Basic Skills; a third o f the state's third graders were at the advanced level, compared to 19 percent nationwide. A Boston College correlation of NAEP results with internatio nal tests found that in eighth-grade science Massachusetts students perform ed as well as, or better than, their counterparts in 40 out of 41 other coun tries, including Germany and Japan; only kids in Singapore were rated higher More students study algebra and upper-level math and science than the n ational average, and Massachusetts also has the fifth-lowest high school dropout rate, the nation's highest percentage of graduates who enroll in colle ge, and the third-highest proportion of students who take the SAT. Massachuse tts students' SAT results have risen steadily since 1994; last year, they were the highest in a decade." (From Marcus, 1999 .) Such a contrast between political bombast and educa tional reality has become common. Part of the long record of declining SAT scores in the 1960s and 1970s, for example, had a straightforward cause, the rapidly expanding numb er of students taking the tests, including many low-income students and students wit h lower grades who would not have taken them in prior years ( Koretz, 1992 Berliner and Biddle, 1995 Berliner and Biddle,


35 of 57 1996 ). When education researchers looked at comparable groups of students, SAT scores were gradually rising during much of this period; u nadjusted averages began to rise as the growth in the number of test takers slowed. With th e gratuitous abuse regularly heaped on public schools during this time, few members of the public would have guessed that some of the real trends in scores were positive. Even no w, many politicians and most news media find the actual results inconvenient; they pr efer simple, strident bashing of public schools, uncomplicated by facts.Many observers and columnists have commented on the complex language, mental tricks and obscure bits of knowledge found in MCAS questio ns (see, for example, Vaishnav, 2000 and Kohn, 2000 ). How elitist are the MCAS tests? One way to look at this is to ask the fraction of questions that must be answered to pass them and the fraction of students who cannot do this. Table A1-1 compares the current graduation level tests in Massachusetts (Note 27) (10th grade MCAS) with those in New York (Note 28) (the revised Regents series) and Texas (Note 29) (the TAAS series). Table A1-1 Comparison of State Achievement Tests StateTypical percent of questions to pass Typical percent of students failing (Note 30) Texas7020New York5520Massachusetts4050 Source of data: see text, Appendix 1 Massachusetts, with by far the lowest passing score has by far the highest rate of failure. Yet year after year, nationwide measures of academi c performance rate Massachusetts students well ahead of those in New York and Texas. (Note 31) Passing an MCAS test says little about the education imparted through ma ny years of schooling. On an MCAS tenth-grade mathematics test, the difference betwee n passing and failing can be getting 24 questions right rather than 23 (see Mass. DoE, 1999d and Mass. DoE, 1999c ). A recent study performed by Catherine Horn and othe rs at Boston College ( Horn, et al., 2000 ) showed that a barely passing score on the tenth-g rade MCAS math test was approximately equivalent to the 50th percentile sco re for the PSAT math test. Students taking the PSAT are aiming for college. Many are ta king the test as part of applying for National Merit and other scholarships; they tend to be good students. Therefore it should not be surprising when half or more of the general student population may "fail" the current tenth-grade MCAS math test.A large share of MCAS test questions is aimed at st udents with exceptional skills and knowledge rather than at typical students. If Massa chusetts designed tests to measure competence rather than mastery, it would be setting much higher passing percentages. If Massachusetts genuinely cared about assessing stude nt skills and knowledge, it would satisfy Education Reform Act requirements (Note 32) calling for a "variety of assessment instruments," including "consideration of work samp les, projects and portfolios," facilitating "authentic and direct gauges of studen t performance," and it would provide for circumstances of special education students, studen ts entering the public schools from


36 of 57 households that speak a first language other than s tandard English, (Note 33) and students whose immediate aims focus on employment rather tha n higher education. The Massachusetts Board of Education has ample acce ss to information of this sort and has received many recommendations to improve its pr actices and make its system of assessments more realistic and fair. It has had mor e than $25 million to spend on developing MCAS ( Szechenyi, 1998 ). It is also well aware that "high-stakes" testing systems in other states have sharply narrowed the s chool curriculum (Note 34) and increased the population of school dropouts, (Note 35) who are likely to be eligible only for the "McJobs" of the future. So far, however, me mbers of the Massachusetts Board of Education remain rigid, programmatic and hostile to facts that do not support their policies. Their attitude does not originate from la ck of information or resources. MCAS, like the other "achievement tests" used in st ate accountability systems, has never been shown to predict success in adult life to any greater degree than could be done with a knowledge of student backgrounds. Instead of trying to show practical significance the Board assumes it, in proposing to use this test as the sole state criterion to deny high-school diplomas and state college eligibility to low-scoring students, making it difficult for them to find responsible jobs and oth er forms of advancement. Students from households that already have the least suffer the m ost from such a system, tending to widen an economic gap between haves and have-nots i n our society, already among the greatest of the industrial nations.Appendix 2: Massachusetts Vocational SchoolsMassachusetts municipal school districts support ei ther jointly or individually more than 30 "vocational," "vocational-technical," "technical ," "agricultural-technical" and "agricultural" high schools, (Note 36) most of which the Department of Education recognizes as separate school districts. Table A2-1 includes the 29 vocational schools that are now operated as separate school districts plus the "technical" high school operated by the City of Boston. (Note 37) Table A2-1 1999 MCAS Grade 10 Math Test Scores for Vocational Schools Vocational School1999 MCAS math 10 AverageNumber Assabet Valley Voc. High211186Bay Path Reg. Voc. Tech.210245Blackstone Valley School213209Blue Hills Reg. Voc. Tech.213215Bristol County Agr. High212100Bristol Plymouth Voc. Tech.208207Cape Cod Reg. Voc. Tech. High214171Charles McCann Voc. Tech.217105


37 of 57 Diman Reg. Voc. Tech. High208312Essex Agr. & Tech. Inst.21280Franklin County Tech.214115Gr. Lowell Reg. Voc. Tech.205466Gr. New Bedford Voc. Tech.208497Greater Lawrence Tech.205320Joseph Keefe Tech. High208160Madison Park Tech. High202289Minute Man Voc. Tech. High216199Montachusett Voc. Tech.210257Nashoba Valley Tech. High208121Norfolk County Agr.214115North Shore Tech. High210118Northeast Metro. Reg. Voc.206297Old Colony Reg. Voc. Tech.210127Pathfinder Voc. Tech.210134Shawsheen Valley Voc. Tech.209281So. Shore Voc. Tech. High212130Southeastern Reg. Voc. Tech.209302Tri County Reg. Voc. Tech.212215Upper Cape Cod Tech.206134Whittier Reg. Voc.204352Averages 210215 Source of data: Mass. DoE, 2000h Vocational schools typically provide instruction fo r grades 9 through 12 and devote about half their instructional time to traditional academ ics and about half to vocational training. Some communities, including Cambridge, Quincy, Reve re and Waltham, provide vocational education in the same facilities as acad emic programs. In contrast to academic high school programs, which mainly concern themselv es with preparing students for college, vocational programs train students for spe cific occupations, and their faculty tend to rate themselves according to graduates' success in finding satisfactory employment in those occupations.Vocational schools, represented through the Massach usetts Association of Vocational


38 of 57 School Administrators, have presented the Massachus etts legislature with a bill to decouple those schools from MCAS and substitute a s pecial examination system based on the occupational categories for which they provide training, sponsored by Sen. David Magnani of Framingham ( Magnani, 2000 ). They are well aware, as the foregoing table shows, that their students score far below state av erages on MCAS tests; but they claim that MCAS tests are directed toward a curriculum th at they do not teach and cannot teach without weakening their key programs. Table A2-2 identifies some of the major sources of information for Massachusetts vocational schools available on the Internet. Table A2-2 Massachusetts Vocational School Information Sources Massachusetts Association of Vocational School Admi nistrators Bay Path Regional Vocational Technical High School, Charlton, MA Blackstone Valley Regional Vocational Technical Hig h School, Upton, MA Bristol-Plymouth Regional Technical School District Taunton, MA Bluehills Regional Technical High School, Canton, M A Diman Regional Vocational Technical High School, Fa ll River, MA Greater Lowell Technical High School, Tyngsboro, MA Greater New Bedford Regional Vocational Technical H igh School, New Bedford, MA Lower Pioneer Valley Educational Collaborative, Eas t Longmeadow, MA Minuteman Science-Technology High School, Lexington MA Northeast Metropolitan Regional Vocational School, Wakefield, MA Old Colony Regional Vocational Technical High Schoo l, Rochester, MA Shawsheen Valley Technical High School. Billerica, MA Tri-County Regional Vocational Technical School Dis trict, Franklin, MA William J. Dean Technical High School, Holyoke, MA Whittier Regional Vocational Technical High School, Haverhill, MA Worcester Vocational High School, Worcester, MA Population trends at Massachusetts Regional Vocatio nal School Districts Source of data: Massachusetts Association of Vocati onal School AdministratorsAppendix 3: Metropolitan Boston MCAS Mathematics Sc oresTable A3-1 1999 MCAS Grade 10 Math Test Scores for Boston-area Schools 200019991998 City orTown High SchoolAdj Avg Adj Num Adj Avg Adj Num Adj Avg Adj Num ArlingtonArlington239231234246230251BelmontBelmont247254243229245223BostonBoston High210111204132203168BostonBrighton208233205216203186BostonCharlestown210175206148207156BostonDorchester205134204121206128BostonEast Boston210270205206204225BostonHyde Park20412920399203249


39 of 57 BostonJeremiah Burke21111220814520485BostonSouth Boston210177205159205182BostonThe English High 208200204203203230 BostonWest Roxbury210211205269202149Boston ExamBoston Latin262382254399246417Boston ExamLatin Academy252248233245230222Boston ExamO'Bryant Science 235222227258223248 BraintreeBraintree241319228306229324BrooklineBrookline245429240416238406CambridgeRindge & Latin*219331220423222404ChelseaChelsea219203216207215242DedhamDedham238180227174226172EverettEverett*218325221295216277LexingtonLexington243407238368239362LynnClassical223294216281216246LynnEnglish224285213247212257MaldenMalden225305221286219265MarbleheadMarblehead241169232145237187MedfordMedford*228228221231219232MelroseMelrose228214226245225243MiltonMilton232241228208229204NewtonNorth*248521239468242468NewtonSouth252309242285240282PeabodyVeterans*226426220354219372QuincyNorth Quincy234301227303224294QuincyQuincy*220238212200214279RevereRevere*224251218306216286SalemSalem*224234220243219257SaugusSaugus233193226226222232SomervilleSomerville*218328216338217307StonehamStoneham235171227169233171SwampscottSwampscott239195240161224181WakefieldMemorial231192227228230243WalthamWaltham*228344220339221384WatertownWatertown233164231159221156WeymouthWeymouth*227466222455222433


40 of 57 WinchesterWinchester249225243198244178WinthropWinthrop*227120223118218162WoburnWoburn237276226285226240 Source of data: see text, Appendix 3 Schools marked with asterisks (*) in Table A3-1 are those providing vocational education in the same facility as academic programs.Data in Table A3-1 are tenth-grade MCAS mathematics test scores for 1 998-2000, averaged by schools, and numbers of test participan ts per school, obtained from the Massachusetts Department of Education ( Mass. DoE, 2000h ) and adjusted for percentages of students enrolled in schools but not taking the test. Adjustment formulas are as follows: Nadj = N (1 Pa/100 ) Sadj = (100 S 200 Pa)/ (100 Pa), whereN is the number of enrolled students S is the average score on test, per Department of EducationPa is the percentage of "absent" students (not taking test)Nadj is the adjusted number of students (number taking test)Sadj is the adjusted average score (only students takin g test) This procedure cannot adjust correctly for students absent for some but not all test sections. The Department has not published such inf ormation for the years 1998-2000. Adjusted results are rounded to the nearest integer .Appendix 4: Metropolitan Boston School Characterist icsTable A4-1 Data for Boston-area School CharacteristicsCity orTown High SchoolPop./ Grade % African Amer. % Asian or Pac. Isl. % Hispanic/ Latino % Lim. Eng. Pr. % Free Lunch 9,10-11,12 Reduct. Per Cap. IncomeArlingtonArlington2596.


41 of 57 BostonBoston High 18356.53.326.514.464.34.015.6 BostonBrighton26749.57.935. Boston31724.06.740.329.862.947.015.6BostonHyde Park22072.01.618.423.553.615.715.6BostonJeremiah Burke 17686.03.626.132.956.720.015.6 BostonSouth Boston 26746.718.416.626.144.624.315.6 BostonThe English High 33250.11.936.945.562.024.215.6 BostonWest Roxbury 32864.72.421.128.764.318.815.6 Boston Exam Boston Latin 39818.522.78.40.629.316.215.6 Boston Exam Latin Academy 25527.720.27.72.840.028.315.6 Boston Exam O'BryantScience 25245.131.813.418.348.723.015.6 BraintreeBraintree3373. & Latin* 47837.47.614.27.714.87.719.9 ChelseaChelsea2818.510.963.715.157.839.111.6DedhamDedham1791.*3368.43.611.*26611.*4955.*4180.


42 of 57 QuincyNorth Quincy 3250.626. QuincyQuincy*2824.714.*3393.615.47.94.621.626.414.7SalemSalem*2795.*46717.15.914.315.452.824.615.2StonehamStoneham1910.*3819.*4642.*1402. Source of data: see text, Appendix 4 Data in Table A4-1 except per-capita income, are gathered from the r eports of public schools to the Massachusetts Department of Educatio n for the school year ended June 30, 1999, and published by the Department in its school profiles ( Mass. DoE, 2000f ). Data for per-capita community income in $1,000s are from the Massachusetts Department of Revenue ( Mass. DoR, 1999 ), based on the 1990 US Census of Population and Ho using. Schools marked with asterisks (*) are those providi ng vocational education in the same facility as academic programs. All numbers except a verage student population per grade and per-capita income are percents. Grade size redu ction has been calculated as percent decrease in the grades 11+12 school population as c ompared with grades 9+10. School district reports ( Mass DoE, 1999a ) include the following: (Note 38) Foundation Enrollment and Student Attendance Report s, reporting foundation enrollment and student attendance as of October 1, submitted t o the Department by each district no later than December 15 during each school year.Individual School Reports and School System Summary Reports, reporting on student enrollments and classifications as of October 1, in cluding gender, "racial" or "ethnic," limited English proficiency and low-income status, submitted to the Department by each district no later than January 31 during each schoo l year. School Attending Children Report, counting all scho ol-age residents of a district classified by grade and type of school attended, as of January 1, submitted to the Department by each district no later than April 30 during each school year. Year-End School Indicator Reports, submitted to the Department by each district no later than the August 15 following each school year.End-of-Year Pupil and Financial Reports (EOYR), sub mitted to the Department by each


43 of 57district no later than the September 30 following e ach school year. An EOYR currently consists of the following schedules: 1. Revenue and Expenditure Summary 2. Assessments Received From Member Towns or Citi es or Regional School Districts 3. Instructional Services By Grade Level 4. Special Education Expenditures by Prototype 7. Pupil Transportation Reimbursement 8. Professional Development 11. Pupil Membership Summary 13. Staff Data By Major Program Area Instructional Programs 16. Pupils-Attendance Data 19. Annual School Budget As of July 1, 2000, the Department requires school districts to compile data on instructional costs at the school building level. N ew financial reporting will use a revised and uniform chart of accounts in the EOYR due Septe mber 30, 2002. However, as previously noted, current school profiles do not in clude expenditures for schools. The Department has a long cycle for publication of data Final summary data on per-pupil expenditures for the school year ending June 30, 19 99, were released January 20, 2001.AcknowledgementsThe author acknowledges with thanks the comments of several reviewers. I particularly appreciate Anne Wheelock's keen observations about some data interpretations. Any errors or omissions remain, as always, the responsi bility of the author.Notes1 The terms "racial" and "ethnic" will be used here i n the senses of state and federal regulations, although they describe populations not always closely related by genetic or cultural backgrounds.2For the fall of 1997, 85 percent of Massachusetts e lementary and secondary school students were in public schools and 15 percent in p rivate schools of all types. See National Center, 2001 Tables 39 and 64.3On the motion of Chairman John Silber, Mass. DoE, 1998a .4Massachusetts General Laws, Chapter 69 ( Powers ), Sections 1B and 1D. The Department of Education publishes "curriculum frame works" with only the legal status of guidelines and recommendations. See Mass. DoE, 2000j .5Massachusetts General Laws, Chapter 69 ( Powers ), Section 1D, paragraph (i).6See Mass. DoE, 2000i Chapter 12, for procedures to equate scores acros s test series. This document does not describe any procedures for relating test scores to education content as delivered in Massachusetts public school s.


44 of 577Model and MANOVA evaluations performed with Statistica, Version 5.5, StatSoft, Tulsa, OK.8Compare "promoting power" or "holding power" in Balfanz and Legters, 2001 For issues concerning availability of accurate data on public school dropout rates, see Kaufman, 2001 .9Other school characteristics also deserve considera tion. Boston may have enrollment patterns for high-school mathematics classes differ ing from suburban norms. Ninth-grade students classified as "Asian" have been found more than twice as likely as average students to be enrolled in traditional "high school mathematics" courses, while students classified as "Hispanic" were found less likely tha n average to be taking such courses. (Anne Wheelock, personal communication, January, 20 01). 10 It is possible that "limited English proficiency" a cts as a proxy for other, potentially stronger factors, much as certain "racial" or "ethn ic" percentages appear to act as proxies for income factors in these data.11 Other studies have reported significant "racial" or "ethnic" test score differences for individuals after adjusting for incomes. For one su rvey and an evaluation, see Hedges and Nowell, 1998 pp. 149-181. However, few studies have considered school ratings based on scholastic achievement tests given in the mid-teen years and have used commensurate data for a geographical cluster of com munities that were each fairly homogeneous.12 See Mass. DoE, 1999b For the first published ratings, see Mass. DoE, 2001 For potential penalties, see Mass. DoE, 2000b 13 See "Relationships between student MCAS performance and courses taken," in Brian Gong, Relationships Between Student Performance on the MC AS and Other Tests--Collaborating District A, National Center for the Improvement of Educational Assessment, Inc., March, 1999, pp. 39-48, available at /MA%20report.pdf 14 See Cremin, 1970 Kaestle, 1983 Tyack, 1974 and Tyack and Cuban, 1995 15 Referred to herein as the "Business Alliance." Not to be confused with the Massachusetts Global Business Alliance, Business Ed ucation Alliance, Regional Education and Business Alliance, or any of several other organizations with similar names.16 See Mass. Business, 1991 Principal author of the sections dealing with sch ool finance was economist Edward Moskovitch, a former Massachus etts chief budget director and executive director of the Massachusetts Municipal A ssociation. The title of this document should not be confused with the Every Chil d a Winner (ECAW) school programs and educational games produced by Martha F Owens and Susan B. Rockett, beginning in 1974, and now distributed by Education al Excellence, Inc., of Ocilla, GA;


45 of 57there appears to be no connection; see tw9/eptw9b.html 17 "Students in the graduating class of 2003 shall mee t or exceed the...score of 220 on both the English Language Arts and the Mathematics MCAS grade 10 tests...." See Mass. DoE, 1999f 18 See Mass. DoE, 1999c Chapter 8, Scoring, and Chapter 10, Scaling. 19 Mass. DoE, 2000a Also see Mass DoE, 1998c and Mass. DoE, 1999c 20 See Antonucci, 1995 for the following: "As students are better matche d to their institutions," [says Abigail Thernstrom], "as they cascade to places where they are prepared to the average level, the graduation rates should go up for minorities." 21 For most of the available information, see Mass DoE, 1999c 22 Massachusetts General Laws, Chapter 69 ( Powers ), Section 1I (second paragraph). Also see Allen, 1999 23 Information on Mass Insight is available at html 24 For example, a "Starting Now" brochure for parents. See Driscoll, 1998a and Driscoll, 1999b The versions distributed include: Fall 1998, Spri ng 1999, Manufacturing and Processing Industry Spotlight, Fall 1999, Spring 20 00, Fall 2000 and Spring 2001--all available from the Massachusetts Department of Educ ation at ubs/other_pub.html 25 See the following examples: "Recent survey by Mass Insight," cited in Goals 2000 Five Year Master Plan, Goal 5, "Create a statewide infrastructure," March 1995; "Mass Insight report on education reform," minutes of Mar ch 22, 1996; presentations on the graduation requirement by Bill Guenther, Mass Insig ht Education and by Susan Kiernan and John Lozada, Mass Insight's Campaign for Higher Standards, minutes of November 23, 1999.26 "A majority of parents surveyed said they would rat her pay out of their pockets to send their children to private or parochial schools than send them to Haverhill schools" (quoted in Grodsky, 1999 ). "Almost every school system is loaded with incom petent administrators" (John Silber, quoted in Drew and Suhler, 1998 ). "We should change the teacher tenure law so we can dismiss incompetent te achers" (Lamar Alexander, former Governor of Tennessee and US Secretary of Education quoted in Patch and Wallace-Wells, 1998 ). "The dismissal of incompetent teachers is made a lmost impossible in some communities by such over-zealous delirium on the part of good people" (from Samuel P. Orth, the author of A History of Cleveland in Orth, 1909 ). 27 For passing scores, see Mass. DoE, 1999f and Mass. DoE, 1999c Chapter 8, Scoring,


46 of 57and Chapter 10, Scaling. For typical scores from 19 98 and 1999, statewide, see, Mass. 1999e Executive Summary. 28 For passing scores, see NY DoE, 2000a For typical scores from 1999, see NY DoE, 2000b 29 For passing scores, see TX DoE, 2000a For typical scores from 2000, see TX DoE, 2000b 30 In some formats, New York and Texas report lower fa ilure percentages because they exclude some students or they include students who pass after multiple attempts. 31 See, for example, National Assessment, 1998 and similar summary tables from other years.32 Massachusetts General Laws, Chapter 69 ( Powers ), Section 1I (second paragraph). 33 See Madaus and Clarke, 1998 34 See McNeil, 2000 and Heubert and Hauser, 1999 35 See Clarke, et al., 2000 and Haney, 2000 36 As required by Massachusetts General Laws, Chapter 74 ( Vocational Education ), Section 7, and Mass. DoE, 1997 37 Data from Mass. DoE, 2000h 38 Also see EdTech, Massachusetts Department of Education, at .ReferencesAllen, M. (1999, September 9). SouthCoast schools c ondemn governor's ranking plan. New Bedford Standard-Times, New Bedford, MA. Antonucci, R. V. (1997a). Letter from the Commissio ner of Education to school superintendents, March 18, 1997. Malden, MA: Depart ment of Education. August, 2001, available at Antonucci, R. V. (1997b). Memorandum from the Commi ssioner of Education to school superintendents, May 6, 1997. Malden, MA: Departmen t of Education. August, 2001, available at l Antonucci, T. W. (1995). Independent Women's Forum Expose. Women Leaders Online, published at Avenoso, K. (1996, December 18). Colleagues say Sil ber hurting ed reform. Boston


47 of 57Globe, p. B1. Balfanz, R., & Legters, N. (2001). How many central city high schools have a severe dropout problem, where are they located, and who at tends them? Paper presented at the Conference on Dropout Research, Harvard University, Cambridge, MA. January, 2001, draft available at /dropout/balfanz.html Battenfeld, J., & Pressley, D. S. (1999, March 2). Cellucci eyes replacing Silber as education czar. Boston Herald. Berliner, D. C., & Biddle, B. J. (1995). The Manufactured Crisis: Myths, Fraud, and the Attack on America's Public Schools. Reading, MA: Addison-Wesley. Berliner, D. C., & Biddle, B. J. (1996). Making mou ntains out of molehills. Education Policy Analysis Archives 4 (3). February, 1996, available at Bevington, P. R., & Robinson, D. K. (1992). Data Reduction and Error Analysis for the Physical Sciences (2nd Ed.). New York, NY: McGraw-Hill. Bolon, C. (2000). School-based standard testing. Education Policy Analysis Archives 8 (23). May, 2000, available at Colloquia Manilana 8, 90-145. Quezon City, Philippines.Clarke, M., Haney, W., & Madaus, G. (2000). High st akes testing and high school completion. National Board on Educational Testing and Public Po licy Monographs 1 (3). January, 2000, available at Cremin, L. A. (1970). American Education; The Colonial Experience, 1607-1 783. New York, NY: Harper & Row.Daley, B. (1997, October 9). Councilor questions ex am process at Latin; Yancey also seeks admissions data. Boston Globe, p. C19. Daley, B. (1998, May 29). Admission policy at city exam schools is upheld. Boston Globe, p. B1. Darling-Hammond, L. (2000). Teacher quality and stu dent achievement: A review of state policy evidence. Education Policy Analysis Archives 8 (1). January, 2000, available at Drew, D. P., & Suhler, J. N. (1998, November 15). S trong steps urged for DISD. Dallas Morning News Driscoll, D. P. (1998a). Memorandum from the Commis sioner of Education to school superintendents, October 19, 1998. Malden, MA: Depa rtment of Education. August, 2001, available at Driscoll, D. P. (1998b). Letter from the Commission er of Education to school superintendents, November 16, 1998. Malden, MA: Dep artment of Education. August, 2001, available at


48 of 57Education Reform Act of 1993. Massachusetts General Court Acts of 1993, C. 71. J uly, 1997, available at Estrin, R (1999, March 4). Cellucci accepts Silber' s resignation, breaking impass. New Bedford Standard-Times, New Bedford, MA. Ferguson, R. F. (1998). Can schools narrow the blac k-white test score gap? In C. Jencks & M. Phillips (Eds.), The Black-White Test Score Gap (pp. 318-374). Washington, DC: Brookings Institution.French, D. (1998). The state's role in shaping a pr ogressive vision of public education. Phi Delta Kappan 80 (3), pp. 185-194. Future of Mass. ed. reform unclear (1996, Spring). FairTest Examiner. Cambridge, MA: National Center for Fair and Open Testing.Gaudet, R. D. (2001). Effective School Districts in Massachusetts (3rd Ed.). Boston, MA: Donahue Institute, University of Massachusetts.Gipps, C., & Murphy, P. (1994). A Fair Test? Assessment, Achievement and Equity. Philadelphia, PA: Open University Press.Gong, B. (1999). Relationships Between Student Performance on the MC AS and Other Tests--Collaborating District A. Dover, NH: National Center for the Improvement of Educational Assessment, Inc. March, 1999, available at Grissmer, D. W., Flanagan, A., Kawata, J., & Willia mson, S. (2000). Improving Student Achievement: What NAEP State Test Scores Tell Us. Santa Monica, CA: Education Division, RAND Corp. July, 2000, available at Grosky, J. B. (1999, March 5). School choice parent s unhappy with Haverhill. Lawrence Eagle-Tribune, Lawrence, MA. Haney, W. (2000). The myth of the Texas miracle in education. Education Policy Analysis Archives 8 (41). August, 2000, available at Hayman, R. L., Jr. (1998). The Smart Culture. New York, NY: New York University Press.Hedges, L. V., & Nowell, A. (1998). Black-white tes t score convergence since 1965. In C. Jencks & M. Phillips (Eds.), The Black-White Test Score Gap (pp. 149-181). Washington, DC: Brookings Institution.Heubert, J. P., & Hauser, R. M., Eds. (1999). High Stakes Testing for Tracking, Promotion and Graduation. Washington, DC: National Academy Press. Horn, C., Ramos, M., Blumer, I., & Madaus, G. (2000 ). Cut scores: Results may vary. National Board on Educational Testing and Public Po licy Monographs 1 (1). April, 2000, available at


49 of 57Howe, P. J. (1992, December 5). Cellucci wants plan for schools on table, urges legislators to reveal details. Boston Globe, p. 13. Howe, P. J. (1993a, January 27). School reform deba te begins but antiabortion phrases cause stir. Boston Globe, p. 14. Howe, P. J. (1993b, January 29). House in 4th day o f talks on education reform bill. Boston Globe, p. 13. Howe, P. J. (1993c, April 22). Education reform los es luster on the hill. Boston Globe, p. 57.Jackson, D. Z. (1996, August 9). A warped board of education. Boston Globe, p. A19. Kaestle, C. F. (1983). Pillars of the Republic. New York, NY: Hill and Wang. Kaufman, P. (2001). The national dropout data colle ction system: Assessing consistency. Paper presented at the Conference on Dropout Resear ch, Harvard University, Cambridge, MA. January, 2001, draft available at /dropout/kaufman.html Klein, S. P., Hamilton, L. S., McCaffrey, D. F., & Stecher, B. M. (2000). What Do Test Scores in Texas Tell Us? Education Policy Analysis Archives 8 (49). October, 2000, available at Kohn, A. (2000, March 20). Poor teaching for poor s tudents: more reasons to boycott the MCAS tests. Boston Globe, p. A11. Koretz, D. (1992). What happened to test scores, an d why? Educational Measurement: Issues and Practice 11 pp. 7-11. Krueger, A. B. (1999). Experimental estimates of ed ucation production functions. Quarterly Journal of Economics 114, 497-532. Langborgh, E. (1999). CUNY remediates itself. Accuracy in Academia, published at Lehigh, S. (1993, March 13). Roosevelt may rouse in terest in '94. Boston Globe, p. 14. Leung, S. (1997, January 16). State board rejects G ED exam plan. Boston Globe, p. B3. Madaus, G., & Clarke, M. (1998). The adverse impact of high stakes testing on minority students: Evidence from 100 years of test data. Pap er presented at the High Stakes K-12 Testing Conference, Teachers College, Columbia Univ ersity, New York, NY. December, 1998, draft available at testing98/drafts/madaus2.html Magnani, D. P., Havern, R. A., Resor, P. A., et al. (2000). An act recognizing vocational-technical learning in educational assess ment. Massachusetts Senate Bill 282, Senate Docket 1656. December, 2000, available at


50 of 57Marcus, J. (1999, October). The shocking truth abou t our public schools. Boston Magazine.Massachusetts Business Alliance for Education (1991 ). Every Child a Winner: A Proposal for a Legislative Action Plan for Systemic Reform of Massachusetts' Public Primary and Secondary Education System. Worcester, MA: Massachusetts Business Alliance for Education.Massachusetts Department of Education (1994). Massachusetts Common Core of Learning, Malden, MA: Department of Education. August, 2001, available at .html Massachusetts Department of Education (1996a). High expectations for students and accountability for all. Education Today 12 (1). Malden, MA: Department of Education. Massachusetts Department of Education (1996b). Asse ssment development committees represent school, district and community interests. Education Today 12 (1). Malden, MA: Department of Education.Massachusetts Department of Education (1997). Vocat ional education. Malden, MA: Department of Education, Robert V. Antonucci, Commi ssioner. Massachusetts Regulation 603 CMR 4.00 et seq., July, 1997.Massachusetts Department of Education (1998a). Minutes of the Massachusetts Board of Education, January 12, 1998 Malden, MA: Department of Education. Summary available at Massachusetts Department of Education (1998b). MCAS Performance Level Definitions. Malden, MA: Department of Education, David P. Drisc oll, Commissioner. September, 1998, available at Massachusetts Department of Education (1998c). Background on the MCAS tests of May, 1998. Malden, MA: Department of Education, David P. Dris coll, Commissioner. November, 1998, available at ontents.html Massachusetts Department of Education (1999a). Acco unting and reporting: school districts. Malden, MA: Department of Education, Dav id P. Driscoll, Commissioner. Massachusetts Regulation 603 CMR 10.03, July, 1999.Massachusetts Department of Education (1999b). School and District Accountability System, Malden, MA: Department of Education, David P. Dris coll, Commissioner. October, 1999, available at Massachusetts Department of Education (1999c). MCAS 1998 Technical Report, Malden, MA: Department of Education, David P. Driscoll, Com missioner. September, 1999, available at ort_full.pdf Massachusetts Department of Education (1999d). Massachusetts Comprehensive Assessment System: Guide to Interpreting the 1999 M CAS Reports for Schools and Districts, Malden, MA: Department of Education, David P. Dris coll, Commissioner.


51 of 57November, 1999, available at l_guide.pdf Massachusetts Department of Education (1999e). Massachusetts Comprehensive Assessment System: Report of 1999 State Results, Malden, MA: Department of Education, David P. Driscoll, Commissioner. Novembe r, 1999, available at 9com.html Massachusetts Department of Education (1999f). Stan dards for competency determination. Malden, MA: Department of Education, David P. Driscoll, Commissioner. Massachusetts Regulation 603 CMR 30.0 3, November, 1999. Massachusetts Department of Education (2000a). Mass achusetts certificate of mastery. Malden, MA: Department of Education, David P. Drisc oll, Commissioner. Massachusetts Regulation 603 CMR 31.03, March, 2000 Massachusetts Department of Education (2000b). Unde rperforming schools and school districts. Malden, MA: Department of Education, Dav id P. Driscoll, Commissioner. Massachusetts Regulation 603 CMR 2.00 et seq., May, 2000. Massachusetts Department of Education (2000c). Report of 1999 Massachusetts and Local School District MCAS Results by Race/Ethnicit y. Malden, MA: Department of Education, David P. Driscoll, Commissioner. May, 20 00, available at html Massachusetts Department of Education (2000d). Massachusetts Comprehensive Assessment System Requirements for the Participatio n of Students with Limited English Proficiency. Malden, MA: Department of Education, David P. Dris coll, Commissioner. June, 2000, available at Massachusetts Department of Education (2000e). Final FY01 Chapter 70 Aid and Spending Requirements. Malden, MA: Department of Education, David P. Dris coll, Commissioner. August, 2000, available at Massachusetts Department of Education (2000f). School and District Profiles. Malden, MA: Department of Education, David P. Driscoll, Com missioner. September, 2000, available at Massachusetts Department of Education (2000g). Release of Spring 2000 Test Items. Malden, MA: Department of Education, David P. Drisc oll, Commissioner. November, 2000, available at Massachusetts Department of Education (2000h). MCAS Results, 1998-2000. Malden, MA: Department of Education, David P. Driscoll, Com missioner. November, 2000, available at Massachusetts Department of Education (2000i). MCAS 1999 Technical Report. Malden, MA: Department of Education, David P. Driscoll, Com missioner. November, 20010, available at .pdf Massachusetts Department of Education (2000j). Massachusetts Curriculum


52 of 57Frameworks. Malden, MA: Department of Education, David P. Dris coll, Commissioner. November, 2000, available at Massachusetts Department of Education (2000k). Curriculum Framework Archive. Malden, MA: Department of Education, David P. Drisc oll, Commissioner. November, 2000, available at Massachusetts Department of Education (2000l). MCAS Alternate Assessment. Malden, MA: Department of Education, David P. Driscoll, Com missioner. November, 2000, available at tm Massachusetts Department of Education (2001). School Performance Ratings by District. Malden, MA: Department of Education, David P. Drisc oll, Commissioner. January, 2001, available at 00.pdf Massachusetts Department of Revenue (1999). Measures of Property Wealth and Income. Boston, MA: Division of Local Services, Massachuse tts Department of Revenue. September, 1999, available at McDuffy v. Robertson, 615 N.E.2d 516 (MA 1993).McLaughlin v. Boston School Committee, 938 F.Supp. 1001, 1018 (MA 1996). McNeil, L. M. (2000), Contradictions of School Reform, New York, NY: Routledge. Metropolitan Area Planning Council (1997). Sustainable Development in the Metropolitan Boston Region. Boston, MA: Metropolitan Area Planning Council, Gr ace S. Shepard, President. August, 1997, available at National Assessment of Educational Progress (1998). National and State Summary Data Tables. Washington, DC: National Assessment of Educational Progress, US Department of Education, July 1998, available at National Center for Education Statistics (2001). Digest of Education Statistics, 2000. Washington, DC: U. S. Department of Education.Nealon, P. (1993, April 15). Project may run some p ublic Mass. schools. Boston Globe, p. 39.New York State Education Department (2000a). Understanding Your School Report Card 2000, Guide to Secondary School Examinations. Albany, NY: Education Department. February, 2000, available at New York State Education Department (2000b). New York State School Report Card Database. Albany, NY: Education Department. March, 2000, ava ilable at Nicodemus, A. (2000, November 21). Alternative test sought for Voc-Tech. New Bedford Standard-Times, New Bedford, MA. Nye, B., Hedges, L. V., & Konstantopoulos, S. (1999 ). The long-term effects of small classes: A five-year follow-up of the Tennessee cla ss size experiment. Educational


53 of 57Evaluation and Policy Analysis 21 (2), 127-142. Orfield, G., Arenson, J., Jackson, T., et al. (1997 ). City-Suburban Desegregation: Parent and Student Perspectives in Metropolitan Boston. Cambridge, MA: Harvard Civil Rights Project, Harvard University. September, 1997, draft available at Orth, S. P. (1909, March). Plain facts about public schools. Atlantic Monthly Overdue school reform bill (1992, December 23). Boston Globe, p. 14. Partee, G. (1997). High School Reform and Systemic Districtwide Reform in Boston, Massachusetts. Washington, DC: American Youth Policy Forum. April 1997, available at Patch, B., & Wallace-Wells, B. (1998, April 22). La mar Alexander on educating America. Dartmouth Review, Dartmouth, NH. Pawlak-Seaman, S. (1996, August 9). Silber as ed cz ar? That's really scary. New Bedford Standard-Times, New Bedford, MA. Pawlak-Seaman, S. (1997, January 8). Silber: Test s cores "pathetic." New Bedford Standard-Times, New Bedford, MA. Peyser, J. A. (1996). Changing the monopoly structu re of public education. Dialogue 12. Boston, MA: Pioneer Institute.Peyser, J. A. (1998). Human Services Financing: A New Model. Boston, MA: Pioneer Institute.Phillips, M., Crouse, J., & Ralph, J. (1998). Does the black-white test score gap widen after children enter school? In C. Jencks & M. Phil lips (Eds.), The Black-White Test Score Gap (pp. 229-272). Washington, DC: Brookings Instituti on. Powers and Duties of the Department of Education. Massachusetts General Laws, Chapter 69. Available at Public Schools. Massachusetts General Laws, Chapter 71. Available at Ribadeneira, D. (1993, January 26). Coalition raps aspects of reform package. Boston Globe, p. 14. Rowe, K. J. (1999a). Assessment, performance indica tors, "league tables," "value-added" measures and school effectiveness. Paper presented at the Australian Association for Research in Education 1999 Conference, Melbourne, A ustralia. December, 1999, available at Rowe, K. J., Turner, R., & Lane, K. (1999b). The "m yth" of school effectiveness: locating and estimating the magnitudes of major sou rces of variation in students' year 12 achievements within and between schools over five y ears. Paper presented at the Australian Association for Research in Education 19 99 Conference, Melbourne,


54 of 57Australia. December, 1999, available at Sacks, P. (1999). Standardized Minds. Cambridge, MA: Perseus Books. School reform bill benefits and costs for fiscal 19 94 (1992, December 23). Boston Globe, p. 20.Smith, J. R., Brooks-Gunn, J., & Klebanov, P. K. (1 997). Consequences of living in poverty for young children's cognitive and verbal a bility and early school development. In G. J. Duncan & J. Brooks-Gunn (Eds.), Consequences of Growing Up Poor (pp. 132-167). New York, NY: Russell Sage Foundation.Steinberg, J. (2000, April 13). Blue books closed, students protest state tests. New York Times, p. A1. Still too early to consider graded diploma system ( 1999, August 9). Springfield Union-News, Springfield, MA. Stotsky, S. (1999). Losing Our Language: How Multicultural Classroom In struction is Undermining Our Children's Ability to Read, Write a nd Reason. New York, NY: Simon & Schuster.Szechenyi, C. (1998, April 5). Steps urged to avoid test errors; Former education commissioner recommended national oversight panel. Middlesex News, Framingham, MA.Taylor, J. (1993, May 2). Crusade for schools puts CEO in limelight he shuns. Boston Globe (Northwest Edition), p. 1. Texas Education Agency (2000a). Student Assessment Division Performance Standards. Austin, TX: Education Agency. March, 2000, availabl e at Texas Education Agency (2000b). 2000 Statewide TAAS Summary Reports. Austin, TX: Education Agency. March, 2000, available at Tuerck, D.G. (2001a). Promoting Good Schools through Wise Spending. Boston, MA: Beacon Hill Institute, Suffolk University. January, 2001, summary available at .pdf Tuerck, D.G. (2001b, January 20). MCAS rating syste m needs to be fixed. Boston Globe, p. A15.Tyack, D. B. (1974). The One Best System. Cambridge, MA: Harvard University Press. Tyack, D. B., & Cuban, L. (1995). Tinkering Toward Utopia, Cambridge, MA: Harvard University Press.Uriarte, M., & Chavez, L. (1999). Latino Students and Massachusetts Public Schools. Boston, MA: Mauricio Gaston Institute for Latino Co mmunity Development and Public Policy, University of Massachusetts. November, 1999 available at


55 of 57 Vaishnav, A. (2000, May 26). MCAS opponents adminis ter a pop quiz: Questions stump many passersby. Boston Globe, p. B4. Vennochi, J. (1993, March 12). Dangerous liaisons. Boston Globe, p. 5. Vocational Education. Massachusetts General Laws, Chapter 74. Available at Walser, N. (1997). Financing fairness. Commonwealth 2 (2-3), Spring. Boston, MA: MassINC.Wheelock, A. (1999). High-stakes test-based account ability policies: problems and pitfalls. Cambridge, MA: National Center for Fair a nd Open Testing. December, 1999, available at ountability.html Why standards and tests? (1999). Mass Insight Education's Organization and Programs. Boston, MA: Mass Insight Education and Research Ins titute. December, 1999, available at Wong, D. S. (1996, August 10). Weld picks education panel nominees. Boston Globe, p. B5.Wood, J. (1999). An Early Examination of the Massachusetts Charter S chool Initiative. Boston, MA: Donahue Institute, University of Massac husetts. December, 1999, available at .html .About the AuthorCraig BolonPlanwright Systems CorpP.O. Box 370Brookline, MA02446-0003 USAEmail: cbolon@planwright.comPhone: 617-277-4197Craig Bolon is President of Planwright Systems Corp ., a software development firm located in Brookline, Massachusetts, USA. After sev eral years in high energy physics research and then in biomedical instrument developm ent at M.I.T., he has been an industrial software developer for the past twenty y ears. He is author of the textbook Mastering C (Sybex, 1986) and of several technical publication s and patents. He is an elected Town Meeting Member and has served as membe r and Chair of the Finance Committee in Brookline, Massachusetts.Copyright 2001 by the Education Policy Analysis ArchivesThe World Wide Web address for the Education Policy Analysis Archives is


56 of 57General questions about appropriateness of topics o r particular articles may be addressed to the Editor, Gene V Glass, or reach him at College of Education, Arizona State University, Tempe, AZ 8 5287-0211. (602-965-9644). The Commentary Editor is Casey D. C obb: .EPAA Editorial Board Michael W. Apple University of Wisconsin Greg Camilli Rutgers University John Covaleskie Northern Michigan University Alan Davis University of Colorado, Denver Sherman Dorn University of South Florida Mark E. Fetler California Commission on Teacher Credentialing Richard Garlikov Thomas F. Green Syracuse University Alison I. Griffith York University Arlen Gullickson Western Michigan University Ernest R. House University of Colorado Aimee Howley Ohio University Craig B. Howley Appalachia Educational Laboratory William Hunter University of Calgary Daniel Kalls Ume University Benjamin Levin University of Manitoba Thomas Mauhs-Pugh Green Mountain College Dewayne Matthews Education Commission of the States William McInerney Purdue University Mary McKeown-Moak MGT of America (Austin, TX) Les McLean University of Toronto Susan Bobbitt Nolen University of Washington Anne L. Pemberton Hugh G. Petrie SUNY Buffalo Richard C. Richardson New York University Anthony G. Rud Jr. Purdue University Dennis Sayers California State University—Stanislaus Jay D. Scribner University of Texas at Austin Michael Scriven Robert E. Stake University of Illinois—UC Robert Stonehill U.S. Department of Education David D. Williams Brigham Young University EPAA Spanish Language Editorial BoardAssociate Editor for Spanish Language Roberto Rodrguez Gmez


57 of 57 Universidad Nacional Autnoma de Mxico Adrin Acosta (Mxico) Universidad de J. Flix Angulo Rasco (Spain) Universidad de Teresa Bracho (Mxico) Centro de Investigacin y DocenciaEconmica-CIDEbracho Alejandro Canales (Mxico) Universidad Nacional Autnoma Ursula Casanova (U.S.A.) Arizona State Jos Contreras Domingo Universitat de Barcelona Erwin Epstein (U.S.A.) Loyola University of Josu Gonzlez (U.S.A.) Arizona State Rollin Kent (Mxico)Departamento de InvestigacinEducativa-DIE/ Mara Beatriz Luce (Brazil)Universidad Federal de Rio Grande do Sul-UFRGSlucemb@orion.ufrgs.brJavier Mendoza Rojas (Mxico)Universidad Nacional Autnoma deMxicojaviermr@servidor.unam.mxMarcela Mollis (Argentina)Universidad de Buenos Humberto Muoz Garca (Mxico) Universidad Nacional Autnoma deMxicohumberto@servidor.unam.mxAngel Ignacio Prez Gmez (Spain)Universidad de Daniel Schugurensky (Argentina-Canad)OISE/UT, Simon Schwartzman (Brazil)Fundao Instituto Brasileiro e Geografiae Estatstica Jurjo Torres Santom (Spain)Universidad de A Carlos Alberto Torres (U.S.A.)University of California, Los