xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam a22 u 4500
controlfield tag 008 c19939999azu 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E11-00011
Educational policy analysis archives.
n Vol. 1, no. 11 (November 02, 1993).
Tempe, Ariz. :
b Arizona State University ;
Tampa, Fla. :
University of South Florida.
c November 02, 1993
Why production function analysis is irrelevant in policy deliberations concerning educational funding equity / Jim C. Fortune.
Arizona State University.
University of South Florida.
t Education Policy Analysis Archives (EPAA)
xml version 1.0 encoding UTF-8 standalone no
mods:mods xmlns:mods http:www.loc.govmodsv3 xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govmodsv3mods-3-1.xsd
mods:relatedItem type host
mods:identifier issn 1068-2341mods:part
mods:detail volume mods:number 1issue 11series Year mods:caption 19931993Month November11Day 22mods:originInfo mods:dateIssued iso8601 1993-11-02
1 of 22 Education Policy Analysis Archives Volume 1 Number 11November 2, 1993ISSN 1068-2341A peer-reviewed scholarly electronic journal. Editor: Gene V Glass, Glass@ASU.EDU. College of Edu cation, Arizona State University,Tempe AZ 85287-2411 Copyright 1993, the EDUCATION POLICY ANALYSIS ARCHIVES.Permission is hereby granted to copy any a rticle provided that EDUCATION POLICY ANALYSIS ARCHIVES is credited and copies are not sold.Why Production Function Analysis is Irrelevant in P olicy Deliberations Concerning Educational Funding Equity Jim C. Fortune College of Education Virginia Tech UniversityFORTUNE@VTVM1.BITNET Abstract: Hanushek and Walberg use production function metho dology to contend that there is no relationship between school expenditures and stu dent achievement. Production function methodology uses correlational methods to demonstra te relationships between input and output in an economic system. These correlational methods may serve to hide rather than reveal these relationships. In this paper threats to the validit y of these correlational methods for analysis of expenditure-achievement data are discussed and an a lternative method of investigation is proposed. The proposed method is illustrated using data from two states (Ohio and Missouri). The method demonstrates relationships between expen ditures and achievement that were overlooked by the production function method. Introduction "On 26 February 1988 Bennett remarked, `Money doesn 't cure school problems.' On 29 February 1988 he was more explicit: `We've done 147 studies at the Department of Education. We cannot show a strong, positive correlation between spendin g more and getting a better result.' In an earlier reference to those studies, he had said on 13 April 1987 that `in only two or three do we find even a weak correlation between spending and a chievement.'" (Baker, 1991) The 147 studies referred to by Bennett are those summarized by Hanu shek (1986) using the production function technique.Hanushek (1989) contended that "Variations in schoo l expenditures are not systematically related to variations in student performance" and that "... schools are operated in an economically
2 of 22inefficient manner ." He suggested that "increased school expenditures by themselves offer no overall promise for improving education" and that school decision making must move away from the traditional "input directed" policies to o nes providing performance incentives." To support his contentions, Dr. Hanushek relied on the 26 year old, much maligned study by Coleman et al, Equality of Educational Opportunity, and his summary of 187 studies using educational production functions.Walberg appears to base his case contending no rela tionship between achievement and productivity on his theory of causal influences on student learning and the resulting nine productivity factors (1982), on the triad relations hip of socioeconomic status, productivity, and expenditures (1989), and on Hanushek's model and th e early literature related to production function analysis (1984). POLICY RELEVANCE OF THE PRODUCTION FUNCTION METHODO LOGY Monk (1992) described production function analysis as the relating of an input measure to an output measure using correlation or multivariate an alysis (regression analysis). He reported that production research began in education some 30 year s ago. The process involves the study of relationships between purchased schooling inputs an d educational outcomes. The research, according to Monk, is deductively driven, although the deductive arguments tend to be abbreviated. He suggested that the approach has lim ited utility in policy research because of methodological and conceptual limitations. Monk poi nted out that recent research includes more complex multivariate models which have greater pote ntial for illuminating policy. Both traditional production function analyses and t he modern multivariate version to which Monk alluded are based on correlational methods whi ch are inadequate to deal with causation. In the simple linear correlation model, a single input variable (often, expenditures, but sometimes other school related inputs such as teacher experie nce or teacher preparation) is correlated with a single output variable (usually achievement, but so metimes percent passing minimum competency tests or rate of graduation). The multip le dimensionality of schooling suggests that such simple representations of either input or outp ut are inadequate to describe the production relationships.The second major production function analysis model is based on regression procedures, where a single output variable is predicted by one or more input variables (chosen from expenditure data, teacher experience or teacher preparation) and by i ntervening variables (such as socio-economic variables, school size, and the like). The purpose of using the intervening variables is to control factors which may confound the actual input-output relationship. In some applications the researcher permits the intervening variables to ent er the regression prior to the entry of the input variables. There exists a serious problem with shar ed variance among the three sets of variables that will be discussed later. Regression based on t he prediction of the output variable residual (which has been created by regressing the partial c orrelation residual of the intervening variables controlling for their relationship to the input var iables with the output variable) by the input variables is a more appropriate application to cont rol for confounding variables. Problems with the Simple Linear Correlation Approac h The Assumptions of the Linear Correlation Approach. Application of the simple correlation model must meet the data assumptions required by co rrelation, the limitations to inference assumed in the use of the model, and the implicit a ssumptions about the relationship between correlates inherit in the production function metho dology. For the application of the Pearson's
3 of 22Product Moment Correlation it is required that one have near or better than interval data for paired cases and that the full range of each variab le be present. The coefficient attained measures the linear relationship between the two variables a nd indicates association, but not necessarily causation.What Constitutes Differences in Expenditures? "Throwing a bucket of water on a raging fire will not keep a building from burning to the ground, but no one would argue on the basis of this experience that water has no value in fire-fighting The value of water is apparent only when enough is applied to overcome the fire by reducing the heat below a critical point, degrading the fuel, or temporarily removing the air needed for co mbustion. An analogous situation often occurs in education. Frequently, we judge an intervention strategy to be ineffective before we have really implemented a program that is intense enough to achieve the desired effects. "Compensatory education" is a case in point." (Brid ge, Judd, and Moock, 1979) The above phenomenon has been labeled a threshold e ffect. One reason why the correlation method of production function analysis does not sho w effects of small differences of funding on achievement is the threshold effect. One dollar dif ference in funding will not purchase a commensurate or observable difference in achievemen t. Instead some larger, aggregate differences in funding, perhaps $600 or $700, is ne eded to purchase observable differences in achievement.Perhaps the greatest problem in the use of the simp le, linear correlation method beyond variable specification, is the absence of the cost dispariti es that are essential to demonstrate differences in educational purchasing power. An ordering of distri cts by amount of instructional expenditures does not necessarily order the same districts by th eir educational purchasing power. One district may have five dollars less in per pupil expenditure than a second district, but may have to pay on the average ten dollars more per teacher than does the second district. Ordering of districts by dollar differences which are less that the measurem ent error associated with expenditures results in gross underestimation of the true relationship b etween costs and achievement. The Truncated Variable (Attenuation). Percent passing a test as a measure of achievement represents a somewhat unusual truncation of a varia ble in that the variance on the achievement measure is limited to variation of dichotomies rath er that variation across the full set of test scores. Variable truncation also occurs when the te sts have either floor or ceiling effects, when only one specific segment of the enrollment is used (such as at risk students or college students) or when data are not available for the entire sampl e being analyzed. Potential Non-Linear Relationships. The simple, linear correlation method will not iden tify non-linear relationships between the input and outp ut variables. In one of the two states discussed later in this paper, I found a quadratic relationship in exploring the data. A state department report in the second state also alluded to a potential quadratic relationship between input-output variables.Problems with the Multiple Regression ApproachThe Assumptions of the Multiple Regression Approach The application of the regression approach is characterized by a single output variab le (some form of achievement measurement or percent reaching an educational standard) being pre dicted by one or more input variables (expenditures, teacher characteristics, and the lik e) and controlling for one or more background variables (such as socio-economic variables or scho ol size). Two ways are used to control for the background variables. The first way is to permit th e background variables to enter first in the
4 of 22prediction equation. The second is a residualizing technique. The residualization process involves creating the residual of the output variab le by regressing the first partial of the controlling variables with the vector of predictors on the output variable. The linear combination of the predictor variables are then regressed on th e output residual. The regression approach requires that the researcher meet all of the assump tions that have to be met in the simple, linear correlation analysis. In addition, the researcher i s required to have a theory or rationale for establishing the order of variable entry and an und erstanding of the shared variance problem. The Order of Variable Entry Problem. The order of variable entry in the calculation of t he correlation is important in handling shared varianc e or commonality of explanation. If two correlated independent variables (predictors) are r elated to a dependent variable (outcome or criterion), the first variable to enter into the re gression calculation gets credit for all of its correlation with the dependent variable. When the s econd variable is entered into the regression calculation, it gets credit only for the correlatio n that it has with the dependent variable that has not been explained by the first variable entered. H ence, the first variable gets credit for the correlation with the dependent variable that is sha red by the second variable. Critics of Coleman showed that his order of effects do not hold up acr oss applications of different regression models. (Pedhazur, 1982)The Shared Variance Problem. In dealing with this triad relationship created by the output variable, the input variables and the controlling v ariables, Walberg (1989) simply failed to discuss how he handled the shared variance problem inherit in the triad. His regression model enters socio-economic status as the first predictor of students' test performances, size as the second prediction variable, and finally expenditure s as the third predictor variable. The amount of explanation shared by socioeconomic status and size and the amount of explanation shared by socio-economic status and expenditures are credi ted to socioeconomic status solely; the amount of explanation shared by size and expenditur es are then attributed to size alone. Certainly not much variance remains to be explained by expend itures. A different order of entry would produce markedly different results. Pedhazur (1982) credits Mayeske with the development of commonality analysis to address this problem, but t his methodology has been subjected to some criticism. There is in fact no effective statistica l method that will unconfound shared predictive relationships. The only appropriate treatment of th e shared relationship problem is, perhaps, a straight-forward admission that it is the cause of the unresolvable ambiguity. Other Design Problems for Both Correlational ModelsInadequate Variable Specification. In addition to the difficulty created by trying to represent multiple inputs and outputs by single variables, th ere is the additional difficulty of including confounding data elements in the input and output v ariable measurement. Selected single variables may provide inadequate description of key inputs or outputs, may be unlikely to have the relationship assumed by the production function paradigm, and may not be accurately measured.Inclusion of Confounded Data Elements. Federal dollars are included in school expenditures as unrestrained expenditures. Some federal dollars are likely ear-marked for efforts that do not contribute to student performance on achievement te sts and some federal funds are not involved in instruction. The inclusion of federal funds is n ot nearly the potential problem of some districts testing special education students and including th eir scores in the test results. Hence, when there is random confounding of the performance measure or the selection of a weak input variable, each serves to reduce the size of relationships. Th e choice of percent passing a basic competency test is an unfortunate choice of measure for an out put variable. Percent passing immediately sets
5 of 22up a ceiling effect for those passing the test. The use of a dichotomized scoring process reduces the amount of variance to be explained and attenuat es the observed relationship. Inadequate Determination of the Input Variables. Variable specification problems occur three ways in the determination of input variables. Probl ems occur when input measures are chosen that are not related to instruction. Perhaps, the m ost frequent example of this problem occurs in the use of teacher salary as an input measure. Teac her salary is based on seniority and is likely not related to quality of instruction. The second w ay that problems occur in the selection of input measures is the selection of an input which cannot be measured adequately across all districts. An example of this can be seen where school distric t size varies enough that economy of scale enters into the accuracy of the measure. Very small districts require more dollars per pupil to provide educational services equivalent to those of larger districts. The third way that selection of input variables can create problems is when in some districts the input variable has larger investment in special students than do other distri cts. Such cases are generated when districts have a large number of "At Risk" students or where a district invests highly in advanced placement instruction.Inadequate Determination of the Output Variables. Variable specification problems occur in at least four ways in the determination of output vari ables. The first way is when the output variable that was chosen was a minor emphasis of many school s. Such may be the case when school districts focus more on emotional, attitudinal, beh avioral, or vocational outcomes. The second way that dependent variable specification problems can occur is when there are floor and ceiling effects to the measures. If the achievement measure has a ceiling or a floor effect, then many of the students making a perfect or a zero score have accomplishments that are not being measured. The third way that variable specification problems can occur is when the output variables have no logical linkage to either the selected input var iables or to school quality. An example of this problem is the "Efficiencies" notion used by Walber g (1989). "Efficiencies" are school expected output developed by the use of prediction based on socio-economic status. The variable can be argued to better represent an error of measurement of the socioeconomic construct than an actual measure of school output. The fourth way tha t variable specification problems can occur is the selection of an output measure that does not pe rtain to the whole student body. An example of this is the selection of freshman grade point av erages for their first year of college. Differentia l proportions of students across districts go to coll ege, college curriculum differ in difficulty and colleges differ in difficulty.Crossing Economic Eras. Production function studies are often grouped for i nterpretation and for the making of policy recommendations. The 38 public ations from which Hanushek extracted his review range from the late 1950s to the early 1980s This means that several of the studies were conducted in different economic eras. In the 1950s, there was a dearth of federal funding, but there was a wave of post-war resources and the earl y beginning of inflation. The 1960s brought the Elementary and Secondary Education Act, increas ed federal funding, escalation of inflation, baby boom growth beginning to enter schools and the emergence of civil rights as major issues in education. The 1970s brought a slowing of federal f unding, abatement of inflation and more focus on growing enrollment. The 1980s marked a red uction in federal funds, the beginning of a recession, the start of program retrenchment and th e end of growing enrollment. It is quite likely that input-output relationships differ across these four decades. Inconsistent Determination of What is to be Conside red a Production Function Study. Several of the studies included in Hanushek's (1989) reviews d o not have one or more of the elements required to be classified as production function an alyses. One such study is a study that occurred in a large school district where teacher experience and differential teacher salaries were used as
6 of 22input variables. (Murname, 1975) In another study, college freshman grade-point-averages were used as the output variable. (Raymond, 1968) It see ms necessary that every study called a production function analysis must have at a minimum an input variable, an output variable, an assumption of a logical linkage between the school, total group and unbiased estimates for both variables across the units of comparison and the co mputation of a correlational analysis. Inadequate Sampling Representation. Problems with sampling representation occur in two ways: through lack of disclosure and through inadequate s ample size. Sampling becomes very important in making an inference to a given populat ion. In most production function analyses, the intent appears to be that the researcher wishes to generalize to all of the school districts in the United States. Not a single study or collection of studies appears to meet sampling requirements for this inference. Criticism of the Work of Hanushek As Spencer and Wiley (1981, p. 44) suggested "Hanus hek offers a provocative interpretation of the last two decades of research on educational pro ductivity." Unfortunately, "Hanushek misinterprets the data on which he bases his conclu sion and draws inappropriate policy implications from them." (Spencer and Wiley, 1981, p. 41) After reading a sampling of Hanushek's articles, I concur with Hughes (1992) th at one could quote from 20 years of Hanushek and destroy his current argument with his own words. However, I choose here to look at his current thesis and see if it stands on its o wn foundation or falls. Hanushek contended that "There is no systematic rel ationship between school expenditures and student performance" (Hanushek, 1991, p. 425) and t hat "... schools are economically inefficient." (Hanushek, 1986, p. 1166) He suggests that "increased school expenditures by themselves offer no overall promise for improving e ducation" (Hanushek, 1986, p. 1167) and that "school decision making must move away from th e traditional `input directed' policies to ones providing performance incentives." (Hanushek, 1989, p. 49) To support his contentions, Dr. Hanushek relies on the 26 year old, greatly critici zed study by Coleman et al, Equality of Educational Opportunity, Washington, D. C., Government Printing Office, 196 6; and his own summary of 187 (147 of these studies are those refe rred to by Bennett) studies of educational production functions. (Hanushek, 1989, p. 46)The Coleman Study as SupportThe Coleman Study did indeed highlight input-output relationships across a large number of districts, using a regression model. Coleman et al concluded that family characteristics and peer group characteristics were more instrumental in pro moting student achievement than were school system characteristics. Critics of the study sugges ted that this ordering of effects may be due to the analytic model used. Because the nature of regr ession analysis requires theory to specify models and order of variable entry into the computa tions, Coleman received considerable criticism, some of which resulted in George Mayeske 's contributions to a new analytic technique, commonality analysis (Pedhazur, 1982).The order of variable entry in the calculation of t he correlation is important in handling shared variance or commonality of explanation. If two inde pendent variables (predictors) are related to a dependent variable (outcome or criterion) and are r elated to each other, the first variable to enter into the computation gets credit for the correlatio n to the dependent variable that it shares with the second variable. Hence, if a family variable en ters first in the computation of the correlation being used in predicting reading performance and th en a peer variable enters into the calculation,
7 of 22the regression results will show for the family var iable its unique correlation with reading performance plus the correlation to reading perform ance that it shares with the peer variable. For the peer variable only its unique correlation to re ading performance is shown. Critics of Coleman show that his ranking of effects does not hold up a cross applications of different regression models. A second criticism of using Coleman as a pr imary research foundation lies in the age of the Coleman data. Any economist should be able to s ee that time has likely made relationships in the Coleman data obsolete with regard to today's ec onomy. Hanushek's Summary of Production FunctionsHanushek's (1989) summary of 187 studies of educati onal production functions is a continuing theme throughout his publications. This summary beg an in 1981 with 29 articles and 130 studies, it was continued in 1986 with 33 articles and 147 s tudies, and it was completed in 1989 with 38 articles and 187 studies. The summary is the resear ch foundation for Hanushek's assertion of no relationship between school districts' expenditures and student performance on standardized achievement tests.There are several serious omissions and research fl aws in the description and logic of Hanushek's summary. These include the lack of disclosure of sa mple sizes in the studies that were reviewed, inadequacy in size and representativeness of the 18 7 case studies, misinterpretation of the results of the hypothesis testing, potential misinterpretat ion of the summary, failure to use selected research that is not consistent with the ideas bein g promoted (Glass and Smith (1979), Spencer and Wiley (1981), Burstein (1980), and inadequate s pecification of the key variables. Lack of Information on Sample Sizes in the Studies that Were Reviewed. The studies that were reviewed by Hanushek were qualified in some unspeci fied manner. It appears that the primary criterion for qualification was publication. Hanush ek stated that at least one study deals with a district or districts in all regions of the United States, with different grade levels, and across different performance measures. He provided two tab les that are purported to describe the sample. In Table 1 of his 1989 article, "The Impact of Differential Expenditures on School Performance," Hanushek showed the number of studies dealing with single districts (60) and the number dealing with multiple districts (127), but h e failed to provide any information on the number of districts involved in the multiple distri cts. In his Table 2, Hanushek showed that 90 studies deal with at least one grade level in the r ange of grades from 1 to 6 and that 97 studies deal with at least one grade level in the range of grades from 7 to 12. No attempt is made to show replication across grade levels, number of students involved at each grade level or for each district. With so few cases, the reader must wonder where the holes are in the sample. Inadequate Size and Lack of Representativeness of t he 187 Case Studies. There are approximately 15,000 public school districts in the United States. These districts are characterized by a large variance in total enrollme nt. Samples that include a majority of the students and provide a confidence band of 0.95 perc ent are usually selected randomly using a stratified sampling frame that involves the selecti on of approximately 800 districts (See the Condition of Education Annual Reports by the National Center for Educational Statistics). A simple random sample without control for the number of students covered requires approximately 400 districts for a 0.95 percent conf idence level and for representation (Schaeffer, Mendenhall and Ott, 1986). The sample used by Hanus hek was not random and was likely smaller than either required sample sizes. The size is less bothersome than the scant likelihood of randomness. The 187 studies were likely to have bee n conducted in reaction to some problem or inquiry. Hence, are the relationships found in thes e unusual districts representative of those that exist in the other 15,000? No evidence is presented to allow the reader to judge the
8 of 22generalizability of the results.Misinterpretation of the Results of the Hypothesis Testing. In hypothesis testing, the researcher assumes the null hypothesis and seeks reason to rej ect it. Failure to find such evidence does not permit one to accept the null hypothesis, but only permits one to fail to accept the alternative hypothesis. Failure to gather evidence that will le ad to the acceptance of the alternative hypothesis and the subsequent rejection of the null hypothesis may be due to inadequate sample size, measurement errors or inaccurate model specif ication. Spencer and Wiley (1981) used the 109 studies which were analyzed in 1981 by Hanushek who sought to argue for the conclusion of no relationsh ip between teacher-pupil ratio and the performance of students as an example that illustra tes another of Hanushek's difficulties with the interpretation of significance tests on regression coefficients. Their argument showed that the null hypothesis can be rejected for positive result s and then can be rejected for negative rejects; pointing out difficulty with the model used and the data set. Potential Misinterpretation of the Summary. Baker (1991) discussed Hanushek's absence of a decision rule in his summary of the literature for the 147 studies (Hanushek, 1986). He stated that a synthesis of literature as reported by Hanushek c an be conducted in one of two ways: either by the vote counting method with a stated expectancy o r decision rule or by the metaanalysis method. Hanushek did not compute effect sizes so hi s review must have entailed by the vote counting method. Given the absence of the statement of a decision rule by Hanushek, Baker assumed a decision rule that 5% of the studies will be significant by chance. He then showed that 20% of the studies are significant, thus ruling out a chance relationship (Baker, 1991). In Table 3 of his 1989 article, "The Impact of Diff erential Expenditures on School Performance," Hanushek showed the expenditure parameters for the 187 studies for seven educational inputs as they relate to student achievement test performance Although he reported number of studies, he did not report number of districts, number of stude nts, or grade levels to which the studies pertain. For the various components he reports the number of non-significant studies found. Hence, 82% of the 152 studies relating teacher/pupi l ratio to student performance were found not significant (p<0.05); 88% of the 113 studies relati ng teacher education to student performance were found not significant (p<0.05); 64% of the 140 studies relating teacher experience to student performance were found not significant (p<0 .05); 78% of the 69 studies relating teacher salary to student performance were found not signif icant (p<0.05); 75% of the 65 studies relating expenditures per pupil to student performance were found not significant (p<0.05); 87% of the 61 studies relating administrative inputs to studen t performance were found not significant (p<0.05); and 84% of the 74 studies relating facili ties to student performance were found not significant (p<0.05). For four of these seven input s (Teacher experience, Teacher salary, Expenditures/pupil, and Administrative inputs) rati os of the significant to nonsignificant studies are equal to or exceed 11 to 4 odds in favor of pos itive relationships. Failure to Cite Research that is not Consistent wit h the Ideas Being Promoted and Inadequate Specification of the Key Study Variables. Given Hanushek's liberal qualification of studies a nd his reliance on the Coleman study, his rejection of the Glass study as being subject to too much criticism for attempting to calculate effect sizes for different class size intervals is surprising an d unaccountable. Hanushek's failure to address the cr iticisms of Spencer and Wiley was also surprising. In his discussion of aggregation effect s, the work of Burstein was overlooked. This work demonstrates the potential danger of aggregate d data and correlation. Inclusion of Confounded Data Elements. Federal dollars are included in school expenditure s as
9 of 22unrestrained expenditures. Some federal dollars are ear-marked for efforts that do not contribute to student performance scores. An even more serious potential problem is that some districts test special education students and other districts fail to test special education students. Hence, there is random confounding of the performance measure, r educing the sizes of correlations possible. Choice of Performance Measure. The choice of percent passing the basic competency test (bct) is an unfortunate choice of measure for a performance indicator. Percent passing immediately sets up a ceiling effect for those passing the test. Eve n if they benefit from additional or redistributed expenditures, their gains can never be shown in the scattergrams. Gains shown by those who pass and by those who continue to fail are not reflected in the measure. Baker (1991) noted that another major problem is Ha nushek's failure to correct correlations for attenuation arising from the fact that per pupil ex penditures are truncated. Baker stated that the correlation between achievement and expenditures is greatly reduced because "no schools spend a great deal more or less than others. ... It is qu ite easy for a significant finding to be overlooked if the observed data come from the center of a scat tergram, where the attenuated data often appear to be random. (Baker, 1991, p. 4) Criticism of the Work of Walberg Walberg appears to base the case for no relationshi p between achievement and expenditures on his theory of causal inferences on student learning and the nine productivity factors (1982); on the triad relationship of socio-economic status, pr oductivity, and expenditures (1989); and on reliance on Hanushek's model and on the early liter ature related to production function analysis (1984).Theory of causal inferences on student learning and the nine productivity factors Walberg's review of productivity research and his d evelopment of the "theory" of school learning has received much professional praise. I am in agre ement with this praise in that the model appears to synthesize a large body of research clea rly and usefully. Walberg's model includes a paradigm connecting Aptitude (ability, development and motivation), Instruction (amount and quality), and Environment (home, classroom, peers a nd television) as inputs to Learning (affective, behavioral and cognitive). I believe th at this model is an accurate picture of a subset of variables that are precursors of productivity. My e xperience suggests that curriculum probably should not be ignored and left out of the model. Al so, note that no variable entitled "expenditure" is included directly in the model. Yet, expenditure s are represented indirectly in both Instruction and Environment. Walberg recognized this role in th e following statement, "... and expenditure levels of schools and districts, and their politica l and sociological organization are less alterabl e in a democratic, pluralistic society; are less cons istently and powerfully linked to learning; and appear to operate mainly through the nine factors i n the determination of achievement." (Walberg, 1982, p. 120) What is puzzling about abou t this statement is that Walberg appears to be trying to stretch logic to agree with Hanushek's weak and inconsistent position, and reasons that higher expenditures follow quality instruction rather than higher expenditures serve as mediating factors to the purchase of quality instru ction. The triadic relationship of socio-economic status, productivity, and expenditures Walberg appears to be interested in the triadic rel ationship of socio-economic status, productivity (or at least efficiency of student test performance ), and expenditures. This interest is expressed in several studies and reviews authored by Walberg. In several of the studies, Walberg appears to
10 of 22have problems in the specification of at least two or perhaps all three of the variables of the triad. Perhaps, one of the major problems with how Walberg has set out to study these variables is his lack of control of certain key school variables. In the discussion of studies of the relationship of class size to achievement test performances nothing is said as to how many of the small classes were made up of special education students or were composed for remediation. The overlooking of these two common practices in school certainly c onfounds the study of class size and the inclusion of special education students confounds t he measure of student performance in reading, mathematics, science or other standard school curri cula criteria used to define school productivity. In his studies of district size, he p ermits urbanism to confound his variable. Walberg is frequently unclear as to what is being m easured as a variable representing productivity. Sometimes his productivity variable i s measured as percent passing. The method of measurement clearly restricts the range of the achi evement construct and serves to reduce the observed correlation. At other times, Walberg uses what he refers to as an efficiency measure, which is made up of the predicted achievement score using socio-economic status in the prediction equation divided by the observed achieve ment score. This configuration called "efficiency" appears to more closely represent a me asure of prediction error for socio-economic status. Clearly, his expenditure data include funds for transportation, lunch, special education, and similar programs which do not bear directly on instruction. In dealing with this triadic relationship, Walberg simply fails to discuss how he has handled the shared variance problem inherent in the relationshi p. His regression model enters socioeconomic status as the first predictor of students' test performances, size as the second prediction variable, and finally expenditures as the third pre dictor variable. The amount of explanation shared by socio-economic status and size and the am ount of explanation shared by socio-economic status and expenditures are credited to socio-economic status solely; the amount of explanation shared by size and expenditures are then attributed to size alone. Certainly, not much explanation remains to be credited to expendit ures. A different order of entry would produce markedly different results. Mayeske develop ed commonality analysis to address this problem, but the methodology has been subjected to some criticism. In actuality there is no effective statistical method that will unconfound s hared predictive relationships. Appropriate treatment of the shared relationship is perhaps a s traight-forward discussion of the irresolvability of the problem.Reliance on Hanushek's modelWalberg depends in several literature reviews on th e productivity analyses reported by Hanushek. He appears to rely on them without critic al scrutiny and uses Hanushek's work as rationale for demoting the role of expenditures in his model and in further analyses. Walberg's acceptance without question of Hanushek's work raises some concern about the other studies that he uses in his argument.Regression Analyses of New Jersey DataThe analyses performed for the New Jersey hearings (Walberg, 1989) appear to duplicate many of the faults discussed in Walberg's triad stu dies, and potentially contain a few new variances from standard research practice. On page 43 lines 4 and 5 of the 1989 document, Walberg's description of regression analyses is mis leading. Regression analyses does not provide a method of simultaneous analysis of the pr edictive contribution of three variables. Order of entry attributes shared variance of two va riables to the first one entered into the prediction process. Observed relations are most lik ely not independent; only the last variable to enter in the equation is likely to be i ndependent.
11 of 22Variable specification is again a problem as confou nding other school factors such as special education, remediation processes, transport ation costs, and the lunchroom expenditures have not been removed from the studies It appears that the variable "expenditures" rather than "expenditures per studen t" was run in the correlations. The "Efficiencies" prediction is still used as a depend ent variable and the truncated measurement of productivity (such as percent passin g) is used in several of the achievement measures.Order of entry and the problem of shared variance i s a problem in these analyses. One wonders what kind of discussion would ensue if an a ppropriate expenditure variable was entered first in the prediction of test performance s that had not been truncated or obscured by the use of ratios. Demonstration of the lack of validity of the produc tion function methodology A Suggested Alternative ApproachThe production function method must be altered in t hree ways to make it policy relevant. To identify the effects of large versus small expen ditures, the research task appears to demand a comparison rather than an association. Rat her than asking if there is a consistent relationship across the whole population it is better to ask for what kinds of districts do such effects exist within a state. A t hird change is to create a discrepancy in expenditures large enough to reveal differences in the purchasing power of educational services.Finding Homogeneous Sets of Districts. Districts within a state differ on many dimensions. Furthermore, the dimensions that are most discrimin ating in one state may not be so in another. By grouping districts in a particular stat e into classes (e.g., rich vs. poor) according to the key dimension for that state (e.g. wealth), homogeneous subgroups can be obtained for further analysis. Size of districts, r ural/urban, and number of exceptional children (either gifted or at risk) are variables w hose subdivisions are likely to establish subsets of homogeneous groups. In states like Monta na and Missouri, size is the dimension which creates homogeneous subgroups. In Alabama rur al/urban is the variable that yields homogeneous subgroups. In Ohio, income levels or so cio-economic status creates homogeneous subgroups. In some cases, there are one or two large, poor, urban districts which have to be considered as outliers so as to es tablish homogeneous subgroups. Creating the Disparity in Funding. In 1970 a study conducted for the Office of Panning and Program Evaluation/Bureau of Elementary and Seconda ry Education/United States Office of Education found that approximately 300 dollars w as needed to improve elementary school children's reading scores one month over the course of a year. A proration of this finding suggests that a disparity of 600 to 700 dol lars is needed between districts compared. Within each homogeneous subgroup, the districts are ordered by instructional expenditures and then divided into two groups where one is formed by the upper 30% and the other is defined by the lower 30%. The two grou ps are equal with regard to sample size and differences between the groups on expenditures should exceed 600 dollars. Given the satisfaction of these conditions differences in ach ievement scores should be apparent, if they exist.Using t-Tests to Investigate the Results of the Dis parity. Given the creation of the two groups (upper and lower 30%) from a single homogeneous sub group and the verification of a 600
12 of 22 dollars disparity, the independent t-test with pool ed variance can be used to discover achievement test differences. If more than three ho mogeneous subgroups are to be analyzed, methods to deal with the inflation of the confidence level should be considered. Such methods include the recalculation of the confi dence levels compensating for the use of several t-tests (the Bonferonni procedure) or the u se of the family of t-tests notion (e.g., the Tukey procedure). The proposed model can be used to investigate eithe r a family of dependent or independent variables or both. The use of several t-tests provi des the method for including a number of dependent or output variables. The ordering of dist ricts for the determination of the upper 30% and lower 30% with regard to the input or indep endent variables permits the consideration of any number of independent variable s. Application of the Alternative Approach to Two Stat es Data for the states of Missouri and Ohio were obtai ned through Education Policy Research, Incorporated which participated in the suits involving equity o f the state system for funding the public schools. These data involved the per pup il expenditure data, the proxy data for socio-economic status of the attendance area of the districts, district enrollment, and achievement data which were used in the preparation of the cases by both sides in the lawsuit. The achievement data for Missouri are the Missouri Mastery Achievement Test (MMAT) prepared by the state to measure state objec tives for the year 1990-91. The achievement data for Ohio are NCEs from standardize d achievement tests selected by the districts for the year 1989-90. Both sets of achiev ement data are judged to have adequate reliability.In Table 1 are shown the production function correl ations for the achievement data for the school districts in Missouri. Note that there is on ly one correlation, the one for tenth grade mathematics, that is large enough to be judged stat istically significantly different from zero. Since there are twenty production functions, one would conclude from such an analysis that the production function shows no rela tionship between instructional costs and achievement in Missouri. Table 1: Correlations Between Expenditures per Stud ent and Student Performance on MMAT Achievement Tests GRADESUBJECT AREA ReadingMathematicsScienceSoc Studies 4th (n=509) 0.050 0.073-0.008-0.0256th (n=522) -0.026-0.044-0.108-0.0628th (n=519) -0.024-0.0190.0270.0129th (n=392) -0.0050.0770.0770.07210th (n=433) 0.0490.117*0.0270.065* denotes p<0.05In Table 2 are shown the t-tests resulting from a p artial application of the alternative approach which creates the funding threshold not in cluded in the production function
13 of 22 analyses for the twenty distributions of achievemen t data. The creation of the threshold results in two of the distributions showing signifi cant positive relationships using the Bonferonni procedure. Ten of the twenty t-tests rea ch significant levels for single applications for the t-test. Given the family-wise results, it remains risky to conclude a positive relationship between achievement and per p upil expenditures at this time. Table 2: Contrasts of High and Low Funded Districts on the Missouri MMAT for 1990-1991Per Pupil Expenditure Averages: Upper 30% = $2056.7 9 Lower 30% = $1248.48 SubjectGroupMeanStd DevntSign.4th GradeReading HighLow 316.32309.36 25.6424.07 154154 2.441ns 4th GradeMath HighLow 313.47 306.87 33.7525.49 154154 1.934ns 4th GradeScience HighLow 330.33 329.48 41.01 32.05 154154 0.367ns 4th GradeSoc.Studies HighLow 336.18 334.14 36.79 34.04 154154 0.529ns 6th GradeReading HighLow 309.83 307.47 27.54 23.56 158 158 0.737ns 6th GradeMath HighLow 360.12 358.82 42.67 34.39 158 158 0.298ns 6th GradeScience HighLow 340.02 353.27 41.81 38.28 158 158 -0.942ns 6th GradeSoc.Studies HighLow 323.94 323.31 32.54 31.19 158 158 0.175ns 8th GradeReading HighLow 325.98 322.97 24.26 24.30 156 156 1.088ns 8th GradeMath HighLow 341.92 336.19 40.07 36.16 156 156 1.318ns 8th GradeScience HighLow 365.41 360.96 44.25 37.45 156 156 0.955ns 8th GradeSoc.Studies HighLow 326.32 321.08 27.26 24.84 156 156 1.764ns 9th GradeReading HighLow 294.13 287.63 22.59 18.94 131 131 2.198ns
14 of 22 9th GradeMath HighLow 312.61 299.64 35.81 23.17 131 131 2.9610.05 9th GradeScience HighLow 367.99 357.41 37.51 31.98 131 131 2.143ns 9th GradeSoc.Studies HighLow 316.89 309.49 24.85 20.34 131 131 2.295ns 10th GradeReading HighLow 311.82 306.89 24.52 18.36 144 144 1.693ns 10th GradeMath HighLow 339.80 330.52 32.30 20.31 144 144 2.5250.10 10th GradeScience HighLow 347.97 343.79 29.23 23.54 144 144 1.180ns 10th GradeSoc.Studies HighLow 309.53 306.03 24.59 18.47 144 144 1.196ns Application of the full alternative model involves not only the creation of the threshold, but also the elimination of outliers or of extreme scor es which may have an unusual relationship between instructional expenditures and achievement. Such scores come from economies of scale effects in small districts, the concentrating of at-risk students, or the amassing of more than essential wealth. In order to complete the comparison, production function analyses were performed on the twenty dist ributions after the outliers had been eliminated. In the Table 3 are reported the results of these production function analyses. Significant non-zero correlations are found for fou r of the twenty coefficients: fourth grade reading, eighth grade reading and social studies, a nd ninth grade mathematics. The significant correlation for tenth grade mathematics was lost in the elimination of the outliers. However, only three of the correlations i n Table 3 are negative, while nine are negative in Table 1. Still these four non-zero corr elations make concluding a relationship between instructional expenditures and achievement too risky. The outliers removed were school districts with enrollments less than 300 and enrollments of greater than 25,000 students. Table 3: Correlations Between Expenditures per Stud ent and Student Performance on MMAT Achievement Tests with Outliers Removed. GRADESUBJECT AREA ReadingMathematicsScienceSoc Studies 4th (n=329) 0.142**0.1070.0190.0966th (n=329) 0.048-0.026-0.0520.0158th (n=329) 0.132*0.0660.0780.121*9th (n=268) 0.0630.146**0.0550.08010th (n=318) 0.0230.052-0.0290.023
15 of 22 denotes p<0.05** denotes p<0.01In Table 4 are reported the results of the full app lication of the alternative model. Note that the threshold is about $620 dollars and that the nu mber of districts has now been reduced to 331. Eight of the twenty t-tests are significant for Bonferonni calculated alpha levels. Fourteen of the twenty t-tests reach the level of s ignificance for unadjusted t-test probabilities. These results permits the conclusion of a positive relationship between expenditures per student and achievement on the MMA T. Missouri school districts can be characterized by a large number of districts with f ewer than 300 student enrollment, a few extremely large districts which have a majority of high risk students and high expenditures, and a handful of rich districts that have extremely high expenditures. Table 4: Contrasts of High and Low Funded Districts with Outliers Removed on Missouri MMAT for 1990-1991Per Pupil Expenditure Averages Upper 30% = $1906.43 Lower 30% = $1284.22 SubjectGroupMeanStd DevntSign.4th GradeReading HighLow 321.17310.44 23.2119.20 99 99 3.4510.01 4th GradeMath HighLow 317.13307.06 24.1421.26 99 99 3.0120.05 4th GradeScience HighLow 336.67332.89 28.9127.15 99 99 0.914ns 4th GradeSoc.Studies HighLow 345.71334.78 27.5726.05 99 99 2.7640.05 6th GradeReading HighLow 312.33306.98 20.6618.16 99 99 1.921ns 6th GradeMath HighLow 363.47358.70 34.60 30.54 99 99 1.020ns 6th GradeScience HighLow 358.25354.46 36.7734.01 99 99 0.748ns 6th GradeSoc.Studies HighLow 327.97322.62 26.53 24.13 99 99 1.472ns 8th GradeReading HighLow 327.68319.13 16.6917.67 99 99 3.2800.05 8th GradeMath HighLow 344.05333.20 34.4430.18 99 99 2.338ns
16 of 22 8th GradeScience HighLow 371.37359.66 34.2532.64 99 99 2.5440.10 8th GradeSoc.Studies HighLow 329.59319.24 21.1321.13 99 99 3.4190.01 9th GradeReading HighLow 293.30288.01 17.2518.21 81 81 2.8480.05 9th GradeMath HighLow 311.95300.58 26.8323.32 81 81 2.8080.05 9th GradeScience HighLow 366.42357.01 28.5329.60 81 81 2.014ns 9th GradeSoc.Studies HighLow 316.33309.24 19.7419.15 81 81 2.275ns 10th GradeReading HighLow 311.73308.55 17.1317.09 93 93 1.263ns 10th GradeMath HighLow 338.46332.03 21.65 19.87 93 93 2.089ns 10th GradeScience HighLow 347.79345.40 19.5523.34 93 93 0.755ns 10th GradeSoc.Studies HighLow 308.67306.85 17.3118.53 93 93 0.689ns A similar sequence of analyses has been performed f or data obtained for the state of Ohio. Production function analyses were performed on the number of school districts in the state and contrasted with the results of t-tests performe d after a threshold had been created. This sequence comparing production functions with t -test contrasts was then repeated after outliers were removed.In Table 5 are reported the nine production functio n analyses for Ohio. None of the nine achievement areas shows significantly non-zero corr elations. In Table 6 are reported the t-test contrasts for the same nine Ohio distributio ns. None of the nine contrasts reach the Bonferonni significance levels. Table 5: Correlations Between Instructional Expendi tures and Selected Variables in Ohio DatabaseSelected Variables District Instructional Expenditures per Student 4th Grade Reading-0.012n = 6084th Grade Language Arts -0.065n = 608
17 of 22 4th Grade Mathematics -0. 024n = 6086th Grade Reading 0.008n = 6086th Grade Language Arts -0.019 n = 6086th Grade Mathematics -0.006n = 6088th Grade Reading 0.004n = 6088th Grade Language Arts -0.028n = 6088th Grade Mathematics -0.002n = 608Table 6: Contrasts (t-tests) of School District Exp enditures on Achievement Scores Per Pupil Expenditure Averages Upper 30% = $2442.62 Lower 30% = $1578.16 n = 183 ... n = 183 Achievement AreaGroupMean St Dev tSign. 4th Reading highlow 54.9554.27 5.935.45 1.133ns 6th Reading highlow 54.27 53.34 5.74 5.90 1.514 ns 8th Reading highlow 54.79 54.07 5. 41 5. 36 1. 264 ns 4th Language highlow 53.82 53.18 6.79 6.29 0.041 ns 6th Language highlow 53.05 52.36 6. 21 6. 30 1.057 ns 8th Language highlow 53.73 53.30 6. 25 6. 23 0.648 ns 4th Math highlow 52.73 51.88 7.50 7.41 1.081 ns 6th Math highlow 53.46 52.15 7.03 7.29 1.740 ns 8th Math highlow 53. 70 52. 43 7. 31 6.89 1.712 ns In Tables 7 and 8 are reported the same analyses af ter the outliers have been removed from
18 of 22 the achievement distributions. In Table 7 are repor ted the production functions. Table 7: Correlations Between Instructional Expendi tures and Selected Achievement Variables in Ohio Database with Outliers Removed 1989-1990.Selected Variables District InstructionalExpenditures per Student 4th Grade Reading0.053n = 4584th Grade Language Arts 0.034n = 4584th Grade Math 0.071n = 4586th Grade Reading 0.055n = 4586th Grade Language Arts 0.037n = 4586th Grade Math 0. 074n = 4588th Grade Reading 0. 072n = 4588th Grade Language Arts 0. 024n = 4588th Grade Math 0.091*n = 458* denotes p<0.05The nine production functions reported in Table 7 i nclude only one non-zero correlation, for eighth grade mathematics. From these analyses o ne is led to conclude no relationship between instructional expenditures and achievement in Ohio. In Table 8 five of the nine t-test contrasts show positive relationships leadin g to the conclusion that instructional expenditures are related to achievement, demonstrat ing the inefficiency and inappropriateness of production function analyses. Table 8: Contrasts (t-tests) of School District Exp enditures on Achievement Scores with Outliers Removed Ohio Database, 1989-90. Per Pupil Expenditure Averages Upper 30% = $2187.07 Lower 30% = $1544.76 n = 106 ... n = 106 Achievement AreaGroupMeanSt DevtSign.4th Reading highlow 55.0953.64 6.235.44 1.714ns 6th Reading highlow 54.42 52.59 5.91 5.80 2.2530.10 8th Reading highlow 55.26 53.25 5.24 5.52 2.7030.05
19 of 22 4th Language highlow 53.84 52.89 6.91 6.27 1.042ns 6th Language highlow 53.23 51.64 6.29 6.42 1.805ns 8th Language highlow 53.94 52.56 6.17 6.33 1.603ns 4th Math highlow 53. 44 51.08 7.85 7.21 2.2580.10 6th Math highlow 53.94 51.15 7.36 7.26 2.7590.05 8th Math highlow 54.00 51.49 7.607.18 2.4540.10 Conclusion Production function analyses have been used to assi st policy deliberations concerning educational funding equity. These analyses are base d on correlational methods which can be misleading in the investigation the relationship between student achievement and instructional expenditures. The correlation process fails to create a threshold of dollars needed to demonstrate differences in achievement. A n alternate method has been developed for the investigation of the relationship. This met hod is based on creating homogeneous subgroups of districts which are then ordered by ex penditures per student. The achievement mean for the group created by the highe st funded 30% of the districts is compared to the mean for the group created by the l owest funded 30% of the districts using a t-test. This method was used to demonstrate relationships missed by production function analyses in two states. References Baker, Keith, "Yes, Throw Money at Schools." Phi De lta Kappan, April, 1991, 72(8), page 4-6.Bridge, G.R., C.M. Judd, and P.R. Moock, The determ inants of educational outcomes: The impact of families, peers, teachers, and schools, C ambridge, Mass., Ballinger Publishers, 1979.Burstein, Leigh, "Issues in the Aggregation of Data ," in Berliner, David, (ed), Review of research in education, Vol. 8, Washington, D. C., A merican Education Research Association, 1980, pp 158-63.Coleman et al, Equality of educational opportunity, Washington, D. C., Government Printing Office, 1966.Glass, G. V. and M. L. Smith, "Meta-analysis of Res earch on Class Size and Achievement," Education Evaluation and Policy Analysis, 1979, I(1 ), pp. 2-16. Hanushek, Eric A., "Conceptual and Empirical Issues in the Estimation of Educational Production Functions," The Journal of Human Resourc es, Summer, 1979, 14(3), pp.
20 of 22351-88.Hanushek, Eric A., "Throwing Money at Schools," Jou rnal of Policy Analysis and Management, Fall, 1981, I(1), pp. 19-41.Hanushek, Eric A., "The Economics of Schooling: Pro duction and Efficiency in Public Schools," Journal of Economic Literature, September 1986, XXIV, pp. 1141-1177. Hanushek, Eric A., "The Impact of Differential Expe nditures on School Performance," Educational Researcher, May, 1989, 18(4), pp. 45-51 Hanushek, Eric A., "When School Finance "Reform" Ma y Not Be Good Policy," Harvard Journal on Legislation, Summer, 1991, 28(2), pp. 42 3-456. Hughes, Mary F., "Review of the Literature: Educati on ProductionFunction Studies and Eric A. Hanushek," fugitive document, West Virginia Education Fund, Charleston, West Virginia, 1992.Monk, David H., "Education Productivity Research: A n Update and Assessment of its Role in Education Finance Reform." Educational Evaluatio n and Policy Analysis, Winter, 1992, 14(4), pp 307-332.Murname, Richard, Impact of School Resources on the Learning of Inner City Children. Ballinger, Cambridge, MA, 1975.Pedhazur, Ezekiel, Multiple Regression in Behaviora l Research. Holt, Rinehart and Winston, New York, 1982.Raymond, Richard, "Determinants of the Quality of P rimary and Secondary Public Education in West Virginia." Journal of Human Resou rces, Fall, 1968, 3(4), pp. 450-470. Schaeffer, Richard L., W. Mendenhall, and L. Ott. E lementary Survey Sampling. Duxbury Press, Boston, MA, 1986.Spencer, Bruce D. and David E. Wiley, "The Sense an d Nonsense of School Effectiveness," Journal of Policy Analysis and Management. Fall, 19 81, pp. 43-52. Walberg, H. J., and W. J. Fowler, Jr., "Expenditure and Size Efficiencies of Public School Districts." apparantly prepared for the New Jersey hearing, ERIC ED 274 471, RC 015 786, 1989.Walberg, H. J., "Educational Productivity: Theory, Evidence, and Prospects" Australian Journel of Education, 26(2), 1982, pp. 115-122.Walberg, H. J., D. L. Harnisch, and S. L. Tsai, "El ementary School Mathematics Productivity in Twelve Countries." British Educatio nal Research Journal, 12(3), 1986, pp. 237-248.Walberg, H. J., "Improving the Productivity of Amer ica's Schools." Educational Leadership, May, 1984, 41(8), pp. 19-27.Walberg, H. J., and K. Marjoribanks, "Family Enviro nment and Cognitive Development: Twelve Analytic Models." Review of Educational Rese arch, 46(4), 1976, pp. 527-550.
21 of 22 Walberg, H. J., and T. Weinstein, "The Production o f Achievement and Attitude in High School Social Studies." Journal of Educational Rese arch, 75(5), 1982, pp. 285-292.Copyright 1993 by the Education Policy Analysis ArchivesEPAA can be accessed either by visiting one of its seve ral archived forms or by subscribing to the LISTSERV known as EPAA at LISTSERV@asu.edu. (To sub scribe, send an email letter to LISTSERV@asu.edu whose sole contents are SUB EPAA y our-name.) As articles are published by the Archives they are sent immediately to the EPAA subscribers and simultaneously archived in three forms. Articles are archived on EPAA as individual files under the name of the author a nd the Volume and article number. For example, the article by Ste phen Kemmis in Volume 1, Number 1 of the Archives can be retrieved by sending an e-mail letter to LI STSERV@asu.edu and making the single line in the letter read GET KEMMIS V1N1 F=MAIL. For a table of contents of the entire ARCHIVES, send the following e-mail message to LIST SERV@asu.edu: INDEX EPAA F=MAIL, that is, send an e-mail letter and make its single line read INDEX EPAA F=MAIL. The World Wide Web address for the Education Policy Analysis Archives is http://olam.ed.asu.edu/epaaEducation Policy Analysis Archives are "gophered" at olam.ed.asu.edu To receive a publication guide for submitting artic les, see the EPAA World Wide Web site or send an e-mail letter to LISTSERV@asu.edu and include the s ingle line GET EPAA PUBGUIDE F=MAIL. It will be sent to you by return e-mail. General quest ions about appropriateness of topics or particular articles may be addressed to the Editor, Gene V Gla ss, Glass@asu.edu or reach him at College of Education, Arizona State University, Tempe, AZ 8528 7-2411. (602-965-2692)Editorial Board John CovaleskieSyracuse UniversityAndrew Coulson Alan Davis University of Colorado--DenverMark E. Fetlermfetler@ctc.ca.gov Thomas F. Green Syracuse Universitytfgreen@mailbox.syr.edu Alison I. Griffithagriffith@edu.yorku.ca Arlen Gullickson email@example.com Ernest R. Houseernie.firstname.lastname@example.org Aimee Howleyess016@marshall.wvnet.edu Craig B. Howley email@example.com William Hunterhunter@acs.ucalgary.ca Richard M. Jaeger firstname.lastname@example.org Benjamin Levinlevin@ccu.umanitoba.ca Thomas MauhsPughthomas.email@example.com Dewayne Matthewsdm@wiche.edu Mary P. McKeowniadmpm@asuvm.inre.asu.edu Les McLeanlmclean@oise.on.ca Susan Bobbitt Nolensunolen@u.washington.edu
22 of 22Anne L. Pembertonapembert@pen.k12.va.us Hugh G. Petrieprohugh@ubvms.cc.buffalo.edu Richard C. Richardsonrichard.firstname.lastname@example.org Anthony G. Rud Jr.email@example.com Dennis Sayersdmsayers@ucdavis.edu Jay Scribnerjayscrib@tenet.edu Robert Stonehillrstonehi@inet.ed.gov Robert T. Stoutaorxs@asuvm.inre.asu.edu