xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam a22 u 4500
controlfield tag 008 c20069999azu 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E11-00490
Educational policy analysis archives.
n Vol. 14, no. 17 (June 30, 2006).
Tempe, Ariz. :
b Arizona State University ;
Tampa, Fla. :
University of South Florida.
c June 30, 2006
On ideology, causal inference and the reification of statistical methods : reflections on examining instruction, achievement and equity with NAEP mathematics data / Harold Wenglinsky.
Arizona State University.
University of South Florida.
t Education Policy Analysis Archives (EPAA)
xml version 1.0 encoding UTF-8 standalone no
mods:mods xmlns:mods http:www.loc.govmodsv3 xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govmodsv3mods-3-1.xsd
mods:relatedItem type host
mods:identifier issn 1068-2341mods:part
mods:detail volume mods:number 14issue 17series Year mods:caption 20062006Month June6Day 3030mods:originInfo mods:dateIssued iso8601 2006-06-30
Readers are free to copy display, and distribute this article, as long as the work is attributed to the author(s) and Education Policy Analysis Archives, it is distributed for noncommercial purposes only, and no alte ration or transformation is made in the work. More details of this Creative Commons license are available at http://creativecommons.org/licen ses/by-nc-nd/2.5/. All other uses must be approved by the author(s) or EPAA EPAA is published jointly by the Colleges of Education at Arizona State University and the Universi ty of South Florida. Articles are indexed by H.W. Wilson & Co. Please contribute commentary at http ://epaa.info/wordpress/ and send errata notes to Sherman Dorn (email@example.com). EDUCATION POLICY ANALYSIS ARCHIVES A peer-reviewed scholarly journal Editor: Sherman Dorn College of Education University of South Florida Volume 14 Numb er 17 July 1, 200 6 ISSN 1068 On Ideology, Causal Inference and the Re ification of Statistical Methods: Reflections on Examining Instru ction, Achievement and Equity with NAEP Mathematics Data Harold Wenglinsky Academy for Educational Development Regional Educational Laboratory of the Southeast Citation: Wenglinsky, H. (2006 ). On ideology, causal infe rence and the reification of statistical methods: Reflections on Examini ng instruction, achievement and equity with NAEP mathematics data. Education Policy Analysis Archives, 14 (17). Retrieved [date] from http://epaa.asu.ed u/epaa/v14n17/. Abstract The purpose of this article is to comment on the prior article entitled Examining Instruction, Achievement and Equity with NAEP math ematics data, by Sarah Theule Lubienski. That article claims th at a prior article by the author suffered from three weaknesses: (1) An attempt to justify No Child Left Behind (NCLB); (2) drawing causal inferences from cross-sectional data; (3) and various statistical quibbles. The author responds to the first claim, by indi cating that any mention of NCLB was intended purely to make the arti cle relevant to a policy journal; to the second claim, by noting his own reservat ions about using cross-sectional data to draw causal inferences; and to the thir d claim by noting potential issues of quantitative methodology in the Lubienski article. He concludes that studies that use advanced statistical methods are often so opaque as to be difficult to compare, and suggests some advantages to the quantitative transparency that comes from the findings of randomly controlled field trials. Keywords: equity, mathemat ics achievement; mathematics instruction; NAEP.
Education Policy Analysis Archives Vol. 14 No. 17 2 Resumen El objetivo de este articulo es coment ar el trabajo Examinando instruccin, logros, y equidad usando re sultados de matemticas de NAEP de Sarah Theule Lubienski previamente publicado en EPAA. Es e articulo sostena que otro articulo publicado por mi sufra de tres debilidades: (1) Intentaba justificar la ley federal sin abandonar ningn nio (2) Realizaba inferencias caus ales a partir de datos obtenidos en estudios cross seccionales; (3) otras objecio nes sobre las estadsticas usadas. Este autor responde a la primera objecin indicando que cualquier mencin a la ley federal sin abandonar ningn ni o tuvo como objeti vo establecer la relevancia de este trabajo para una revista de anlisis poltico; a la segunda objecin, respondo indicando mis propias reservas al uso de estudios cross seccionales para establecer relaciones ca usales; y a la tercer objecin, indicando algunos problemas potenciales en la me todologa cuantitativa del trabajo de Lubienski. Este autor concluye que los estudios que usan mtodos cuantitativos avanzados suelen ser opacos y difciles de entender y comparar, y sugiere que la transparencia cuantitativa deviene obt ener resultados en experimentos con poblaciones seleccionadas al az ar y campos controlados. Editors Note: This article is a response to Sarah Lubienskis (2006) article that appears at http://epaa.asu.edu/epaa/v14n14/ which discussed Wenglinskys (2004) article available at http://epaa.asu.edu/v12n64/ It is the practice of Education Policy Analysis Archives to publish one round of responses to articles where it is merited. Additional discussion of this and other articles is welcome online at http://epaa.info/wordpress Over the last decade, the author has publishe d half a dozen studies of relationships among school and teacher characteristics and student achievement using data from the National Assessment for Educational Progress (NAEP). Othe rwise known as the Nations Report Card, NAEP provides test data on nationally representative samples of fourth, eighth and twelfth graders in a variety of subjects over multiple years. There are many methodological challenges faced in the analysis of NAEP data, but none as nettlesome as its cross-sectional nature. Because the data are cross-sectional, the finding of relationships between school characteristics, such as class size, and student achievement cannot be used to draw causal inferences. This point has been made by the author in nearly all of his publications on the topi c, and is reiterated by opponents of the authors conclusions. If the policy conclusions are of a constructivist nature, the author finds himself attacked by conservative-learning researchers on the grounds that he is making causal inferences. If the policy conclusions are of a didactic nature, the author finds himself attacked by liberal-leaning researchers, generally on the same grounds. The critics usually also sprinkle in some methodological quibbles, such as wondering what the results would have been if variable X had been measured slightly differently, but the core criticism is th at the cross-sectionality of the data make causal inferences impossible. A recent instance of this occurred with Sarah Theule Lubienskis (2006) response to the current authors article on the achievement gap, C losing the Racial Achievement Gap: The Role of Reforming Instructional Practices (Wenglinsky, 2004 ). The current authors article distinguished between two types of racial achievement gap, that between schools, meaning between predominantly minority and predominantly white schools, and within schools, meaning between
On Ideology, Causal Inference and the Reification of Statistical Methods 3 White and minority students in the same school. The author found that a variety of instructional practices, mostly of the constructivist variety, but some not, were negatively related to the withinschool achievement gap, but unrelated to the between-school achievement gap. The author concluded that using the identified practices might be a viable strategy for reducing the achievement gap within schools, but not the one between schools, which would require some more macroinstitutional change. Lubienskis (2006) response, Examining Instruction, Achievement and Equity with NAEP Mathematics Data, also found some in structional practices to be associated with the racial achievement gap, but with the caveats that only constructivist techniques evinced a relationship and that these techniques did not go so far as eliminating achievement gaps (the gaps she examined were analogous to the within-school gaps the author examined). Lubienskis article raised the three questions that the current author finds are commonly raised about NAEP secondary analyses. First, Lubi enski suggested an ideological underpinning to her critique: The current authors study was suppos ed to suggest how the Bush Administrations No Child Left Behind (NCLB) could succeed, whereas her study focused on empirical support for the National Council of Teachers of Mathematics reform-minded (read constructivist) instructional practices. Second, Lubienski raised the issue of ca usal inferences, claiming that the current author used causal language in his study. And, third, she proposed some statistical quibbles which amounted to the notion that she approached her analysis slightly differently than the current author did. With regard to the first argument, the current author had no ideological agenda. The mention of NCLB did not constitute and endorsemen t of it, but simply an attempt to find some relevance to policymakers in an article submitted to a policy journal. It has typically been the authors experience that the findings of NAEP secondary analyses rarely fit neatly into one ideological framework or another. Thus the authors study of school finance found that the effectiveness of school dollars depended upon how they were spent (Wenglinsky, 1997). As another example, the authors foray into the debate abou t whether educational tech nology made a difference found that technology effects depended upon how the technology was used (Wenglinsky, 2005). Rarely are the findings from statistical analyses of large-scale data unequivocal, and Closing the Racial Achievement Gap was no different, finding that the effective practices were an ideological potpourri, leaning somewhat towards the constructivist side. The second argument, about causal inferences, is to some extent a red herring. As Lubienski admits, the current author acknowledged repeated ly that causal inferences cannot be drawn from cross-sectional data. He specifically noted that wh ile he would use the phrase school effect, he meant it in the statistical sense (as in effect size) and did not intend it to connote causality. Lubienski herself sometimes falls into causal lang uage, such as when she refers to instructional practices as predictors of achievement, suggest ing that they are temporally, and thus causally, prior to test scores (Lubienski, 2006, p. 7). And in a nother analysis of NAEP data, in which Lubienski seeks to measure the relationship between attendin g a private school and student achievement, she talks of private-school effects (C. Lubienski & S. T. Lubienski, 2006). Given this challenge in both Wenglinskys and Lubienskis work, one may rightly ask whether it is worthwhile to engage in secondary analyses of NAEP at all. The answer is that, while correlational analyses are not good at establishing causal relationships they are good at identifying variables that should subsequently be subject to more rigorous analyses. This view is the rationale behind the Institute for Education Sciences framework for its discretionary grants prog ram; it has a continuum of research goals under which researchers can apply, beginning with secondary analyses to identify variables of importance, proceeding to developmental studies that make th e transition from variable identification to the creation of an intervention, finally to experi mental studies that can establish causation for interventions. The reason that correlational analyses are an important first step is because they
Education Policy Analysis Archives Vol. 14 No. 17 4 suggest where to concentrate and where not to concentrate. Given that, in the Lubienski study, the use of calculators proved unrelated to the racial achievement gap, it is unlikely that providing calculators to students, alone, is a worthwhile intervention. Thus Lubienski is correct in suggesting that cross-sectional data do not support causal infe rences, but such secondary analyses are a crucial piece of work prior to the development and testing of educational interventions. There is perhaps a more important reason th at secondary analyses must be viewed as preliminary to more rigorous research designs, and that is because secondary analyses nearly always fall victim to statistical reification. This term refe rs to the fact that researchers typically treat the statistical method of the day as an absolute basis for truth, even though the rationale behind using the technique, to say nothing of the mathematics in volved, is typically opaque. Why use hierarchical linear modeling rather than structural equation mode ling? When is multicollinearity severe enough to discredit results, given that most of the research questions of interest involve creating a significant degree of multicollinearity? Statistical quibbles can be raised about any secondary analysis. Lubienskis is no exception. Although she does not present the correlations among her factors, she refers to their being highly correlated. Highly-co rrelated factors suggest a poorly-fitting factor model and therefore potentially invalidate it. In addition, the proper way to verify a factor structure is through confirmatory factor analysis using a separa te replication sample, not through creating scales and running Cronbach Alphas on the same data. An d, disturbingly, here Cronbach Alphas are mostly below what many consider to be the cu toff of .7. One other issue is that her models disaggregate teacher effects to the student level, wh ich means that the instructional variables are only partitioning student-level variance in achievemen t, not school-level variance in achievement, and thus may understate the size of instructional effects. Does all of the foregoing invalidate her conclusions? The reader will probably decide by comparing the findings to his or her own experiences with education reform. Secondary analyses need to be conducted with limited goals, and some kind of more robust research design (quasi-experimental or experimental) used for the more ambitious goal of demonstrating the efficacy of an intervention. One reason for this is that experiments are designed to support causal inferences, by holding constant all variables besides exposure to the treatment. They therefore address selection bias in a way that statistical analyses cannot. But a more important reason is the transparent nature of the results of an experiment. The results are transparent because they generally involve performing a students t-test on two raw scores, that of the treatment and that of the control. These kinds of comparisons are mo re likely to be persuasive to a policymaker or educator than the elaborate debates over the a ppropriate multivariate method, the appropriate fit statistic, or any number of other running debates among quantitative methodologists. This is not to say that experiments are the gold standard, but si mply that they are less subject to reification, and therefore more trustworthy from the st andpoint of making policy decisions. References Lubienski, C., & Lubienski, S.T. (2006). Charter, private, public school s and academic ac hievement: New evidence from NAEP mathematics data. New York: National Ce nter for the Study of Privatization in Education, Teachers College, Columbia University. Lubienski, S.T. (2006). Examining instruction, ac hievement, and equity with NAEP mathematics data. Educational Policy Analysis Archives, 14 (14) Retrieved June 30, 2006, from http://epaa.asu. edu/epaa/v14n14/
On Ideology, Causal Inference and the Reification of Statistical Methods 5 Wenglinsky, H. (1997). When money matters: How educational ex penditures improve student achievement and how they dont. Princeton, NJ: Educatio nal Testing Service. Wenglinsky, H. (2004). Closing the racial achiev ement gap: The role of reforming instructional practices. Educational Policy Analysis Archives, 12 (64). Retrieved Ju ne 30, 2006, from http://epaa.asu.edu/epaa/v12n64/ Wenglinsky, H. (2005). Using technology wisely: The keys to success in schools. New York, NY: Teachers College Press. About the Author Harold Wenglinsky Academy for Educational Development Regional Educational Labor atory of the Southeast Email: firstname.lastname@example.org Harold Wenglinsky is a program officer at the Academy for Educ ational Development and the Co-investigator of a ra ndomized controlled trial of a state-level intervention, the Alabama Math, Science and Technology Initiative. He still enjoys analyzing NAEP data.
Education Policy Analysis Archives Vol. 14 No. 17 6 EDUCATION POLICY ANALYSIS ARCHIVES http://epaa.asu.edu Editor: Sherman Dorn, University of South Florida Production Assistant: Chris Murre ll, Arizona State University General questions about ap propriateness of topics or particular articles may be addressed to the Editor, Sherman Dorn, email@example.com. Editorial Board Michael W. Apple University of Wisconsin David C. Berliner Arizona State University Robert Bickel Marshall University Gregory Camilli Rutgers University Casey Cobb University of Connecticut Linda Darling Hammond Stanford University Gunapala Edirisooriya Youngstown State University Mark E. Fetler California Commission on Teacher Credentialing Gustavo E. Fischman Arizona State Univeristy Richard Garlikov Birmingham, Alabama Gene V Glass Arizona State University Thomas F. Green Syracuse University Aimee Howley Ohio University Craig B. Howley Ohio University William Hunter University of Ontario Institute of Technology Daniel Kalls Ume University Benjamin Levin University of Manitoba Thomas Mauhs Pugh Green Mountain College Les McLean University of Toronto Heinrich Mintrop University of California, Berkeley Michele Moses Arizona State University Anthony G. Rud Jr. Purdue University Michael Scriven Western Michigan University Terrence G. Wiley Arizona State University John Willinsky University of British Columbia
On Ideology, Causal Inference and the Reification of Statistical Methods 7 EDUCATION POLICY ANALYSIS ARCHIVES English-language Graduate -Student Editorial Board Noga Admon New York University Jessica Allen University of Colorado Cheryl Aman University of British Columbia Anne Black University of Connecticut Marisa Cannata Michigan State University Chad d'Entremont Teachers College Columbia University Carol Da Silva Harvard University Tara Donahue Michigan State University Camille Farrington University of Illinois Chicago Chris Frey Indiana University Amy Garrett Dikkers University of Minnesota Misty Ginicola Yale University Jake Gross Indiana University Hee Kyung Hong Loyola University Chicago Jennifer Lloyd University of British Columbia Heather Lord Yale University Shereeza Mohammed Florida Atlantic University Ben Superfine University of Michigan John Weathers University of Pennsylvania Kyo Yamashiro University of California Los Angeles
Education Policy Analysis Archives Vol. 14 No. 17 8 Archivos Analticos de Polticas Educativas Associate Editors Gustavo E. Fischman & Pablo Gentili Arizona State University & Universidade do Estado do Rio de Janeiro Founding Associate Editor for Spanish Language (1998003) Roberto Rodrguez Gmez Editorial Board Hugo Aboites Universidad Autnoma Metropolitana-Xochimilco Adrin Acosta Universidad de Guadalajara Mxico Claudio Almonacid Avila Universidad Metropolitana de Ciencias de la Educacin, Chile Dalila Andrade de Oliveira Universidade Federal de Minas Gerais, Belo Horizonte, Brasil Alejandra Birgin Ministerio de Educacin, Argentina Teresa Bracho Centro de Investigacin y Docencia Econmica-CIDE Alejandro Canales Universidad Nacional Autnoma de Mxico Ursula Casanova Arizona State University, Tempe, Arizona Sigfredo Chiroque Instituto de Pedagoga Popular, Per Erwin Epstein Loyola University, Chicago, Illinois Mariano Fernndez Enguita Universidad de Salamanca. Espaa Gaudncio Frigotto Universidade Estadual do Rio de Janeiro, Brasil Rollin Kent Universidad Autnoma de Puebla. Puebla, Mxico Walter Kohan Universidade Estadual do Rio de Janeiro, Brasil Roberto Leher Universidade Estadual do Rio de Janeiro, Brasil Daniel C. Levy University at Albany, SUNY, Albany, New York Nilma Limo Gomes Universidade Federal de Minas Gerais, Belo Horizonte Pia Lindquist Wong California State University, Sacramento, California Mara Loreto Egaa Programa Interdisciplinario de Investigacin en Educacin Mariano Narodowski Universidad To rcuato Di Tella, Argentina Iolanda de Oliveira Universidade Federal Fluminense, Brasil Grover Pango Foro Latinoamericano de Polticas Educativas, Per Vanilda Paiva Universidade Estadual Do Rio De Janeiro, Brasil Miguel Pereira Catedratico Un iversidad de Granada, Espaa Angel Ignacio Prez Gmez Universidad de Mlaga Mnica Pini Universidad Nacional de San Martin, Argentina Romualdo Portella do Oliveira Universidade de So Paulo Diana Rhoten Social Science Research Council, New York, New York Jos Gimeno Sacristn Universidad de Valencia, Espaa Daniel Schugurensky Ontario Institute for Studies in Education, Canada Susan Street Centro de Investigaciones y Estudios Superiores en Antropologia Social Occidente, Guadalajara, Mxico Nelly P. Stromquist University of Southern California, Los Angeles, California Daniel Suarez Laboratorio de Politicas Publicas-Universidad de Buenos Aires, Argentina Antonio Teodoro Universidade Lusfona Lisboa, Carlos A. Torres UCLA Jurjo Torres Santom Universidad de la Corua, Espaa