USF Libraries
USF Digital Collections

Educational policy analysis archives

MISSING IMAGE

Material Information

Title:
Educational policy analysis archives
Physical Description:
Serial
Language:
English
Creator:
Arizona State University
University of South Florida
Publisher:
Arizona State University
University of South Florida.
Place of Publication:
Tempe, Ariz
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Education -- Research -- Periodicals   ( lcsh )
Genre:
non-fiction   ( marcgt )
serial   ( sobekcm )

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
usfldc doi - E11-00308
usfldc handle - e11.308
System ID:
SFS0024511:00308


This item is only available as the following downloads:


Full Text
xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam a22 u 4500
controlfield tag 008 c20039999azu 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E11-00308
0 245
Educational policy analysis archives.
n Vol. 11, no. 10 (March 14, 2003).
260
Tempe, Ariz. :
b Arizona State University ;
Tampa, Fla. :
University of South Florida.
c March 14, 2003
505
Exploring the achievement gap between white and minority students in Texas :a comparison of the 1996 and 2000 NAEP and TAAS eighth grade mathematics test results / Thomas H. Linton [and] Donald Kester.
650
Education
x Research
v Periodicals.
2 710
Arizona State University.
University of South Florida.
1 773
t Education Policy Analysis Archives (EPAA)
4 856
u http://digital.lib.usf.edu/?e11.308



PAGE 1

1 of 19 Education Policy Analysis Archives Volume 11 Number 10March 14, 2003ISSN 1068-2341 A peer-reviewed scholarly journal Editor: Gene V Glass College of Education Arizona State University Copyright 2003, the EDUCATION POLICY ANALYSIS ARCHIVES .Permission is hereby granted to copy any article if EPAA is credited and copies are not sold. EPAA is a project of the Education Policy Studies Laboratory. Articles appearing in EPAA are abstracted in the Current Index to Journals in Education by the ERIC Clearinghouse on Assessment and Evaluation and are permanently archived in Resources in Education .Exploring the Achievement Gap Between White and Minority Students in Texas: A Comparison of the 1996 and 2000 NAEP and TAAS Eighth Grade Mathematics Test Results Thomas H. Linton Texas A&M University-Corpus Christi Donald Kester Texas A&M University-Corpus ChristiCitation: Linton, T. H. & Kester, D. (2003, March 1 4). Exploring the achievement gap between white and minority students in Texas: A comparison of the 1996 and 2000 NAEP and TAAS eighth grade mathem atics test results, Education Policy Analysis Archives, 11(10). Retriev ed [date] from http://epaa.asu.edu/epaa/v11n10/ .Abstract The Texas Assessment of Academic Skills (TAAS) has been used to document and track an achievement gap between white and minority students in Texas. Some educators have credited th e TAAS with fueling

PAGE 2

2 of 19a drive to close the achievement gap while others s uggest that TAAS scores may be misleading because of factors such as score inflation and a possible ceiling effect. The purpose of this study was to analyze the gap in mathematics achievement for eighth grade student s. The study compared TAAS and National Assessment of Educationa l Progress (NAEP) test results to determine if the achievement gap between white, Hispanic, and African-American Students had narrowe d between 1996 and 2000. Results indicate that TAAS mean scores i ncreased significantly for all three ethnic groups between 1 996 and 2000. Comparison of the TAAS test score frequency distrib utions for each ethnic group indicated that white students' scores shifted from the middle to the upper portion of the test score range while minority students' scores shifted from the lower to the middle and hig her score range. Both white and minority students' TAAS test score distri butions were significantly more negatively skewed in 2000 than i n 1996. Comparisons between white and minority students' TA AS scores showed that white students had significantly higher scores than either Hispanic or African-American students in both 1996 and 2000. Co mparison of mean score differences in 1996 and 2000 indicated that t he achievement gap between white and minority students had narrowed. N AEP scores increased significantly from 1996 to 2000 for Hispa nic students, but not for white or African-American students. However, te st score distribution patterns showed small positive changes for all thre e ethnic groups. Comparisons between ethnic groups indicated that there were significant differences between white and minority students' scores in both 1996 and 2000. Comparison of mean score differ ences in 1996 and 2000 indicated that the achievement gap between His panic white students had narrowed slightly but that there was n o change in the achievement gap between white and African-American students. Analysis of the TAAS test score distribution patter ns indicated the likelihood that a ceiling effect had impacted stude nts' scores. The evidence for a ceiling effect was strongest for whi te students. In 2000, 60.4% of white students had a TAAS score that fell in the top 10% of the score range. In contrast, there was no evidence of a ceiling effect for the NAEP. Mean score gains on the TAAS are only partial ly substantiated by the NAEP data. Furthermore, there is a very str ong possibility that a ceiling effect artificially restricted the 2000 TAA S scores for white students and created the illusion that the achievem ent gap between minority and white students had been narrowed. The Texas Assessment of Academic Skills (TAAS) has been administered as a measure of student achievement in reading, mathematics, and writing in Texas since 1990. The tests have been praised for providing disaggregated data on different ethnic groups and requiring each group to meet the same standard of p roficiency. The disaggregated scores have been used to document and track an achievement gap between the scores of white and minority students. Test results have typically been reported as percent of students passing (i.e. meeting minimum expectations) the TAA S (Texas Education Agency, 2000 b). These results show that passing rates for minor ity students have increased at a faster rate than the passing rates for white students.

PAGE 3

3 of 19 Table 1 TAAS Grade 8 Mathematics Test: A Comparison of Perc ent Passing for 1996 and 2000 by Ethnic Group Ethnic Group Percent Passing Percent Gain 19962000 African American43.880.937.1Hispanic51.485.534.1White77.794.817.1 Many educators in Texas have credited the tests as fueling the drive to close this achievement gap. Several studies have cited the in creased percent of minority students passing the TAAS as evidence that the achievement g ap is being closed (Hurley, Chamberlain, Slavin, & Madden, 2001; Jerald, 2000; Jerald, 2001; Texas Education Agency, 2000 a). Other researchers disagree and su ggest that increases in passing rates for minority students may be linked to other factor s. Haney (2000) argues that increases in the TAAS passing rate for minority high school s tudents is at least partially explained by higher dropout rates among minority students. T oenjes and Dworkin (2002) cite evidence that challenges Haney's assertions and con clude that dropout rates and special education exemptions do not explain the large incre ases in the TAAS passing rate. It should be noted that none of these studies dire ctly address the actual achievement level of students. Comparing the passing rate for minori ty and white students does not provide information about comparative achievement levels. Rather, the passing rate is simply the percentage of students that have attained the minim um achievement level deemed necessary to "pass" the test.Rather than depending solely on TAAS data, some res earchers have used National Assessment of Educational Progress (NAEP) results t o analyze the achievement levels of Texas students. These studies have consistently fo und either no gain or small achievement gains for both minority and white stude nts, but have been unable to substantiate the large gains reported on the TAAS. Amrein and Berliner, after examining test results for 18 states with high-stakes testing programs, reported that student achievement remained at the same level or went down after the high-stakes testing policies were instituted. Camilli, using a cohort a nalysis of gains on the NAEP math test from grade 4 to grade 8 found that Texas ranked 17t h among 35 states. In a study that directly compared TAAS and NAEP test results (Klein Hamilton, McCaffrey, & Stecher, 2000), TAAS data indicated that the achievement gap between minority and white students was closing while NAEP data did not. Klein et. al. suggested that the large gains of minority students relative to white students on the TAAS were misleading and could be due, at least in part, to a ceiling effect and t eaching to the test. However, Klein's study compared 1992 and 1996 NAEP scores with TAAS data f or 1994 and 1998. Because of the disparity in the test administration dates for the NAEP and the TAAS, questions were raised about the conclusions of the study. Furtherm ore, Klein's study analyzed mean achievement gains for the TAAS and the NAEP but did not examine the actual distributions of test scores for evidence of a ceil ing effect.

PAGE 4

4 of 19The National Center for Educational Statistics, in its report, The Nation’s Report Card: Mathematics 2000, (U. S. Department of Education, 2001a) released na tional and state reports of NAEP scores for eighth grade mathematics in August 2001. To date, there have been no studies comparing the NAEP 2000 results and the TAAS results for Texas.Purpose of the StudyThe purpose of the present study is to present an a nalysis of the Texas TAAS and NAEP results for 1996 and 2000 and to explore the mathem atics achievement gap between eighth grade minority students and white students i n Texas. Specific research questions for the study are: Did Texas eighth grade African American, Hispanic, and white student math scores on the TAAS and NAEP increase significantly between 1996 and 2000? DO TAAS and NAEP data show that the achievement gap between white and minority students decreased from 1996 to 2000? Is there evidence that a ceiling effect artificiall y restricted the distribution of students’ scores on the TAAS or the NAEP? MethodsA causal-comparative research design was used to an alyze the variables in the study. The sample for the TAAS data consisted of all African A merican, Hispanic, and white students in Texas who were in the Texas Education A gency’s accountability subset of TAAS scores in grade 8 in 1996 and 2000 (TEA, 2000 b). (Note 1) The sample for the NAEP-Texas data consisted of a random sample of app roximately 2,500 eighth grade students selected according to NAEP specifications in 1996 and 9,600 were selected in 2000. The national results for the NAEP test were included for comparative purposes. (Note 2)Two sets of analyses were used to compare scores wi thin and between ethnic groups. First, confidence intervals were calculated and used to analyze differences between mean scores. A d statistic (Green & Akey, 2000) was calculated to m easure effect size for differences between mean scores. ( Note 3) Second, a chi-square goodness-of-fit test was used to compare TAAS 1996 and 2000 test score distributions and NAEP 1996 and 2000 test score distributions for white and minority students. The Cramer’s V coefficient (Green & Akey, 2000) was cal culated to determine effect size for the chi-square analysis. (Note 4)Research Question #1Did Texas eighth grade African American, Hispanic, and white student math scores on the TAAS and NAEP increase significantly between 19 96 and 2000? The 1996 and 2000 TAAS scores and NAEP scores for e ach ethnic group were analyzed in two ways. First, mean scores for each ethnic gro up were compared to determine if there were significant differences between the 1996 resul ts and the 2000 results. Second, the test score distributions for each ethnic group for 1996 and 2000 were compared to determine changes in the score distribution pattern over time.

PAGE 5

5 of 19 TAAS ResultsThe comparison of 1996 and 2000 mean scores for eac h ethnic group on the TAAS is presented in Table 2. Results indicate that: Mean scores increased significantly for each ethnic group from 1996 to 2000. Effect sizes as measured by the d-statistic were mo derate, ranging from .472 (white) to .646 (African American). Mean score gain for white students less than two-th irds as large as the gain for Hispanic and African American students. Table 2 Comparison of TAAS TLI Mean Scores for 1996 and 200 0 By Ethnic Group GroupEthnicity Mean TLI Scores GainEffect Size (d Statistic) 19962000 TAASAfrican American65.077.2 +12.2*.916 Hispanic67.879.3 +11.5*.894 White77.284.2 +7.0*.667 p < .05The second set of TAAS analyses compared the 1996 a nd 2000 score distributions for each ethnic group. A chi-square analysis indicated that the score distributions for each ethnic group changed significantly from 1996 to 200 0. Results of the analyses are presented in Tables 3 and 4. For each ethnic group, the distribution of scores i n 1996 was significantly different from the distribution of scores in 2000. Effect size as measured by the phi coefficient indi cated that there was a large gain for African American and a moderate gain for Hispan ic students than for white students Scores for Hispanic and African American students t ended to increase from the lowest score range to middle and upper portion of t he score range. Scores for white students were concentrated in the upper portion of the score range. Table 3 TAAS TLI Score Distributions in Percent for 1996 an d 2000 by Ethnic Group Ethnic Group TLI Score0-6970-7475-7980-8485-8990-94 African Am.199656.211.812.010.67.42.0 200019.111.318.424.420.96.0 Hispanic199648.611.913.112.910.33.3

PAGE 6

6 of 19 200014.59.016.324.926.09.4 White199622.310.013.919.623.211.0 20005.23.99.121.437.922.5 Table 4 Comparison of TAAS Score Distributions for 1996 and 2000 Within Each Ethnic Group TestGroupSignificance Level Effect Size (Cramer’s V) TAASWhite 1996-2000<.01.332 African American 1996-2000 <.01.405 Hispanic 1996-2000<.01.410 NAEP ResultsTwo sets of NAEP mean scores were reported: NAEP r esults for Texas and NAEP results for the nation. The comparison of 1996 and 2000 mean scores within each ethnic group is presented in Table 5.Results indicated that: For the NAEP-Texas, mean scores increased significa ntly for Hispanic students but not for African American or white students. For the National NAEP, mean scores increased signif icantly for white students but not for Hispanic or African American students. Table 5 Comparison of NAEP Mean Scores for 1996 and 2000 wi thin each Ethnic Group GroupEthnicity Mean Scale Scores Gain 19962000 NAEP-TexasAfrican American249252+3.0 Hispanic256266+10.0* White285288+3.0NAEP-NationAfrican American242246+4.0 Hispanic250252+2.0 White281285+4.0* p <.05The second set of analyses used a chi-square analys is to compare the 1996 and 2000 score

PAGE 7

7 of 19 distributions for each ethnic group. Results of the analyses are presented in Tables 6 and 7. For the NAEP-Texas, there was a significant change in the score distributions of all three ethnic groups from 1996 to 2000. The effect size (phi coefficient) for the NAEP-Texa s was very small for white (.067) and African American students (.127) and sli ghtly larger for Hispanic students (.176). This indicates that there was a mo re of a change in the Hispanic students' score distribution than for white or Afri can American students. For the National NAEP, white students’ score distri bution changed significantly from 1996 to 2000 but the score distributions for H ispanic and African American students did not. The effect size for the National NAEP was very smal l for all three groups (.030 to .053), indicating that the changes from 1996 to 200 0 were minimal. Table 6 Distribution of NAEP Score in Percent for 1996 and 2000 by Ethnic Group EthnicityYearBelow BasicBasicProficientAdvanced Texas African Am.1996692641 2000603460Hispanic1996583471 20004145131White19962245294 20001746334 Nation African Am.1996732340 2000682750Hispanic1996632971 2000603181White19962743255 20002343286 Table 7 Comparison of NAEP Score Distributions for 1996 and 2000 within Ethnic Group TestGroupSignificance Level Effect Size (Cramer’s V) NAEP-TexasWhite 1996-2000<.05.067 Hispanic 1996-2000<.01.176

PAGE 8

8 of 19 African Am. 1996-2000<.05.127NAEP-NationalWhite 1996-2000<.01.052 Hispanic 1996-2000N.S..030 African Am. 1996-2000N.S..053 ConclusionsComparison of the TAAS results and NAEP-Texas resul ts show a significant mean score increase with a moderate to large effect size for a ll three ethnic groups on the TAAS. In contrast, only Hispanic students had a significant mean score increase on the NAEP-Texas, with a small effect size. The Hispanic students’ increase on the Texas NAEP was not reflected on the National NAEP, indica ting that the NAEP-Texas result was not part of a national trend.The score distributions for both the TAAS and the N AEP-Texas changed significantly from 1996 to 2000. Effect sizes for changes in the TAAS score distributions were much larger than those found for the NAEP-Texas (Figure 1). In addition, the pattern of change in the distributions was different for the TAAS tha n for the NAEP-Texas. The TAAS distributions for African American and Hispanic stu dents showed a very large decrease in students who failed the test (i.e. scored below 70 TLI) and an increase in scores in the middle to high range. For white students, the perc entage at the lower and middle range decreased and the percentage at the top of the test score range showed a very large increase. In contrast, the NAEP-Texas distributions had changes primarily at the lower end of the score range for all three ethnic groups; that is, from “below basic” to “basic”. Figure 1. Effect Size Comparison for 1996 and 2000 Score Dist ributions for TAAS and NAEP

PAGE 9

9 of 19 Research Question 2Do TAAS and NAEP data show that the achievement gap between white and minority students decreased from 1996 to 2000? The 1996 and 2000 TAAS scores and NAEP scores for white and minority students were compared in two ways. First, mean scores for white students were compared with mean scores for African American and Hispanic students t o determine if there were significant differences. Second, the distribution of test score s for white students and minority students were compared to determine if the distribu tion patterns became more similar over time.TAAS—Comparison of White and Minority ScoresThe comparison of mean scores for white and minorit y students on the TAAS presented in Table 8. Confidence intervals were used to deter mine the statistical significance of differences between white and minority students’ me an scores in 1996 and in 2000. Mean scores for white students were significantly h igher than African American students in both 1996 and 2000. The difference in mean scores for white and African American students was larger in 1996 than in 2000. Mean scores for white students were significantly h igher than Hispanic students in both 1996 and 2000. The difference in mean scores for white and African American students was larger in 1996 than in 2000. Effect sizes for white vs. African American student s was large while effect size for white vs. Hispanic students was moderate. Table 8 Comparison of TAAS Mean TLI Score For White and Min ority Students Comparison GroupMean DifferenceEffect Size (d Stati stic) African American vs. White 1996 12.2*.917 African American vs. White 2000 7.0 *.824 Hispanic vs. White 1996 9.4 *.682 Hispanic vs. White 2000 4.9 *.553 p <.01A chi-square analysis compared the score distributi on for white students with the score distributions for Hispanic students and African Ame rican students. Results of the analyses are presented in Table 9. The score distribution for white students was signi ficantly different from the score distributions for African American students both 19 96 and 2000.

PAGE 10

10 of 19 The score distribution for white students was signi ficantly different from the score distributions for Hispanic students both 1996 and 2 000. Effect sizes changed very little from 1996 to 2000. This indicated that, although the distribution patterns changed, the differences betw een white students’ scores and minority students’ scores were relatively unchanged from 1996 to 2000. Table 9 Comparison of TAAS Score Distributions For White an d Minority Students Comparison GroupSignificance LevelEffect Size (Cram er’s V) African American vs. White 1996<.01.331African American vs. White 2000<.01.333Hispanic vs. White 1996<.01.313Hispanic vs. White 2000<.01.286 NAEP—Comparison of White and Minority ScoresThe comparison of mean scores for white and minorit y students on the NAEP-Texas and the National NAEP are presented in Table 10. Confid ence intervals were reported to show the statistical significance of differences be tween white and minority students’ mean scores in 1996 and 2000. For the NAEP-Texas: White students’ mean scores were significantly high er than African American and Hispanic students' mean scores both in 1996 and in 2000. The differences between white and African American students’ mean scores remained large and relatively unchanged from 1996 t o 2000. The difference between white and Hispanic students’ mean scores decreased from 1996 to 2000. For the National NAEP, white students’ mean scores were significantly higher than African American and Hispanic students' mean scores both in 1996 and in 2000. The differences were large and did not change appreciab ly from 1996 to 2000. Table 10 Comparison of NAEP Mean Scale Scores For White and Minority Students TestComparison GroupMean Difference NAEP-TexasAfrican American vs. White 1996 36* African American vs. White 2000 36 Hispanic vs. White 1996 29 Hispanic vs. White 2000 22 *NAEP-NationalAfrican American vs. White 1996 39

PAGE 11

11 of 19 African American vs. White 2000 39 Hispanic vs. White 1996 31 Hispanic vs. White 2000 31 ** p <.01A chi-square analysis compared the score distributi on for white students with the score distributions for Hispanic students and African Ame rican students. Results of the analyses are presented in Table 11. On the NAEP-Texas, the score distributions for Hisp anic and African American students were significantly different from that of white students in both 1996 and 2000. On the NAEP-Texas, the effect size for the comparis on of Hispanic students' and white students' scores in 2000 was smaller than in 1996. In contrast, the effect size for the comparison of African American students' an d white students' scores were unchanged from 1996 to 2000. On the National NAEP, the score distributions for A frican American and Hispanic students were significantly different from the scor e distribution for white students in both 1996 and 2000. On the National NAEP, effect size was moderate and there was little change from 1996 to 2000. Table 11 Comparison of NAEP Score Distributions For White an d Minority Students TestComparison GroupSignificance Level Effect Size (d Statistic) NAEP-TexasAfrican Am. vs. White 1996<.01.415 African Am. vs. White 2000<.01.428 Hispanic vs. White 1996<.01.402 Hispanic vs. White 2000<.01.318NAEP-NationalAfrican Am. vs. White 1996<.01.379 African Am. vs. White 2000<.01.383 Hispanic vs. White 1996<.01.291 Hispanic vs. White 2000<.01.327 ConclusionsThe TAAS results show that the difference in mean s cores for white and minority students was smaller in 2000 than in 1996. This wo uld seem to indicate that minority students were closing the achievement gap in eighth grade mathematics. However, results of the NAEP-Texas offer only partial support for th is conclusion. On the NAEP-Texas,

PAGE 12

12 of 19the difference for white and African American stude nts was unchanged from 1996 to 2000. The difference in NAEP-Texas mean scores for Hispanic and white students was smaller in 2000 than in 1996, although the differen ce was still large. In contrast, results from the National NAEP indicated that the differenc e between Hispanic and white students’ mean scores actually increased slightly.Comparison of the score distributions of white and minority students presents a similar result. Comparison of the score distributions of wh ite and African American students yielded similar effect sized in 1996 and 2000. In c ontrast, the comparison of white and Hispanic students showed that the effect size decre ased from 1996 to 2000 (Figure 2). Figure 2. Comparison of Effect Size for White vs. Minority Sc ore Distributions The finding that the effect size for comparisons of minority and white students is larger on the NAEP-Texas than on the TAAS show that the di sparity between minority and white students is greater on the NAEP-Texas. That i s, the achievement gap is more evident on the NAEP-Texas than on the TAAS both in 1996 and in 2000. However, on both tests, the achievement gap between Hispanic an d white students was smaller in 2000 than in 1996. The fact that Hispanic students do no t show similar gains nationally indicates that this is not part of a national trend The disparity between the NAEP-Texas and National NAEP may be an indication that Hispani c students in Texas are beginning to close the achievement gap in eighth grade mathem atics.Research Question 3Is there evidence that a ceiling effect artificiall y restricted the distribution of students’ scores on the TAAS or the NAEP?TAAS ScoresThe analysis of TAAS mean score gains for each ethn ic group show that white students

PAGE 13

13 of 19 gained only 7.0 TLI points from 1996 to 2000 while African Americans gained and Hispanic students gained 11.5. Since the largest pe rcentage of white students scores were in the upper 10% of the score range in both in 1996 and 2000, the gains for their highest scoring students were limited to the maximum score possible on the test. The likely result is that they were not able to show their true achie vement level because the maximum score (ceiling) of the test artificially limited th eir scores. If this were the case, comparison of their scores with those of minority students (wh ose opportunity for gain was not as restricted) would create the appearance that the lo wer scoring students were achieving at a greater rate and therefore closing the achievemen t gap. A second analysis looked at the distribution of sco res for each ethnic group for evidence of a ceiling effect. This analysis, presented in Ta ble 12, gives the percent of students in each ethnic group with TLI scores in the upper 10% of the score range (i.e. a TLI of 85 to 94). The table shows that in 1996, 34% of the white students had test scores in the upper 10% of the TAAS score range. This increased to 60% in 2000. The fact that the white students have the largest percentage of students in the upper range indicates that the score range for these students is more restricted by the maximum test score (the test ceiling). The dramatic difference in the score distributions for white and minority students provides support for the hypothesis that a ceiling effect has restricted white student’ scores to a greater degree than it has restricted H ispanic and African American students scores. If this hypothesis is correct, the result w ould be an artificial narrowing of the achievement gap between white and minority students ’ eighth grade math test scores. Table 12 Percent of Students with TAAS TLI Scores Greater th an 84 MathTLI ScoresCumulative Total 85-8990-94 White 199623.211.034.2White 200037.922.560.4African American 19967.42.09.4African American 200020.96.026.9Hispanic 199610.33.313.6Hispanic 200026.09.435.4 NAEP ScoresNAEP-Texas and National NAEP results showed that th e mean scores for all three ethnic groups were near the middle of the test range (0 to 500). An analysis of the score distributions for the NAEP-Texas and the National N AEP (Table 13) show that student gains have been primarily at the lower range of the test (from “below basic” to “basic”), with little change in the percent of students achie ving the ”advanced” range. There is no evidence to support the hypothesis that there is a ceiling effect for either the NAEP-Texas or the National NAEP results.

PAGE 14

14 of 19 Table 13 Percent of Students with NAEP Scores Above Basic EthnicityYearProficientAdvancedCumulative Total NAEP-Texas African American19964%1%5% 20006%0%6%Hispanic19967%1%8% 200013%1%14%White199629%4%33% 200033%4%37% NAEP-National African American19964%0%4% 20005%0%5%Hispanic19967%1%8% 20008%1%9%White199625%5%30% 200023%43%34%DiscussionWhite, African American, and Hispanic students all had large and statistically significant gains on the TAAS from 1996 to 2000. Comparison of white and minority students' scores show that white students had significantly h igher TAAS scores than African Americans and Hispanics in 1996 and 2000, but that the differences were smaller in 2000. These results were consistent both for analys is of mean scores and analysis of the test distributions for each student group.NAEP results were not consistent with the TAAS resu lts. Hispanic students had a mean score gain from 1996 to 2000, but white and African American students did not. When white students' NAEP scores were compared to minori ty students’ scores, the difference between Hispanic and white students' scores decreas ed from 1996 to 2000 but the difference between African American and white stude nts’ scores did not. These results were consistent for analysis of mean scores and te st distributions for each student group. In summary, the large student gains on the TAAS, wh ich is a minimum skills test tailored specifically to the Texas mathematics curriculum, a re only partially substantiated by the smaller gains on the NAEP, which is a more general and more difficult test of mathematics. While an explanation of the reasons fo r the differences in the TAAS and NAEP results is beyond the scope of this research, the authors' experience in Texas public schools suggests two likely answers. First, teachin g to the TAAS is widespread and

PAGE 15

15 of 19pervasive in Texas schools. Release versions of the TAAS are available from the Texas Education Agency (along with scoring services that mimic actual TAAS reports) as are a variety of commercially developed practice and test preparation materials. It is common practice for schools to administer one or more “pra ctice TAAS” tests in the fall and use the results to guide instruction in preparation for the state-mandated TAAS testing in the spring. Second, Texas teachers and principals are e valuated, in part, on their students' success (or lack of success) on the TAAS. These fac tors create very strong pressure to teach to the test. It is likely that score inflatio n is a significant factor in the large gains that have been consistently reported for the TAAS.TAAS data for 2000 revealed that differences betwee n white and minority students' scores had decreased when compared to 1996. Other studies have considered this as evidence that the achievement gap between white and minority students is being narrowed. However, analysis of the distribution of scores for each ethnic group reveal that over 60% of white students scored in the upper 10% of the test score range while about 27% of African American students and 35% of H ispanic students scored in this range.Since a larger percentage of white students than mi nority students achieved the maximum score on the test (the test ceiling), white student s' scores likely underestimated their true achievement level. That is, the ceiling effect has artificially restricted white students scores and created the illusion that the achievemen t gap has been narrowed. The presence of a ceiling effect casts doubt on the validity of claims that the achievement gap between white and minority students has been narrowed. A mo re reasonable interpretation of the available data is that because the test ceiling has differentially affected the scores for white, Hispanic, and African American students, TAA S results cannot be used to determine whether or not the achievement gap has be en narrowed. Analysis of the NAEP-Texas score distribution sugge sts that the achievement gap for African American and white students has not changed between 1996 and 2000. However, the gap between white and Hispanic students' scores did narrow, although the change was small when compared to the TAAS. Comparison of NAEP -Texas and National results indicates that this change was a Texas phenomena an d was not found in the National NAEP data. This finding does indicate that Texas h as been partially successful in narrowing the achievement gap between Hispanic and white students. The results of this study have implications beyond the TAAS and the State of Texas. The national emphasis on high standards and the use of high stakes criterion-referenced tests to measure progress toward those standards have bec ome commonplace in public education. Many states depend solely on high stakes test results for making far-reaching decisions about the content and formulation of curr icula, the funding of educational initiatives, and the development of educational pol icy. Any state that uses a high stakes test to measure progress toward state standards mus t be aware of the twin dangers of test score inflation and ceiling effects. Both can lead to invalid interpretation of test scores and erroneous conclusions about student achievement The use of comparative data such as the NAEP are vital to ensure that the test data used by state and national decision-makers presents an accurate picture of the educational achievement of their students. Notes

PAGE 16

16 of 19TAAS data were ordered as a customized report of fr equency distributions by ethnic group. The data set for each ethnic group co nsisted of a frequency count of the number of students by Texas Learning Index (TLI ) score together with the mean, standard deviation, and SEM of the distributi on. The TLI score is a scaled score derived specifically for the TAAS and is not comparable to other scaled scores. A complete description of the derivation of the TLI is contained in the TAAS Technical Digest available from the following Texas Education Agency web site (TEA, 2000 c). 1. All data for the NAEP-Texas and National NAEP were obtained from the following NCES reports: The nation's report card: Mathematics 2000 (U.S. Department of Education, 2001a) and The nation's report card: state mathematics 2000, report for Texas (U.S. Department of Education, 2001b) 2. The d statistic is the ratio of the mean difference betw een two groups divided by the pooled standard deviation. A value of .2, .5, o r .8 is generally interpreted as small, medium, or large effect size, respectively. 3. The Cramer’s V coefficient is a rescale of the phi coefficient and has a range between 0 and 1. A Cramer’s V of .1, .3, or .5 is g enerally interpreted as small, medium, or large effect size, respectively. 4.ReferencesAmrein, A. L. and Berliner, D. C. (2002). High-stak es testing, uncertainty, and student learning. Education Policy Analysis Archives, 10(18). Retrieved March 14, 2003, from http://epaa.asu.edu/epaa/v10n18/ Camilli, G. (2000). Texas gains on NAEP: points of light? Education Policy Analysis Archives, 8(42). Retrieved March 14, 2003, from http://epaa.asu.edu/epaa/v8n42/ Green, S. & Akey, T. (2000). Using SPSS for windows: Analyzing and understanding data (2nd ed.). Upper Saddle River, NJ: Prentice-Hall. Haney, W. (2000, August). The myth of the Texas mir acle in education. Education Policy Analysis Archives, 8 (41). Retrieved March 14, 2003, from http://epaa.asu.edu/epaa/v8n41/ Hurley, E., Chamberlain, A., Slavin, R., and Madden N. (2001). Effects of success for all on TAAS reading scores: A Texas statewide evaluatio n. Phi Delta Kappa, 82 (10), 750-756. Jerald, C. (2000, January 13). The state of the sta tes. Education Week, 19 (18), 62-65. Jerald, C. (2001). Real results, remaining challeng es: The story of Texas education reform. The Education Trust. Retrieved March 14, 2003, from http://www.brtable.org/document.efm/532 Klein, S., Hamilton, L., McCaffrey, B., & Stecher, B. (2000). What do test scores in Texas tell us? Education Policy Analysis Archives 8(49). Retrieved March 14, 2003, from http://epaa.asu.edu/epaa/v8n49/ National Center for Educational Statistics (2001a). The nation's report card: Mathematics

PAGE 17

17 of 19 2000 (NCES Document No. 2001-517). Washington, D.C: Of fice of Educational Research and Improvement, U. S. Department of Educa tion. National Center for Educational Statistics (2001b). The nation's report card: state mathematics 2000, report for Texas (NCES Document No. 2001-519 TX). Washington, DC: Office of Educational Research and Improvement, U. S. Department of Education. Texas Education Agency (2000 a, April 20). State's exit-level TAAS performance sets another record: 10th graders hit 90 percent mark on two sets of tests Austin, TX: Author. Texas Education Agency, Office of Policy Planning a nd Research. (2000 b, April). The 2000 accountability rating system for Texas public schools and school districts Austin, Texas: Author. Texas Education Agency (2000 c). TAAS Technical Digest Retrieved January 2, 2002, from http://www.tea.state.tx.us/student.assessment/resea rchers.html Toenjes, L. and Dworkin, A. G. (2002). Are increasing test scores in Texas really a myth, or is Haney's myth a myth? Education Policy Analysis Archives 10(17). Retrieved March 14, 2003, from http://epaa.asu.edu/epaa/v10n17/ .About the AuthorsDr. Tom Linton is an associate professor in the Educational Leade rship Doctoral Program at Texas A&M University-Corpus Christi. His experience includes 25 years in the Corpus Christi ISD Office of Research Testing, and Evaluation and with various other school districts as an outside consultant and as a project evaluator. He has also served on several advisory committees for the Student Assessm ent Division of the Texas Education Agency. His e-mail address is: tlinton@falcon.tamucc.edu Dr. Don Kester is an associate professor in the Educational Leade rship Doctoral Program at Texas A&M University-Corpus Christi. His experience includes 27 years as an administrative consultant in the areas of educat ional program management, evaluation, research, and student assessment with the Los Angel es County Office of Education. His e-mail address is: don.kester@mail.tamucc.edu .Copyright 2003 by the Education Policy Analysis ArchivesThe World Wide Web address for the Education Policy Analysis Archives is epaa.asu.edu Editor: Gene V Glass, Arizona State UniversityProduction Assistant: Chris Murrell, Arizona State University General questions about appropriateness of topics o r particular articles may be addressed to the Editor, Gene V Glass, glass@asu.edu or reach him at College of Education, Arizona State University, Tempe, AZ 8 5287-2411. The

PAGE 18

18 of 19Commentary Editor is Casey D. Cobb: casey.cobb@unh.edu .EPAA Editorial Board Michael W. Apple University of Wisconsin David C. Berliner Arizona State University Greg Camilli Rutgers University Linda Darling-Hammond Stanford University Sherman Dorn University of South Florida Mark E. Fetler California Commission on Teacher Credentialing Gustavo E. Fischman California State Univeristy–Los Angeles Richard Garlikov Birmingham, Alabama Thomas F. Green Syracuse University Aimee Howley Ohio University Craig B. Howley Appalachia Educational Laboratory William Hunter University of Ontario Institute ofTechnology Patricia Fey Jarvis Seattle, Washington Daniel Kalls Ume University Benjamin Levin University of Manitoba Thomas Mauhs-Pugh Green Mountain College Les McLean University of Toronto Heinrich Mintrop University of California, Los Angeles Michele Moses Arizona State University Gary Orfield Harvard University Anthony G. Rud Jr. Purdue University Jay Paredes Scribner University of Missouri Michael Scriven University of Auckland Lorrie A. Shepard University of Colorado, Boulder Robert E. Stake University of Illinois—UC Kevin Welner University of Colorado, Boulder Terrence G. Wiley Arizona State University John Willinsky University of British ColumbiaEPAA Spanish Language Editorial BoardAssociate Editor for Spanish Language Roberto Rodrguez Gmez Universidad Nacional Autnoma de Mxico roberto@servidor.unam.mx Adrin Acosta (Mxico) Universidad de Guadalajaraadrianacosta@compuserve.com J. Flix Angulo Rasco (Spain) Universidad de Cdizfelix.angulo@uca.es

PAGE 19

19 of 19 Teresa Bracho (Mxico) Centro de Investigacin y DocenciaEconmica-CIDEbracho dis1.cide.mx Alejandro Canales (Mxico) Universidad Nacional Autnoma deMxicocanalesa@servidor.unam.mx Ursula Casanova (U.S.A.) Arizona State Universitycasanova@asu.edu Jos Contreras Domingo Universitat de Barcelona Jose.Contreras@doe.d5.ub.es Erwin Epstein (U.S.A.) Loyola University of ChicagoEepstein@luc.edu Josu Gonzlez (U.S.A.) Arizona State Universityjosue@asu.edu Rollin Kent (Mxico)Universidad Autnoma de Puebla rkent@puebla.megared.net.mx Mara Beatriz Luce (Brazil)Universidad Federal de Rio Grande do Sul-UFRGSlucemb@orion.ufrgs.brJavier Mendoza Rojas (Mxico)Universidad Nacional Autnoma deMxicojaviermr@servidor.unam.mxMarcela Mollis (Argentina)Universidad de Buenos Airesmmollis@filo.uba.ar Humberto Muoz Garca (Mxico) Universidad Nacional Autnoma deMxicohumberto@servidor.unam.mxAngel Ignacio Prez Gmez (Spain)Universidad de Mlagaaiperez@uma.es DanielSchugurensky (Argentina-Canad)OISE/UT, Canadadschugurensky@oise.utoronto.ca Simon Schwartzman (Brazil)American Institutes for Resesarch–Brazil(AIRBrasil) simon@sman.com.br Jurjo Torres Santom (Spain)Universidad de A Coruajurjo@udc.es Carlos Alberto Torres (U.S.A.)University of California, Los Angelestorres@gseisucla.edu


xml version 1.0 encoding UTF-8 standalone no
mods:mods xmlns:mods http:www.loc.govmodsv3 xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govmodsv3mods-3-1.xsd
mods:relatedItem type host
mods:identifier issn 1068-2341mods:part
mods:detail volume mods:number 11issue 10series Year mods:caption 20032003Month March3Day 1414mods:originInfo mods:dateIssued iso8601 2003-03-14