USF Libraries
USF Digital Collections

Educational policy analysis archives

MISSING IMAGE

Material Information

Title:
Educational policy analysis archives
Physical Description:
Serial
Language:
English
Creator:
Arizona State University
University of South Florida
Publisher:
Arizona State University
University of South Florida.
Place of Publication:
Tempe, Ariz
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Education -- Research -- Periodicals   ( lcsh )
Genre:
non-fiction   ( marcgt )
serial   ( sobekcm )

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
usfldc doi - E11-00298
usfldc handle - e11.298
System ID:
SFS0024511:00298


This item is only available as the following downloads:


Full Text

PAGE 1

1 of 18 Education Policy Analysis Archives Volume 10 Number 50December 12, 2002ISSN 1068-2341 A peer-reviewed scholarly journal Editor: Gene V Glass College of Education Arizona State University Copyright 2002, the EDUCATION POLICY ANALYSIS ARCHIVES .Permission is hereby granted to copy any article if EPAA is credited and copies are not sold. EPAA is a project of the Education Policy Studies Laboratory. Articles appearing in EPAA are abstracted in the Current Index to Journals in Education by the ERIC Clearinghouse on Assessment and Evaluation and are permanently archived in Resources in Education .The Case That Won't Go Away: Besieged Institutions and the Massachusetts Teacher Tests Larry H. Ludlow Dennis Shirley Camelia Rosca Boston College Lynch School of EducationCitation: Ludlow, L., Shirley, D., Rosca, C. (2002, December 12). The case that won't go away: Besieged institutions and the Massachusetts teacher tests, Education Policy Analysis Archives 10 (50). Retrieved [date] from http://epaa.asu.edu/epa a/v10n50/.AbstractTeacher testing was inaugurated in Massachusetts in 1998 and a 59% failure rate among test-takers led to public shamin g of the teacher candidates and their colleges and universities in t he media. Within a two-year time period, low-performing teacher educat ion programs in Massachusetts initiated a wide range of test prepar atory activities which

PAGE 2

2 of 18led to a dramatic increase in their students' pass rates. The authors separate colleges and universities into three categ ories and examine their differentiated responses to teacher testing. Their finding that institutions of higher education have responded effectively to t eacher testing does not preclude critique of teacher testing as current ly practiced in Massachusetts. Teacher testing has emerged as one of the most wide ly disseminated educational practices related to the improvement of teacher qua lity in the United States in the last twenty years. What was once a state concern, howeve r, has now become federal. With the reauthorization of Title II of the Higher Educa tion Act (Public Law 105-244) in 1998, states are now required to report to the Unit ed States Department of Education on how well their teacher education program completers fared on teacher tests. This legislation, in effect, federalized teacher testing Under this legislation, colleges and universities w ith failing teacher education programs stand to lose federal funding for professional deve lopment programs, research and student financial aid. Institutions of higher educa tion (IHEs) with low passing rates risk the humiliation of being publicly designated as "lo w-performing" by their states (United States Department of Education, 2000). Teacher educ ation programs whose students fail to meet minimum pass rates face sanctions and, ulti mately, closure by state departments of education (Massachusetts Department of Education 1998). A variety of concerns have been raised about teache r testing as a new federal policy in the United States (National Research Council, 2001) Some critics have worried that the tests filter competent teachers, especially minorit ies, out of the profession (Melnick and Pullin, 2000); others question whether the tests me asure the most critical attributes of teachers (Flippo and Riccards, 2001); and others ha ve raised technical questions about the tests (Haney, Fowler and Wheelock, 1999; Ludlow 2001). Many of these concerns are legitimate and require further clarification an d debate among teachers, policymakers, and the public at large.Regardless of the concerns that have been raised, h owever, colleges and universities with teacher education programs are currently held "accountable" for the results of their candidates on the tests. Institutions across the co untry have considered a variety of options for responding to this new testing regimen, of which three are most salient. They can (i) change their teacher education curricula to align them with the test; (ii) restrict their applicant pool to exclude applicants who they believe might not be able to pass the test; or (iii) develop test preparation workshops t o address areas of students' academic weakness.Thus far, we have little data on transformations in teacher education curricula or restrictions on applicants. In their research at fi ve colleges and universities in Massachusetts conducted during the first year after the test was implemented, our colleagues Marilyn Cochran-Smith and Curt Dudley-Ma rling found little evidence of changes in teacher education coursework relevant to the test (2001). Furthermore, even though there has been some speculation that there h ave been restrictions in the applicant pool, we do not have any quantitative evidence to t est this hypothesis at this point in time.

PAGE 3

3 of 18We take it for granted that teacher testing is now an established part of the American educational landscape, and we are skeptical that an y social movements or political coalitions will arise which will have sufficient po wer to terminate teacher testing. The testing movement has been firmly embraced by the cu rrent administration in Washington, DC, and while there are some opponents, they do not seem to enjoy broad public support. Given that these tests are here to stay, we seek to pose and answer an educational question: what is it that teacher education programs are doin g to ensure that their students' pass rates meet state standard s?Three Core QuestionsTo answer this question, we gathered data from our home state, the Commonwealth of Massachusetts. We asked faculty and staff at all 59 different teacher preparatory institutions to answer three essential questions. T hese institutions ranged from small private colleges graduating no more than a dozen te acher candidates each year to large state colleges and universities with hundreds of fu ture teachers completing their course work annually.First, we asked what efforts are currently underway at your instit ution to prepare students to take the Massachusetts Test for Educato r Licensure (MTEL) ? For example, has your institution (a) recommended that students not take all three tests on the same day, (b) advised students to take tests earlier in their undergraduate programs to allow repeat opportunities to take the test, (c) offered "test taking skills" sessions and, if so, what is the purpose, format, and content of those s essions, (d) coordinated with arts and sciences departments to ensure coverage of subject matter tests, or (e) changed curricula to add or drop subject coverage?Second, we asked what mechanisms does your institution have for kee ping track of and analyzing student results? Does your institution create data files containing student test results? If so, who maintains the files and conduct s analyses upon them? What kinds of analyses are performed with the data? For example, are the test results statistically analyzed by degree level or program? Or, are test s cores related to SATs and GPAs? What kinds of analyses of students who are potentia lly at-risk have been conducted—and have those analyses informed interven tion strategies? What curricular decisions have resulted from analyses of the test s cores? Third, we asked what improvements in MTEL pass rates can be directl y linked to curricular changes and test preparation changes? Specifically, is there any evidence to suggest that these changes or test preparation sess ions improve the chances of passing the tests on the first time (or on subsequent testtaking)? What test areas have shown improvements in pass rates? How have pass rates, in general, changed over time?Key Data SourcesThese questions were sent to each Title II Coordina tor at the 59 teacher preparation institutions in the Commonwealth. A follow-up reque st for participation was sent two weeks after the initial mailing and another was sen t two weeks after that. Some of their email responses prompted follow-up questions from u s. In addition, some Title II Coordinators participated in phone or face-to-face interviews. Thirty-nine institutions

PAGE 4

4 of 18ultimately provided descriptions of how they prepar e students for the tests (Note 1). For the 1999-2000 Title II reporting period, these 39 i nstitutions served 94% of the program completers tested in the state.Before answering our first query about efforts curr ently underway to prepare teacher candidates for the MTEL, a reminder is warranted ab out the historical origins of teacher testing in Massachusetts. The first administration of the teacher test in 1998 resulted in a 59% failure rate and generated national news when t he speaker of the House of Representatives condemned the failed candidates as "idiots" (Pressley, 1998). Colleges and universities with high failure rates suffered p ublic condemnation. Chastened deans, provosts, and presidents found themselves in the un familiar role of apologists for their teacher preparatory programs. Critics within school s of education condemned the teacher tests as a set-up insofar as the candidates knew li ttle about the content on the tests and were told consistently (until shortly before the te sts) that the results would not count. Professors, staff, and students in schools of educa tion took careful note of how their own institutions sized up in relationship to their comp etitors. In this climate of intensive external scrutiny and criticism and self-reflection and analysis, IHEs were compelled to implement any stra tegy that appeared to not just improve their curriculums but also to show the publ ic they were responding to this perceived failure in their programs to prepare teac her candidates. No guidance, however, was provided by the Massachusetts Department of Edu cation about what curricular components should be changed, added, or eliminated. Nor were any test results available from the test contractor (National Evaluations Syst ems: NES) that provided useful diagnostic information to students and programs abo ut specific test content deficiencies (Note 2). As a result, programs were left to their own devices to create survival strategies.These strategies were largely uncoordinated and wer e implemented at the discretion of education deans, department chairs, and program coo rdinators. A number of institutions hired psychometricians to help them build data file s to make sense of the test results. Another strategy consisted of the immediate impleme ntation of test preparation sessions taught by testing faculty and academic development centers. Over time, word of these efforts spread and colleges and universities began sharing their strategies with one another. With the pressure of Title II reporting of teacher candidate test results, the need for a more systematic survey and appraisal of outco mes became apparent to all parties. As a result of the expressed interest from colleges and universities to the Massachusetts Department of Education, the Department conducted a survey in the fall of 2001. The survey solicited data from IHEs relevant to new tes t preparation programs and addressed issues of intended audiences and expense. The surve y did not inquire into the substance of the test preparation programs nor did it seek to gauge their successes.First Question: Levels of Institutional InvestmentBased on the extensive and enthusiastic responses w e received we have organized our survey results around the theme of institutional investment This theme draws on the work of Cochran-Smith and Dudley-Marling, who noted that "the speed and degree with which institutions shifted resources to test prepar ation was linked to the degree of urgency and crisis that participants perceived at t heir institutions" (2001). Our objective

PAGE 5

5 of 18was to document the extent and types of resource sh ifts and their real and apparent efficacy. We identified three levels of institution al investment; those heavily, moderately, or minimally invested in shifting insti tutional resources to immediately address the need the raise teacher test pass rates.We describe institutions of higher education with h igh numbers of candidates who failed on the first administration of the test as besieged ." In general, fewer than forty percent of teacher candidates at these colleges and univers ities passed the first three administrations of the teacher tests in 1998. These besieged institutions have as an aggregate responded with an extensive, heavy investment to reduce their high initial failure rates. Of our responding institutions, 20 o r 51% fall into this category. Besieged institutions have developed a host of stra tegies to help their teacher candidates pass the MTEL. Administrators—usually deans—at thes e colleges and universities have hired consultants to teach staff how to conduct the ir preparatory workshops or they have hired consultants to conduct the workshops directly for future test-takers. Staff have created in-house test preparation workshops which e xtend across weeks and even months. Faculty and staff have designed and taught new one-credit courses to help students prepare for the test over the course of an entire semester. Administrators hired statisticians to create sophisticated longitudinal data bases for the purpose of tracking individual student performances. In addition, these statisticians built models to identify program strengths and weaknesses.At many besieged institutions, we observed a univer sity-wide response to the crisis of low test scores. Especially noteworthy in this rega rd is the manner in which collaboration between teacher education and academi c content faculty was catalyzed by a mutual desire to raise the scores. At one IHE, th e committee leading the drive to prepare students for the test included the Vice Pre sident for Academic Affairs (who chaired the committee), the Dean of the School of A rts and Sciences, the Dean of the School of Education, the Associate Vice President f or Student Services, the Director of Academic Advisement, and the Title II Coordinator ( who was a professor of education). At another heavily invested institution an all-out effort was mounted to raise teacher test scores. Teacher candidates were offered a variety o f test preparatory workshops, ranging from two to twenty-four hours in duration. These wo rkshops included not only the communication and literacy segments of the test, bu t also content area sections, such as history, mathematics, or chemistry. Faculty receive d professional development assistance in redesigning their course curricula to include test-taking skills; individual tutors were hired to help struggling teacher candid ates; and the results of students' writing outcomes on the MTEL were correlated with c hanges in expository writing classes as part of extensive program assessment. So me of the test preparation classes were offered for academic credit. Academic advising emphasized the importance of the tests and informed students of multiple opportuniti es to prepare for them. Faculty familiar with the design of the MTEL have s erved as guest lecturers in classes taken by prospective teachers; IHEs have hired addi tional faculty to teach writing skills and to serve as part-time academic advisors; and th e linkage between state curriculum frameworks and the teacher test has been taught exp licitly in methods classes. Some institutions have developed entire new courses to h elp their students pass the test, and some have increased distributional requirements wit h the goal of helping the students to improve their English language capabilities.

PAGE 6

6 of 18Many institutions have concentrated on the early id entification of at-risk students, often during the freshman year. For example, one institut ion requires its freshmen in their first undergraduate course to take the PRAXIS test so tha t faculty in the teacher education program can immediately identify a students' streng ths and weaknesses in regards to communication and literacy. In this instance, PRAXI S serves as a kind of pre-test for the teacher candidates.One of the more surprising innovations developed by these institutions concerns the use of middle and high school textbooks as test prepara tory materials. As discussed by Ludlow (2002) at a recent regional workshop for tea cher educators, a teacher candidate might be better served by studying a high school hi story or biology textbook in depth than to take specialized courses in social history or evolution that contain large bodies of information not measured on the test. (Whether the focus on high school texts diminishes teacher candidates' awareness of recent debates in history or discoveries in genetics is an open question.)Many of these transformations have involved signifi cant allocations of university resources in the form of time, money, and financial aid (for example, for doctoral students hired to serve as tutors for students who will be taking the MTEL). Deans and department chairs at besieged institutions appear t o have been resourceful and inventive program advocates who effectively reallocated money to hire consultants, develop test workshops, and measure student outcomes. Some insti tutions with meager financial resources used grant funding to develop test prepar atory activities; for example, one IHE allocated a portion of federal Eisenhower grant fun ds to help its students prepare for the communication and literacy sections of the test. Th e costs involved in faculty attending professional development sessions relevant to the M TEL and redesigning courses accordingly are important "hidden" costs entailed i n the teacher testing enterprise. Test preparation has also entailed additional expenses f or teacher candidates (at one private university, test preparation workshops cost $150).At the opposite extreme of testing performance are those colleges and universities with high pass rates that have been maintained across al l administrations of the teacher test. These IHEs have generally evidenced a minimum level of investment in test preparatory activities. Although they may have made modest effo rts to adjust curriculum or share test-relevant information with other IHEs, they hav e, essentially, no formal test preparation for their students. Orientation session s, if held at all, tend to be voluntary and focus on only the general test format. Review o r study materials may be distributed to students but there are no formal assignments. Li kewise, there tend to be no requirements for passing any components of the test prior to entering the program or prior to student teaching. Six or 15% of the instit utions that responded to our survey fit into this category.Institutions with minimal investment give evidence of virtually no discussion between teacher education faculty and their arts and scienc es colleagues regarding subject matter competence. There appears to be no serious effort t o identify at-risk students or students who flunked sections of the test. Nor do they feel compelled to maintain data files for statistical analysis of performance patterns. This tone of satisfaction with the status quo is captured well by one Title II coordinator at a s mall liberal arts college, who reported, "we have done nothing in the way of gate-keeping or special preparation of our students for the MTEL." In a similar vein another coordinato r stated that "we feel that our

PAGE 7

7 of 18curriculum is the preparation for the MTEL."IHEs with minimal investments do tend to advise stu dents to split the tests into separate testing sessions and they may provide handouts with test-taking advice and materials to serve as general references for self-study. Respons ibility for test information, record keeping, and interaction with the state and the tes t contractor tends to fall on the shoulders of a single person, typically the departm ent chair. At more than one such institution, however, the secretary for the departm ent of education in the college is designated the MTEL coordinator who is responsible for maintaining databases and testing materials.What is particularly interesting about these instit utions is the fact that some of them will be adversely affected when the next round of Title II results are released (for the 2000-2001 period). Specifically, some of them will lose their ranking in the top Title II reporting quartile. This shift in ranking will happ en because many institutions have recently implemented policies that require prospect ive teachers to either pass the tests prior to admission to the teacher preparation progr am or as part of their student teaching component. Thus, there will be an increase in the n umber of programs that claim a one hundred percent pass rate on the test in the next r ound of Title II reports. Some institutions are concerned about this particular pu blic relations aspect of the test. One Title II coordinator said that "We plan no special preparation programs, but perhaps we should be thinking of requiring the basic skills te st as a teacher education prerequisite in order to join those institutions that now are repor ting a one hundred percent pass rate." Institutions with a moderate level of investment (typically, those below the st ate cut-score of 80% passing but not so low as to have been shocking) generally provide required orientation sessions or workshops where th e format of the test and test-taking strategies are described by teacher educators or af filiated staff. Student performance data are gathered for systematic statistical analysis an d test performance over time is tracked. At-risk students and those who failed a test are id entified and tutoring sessions are provided. The administrative support network tends to be elaborate, involving program faculty, the practicum office, and usually a dean. Thirteen or 33% of the institutions or our responding institutions are in this category.At some colleges and universities with moderate lev els of investment, Saturday sessions are held for the two weeks immediately prior to the tests. Take-home exercises relevant to these sessions may be required and student advis ors are encouraged to meet with each student individually before the test is administere d. Teacher education faculty and affiliated staff explain test registration booklets to students; further, teacher educators design and share information in workshops to famili arize students with different types of test questions and to encourage time management str ategies. Some Massachusetts teacher educators encourage teacher candidates to v isit web site locations, such as that of the Texas Education Agency ( http://www.excet.nesinc.com/excetstudyguid/ ); this site provides sample test items on the ExCET test for Te xas teacher candidates, which is in many ways analogous to the MTEL and is also designe d by the NES. Some sections of the tests are required for admissi on to the teacher preparation program and some programs require the students to pass all sections of the test before student teaching. One example of an institution with a mode rate level of investment is a private university which distributes test preparatory infor mation from the Massachusetts Department of Education and refers students to test preparatory workshops at a nearby

PAGE 8

8 of 18state college. The university advises its students not to take all three sections of the MTEL on the same day and plans to require the teach er candidates to pass the communications and literacy sections of the test be fore they enter student teaching. An administrative assistant tracks student test scores and analyzes the results to ascertain any possible consequences for the teacher education program.Cross-Institutional CollaborationsAcross all three levels of institutional investment faculty and staff shared some degree of confidence about their ability to prepare test-t akers for the communication and literacy sections of the test. There was much less certainty about the academic content area portions. As one program coordinator confessed "When it comes to subject matter tests, our students are on their own." Expressing s imilar sentiments, another respondent said, "It is interesting to me that with such empha sis being placed on the test results, no person from the Department of Education has offered to give us workshops with useful information on exactly how we can prepare the stude nts. Most of the information and strategies have been developed through networking w ith other education program faculty."Driven by this need to share information, colleges and universities with teacher preparatory programs in Massachusetts have sought f orums to discuss the MTEL and to develop appropriate institutional responses. For ex ample, Framingham State College convened a Conference on MTEL Test Preparation on 7 January 2002 to assist teacher educators in sharing and analyzing information rele vant to the test. Likewise, a regional workshop funded by the Alliance for Education suppo rted the Colleges of Worcester Consortium in their efforts to better understand te st preparation practices (Ludlow, 2002). The Association of Independent Colleges and Universities in Massachusetts, led by Clare Cotton, has been instrumental in recognizi ng the need for IHEs to confer about the MTEL and to improve communication with National Evaluation Systems and the Massachusetts Department of Education. Finally, the Massachusetts Coalition for Teacher Quality and Student Achievement has made th e preparation of teachers for urban schools a priority and focused on teacher tes ts at several of its conferences and institutes.These are all voluntary efforts that institutions f reely engage in at their own discretion. Perhaps not surprisingly, besieged institutions hav e been most visible in these broad-based networks. They report that sharing info rmation across institutional lines has helped them to develop programs and to garner resou rces from their deans and presidents. Institutions with minimal investment ge nerally do not participate in these forums, and those with moderate investment particip ate only intermittently. Complicating these facets of preparing for the test s, is the fact that many teacher educators previously had little, and in some cases no, contact with their arts and sciences colleagues who teach academic content knowledge. He nce, one consequence of the teacher test has been to promote collaboration betw een teacher education and arts and sciences faculty. One problem is that in some areas such as history, Massachusetts has been unable to establish state curriculum framework s, so even those cannot be referred to as a point of reference.One wide-spread approach to the subject matter test s has been for individual academic

PAGE 9

9 of 18departments to assume the responsibility for provid ing informal sessions on subject preparation for the tests. For example, at one inst itution faculty in both the English and History Departments offer optional workshops for st udents. These provide a review of sample subject matter questions that are provided b y state and test contractor documents. Ironically, faculty who wanted to do a good job pre paring their students for the tests were frustrated by the wide range of questions that could be posed to students. "You would basically have to know the history of the wor ld to ace this exam," one professor at a well-regarded private college stated.Second Question: Insights from Institutional Data A nalysesOur second research question addressed analyses of data relevant to the teacher tests. We were particularly interested in the extent to which IHEs tried to systematically organize and analyze their test results for the purpose of b etter understanding the strengths and weaknesses of their students and their own teacher preparation programs. Given the initial relatively high failure rates, what student and program variables might be useful for understanding students' poor performances? If s uch variables could be identified, then how might IHEs assist future test-takers to pr epare for the tests? The following examples illustrate the kinds of stat istical analyses that have been performed as more colleges and universities realize the potential benefits to be derived from the test score data. Correlational analyses have revealed positive relat ionships between the Communication and Literacy Skills Tests (CLST) and the subject matter tests and between the CLST and subject tests with SATs and GP As. Analyses of test-retest effects on the CSLT and sub ject matter tests have indicated that students who fail the first time are likely to pass on the second administration. Longitudinal charts have revealed strengths and wea knesses in sub-areas over time. Ordinary least squares multiple regression models h ave been successfully constructed for the purpose of using GPAs and SATs as predictors of at-risk test-takers. Logistic regression models have been successfully e mployed with demographic and programmatic variables to predict success or fa ilure on the tests. Independent means t-test analyses have not found si gnificant differences in test scores based on gender but have found differences b ased on academic level (undergraduate versus graduate student). These analyses were not usually performed for the p urpose of hypothesis testing. They were usually conducted as exploratory analyses atte mpting to reveal any patterns or relationships among test results, student character istics, and program structures that would shed light on why students either passed or f ailed and, more importantly, how student preparation for the tests could be strength ened. There have been results, however, that were signifi cant that have been shared between teacher preparatory programs in different colleges and universities. Some of the most significant findings are of potentially tremendous value to teacher candidates and teacher educators. For example, many students who took all three tests on the same day performed worse than those who took the third compo nent of the tests (the subject

PAGE 10

10 of 18matter test) on a separate day. Accordingly, most i nstitutions now advise students not to take all three sections at once. Another example is that graduate students have outperformed undergraduate students, so some instit utions have focused their test preparatory activities on undergraduates. Students with low SAT scores have performed worse than those with high scores, and students wit h low GPAs have performed worse than those with high GPAs. Some colleges and univer sities have used this data to identify at-risk students and to develop appropriat e interventions for them (Note 3). One besieged institution has defined students at-ri sk of failing if they have an SAT verbal score below 420, did poorly in basic college writing or introduction to education courses, have English as a second language, or have a learning disability. These students are required by the university to take test prepara tory workshops that range from fifteen to twenty-four hours in duration. The workshops las t for several weeks, with multiple opportunities for workshop leaders to assess studen t progress over time. One major problem that every IHE in Massachusetts h as faced in performing these analyses is that the test scores are not available in electronic format. (Note 4) Thus, any statistical analysis first requires hand processing of the paper records from the testing contractor. This, in turn, forces each IHE which se eks to assist its teacher candidates to develop its own approach to building databases and performing analyses. It has been impossible thus far to perform any cross-institutio nal aggregation and analysis of data (Note 5). In some cases these data management probl ems have prevented institutions from analyzing their own data. As one Title II coor dinator noted: "It takes too much time now for me to enter all the data and as soon as we train a graduate student to do data entry the student graduates. Unfortunately, we have not had a statistician or faculty member that has taken an interest."Third Question: Results of Test Preparatory Activit iesOur third research question concerned the efficacy of test preparation efforts in relation to improved test scores. We know little to date abo ut the impact of the new teacher test preparation approaches, in spite of the national wa ve of innovations in this area in recent years. The impact of a teacher test preparation pro gram in Arizona was examined by Fierros (2002) but his effort was specific to a sin gle institution and explored candidates' sentiments about the program rather than their actu al test score results. Consistent with Fierros' positive findings, we also received many f avorable comments about the test preparatory programs in Massachusetts. For example, one Title II program coordinator said that "students feel more confident and tend to do better on the first try if they attend the one-credit test prep course … students benefite d greatly from the one-on-one tutoring … the students really feel it helps." Anot her coordinator commented, "Everyone benefits from some MTEL test preparation. It is ver y helpful for even the most skilled test takers to attend a two to three hour orientati on/test readiness course." Although anecdotal evidence is useful, it is also p ossible to establish a statistical relationship between increased test preparation and test performance on the teacher tests. Starting in April 1998, the Massachusetts Departmen t of Education began releasing institutional test results after each administratio n of the teacher test. This practice continued through June 1999, at which point the Dep artment ceased making public reports on the data (with the exception of the Titl e II test results in April 2001).

PAGE 11

11 of 18Recall that we disaggregated our teacher preparator y institutions into three categories: those with a minimal investment in improving scores, those with a moderate investment; and those which are heavily invested. The latter group we labeled besieged institutions because failure to increase pass rates above 80% by 2004 could result in the Department of Education closing their teacher education progra ms. The chart represents the summary pass rate for each of our resource investment categories across each of the MTEL testing dates. Summary pass rate" is defined as "the proportion of program completers who passed all tes ts they took for their areas of specialization among those who took one or more tes ts in their specialization areas" (United States Department of Education, 2000). The bold horizontal line at 80% pass represents the Massachusetts Department of Educatio n criterion for institutional approval for the continuation of a teacher preparat ory program (Massachusetts Department of Education, 1998). The last test date (2001) refers to the September 1, 1999-August 31, 2000 Title II reporting period. Tho se results were submitted to the Department in April 2001.The minimally invested institutions started off wit h high pass rates and have maintained a relatively constant high level of success on the tests. In essence, they were not threatened by the test and did not need to exert an y additional efforts to meet the state standards, although several of them have recently m ade some efforts in this direction. In contrast to their situation, institutions which are moderately or heavily invested in improving their pass rates have shown a steady impr ovement over time. The moderate category institutions show a sharp rise ending the 1999 academic year and, as a group,

PAGE 12

12 of 18exceeded the 80% threshold for the Title II reporti ng period. Besieged institutions started off about fifteen percentage points below t he moderate group and have stayed roughly at a fifteen to twenty point difference acr oss time. For the first Title II reporting period the besieged institutions were below the 80% threshold. Not only have they have dramatically narrowed the performance gap (Note 6) but, based on their trajectory, we anticipate that they will meet the state standard a s a group when the next round of test score results are released. (Note 7)There is no question that the improvement of test s cores is correlated with the rise of test preparatory activities in the besieged colleges and universities. The extent to which the rise in test scores was caused by those activities is, however, unclear because there are confounding variables for which we have no controls For example, the extent to which teacher education curricula have changed to conform with MTEL content is unknown. Another variable is the extent to which admission s electivity (e.g. high school GPA and SAT scores) has become more rigorous.We do not believe that these confounding variables should be given much weight. Teacher educators had little information about the teacher test during the first administrations and could only rely on test-takers to learn about the nature of the test. Anita Page, the director of the early childhood and elementary programs at Mount Holyoke College, asked "Well, what is it that I hav e to improve?," reflecting a widespread sentiment of concern about the lack of c larity in Department of Education guidelines (Tantraphol, 2002). Regarding student se lectivity, it seems implausible that any increase in teacher preparatory program admissi on criteria could affect test results, in the aggregate, in such a narrow span of time.ConclusionAs we have seen, besieged institutions responded ra pidly with innovative strategies to enhance the content knowledge of prospective teache rs as well as their writing and reading skills. In addition, some institutions chan ged their admission criteria, student teaching requirements, and program completer defini tions to include passing the MTEL. Subsequently, test takers' scores in the besieged i nstitutions improved dramatically in the years following the initial administration of the t eacher test. At present, it is impossible to disentangle to what extent specific factors led to the rise in test scores. Based on the results described here, the faculty, a dministrators, staff, and students at many institutions may feel pleased that their strat egies improved teacher test results—and, by implication, the skills and compete ncies required of all teachers. Furthermore, advocates of teacher testing might cla im that state and federal teacher testing policies are working effectively to upgrade the teaching profession. We, however, wish to advance several caveats becaus e we share with critics of teacher testing concerns about how the tests are being used and the manner in which they are transforming teacher education. First, many of the besieged institutions recently changed the time of their testing so that students take sec tions of the teacher test before even being admitted to a teacher education program. In t hese cases, the institutions are guaranteeing a high pass rate from their program co mpleters. Second, we do not know if valuable facets of teacher education have been sacr ificed in the effort to improve test scores. Cochran-Smith and Dudley-Marling (2001) fou nd that significant institutional

PAGE 13

13 of 18resources were devoted to test preparatory activiti es and some of these resource allocations detracted from socially critical parts of a university's mission, such as recruiting students of color into teaching. Finally the test may screen promising teachers out of the profession who could be quite effective in classes yet do not perform well on standardized tests.Teacher testing—virtually unknown two decades ago—h as now become ubiquitous in the United States. As a result of this far-reaching transformation, teacher educators are now able to use test data to review, analyze and st rengthen their programs. When combined with other innovations in the field of tea cher education—such as portfolio assessment, exhibitions, and teaching demonstration s—teacher testing can play an important role as one strand of holistic assessment Professional approval programs, such as the National Council for the Accreditation of Teacher Education (NCATE), require schools, colleges, and departments of educa tion to document their self-study efforts, and teacher test results provide additiona l information on teacher candidates' academic competencies.Critics, however, have raised serious concerns abou t unintended consequences of teacher testing. These critics will not, and should not, be silenced simply because there are positive trends in teacher test results. Even a s we applaud the progress made by besieged institutions in improving test score resul ts, we must continue to inquire into deleterious results of testing that detract from es sential components of teacher preparation.NotesAnna Maria College, Becker College, Berklee College of Music, Boston College, Boston University, Brandeis University, Bridgewater State College, Cambridge College, Clark University, College of the Holy Cros s, Curry College, Eastern Nazarene College, Elms College, Emerson College, Em manuel College, Endicott College, Fitchburg State College, Framingham State College, Gordon College, Harvard Graduate School of Education, Lesley Colleg e, Massachusetts College of Art, Massachusetts College of Liberal Arts, Merrima ck College, Montserrat College of Art, Northeastern University, Salem Stat e College, Simmons College, Smith College, Springfield College, Stonehill Colle ge, Suffolk University, Tufts University, University of Massachusetts/Amherst, Un iversity of Massachusetts/Boston, University of Massachusetts/L owell, Wheaton College, Wheelock College, and Worcester State College. 1. Some institutions report that obtaining useful prac tical and technical information about the MTEL is still problematic. In fact, no te chnical reports have been released "following the use of each form of the tes ts" and no technical advisory committee has been formed to "meet up to four times annually to review the test items, test administration, scoring procedures, and score setting for validity and reliability", even though both actions have been ca lled for in the MTEL contracts. 2. The opportunity for IHEs to conduct these valuable statistical analyses upon their own candidate test score results has just recently become more restrictive. In an April 18, 2002 letter to Judith Gill, Chancellor, B oard of Higher Education and Clare Cotton, Executive Director, Association of In dependent Colleges and Universities in Massachusetts, Commissioner of Educ ation David Driscoll stated that candidates will now be required to give explic it "consent before his or her 3.

PAGE 14

14 of 18institution is sent any information on individual s ubarea performance." This requirement is imposed even though Driscoll acknowl edged that language in the current registration booklet is "essentially identi cal in nine of the states served by the National Evaluation Systems." This additional r elease will inevitably result in fewer complete data records for teacher candidates and their institutions. Each contract for the MTEL has stipulated that the test developer (National Evaluation Systems) provide the test scores in elec tronic format to IHEs (just as the NES presently does for Title II reporting purpo ses). The contract specifies that "The Contractor will… electronically transferring o fficial scores to the institutions after each administration." After four years of tes ting, however, the NES still does not provide the data on an electronic medium suitab le for processing by standard software like SPSS or EXCEL. 4. Commissioner Driscoll stated that "There are two re asons for my prohibition on cross-institutional pooling of data. First, we wish to protect the individual's identification. Second, we wish to lessen misuse of our licensure tests." ( ibid. ). The first point can be easily addressed on the data records using standard procedures such as codes and pseudonyms to protect individuals' confidentiality. Regarding the second point, we do not know what kin ds of "misuse" the Commissioner has in mind. The practical consequence of this prohibition is to further reduce the capability of IHEs to serve thei r students and the interests of the general public by improving teacher candidates' tes t scores. The political impact of this prohibition is that independent analysts are p revented from conducting the sorts of rigorous validity and differential impact analyses that could assist teacher candidates. 5. It is important to note that the dramatic increase from June 1999 to the 2001 Title II results in the last column is partially attribut able to different definitions of test-takers. Prior to Title II, test-takers include d anyone who claimed affiliation with a college or university. There were no control s on the population of test-takers, i.e. anyone who took the test and clai med an institutional affiliation was counted in institutional results. For Title II purposes, however, only "program completers" are test-takers. According to Title II guidelines as provided by the Massachusetts Department of Education, program comp leters are only those individuals who took the teacher test and met all o ther institutional program requirements for graduation and certification. This distinction has led to the apparent paradox that presently exists when test ad ministration results are released. That is, "On the February 23, 2002 admini stration, a total of 58% of 5,225 first time test takers passed all three parts of the test" (Massachusetts Department of Education, 2002) yet the statewide pa ss for Title II (1999-2000) was 81%. 6. Congress requires IHEs to send their Title II test results to their respective state agencies by April 7 of each year. The Massachusetts results for 1999-2000 were subsequently posted on most institutional web sites in accordance with the requirement that the results be available to the pu blic. This year, however, most IHEs have not posted their results on their respect ive web sites as of 22 June 2002—and we checked 58 separate IHE sites. Furtherm ore, the Department of Education declined to provide those results for thi s article. Thus, even though in theory the latest Title II results are accessible b y the public we could not in point of fact obtain and use them. 7.References

PAGE 15

15 of 18Cochran-Smith, M., & Dudley-Marling, C. (2001) The flunk heard round the world. Teaching Education, 12, 49-63 Fierros, E. G. (2002, April) Improving performance? A model for examining the impact of AEPA preparation center in Arizona. Paper presen ted at the American Educational Research Association conference in New Orleans, Louisiana. Flippo, R. F., & Riccards, M. P. (2000) Initial tea cher certification testing in Massachusetts: A case of the tail wagging the dog. Phi Delta Kappan, 82, 34-38. Haney, W., Fowler, C. & Wheelock, A. (1999) Less tr uth than error? An independent study of the Massachusetts Teacher Test. Educational Policy Analysis Archives 7,4. Retrieved 17 June 2002 from http://epaa.asu.edu/epaa/v7n4/ Ludlow, L. (2001) Teacher test accountability: Alab ama to Massachusetts. Educational Policy Analysis Archives, 9,6. Retrieved June 2002 from http://epaa.asu.edu/epaa/v9n6.html Ludlow, L. and Rosca, C. MTEL test preparation: Who is doing what and how successful is it? (2002) Unpublished ms. Massachusetts Department of Education. Teacher Certification: 603 CMR 7.03: Educator Licensure and Preparation Program Approval (2001). Melnick, S. L., & Pullin, D. (2000) Can you take di ctation? Prescribing teacher quality through testing Journal of Teacher Education, 4 262-275. National Research Council. (2001) Testing teacher candidates: The role of licensure tests in improving teacher quality. Washington, DC: National Academy Press. Pressley, D.S. (1998) "Dumb struck: Finneran slams 'idiots' who failed teacher tests." Boston Herald 26 June, pp. 1, 28. Tantraphol, R. (April 11, 2002) Teachers' test resu lts improving. Union News, Retrieved April 2002 http://www.masslive.com/unionnews/index.ssf United States Department of Education, National Cen ter for Education Statistics. (2000) Reference and Reporting Guide for Preparing State a nd Institutional Reports on the Quality of Teacher Preparation: Title II, Higher Ed ucation Act, NCES 2000-8 9 Washington, DC.About the AuthorsLarry H. LudlowDennis ShirleyCamelia RoscaBoston CollegeLynch School of Education140 Commonwealth AvenueCampion Hall

PAGE 16

16 of 18 Chestnut Hill, MA 02467-3813Larry Ludlow is Chair and Associate Professor in th e Department of Educational Research, Measurement and Evaluation at the Lynch S chool of Education at Boston College. He teaches courses in research methods, st atistics, and psychometrics. His research interests include teacher testing, faculty evaluations, applied psychometrics, and the history of statistics.Dennis Shirley is Chair and Professor in the Depart ment of Teacher Education, Special Education, and Curriculum and Instruction at the Ly nch School of Education at Boston College. His research interests are in the areas of community organizing and school reform, and his most recent book is "Valley Interfa ith and School Reform: Organizing for Power in South Texas" (University of Texas Pres s, 1992). He teaches classes in the Social Contexts of Education and the History and Po litics of Curriculum. He is the Director of the Massachusetts Coalition for Teacher Quality and Student Achievement. Camelia Rosca is a doctoral candidate in the Lynch School of Education at Boston College. Her interests include teaching, program ev aluation, and large-scale assessment.Copyright 2002 by the Education Policy Analysis ArchivesThe World Wide Web address for the Education Policy Analysis Archives is epaa.asu.edu General questions about appropriateness of topics o r particular articles may be addressed to the Editor, Gene V Glass, glass@asu.edu or reach him at College of Education, Arizona State University, Tempe, AZ 8 5287-2411. The Commentary Editor is Casey D. Cobb: casey.cobb@unh.edu .EPAA Editorial Board Michael W. Apple University of Wisconsin Greg Camilli Rutgers University John Covaleskie Northern Michigan University Alan Davis University of Colorado, Denver Sherman Dorn University of South Florida Mark E. Fetler California Commission on Teacher Credentialing Richard Garlikov hmwkhelp@scott.net Thomas F. Green Syracuse University Alison I. Griffith York University Arlen Gullickson Western Michigan University Ernest R. House University of Colorado Aimee Howley Ohio University Craig B. Howley Appalachia Educational Laboratory William Hunter University of Ontario Institute ofTechnology Daniel Kalls Ume University Benjamin Levin University of Manitoba

PAGE 17

17 of 18 Thomas Mauhs-Pugh Green Mountain College Dewayne Matthews Education Commission of the States William McInerney Purdue University Mary McKeown-Moak MGT of America (Austin, TX) Les McLean University of Toronto Susan Bobbitt Nolen University of Washington Anne L. Pemberton apembert@pen.k12.va.us Hugh G. Petrie SUNY Buffalo Richard C. Richardson New York University Anthony G. Rud Jr. Purdue University Dennis Sayers California State University—Stanislaus Jay D. Scribner University of Texas at Austin Michael Scriven scriven@aol.com Robert E. Stake University of Illinois—UC Robert Stonehill U.S. Department of Education David D. Williams Brigham Young University EPAA Spanish Language Editorial BoardAssociate Editor for Spanish Language Roberto Rodrguez Gmez Universidad Nacional Autnoma de Mxico roberto@servidor.unam.mx Adrin Acosta (Mxico) Universidad de Guadalajaraadrianacosta@compuserve.com J. Flix Angulo Rasco (Spain) Universidad de Cdizfelix.angulo@uca.es Teresa Bracho (Mxico) Centro de Investigacin y DocenciaEconmica-CIDEbracho dis1.cide.mx Alejandro Canales (Mxico) Universidad Nacional Autnoma deMxicocanalesa@servidor.unam.mx Ursula Casanova (U.S.A.) Arizona State Universitycasanova@asu.edu Jos Contreras Domingo Universitat de Barcelona Jose.Contreras@doe.d5.ub.es Erwin Epstein (U.S.A.) Loyola University of ChicagoEepstein@luc.edu Josu Gonzlez (U.S.A.) Arizona State Universityjosue@asu.edu Rollin Kent (Mxico)Departamento de InvestigacinEducativa-DIE/CINVESTAVrkent@gemtel.com.mx kentr@data.net.mx Mara Beatriz Luce (Brazil)Universidad Federal de Rio Grande do Sul-UFRGSlucemb@orion.ufrgs.brJavier Mendoza Rojas (Mxico)Universidad Nacional Autnoma deMxicojaviermr@servidor.unam.mxMarcela Mollis (Argentina)Universidad de Buenos Airesmmollis@filo.uba.ar

PAGE 18

18 of 18 Humberto Muoz Garca (Mxico) Universidad Nacional Autnoma deMxicohumberto@servidor.unam.mxAngel Ignacio Prez Gmez (Spain)Universidad de Mlagaaiperez@uma.es Daniel Schugurensky (Argentina-Canad)OISE/UT, Canadadschugurensky@oise.utoronto.ca Simon Schwartzman (Brazil)American Institutes for Resesarch–Brazil(AIRBrasil) simon@airbrasil.org.br Jurjo Torres Santom (Spain)Universidad de A Coruajurjo@udc.es Carlos Alberto Torres (U.S.A.)University of California, Los Angelestorres@gseisucla.edu


xml version 1.0 encoding UTF-8 standalone no
mods:mods xmlns:mods http:www.loc.govmodsv3 xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govmodsv3mods-3-1.xsd
mods:relatedItem type host
mods:identifier issn 1068-2341mods:part
mods:detail volume mods:number 10issue 50series Year mods:caption 20022002Month December12Day 1212mods:originInfo mods:dateIssued iso8601 2002-12-12


xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam a22 u 4500
controlfield tag 008 c20029999azu 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E11-00298
0 245
Educational policy analysis archives.
n Vol. 10, no. 50 (December 12, 2002).
260
Tempe, Ariz. :
b Arizona State University ;
Tampa, Fla. :
University of South Florida.
c December 12, 2002
505
Case that won't go away : besieged institutions and the Massachusetts teacher tests / Larry H. Ludlow, Dennis Shirley, [and] Camelia Rosca.
650
Education
x Research
v Periodicals.
2 710
Arizona State University.
University of South Florida.
1 773
t Education Policy Analysis Archives (EPAA)
4 856
u http://digital.lib.usf.edu/?e11.298