|USFDC Home||| RSS|
This item is only available as the following downloads:
1 of 22 Education Policy Analysis Archives Volume 8 Number 12February 21, 2000ISSN 1068-2341 A peer-reviewed scholarly electronic journal Editor: Gene V Glass, College of Education Arizona State University Copyright 2000, the EDUCATION POLICY ANALYSIS ARCHIVES. Permission is hereby granted to copy any article if EPAA is credited and copies are not sold. Articles appearing in EPAA are abstracted in the Current Index to Journals in Education by the ERIC Clearinghouse on Assessment and Evaluation and are permanently archived in Resources in Education Top-Down, Routinized Reform in Low-Income, Rural Sc hools: NSF's Appalachian Rural Systemic Initiative Robert Bickel Terry Tomasek Teresa Hardman Eagle Marshall UniversityThis article has Commentary Abstract Since 1991, the National Science Foundation has fun ded fifty-nine state, urban, and rural systemic initiatives. The purpose of the initiatives is to promote achievement in math, science, and technolog y among all students, and to encourage schools and communities to secure the resources needed to maintain such outcomes. The App alachian Rural Systemic Initiative (ARSI) is a six-state consortiu m which focuses these efforts on low-income, rural schools. The primary m eans of accomplishing ARSI's aims is a one-day-one-school s ite visit, called a Program Improvement Review, done by an ARSI math or science expert. The centrally important Program Improvement Reviews however, seem to be premised on unsubstantiated assumptions as to the static,
2 of 22easy-to-understand, easy-to-evaluate nature of educ ational achievement in rural Appalachian schools. As a result, the Revi ews resemble exercises in early-twentieth century scientific man agement, and are unlikely to enhance achievement in science or math. Consequently, even if there is merit to the commonsense human capital approach to economic growth and development on which systemic i nitiatives are tacitly premised, this firstperson account makes a case that desired payoffs are unlikely to follow from the work of ARS I. Efforts to promote economic development and eliminate poverty through investment in public education have a long history in the U.S. (See, for example, Bowles and Gintis, 1976; Kaestle, 1983; Perkinson, 1995; S pring, 1997; McMurrer and Sawhill, 1998). In recent years, such efforts have included special attention to elementary and secondary schooling in science, math, and technolog y (Ashton and Sung, 1997; Senate Committee on Labor and Human Resources, 1997). This emphasis is premised on the assumption that in an increasingly science-based, t echnology-intensive world, the economic well-beingperhaps even the simple surviva lof individuals and entire societies requires ever-higher levels of pure and a pplied scientific and mathematical knowledge (Shapiro and Varian, 1998; National Counc il of Teachers of Mathematics, 1998; Reich, 1992).The National Science Foundation's Systemic Initiati ves In line with this straightforward human cap ital theoretic point of view, since 1991 the National Science Foundation has funded fifty-ni ne state, urban, and rural systemic initiatives (National Science Foundation, 1999). Th e purpose of each systemic initiative is to promote education in math, science, and techn ology (National Science Foundation, 1994a). Published research on the initiatives is ha rd to find, and evaluation reports are not available. The origin of the term "systemic initiat ive" remains unclear. NSF's recent request for proposals for "systemic initiative rese arch" provides no insight as to the meaning of the concept (NSF, 1998). The terminology may follow, however, from N SF's judgment that education involves entire communities (Shields, 1997). At its best, in this view, education in math and science focuses on everyday applications in com munities where schools are located (National Science Foundation, 1994b). The communiti es themselves, in a reciprocal process, benefit from development of a technologica lly literate workforce (Consortium for Policy Research in Education, 1995).NSF's Appalachian Rural Systemic Initiative (ARSI) The Appalachian Rural Systemic Initiative, or ARSI, is a six-state consortium, covering all of the Appalachian region of the U.S. (Harmon and Blanton, 1997). Consistent with NSF's intent, ARSI's ambitious obje ctive is to facilitate educational change in economically disadvantaged rural schools resulting in high achievement for all students in mathematics and science (National Scien ce Foundation, 1997). This is to be accompanied by development of community resources t o sustain educational improvements (Brown, 1996).
3 of 22Program Improvement Review The primary means of accomplishing ARSI's a ims is the Program Improvement Review. Done by ARSI experts, typically retired tea chers, the purpose of a Review is to identify strengths and weaknesses in schools' math and science programs and make recommendations for improvement. ARSI experts, ther eby, are charged with helping low-income, rural schools make students more produc tively employable in a science-based, technology-intensive world. In doing this, ARSI experts aim to contribute to production of the human capital needed for the e conomic and social development of low-income rural areas.Will ARSI Promote Economic Development in Appalachi a? The uncomplicated human capital perspective on which ARSI is premised begs an important policy question. Specifically, can educat ional reform be used to drive a growth and development strategy whereby the availability o f well educated prospective employees attracts employmentcreating investments ? A tenable alternative holds that economic development is a necessary prerequisite for effective educational change (Bickel and Spatig, 1999). For present purposes, however, we will put this reservation aside and address a more manageable question: If the education-and-deve lopment assumptions on which ARSI is premised were undeniably correct, would ARS I accomplish its objectives?A First Person Account The following account is written from the v antage point of one who was first an ARSI expert-aspirant, then an ARSI expert writing h is first Program Improvement Review, and finally an ARSI dropout. The descriptio ns of "shadowing," of neutral-site instruction, of report preparation, and of rejectio n of the ARSI model are based on work done as part of the process of bringing ARSI to Wes t Virginia under the auspices of the regional university with which the paper's authors are affiliated. Participation in this endeavor leads to the following inferences: ARSI experts construe the process of educational ac hievement as a thoroughly understood, relatively simple mechanism manifest in static indicatiors of school effectiveness.In consequence, ARSI has standardized and accelerat ed its centrally important Program Improvement Review process throug h excessively routinized observation based on short-cut procedure s and unvalidated instruments.ARSI experts show no interest in substantiation of their evaluation criteria, but, nevertheless, take them for granted as embodyi ng the one right way to teach math and science anywhere.Student engagement and student-teacher interaction are irrelevant to ARSI evaluations. Departures from ARSI criteria, even in the presence of overwhelmingly favorable student responses, are neg atively sanctioned.
4 of 22 The remainder of this article is devoted to clarifying these inferences based on a first-hand account of ARSI at work. Throughout, one important message seems clear: ARSI's Program Improvement Reviews in low-income, r ural schools are unlikely to enhance science and math achievement or promote eco nomic growth and development. We attribute this unfortunate set of circum stances to specious assumptions as to the existence of a taken-forgranted, science-based ra tionale for the top-down routinization and streamlining of educational evaluation and prac tice. As a result, even if the commonsense human capital framework on which system ic initiatives are based were valid, ARSI's work would not facilitate their appli cation.A Checklist-Guided Audit The Program Improvement Review takes the fo rm of a oneday, one-expert school visit, yielding a checklist-guided audit, resulting in degree-of-compliance scores ranging from 1 to 5 on approximately seventy Likert items. The check list is called a "Consistency Rating Summary." For example, when evaluating a math program the first general heading is "Curriculum, subsuming ten Likert items, the firs t being "1.1 The math curriculum is written and used in planning the instructional prog ram." The remaining general headings are "Instruction," "Thinking Processes," "Equity an d Diversity," "School Climate, "Relevance" or "Connections," "Training and Develop ment," and "Financial and Material Resources." The total number of items varies slightly d epending on the discipline, math or science, the grade level, and the state which provi des the educational policy setting for the review. Minor variations in the wording of the general headings and individual items are geared to these same factors. For example, unde r "Instruction," the first item used in evaluating math programs in West Virginia elementar y schools reads as follows: "2.1 Teachers use WV IGO's to guide their instructional practices." "WV IGO's" refers to state-mandated "Instructional Goals and Objectives, around which high-profile state achievement tests are organized. Likert item scores are used to gauge specif ic strengths and weaknesses in a school's math or science program. Strengths reflect consiste ncy with the ARSI model embedded in the "Consistency Rating Summary." Weaknesses ref lect departures from the model. In practice, far more attention is given to weaknesses than to strengths. In spite of the importance of the Consisten cy Rating Summary, the source of its ten headings and seventy items is not identified. Are t hey research-based? Are they reasonable inferences based on years of teaching ex perience? Are they established principles in math and science education? Is their appeal based on face validity among ARSI experts? Do they represent an identifiable edu cational philosophy or pedagogical model? Participants are not told. Literature is now here to be found.NSF Standards NSF has promulgated a detailed set of Natio nal Science Education Standards (National Research Council, 1996). In the course of conversation and training with ARSI experts, however, these are never mentioned. If the experts are aware of NSF Standards, they do not disclose this. If NSF Standards are a s ource for the Consistency Rating Summary, participants are not told. The absence of descriptive, evaluative, or any other sort of literature concerning the Summary is again conspicuous. ARSI experts occasionally make off-handed references to "constru ctivism," and they are fond of
5 of 22invoking the notion "hands-on." One might reasonabl y surmise, therefore, that these ideas, though they typically remain vague, are incl uded in construction of the Consistency Rating Summary and the way it is scored In the absence of pertinent literature, however, this remains merely plausible conjecture.State Mandates ARSI experts often refer to state mandates, such as West Virginia's Instructional Goals and Objectives, mentioned above, and the Kent ucky Core Content for Assessment. Whatever the merit of these state-level mandates, their substance appears to have been another influence in construction of the Consistency Rating Summary, and affects the way it is applied. The heading emblazon ed at the top of the Consistency Rating Summary may vary with the state in which it is being used, as in "KERActeristics of a Good Mathematics Program" used in Kentucky, or the "West Virginia Program Improvement Review Consistency Rating Summary for M athematics." Beyond these tentative inferences, however, no rationale for the instrument is provided. One is left with the impression that the Consistency Rating Summary may very well have been the product of brain-storming s essions. The outcome is an instrument which appears to be vaguely current and topically correct, but which, as an evaluation tool, is of uncertain value.Consistency Rating Summary Validation Similarly, the technical properties of the Consistency Rating Summary as a measurement tool are not reported, and may not have been investigated. Given organization of the instrument into ten sections, e ach subsuming six to ten items, one might reasonably surmise that a factor analysis wou ld reveal ten identifiable subscales. If this is the case, however, results are not availabl e. The same is true for routine reliability coefficients. In short, the psychometric properties of the instrument seem not to be known. The possibility that discussion of such prop erties might be pertinent, even essential, is not acknowledged by ARSI experts.Reporting on a Program Improvement Review The final report, usually written overnight and presented the next day, is organized around the same ten general headings and seventy Li kert items. Since much more attention is given to weaknesses than to strengths, most reports do not address all general headings or all items, but only those deeme d deficient. Recommendations for change appear throughou t the report. A recommendation pertaining to "Relevance", meaning "[relating] math ematical knowledge to students' goals and interests," for a middle school located i n a low-income, rural district in West Virginia's southern coal fields reads as follows: "Make a concerted effort to display positive, engag ing images of mathematics throughout the school environment, payi ng particular attention to highlighting student work that is creative (not just correct) . ." [Emphasis in the original.]Becoming an ARSI Expert
6 of 22 Training in doing the Program Improvement R eview, including scoring the Consistency Rating Summary, usually begins with "sh adowing," accompanying an ARSI math or science expert who is doing a Review. ARSI experts also provide training at neutral sites, relying heavily on videos prepared t o meet their specific instructional needs. Limited role-playing is used as a means of r eadying prospective experts to present Program Improvement Review findings to school perso nnel. Training is informal, with little or no dir ect instruction. Instead, the ARSI experts serve as models during shadowing, and provide illus trative opportunities to apply the ARSI model during training sessions. Total training time varies, usually ranging from two to three days. An experienced ARSI expert may a lso participate in the first Program Improvement Review done by a just-trained expert."Shadowing" in Chemistry 8-B: Deficient Instruction To illustrate our claim that ARSI Program I mprovement Reviews are unlikely to enhance achievement, we begin with a brief case stu dy of shadowing. Two ARSI expert-aspirants, assisting in bringing ARSI to Wes t Virginia under the auspices of the university which employs them, are observing the in -school work of an ARSI science expert at a small, rural, low-income elementary sch ool in eastern Kentucky. We first attend a chemistry class. The thre e of us open the front door to the classroom without knocking, walk to the rear withou t speaking, and sit in side-by-side desks, while the class goes on about us. Students s eem uninterested in our intrusion. The teacher seems unconcerned, and she makes no effort to acknowledge our presence. Even though this elementary school goes through grade 8, chemistry, rather than, say, general science, seems out of place, too advanced for an el ementary school. The class, moreover, is referred to as Chemistry 8-B. This, we learn, me ans that chemistry students are grouped or tracked, with the ostensibly more capabl e students located in section 8-A. Nevertheless, the approximately twenty-five student s in section 8-B seem quite capable themselves. The teacher is reviewing chemical bondi ng, referring to positive and negative valences, what they mean with regard to th e make-up of individual atoms, and how they govern the way different elements combine to form molecules. She makes occasional reference to a periodic table displayed within easy reach on the wall near the front of the room. Desks are organized in traditional fashion, arranged in rows, all facing forward. The teacher's desk is in the front of the room in the m iddle, turned toward the students. The teacher stands slightly to the left of her desk fac ing the students and occasionally turning to the board or, less often, to the periodic table. The presentation, too, is traditional, relying largely on lecture and board work, with que stions and responses to teachers' queries from students. The teacher speaks fairly ra pidly. The substance of the class is in no sense trivialized to match the ostensibly limite d capabilities of lower track students. The material covered is high school chemist ry, much as I remember it from the eleventh grade. The teacher, though, seems smarter and more articulate, explaining things more clearly than I remember mine doing deca des ago. Her high expectations for students are genuinely taken for granted. None of t he students stands out as a stellar performer or favorite. The teacher's high expectati ons seem to apply equally to everyone. The truly remarkable things about the class are the students' responses. All white, about half male and half female, they seem genuinel y engaged. They attend singlemindedly to the teacher's presentation. The student s, manifestly, are putting all their time on task. Not just any task, but the conceptually di fficult, even esoteric task at hand. The teacher asks questions fairly often. An swers are quickly forthcoming, spoken
7 of 22thoughtfully, usually confidently, without the form ality of hand-raising. Students' questions are immediately acknowledged and answered in a business-like, though not unsympathetic fashion. The teacher, a woman of abou t thirty who seems obviously to enjoy what she is doing, tries various means of exp laining the same difficult ideas, sometimes complementing her oral presentation with additional board work. Students don't talk among themselves. Two g irls on the teacher's right near the front of the room are an exception, but as they whisper, they look toward the chalkboard, and one points to a diagram that the teacher had drawn earlier, illustrating the bonding of sodium and chlorine. A male student near the rear o f the room on the teacher's left has a persistent problem with understanding her explanati on of positive and negative valences. He makes his difficulty conversationally evident: "Yeah, but I still don't get it. The signs are the opposite . ." He makes his point, in the same conversational fash ion, more than once: "I still don't get it. Why isn't it negative . .? The teacher explains again, varying her cho ice of words. She gives no evidence of impatience. She addresses the questioning student i n a matter-of-fact, even collegial fashion. She moves on, still holding students' atte ntion, and doing so effortlessly. She presents material with relaxed enthusiasm born of g enuine interest. There is no exaggerated affect or undue dramatization as she co ntinues with a traditional presentation of conceptually sophisticated material The puzzled student on the teacher's left r emains confused about positive and negative valences, though the precise nature of his misunderstanding is still not quite clear. He remains engaged, however, and raises the issue yet again, without evidence of embarrassment or anxiety. The teacher stops and thi nks, looks at her diagrams on the board, seems not to know what else to say. A male student sitting to the immediate lef t of his confused colleague responds spontaneously and matter-offfactly: "I think I see . try this." I cannot hear what is said. After a brief exchange between the two students, the puzzled one addresses the teacher: "If sodium is short an electron and it adds one, wh y isn't it negative?" Implied in this question is a complementary query about chlorine: if chlorine has an extra electron and it gives one to sodium, why isn' t chlorine positive? The source of the student's confusion is now clear. The +1 valence of sodium is determined in its free state, before it combines with chlorine to form tab le salt. The fact that it takes an electron from chlorinein effect adds a negatively charged particledoes not make it negative. The fact that it has a place for an elect ron that it adds to its outer ring, however, does make it positive. And conversely with chlorine. The nature of the difficulty having finally been clarified, the teacher is able to dispel the formerly puzzled students' misunderstand ing. He is satisfied. The teacher and students continue in the same matter-of-fact but en gaged manner which has prevailed from the beginning. One way to usefully characteriz e their approach and the nature of the affect which accompanies it might very well be "professional."
8 of 22 As an observer, I was stunned. How did the teacher manage to hold the attention and active interest of this Blevelor any leveleighth gradeor any gradechemistry classor any classthroughout an entire period in which she discussed, in traditional lecture form, chemical bo nding? In a low-income, rural, K-8 elementary school in eastern Kentucky or anywhere e lse! Here, as best I could determine, was science being taught and learned abo ut as well as could be done. Since the aim of ARSI is to promote high achievement in m ath and science in low-income, rural schools, this, perhaps, was a model, though o ne that might prove difficult to codify. NSF's National Science Education Standards, which may or may not be known to ARSI experts, are intended to enable educators to j udge whether particular actions will serve the vision of a scientifically literate socie ty (National Research Council, 1996). The actions of this teacher and her students emphat ically did just this. Or so it seemed to me.An Off-the-Cuff Evaluation of Chemistry 8-B At the behest of the ARSI science expert, t he three of us who had been observing left before the class was over. We had been in Chem istry 8-B for about twenty-five minutes. Going out the door at the front of the roo m, I said to the teacher: "We're leavin' cause we can't understand this stuf f." The teacher stopped in mid-lecture, looked at me while I was speaking, and an expression of uncertainty left her face as she smil ed. She gestured toward her students and said with confident pride: They can understand it!" "I can see that!", I replied, as I joined the other two observers in the hall. As the three of us walked to the next class the ARSI science expert, striding purposefully, leading the way, offered the followin g judgments: "They didn't understand a word she said." His tone was contemptuous. "She was way, way over their heads.""There was nothing to hold their interest, no manip ulatives or anything." "The walls were just about bare. Not much about sci ence on them . nothing at all about science careers.""She was traditional lecture the whole time. All co ntent." The other observer was non-committal, as if taking in what was being said but still processing it, neither concurring nor disagreeing. There was a brief silence as we walked. The n I said, laughingly, "for what it's worth, she teaches just like I do, when I'm having a really good day." Neither the ARSI science expert nor the other observer acknowledged my comment. The ARSI expert led us into the next classroom. I felt sort of silly. N ot because no one had acknowledged my response, but because I had felt the need to cover it with self-deprecating laughter. It was clear that the ARSI expert had defin ite preconceptions as to what eighth graders could and could not handle. His conclusion that the students in 8-B chemistry
9 of 22had no idea what the teacher was talking about seem ed wildly at odds with what I had seen and heard in the classroom. Even the puzzled s tudent eventually understood, and he did so with the help of another student. His confus ion, moreover, bespoke an understandable, even imaginative, failure to see th e specific terminological conventions which were being employed. In a real sense, his con fusion about terminological conventions actually reflected a clear understandin g of the chemical bonding process itself. The teacher's method of presenting the mate rial was traditional, to be sure. The students participated freely, however, without fear and without required hand-raising. The teacher-student, and student-student exchanges were conversational and matter-of-factly animated. Students helped each oth er.The Irrelevance of Students For the ARSI expert, however, this relaxed, informal, traditional approach was inevitably ineffective. It was abundantly clear that the living presence of students in the classroom was not essential to his judgments. He seemed not to notice them, their engagement, or the informed nature of their exchang es with the teacher and with each other. The expert attended only to the teacher, her traditionally limited use of few instructional materials, and the dearth of wall pos ters.One Best Way to Teach Science The ARSI expert clearly judged himself to b e in a position to evaluate any science teacher's performance without benefit of observing or otherwise evaluating student responses, to which he seemed oblivious. In this in stance, he purported to know in a matter of minutes that the teacher was clue-less, a nd that students would not learn. Traditional lecture was bad. Absence of manipulativ es was worse. "You can use them to build molecules," he assured us. "That's what she was trying to do, but it's somethi ng you have to get your hands on. There weren't even any [manipulatives] in the room."Thin Description The ARSI science experts' dismissive, almos t angry assessment of the teacher's effectiveness bespoke a willingness to generalize f rom very limited information. His assumption, clearly, was that twenty-five minutes o f haphazardly selected, barged-in-on class time enabled him to produce an accurate typif ication of the teacher's performance and students' consequent achievement. His harsh judgments, moreover, seemed incon sistent with NSF's position that science teaching and inquiry can be effectively don e in a variety of ways (National Research Council, 1996; also see National Science T eachers Association, 1998). But once again, the connection between NSF and ARSI may or may not entail a shared understanding about teaching science and math. NSF standards may or may not be known to ARSI experts. In any case, the experts do not mention them."Shadowing" in a Program Improvement Review Present ation
10 of 22 Three weeks after the visit to the eastern Kentucky K-8 elementary school, I was again involved as an observer. I was paired with th e same ARSI expert-aspirant, shadowing another ARSI expert in another small, low -income, rural elementary school in eastern Kentucky. During an hour-long, late morn ing meeting, the ARSI expert presented his previous-day's findings to the school 's principal and six teachers. The ARSI expert began with a weak, almost apologetic gr in: "This isn't as bad as it looks. There are a lot of 1's, 2's, and 3's, but this can be fixed . a lot of it . ." [His voice traile d off.] Criteria used in selecting the six teachers present at the meeting were not specified. They and the principal, however, remained silent as the ARSI expert went over his largely unfavorable report. "There's no evidence of the importance of math. The y come away thinking it's just what they do in school.""They don't create their own knowledge. There is a lot of mainly lecture in the classroom.""If you used field trips, they would be able to see math all around us." "They don't see its importance for careers, and tha t it's rewarded." The principal, in spite of the beating her school was taking, looked confident and even eager throughout, as if to say "we're professi onal educators sharing information. There's nothing personal about this. We're glad to hear from outside experts, and we'll benefit from it. Please go on." The teachers seemed affectively disengaged but dutifully attentive. They betrayed no emotion. They seemed to neither accept nor reject the ARSI expert's account.Teachers' Informal Challenge After the report was presented, with only a few words of perfunctory discussion, we went to lunch in the school's cafeteria. By chance, I stood in the serving line with two of the teachers who had attended the meeting. Female, white, in their late-forties to mid-fifties, the teachers pleasantly initiated a co nversation by asking where I was from. We talked briefly about West Virginia and work I ha d done in a rural county there. I likened that to what was being done by ARSI in thei r school. This was followed by a brief what-do-we-say-now sort of silence. By way of keeping the conversation going, I added that the West Virginia project had been a long one. One of the teachers asked how long. I replied that it had gone on for three years, relying heavily on repeated focus groups with a broad range of stakeholders, and on literally hundreds of visits t o the three schools involved. The teachers became more animated and empha tic. Speaking of the ARSI expert's report of instructional omissions and other deficie ncies, they commented: "We do a lot of that stuff, but we don't do it all the time. He was only here for one day, for a few hours . .""He never came to my room. How could he know what w e do?"
11 of 22"I never even knew he was here." Clearly, in this low-income, rural elementa ry school in eastern Kentucky, teachers were challenging the assumption that ARSI experts' one-day site visits enable them to understand a school's math or science instruction. This assumption, nevertheless, tacitly under girds all ARSI Program Improvement Reviews. In retrospect, it seems obvious that I invi ted this challenge from the two teachers with my mention of a threeyear project in West Vi rginia. At the time, however, I was just awkwardly trying to hold up my end of a conver sation. Moreover, the teachers' responses seemed genuine, something they were waiti ng for a chance to say. Perhaps I had given them a deserved rhetorical opportunity, r ather than a naked invitation to engage in a defensive, self-serving polemic.Training in Fixing Deficiencies: Conservation of Mo mentum In addition to shadowing, the training of A RSI expertaspirants includes neutral-site instruction offered by ARSI experts. A s an example, ten ARSI expert-aspirants and a handful of interested onlook ers met with an ARSI math and science expert at the small-city headquarters of a West Virginia regional education agency. It was the ARSI expert's aim to continue wi th the introduction of expertaspirants to the ARSI approach to evaluating educat ion in science and math.ARSI Training Videos A retired teacher, the expert relied largel y on a series of videos intended to provide opportunities to illustrate the ARSI ethos in use. During one of the longer and more purposeful videos, a white female teacher in her la te twenties is seen reviewing the concept "conservation of momentum" with her high sc hool physics class. There are approximately twenty students, all of them are whit e, about evenly divided between males and females. Is this a functioning classroom, or something staged by ARSI to aid in the production of new ARSI science and math expe rts? We are not told, and no one asks. The students in the video are more or less attentive. The teacher's presentation is brief and seems to lack focus, perhaps because the video begins near the end of her explanation, immediately following an exercise with manipulatives. Oddly, there is no teacher's introduction to the video itself. It just starts. Whether or not this is an ARSI-staged video, the absence of an introduction, an explanation to students as to why the video is being shown, is disconcerting. After a ll, we are supposed to be engaged in the evaluation of instruction. Maybe the ARSI exper t will use the teacher's failure to introduce her video as a painfully obvious illustra tion of the wrong way to do things, such as use audio-visual aids in explaining conserv ation of momentum. The video is devoted entirely to cars crash ing. Cars crashing into each other, cars crashing into telephone poles, cars careening off g uard rails and rolling onto their roofs, cars going off the road and landing in ditches . . It is reminiscent of a demolition derby, but without a winner. It is not immediately evident to me that the video actually does illustrate conservation of momentum. The ARSI expert says nothing. The only sounds in our room, as in the classroom on the vide o, are made by crashing cars. As we watch the students watching the video they seem, for the most part, unmoved. The camera catches two male students sitti ng together laughing at one, seemingly unexceptional collision. The crashes, pre sumably, were staged. All the cars
12 of 22are from the middle and late 70's. The video is re petitious, it seems too long, there is no narrative, just wreck after wreck, one looking more or less like another. Finally it occurs to me that conservation o f momentum, as best I can remember from twelfth grade physics, is manifest in the cars tendency to continue moving even after they run into something solid. Though this re collection, in retrospect, seems embarrassingly obvious, is it safe to assume that t he students on the video made the same inference? After all, their teacher, much as o ur ARSI expert, provided no commentary. Is this an example of constructivism, o f students constructing their own physical knowledge? When the conservation of moment um video is over, our video is over, too. If there was an in-class discussion of w hat the students had just seen, we didn't get to hear it. Employment of the video seems part of a badly disjointed instructional process. Perhaps the point of all this has been self -evident to the other ARSI expert-aspirants. I am, however, surrounded by nine other adults, all involved in education in one way or another, some with backgrou nds in science, but more from administration or higher education. I wonder how ma ny know what conservation of momentum means. Even now, I'm not sure that I do. F or all I know, my aforementioned recollection from twelfth grade physics was in erro r. After all, I may have confused conservation of momentum with "objects in motion te nd to stay in motion . .", or something like that. I wonder how many of the others see the per tinence of a video of serial collisions to understanding conservation of momentum. Were they a ble to recall or construct their own physical knowledge? Or is this video as bad an instructional tool as it seems to be? The ARSI expert has very little to say abou t the serial-collision video. For a moment, he seems at a loss. He passes up the opport unity to fault the teacher for not providing an introduction. He says nothing about th e absence of a debriefing. Then, belatedly, he calls our attention to the fact that two male students had laughed: "You could see their interest. They weren't just be ing passive." The expert says nothing more about the vide o. He has concluded, as far as I can tell, that it demonstrated students' engagement in the pr ocess of acquiring a clearer, deeper understanding of "conservation of momentum." Perhap s we really have seen the construction of physical knowledge. My colleagues a nd I are silent. In truth, the serial collision video seemed like a silly caricature of i nstruction with audio-visual aids, how to misuse them rather than use them. But the ARSI e xpert gives no evidence of sharing this view.Training in Fixing Deficiencies: Getting "Down and Dirty" In another instructional video, a white fem ale teacher in her early thirties is standing in front of a class of elementary school s tudents. We are not told the grade, but the children appear to be eight or nine years old. Once again, all the students are white. The classroom is organized in traditional fashion, with individual desks in rows and the teacher standing at the front of the room, her back to the chalkboard. The teacher has said only a few words, the point of her class has n ot yet become evident, when the ARSI expert interrupts while the video continues to run. He speaks emphatically and with excitement: "Look at her! Look at her clothes! She prepared for this!"
13 of 22 In truth, I saw nothing distinctive about t he teacher's clothing or appearance. She was dressed modestly, wearing an open jacket with l apels, a white blouse which buttoned at the neck, a just-below-the-knees skirt, and shoes with medium heels. Her clothing was well-suited to working as, say, a bank teller, a receptionist in a family dentist's office, or a casework supervisor in a sta te social welfare agency. Her hair was cut short, but not extremely so. It was neatly comb ed, but not stylishly done. She wore makeup, but there was nothing ostentatious or ext raordinary about it. She looked like the girl next door, grown up and working for a mode st living. But the ARSI expert did not see it that way. The fact that the teacher was presentable counted against her: "She can't get down-and-dirty dressed like that.""She didn't come to work." These observations, coupled with his surmis e that the teacher had come prepared to appear on a video, seemed to imbue the ARSI expert with a sense of discovery. His response to the video suggested that, perhaps, he h ad not seen it before. He was looking for something instructive, and quickly found it in the teacher's appearance, which still seemed unexceptional. He judged a teacher's work as inevitably in volving getting "down-and-dirty." Suitable clothes, I concluded, would have been fade d jeans, a sweatshirt with holes worn in the elbows, and grass-stained tennis shoes. Why suitable attire for an elementary school teacher should take this form remained a mys tery to me, just as the nature of "getting down and dirty" and why it was a pedagogic al essential remained unexplained. None of the prospective experts spoke. I sa w two give obligatory grins at the "she can't get down-and-dirty dressed like that" judgmen t. Otherwise, the group was impenetrably difficult to read. Was the lesson clea r? Did participants accept it? Did anyone find this informative? Did the ARSI expert k now that NSF National Science Education Standards do not include a dress code? Wa s he aware that teachers' attire is often an issue in rural Appalachian schools because they sometimes dress too informally (Austin, 2000)? Is this what it means to become an ARSI expert?ARSI in West Virginia ARSI's first Program Improvement Review in West Virginia was done in mid-March of 1999. This was also the first time I w orked as an ARSI expert. The same was true of my shadowing partner, who was serving a s coordinator of our three-school review. Though newly-minted as an ARSI expert, he h ad long experience in grant writing, program development, and administration of ground-up educational change efforts. Early in his career, he had taught high sc hool science.Adaptation or Adoption This Review, moreover, was to be different from those we had seen in Kentucky. It involved three schools rather than one. The schools an elementary school, a middle school, and a high school, are in close geographica l proximity to each other, situated in a low-income, rural district in the state's southern coal fields. In addition, while the ARSI Program Improve ment Review was being used as a point of departure, it was not a governing model. T he Consistency Rating Summary, replete with Likert items, was still there, but as only one source of information in
14 of 22preparation of a report which was to be tentative, formative, and qualitative. Rather than one-expert school visits, as in Kentucky, there were four evaluators for each school. Most members of each team were newly-m inted ARSI experts, who also had training and experience in a variety of pertine nt disciplines, including assessment, math education, program evaluation, and administrat ion. Recommendations for improvement were to be made only after discussing the final report with a variety of local stakeholders from th e three schools. Stakeholders would participate in the process of actually producing the recommendations.Synthesizing a Final Report My task was to synthesize a final report. T he Consistency Rating Summary would have left little to synthesize, but its place was n ot central in West Virginia, as it had been in Kentucky. The materials for synthesis were submi tted in manila folders, eleven-by-seventeen envelopes, three-ring notebooks translucent zip-lock packets, and paper-clipped pages. Consistency Rating Summaries p repared by ARSI experts were included. The Summaries, however, were mixed in wit h field notes, handwritten reminders, and miscellaneous jottings on single she ets of paper. In addition, each teacher at each school had completed a Consistency Rating S ummary, and these, too, had been included. Some ARSI experts' Summaries had conspicuou s marginal notes and some did not. Summaries for the same school included and excluded different headings and items. Some included experts' names and some did not. Some had a formal, finished appearance, while others looked like preliminary wo rksheets. In spite of our plan to make production of recommendations a collaborative effort with stakeholders, a few Summaries included recommendations. All tolled, how ever, the material did not resemble the output of the sort of mechanically rou tinized process of thin description we had seen in Kentucky.A Formative Systemic Report Since our Review involved three schools, a systemic report seemed in order. Furthermore, even though the schools were at three different levels, dramatic crossschool commonalities in traditional educational phi losophy and old fashioned, no-nonsense practice made a single report seem fitt ing. The flexibly formative nature of the process was emphasized in the report's opening paragraphs under the heading "Informed Interpretation from Multiple Perspectives ." "A good deal of what we have to say, moreover, is s ubject to good-faith interpretation and reinterpretation by stakeholde rs . ." Similarly, use of the Consistency Rating Su mmary was placed in context, subsumed by "Judicious use of a Quantitative Rating Summary": ". . but one source of information for making for mative judgments. Its . scores . merely summarize some of the informati on used in making our essentially qualitative judgments."A First Draft
15 of 22 The report characterized the math program i n each of the schools as traditional, and noted that all adult stakeholders, teachers, admini strators, and parents, preferred it that way. Parents were unaware of alternatives. Even som e of the teachers were unfamiliar with current terminology and practice. When a newly -minted ARSI expert used the term "rubric," an elementary teacher asked what rubric m eant. The schools were autonomous to a fault. Tho ugh constituting a rudimentary feeder system, teachers and administrators had no cross-sc hool contact. Insofar as their math curricula were cumulatively compatible, it was due to state and district requirements, and adherence to the same traditional ethos and pra ctices. The report went on for twenty-seven doublespaced pages, addressing topics such as "Avoidance of Innovation," "Cautious Selectivity ," "Exclusion of Exploration" Innovations Come and Go," "Traditional Parental Rol es," "School-to-School Isolation," and "Staff Development and Teacher Traditionalism." The concluding sections reemphasized the importance of understanding the repo rt as interpretative and subject to legitimate challenge by stakeholders. Readers were reminded that formulation of recommendations was to be a collaborative effort."Their Nickel" When I gave this determinedly formative rep ort to my former shadowing partner, still coordinating this first West Virginia Review, his response took me by surprise. Noting the absence of a "Consistency Rating Summary ," he said, "it's their nickel." In short, whatever liberties we took with the ARSI mod el, this remained an ARSI endeavor. ARSI was establishing itself in West Virg inia under the institutional auspices of our regional university, and some ARSI expectati ons had to be met. In response, I used the diverse, unstandardized information which had been submitted, and tried to synthesize a set of defensible Likert item scores f or the three-school system. Having attached this to the narrative, I thought the job w as done. The coordinator agreed. He submitted a copy to West Virginia's first ARSI Coll aborative Director, and scheduled a meeting.Meeting with ARSI Officials The meeting with the ARSI Collaborative Dir ector and an associate began amicably. They had read the report, and they listen ed with what appeared to be friendly interest as we explained our plans to meet with sta keholders to collaboratively produce recommendations for change. I characterized the app roach to Program Improvement Reviews in Kentucky as "take-it-or-leave-it," "expe rt-centered," "prematurely codified," "top-down," and "quick-and-dirty." The evolving Wes t Virginia approach, by sharp contrast, was "flexible," "client-centered," "quali tatively formative," and "collaborative." The Director responded by noting that there was only one Consistency Rating Summary for three schools. I replied: "Right. Like we said in the report, we took a syste mic approach. It made sense, especially since the schools are so much ali ke." The Director responded that there were no r ecommendations. I referred again to the report, noting that the recommendations were to be produced collaboratively with school-level stakeholders. The Director, still smil ing, shook her head. She said:
16 of 22"The reports are standard. We need Summary scores f or each school, and recommendations for each." I responded that I had seen take-it-or-leav e-it reports, loaded with misguided Likert-item claims to precision, done all too quick ly during shadowing. They were the sorts of reports, I added, that later sat on shelve s gathering dust, because stakeholders were not involved in their production. The Director replied: "I'm sorry if that was your experience."She looked at her assistant and asked:"Is that the way you saw it when you made visits?" The assistant shook her head and murmured u nintelligibly. I returned to my characterization of what I had seen in Kentucky, in cluding again "take-it-or-leave-it," "prematurely codified," and "quick-and-dirty." The Director responded: "But that's just your opinion."I snapped angrily:"Of course! What else would it be?" My shadowing partner intervened. He asserte d that he had not expected to do Program Improvement Reviews exactly as they were do ne in Kentucky. He was especially concerned about formulating recommendati ons without collaboration with local stakeholders. "They need to be involved in this process. They nee d a sense of ownership. Otherwise, the report will never be implemented." The ARSI Collaborative Director was not per suaded. She said little, remained unflappable, and would not budge: ARSI Program Impr ovement Reviews were standard. I asked: "What did you think of the text of the report?" The Director and her assistant both nodded approval. Then the assistant added: "It was long. People are busy . ." (Followed by a conciliatory, partly muffled chuckle.) I asked: "What's missing from the report as it is now?" The Director repeated that Consistency Rati ng Summaries and recommendations for each school were essential parts of any ARSI Pr ogram Improvement Review report. These, in fact, as submitted by the ARSI experts, are the report. I responded: "So I just clip the three reports together? It's a clerical job?!"
17 of 22The Director replied:"Yes . in part."I responded angrily:"If I had known we were gonna do it this way, I'd n ever have gotten involved. This is the last one I'll do." By this time, I had lost my composure, whil e the Director had retained hers. I left the room, acknowledging that ARSI would get the kin d of report it wanted. That was the end of my involvement with the Appalachian Rural Sy stemic Initiative.In Retrospect It is worth noting that, until our Program Improvement Review, ARSI had kept a low profile in West Virginia. Unknown to me was an earlier series of three meetings with West Virginia educators hosted by ARSI represe ntatives, one of whom is now the ARSI Collaborative Director. According to a partici pant, a former math teacher who is currently a professor of education and a co-author of this article, the meetings were held January through April of 1998. Her unsolicited invi tation to attend described the first meeting as intended to explore "the development of a self-assessment instrument . to aid counties in: "Critically looking at their science and math progr ams." "Analyzing test data to improve instruction/curricu lum." "Planning Professional Development." "Revising Unified Plans." [Documents which are cruc ial to determining levels of Title I funding.] However, the meeting actually focused on Ke ntucky-style Program Improvement Reviews organized around Consistency Rating Summari es. The ARSI representatives' first question suggested real openness: "Should we do this at all in West Virginia." As the participant's account goes, however, "it seemed clear that the decision had already been made. It would be done, and in the same way as in Kentucky." A math and science expert from ARSI's Kentu cky program presented his state's version of the Consistency Rating Summary for revie w. Abandoning the "should we do this at all in West Virginia?" attention-getter, AR SI representatives instructed participants to examine each item on the Summary ac cording to unspecified national standards and state-mandated West Virginia regulati ons. In a February 10, 1998, e-mail from the Wes t Virginia Collaborative Director, participants were told they should decide if Consis tency Rating Summary items "should be kept as is, deleted, or modified. You may also g enerate additional questions." Latitude, yes, but narrowly circumscribed by produc tion of Likert items. Reservations as to whether and how Program Improvement Reviews shou ld be done in West Virginia had been dispelled. The Kentucky model, giving prid e of place to the Consistency Rating Summary and check-list guided audits, had be en adopted.Conclusion When we first thought about writing this ar ticle, our main concern was that there is
18 of 22nothing rural about the work of the Appalachian Rural Systemic I nitiative. That remains emphatically true. For ARSI, a-school-is-a-schoolis-a-school. We concluded, however, that even more important than ARSI's failure to fin d anything consequentially distinctive about rural education was the characterization we o ffered at the beginning: ARSI experts construe the process of educational ac hievement as a thoroughly understood, relatively simple mechanism manifest in static indicators of school effectiveness.In consequence, ARSI has standardized and accelerat ed its centrally important Program Improvement Review process throug h excessively routinized observation based on short-cut procedure s and unvalidated instruments.ARSI experts show no interest in substantiation for their evaluation criteria, but, nevertheless, take them for granted as embodyi ng the one right way to teach math and science anywhere.Student engagement and student-teacher interaction are irrelevant to ARSI evaluations. Departures from ARSI criteria, even in the presence of overwhelmingly favorable student responses, are neg atively sanctioned. How has this come about? ARSI's approach ap pears to be premised on two assumptions, one following from the other. First, a nd fundamentally, educational achievement is so thoroughly understood that its pr actice and evaluation can and should be routinized and streamlined. Second, and as an ob vious corollary, well-informed routinization assures not only enhanced achievement but provides opportunities for cost cutting, as well. If these assumptions, especially the first, were true, educational reform, in low-income rural schools or elsewhere, could be ine xpensively programmed and monitored. Schools would become models of efficienc y and effectiveness. If the assumptions are false, however, science-based routi nization gives way to quick-and-dirty dogmatism, producing contemporary caricatures of ea rly-twentieth century scientific management. As a result, as we have sought to make clear, ARSI's Program Improvement Reviews in low-income, rural schoolsor any schoolsseem unlikely to enhance achievement in science and math. There may be merit to the claim that high l evels of achievement in science and math are far more important today, for individuals and entire societies, than ever before. There may be merit to the claim that the commonsens e human capital assumptions which tacitly undergird NSF's systemic initiatives are valid. Whatever the value of these claims, however, it seems unlikely that interventio ns made by the Appalachian Rural Systemic Initiative will facilitate educational ref orm and promote economic growth and development. Even if the education-anddevelopment assumptions on which ARSI is premised were undeniably correct, this first-person account casts depressingly serious doubt on ARSI's ability to accomplish its objective s.ReferencesAshton, D. and Sung, J. (1997) Education, Skill For mation, and Economic Development. In Halsey, A., Lauder, H., Brown, P. a nd Wells, A. (Eds.) Education: Culture, Economy, and Society New York: Oxford, 207-218.
19 of 22Austin, L. (2000) Some Schools Trying Faculty Dress Codes. Huntington [WV] Herald Dispatch, January: 1A and 9A.Bickel, R. and Spatig, L. (1999) Early Achievement Gains and Poverty-Linked Social Distress: The Case of Post-Head Start Transition. Journal of Social Distress and the Homeless 8: 241-254. Bowles, S. and Gintis, H. (1976) Schooling in Capitalist America New York: Basic Books.Brown, D. (1996) Systemic Change and the Role of Sc hool Boards. New York: American Educational Research Association Annual Me eting. Consortium for Policy Research in Education (1995) Reforming Science, Mathematics, and Technology Education: NSF's State Systemic Init iatives. New Brunswick, New Jersey: Consortium for Policy Research in Education Rutgers University. Harmon, H. and Blanton, R. (1997) Strategies for Im proving Math and Science Achievement in Rural Appalachia. Tucson, Arizona: N ational Rural Education Association Annual Convention.Kaestle, C. (1983) Pillars of the Republic New York: Hill and Wang. McMurrer, D. and Sawhill, I. (1998) Getting Ahead Washington, D.C.: The Urban Institute.National Council of Teachers of Mathematics (1998) Principles and Standards for School Mathematics. Reston Virginia: National Counc il of Teachers of Mathematics. National Research Council (1996) National Science Education Standards Washington, D.C.: National Academy Press.National Science Foundation (1994a) Foundation for the Future. Washington, D.C.: National Science Foundation Directorate for Educati on and Human Resources. National Science Foundation (1994b) Building the Sy stem: Making Science Education Work. Washington, D.C.: National Science Foundation Directorate for Education and Human Resources.National Science Foundation (1997) Rural Systemic I nitiatives in Science, Mathematics, and Technology Education. Washington, D.C.: Nationa l Science Foundation, Directorate for Education and Human Resources.National Science Foundation (1998) Systemic Initiat ive Research. Washington, D.C.: National Science Foundation, Directorate for Educat ion and Human Resources, URL http://www.nsf.gov/pubs/1998/nsf98140/nsf98140.htmNational Science Foundation (1999) HER Projects. Wa shington, D.C.: National Science Foundation, Directorate for Education and Human Res ources, URL http://www. ehr.nsf.gov/project. asp
20 of 22 National Science Teachers Association (1998) The Na tional Science Education Standards: A Vision for the Improvement of Science Teaching and Learning. Arlington, Virginia: National Science Teachers Association, UR L http://www.natt.org/handbook/nses.htmPerkinson, H. (1995) The Imperfect Panacea New York: McGraw-Hill. Reich, R.( 1992) The Work of Nations New York: Vintage Books. Senate Committee on Labor and Human Resources (1997 ) Innovative Workforce Development Initiatives. Washington, D.C.: Congress of the United States, United States Senate.Shapiro, C and Varian, H. (1998) Information Rules Cambridge, Massachusetts: Harvard Business School Press.Shields, P. (1997) Evaluation of the National Scien ce Foundation's Statewide Systemic Initiatives (SSI) Program: First Year Report. Menlo Park, California: SRI International. Spring, J. (1997) The American School: 1642 to 1996 New York: McGraw-Hill.About the AuthorsCollege of Education and Human ServicesMarshall UniversityHuntington, West Virginia 25755-2480Robert Bickel email@example.com Robert Bickel is a Professor of Advanced Educationa l Studies at Marshall University. His recent research is concerned with school size a s a variable which moderates the relationship between social class and measured achi evement, and with contextual factors which occasion the at-risk designation.Terry TomasekTerry Tomasek has taught high school science, and i s completing a Master of Arts in Teaching and a Master of Science in Biology at Mars hall University. Her thesis research focuses on the effects of valley fill construction on aquatic environments. Teresa Hardman Eagle Teresa Hardman Eagle is an Assistant Professor of E ducational Leadership Studies at Marshall University. A former high school math teac her, she is doing ethnographic research on women as administrators in educational institutions and social service agencies.Copyright 2000 by the Education Policy Analysis ArchivesThe World Wide Web address for the Education Policy Analysis Archives is epaa.asu.edu
21 of 22General questions about appropriateness of topics o r particular articles may be addressed to the Editor, Gene V Glass, firstname.lastname@example.org or reach him at College of Education, Arizona State University, Tempe, AZ 8 5287-0211. (602-965-9644). The Commentary Editor is Casey D. C obb: email@example.com .EPAA Editorial Board Michael W. Apple University of Wisconsin Greg Camilli Rutgers University John Covaleskie Northern Michigan University Alan Davis University of Colorado, Denver Sherman Dorn University of South Florida Mark E. Fetler California Commission on Teacher Credentialing Richard Garlikov firstname.lastname@example.org Thomas F. Green Syracuse University Alison I. Griffith York University Arlen Gullickson Western Michigan University Ernest R. House University of Colorado Aimee Howley Ohio University Craig B. Howley Appalachia Educational Laboratory William Hunter University of Calgary Daniel Kalls Ume University Benjamin Levin University of Manitoba Thomas Mauhs-Pugh Green Mountain College Dewayne Matthews Western Interstate Commission for HigherEducation William McInerney Purdue University Mary McKeown-Moak MGT of America (Austin, TX) Les McLean University of Toronto Susan Bobbitt Nolen University of Washington Anne L. Pemberton email@example.com Hugh G. Petrie SUNY Buffalo Richard C. Richardson New York University Anthony G. Rud Jr. Purdue University Dennis Sayers Ann Leavenworth Centerfor Accelerated Learning Jay D. Scribner University of Texas at Austin Michael Scriven firstname.lastname@example.org Robert E. Stake University of IllinoisUC Robert Stonehill U.S. Department of Education David D. Williams Brigham Young UniversityEPAA Spanish Language Editorial Board
22 of 22 Associate Editor for Spanish Language Roberto Rodrguez Gmez Universidad Nacional Autnoma de Mxico email@example.com Adrin Acosta (Mxico) Universidad de Guadalajaraadrianacosta@compuserve.com J. Flix Angulo Rasco (Spain) Universidad de Cdizfelix.firstname.lastname@example.org Teresa Bracho (Mxico) Centro de Investigacin y DocenciaEconmica-CIDEbracho dis1.cide.mx Alejandro Canales (Mxico) Universidad Nacional Autnoma deMxicocanalesa@servidor.unam.mx Ursula Casanova (U.S.A.) Arizona State Universitycasanova@asu.edu Jos Contreras Domingo Universitat de Barcelona Jose.Contreras@doe.d5.ub.es Erwin Epstein (U.S.A.) Loyola University of ChicagoEepstein@luc.edu Josu Gonzlez (U.S.A.) Arizona State Universityjosue@asu.edu Rollin Kent (Mxico)Departamento de InvestigacinEducativa-DIE/CINVESTAVrkent@gemtel.com.mx email@example.com Mara Beatriz Luce (Brazil)Universidad Federal de Rio Grande do Sul-UFRGSlucemb@orion.ufrgs.brJavier Mendoza Rojas (Mxico)Universidad Nacional Autnoma deMxicojaviermr@servidor.unam.mxMarcela Mollis (Argentina)Universidad de Buenos Airesmmollis@filo.uba.ar Humberto Muoz Garca (Mxico) Universidad Nacional Autnoma deMxicohumberto@servidor.unam.mxAngel Ignacio Prez Gmez (Spain)Universidad de Mlagaaiperez@uma.es Daniel Schugurensky (Argentina-Canad)OISE/UT, Canadadschugurensky@oise.utoronto.ca Simon Schwartzman (Brazil)Fundao Instituto Brasileiro e Geografiae Estatstica firstname.lastname@example.org Jurjo Torres Santom (Spain)Universidad de A Coruajurjo@udc.es Carlos Alberto Torres (U.S.A.)University of California, Los Angelestorres@gseisucla.edu
xml version 1.0 encoding UTF-8 standalone no
mods:mods xmlns:mods http:www.loc.govmodsv3 xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govmodsv3mods-3-1.xsd
mods:relatedItem type host
mods:identifier issn 1068-2341mods:part
mods:detail volume mods:number 8issue 12series Year mods:caption 20002000Month February2Day 2121mods:originInfo mods:dateIssued iso8601 2000-02-21
xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam a22 u 4500
controlfield tag 008 c20009999azu 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E11-00156
Educational policy analysis archives.
n Vol. 8, no. 12 (February 21, 2000).
Tempe, Ariz. :
b Arizona State University ;
Tampa, Fla. :
University of South Florida.
c February 21, 2000
Top-down, routinized reform in low-income, rural schools : NSF's Appalachian rural systemic initiative / Rober Bickel, Terry Tomasek, [and] Teresa Hardman Eagle.
Arizona State University.
University of South Florida.
t Education Policy Analysis Archives (EPAA)