USF Libraries
USF Digital Collections

Educational policy analysis archives

MISSING IMAGE

Material Information

Title:
Educational policy analysis archives
Physical Description:
Serial
Language:
English
Creator:
Arizona State University
University of South Florida
Publisher:
Arizona State University
University of South Florida.
Place of Publication:
Tempe, Ariz
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Education -- Research -- Periodicals   ( lcsh )
Genre:
non-fiction   ( marcgt )
serial   ( sobekcm )

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
usfldc doi - E11-00176
usfldc handle - e11.176
System ID:
SFS0024511:00176


This item is only available as the following downloads:


Full Text
xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam a22 u 4500
controlfield tag 008 c20009999azu 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E11-00176
0 245
Educational policy analysis archives.
n Vol. 8, no. 32 (July 11, 2000).
260
Tempe, Ariz. :
b Arizona State University ;
Tampa, Fla. :
University of South Florida.
c July 11, 2000
505
Student assessment as a political construction : the case of Uruguay / Luis Benveniste.
650
Education
x Research
v Periodicals.
2 710
Arizona State University.
University of South Florida.
1 773
t Education Policy Analysis Archives (EPAA)
4 856
u http://digital.lib.usf.edu/?e11.176


xml version 1.0 encoding UTF-8 standalone no
mods:mods xmlns:mods http:www.loc.govmodsv3 xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govmodsv3mods-3-1.xsd
mods:relatedItem type host
mods:identifier issn 1068-2341mods:part
mods:detail volume mods:number 8issue 32series Year mods:caption 20002000Month July7Day 1111mods:originInfo mods:dateIssued iso8601 2000-07-11



PAGE 1

1 of 40 Education Policy Analysis Archives Volume 8 Number 32July 11, 2000ISSN 1068-2341 A peer-reviewed scholarly electronic journal Editor: Gene V Glass, College of Education Arizona State University Copyright 2000, the EDUCATION POLICY ANALYSIS ARCHIVES. Permission is hereby granted to copy any article if EPAA is credited and copies are not sold. Articles appearing in EPAA are abstracted in the Current Index to Journals in Education by the ERIC Clearinghouse on Assessment and Evaluation and are permanently archived in Resources in Education Student Assessment as a Political Construction: The Case of Uruguay Luis Benveniste The World BankAbstractThis article reveals the interplay between assessme nt policies in Uruguay and the nature of State-societal relations. The cen tral State has been historically a staunch defender of public education and has championed the cause of equalizing opportunities for the most disadvantaged sectors of society. The national evaluation system of stude nt performance has been constructed as an expression of this tradition The Uruguayan government sought to build a wide level of consensu s with respect to the assessment instruments by encouraging educators to participate and buy into the assessment initiative. Moreover, the natio nal government shifted the focus of the national evaluation from measuring schooling outcomes to addressing the social wants that condition stude nt learning. Hence, the national evaluation has come to symbolize an agreed -upon mechanism of social accountability by which the central governme nt upholds its responsibility for educational provision as it inte rvenes on behalf of impoverished communities. (Note 1)This article is also avaialable in Spanish in Adobe Acrobat format at http://www.grade.org.pe/gtee-preal/docs/Benveniste. pdf

PAGE 2

2 of 40 This study reveals the interplay between as sessment policies in Uruguay and the nature of State-societal relations. The central Sta te has been historically a staunch defender of public education and has championed the cause of equalizing opportunities for the most disadvantaged sectors of society. The national evaluation system of student performance has been constructed as an expression o f this tradition. The first section describes the educational system as a highly centralized organizational structure. Then, it provides a brief overview of the education reform initiative launched in 1995 by the National Adminis tration of Public Education to promote and consolidate social equity. The second section portrays the Unidad de Medicin de Resultados Educativos (the evaluation agency of primary education) as a tempor ary unit created in 1996 within the framework of a project financed by the World Bank. In spite of its short history, the assessment system has garnered substantial popular support and spurred a curricular and pedagogical renovation among teachers, principals a nd supervisors. The third section explores the reasons behi nd the public embrace of the national assessment system. This has been no slight accompli shment in light of the fact that the evaluation of student performance may potentially e xert a destabilizing role by highlighting deficiencies in educational service pr ovision. First, the central State circumscribed teacher liability over poor performan ce, largely assuming itself the responsibility for the character of schooling. Seco nd, the national government built a wide level of consensus with respect to the assessm ent instruments by encouraging educators to participate and buy into the assessmen t initiative. Third, the national government shifted the focus of the national evalua tion from measuring schooling outcomes to addressing the social wants that condit ion student learning. Hence, the national evaluation has come to symbolize an agreed -upon mechanism of social accountability by which the central government upho lds its responsibility for educational provision as it intervenes on behalf of impoverishe d communities. Assessment may in fact reify centralized co ntrol by imposing standards that must be uniformly enforced throughout the country. Parad oxically, in Uruguay's highly concentrated model of governance, the national eval uation proves that centralization need not be incompatible with democratic participat ion.The process of education reform in UruguayThe Uruguayan educational system The educational system of the Republic of U ruguay is organized in three levels. (Note 2) Initial education caters to children betwe en 3 and 5 years of age. Preschool instruction is not compulsory presently, but the go vernment plans to make it obligatory for 4 and 5 year-old children in the proximate futu re. Primary education consists of six grades and services 6 to 11 year-old children. Seco ndary education consists of two sub-cycles. The Ciclo Bsico nico (Unique Basic Cycle) is a three-year course common to all students between 12 and 14 years of a ge. Students may then opt to proceed for baccalaureate or technical-professional instruction to round off their secondary education. Training at this level may las t between 2 and 7 years depending on the course. Primary schooling and the Unique Basic Cycle constitute the national compulsory educational requirements (Uruguay—Minist erio de Educacin y Cultura,

PAGE 3

3 of 401996). The administration of the education sector is highly centralized, but falls under the jurisdiction of several independent de-concentrated councils. The Ministry of Education and Culture is responsible for devising broad natio nal educational policies. Despite its overarching mandate, this Ministry has a subsidiary role in the operations of the education sector. The Administracin Nacional de Educacin Pblica (ANEP), the National Administration of Public Education, is the agency responsible for the management of the public educational system. The AN EP is fully autonomous from the Ministry of Education and Culture and it is configu red by several bodies: (a) the Central Board Council (CODICEN), (b) the Council of Primary Education (CEP), (c) the Council of Secondary Education (CES), and (d) the C ouncil of Technical-Professional Education. The Central Board Council is the highest administrative authority in the education sector. It is comprised of 5 members elec ted by the President and approved by the Senate. The other three councils are subordinat e to the CODICEN, but they function largely autonomously. They are responsible for impa rting, administering and supervising educational services. The directors of these counci ls are appointed by the CODICEN (see Figure 1). Educational policy is also shaped by severa l independent official advisory bodies to the ANEP. The Coordinating Commission of Education consists of the Minister of Education and Culture, the highest authorities of t he autonomous councils as well as by representatives of universities and post-graduate i nstitutions. It propounds guidelines and draft agreements for the coordination of the ed ucation sector. The Asambleas Tcnico-Docentes (Technical-Pedagogical Assemblies or ATDs) are nati onal and regional deliberative bodies comprised of teachers elected through secret compulsory voting. ATDs pronounce opinions regarding the condi tions of education and may initiate educational policy directives (Gonzlez Rissotto, 1 997). Basic education has reached universal propo rtions in Uruguay. In 1995, net enrollment rates at the primary school level encomp assed 95% of the 6 to 11 year-old cohort. At the Unique Basic Cycle level, matriculat ion rates averaged 67% for the

PAGE 4

4 of 40relevant school-aged population in Montevideo and 5 7% for all other urban areas in the rest of the country. Participation rates drop sharp ly in the second cycle of high-school instruction. Net enrollments at this level were bel ow 30%. Total expenditures in education amounted to US$ 578 million in 1995, whic h represents 3.4% of the gross national product. The private sector caters to 13% of primary school students and 14% of secondary school enrollments (Uruguay—Ministerio de Educacin y Cultura, 1997). Uruguay has a shortage of teachers. The imb alance between teacher supply and demand has prompted governmental authorities to all ow instructors to work double shifts. Teachers' real income has deteriorated stea dily, even declining during periods of private real income recovery. Between 1960 and 1989 real salaries for teachers declined by 46.6%. Monthly wages in 1996 ranged between US$ 270 and US$ 407 (Uruguay—Ministerio de Educacin y Cultura, 1996). Low salaries have forced teachers to search for alternative sources of income.The Uruguayan education reform A concern for the inequities in the Uruguay an educational system has prompted the government to embark on an ambitious reform initiat ive. Net enrollment rates for the population in chronic poverty reach 27% for prescho olers and 34% for high school students. The dropout rates for the poorest childre n in the first cycle of obligatory secondary education surpass 37%. There is also grow ing weariness about the deterioration of the quality of education. The nati onal assessment of student achievement revealed that 6th graders in extreme po verty responded correctly to 37% and 17% of a language and mathematics test on avera ge. The national means are nearly 20 percentage points above these levels. Primary sc hool repetition rates have remained stable at around 10% during the past fifteen years. The repetition rate in the first grade, however, has reached 22%. In Montevideo, 63 out of 257 schools have a repetition rate in the first grade above 30%, and another 67 establ ishments between 20% and 29% (Rama, 1998). The current administration of the ANEP has adopted four guiding principles to transform the educational system (Rama, 1998; Urugu ay—Ministerio de Educacin y Cultura, 1996): The consolidation of social equity, 1. The appreciation of teacher professionalism and tra ining, 2. The improvement of educational quality, and 3. The strengthening of institutional management 4. The consolidation of social equity effort directs services and compensatory actions to underprivileged children. The ANEP seeks to exte nd public preschool services to 95% of the 5 year-old population and conduct an out reach program to incorporate 85% of 12 to 14 year-olds to the first cycle of seconda ry schooling. The poorest students receive more hours of instruction, including “fulltime” schooling. They also have access to a comprehensive school meal program. The appreciation of teacher professionalism effort strives to double the graduation rates of primary school teachers and triple that of secondary school instructors by the end of 1999. Approximately 90% of the elementary school teacher corps and 4,300 non-certified high school instructors will receive in-service professional development training. Teacher salaries were planned to undergo an increase of 13% in 1996, 10% in 1997, 15% in 1998 and 18% in 1999. In actuality, te acher salaries did rise over the yearly inflation rates, but did not reach the goals originally contemplated. Nonetheless, education was the only social sector that received an appropriation to increase salaries

PAGE 5

5 of 40and its general operating budget in August 1998. The educational quality enhancement effort focused around the widespread distribution of textbooks, instructional materials and pedagogical resources to public establishments. Curricular programs at the secondar y level are also undergoing an in-depth review and renovation. In addition, the AN EP finances school-based projects to address specific needs within educational communiti es. Finally, the government has launched a program, “All Children Can Learn,” to re duce primary school repetition rates. This program consists of a series of integra ted social activities that endeavor to facilitate the access and permanence of children in schools, to strengthen the coordination between preschool and primary educatio n, to enhance teacher training and to use textbooks as “an instrument for open learnin g” (Rama, 1998). The strengthening of institutional management effort encompasses specialized training for school principals as well as the creat ion of computerized systems to assist administrators in their functions. Rural schools wi th less than ten students are being consolidated in order to reduce wastage and promote a more efficient use of resources. These four initiatives are funded by a 22% increase in the education sector appropriation. The 1996-2000 budget has grown by US $ 75 million from the 1991-1995 budget, to US$ 430 million. The government of Urugu ay also receives substantial aid from the international donor community to implement these reforms. The Inter American Development Bank and the World Bank have l ent $140 million dollars to the modernization of the educational system. The Projec t for the Improvement of the Quality of Primary Education (MECAEP), (Note 3) funded by t he World Bank, has contributed to the construction of preschools, the in-service t raining of elementary school teachers, and the provision of textbooks and pedagogical reso urces. It also supports the Unidad de Medicin de Resultados Educativos (UMRE), the agency responsible for assessing educational quality at the primary level. The Proje ct for the Improvement of the Quality of Basic Education and the Instruction and Training of Teachers (MESyFOD), funded by the Inter American Development Bank, has supported the creation of five regional teacher training centers, the inservice developme nt of high school instructors, and the maintenance of secondary school infrastructure. In addition, MESyFOD has conducted the national assessment of student achievement at t he secondary level in 1999.2. Student assessment practices in Uruguay (Note 4) A. The measurement of student achievement Initial experiences with student assessment Between 1990 and 1994, the United Nation's Economic Commission for Latin America and the Caribbean (CEPAL) conducted a serie s of studies requisitioned by the National Administration of Public Education. These studies were based on two examinations administered in 1990 to a small sample of 4th and 9th grade students in language and mathematics. CEPAL also collected soci oeconomic and background information from parents, teachers and principals. The purpose of these tests was to explore the conditions of basic and secondary educa tion in Uruguay (Comisin Econmica para Amrica Latina y el Caribe, 1994; 19 93; 1992; 1991; 1990). The primary school evaluation revealed that on average students could respond correctly to 58% of the questions (Comisin Econmi ca para Amrica Latina y el Caribe, 1991). The results from the secondary schoo l evaluation were significantly inferior. Less than 22% of public school students r eached an adequate level of

PAGE 6

6 of 40proficiency in mathematics or language, as opposed to over 50% in the private sector. The mathematics test showed that “students learn ve ry little in the courses of the Unique Basic Cycle.” The language scores exposed that “the probability of success of the great majority of public establishments is so low that fa ilure is almost certain” (Comisin Econmica para Amrica Latina y el Caribe, 1992: 90 122). The reports produced by CEPAL, however, abs tained from making curt accounts or generic descriptions of student outcomes. Rather, t est scores were the starting point for in-depth analyses of the impact of socioeconomic va riables on student learning. Predictably, CEPAL found that lowincome children tend to have lower levels of academic attainment. After an exhaustive review of the effect of various sociocultural indicators on school performance, the CEPAL undersc ored that maternal educational level is the best predictor of student achievement (Ravela, 1997b). The research agenda of this study also incl uded the identification of schools that, despite serving disadvantaged populations, have att ained high levels of academic performance. These educational establishments were denominated “exemplary schools.” The CEPAL carried out a qualitative investigation o f these schools and posited that there were four factors that explain scholastic excellenc e in underprivileged environments: the ability of the principal to assume a leadership role in the school as well as in its community, 1. the knowledge and experience of the classroom teach er combined with the satisfaction and commitment to his/her work, 2. a dynamic pedagogical culture within the teacher ca dre, and 3. the existence of significant bonds between the educ ational establishment and parents (Ravela, 1997a). 4. Finally, the CEPAL emphasized that low test scores were symptomatic of a systemic crisis in the education sector. The reason for the results is not the fault of educ ational establishments or their authorities.... They are the outcome of a pro longed social process, during a prolonged historical period, during which the quality of education ceased to be a priority as an objective of State ac tion (Comisin Econmica para Amrica Latina y el Caribe, 1992: 123). In other words, the deterioration of educat ional quality was ascribed to a lack of commitment from the central State to make adequate investments in schooling services. According to this report, the reversal of this situ ation would follow from the initiative of the national government towards promoting policies and programs that support the labor of teachers and principals. The construction of a national assessment system It could be said that Uruguay does not have an institutionalized national assessment system. UMRE, the unit responsible for the measurem ent of academic achievement at the primary education level, is not a formal “lineagency” of the National Administration of Public Education. It is an ad hoc unit initially constituted to implement the evaluation sub-component of the MECAE P Project financed by the World Bank. UMRE must abide by the directives of th e Central Board Council, but it is exonerated from following certain civil service reg ulations. Similarly, the secondary education evaluation was developed autonomously wit hin the framework of the MESyFOD Project, funded by the Inter American Devel opment Bank. Although there are plans to make student assessment a permanent en tity within the governmental

PAGE 7

7 of 40organizational structure, the appraisal of academic performance currently operates from quasi-independent transitory agencies. This situati on has provided to the evaluation of student achievement certain degree of independence and freedom—in relation to its organization, operation and personnel selection—by means of its ability to proceed outside the strict channels that regulate public of fices. On the other hand, and as it will be described in a later section, this “extra-offici al” character has generated concern among certain sectors of the educational community, and particularly among the school inspectorate, who perceive UMRE as a parallel entit y, alien to them. The systematic and periodic measurement of schooling outcomes was not an initiative of the Uruguayan government. It was a co nditional clause for the appropriation of the MECAEP World Bank loan (Interview UGN1). Alt hough initially greeted with some resistance, the Uruguayan government eventuall y welcomed the creation of an evaluation unit (Interview UGN34). Germn Rama, who became Director of the ANEP in 1995, had been responsible for the design and im plementation of the CEPAL study on student achievement aforementioned. Under his leade rship, the Central Board Council decreed a resolution in March 1996 stipulating that “one of the prioritized lines of action of this Council is the implementation of assessment systems of [student] learning … with the objective to appraise the performance of t his Organism and the quality of service it provides to the population” (Uruguay—Adm inistracin Nacional de Educacin Pblica, 1996b). UMRE has been in operation since 1994. Pilo t tests for a 3rd and 6th grade evaluation were conducted late that year, with the intention to launch the first national assessment in 1995. When Dr. Rama assumed control o f the ANEP in mid 1995, however, he replaced the technical leadership of UM RE and resolved to postpone the exam for one year. The national assessment underwen t an important reformation. First, the ANEP would evaluate all public and private scho ol students in 6th and 9th grades, the terminal years of the primary and secondary edu cational levels, every three years. Second, the test would veer from appraising curricu lar contents to measuring skills and competencies (such as reading comprehension or prob lem resolution). Third, the evaluation would incorporate a detailed sociocultur al survey to be completed by parents, teachers and principals. Fourth, UMRE would seek fe edback about its mission and operations from the various stakeholders involved i n the provision of schooling services. Fifth, governmental authorities committed to mainta ining secrecy about individual school test results. The ANEP guaranteed that only aggregate data would be made public (UMRE, 1996e). UMRE is constituted by 3 full-time and 5 pa rt-time professionals. It is responsible for the design, implementation, analysis, and devol ution of results of the primary education assessment. From practically its inceptio n, public and private school authorities as well as policy makers, supervisors a nd teachers were consulted about the development of instruments, test administration pra ctices, and the uses of assessment results. The government also held regular informati ve workshops and produced several publications to raise awareness about the objective s of collecting student data (UMRE, 1996e). UMRE devoted significant effort to securing support and building consensus for the national assessment across the gamut of educati onal actors. In 1996, an “Advisory Group” was consolidated to review the work of UMRE and promote cooperative participation. This committee is conformed by natio nal and regional representatives from the Council of Primary Education, the supervis ory cadre, teacher training institutions, the Technical-Pedagogical Assembly, t he Association of Private Education Establishments, the Uruguayan Association of Cathol ic Education, and the Uruguayan Federation of Teachers (the national teachers' unio n).

PAGE 8

8 of 40 UMRE administered the first standardized ev aluation in mathematics and language to all 6 th grade students in 1996. Rural schools with less th an six pupils in the sixth grade classroom were exempted from participation. A bsenteeism rates were below 3.5% of the total enrollment in the mathematics test and 6.2% in the language test. In addition, educators and parents were required to complete soc ioeconomic background surveys. The rate of parental response to this survey was 98 .5% (UMRE, 1998c). The exams consisted of multiple choice item s and open-ended questions. Teachers and supervisors participated in the formulation of test items, but technical staff from UMRE ultimately devised the exam. (Note 5) Independ ent proctors monitored the administration of the assessment. Students were all otted one hour and thirty minutes to complete the test, but those who required additiona l time to finish were allowed to do so (UMRE, 1996f). UMRE was responsible for correcting the exams and analyzing the results. Forty days after the application of the tes t and prior to the culmination of the academic year, schools received an individualized c onfidential report with aggregate school results item by item. The socioeconomic back ground surveys served as a basis to categorize schools into five categories, from very unfavorable to very favorable contexts. Student outcomes were compared to the national aver age, the departmental/regional average and that of schools that service students f rom similar socioeconomic conditions. Educational establishments also obtained two techni cal manuals to interpret results. In the following academic year, educators received a s econd confidential report with a sociocultural profile of their school, based on bac kground questionnaire data. UMRE also produced methodological guides with pedagogica l suggestions and recommendations to redress weaknesses identified in mathematics and language (UMRE, 1997b; 1997c; 1997d; 1997f; 1997g; 1996c; 19 96d; 1996g). UMRE tailored several reports for the super visory cadre. School inspectors participated in workshops where they received a reg ional profile of local schools and a “socioacademic map” that classified educational est ablishments under their oversight in terms of achievement levels and socioeconomic conte xt. These instruments would allow supervisors to identify exemplary schools that exhi bited high test scores in spite of being resource poor. They were also meant for targeting c ompensatory interventions to low performing educational establishments. UMRE results The national assessment of 6 th grade students showed that 57.1% of students were able to respond to more than 60% of the language te st correctly. The success rate in mathematics was considerably lower. Only 34.6% of s tudents were able to answer over 60% of the questions satisfactorily. The percentage of students that did not reach the 60% “adequacy level” in both tests was 37.9%. The first official report of results for pu blic dissemination highlighted the role of contextual variables in the acquisition of knowledg e. Students were classified into four categories according to their sociocultural context Sociocultural context was defined in terms of maternal educational level. Schools from “ very favorable” contexts were characterized as those with over 50% of students wh ose mothers completed at least secondary education. Schools from “very unfavorable ” contexts were characterized as those where less than one out of two mothers had re ceived only a primary education, and at most one out of ten mothers had received a secon dary education. As the CEPAL studies had demonstrated earli er, students from underprivileged backgrounds scored significantly below students fro m more affluent families (see Tables 1 and 2). While over 85% of children from “very fav orable” contexts answered correctly

PAGE 9

9 of 40 to at least 60% of the language test correctly, les s than 40% of students from “very unfavorable” contexts attained the same level of ac hievement. In mathematics, the gap between highand low-income children widened.Table 1 Percentage of Students by Performance Level in Mathematics and School Sociocultural Context Very Favorable Medium High Medium Low Very Unfavorable Total Highly Satisfactory (scores above 80%)21.0%8.4%3.4%2.0%6.8%Satisfactory(scores 60% to 80%)45.6%35.3%23.2%15.7%27.8%Unsatisfactory(scores 30% to 60%)30.6%49.7%60.7%64.4%54.5%Very unsatisfactory(scores below 30%)2.8%6.7%12.7%17.9%10.9%Total 100.0%100.0%100.0%100.0%100.0% Source: UMRE (1996g), p. 10. Table 2 Percentage of Students by Performance Level in Language and School Sociocultural Context Very Favorable Medium High Medium Low Very Unfavorable Total Highly Satisfactory(scores above 80%)41.9%19.5%9.8%5.0%15.8%Satisfactory(scores 60% to 80%)43.3%48.1%40.9%32.8%41.3%Unsatisfactory(scores 30% to 60%)14.0%29.7%43.2%52.7%37.7%Very unsatisfactory(scores below 30%)0.8%2.8%6.1%9.5%5.2%Total 100.0%100.0%100.0%100.0%100.0%

PAGE 10

10 of 40Source: UMRE (1996g), p. 10. UMRE produced a second report exploring the relationship between sociocultural factors and student achievement. This study categor ized the Uruguayan educational system into five subsystems according to geographic al and sociocultural variables. This study revealed that private schools in Montevideo g enerally attracted students with the highest maternal educational levels, followed by, i n decreasing order of maternal educational level, private schools in the interior, public schools in Montevideo, public schools in the interior, and rural schools. School performance in these subsystems was closely correlated to sociocultural context, with t he exception of rural schools that evinced academic achievement levels slightly greate r than expected for their low sociocultural context (UMRE, 1997f). More important ly, this report provided proof that academic achievement levels were not directly tied to the public or private nature of schooling, but rather to the sociocultural composit ion of the student body. In other words, the average scores of public schools from ve ry favorable contexts were similar to those of their private counterparts within this context. The outcomes of private schools that served underprivileged populations wer e also analogous to those of public schools that assisted students from very unfavorabl e contexts. (Note 6) A third national report was released late i n 1997 providing a meticulous institutional profile of educational establishments This document was based on the background surveys provided by principals, teachers and parents. It depicted the attributes of building facilities, school materials class size, years of experience of principals, teacher training, pedagogical approache s favored, staff turnover, parental involvement, and student self-esteem (UMRE, 1997g). As in previous inquiries, the analysis gravitated around the relationship between sociocultural context and schooling conditions. Overall, the Uruguay government emphasized consistently throughout its public reports the role played by contextual factors in st udent learning. Average student scores, as all comparisons between geographic regions or be tween the public and the private sectors, were presented in direct relation to the s ociocultural level in which learning took place. School-level data was kept rigorously confid ential. Other assessment activities In addition to the sixth grade assessment, the Uruguayan government has undertaken two other evaluation exercises. Firstly, the government conducted an experimental assessment to a stratified sample of 3 rd grade classrooms late in 1998. This test was available to other educational establishme nts outside the controlled sample for selfadministration on a voluntary basis. The Cent ral Board Council, however, exhorted all educational establishments to take part of this initiative (UMRE, 1998a). The purpose of this evaluation was to appraise student competen cies at midpoint of their primary schooling. It also pursued to signal teachers about the expected competencies pupils ought to master by the third grade and provide them with an earlywarning system to reformulate programmatic contents and pedagogical s trategies (UMRE, 1997a). The exam consisted of open-ended questions that integrated concepts from a variety of disciplines (mathematics, language, social studi es, natural sciences, moral education, art) without compartmentalizing them into different spheres of knowledge. In response to teachers' demands for greater participation in t he formulation of the test, UMRE established working groups with educators selected by the supervisory cadre, the regional Technical-Pedagogical Assemblies, and the associations of private independent and private Catholic schools. These working groups identified curricular areas to be

PAGE 11

11 of 40evaluated and collaborated in the development of te st items. An informational document providing detaile d information about the proposed testing scheme and objectives was drafted and distr ibuted to all teachers and school inspectors. UMRE later requested teachers to respon d to an opinion survey regarding the assessment instrument and competencies to be evalua ted. Ninety two percent of respondents declared that the test was “adequate” a nd there was complete agreement about the competencies selected (UMRE, 1998b). As i n the 6 th grade assessment, the measurement instrument included background surveys for parents, teachers and principals in order to obtain data regarding the co nditions in which student learning took place. Every educational establishment received a report with national aggregate averages by competencies (reading comprehension, resolution of problems, processing information). Test scores were also broken down by socioeconomic context (rural, very favorable, favorable, medium, unfavorable and very unfavorable). A supplementary report detailed average background information (mat ernal educational level, home overcrowding, books in the house, preschool trainin g) tabulated by sociocultural context. Schools that did not participate in the controlled sample received as well a standardized correction manual so that they could tally their ow n in-house results and compare them to the official national average scores. Secondly, the 6 th grade cohort evaluated in 1996 was re-tested in 19 99 as students completed their 9 th grade. MESyFOD, the project responsible for the ad ministration of the test, espoused a methodology similar to that im plemented by UMRE. The evaluation team sought to conduct informational sessions with supervisors, private and public school instructors, the Technical-Pedagogical Assem blies (ATD) and the teachers' union to gain their support. MESyFOD also intended to est ablish an advisory group conformed by representatives from every sector of the educati onal system that would review its operations. At the time the data collection for thi s study was conducted, it was unclear whether MESyFOD would be able to build consensus fo r the evaluation, especially from the ATDs and the Federacin Nacional de Profesores de Enseanza Secu ndaria the national secondary school teachers' union. Secondar y school teachers had adopted a more contentious stance towards the central governm ent's reform initiatives than primary school educators. ATD representatives had refused i n the past to collaborate in projects spearheaded by MESyFOD (Interviews UGN3, UGN3b). (N ote 7) The MESyFOD team, however, concedes that th e national experience with UMRE had greatly eased their work nonetheless. In most i nstances, educational establishments offered little resistance. They had not questioned the government's rationale for conducting this initiative nor were they concerned about being penalized for poor performance. Our undertaking has been facilitated due to the fac t that MECAEP has been very careful about the confidentiality of test resu lts, about the prompt devolution of scores, about the provision of indivi dualized reports to each educational center. They took a series of precautio ns that, for instance, have encouraged private schools to open their doors. … T he realities of secondary education are not the same as those of th e primary level, and there's still all the prejudices about standardized evaluations, but we're going along (Interview UGN3). The assessment involved approximately 40,000 studen ts. It appraised achievement in

PAGE 12

12 of 40language, mathematics, social studies and natural s ciences. Tests were administered by independent proctors and corrected centrally by MES yFOD. B. The uses of assessment data The findings uncovered by the first nationa l measurement of student achievement are aimed at three distinct audiences: (a) the cent ral government, (b) the school inspectorate, and (c) teachers and principals. Pare nts are informed indirectly about the general conditions of schooling through the press. A few schools, mostly in the private sector, have taken the initiative to publicize thei r scores to the families they serve. The central government The national evaluation of student learning has as its official mandate: to produce information about the extent to which primary school graduates have been able to develop the skills and fundamental understandings in Language and Mathematics that eve ry Uruguayan child ought to have incorporated regardless of his social origin, economic condition, or local context (UMRE, 1996b: 1). This mission statement underscores the diag nostic objectives of assessment. “To have this information available,” claims the ANEP, “is crucial to recuperate the democratizing role of the national educational syst em.” Equity considerations lie at the heart of the central government's involvement in th e measurement of academic outcomes. The ANEP has relied on data gathered by UMR E primarily to guide and inform compensatory policies. There are three autonomous a gencies within the national government that are consumers of information genera ted by UMRE: (a) the Council of Primary Education (CEP), (b) the MECAEP project (wh ich is administered independently from the CEP), and (c) the Planning A rea of the ANEP, a unit that depends directly from the Central Board Council. The MECAEP project has been the most active patron of assessment data. On one hand, MECAEP has played a key role in promoting ref lection among educators regarding the results of the first national evaluat ion. Technical discussions about the meanings of UMRE's findings have become a standard feature of institutional planning or professional development workshops organized for school inspectors and principals (Uruguay—ANEP-MECAEP, 1997). On the other hand, tes t scores and UMRE's classification of schools according to sociocultura l context guide many of the initiatives undertaken by MECAEP. For instance, MECAEP disburse s US$ 3,000 government grants for school-based projects. The selection pro cess takes into account how these projects may address shortcomings identified by the UMRE evaluation. Moreover, priority is awarded to schools from “unfavorable” s ociocultural environments (Uruguay—ANEP-CODICEN, 1998). Sociocultural context as defined by UMRE, has also become a salient criterion for the allocation of resources. The official press release detailing the outcomes of the first evaluation to t he general public, for example, announced that MECAEP earmarked US$ 1 million to th e purchase of pedagogical materials, targeting specifically 400 schools from unfavorable contexts (Uruguay—Administracin Nacional de Educacin Pbli ca, 1996a). "The Council [of Primary Education] permane ntly solicits information from UMRE," states a senior government official. “We are interested in learning about the

PAGE 13

13 of 40strengths and weaknesses in language and math achie vement, as well as about the relationship between school and family variables” ( Interview UGN6). In practice, although the CEP's school inspectorate has been an important end user of test data, the central CEP office has given at best limited applic ation to the UMRE results. School test scores have been used as educational quality indica tors for the program “All Children Can Learn.” This initiative strives to reduce repet ition rates below the 20% mark in 160 schools through a comprehensive set of activities t hat include teacher training, providing health care services, reaching out to parents, and supplying textbooks (Uruguay—ANEP-CODICEN, 1998). Achievement levels ha ve not been a parameter for bringing schools into the program, but test outcome s are occasionally used to tailor specific remedial actions in some establishments. O utside this initiative, the Council of Primary Education does not rely on UMRE data for ot her purposes. This has been a source of disappointment for some UMRE officials (I nterviews UGN1, UGN2). Finally, the Planning Area of the ANEP has depended on UMRE's school socioeconomic data for several of its own activitie s as well. In 1998, it conducted a research project on variables associated with prima ry education repetition rates (rea de Planeamiento de ANEP, 1998). This study demonstrate d a close relationship between sociocultural context and the likelihood that stude nts will be held back in the first and second grades. In addition, school background infor mation has been “a fundamental referent” in the identification of establishments t hat could benefit from recent government initiatives, such as in-school meals, sc hool infrastructure maintenance, or classroom construction (Interview UGN7). It is expe cted that once the MECAEP project comes to its conclusion, UMRE will become part of t he Planning Area of the ANEP. (Note 8)UMRE's own policy initiatives The Council of Primary Education maintains that UMRE's role “is bounded to describing what happens” and “providing statistical data,” so that, in turn, this knowledge can serve “the relevant organisms to make pertinent decisions” (Interview UGN6). In practice, UMRE has been more than just an informationgathering agency. It has been intimately involved in the design and prom otion of educational policies for schools from “very unfavorable” contexts. UMRE, with support from regional Institutes for Teacher Training, developed a Saturday workshop series for 541 urban primary scho ols serving underprivileged communities (approximately 40% of all public establ ishments). Participation in this four-month seminar was voluntary, but in order to q ualify, at least half of a school's professional staff must have agreed to participate. Teachers were remunerated for the time they dedicated to this venture with a monthly monetary bonus equivalent to 30% of the average teacher salary. Furthermore, UMRE established a fund to fin ance propositions that could enhance educational quality. Teacher training institutes re ceived $1,000 awards to foster “the accumulation of knowledge about [student] learning in unfavorable environments and the implementation of professional development acti vities in teacher training institutions around these themes” (UMRE, 1997e: 1). Low-income s chools could solicit $1,000 grants for the implementation of intervention proje cts destined to improve achievement levels in that educational community. The resources made available, however, would only allow for 50 school awards altogether. Lastly, UMRE, in collaboration with the Pro gram for the Strengthening of the Social Area (FAS) from the Office of Planning and B udget, conducted a qualitative research project in 12 schools from unfavorable soc iocultural contexts. Eight of these

PAGE 14

14 of 40establishments excelled in the first national evalu ation. The purpose of this study was to uncover the attributes of those establishments that inspired high attainment levels in underprivileged children. In particular, the dimens ions explored were: (a) institutional characteristics, (b) pedagogical focus, and (c) lin kages to the family and surrounding community. This study has become the basis for a co mprehensive pedagogical proposal for “full-time” schooling to be implemented in 10% of public educational establishments serving the poorest children in the nation (Uruguay —ANEP-MECAEP, 1997). School supervisors The school inspectorate is organized hierar chically from the national-central level to the departmental-regional level to the local-zon al level. Although theoretically organized in a decentralized fashion (Macedo, 1995) school supervisors abide closely by the mandates established centrally at the Techni cal Inspection unit of the Primary Education Council (World Bank, 1994). The supervisory cadre has a long tradition of evaluative activities at the school level. Schools are required to self-design and self -administer initial, mid-year and final exams in mathematics and language at all grade leve ls in order to appraise academic attainment. Inspectors must report on student test scores and specify the percentage of students that can master specific competencies, suc h as oral expression, orthography, reading comprehension, production of a text, resolu tion of algorithms, or recognition of geometrical figures. (Note 9) In addition, inspectors are instructed to c onduct their own institutional assessments in order to look beyond academic achievement as “th e only objective testimonial of the level and quality” of educational services (see, fo r example, Uruguay—ANEP-CEP-Inspeccin Tcnica, 1991a; 1991b; 1991c). They collect data on a wide a variety of measures related to educational quality, including student attitudinal qualities (respect, selfconfidence, tolerance), a bsenteeism rates, repetition rates, classroom pedagogical approaches, availability of d idactic materials, in-service professional development opportunities, and extent of parental involvement (see, for instance, Inspeccin Departamental de Montevideo, 1 998). Supervisors produce a comprehensive school profile on the basis of this i nformation and elaborate in conjunction with school authorities a strategic pla n to address the shortcomings identified in this process. The national assessment conducted by UMRE s ummed itself to the battery of school diagnostic information available to the insp ectorate. UMRE elaborated reports tailored for the supervisory cadre categorizing sch ools by sociocultural context and performance level (UMRE, 1998c). Supervisors also h ad access to the scores of the schools under their tutelage. UMRE developed a seri es of workshops to familiarize inspectors with the results of this standardized ev aluation and suggest potential courses of action that they may take to enhance educational quality. Overall, the inspectorate gives high marks to UMRE. They underscore that it “has been extremely useful” (Interview UGN16) and has sp urred a transformation throughout the educational system at various levels. We discovered that, it is important to have these d ata at the national level. In second place, this information is not only usefu l for the [educational] system, but for schools themselves. There were cert ain guarantees respected of all operations conducted. [The assessment] is no t assigning blame in the face of potential deficits or anything like it. It is simply an objective measure that goes beyond [curricular] contents, and looks at much broader

PAGE 15

15 of 40processes. … In general terms, everybody is conscio us that this is something valuable (Interview UGN9). The assessment was a starting point to begi n to understand the weaknesses in schooling, particularly for low-income children. Fu rthermore, it paved the way for the adoption of specific remedial actions to address th ese shortcomings. This mass evaluation of [student] achievement has p ut on the table quite clearly what all teachers have been perceiving for many years: how little children in situations of social exclusion learn. T he evaluation took into consideration the educational level of the mother, home crowding, or the number of children in the family. [This systematize d information] gave us, at the educational system level, some tools to corr ect in part this situation of low student achievement by updating teachers … and proposing useful strategies in the areas of psychology, language and mathematics. From this point of view, it served an important professional upgrading role throughout the nation. It allowed many teachers to connect [wi th their students], because many knew that things were going poorly but it wasn't clear the reason why. It was useful to find new pathways (Int erview UGN10). Supervisors praise the technical reports and pedago gical recommendations put forward by UMRE. They are described as “filled with proposa ls for action” and “based on solid theoretical foundations” (Interview UGN15). “For me [UMRE] has been very advantageous because of the exchange of materials. Their contributions are very helpful … Really, they have been a great technical support” (Interview UGN13). The national assessment has also served as a model towards a new educational paradigm. Traditionally, educators have emphasized memorization drills of curricular contents. The UMRE test, instead, moved away from a ppraising curricular contents to assessing competencies. A supervisor suggests that the UMRE test “took place precisely at a time when other pedagogical changes were takin g place, and UMRE was able to appropriate itself of all this … and motivate a reelaboration of [educational] processes” (Interview UGN23). Inspector 1. [UMRE] moved us. It put us into contact with [new] literature, with another modality of evaluation that in turn im plied another modality of [curricular] planning (Interview UGN19).Inspector 2 The results obliged us to think about the way cur ricular proposals were being implemented in educational est ablishments and how children were learning. The failure of students … s uggested that perhaps it was necessary to reformulate the educational projec t (Interview UGN14). The inspectorate has played a crucial role in bringing the lessons from the first national evaluation into the classroom. Across regi ons, the supervisory cadre was required to organize in commissions to reflect upon student outcomes and devise plans of action that responded directly to the needs iden tified. These sessions focused on “the role and mission of the inspector” as a catalyst fo r change (Interview UGN12). The departmental inspector asked [us] to conduct a study, an analysis of the results, and see what we, as a departmental inspect orate, could do. I was recently reviewing this, and we had accorded to wor k with institutional

PAGE 16

16 of 40projects … Every supervisor, following these genera l guidelines, could request for funds to implement an intervention proj ect in reference to the [UMRE] test results (Interview UGN30). Inspectors were encouraged to adapt the gui delines outlined in departmental commissions to the social realities of the establis hments they oversaw. In certain localities, supervisors organized 2to 3-month sem inars “to support educators with the findings of new research, and a theoretical framewo rk” that delved not only on how students learn but what is relevant learning (Inter view UGN13). In most districts, the favored approach has been to intercede directly wit h school administrators. “We work on specific proposals with our principals, who in t urn pour this effort into institutional projects developed together with their teachers” (I nterview UGN21). Lower scoring schools have received preferential attention over h igher achieving establishments. There is a growing sense that UMRE has imbu ed the educational system with a reflexive and renovating spirit. Regardless of the actual transformations that may have occurred as product of the first national evaluatio n, supervisors concur that UMRE has been responsible for bringing to the fore a nationa l dialogue on the effectiveness of educational services and practices. Personally, I perceive that there have been changes Changes in the good sense. There has been an evolution, in theory and i n practice. There is a theoretical discussion about [educational] issues, which gets translated into daily activities. … I have never seen such quick ch ange. I believe this is positive (Interview UGN20). Despite this strong endorsement to the work and outcomes of UMRE, inspectors do express reserve towards the national evaluation sys tem. First, they underscore that the measurement of student achievement is not a new act ivity in the Uruguayan educational landscape “We have always evaluated,” attests one s upervisor unequivocally (Interviews UGN22). Inspector 1. In terms of evaluation, I believe that teachers hav e been working a lot previously on this subject. And so ha ve inspectors. Yes, I share with others that the [UMRE] materials we rece ived have triggered reflection among educators, but I believe that we h ave been working continuously on evaluation (Interview UGN17).Inspector 2 I suggest that it is not new to evaluate. [The UM RE assessment] is not new nor is it the only kind of e valuation. Of course, this was an evaluation at the macro level and by an exte rnal agent to the school. But we have never stopped evaluating within schools because this is inherent to teachers' practices: evaluating, planni ng, and researching (Interview UGN15). Second, the supervisory cadre is concerned about the lack of coordination between the central Technical Inspection and the national a ssessment. Although all levels of the inspectorate (national, departmental and zonal) are represented in UMRE's advisory council, some supervisors protest that there has no t been sufficient participation or communication between the two agencies. There is a need to polish certain instances [of par ticipation] so that they are

PAGE 17

17 of 40truly effective. Sometimes it is not enough to say that we are participating, that we want to participate. It is necessary that t hese spaces be created. The possibility is not always present. … The will has b een there, but the spaces are not instrumented so that we can actually share our opinions (Interview UGN12). Ultimately, the inspectorate is wary of the overlap between UMRE's and their own functions. Supervisors stress that the national eva luation does not supersede their role in the education sector. “I believe [the UMRE evaluati on] was a new thing for the educational system, but under no circumstance it pr ecludes the other kind of evaluation that we have been conducting. They are complementar y” (Interview UGN15). Some suggest that UMRE is an external agency that has un fairly arrogated their jurisdiction. The fact is that UMRE belongs to an organism that i s called MECAEP and that is parallel to the normative system. It is ali en to the Primary Education Council and to the [educational] system. Even thoug h one may value some of the actions that they perform, we can't stop fee ling this way. It is not an evaluation generated within the Primary Education C ouncil. It comes out of an external organism. I believe this is one of the issues that produces great aggravation (Interview UGN10). Others remark that UMRE has been unabashedl y displacing them with an agenda of which they claimed to have no knowledge. “[UMRE] wa s coming above us. Sometimes we didn't even know what they were doing” (Intervie w UGN16). And yet others claim that UMRE oversteps the separation of responsibilit ies between the autonomous councils of the National Administration of Public E ducation (Interview UGN12). Inspector 1. Over the entire evaluative history in our country t he ones that always performed a pedagogical review, a study, wer e the supervisory cadre and the Primary Education Council. Presently, that review is being done externally. We now wonder repeatedly, as inspectors to what extent it is valid that somebody else comes along, with other po ssibilities, with other mechanisms, with more people, to do what we are doi ng. The measurement performed by UMRE is parallel to the functions of t his deconcentrated authority (Interview UGN13).Inspector 2 The issue is that [UMRE and the inspectorate] eac h have their own lines of action. The inspectorate has a very cl ear agenda. But these lines of action get intercepted. Supposedly, UMRE o ught to be an advisory or collaborative board in support of our activities But if their actions are intercepting ours, or we are being displaced by UMR E, then that is where things are starting to become unwound (Interview UG N16). In summary, supervisors object to the fact that UMRE is an external agency to the inspectorate with comparable functions. They resent that UMRE has had the ability to act independently, the authority to command the att ention of educational establishments and the resources to implement directly remedial ac tivities. To some extent, UMRE has come to embody a potential threat to the supervisor y cadre. In a few schools, teachers even give credence to the rumor that the supervisor y cadre will disappear or that it will be restructured. These criticisms not withstanding, the general consensus is that UMRE

PAGE 18

18 of 40has been a positive asset and ought to continue the work that it has begun. “A system that does not evaluate itself cannot improve,” rema rks an inspector (Interview UGN17). According to the supervisory cadre, it is its organ izational structure and relationship to the Primary Education Council that, in their eyes, begs to be redefined. Principals and teachers In hindsight, teachers and principals belie ve that the first national evaluation was an important experience. Private and public schools, a s well as lowand high-income establishments concur that the UMRE assessment “was very useful, because it helped us to see where were our flaws, what we can do about t hem, and how we can change” (Interview UES2). In its inception, teachers were suspicious of the UMRE test (Interview UGN9). Some expressed concern about whether student perfor mance would be a means to appraise their own professional performance. Others feared that if their students did not attain high marks, they might be transferred to ano ther grade (UMRE, 1996a). The Association of Teachers of Montevideo (ADEMU) expre ssed its rejection and opposition to UMRE. ADEMU protested that this was a test devised by an entity external to the Primary Education Council and suppo rted by international donor agencies. “The economic expenditure that [the evalu ation] supposes,” the teachers' union announced in a newspaper communiqu, “does not conf orm to the austerity criteria that govern the education budget” ( El Pas 1995). The Uruguayan Federation of Teachers (FUM) also declared deep reservations towards the n ational assessment. In the second semester of 1996 and just prior to th e measurement, the teachers' union picked up the debate [on the UMRE e valuation]. We reiterated certain existing reparations, about its expense and the degree of dependency to the World Bank's orientations. New el ements of concern were also incorporated, like … the possibility of u sing the results to categorize schools, to provide differentiated salar ies to teachers according to test scores, to stigmatize a certain group of teach ers or schools. Also, that it may favor the private sector in some way or other t o the extent that it was predictable that public schools would have worse re sults than private ones. Another series of criticisms were directed to the p ertinence of the instruments and the appropriateness of administerin g one instrument to measure processes in different social realities. Fi nally, there were concerns about the operational organization in itself, who w as going to apply the tests, the access teachers would have, which guaran tees existed about the formulation of the tests, the trustworthiness of co rrection criteria. The criticisms varied from highly ideological considera tions, to reserve and distrust, to concerns about the everyday operations of the classroom. There was a wide scope of opinions (Interview UGN37). Over time, these misgivings were assuaged. Although ADEMU remained defiant to the first national evaluation and encouraged educat ional establishments to forestall the entrance to exam proctors, teachers and principals collaborated with this governmental initiative. The Uruguayan Federation of Teachers re cognizes that UMRE's “open attitude and desire to consult with the teachers' unions and technical-pedagogical assemblies” led to their participation in the Advisory Group and co operation with the national test (Interview UGN37). An instructor from a rural area recounts that “at the beginning, teachers were not invested [in the evaluation], but during the past year, people started to

PAGE 19

19 of 40talk positively about it” (Interview UES13). A repr esentative from the Association of Private Education Establishments describes a simila r experience: When UMRE appeared, we had a brick on each hand. I was ready to kill them. I had all my reasons against them ready. Litt le by little they convinced us. Now, after all that has happened and as we get more results, they convince us even more. It is OK that the test is ob ligatory. It has been a valuable experience (Interview UEM32). The sense of trust and confidence garnered by UMRE among the teacher cadre can be attributed to four factors: strict confidentiality of test results, 1. prompt devolution of student outcomes to school aut horities, 2. contextualization of test scores by sociocultural b ackground, and 3. abstention from holding teachers directly accountab le for academic attainment. 4. Private school principal Teachers [initally] felt on the spot. There was t alk …that instructors who did not reach certain scores would be removed from office, that there was going to be a public ranking of schools, that this was an attempt to regulate teachers. The people from UM RE were quite clear in explaining what the objectives of the test were. Bu t nobody believed them. Everybody feared that behind this there was somethi ng that somehow would harm teachers. … It is now clear that they kept the ir word, that it was useful, that it helped us to review things, that two years later we are still working with the results (Interview UEM33). Public school instructor. Teachers feared that their school would be identified in some manner. And if the school was id entified, so would their classroom. And from the classroom, the teacher [wou ld be recognized] … But the data were confidential. Only we got to know the scores. And the schools were later categorized according to their e nvironment (Interview UEM16). The national assessment has taken place wit hin an education reform context that has espoused “teacher-friendly policies.” “The appr eciation of teacher professionalism and training” has been one of the four pillars of t he reform (Rama, 1998). Real average teacher salaries have also risen progressively and consistently starting in 1993, after a period of decline between 1988 and 1992 (Domingo, 1 998). (Note 10) This general setting might have contributed to generate a positi ve disposition among teachers towards the objectives of UMRE and a sense of trust that th e evaluation had not been established to monitor their performance or increase their prod uctivity. Moreover, the attention of educators has no t been focused on student achievement measures exclusively. The sociocultural data collec ted by UMRE were featured as an salient explanatory factor behind student performan ce. School background information has become a key justification to account for the l evel of academic achievement attained and an important consideration in the design of rel evant remedial actions. The system of evaluation also conducted a family su rvey that took into consideration the role of the home in the education al process. We need to take into account that children only spend four hou rs at school, and twenty at home. The role of the family is fundamental in t erms of the contributions

PAGE 20

20 of 40that it can dispense to reaffirm educational proces ses (Interview UET21). Teachers report that test scores were subje ct of repeated discussion and reflection sessions among school inspectors, principals, and t he teacher cadre. The organization and participation in these initiatives was mandated by the central government. The following year, in 1997, when we came back to s chool, we were required to study the results of the UMRE evaluatio n, point by point, during our 'administrative days.' Then, we had to draw joi nt conclusions. It was an obligation to read them. [The order] came to the sc hool in the form of an [official] act (Interview UES15). The outcomes of the 6 th grade assessment were the starting point of a proc ess of pedagogical reflection for a wide range of public a nd private schools. Medium-income school. On the basis of the [exam results], we developed a plan for the following year. For instance, the disc ussion over problem resolution was very important for us in order to go deeper into this issue, to work more on reasoning. I don't know if this took u s further away from the [official curricular] program, but … Also, we've be en working on the language [curriculum] in teacher meetings. … In the se sessions we analyzed some of the test items (Interview UET4).Medium-low income school [UMRE] identified those competencies that experience the greatest problems. … We studied the results and worked together with other teachers. We presented the find ings in teacher meetings, and discussed the pros and the cons. We devised our [classroom] diagnostic tests at the beginning of the year on the basis of the test outcomes in order to give teachers the opportunity to continue workin g on these competencies (Interview UES7).Low-income school The contents and approach [of the UMRE test] challenged a great deal of ideas that we had. When we saw the exam and what they were after, we came to realize that we we re working wrong, that we were working differently, that we were behind, t hat we were traditional. … The results and the design of the test (which was a very good proposition) led teachers to realize of everything that we lacked. From here on, we started to review everything, not because we did well, but because we could have done even much better (Interview UEM1 8). Rural school [The UMRE test] does not evaluate for the sake of evaluation, to just get some numbers back. It is meant to impro ve [our future practices], and to provide feedback. These are problem areas th at require hard work and a different approach (Interview UET10).Private school I believe this was a very positive experience. It allows teachers to question if they are working well, alon g the lines they should be working, or if their approach is satisfactory (Inte rview UES21). In some establishments, instructors aligned the course curricula according to the

PAGE 21

21 of 40competencies measured by the UMRE test. Others desc ribe that the evaluation triggered greater coordination between grade curricula. And i n yet some others, they allude to the design of specific institutional projects to streng then curricular objectives where students scored poorly. Low-income school I liked the approach of the [UMRE] test. It was a n interesting proposition. The following year, we pla nned [our curriculum] on the basis of the approach forwarded by the test. We worked together with another sixth grade teacher on reasoning, geometry, numbers. We did all this basing ourselves on the UMRE test. Last year t here wasn't an evaluation, but we administered the '96 exams at th e end of the school year. We got a completely different result. In mathematic s, it was very good. In language, it was low, but not as low as in the prev ious year. We even used the same methodology. You could not ask questions t o a classmate or the teacher (Interview UEM3).Rural school I think that [the UMRE exam] was highly positive to shake teachers up a bit. It led us to question ourselves about many competencies and [curricular] areas that, perhaps, we were not d eveloping well. Throughout the school cycle, students do not receiv e the same type of education. We might have missed a few steps. These concepts may not have been grasped at the right time and kids drag this h andicap into the sixth grade. So the teacher covers the sixth grade curric ulum, but oftentimes students do not have clear the concepts or processe s necessary to sustain these new concepts. This is all very positive, so t hat we can all reflect. We are all responsible for specific areas. We have to make sure that students learn certain topics so that the teacher that follo ws can continue to build upon them (Interview UET11).Low-income school: Principal We observed that we needed to start all over again in language, particularly in reading comprehe nsion. One of the factors that exerted incidence on this question was the lac k of books at home. … So we developed a project. Instructor Yes, we developed a project that sought to overcome the current deficits. We called it “A V egetable Garden to Learn.” Through this project we are trying to addre ss the problems detected in the evaluations. ... We find unsatisfactory or i nsufficient levels in competencies such as production of texts (… which c omes to 52%), also algorithms (52%) and problem solving (48%). Those a re the competencies with the lowest scores. Hence, we are trying to fin d solutions to those problems. At the same time, we see the need to cont inue working on discipline and the formation of good habits. The da ta from UMRE were particularly useful here. The study showed that we had a 47.6% of aggressiveness and misconduct, and lack of motivati on or interest in a 29.9%. Through our little great project “A Vegetabl e Garden to Learn” we are trying to bring parents into the school and int egrate them. Our school is from an unfavorable sociocultural context, and one of the problems that affects much of our functioning is that parents are not involved in student learning (Interviews UES9, UES10). The impact of the UMRE test on academic pra ctices can be appraised most overtly by how quickly it has become a standard for in-scho ol evaluative practices. Public and

PAGE 22

22 of 40private schools that cater to children from diverse communities manifest that they have modeled their own student assessments after the UMR E test. Some establishments have photocopied the UMRE test and readministered it. Others have prepared a different test with a comparable methodological approach. Private school teacher Some teachers used the [UMRE] test again [the following year]. It was like a reapplication. We' ve also used it as a model for other tests (Interview UES21).Public school teacher I came to this school in 1997. I am the sixth gra de teacher. Last year we administered something simila r [to the UMRE test]. It was prepared here, together with other sixth grade teachers. … We started to evaluate like UMRE. If the proposal is good, let 's do it! I liked the narrative and argumentative text parts of the test in particular (Interview UES8). Without entering into the discussion regard ing the appropriateness or desirability of standardized evaluative practices in the classroom, it is apparent that UMRE's assessment experience reveals the influence nationw ide examinations may exert in schooling practices, even in cases where these asse ssments do not involve high stakes testing. Uruguayan teachers adopted the evaluative approach proposed by UMRE despite of the fact that there were no incentive mechanisms or penalties openly associated with this test. It is also opportune to highlight that t eachers did not experience this alignment an imposition of the central government or as a res triction to their pedagogical autonomy. They welcomed this methodology for findin g it interesting or innovative. Educators underscore that the type of evalu ation proposed by UMRE epitomizes a novel pedagogical approach. Teachers find the empha sis on skill areas and problem solving particularly attractive. On the other hand, they recognize that they lack the know-how to implement it properly. That is, the met hodological guidelines forwarded by UMRE are at best an initial referent; in order t o be truly effective, they ought to be complemented with specific training. Instructor 1 The methodological guides say “this should not be like this,” but they don't explain how we should do it. Instructor 2 There have been radical changes. We studied all our lives one way, under certain methodology. Suddenly, and especially in reading an d writing, everything changes. Instructor 1 The explanations are very theoretical. Experts pr epare these materials, but they remain up there, in theor etical issues. They are not very practical, or clear about how to apply them. Instructor 2 [They] first have to come to terms that we are not mathematician s or linguists (Interviews UES14, UES15). The difficulties experienced in implementin g change in classroom practices, according to teachers and principals, have centered around two broad predicaments: (a) lack of capacity, and (b) institutional organizatio nal impediments. These obstacles afflict more acutely the public rather than the private sec tor, and low-income rather than high-income contexts. Moreover, teachers lack the institutional s pace and time to master new techniques or ponder about educational practices. There are fe w opportunities for in-service training, team curricular planning and professional development. A notable exception

PAGE 23

23 of 40was the seminar organized by UMRE for urban establi shments from unfavorable sociocultural contexts. There are establishments that take into considerati on the UMRE data, but there are also establishments that do not take adva ntage of [this information] not because they do not want to, but b ecause they lack the institutional space for teachers to meet. There is no time for instructors to come together and reflect. It is all left in the ha nds of the good will of teachers to benefit from the results. … This is an obstacle. There is an enormous quantity of information, but oftentimes it is wasted. It does not reach the teacher as it should (Interview UGN12). Public establishments also undergo a freque nt and dramatic staff turnover every few years. This has been a standard feature of the Uruguayan educational system. Educators are assigned permanent posts that they pe riodically vacate to fill in for temporary more desirable positions. This shift caus es a ripple effect, encouraging another educator to leave her current post to fill in for that position now open. This permanent flux of school staff interrupts medium-te rm institutional processes as well as hinders educators from becoming intimately acquaint ed with local educational and social conditions. [In 1996,] I was the sixth grade teacher. It was my first year in the school. That year every teacher in the school was new. We h ad no knowledge of those kids. And neither did the principal, who had been assigned to the school the year before. It took us a year and a hal f to get to know the school integrally. The only original thing that remained i n the school were the students (Interview UES18). Educators have also professed some objectio ns to UMRE's instruments and methodology. Classroom teachers, and especially tho se who work with low-income children, criticize the first national evaluation o n three counts primarily. First, they object that UMRE depends upon the same instrument t o evaluate disparate social realities. It is conceived as intrinsically “unfair ” that children from underprivileged backgrounds must face the same exigencies as childr en that have access to plentiful resources. (Note 11) Second, they protest that exam s were administered by outside proctors. The presence of an unknown person in the classroom allegedly distressed and distracted students. What I objected to was that the classroom teachers could not be the exam proctors. They did not trust us. The job of the pro ctors was only to distribute the tests, and we could have done that perfectly we ll. … There was too much formality, and children are not used to it. … And that had a negative impact. … Children were neither at ease nor comfort able in that environment, and that was truly detrimental for the m (Interview UET5). Third, educators claim that unfamiliarity with a mu ltiple choice methodology encouraged students to guess answers or select resp onses randomly. In spite of these reservations, UMRE has ma naged to establish itself quite quickly within the Uruguayan educational landscape. This is a remarkable achievement provided that the evaluation system is barely a few years ol d. The words of a trade union leader capture this sentiment persuasively:

PAGE 24

24 of 40I believe that at the [educational] system level, [ UMRE] furnishes very valuable and interesting information. Although with some difficulties, it has been effectively incorporated into the school cultu re. The results are valued. The lack of discussion about the application of the third grade assessment immediately demonstrates that it has been incorpora ted into the school dynamic (Interview UGN37). Teachers concur that participation in this experience has been beneficial. The UMRE test has, at worst, successfully fostered a di alogue about classroom practices and, in the best case scenarios, stimulated a renewal in pedagogic approaches. I speak sincerely. Sometimes, when teachers have ma ny years of experience, we find that we must take on other activities outsi de school. The poor economic conditions oblige us to search for other a ctivities so that we can live with dignity. Hence, suddenly we fossilize in certain aspects, certain methodologies. This test allowed us to see that we can evaluate in a different way. It has become a model. And it gave u s bibliography so that we can continue along the path paved by UMRE (Inter view UET21) The first national evaluation has become a model on how to emphasize competencies rather than straight curricular conten ts. Many educators, in fact, argue that UMRE has taken the lead in educational matters, lea ving the old official curricular designs to recede into the background and prompting teachers to challenge long-held assumptions. Our [curricular] program says Venn diagrams, it say s operations, it says reasoning, it says application of knowledge, it say s grammar, it says written expression, it says oral expression, it says readin g. That is how our programs are currently structured. In the [UMRE] te st, it said something else: mother tongue, reflection on language, text p roduction. In the program, it says composition, it does not say written expres sion. Argumentative text is nowhere. In other words, the program is not what was evaluated. … The program talks about sentence grammar, … it talks ab out subject and predicate, but [UMRE] measured it as contextual gra mmar. … We were convinced that we were teaching, but we had not rea lized that what we had in front was [expected of us too]. With UMRE, we ca me to realize that not everything that we did was right, that students wer e not quite responsible [for their shortcomings], that we needed to change behaviors (Interview UET7). In summary, the assessment of educational quality i n Uruguay went beyond a mere description of the conditions of schooling througho ut the country. It was decidedly a call to action.3. National assessment and the character of the Uru guayan nation-state A. Assessment, rationality and State legitimacy

PAGE 25

25 of 40Assessment for rational decision-making. The UMRE assessments have been designed as a recurrent diagnostic instrument of the characteristics of the Uruguayan education sect or. “The evaluation of student learning … is conceived as a systemic evaluation fo r feedback purposes” (UMRE, 1997a: 6). The main objective of UMRE is to supply educational actors—policy makers, school inspectors, principals and teachers—with rel evant and updated information about student academic performance and the sociocultural variables that may condition it. This information will promote educational quality and eq uity through two channels. First, it identifies the strengths and shortcomings in educat ion provision. Second, it sets the stage for school actors and government officials to take the necessary steps to correct deficiencies in the efficiency and distribution of educational services on the basis of systematically collected and objective data. What we endeavor is to produce information regardin g … which skills [students] have mastered and which ones they have n ot, what pedagogical and institutional strategies have succeeded to inst ill fundamental learnings in students from the neediest sectors and, finally, where it is still necessary to invest and provide technical assistance to attai n a more democratic educational system that benefits all Uruguayan chil dren without socioeconomic distinctions” (UMRE, 1996b: 1-2) The national government has employed assess ment outcomes to shape remediation policies and direct technical and economic resource s to those segments of the population in greatest need. Student achievement measures and sociocultural context considerations have played a modest role in the allocation of dida ctic materials, technical assistance, and funds for school-based projects. The central St ate, however, has prioritized socioeconomic variables over strict performance sta ndards for redistributive purposes. UMRE expects to bring about a renovation in pedagogical practices and classroom activities on the basis of the data it collects. Sp ecifically, the assessment system propounds the following objectives: To make information available about [student] compe tency levels in areas considered to be fundamental; [and]To provide that information to teachers so that the y can search for pedagogical alternatives that may revert situations prior to the exit of students from the primary educational system (UMRE, 1997a: 5). Teachers and principals have been formally instructed to review the findings of the first national evaluation and devise compensatory s trategies in response to them. The supervisory cadre has been closely involved in this process too, particularly in schools from unfavorable sociocultural contexts.Assessment and State legitimacy UMRE has consistently reported and analyzed student achievement outcomes in relation to socioeconomic measures. The first natio nal report underscores the link between test results and background variables (UMRE 1996g). The second national report is exclusively devoted to the impact of soci oeconomic factors on academic performance (UMRE, 1997f). In other words, in Urugu ay, the concepts of educational quality and equity are inextricably intertwined. Th e national evaluation system embodies

PAGE 26

26 of 40another conduit for the central government to fulfi ll its obligation to reduce the gap between the privileged and underprivileged sectors of society. [I]t is considered that having information about fu ndamental skill levels is crucial to recuperate the democratizing role of edu cation. The results obtained in the first national evaluation corrobora te that strong inequalities in the quality of learning opportunities exist amon g students from social environments with great deficits. Although it is kn own that this is due to a multiplicity of factors, oftentimes external to the educational system, we assume our responsibility for the permanent improve ment of the quality of learning. In socially disadvantaged sectors, the me diating function of the school becomes all the more necessary in order to c ontribute to the personal and social development of children (UMRE, 1997a: 6) The contextualization of average test score s has become standard practice not just in official documentation, but in the collective mi nd of educators throughout the country as well. Educational establishments are keenly awar e of their own location within the “socioacademic map” and have learned to interpret t est results in relation to the social conditions in which the school is inserted. What is ultimately fundamental? To evaluate [studen t] linguistic and mathematical competencies, and to precise their fam ily contexts. We have to see what the incidence of the [family] backgroun d is [on student achievement] (Interview UET14). The identification of UMRE with the plight for educational equity has been instrumental for the legitimation of the State's ev aluative activities. The collection of student achievement data has validated the reform i nitiatives of the national government by providing scientific proof of the erosion in the quality of educational services while furnishing rational-technical justifications for th e pursuit of these compensatory measures. But perhaps more importantly, UMRE has bo lstered the image of the central State as an interventionist agency supporting and t ending for the neediest sectors of the population. As a teachers' union leader attests, [UMRE] ended up inspiring satisfaction. That is, it supplied schools with a depiction of their [academic] situation cross-refer enced to sociocultural variables, repositioning results in terms of their contexts. This allows for a type of public stance that is congruous with the tr ade union's habitual position. Isn't it true? [It refers to] the degree of predetermination and conditioning faced by children as they enter the sc hool. … In short, there was a national test and there were results of that test that did not merit objections (Interview UGN37). There are two additional factors that have ratified the validity of the assessment instrument and, ergo the evaluation of educational quality as a legiti mate State activity. First, the national evaluation apparatus has been c onstrued as the fruit of a consensual process that has incorporated all of the actors in the educational system, including central government officials, regional and local sc hool inspectors, teacher representatives from the Technical-Pedagogical Asse mblies, trade union leaders, and private sector delegates. Second, the State has sec ured the support of educators by largely circumscribing their liability over test ou tcomes. “There is not going to be an

PAGE 27

27 of 40index finger accusing anybody,” declared Germn Ram a, the National Director of the ANEP, upon the dissemination of test results ( El Observador 1996). Obviously, the deterioration [of the educational sy stem] was not the unique or principal responsibility of teachers. A multiplicity of factors external to the educational system has been in operation for th is to occur: the mass expansion of education, the deterioration of the qu ality of life of families, the retraction in educational investments during th e military regime, etc. However, it is necessary to recognize that there ar e variables internal to the system that affect the quality of student learn ing: the pertinence of pedagogical strategies, the relevance of the curric ulum, the modalities and expectations inherent in academic evaluations, the fact that schools from the poorest areas are the gateway to the teaching profe ssion, among others (UMRE, 1996b: 1, bold in the original). The circumscription of teacher liability wa s accomplished in two ways. First, by showcasing background variables as explanatory fact ors of academic attainment. “Student learning,” UMRE (1997e: 2) attests, “is st rongly stratified as a function of the sociocultural context within which each school oper ates.” And secondly, by the central State acknowledging accountability over the conditi ons in schooling services. As established earlier, the national government accept s “ its responsibility for the permanent improvement of the quality of learning” (UMRE, 1997 a: 6, my italics). The premise that the assessment of academic achievement legitimizes the central State potentially encompasses within itself a parad ox. On one hand, evaluation endorses State action by making public its commitment and re sponsibility over educational processes and outcomes. On the other hand, the meas urement of student learning implies a high risk: that poor test performance may provide irrevocable evidence of governmental inefficiency in educational service pr ovision. Thus, if the central State is directly accountable for schooling processes and ou tcomes, doesn't evaluation jeopardize State legitimacy by calling attention to the defici encies in schooling? Sociological institutional theorists posit that assessment is primarily a symbolic activity (Meyer and Rowan, 1978). Its main objectiv e, according to this paradigm, is not to produce results or provide relevant data for a d iagnosis of the conditions of the education sector, but rather to appear that it does. That is, assessment strives to imbue the policy-making process with a guise of scientifi c rationality. The measurement of academic performance is foremost a legitimizing mec hanism of State action by associating the policy-making process with scientif ic analysis. Institutional sociologists underscore that attention to test scores may have a deleterious effect by uncovering inefficiencies wit hin the educational system. Consequently, the relationship between assessment a nd legitimacy depends upon a loose coupling between evaluative processes and outcomes. In othe r words, assessment plays predominantly a figurative role, where the act of e valuating has greater salience than the findings it may uncover. This disjunction blurs the inconsistencies between educational goals and the existing conditions of schooling. In summary, institutional sociologists profess that assessment systems prescribe officiall y acceptable standards of behavior and operation that uphold State action. On the other ha nd, these principles that educational establishments professedly embrace are in fact deco upled from the actual organization of schooling. What do we observe in the Uruguayan case? T he central government has reported aggregate test results from the UMRE evaluation at the national level. Student

PAGE 28

28 of 40achievement data were not broken down by department or educational establishment. This practice differs significantly from evaluative experiences in other countries in the region that report testing outcomes by school or by region. Although withholding individual school data may indeed hide inconsistenc ies in educational service provision, it does not absolve the central government from lia bility over test outcomes. On the contrary, protecting individual school variability makes the central State the sole publicly accountable agent for educational quality. This strategy would appear to contradict the predictions of sociological institut ional theorists. The Uruguayan government's approach to give ample dissemination t o test results and advocate reflection over student outcomes, within a context where the central State has accepted responsibility for the quality of educational servi ces, could give way to a crisis of legitimacy for the central government. National test scores in the first national evaluation were, at best, substandard. Over 65% of students scored unsatisfactorily in mathemat ics and 43% performed poorly in language. (Note 12) Despite this inferior record, a nd contrary to common wisdom, UMRE did not delegitimate central State action. The central State, as predicted by sociological institutionalism, shifted the focus of attention from student outcomes to the role of sociocultural variables in academic achieve ment. Assessment data fostered a national debate about the impact of socioeconomic forces in educational services. Evidence of the dec ay of the education sector was primarily a backdrop to champion governmental compe nsatory initiatives and vindicate the participation of the central State in social po licies. The central government could afford to expose the deterioration in schooling bec ause the root causes of the present educational landscape preceded the current administ ration. These had been already documented in detail in the student achievement stu dies conducted by the CEPAL in 1990 (Comisin Econmica para Amrica Latina y el C aribe, 1993; 1991; 1990). Moreover, assessment data demonstrated that controlling for sociocultural context, the performance of public sector schools is equival ent to their private sector counterparts. [W]hen we take into consideration the sociocultural context within which schools carry out their activities, results vary: p ublic schools that operate in the most favorable contexts obtain results as good as private schools in the same contexts. At the other extreme, rural schools obtain results similar and sometimes even better than urban establishments fro m contexts equally unfavorable (UMRE, 1997f: 5). Hence, UMRE asserted the value of public education and, consequently, of State-run educational service provision. Then, if assessment does not jeopardize Sta te legitimation, are evaluation practices in Uruguay an instance of a loosely coupled system as predicted by sociological institutionalism? That is, is the measurement of st udent achievement primarily a symbolic activity where evaluative processes are of greater consequence that their outcomes? Interview and observational data collected for this research study suggest otherwise. In fact, school actors manifest that there is signi ficant coincidence between State mandates around the UMRE evaluation and actual scho ol behavior. In other words, there is evidence that central State action has successfu lly elicited organizational alignments. Teachers, principals and supervisors alike express a high level of familiarity with assessment policies. In most cases, they have large ly complied with regulations to

PAGE 29

29 of 40review and analyze test results. Furthermore, educa tors concur that this assessment has triggered reflection and some renovation in educati onal practices. As it has been documented in the section above, some schools have devised institutional projects in response to the findings of the evaluation. Other e ducational establishments have modeled their classroom and evaluative activities a fter UMRE's appraisal, focusing for example on competencies rather than on curricular c ontents. The degree of influence of the UMRE evaluat ion on educational practices stands out provided that this is a low-stakes test. There are no incentives tied to performance standards. Neither are educational establishments l iable to the public or to the government for student scores. Similarly, a compari son between UMRE's appraisal and other school-based diagnostic evaluative exercises confirms that the former has had quite a distinct impact on classroom activities. Low-income school instructor When we conducted our own evaluations, we tested concepts. The type of evaluation of UMRE, it makes you think and balance things out. It leads you to wonder what lies behind [a question]. It is evaluating the process itself. And it is prov iding feedback to our work. That is what we need to do … We need to change (Int erview UET15). Two factors can account for this budding tr ansformation in the classroom brought about by the national assessment. First, the evalua tion was built and designed with the support and participation of the education communit y atlarge. This process has fostered among educational actors a sense of appropriation a nd commitment to the work of UMRE. Second, UMRE accompanied its evaluative activ ities with in-service training workshops for teachers, principals and inspectors. Professional development has catalyzed the patronage and implementation of novel curricular and pedagogical propositions. In summary, the Uruguayan central State is responsible and accountable for the conditions of the educational system. Assessment ma y potentially delegitimate State action by underscoring the weaknesses in the educat ion sector. In spite of the shortcomings in schooling services exposed by UMRE, the national government did not suffer a crisis of legitimacy (Weiler, 1990). In fa ct, the central State was able to rally a wide basis of support behind this initiative. As so ciological institutionalism predicts, the central State shifted the focus of public attention from testing outcomes to a comprehensive policy initiative addressing the soci oeconomic wants that condition student learning. This displacement, however, did n ot necessarily decouple assessment from schooling practices. This is particularly stri king given that the national evaluation was not designed a high stakes test for students, t eachers or principals. The UMRE evaluation acted a conduit to channel the might of the State apparatus behind a pedagogical and curricular transformation. B. Assessment and State ideology Uruguay has a long-standing tradition of pu blic support of social sector activities. It has the highest per capita spending on social secto rs among Latin American countries. Social expenditures comprise approximately 50% of t otal government expenditures (World Bank, 1994). The State has been an ardent de fender of public education and a champion of the conception of the Estado docente —the State as teacher (Fernndez, 1997). Uruguayan education reform program has lean ed on two principles: (a) the pursuit

PAGE 30

30 of 40and defense of basic social entitlements, and (b) t he resolute participation of the State in the attainment of these entitlements through social promotion and redistributive policies. “The history of Uruguay shows that if you want to c hange qualitatively a social sector, it must originate from a strong State presence,” remar ks a high-ranking government official. “It is unimaginable to think of education reform without the State being an important protagonist” (Interview UGN7). At an historical junction when the Keynesia n Welfare State has been pronounced to be “in terminal decline” (Jessop, 1993: 34), the AN EP frames its vision for central State action in the education sector within this very par adigm. Renato Opertti, the National Coordinator for the Planning Area of the ANEP, port rays the current efforts to transform the educational system along this vein. The [education] Reform is rooted on a vindication o f the Welfare State, in its objectives as well as in its contents; indirect ly, it defies reform programs steered by the idea of an auditing and regulating S tate that “delegates” onto the market the direct provision of services (Opertt i, 1997: 146). Santiago Gonzlez Cravino, another high-ran king government official, tempers this model of State action, while reaffirming the irrevo cable duty of the national government to support the neediest sectors of the population. In order to attend to disadvantaged people, we need an Interventionist State. In order to favor and sustain the middle class, it is essential, sometimes, the intervention of the State. But the emphasis ought t o lie in giving it a more positive and active role, using the private sector as a motivating instrument (Gonzlez Cravino, 1995: 10). The education reform program, an initiative born in the context of “budgetary limitations” and “commitments and conditions genera ted by international organizations,” has been target of harsh criticisms from those that believe that the central State has relinquished its historic role. This “State” has had no incidence in overcoming the sociocultural deficiencies of increasing student cohorts. Neither have “compensatory” or “focalization” policies demonstrated any ability to surmount … the true causes of pauperization, marginality and social exc lusion (Pallares, 1998: 64). President Sanguinetti, however, has staunchly defen ded his agenda for the transformation of the educational system as a “new form of humanist liberalism based precisely on the promotion of equity” ( El Pas 1997). UMRE has evolved and operated within this f ramework of State-societal relations. Hence, the assessment system has sought to align it s activities with a model of governmental action that promotes the production an d distribution of social well-being. The Uruguayan education reform is statist in its de fense of public education. The [UMRE] evaluation is very much linked to this. It is an attempt to promote social policies, to provide services. It is not symptomatic of a retracting [State] (Interview UGN3). The national evaluation, as already documen ted earlier in this article, has stressed

PAGE 31

31 of 40the utilization of student achievement information in support of remediation programs intended for disadvantaged communities. As test res ults have come to light, the central government has assumed responsibility for the condi tions of schooling and voiced an institutional commitment to enact a policy agenda t o address the shortcomings identified. In this sense, the national evaluation has proceeded in the spirit of social accountability and under the currency of social equalization. The construction of UMRE in these terms has been a deliberate choice. The World Bank, who currently finances UMRE activities, had o riginally proposed an evaluation system based on a consumer accountability paradigm. The assessment would have operated under a different logic where parents, as consumers of educational services, rely on test results to select an educational estab lishment for their children ( ltimas Noticias 1996). When the administration of Germn Rama too k charge of the ANEP in 1995, however, there was a change in strategy and U MRE was shaped after the assessment model that Rama had developed earlier at CEPAL (Comisin Econmica para Amrica Latina y el Caribe, 1991) The association between the assessment syst em and the World Bank has inspired some mistrust regarding the credibility of the mode l of State-societal relations espoused by UMRE. In fact, this partnership has threatened t o interfere with the legitimacy of the national appraisal. The Uruguayan Federation of Tea chers and the Technical-Pedagogical assemblies have expressed opp osition to the evaluation of student achievement “because of its international p erspective,” associated with neoliberal policies that seek to reduce governmenta l intervention in the provision of social services (Interviews UGN6, UGN37). The [Central Board Council] has deteriorated the au tonomy of the [Primary Education Council] by assigning functions to a para llel organization (MECAEP). [MECAEP] operates with resources conditio ned by international loans and imposes EDUCATIONAL policie s that do not respond to the needs forwarded by the NATIONAL TEAC HER CADRE (Asamblea Nacional Tcnico Docente, 1998: 30, caps in the original). The national government, in turn, underscor es its independence from the multilateral organization and reaffirms to the publ ic opinion its defense of public intervention in the social sectors. “We are not dom inated by [the World Bank],” asserts Claudio Williman, the vice president of the Central Board Council, “We are an underdeveloped country where State involvement is v ital. … Education is a competency of the State” ( El Diario 1996). C. Assessment and State control The Uruguayan educational system is structu red in a greatly centralized and hierarchical fashion. All decisions—from administra tive matters to curricular frameworks—are determined in Montevideo and uniform ly enforced throughout the country. “Teachers in Uruguay behave like an army,” remarks a government official. “If you give them an order, they will follow it” (Inter view UGN3). There are extremely limited instances of organizational decentralizatio n or institutional autonomy (Fernndez, 1997). World Bank report ascribes to this “extreme ” concentration of power a profoundly deleterious effect. The highly centralized public primary education sys tem hinders undertaking the required changes to achieve greater sectoral ef ficiency, equity, and

PAGE 32

32 of 40quality. Centralization has restricted teachers and local managerial authority and initiative, reduced teacher-pupil interaction, discouraged personal growth and professional advancement, and limited th e extent to which managerial staff and teacher opinions in pedagogica l and administrative matters are solicited and recognized by those in ch arge of their workplace. On the other hand, centralization has overburdened policymakers and higher level staff with routine tasks and decisions, depri ving them from having a more long-term strategic and prospective approach t o the sector. The 19 [Departmental Inspectorates] are more concerned wit h transmitting centrally adopted policies and guidelines and collecting data on behalf of ANEP's central offices tan with enforcing activities to en hance the quality of education for which they are ill equipped and train ed (World Bank, 1994: 11). Uruguayan scholars concur that the educatio nal system may benefit from greater flexibility and autonomy in its governance (Pallare s, 1998; Fernndez, 1997; Macedo, 1995). An initial step in this direction has been t he disbursement of small grants to educational establishments for the implementation o f school-based initiatives that can enhance educational quality (Uruguay—ANEP-CODICEN, 1998). Undoubtedly, the national assessment suppor ts the concentration of authority at the central level. As Hans Weiler (1993: 76) proposes, “evaluation is not merely the gathering and dissemination of information; it also has something to do with the authoritative interpretation of standards of knowle dge and is endowed with a considerable amount of force, both real and symboli c.” UMRE reinforces curricular mandates pronounced by the ANEP. It also fosters th e alignment of school practices with centralized prescriptions. A government inform ant even claims that UMRE was an attempt from the central State to exert greater con trol over the flow of information on academic achievement after the release of the highl y-critical CEPAL studies (Interview UGN3). On the other hand, the organization and imp lementation of the national evaluation defy this depiction of closed centralized control. UMRE has dedicated great effort to the incorporation of an ample array of voices and opini ons into this process. It has steadily encouraged the systematic and continuous participat ion of all levels of civil society. The UMRE Advisory Group consists of representatives fro m the public and the private sectors, as well as central, departmental and local jurisdictions. Teachers, principals and supervisors have been repeatedly consulted on a wid e variety of topics, from the design of the curricular matrix to be appraised to the dev elopment of test items. UMRE's experience serves as a model of cent ralized governance sustained and enriched by democratic cooperation. The involvement of the Technical-Pedagogical assemblies and the teachers' union in the national assessment is living proof that even unpopular policies may garner the consent of retice nt social actors in an environment that nurtures open and effective dialogue.4. Concluding remarks UMRE incarnates a model of social—as oppose d to consumer—accountability where the central State must respond for the condit ions of schooling. In this paradigm, the national government not only functions as a gua rantor of educational quality and equity, but it also upholds its obligation as provi der of educational services. The

PAGE 33

33 of 40evaluation of student performance is an avenue to d efend the role of public education as an equalizing social force and reaffirm the central government's support to the neediest sectors of the population. However, student performance measures, as a lready expressed repeatedly, may potentially exert a destabilizing role by highlight ing deficiencies in educational service provision. The central State averts the potential c risis of legitimation (Weiler, 1990) by shifting the character of assessment from the measu rement of student outcomes to the remediation of the ills in student learning. The conceptualization of education as a gov ernmental responsibility has largely insulated the assessment process from finger-pointi ng or assigning blame. It is not teachers or schools that are being tested, but the educational system as a whole. This approach has generated the potential for educators to identify with and participate in evaluative activities. Democratic participation, in turn, buttresses the legitimacy of the assessment scheme. UMRE has spurred the beginnings of a curric ular and pedagogical transformation throughout the Uruguayan educational landscape. Thi s is a promising first step for an evaluation system in its formative years. As new da ta are collected and UMRE consolidates its role within the education sector, the central State will face a new challenge: It will have to account for the effectiv eness of its own policies in reducing existing inequalities. NotesI would like to thank the Unidad de Medicin de Res ultados Educativos for their support in this research project. I am indebted to Patricia Arregui, Martin Carnoy, John Meyer, Kathleen Morrison, Karen Mundy, Pedro R avela and Michel Welmond for helpful comments and suggestions. All r emaining errors are my own. This research project was supported by a summe r fellowship from the Center for Latin American Studies at Stanford University a nd a Spencer Foundation Research Training grant. A Spanish version of this article has been published by the Working Group of Standards and Evaluation of GR ADE-PREAL and can be accessed in Adobe Acrobat format at http://www.grade.org.pe/gtee-preal/docs/Benveniste. pdf 1. This analysis excludes tertiary education. 2. In 1998, the ANEP signed a loan agreement with the World Bank for an additional US$ 28 million for the second phase of the MECAEP p roject. 3. This section draws largely from personal interviews conducted with government officials involved in the design and implementation of the primary and secondary national assessment systems. 4. In the 1998 evaluation, teachers were invited to pa rticipate in the formulation of test items. 5. Correlation coefficients and standard errors were n ot provided by the source document. 6. A government informant explains the reasons behind secondary teachers' more contentious attitude in this manner: The secondary education teacher cadre is very diffe rent to primary school educators. The latter is a professionalized group. One hundred percent of [primary school] teachers obtain their d egrees. They all went to normal institutes. They all have the title hanging somewhere 7.

PAGE 34

34 of 40at home. Hence, they have a positional culture that is more homogeneous. In secondary schooling, only 30% of th e people teaching have specific preparation for being a 'pro fessor.' There are university professionals, university students .... Thus, the heterogeneity is much greater. ... Secondly, second ary teachers have adopted a "let’s see" attitude towards the educatio n reform. Primary teachers were "calmer," more easy going, less oppos ition. That is why MECAEP, and more specifically UMRE, has been able t o secure an active collaboration, inclusive of the teachers’ un ions and the ATD, the Technical-Pedagogical Assembly. In the case of secondary schools, the unions were more in opposition from th e get go, more combative because the education reform was deeper. The ATD is also more politicized. The ATD leaders have emphasized t heir own ATD position over the stance of the [teachers'] union ( Interview UGN3). There is currently some uncertainty regarding the t ransfer of UMRE from the MECAEP project to the ANEP due to potential changes in the organizational and institutional structure of the evaluation system. 8. The characteristics of these evaluations vary from school to school. Thus, average student test scores are not comparable across schoo ls. 9. Real average teacher salaries, however, are still s lightly below their 1988 level nonetheless. 10. This stance contradicts another argument that point s at the inherent inequity of holding different expectations for students from di ssimilar sociocultural contexts, and particularly of holding lower expectations for children from lower-income backgrounds. The challenge would be not to "veil" t he differences among social groups, but rather to introduce the necessary compe nsatory measures so that all students, regardless of their sociocultural context can equally reach high achievement levels or national standards. 11. Unsatisfactory test performance was defined by UMRE as inferior to 60% of correct answers. 12.Referencesde Planeamiento de ANEP (1998). Incidencia de los Factores Socioculturales en la Repeticin Escolar Montevideo: Consejo Directivo Central, ANEP. Asamblea Nacional Tcnico Docente (1998). Resoluciones Montevideo: mimeo. Comisin Econmica para Amrica Latina y el Caribe (1990). Enseanza Primaria y Ciclo Bsico de Educacin Media en el Uruguay Montevideo: CEPAL. Comisin Econmica para Amrica Latina y el Caribe (1991). Que Aprenden y Quienes Aprenden en las Escuelas de Uruguay: Los Contextos Sociales e Institucionales de xitos y Fracasos Montevideo: CEPAL. Comisin Econmica para Amrica Latina y el Caribe (1992). Aprenden los Estudiantes? El Ciclo Bsico de Educacin Media Montevideo: CEPAL. Comisin Econmica para Amrica Latina y el Caribe (1993). Escuelas Productoras de Conocimientos en los Contextos Socioculturales Ms Desfavorables Montevideo:

PAGE 35

35 of 40CEPAL.Comisin Econmica para Amrica Latina y el Caribe (1994). Los Bachilleres Uruguayos. Quienes Son, Que Aprendieron y Que Opina n Montevideo: CEPAL. Domingo, R. (1998). Evolucin Salario Docente Promedio Montevideo: Universidad de la Repblica, mimeo.El Diario (1996). Estado No Puede Entregar la Educacin a Manos Priva das Montevideo, November 6-7.El Observador (1996). Los Resultados No son Satisfactorios Montevideo, November 29.El Pas (1995). ADEMU Informa Montevideo, November 14. El Pas (1997). Sanguinetti Ratific Apoyo para la Reforma Educativ a Montevideo, July 30.El Pas (1998). Dinero para Universalizar la Enseanza Preescolar Montevideo, October 6.Fernndez, T. (1997). “Educacin post-vareliana: Ev idencia para nuevas prioridades.” Revista Educacin 6: 63-78. Gonzlez Cravino, S. (1995). "La reforma de los ser vicios sociales del Estado: Una visin global." In Ministerio de Educacin y Cultur a, (Ed.) La Gestin en la Educacin, Poltica Educativa, Administracin y Financiamiento Montevideo: Oficina de Planeamiento y Presupuesto, FAS, 6-10.Gonzlez Rissotto, R. (1997). Sistema Educativo Nacional de Uruguay: 1993 Montevideo: Organizacin de Estados Iberoamericanos para la Educacin, la Ciencia y la Cultura.Inspeccin Departamental de Montevideo (1998). Evaluacin Montevideo: ANEP-CEPInspeccin Departamental de Montevideo, m imeo. Jessop, B. (1993). “Towards a Schumpeterian Workfar e State? Preliminary remarks on Post-Fordist Political Economy.” Studies in Political Economy 40(Spring): 7-39. Macedo, B. (1995). "Descentralizacin de la Educaci n Bsica: Las Experiencias de Chile y Uruguay." In U. C. d. U. D. A. Larraaga, ( Ed.) Problemas y Desafos en la Educacin Bsica Montevideo: Oficina de Planeamiento y Presupuesto FAS, 27-29. Meyer, J. and B. Rowan (1978). "The structure of ed ucational organizations." In M. Meyer, (Ed.) Environments and Organizations San Francisco: Jossey-Bass, 78-109. Opertti, R. (1997). “La reforma educativa: Reinvind icacin del Estado Benefactor.” Cuadernos del CLAEH 22(78-79): 139-159. Pallares, R. (1998). "Fundamentos y consideraciones para una poltica alternativa en la educacin media." In F. V. Tras, (Ed.) Reforma Educativa: Anlisis Crtico y

PAGE 36

36 of 40Propuestas Ediciones de la Banda Oriental. Rama, G. (1998). La Reforma Educativa en Uruguay Montevideo: Administracin Nacional de Educacin Pblica, mimeo.Ravela, P. (1997a). "Escuelas poductoras de conocim ientos en los contextos socioculturales ms desfavorables." In Organizacin de Estados Iberoamericanos, (Ed.) Cumbre Iberoamericana: Memoria, Parte II. Aspectos Cualitativos y Cuantitativos en la Evaluacin Educativa: Una Aproximacin al Rendimien to Escolar Buenos Aires: OEI, 99-125.Ravela, P. (1997b). "La bsqueda de escuelas poduct oras de conocimientos en el marco de la evaluacin nacional de aprendizajes en Urugua y." In Organizacin de Estados Iberoamericanos, (Ed.) Cumbre Iberoamericana: Memoria, Parte I. Aspectos Cualitativos y Cuantitativos en la Evaluacin Educa tiva: Una Aproximacin al Rendimiento Escolar Buenos Aires: OEI, 113-135. ltimas Noticias (1996). El Banco Mundial Sugiere que ANEP Reduzca sus Contr oles Montevideo, November 5.UMRE (1996a). Acta No. 8. Reunin Grupo de Consulta Montevideo: UMRE, mimeo. UMRE (1996b). Manual del Aplicador Montevideo: Administracin Nacional de Educacin Pblica.UMRE (1996c). Material Informativo para Docentes: III. Manual de Interpretacin Prueba de Lengua Materna Montevideo: Administracin Nacional de Educacin Pblica.UMRE (1996d). Material Informativo para Docentes: III. Manual de Interpretacin Prueba de Matemtica Montevideo: Administracin Nacional de Educacin Pblica. UMRE (1996e). Material Informativo para Maestros y Directores: I. Fundamentos Montevideo: Administracin Nacional de Educacin P blica. UMRE (1996f). Material Informativo para Maestros y Directores: II Aspectos Organizativos Montevideo: Administracin Nacional de Educacin Pblica. UMRE (1996g). Primer Informe de Difusin Pblica de Resultados Montevideo: Administracin Nacional de Educacin Pblica.UMRE (1997a). Fundamentos y Objetivos de la Evaluacin Muestral d e Aprendizajes en 3er Ao Educacin Primaria Montevideo: Administracin Nacional de Educacin Pblica.UMRE (1997b). Lenguaje: Especificaciones y Sugerencias Didcticas Montevideo: Administracin Nacional de Educacin Pblica.UMRE (1997c). Matemtica: Especificaciones y Sugerencias Didctic as Montevideo: Administracin Nacional de Educacin Pblica.UMRE (1997d). Primer Informe sobre Produccin Escrita Montevideo:

PAGE 37

37 of 40Administracin Nacional de Educacin Pblica.UMRE (1997e). Programa de Mejoramiento de los Aprendizajes en las Escuelas Pblicas Urbanas de Contextos Desfavorables Montevideo: Administracin Nacional de Educacin Pblica.UMRE (1997f). Segundo Informe de Difusin Pblica de Resultados Montevideo: Administracin Nacional de Educacin Pblica.UMRE (1997g). Tercer Informe de Difusin Pblica de Resultados Montevideo: Administracin Nacional de Educacin Pblica.UMRE (1998a). Acta No. 2. Reunin Grupo de Consulta Montevideo: UMRE, mimeo. UMRE (1998b). Evaluacin de Aprendizajes en Tercer Ao de Educaci n Primaria. Boletn Informativo para Maestros Montevideo: Administracin Nacional de Educacin Pblica.UMRE (1998c). Uruguay--Unidad de Medicin de Resultados Educativo s Montevideo: mimeo.Uruguay—Administracin Nacional de Educacin Pblic a (1996a). Nota de Prensa. Primeros Resultados de la Evaluacin Nacional de Ap rendizajes en 6to Ao de Enseanza Primaria Montevideo: Consejo Directivo Central, mimeo. Uruguay—Administracin Nacional de Educacin Pblic a (1996b). Resolucin, Marzo de 1996 Montevideo: Consejo Directivo Central, mimeo. Uruguay—ANEP-CEP-Inspeccin Tcnica (1991a). Circular No. 3 Montevideo: ANEP-CEPInspeccin Tcnica, mimeo.Uruguay—ANEP-CEP-Inspeccin Tcnica (1991b). Circular No. 26 Montevideo: ANEP-CEPInspeccin Tcnica, mimeo.Uruguay—ANEP-CEP-Inspeccin Tcnica (1991c). Circular No. 30 Montevideo: ANEP-CEPInspeccin Tcnica, mimeo.Uruguay—ANEP-CODICEN (1998). 1998: La educacin Uruguaya. Situacin y Perspectivas. Montevideo: Consejo Directivo Central, ANEP. Uruguay—ANEP-MECAEP (1997). Informe de Actividades No. 3 1997 1er Semestre Montevideo: ANEPMECAEP.Uruguay—Ministerio de Educacin y Cultura (1996). Desarrollo de la Educacin: Informe Nacional de Uruguay Montevideo: Ministerio de Educacin y Cultura, Direccin de Educacin.Uruguay—Ministerio de Educacin y Cultura (1997). Anuario Estadstico de Educacin 1996 Montevideo: Ministerio de Educacin y Cultura, Di reccin de Educacin. Weiler, H. (1990). "Decentralisation in educational governance: An exercise in contradiction?" In M. Granheim, M. Kogan and U. Lun dgren, (Eds.) Evaluation as

PAGE 38

38 of 40 Policymaking: Introducing Evaluation into a Nationa l Decentralised Educational System London: Jessica Kingsley Publishers, 42-65. Weiler, H. (1993). "Control versus legitimation: Th e politics of ambivalence." In J. Hannaway and M. Carnoy, (Eds.) Decentralization and School Improvement: Can We Fulfill the Promise? San Francisco: Jossey-Bass Publishers, 55-83. World Bank (1994). Uruguay, Basic Education Quality Improvement Projec t Washington, DC: World Bank. About the AuthorLuis BenvenisteThe World Bank, 1818 H Street, NW, Room MC8-414BWashington, DC 20433Tel.: 202.473.8495Fax: 202.614.0935 Lbenveniste@Worldbank.Org Email: luis_benveniste@post.harvard.edu Luis Benveniste is an education specialist at the W orld Bank. His recent research work has focused on the politics of student achievement testing in Argentina, Chile and Uruguay. He has also co-authored articles on multig rade schooling and school accountability mechanisms in public and private sch ools. He has worked in a wide variety of international education projects in Lati n America, Western Africa and East Asia.Copyright 2000 by the Education Policy Analysis ArchivesThe World Wide Web address for the Education Policy Analysis Archives is epaa.asu.edu General questions about appropriateness of topics o r particular articles may be addressed to the Editor, Gene V Glass, glass@asu.edu or reach him at College of Education, Arizona State University, Tempe, AZ 8 5287-0211. (602-965-9644). The Commentary Editor is Casey D. C obb: casey.cobb@unh.edu .EPAA Editorial Board Michael W. Apple University of Wisconsin Greg Camilli Rutgers University John Covaleskie Northern Michigan University Alan Davis University of Colorado, Denver Sherman Dorn University of South Florida Mark E. Fetler California Commission on Teacher Credentialing

PAGE 39

39 of 40 Richard Garlikov hmwkhelp@scott.net Thomas F. Green Syracuse University Alison I. Griffith York University Arlen Gullickson Western Michigan University Ernest R. House University of Colorado Aimee Howley Ohio University Craig B. Howley Appalachia Educational Laboratory William Hunter University of Calgary Daniel Kalls Ume University Benjamin Levin University of Manitoba Thomas Mauhs-Pugh Green Mountain College Dewayne Matthews Western Interstate Commission for HigherEducation William McInerney Purdue University Mary McKeown-Moak MGT of America (Austin, TX) Les McLean University of Toronto Susan Bobbitt Nolen University of Washington Anne L. Pemberton apembert@pen.k12.va.us Hugh G. Petrie SUNY Buffalo Richard C. Richardson New York University Anthony G. Rud Jr. Purdue University Dennis Sayers Ann Leavenworth Centerfor Accelerated Learning Jay D. Scribner University of Texas at Austin Michael Scriven scriven@aol.com Robert E. Stake University of Illinois—UC Robert Stonehill U.S. Department of Education David D. Williams Brigham Young UniversityEPAA Spanish Language Editorial BoardAssociate Editor for Spanish Language Roberto Rodrguez Gmez Universidad Nacional Autnoma de Mxico roberto@servidor.unam.mx Adrin Acosta (Mxico) Universidad de Guadalajaraadrianacosta@compuserve.com J. Flix Angulo Rasco (Spain) Universidad de Cdizfelix.angulo@uca.es Teresa Bracho (Mxico) Centro de Investigacin y DocenciaEconmica-CIDEbracho dis1.cide.mx Alejandro Canales (Mxico) Universidad Nacional Autnoma deMxicocanalesa@servidor.unam.mx Ursula Casanova (U.S.A.) Arizona State Universitycasanova@asu.edu Jos Contreras Domingo Universitat de Barcelona Jose.Contreras@doe.d5.ub.es

PAGE 40

40 of 40 Erwin Epstein (U.S.A.) Loyola University of ChicagoEepstein@luc.edu Josu Gonzlez (U.S.A.) Arizona State Universityjosue@asu.edu Rollin Kent (Mxico)Departamento de InvestigacinEducativa-DIE/CINVESTAVrkent@gemtel.com.mx kentr@data.net.mx Mara Beatriz Luce (Brazil)Universidad Federal de Rio Grande do Sul-UFRGSlucemb@orion.ufrgs.brJavier Mendoza Rojas (Mxico)Universidad Nacional Autnoma deMxicojaviermr@servidor.unam.mxMarcela Mollis (Argentina)Universidad de Buenos Airesmmollis@filo.uba.ar Humberto Muoz Garca (Mxico) Universidad Nacional Autnoma deMxicohumberto@servidor.unam.mxAngel Ignacio Prez Gmez (Spain)Universidad de Mlagaaiperez@uma.es Daniel Schugurensky (Argentina-Canad)OISE/UT, Canadadschugurensky@oise.utoronto.ca Simon Schwartzman (Brazil)Fundao Instituto Brasileiro e Geografiae Estatstica simon@openlink.com.br Jurjo Torres Santom (Spain)Universidad de A Coruajurjo@udc.es Carlos Alberto Torres (U.S.A.)University of California, Los Angelestorres@gseisucla.edu