Educational policy analysis archives

Educational policy analysis archives

Material Information

Educational policy analysis archives
Arizona State University
University of South Florida
Place of Publication:
Tempe, Ariz
Tampa, Fla
Arizona State University
University of South Florida.
Publication Date:


Subjects / Keywords:
Education -- Research -- Periodicals ( lcsh )
non-fiction ( marcgt )
serial ( sobekcm )

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
E11-00378 ( USFLDC DOI )
e11.378 ( USFLDC Handle )

Postcard Information



This item has the following downloads:

Full Text
xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam a22 u 4500
controlfield tag 008 c20049999azu 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E11-00378
0 245
Educational policy analysis archives.
n Vol. 12, no. 29 (June 28, 2004).
Tempe, Ariz. :
b Arizona State University ;
Tampa, Fla. :
University of South Florida.
c June 28, 2004
High-stakes testing in the warm heart of Africa : the challenges and successes of the Malawi National Examinations Board / Elias Chakwera.
x Research
v Periodicals.
2 710
Arizona State University.
University of South Florida.
1 773
t Education Policy Analysis Archives (EPAA)
4 856

xml version 1.0 encoding UTF-8 standalone no
mods:mods xmlns:mods http:www.loc.govmodsv3 xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govmodsv3mods-3-1.xsd
mods:relatedItem type host
mods:identifier issn 1068-2341mods:part
mods:detail volume mods:number 12issue 29series Year mods:caption 20042004Month June6Day 2828mods:originInfo mods:dateIssued iso8601 2004-06-28


1 of 21 A peer-reviewed scholarly journal Editor: Gene V Glass College of Education Arizona State University Copyright is retained by the first or sole author, who grants right of first publication to the EDUCATION POLICY ANALYSIS ARCHIVES EPAA is a project of the Education Policy Studies Laboratory. Articles appearing in EPAA are abstracted in the Current Index to Journals in Education by the ERIC Clearinghouse on Assessment and Evaluation and are permanently archived in Resources in Education Volume 12 Number 29June 28, 2004ISSN 1068-2341High-Stakes Testing in the Warm Heart of Africa: The Challenges and Successes of the Malawi National Examinations Board Elias Chakwera University of Massachusetts Amherst and Domasi Coll ege Dafter Khembo University of Massachusetts Amherst and Malawi Nati onal Examinations Board Stephen G. Sireci University of Massachusetts AmherstCitation: Chakwera, E., Khembo, D., Sireci, S., (20 04, June 28). High-Stakes Testing in the Warm Heart of Africa: The Challenges and Successes of th e Malawi National Examinations Analysis Archives, 12(29). Retrieved [Date] from http://epaa In the United States, tests are held to high standa rds of quality. In developing countries such as Malawi, psychometricia ns must deal with these same high standards as well as several a dditional pressures such as widespread cheating, test adminis tration difficulties due to challenging landscapes and poor resources, difficulties in reliably scoring performance assess ments, and extreme scrutiny from political parties and the pop ular press. The


2 of 21 purposes of this paper are to (a) familiarize the m easurement community in the US about MalawiÂ’s assessment progr ams, (b) discuss some of the unique challenges inherent in s uch a program, (c) compare testing conditions and test ad ministration formats between Malawi and the US, and (d) provide suggestions for improving large-scale testing in countries such as the US and Malawi. By learning how a small country instituted and supports its current testing programs, a broader perspective on resolving current measurement problems throughout the world w ill emerge. Malawi is a small landlocked country in Africa, sou th of the Equator covering an area of 118, 484 square kilometers of which 20% is water. The country is bordered to the North and North-East by the Republi c of Tanzania and to the East, South and South-West by the Republic of Mozam bique. The Republic of Zambia forms the Western border.Malawi gained independence from Britain in 1964 and operated under one-party state until 1994 when a multiparty government was e lected. The population of Malawi is estimated at 11 million people. About 46% of the population consists of children and youth less than 15 years of age.The literacy level is estimated at 40% of the adult population (29% female and 48% male). Because of poor levels of literacy, ther e has been rampant poverty. This situation prompted the new government to intro duce Free Primary Education (FPE) in 1994 as a tool for alleviating p overty. This innovation received an overwhelming support from the public in that the enrolment in primary schools increased from 1.8 million to about 3 million pupils. This meant an increased demand for resources that support lear ning, including assessment.In this paper, we provide an overview of the nation al testing systems that support FPE in Malawi. The psychometric, logistic, and political factors affecting this system are discussed, as are the similarities and differences between educational testing in Malawi and in the United Sta tes. We begin with a description of the Malawi National Examinations Boa rd.A Brief History of the Malawi National Examinations BoardIn 1969, the Malawi parliament enacted a law that c reated the Malawi Certificate Examination Board (MCE Board). This Boa rd was charged with the responsibility of developing and administering the Malawi Certificate of Education (MCE) examination in conjunction with the Associated Examining Board (AEB) of the UK. The first such examination w as administered in 1972. Prior to 1972, school leavers in Malawi were taking the Cambridge Overseas School Leaving Examination from the UK.Seven years later, the MCE Board became the Malawi Certificate Examinations and Testing Board (MCE and TB). The MCE and TB cont inued to administer the MCE examinations with the AEB until 1989 when t he handover was completed.


3 of 21 Following an evaluation of examinations in Malawi i n 1984, it was decided that all public examinations should be developed and adm inistered by one central authority. Consequently, in 1987, parliament approv ed legislation merging the examinations section of the Ministry of Education w ith the MCE and TB, thus forming the Malawi National Examinations Board (MAN EB), which currently operates the major educational testing programs in Malawi. In addition, MANEB took over the responsibility of developing and admi nistering Teacher Certificate Examinations and Craft Examinations for technical s chools.MalawiÂ’s Education SystemThe Malawi education system consists of three level s: primary, secondary, and tertiary. The primary education level is an eight-y ear cycle running from Standards (grades) 1 to 8. Standard 8 is an equival ent of Grade 8 in the United States. At the end of Standard 8 pupils take the Pr imary School Leaving Certificate Examination (PSLCE).Secondary education lasts for four years, running f rom Form 1 to Form 4, which are the equivalents of Grades 9 to 12 in the U.S. T wo national examinations are administered at this level: the Junior Certificate Examination (JCE) at the end of junior secondary in Form 2 and the Malawi School Ce rtificate Examination (MSCE) at the end of senior secondary in Form 4.Tertiary education is usually four years particular ly at the University level, although there are other tertiary educational insti tutions that offer courses and programs for less than four years. Teacher training for primary school teachers is usually two years, while technical training may last for four years or less depending on the field of specialization. Access to tertiary education is still very limited because of scarcity of places at that level A description of these major educational testing pr ograms follows. Table 1 presents a brief summary of these programs that inc ludes the grade at which they are administered, the purpose of the test, as well as the number of students sitting for each test and the passing perc entages for the most recent administration on which data are available. Table 1 Summary of MalawiÂ’s Major Educational Assessments Exam Grade Administered Purpose # Examinees 2001 % Pass 2001 PSLCE8 secondary school entrance 161,78626.33* JCE10 10th grade exit; basic employment certificate 82,53057.21 MSCE12 High school exit; postsecondary admissions 61,85618.01


4 of 21 *This represents proportion of PSLCE examinees sele cted to secondary school.Major Testing Programs in MalawiMANEB develops and administers three major national school examinations: PSLCE, JCE and MSCE. A brief description of these e xamination programs is provided below.The PSLCEPSLCE terminates the primary cycle. Its results are used for certification and selection into Form 1 of the secondary education. T he results are reported in letter grades A-F, where A denotes excellent perfor mance and F a fail. Five subjects are offered at this level. These are Engli sh, Mathematics, Primary Science, Chichewa (a local language), and Social St udies. For selection purposes, students are ranked within their district s. Each district is allocated a certain number of Form 1 places in national seconda ry schools. The district quota depends on the proportion of candidates in th e district in relation to the national total. The remaining candidates are consid ered for places in District Secondary Schools and Community Day Secondary Schoo ls (CDSSs). At each selection level, boys and girls are considered sepa rately to ensure gender equity (i.e., within-group norming). A single merit list would result in boys getting a disproportionate number of secondary school place s, since they generally perform better than girls.For many people, the certification aspect of the PS LCE is not as important as its selection function, because the certificate can no longer be used for employment purposes as is the case with MSCE. There fore the pupils are under pressure to perform well enough to be selecte d into secondary education. A longitudinal sampling of the numbers of students taking the PSLCE and the numbers of students passing it, are presented in Ta ble 2. As these data show, the demand for secondary education has always outst ripped the available places. Table 2 Standard 8 – Form 1 Transition YearPSLCE Entry# Passing% Selected197747,3174,85410.3198795,6316,8947.21997128,3799,1707.12001161,78642,600*26.3 *CDDSs were instituted in 1998. This figure include s the 5% of students who went to national secondary schools and the 21% who went to CDSS’s. Formal education for those who fail to get into nat ional secondary school


5 of 21 effectively stops at Standard 8. Before 1999, some pupils received secondary school tuition through Distance Education Centers ( DECs), which have since been turned into Community Day Secondary Schools (C DSS). The increased transition rate in 2001 is a reflection of the incl usion of students who go to CDSSs. However, the quality of learning at CDSSs is considered inferior to the conventional secondary schools in terms of material s and number and quality of teachers in these schools. Private secondary school s also provide secondary education but are too expensive for most parents, a nd the majority of them are not well resourced. Until now the competition is hi gh for places in the conventional secondary schools where the government subsidizes tuition and the schools are better resourced than the private a nd CDSSs. This is what makes PSLCE a high-stakes examination: it determine s oneÂ’s opportunity for higher, better and affordable education.JCEThe JCE is administered after two years of secondar y education. Originally this examination was meant to assess skills and knowledg e leading to gainful employment and further education in senior secondar y school. Twenty-two subjects are offered for this examination. Candidat es must pass at least six of them including English to qualify for a certificate and proceed to Form 3 (11th grade). The examination results are shown in Table 3. Table 3 JCE results YearEntry#passing%passing 199874,122 51,87870.0199969,148 63,13391.3200182,53047,21857.21 In 1997 the government phased out the JCE as a mini mum requirement for entry into civil service. However, due to intense j ob competition, it is still used as a hiring criterion for some blue-collar jobs. Furth ermore, a student must pass the JCE before sitting for the MSCE. The competitio n for Form 3 places is no longer stiff since there are equal numbers of place s in junior and senior secondary sections, and so all who pass the JCE are automatically promoted to Form 3.MSCEMSCE, which is equivalent to High School Diploma in the US, is administered at the end of secondary education. The examination res ults are used for certification (i.e., certifying successful completi on of secondary education) and selection into the university and other tertiary in stitutions. A total of 21 subjects are offered at this level. Each subject is graded o n a nine-point scale using the following standards: Grades 1-2 for distinction;


6 of 21 3-6 for credit;7-8 for general pass; and9 for fail. To qualify for a certificate a candidate must pass at least any combination of six subjects including English, and one of the grades m ust at least be a credit pass. An MSCE certificate can also be awarded if a candid ate passes five subjects including English, and three of which are at least credit passes. The grading process for the MSCE makes the followin g assumptions: The examinations are equivalent across years in ter ms of difficulty level, content covered, and skills examined; The test administration conditions are uniform from year to year; The student cohorts taking the examination each yea r are randomly equivalent. For candidates to be considered for selection into the university, they must have earned credit or distinction on at least six exams, one of which must be English. A pass with at least credit grade in English ensure s that the candidates have adequate communication skills to fully participate in college lectures. Currently, the University of Malawi admits only 0.3% of the se condary school leavers, which illustrates the stiff competition. In additio n, the MSCE certificate has become the minimum qualification for gainful employ ment. Because of these two functions – selection and certification – there is a lot of pressure on the students to pass the examination, making it extreme ly high-stakes. Over the last decade there have been declining tren ds in pass rates on the MSCE as shown in Table 4. This trend has been attri buted to several factors. One factor is indiscipline due to some student’s mi sinterpretation of their newly found democracy, human rights, and freedom (Malunga et al. 2000). For example, Kuthemba-Mwale, Hauya and Tizifa (1996) ob served “a general lack of interest among students to do academic school wo rk.” Other factors that contribute to poor examination results include incr easing number of students without a corresponding increase in instructional r esources, and inadequate and under-qualified teachers. As Malunga et al. reporte d, there are only 4,998 secondary school teachers instead of 12,000 require d by the system. In addition, it was also observed that 67.2% of the te achers were under-qualified. Table 4 MSCE Pass Rates 1992-99 YearMSCE Entry#passingPass (%) 199210753565344.4199313254712346.7199416264787143.1 1995*23219742129.4 199624213803630.7


7 of 21 199726543674023.6199835438632917.9199936732553614.32001618561114318.0 *Private secondary schools opened in 1995, which ma y explain the large increase in number of students tested in this year. Other increases are harder to explain, but may be due in part to students who fai led the exam in previous years sitting again for the exam.Practical, Political, and Psychometric Issues Confr onted by MANEBMANEB administers the three national examinations d escribed above besides other responsibilities such as development and admi nistration of Malawi Craft Examinations and teacher certification examinations In all these examinations the numbers of candidates and examination centers h ave been increasing every year. This has resulted in a number of admini strative challenges, which we describe next.Dealing with Limited ResourcesFrom the three tables above, it is apparent that MA NEBÂ’s volume of work and expenditure increase every year. MANEBÂ’s source of funding is largely government subvention, whose annual increase does n ot match the increased costs of administering examinations, especially in view of increasing inflation. The major areas of expenditure with regard to incre asing number of examination centers that are scattered throughout t he country are delivery and collection of examination materials to and from all centers, scoring, and invigilation (supervision of examinations in the ce nters). In explaining the delay in releasing the 2001 JCE and MSCE examination resu lts, MANEBÂ’s Executive Director made reference to inadequate funding as a major cause of some of the problems facing examination administration (The Nat ion, 2002a). For instance, the scoring process was disrupted by persistent str ikes by the scorers, who were demanding more money from MANEB (The Nation, 2 002b). It is now being proposed that examinations should be administ ered much earlier during the school year to allow adequate time for processi ng the results. The likely consequence of this proposal is that it will be dif ficult for schools to adequately cover the syllabi before the examinations are admin istered. As a way of reducing costs due to delivery and coll ection MANEB has introduced Examination Distribution Points from whe re surrounding schools come to collect examination materials for their sch ools on the daily basis. Examination Security ConcernsOne of the major concerns regarding security of exa minations is leakage. In


8 of 21 some centers examination envelopes have been intent ionally opened before the specified time, and contents exposed for the benefi t of candidates. In extreme cases the prematurely exposed examination papers ha ve been duplicated and sold to the candidates. Such a practice led to the cancellation of the 2000 examinations, and another set of examination papers had to be developed and distributed. In an attempt to deal with this proble m MANEB established Examination Distribution Centers for storage of exa mination materials which are guarded by police officers.Another area of concern is cheating. Cheating takes place in many forms including impersonation, giving extra time, substit ution where a candidateÂ’s script is replaced by one prepared by a more compet ent person, referring to books, copying from each other, copying from a comm on source, teachers dictating answers to the class, etc. As a way of cu rbing cheating during examinations, MANEB carries out spot checks during examinations, but these are done to a limited extent due to shortage of per sonnel, vehicles, and finances. MANEB also provides civic education to th e general public about the dangers of examination malpractice, since in some c ases cheating involves the general public. Sometimes MANEB applies sanctions s uch as nullification of results, deregistration of examination centers, wit hholding results, and prosecuting the culprits if examinations regulation s have been infringed. In addition, MANEB uses external invigilation system w hereby a teacher from a different school invigilates examinations. The head teacher of the schools is the overall supervisor of examinations at the school. M ANEB also prints examination papers outside the country to curb poss ibility of leakage originating from MANEB offices.Dealing with Public and Political PressureMANEB works under considerable pressure because of the high-stakes nature of its examinations. There are many groups that dir ectly influence the way in which MANEB operates. For example, The Ministry of Education, which directs all MANEBÂ’s activities, requires timely release of examination results so that the school calendar is not disturbed. For MANEB to admi nister all examinations and process the results within a single school year, me ans that some exams must be administered well before the end of the school y ear to allow time for processing the results. Consequently, the examinati ons are likely to test material that has not yet been covered in classes. This causes anxiety to both the examinees and their teachers.Another significant problem faced by MANEB is cash flow. MANEBÂ’s cooperating partners such as invigilators, supervis ors, and scorers, want to be paid promptly for the work they do. For some time M ANEB has not been able to make prompt payments due to unavailability of funds This has soured the relationship between MANEB and its partners who som etimes wait for up to two years before they are paid.Another problem to be dealt with by MANEB is score challenges. When examinees and their guardians do not agree with the examination results, they request a re-scoring of the exam. Given that the ma jority of MANEB exams involve constructed-response items that are scored subjectively, score


9 of 21 challenges and re-scoring of exams is time-consumin g and expensive. Like many educational testing programs in the Unite d States, MANEB is also a target for criticism in the popular press. Quite of ten, the press reports on the tension between MANEB and its cooperating partners, and they often highlight the negative aspects. For example, commenting on th e delay in releasing the 2001 examination results, The Nation newspaper reported: MANEB and its parent ministry should take responsib ility for the inconvenience that has been created and take necessary remedial a ction. Is it really impossible to conduct incident-free examinations wh ose results are released in good time? We believe it is possible and MANEB can only justify its existence by doing no less. (The Nation, 2002a)In 1999, the negative coverage of examination resul ts in the press prompted the State President to institute a commission of inquir y into the causes of poor MSCE results (Malunga et al. 2000). The opposition parties took advantage of the poor results to criticize government education policies.Measurement IssuesCurricular Validity and Teaching to the TestThe examinations in Malawi are so important that th ey have assumed a “gate-keeping” role in the system. Because of this importance, the examinations exert considerable influence on what goes in school s. Although the curriculum has generally incorporated issues of the cognitive, psychomotor, and affective domains, examinations mainly focus on the cognitive domain. With so much emphasis on passing examinations it is not surprisi ng that the instruction has become examination oriented. Thus, curricular valid ity of Malawian exams is a contentious issue.MANEB is aware of the demands of the curriculum, bu t is unable to meet them because of inadequate resources. In some subjects t he number of examination papers was reduced and practical-work in some subje cts was scaled down to cut down costs. For example, assessment by project method (Note 1) had to be discontinued. This resulted in a mismatch between t he examinations and course objectives because only selected parts of th e curriculum are assessed, and therefore taught.When projects were removed from assessment to cut d own costs on project inspection and scoring, the schools no longer felt the need to teach by project method, even though it remained an important part o f the national curriculum. If teachers are to cover the whole curriculum, then ex aminations must cover the curriculum. By not covering some parts of the curri culum, the examinations limit the scope of instruction.Scoring Free-Response ItemsMost MANEB exams use free-response items. Because o f the large numbers of candidates the items are scored only once, with 10% re-scored by the Chief


10 of 21 Examiners who supervise the scoring exercise. This raises the problem of examination reliability. For each of the three majo r exams, over 800,000 student papers are scored. Scoring each exam takes about fo ur weeks and involves considerable human and monetary resources.The move by MANEB towards objective assessment usin g multiple-choice examinations was met with strong resistance from th e general public who felt that the multiple-choice examinations would dilute the education quality. As a way of improving the reliability of the examination scores, MANEB put in place a number of measures such as training of scorers, pre -scoring exercise, standardization of scoring, script checking, and da ta entry verification. All these measures are meant to ensure that no errors are mad e during scoring of scripts and processing of examination results. However, eve n with this rigorous error-searching process, some errors go undetected and are discovered at the re-scoring stage, and only if such a request is mad e. The candidatesÂ’ requests for re-scoring are attended to only on payment of a re-scoring fee.High-Stakes Educational Tests in Malawi and the U.S .: Similarities and DifferencesThe preceding sections outlined the major issues co nfronted by measurement professionals in Malawi. Interestingly, most of the se issues are policy-oriented or deal with the practical problems involved in tes t administration. Many of these issues are also confronted by measurement professio nals in the U.S., but U.S. psychometricians appear to be more focused on techn ical issues, particularly those related to the reliability and validity of te st scores. In this section, we discuss the similarities and differences between te sting in Malawi and the U.S. with respect to both psychometric and educational p olicy issues.Testing and Educational Reform: An Important Area o f CommonalityIt is interesting to note that educational reform m ovements in both Malawi and the U.S. use standardized tests as the primary mech anism for accountability and certification goals. Almost all states within t he U.S. have a state-mandated assessment system (Linn, 2000), which is used to ev aluate school districts, schools, teachers, and students. In many states, su ch as Massachusetts, state-mandated tests are also used (a) to encourage teachers to align their instruction with state curriculum frameworks and (b ) for certification functions such as granting high school diplomas.The Malawi national examination system has also bee n at the heart of its educational reform movement. For many schools where instructional resources are scarce or nonexistent, the syllabi associated w ith MANEB tests represent significant instructional resources for teachers. H owever, it is interesting to note that the reform movement in Malawi is a national mo vement, instituted by national laws, and the tests are developed and admi nistered by a national testing agency. This situation is quite different t han the U.S., where efforts to create nationally mandated tests continually fail. States want the authority to decide what is taught and what is tested and so eve n efforts to institute the


11 of 21 Voluntary National Test have been met with resistan ce. Only tests associated with the National Assessment of Educational Progres s (NAEP) have been accepted by the states, perhaps because no studentlevel data are reported, the effect of these tests on state curricula is min imal, and there are absolutely no stakes at all for the students who take them. Th us, although both countries use tests as the primary data source in their educa tional accountability and certification systems, the difference between “loca l” and national control is striking.It is also interesting to note that “teaching to th e test” is seen as a significant issue in both countries. In both Malawi and the U.S ., there are critics who see mandated testing as a weakening of the curriculum, while others praise this practice as an effective means for improving instru ction. It appears that the use of high stakes tests to improve classroom instructi on has supporters and detractors on both continents.High Stakes Versus Really High StakesHigh stakes testing receives a great deal of attent ion in the popular press and educational policy journals within the U.S. The two most common issues are the appropriateness of admissions tests for making post secondary admissions decisions and the appropriateness of using standard ized tests for awarding high school diplomas. Relatively poor performance on the SAT or ACT can certainly inhibit a student’s chances of getting accepted int o a postsecondary institution, particularly the institution of her or his choice. Also, not receiving a high school diploma due to failing an exit exam also has seriou s, negative consequences for students in the U.S. But the stakes associated with the PSLCE and the MSCE in Malawi are much higher. With respect to postseco ndary admissions in the U.S., the community college system is available to students who cannot get into a four-year college and most of these schools have open enrollment policies that do not require admissions test scores. More im portantly, there is not a huge discrepancy between the number of seats available f or postsecondary education and the number of students who seek it. W ith respect to high school graduation and postsecondary admission, the U.S. of fers a multitude of well-paying jobs that do not require high school or college degrees. Furthermore, second-chance programs such as the Tes ts of General Educational Development and those found in adult ba sic education provide opportunities for adults who did not complete high school to earn a high school diploma later in life and continue their education.The situation in Malawi is very different. Students who do not pass the PSLCE do not even make it into secondary school. Even for those who do pass, there are limited spaces in the national secondary school s and CDSSs. Last year, only about 5% of primary school students were place d into the coveted national secondary schools and only 20% more were placed int o the lower quality CDSSs. Most of the other 75% of students will never have the opportunity to pass the JCE and MSCE and be able to compete for th e best jobs in Malawi. For many Malawians, passing the JCE and the MSCE ma kes the difference between a life of self-sufficiency and a life of po verty. Passing the MSCE makes numerous career options possible that cannot be att ained through other routes. For example, the national government requires an MS CE certificate for civil


12 of 21 service employment. Therefore, “high-stakes testing ” has a more pronounced meaning in Malawi. The national educational tests a re the sole criteria for academic certification and the stakes associated wi th the tests include starvation versus prosperity.Equity Issues in AssessmentIn the U.S., the equity issues associated with educ ational testing most commonly involve ensuring or evaluating test fairne ss with respect to (a) racial, ethnic, or linguistic minority groups; (b) females and males; and (c) individuals with disabilities. In Malawi, only sex differences in test performance receive significant attention by researchers, politicians, and the popular press. At first, an outsider may think equity issues assoc iated with ethnicity are not relevant to Malawi because all citizens are African However, although the ethnic composition of the country is much more homo geneous than the U.S., there are still significant differences with respec t to tribal origins, religion, and language. However, test bias with respect to these groups has not been extensively studied. Systematic study of difference s across linguistic groups also has not been conducted, which is unfortunate s ince most Malawians primarily speak Chichewa, even though English is it s official language (two other languages, Tumbuka and Yao, are also the nati ve tongue for hundreds of thousands of Malawians).The issue of accommodating tests for individuals wi th disabilities has received much less attention in Malawi than in the U.S. Ther e is no acknowledgement of students with learning disabilities in the educatio nal system and so granting extended time on tests to such students, which is c ommon in the U.S., is not even on the radar screen. However, MANEB does make Braille tests available to students with visual disabilities and provides 1 /6 additional time for such students to take the tests.In the U.S., equity issues are at the forefront of educational assessment policy debates. When achievement differences are found acr oss racial/ethnic groups on educational tests, researchers, lawmakers, and p olicy analysts are often divided about what should be done. Claims of test b ias against minority groups have led to the abandonment of some educational tes ts, but affirmative action practices (e.g., using different standards for sele cting minority and non-minority candidates) have not stood up to legal scrutiny (Gr een & Sireci, 1999; Sireci & Green, 2000). These policy issues have led psychome tricians and other researchers to focus much of their research on issu es of adverse impact and test bias. For example, studies of differential pre dictive validity and differential item functioning are common in the U.S., but are pr actically non-existent in Malawi.An interesting difference between Malawi and the U. S. with respect to equity in assessment is the way they handle sex differences o n educational tests. In the U.S., within-group norming (Note 2) practices have been outlawed for organizations that receive federal funds, which inc lude virtually all accredited educational institutions and all governmental agenc ies. Thus, adjusting for performance differences across males and females is not conducted. In Malawi,


13 of 21 performance differences between females and males o n educational tests are more pronounced. Furthermore, the proportions of fe males at secondary and postsecondary schools are well below that of males. Thus, colleges and universities struggle to admit qualified females. T o address educational opportunity differences across the sexes, Malawi se condary schools, colleges, and universities rank females and males separately so that the highest-ranking females will be accepted over males that may have s cored higher on a test. Thus, given the same equity issue, the two countrie s made completely opposite policy decisions. This difference stems not so much from philosophical differences in assessment or admissions equity, but from differences in the numbers of women remaining in school after the prim ary grades. Use of Item FormatsAs described above, MANEB exams use predominantly c onstructed-response items. Multiple-choice items are used on some exams but the public perception is that such items dumb down the curriculum and are not effective for measuring important academic knowledge and skills. These criticisms have also been raised in the U.S., but the psychometric commu nity has worked hard to educate the public about the benefits of multiple-c hoice items (e.g., increasing score reliability and content coverage, measuring h igher-level skills, reduced scoring costs and reduced testing time) as well as the limitations of constructed-response items (Note 3) (lack of content coverage, task specificity, reduced reliability, higher scoring costs). In the U.S., the majority of educational tests use either only multiple-choice items, or a c ombination of multiple-choice and constructed response items. These practices ref lect a desire to ensure adequate levels of score reliability and content va lidity while keeping down scoring costs. In Malawi, construct representation is emphasized at the expense of score reliability, testing time, score reporting time, and scoring costs. Computer-Based TestingAnother striking difference between educational ass essment in Malawi and the U.S. is the amount of attention paid to computer-ba sed testing (CBT). In the U.S. almost all testing programs are moving towards computerized administration of their tests or are considering th e use of computers in improving their assessment systems (Zenisky & Sirec i, 2002). Conferences within the educational measurement community featur e programs that are dominated with CBT issues such as computerized-adap tive testing, innovative item types, and automated scoring of constructed-re sponse items. These topics are not receiving considerable attention in Malawi, primarily due to the lack of computer resources within the country.Measurement CommunityAnother huge difference between the U.S. and Malawi is the presence of a significant educational measurement community. In t he U.S. there are thousands of measurement professionals who meet and interact regularly. For example, there are approximately 3,000 members of t he National Council on Measurement in Education (NCME) and even more membe rs of the


14 of 21 Measurement, Evaluation, and Statistics Division of the American Educational Research Association (AERA). The Psychometric Socie ty and the measurement and statistics division of the American Psychological Association (APA) also provide national forums for measurement professionals. In Malawi, the measurement community is much younger and much smaller. For example, there is no Malawi equivalent of the Standards for Educational and Psychological Testing (AERA, APA, NCME, 1999). However, in 2001, a grant from USAID to the University of Massachusetts Amher st (UMASS) established a program to build educational measurement expertis e within Malawi. Currently, nine measurement professionals from Malawi are rece iving doctoral or masterÂ’s degrees in psychometrics from UMASS and an educatio nal measurement program is being reinforced at the University of Ma lawi. This development should bring measurement practices and research are as across the two countries closer together in the future.Technical Versus Practical Measurement IssuesIn addition to differences in the attention paid to differential predictive validity and differential item functioning, there are also s ignificant differences between Malawi and the U.S. with respect to the issues that receive the most attention as part of the normal operating procedures of testi ng agencies. For example, in the U.S., procedures for scaling educational tests are widely researched and item response theory (IRT) is a common procedure fo r scaling educational tests. Furthermore, tests administered in different years are typically equated onto a common scale to ensure differences in test d ifficulty are taken into account when monitoring student progress and awardi ng credentials. In Malawi, IRT is not used at all, and tests are not equated a cross years. Instead, different test forms are assumed to be equivalent in content and difficulty. Another significant difference is in procedures used to set standards on educational tests. In the U.S., standard setting is one of the busiest areas of research and new methods appear continuously (e.g., Cizek, 2000) In Malawi, standard setting is conducted in a less systematic fashion d rawing from subjective estimates of test difficulty and student cohort dif ferences. Due to the higher stakes, more limited resources, a nd limited technical expertise, the measurement issues that get the most attention in Malawi are more logistical. Reducing cheating is a significant issue, since it is widespread and it represents a significant threat to the valid ity of exam scores. Developing, administering, and scoring the exams essentially ex hausts the personnel and financial resources of MANEB and so there is little time or resources to conduct research on test validation.Conclusions: Testing Collegiality Around the WorldThis paper illustrates how different countries deal with common measurement issues, as well as those that are unique to their o wn situation. Many of the practical problems in measurement are universal and so much can be learned from what other countries are doing. For example, t he U.S. can learn from Malawi about successful implementation of large-sca le performance assessment and about alternative strategies for ach ieving equity in test-based


15 of 21 admissions decisions. Malawi can learn technical me asurement solutions to problems such as scaling, equating, standard settin g, and item and test bias research. By building international collegiality within the m easurement community we will be better positioned to help each other tackle our significant measurement problems. For example, there is much that could be done in both countries to build computerized systems for test delivery that c ould reduce cheating and test administration costs. Also, measurement programs in the U.S. could do more to reach out and train professionals in Malawi. This e xpertise could then be extended to other countries in Africa through the m easurement program at the University of Malawi and through similar programs t hat could be developed in other countries. Quality educational systems need q uality assessments. Through the process of building measurement experti se in developing countries such as Malawi, we can help these countries improve their educational systems.NotesThis research was entirely collaborative and the or der of the authors is alphabetical.Correspondence concerning this article should be ad dressed to Stephen G. Sireci, Center for Educational Assessment, School o f Education, University of Massachusetts, Amherst, MA 01003-4140. E-mail corre spondence may be sent to 1. Assessment by project refers to embedded assessment s where students complete hands-on projects throughout the school ye ar that were graded by MANEB. One assessment that was cancelled was an agr icultural project where MANEB officials visited farm sites and evaluated st udentsÂ’ agricultural projects. 2. In within-group norming, candidates within a group (say, male or female) are rank-ordered with respect to everyone else in the g roup. Then, the ranks of the candidates are treated as if they were interchangea ble. For example, the highest-ranking female would be considered equivale nt to the highest-ranking male, even if their scores on a test were very diff erent. 3. See Dunbar, Koretz, & Hoover, 1991; Linn & Burton, 1994; Wainer & Thissen, 1998, for empirical studies of the advanta ges and disadvantages of these different item formats.ReferencesAmerican Educational Research Association, American Psychological Association, & National Council on Measurement in Education (1999). Standards for educational and psychological testing Washington, DC: American Educational Research Ass ociation. Chimwenje, C. (1995). Secondary School Curriculum Review in Malawi: Impli cations for Assessment Paper presented at the 13th annual conference of the Association for Educational Assessment in Africa, Pretoria, South Africa. Chimwenje, C., & Khembo, D. J. (1994). The Role of Examinations in Educational Change: Mal awiÂ’s experience with Examination Reform Paper presented at the 12th annual conference of the


16 of 21 Association for Educational Assessment in Africa, A ccra, Ghana. Cizek, C. J. (Ed.) Standard setting: Concepts, methods, and perspectiv es Mahwah, NJ: Lawrence Erlbaum. Dunbar, S. B., Koretz, D. M., & Hoover, H. D. (1991 ). Quality control in the development and use of performance assessments. Applied Measurement in Education, 4 289303. Green, P. C., & Sireci, S. G. (1999). Legal and psy chometric issues in testing students with disabilities. Journal of Special Education Leadership, 12 (2), 21-29. Kuthemba Mwale, J. B., Hauya, R., & Tizifa, J. (199 6). Secondary School Discipline Study Zomba, Centre for Educational Research and Training. Linn, R. L. (2000). Assessments and accountability. Educational Researcher, 29(2), 4-16. Linn, R. L., & Burton, E. (1994). Performance-based assessment: Implications of task specificity. Educational Measurement: Issues and Practice, 13 (1) 5-8, 15. Malunga, L. B. et al. (2000). Presidential Commission of Inquiry into the Malawi School Certificate of Education (MSCE) Examination Results Lilongwe. Ministry of Education. (1998). Education Statistics Lilongwe. Ministry of Education. (1999). Education Statistics Lilongwe. Sireci, S. G., & Green, P. C. (2000). Legal and psy chometric criteria for evaluating teacher certification tests. Educational Measurement: Issues and Practice, 19(1) 22-31, 34. The Nation (2002a, February 13). Delay in exam results create more problems. The Nation (2002b, Mmarch 1). Release of exam results still cloudy. Zenisky, A. L., & Sireci, S. G. (2002). Technologic al innovations in large-scale assessment. Applied Measurement in Education, 15 337-362.About the AuthorsElias W. J. Chakwera is Deputy Principal of Domasi College of Education in Malawi, where he teaches courses in Testing, Measur ement and Evaluation. He is also a doctoral student at the University of Mas sachusetts Amherst. His areas of expertise and interest include evaluation of edu cational assessments, teacher education and development, and training tea chers through distance education. His recent research activities include s tudies on content validity, test score generalizability, consequential validity, and teacher upgrading through distance education in Malawi.Dafter J. Khembo is currently a doctoral student in Testing & Measu rement at the University of Massachusetts Amherst. He receive d his B.Ed from the University of Malawi in 1986 and MA (Education) fro m the University of London Institute of Education in 1991. He is presently an employee of the Malawi National Examinations Board where he works as a Res earch & Test Development Officer. His areas of interest include: standard setting, differential item functioning, and test score equating.Stephen G. Sireci is Associate Professor of Education and Co-Directo r of the Center for Educational Assessment at the University of Massachusetts Amherst, USA. His areas of expertise include educat ional test development and the evaluation of educational assessments. His most recent research activities


17 of 21 include evaluating content validity, test bias, dif ferential item functioning, and the comparability of different language versions of tests and questionnaires. His vita can be accessed at http://www-unix.oit.umass.e du/~sireci. The World Wide Web address for the Education Policy Analysis Archives is Editor: Gene V Glass, Arizona State UniversityProduction Assistant: Chris Murrell, Arizona State University General questions about appropriateness of topics o r particular articles may be addressed to the Editor, Gene V Glass, or reach him at College of Education, Arizona State Un iversity, Tempe, AZ 85287-2411. The Commentary Editor is Casey D. Cobb: .EPAA Editorial Board Michael W. Apple University of Wisconsin David C. Berliner Arizona State University Greg Camilli Rutgers University Linda Darling-Hammond Stanford University Sherman Dorn University of South Florida Mark E. Fetler California Commission on TeacherCredentialing Gustavo E. Fischman Arizona State Univeristy Richard Garlikov Birmingham, Alabama Thomas F. Green Syracuse University Aimee Howley Ohio University Craig B. Howley Appalachia Educational Laboratory William Hunter University of Ontario Institute ofTechnology Patricia Fey Jarvis Seattle, Washington Daniel Kalls Ume University Benjamin Levin University of Manitoba Thomas Mauhs-Pugh Green Mountain College Les McLean University of Toronto Heinrich Mintrop University of California, Los Angeles Michele Moses Arizona State University Gary Orfield Harvard University Anthony G. Rud Jr. Purdue University Jay Paredes Scribner University of Missouri Michael Scriven University of Auckland Lorrie A. Shepard University of Colorado, Boulder Robert E. Stake University of Illinois—UC Kevin Welner University of Colorado, Boulder


18 of 21 Terrence G. Wiley Arizona State University John Willinsky University of British ColumbiaEPAA Spanish & Portuguese Language Editorial Board Associate Editors Gustavo E. Fischman Arizona State University & Pablo Gentili Laboratrio de Polticas Pblicas Universidade do Estado do Rio de JaneiroFounding Associate Editor for Spanish Language (199 8—2003) Roberto Rodrguez Gmez Universidad Nacional Autnoma de Mxico Argentina Alejandra Birgin Ministerio de Educacin, Argentina Email: Mnica Pini Universidad Nacional de San Martin, Argentina Email:, Mariano Narodowski Universidad Torcuato Di Tella, Argentina Email: Daniel Suarez Laboratorio de Politicas Publicas-Universidad de Bu enos Aires, Argentina Email: Marcela Mollis (1998—2003) Universidad de Buenos Aires Brasil Gaudncio Frigotto Professor da Faculdade de Educao e do Programa dePs-Graduao em Educao da Universidade Federal F luminense, Brasil Email: Vanilda Paiva Lilian do Valle Universidade Estadual do Rio de Janeiro, Brasil Email: Romualdo Portella do OliveiraUniversidade de So Paulo, Brasil


19 of 21 Email: romualdo@usp.brRoberto Leher Universidade Estadual do Rio de Janeiro, Brasil Email: Dalila Andrade de Oliveira Universidade Federal de Minas Gerais, Belo Horizont e, Brasil Email: Nilma Limo Gomes Universidade Federal de Minas Gerais, Belo Horizont e Email: Iolanda de OliveiraFaculdade de Educao da Universidade Federal Flumi nense, Brasil Email: Walter KohanUniversidade Estadual do Rio de Janeiro, Brasil Email: Mara Beatriz Luce (1998—2003) Universidad Federal de Rio Grande do Sul-UFRGS Simon Schwartzman (1998—2003) American Institutes for Resesarch–Brazil Canad Daniel Schugurensky Ontario Institute for Studies in Education, Univers ity of Toronto, Canada Email: Chile Claudio Almonacid AvilaUniversidad Metropolitana de Ciencias de la Educaci n, Chile Email: Mara Loreto Egaa Programa Interdisciplinario de Investigacin en Edu cacin (PIIE), Chile Email: Espaa Jos Gimeno SacristnCatedratico en el Departamento de Didctica y Organ izacin Escolar de la Universidad de Valencia, Espaa Email: Mariano Fernndez EnguitaCatedrtico de Sociologa en la Universidad de Sala manca. Espaa Email: Miguel Pereira Catedratico Universidad de Granada, Espaa Email: Jurjo Torres Santom Universidad de A Corua Email: Angel Ignacio Prez Gmez Universidad de Mlaga


20 of 21 Email: J. Flix Angulo Rasco (1998—2003) Universidad de Cdiz Jos Contreras Domingo (1998—2003)Universitat de Barcelona Mxico Hugo Aboites Universidad Autnoma Metropolitana-Xochimilco, Mxi co Email: Susan StreetCentro de Investigaciones y Estudios Superiores en Antropologia Social Occidente, Guadalajara, Mxico Email: Adrin Acosta Universidad de Guadalajara Email: Teresa Bracho Centro de Investigacin y Docencia Econmica-CIDE Email: bracho Alejandro Canales Universidad Nacional Autnoma de Mxico Email: Rollin Kent Universidad Autnoma de Puebla. Puebla, Mxico Email: Javier Mendoza Rojas (1998—2003) Universidad Nacional Autnoma de Mxico Humberto Muoz Garca (1998—2003)Universidad Nacional Autnoma de Mxico Per Sigfredo ChiroqueInstituto de Pedagoga Popular, Per Email: Grover PangoCoordinador General del Foro Latinoamericano de Pol ticas Educativas, Per Email: Portugal Antonio TeodoroDirector da Licenciatura de Cincias da Educao e do Mestrado Universidade Lusfona de Humanidades e Tecnologias, Lisboa, Portugal Email: USA Pia Lindquist Wong


21 of 21 California State University, Sacramento, California Email: wongp@csus.eduNelly P. StromquistUniversity of Southern California, Los Angeles, Cal ifornia Email: Diana RhotenSocial Science Research Council, New York, New York Email: Daniel C. LevyUniversity at Albany, SUNY, Albany, New York Email: Ursula Casanova Arizona State University, Tempe, Arizona Email: Erwin Epstein Loyola University, Chicago, Illinois Email: Carlos A. Torres University of California, Los Angeles Email: Josu Gonzlez (1998—2003) Arizona State University, Tempe, Arizona EPAA is published by the Education Policy StudiesLaboratory, Arizona State University


Download Options

Choose Size
Choose file type
Cite this item close


Cras ut cursus ante, a fringilla nunc. Mauris lorem nunc, cursus sit amet enim ac, vehicula vestibulum mi. Mauris viverra nisl vel enim faucibus porta. Praesent sit amet ornare diam, non finibus nulla.


Cras efficitur magna et sapien varius, luctus ullamcorper dolor convallis. Orci varius natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Fusce sit amet justo ut erat laoreet congue sed a ante.


Phasellus ornare in augue eu imperdiet. Donec malesuada sapien ante, at vehicula orci tempor molestie. Proin vitae urna elit. Pellentesque vitae nisi et diam euismod malesuada aliquet non erat.


Nunc fringilla dolor ut dictum placerat. Proin ac neque rutrum, consectetur ligula id, laoreet ligula. Nulla lorem massa, consectetur vitae consequat in, lobortis at dolor. Nunc sed leo odio.