xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam a22 u 4500
controlfield tag 008 c19999999azu 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E11-00135
Educational policy analysis archives.
n Vol. 7, no. 23 (August 01, 1999).
Tempe, Ariz. :
b Arizona State University ;
Tampa, Fla. :
University of South Florida.
c August 01, 1999
Academic program approval and review practices in the Untied State [sic] and selected foreign countries / Don G. Creamer [and] Steven M. Janosik.
Arizona State University.
University of South Florida.
t Education Policy Analysis Archives (EPAA)
xml version 1.0 encoding UTF-8 standalone no
mods:mods xmlns:mods http:www.loc.govmodsv3 xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govmodsv3mods-3-1.xsd
mods:relatedItem type host
mods:identifier issn 1068-2341mods:part
mods:detail volume mods:number 7issue 23series Year mods:caption 19991999Month August8Day 11mods:originInfo mods:dateIssued iso8601 1999-08-01
1 of 20 Education Policy Analysis Archives Volume 7 Number 23August 1, 1999ISSN 1068-2341 A peer-reviewed scholarly electronic journal Editor: Gene V Glass, College of Education Arizona State University Copyright 1999, the EDUCATION POLICY ANALYSIS ARCHIVES. Permission is hereby granted to copy any article if EPAA is credited and copies are not sold. Articles appearing in EPAA are abstracted in the Current Index to Journals in Education by the ERIC Clearinghouse on Assessment and Evaluation and are permanently archived in Resources in Education Academic Program Approval and Review Practices in the United State And Selected Foreign Countries Don G. Creamer Steven M. Janosik Virginia Polytechnic Institute and State UniversityAbstract This report outlines general and specific processes for both program approval and program review practices found in 50 s tates and eight foreign countries and regions. Models that depict these procedures are defined and the strengths and weakness of each are discussed. Alternatives to current practice by state agencies in the U.S. are described that might provide for greater decentrali zation of these practices while maintaining institutional accountab ility.Introduction Responding to multiple challenges in the go vernance and coordination of higher education, state agencies increasingly are examinin g their structures for carrying out their mandates. Writing for the Education Commissio n of the States (ECS), McGuinness (1997) predicted that changes would be necessary to correct some structures that were designed for earlier times. Challenges such as the integration of technology into delivery systems for higher education, market pressures, ins tability in state government leadership, and growing political involvement in st ate coordination and governance are among the most compelling forces that make these ch anges likely.
2 of 20 One of the most common responsibilities of state coordinating or governing agencies for higher education is academic program a pproval and program review. Program approval refers to the process for approval of new academic programs by state higher education agencies or boards and generally i s done to curb unnecessary duplication of programs among public institutions. Program review refers to the process of critique of existing academic programs and gener ally is seen as a strategy for quality and productivity improvements. According to Barak ( 1991), 45 state agencies undertake some form of program approval and 34 state agencies review at least some existing programs (some reviews are conducted at the state l evel). Concerns for quality and for accountability in higher education is increasing in state governments and agencies, putting pressure on tradi tional statewide coordination functions of these agencies such as program approva l and review, budgeting, planning, and monitoring quality. Increasingly, state agencie s are seeking ways to decentralize certain functions while, at the same time, increasi ng accountability. McGuinnes (1997) predicted that states would turn to two quality ass urance mechanisms to accommodate these trends. First, he saw a reliance shift from r egulatory controls to incentives to ensure public interests. Second, he surmised more c oordinated tactics among state and federal governments, accrediting agencies, institut ional governing boards, and disciplinary and professional organizations. This study was undertaken at the request of the State Council of Higher Education for Virginia (SCHEV) that sought policy alternative s to their current academic program approval and review practices to enable a more dece ntralized approach to the process in an environment of greater accountability. The study sought to provide these policy alternatives by first collecting base-line informat ion about current academic program approval and review practices in all 50 states and in selected foreign countries and, second, to formulate policy alternatives based upon a synthesis of these findings and upon reasoned judgment about such practices in high er education. Methodology All 50 states in the U.S. and eight foreign countries and regions, including Australia, Canada, England, Germany (Lower Saxony), Hong Kong, the Netherlands, New Zealand, and Scotland, were selected for study. Data for this study were obtained in two forms: (a) documents obtained from web sites or direct mail and (b) interview results with academic officers of state agencies (n o interviews were conducted with agency representatives from foreign countries or re gions). Documents were analyzed to illuminate their academic program approval and revi ew policies and practices. Semi-structured interviews with agency academic off icers were conducted to determine perceived strengths and weaknesses of current pract ices and future plans for change in these procedures. Useful data in one or both forms were obtained from all foreign countries and regions and from 46 states. Informati on from these sources varied in content. Some written policies were explicit and de tailed; others were vague and confined in scope. Likewise, interview results vari ed from expansive and illuminating to narrow and sketchy. Most data were instructive, how ever, clearly revealing current practice. Inquiries about future plans for change w ere met with limited success. Either the officers did not know what changes might occur in their agencies or were cautious in their comments for a variety of reasons. Information from these sources was analyze d for patterns in responses. These synthesized patterns of practice were used to repor t the findings from the study.
3 of 20Discovered patterns were more normative among progr am approval practices than from program review practices; therefore, variations in review practices also were synthesized from the data and reported as substratum patterns. Findings Findings are presented in four parts: (a) g eneralized findings about program approval processes, (b) generalized findings about program review processes, (c) summary of program approval and review practices, a nd (d) generalized program approval and review findings from foreign countries and regions. Generalized findings about program approval and program review are follo wed next by steps in the processes and, finally, by strengths and weaknesses of the ge neralized model presented. Distinctive features of the practices from foreign counties and regions follows their generalized findings. Program Approval Practices in the States Practices regarding state agency program ap proval can be summarized and displayed in a generalized model. This model is dep icted in Figure 1 and shows widely accepted practices at the institution and at the st ate agency level where multiple decisions and actions are possible.
4 of 20 Notes:Early screening is used by some states to save time and resources. Good proposals are helped. Poor proposals are discourage d. 1. The process used by state agencies may include inte rnal review by staff, external reviewers, or peer reviews. Criteria inclu de need, demand, duplication, cost, ability to deliver, etc. 2. Some states will aid institutions to revise their p roposals even after a program has been disapproved. 3. Review at this stage comes as part of a conditional approval or as part of the criteria for full approval. It may involve rigorous review and include accreditation agencies, outside consultants, or age ncy staff. 4.Figure 1. Typical State Approval Process When proposals are disapproved at the state agency level, some agencies may help institutions improve their proposals and encourage them to resubmit. When proposals are approved at the state agency level, some agenci es schedule a subsequent review as part of the approval process while others grant aut omatic continuation unless a review is triggered by productivity concerns.Steps in Program Approval by State Agencies
5 of 20Program approval procedures by state agencies gener ally followed these steps: 1. Institution determines the feasibility of its in tent to plan a new program. 2. Institution notifies the state agency of intent. 3. Institution prepares a draft proposal containing a brief statement identifying the program and addressing the following issues: Relation to institutional mission, strategic plan, goals and objectives; Projected source of resources (reallocation, extern al funds, request for new dollars); Student need; Relationship to other programs in the system and re gion. 4. The state agency distributes the proposal to oth er affected institutions to elicit comments and recommendations. 5. State agency staff comments and makes recommenda tions on the draft proposal. 6. Institution submits the full proposal addressing some or all of the following issues: Centrality to institutional mission and planning Need for the proposed program Societal need Occupational need Student availability and demand (Enrollment level) Reasonableness of program duplication, if any (not including general education programs) Adequacy of curriculum design and related learning outcomes Adequacy of resources to support the program Adequacy of finances Faculty resources Library resources Student affairs services Physical facilities and instructional equipment Adequacy of program administration Adequacy of the plan for evaluation and assessment of the program Diversity plan for increasing the number of student s from underrepresented populations Accreditation (Is there a recognized accreditation agency for the program? Will accreditation be pursued?) Use of technology 7. The full proposal is reviewed by one or a combin ation of appropriate governance bodies, external consultants, and/or program review committees consisting of a representative(s) of the program proposing unit, st ate agency staff, and/or external experts in the area. 8. State agency takes action to: Approve (provisional approval or full approval) Disapprove Defer 9. If provisionally approved, the institution will address the issues raised by the state
6 of 20agency. The state agency reviews the program after a relatively short period (e.g., for one year). 10. If fully approved, the institution will develop and implement the program. 11. If disapproved, the institution may have the ri ght to appeal. 12. After the graduation of the first class of the new program, the program may receive an in-depth comprehensive review. 13. Change to current program status.Summary of Strengths and Weaknesses of Program Appr oval Processes by State AgenciesCertain strengths and weaknesses in current were id entified as follows: Strengths of Current Practice Tends to improve the quality of the academic progr am Increases interinstitutional communication and col laboration Incorporates future assessment criteria and accoun tability measures Ensures demand and need Reduces duplication Conserves resources Stresses application of state planning priorities Weaknesses of Current Practice Reduces autonomy of the institutions Can delay the initiation of needed academic progr ams Decision making may be politicized or arbitrary Staffing requirements may be excessive Generalized Patterns of Program Review Practices in States All states do not conduct program reviews. Where they are conducted, as occurs in a majority of states, they are conducted in differe ntiated or even idiosyncratic patterns. Though practiced in a variety of approaches, progra m review procedures in state agencies can be normatively represented in a concep tual scheme. This arrangement is depicted in Figure 2.
7 of 20 Notes:Programs are selected for review on a cyclical or triggered" basis. Cyclical patterns are based on varying recurring time frames "Triggered" reviews occur due to response to results from productivity measures or interest on the part of the state legislature. 1. As programs are selected, some states use a peer re view process as a precursor to the full review process. This process helps in the data collection phase. 2. Program reviews take a variety of forms. They may b e done in conjunction with self-studies or accreditation visits. The inst itution, state agency, or external consultants may conduct them. 3. May be formal or informal. Programs that are approv ed conditionally are usually given a specific period of time to correct shortcomings. The programs are monitored and additional reviews may be conduct ed to determine the program's fate. 4. May lead to modification, consolidation, or elimina tion. 5.Figure 2. State Agency Academic Program Review Proc ess Model This conceptual scheme suggests three gener alizations about academic program review processes. First, some external agent such a s the state legislature or state agency selects programs for review. The review may be trig gered by a concern such as productivity or mission-related matters. Second, in stitutions are requested to take certain actions such as conducting self-studies of program effectiveness. Third, state agencies
8 of 20take certain actions such as forming agency review committees or other structures which may include internal or external consultants and/or representatives from accrediting agencies to determine a programÂ’s approval status. Reviews, where conducted, often are focused on disciplines (or discipline clusters) or on broad categories such as degree level programs. Academic program review practices can be pl aced in one of three general approaches: Independent Institutional Review In this approach, the state agency delegates the authority to conduct program reviews to the institu tion. The state agency does not exercise any supervision or audit of the processes (e.g., Michigan, Minnesota, Nevada, and New Jersey). Interdependent Institutional Review In this approach, the institution conducts the program review on a regular basis but does so under the guidance and audit of the state agency. The institution determines the review proce sses and criteria to be used consistent with the context and characteristics of the institution. The institution submits its program review report to the state agency accor ding to an annual or cyclical state-determined plan. Program review reports condu cted in this manner often include: Descriptive program information, Year of last program review, Documentation of continuing need, Assessment information related to expected student learning outcomes and the achievement of the programÂ’s objectives, Plans to improve the quality and productivity of t he program, and Program productivity indicators. Based on the information that the instituti on provides the state agency will make recommendations to modify, consolidate, or eliminat e the program(s) (e.g., Hawaii, Kansas, and Montana). State-Mandated Review In this approach, the state agency determines the procedures and criteria of the program review, and conducts or commissions the review of the selected programs within the state system. T he state agency staff will participate in the review process. System wide (lateral) progra m review of similar programs within the state may be carried out at the same time as ca n be seen in Illinois. The state agency also may conduct post-audit reviews of new programs following the graduation of the first class using pre-determined criteria (e.g., Ge orgia and North Dakota). Variations on these program review approach es include the use of productivity reviews (normally triggered by evidence of below st andard efficacy) and cyclical reviews. When productivity reviews are incorporated into the process, productivity indicators (such as credit hours, course enrollment s, number of majors, number of degrees awarded, cost, and related information) are examined annually as reported by the institution. The state agency identifies low produc tivity and/or duplicative programs and takes action based on their determinations (e.g., V irginia and New Hampshire). Sometimes when reviews are triggered in this manner the state agency reviews all similar programs in the state (e.g., Montana). When cyclical reviews are conducted, all programs are examined on some pre-determined schedu le such as once each 3, 5, 7, or 10 years (e.g., South Carolina and Illinois). External consultants may be used as a compl ement with any of the generalized approaches to program review. External consultants form an advisory committee to participate directly in the program review process. On-site visitations may be performed.
9 of 20Most states require the use of external consultants External consultants may be selected from several groups of experts: External evaluators: Qualified professionals select ed from in-state or out-of state to provide objectivity and expertise. 1. Representatives from peer institutions with similar programs: Selected from similar institutions with similar programs to permi t informed exchange and to establish comparable standards (e.g., Georgia and W isconsin). 2. Accreditation agencies: Representatives from specia lized and regional accreditation recognized by the state agency may be used in the reviews (e.g., Montana and Georgia). 3. Representatives from state agencies of elementary a nd secondary education: Selected to achieve better linkage among the differ ent educational levels. 4. Local lay people and other interested parties: Sele cted to address societal and occupational needs. 5. The consultants and/or representatives comme nt upon the quality of the program, resources available to the program, outcomes of the program, program costs, and other factors. An external review report is provided on t he findings and each institution may have the opportunity to review the report and make comments. The final report and comments of the institution are reviewed by the sta te agency where further action may be taken. The generalized academic program review approaches may occur in combination with one another and may be combined with the use of ext ernal consultants. Some of these combinations may be described as follows: Example 1 features interdependent and state-mandat ed reviews with the use of external consultants (e.g., Arizona, Wisconsin, and Idaho). Example 2 features interdependent review and the u se of consultants (e.g., Washington and Georgia). Example 3 features independent review and the use of external consultants (e.g., Michigan, Minnesota, Nevada, and New Jersey). Example 4 features state-mandated review character ized by productivity review approaches or cyclical state-mandated reviews in co mbination with the use of external consultants (e.g., Virginia and West Virgi nia). Example 5 features independent review under state agency guidelines (e.g., New Hampshire). Summary of Strengths and Weaknesses of Program Revi ew Processes by States Strengths and weaknesses of program review practices may vary according to the model or approach chosen; however, they may be char acterized generally as follows: Strengths of Program Review Practices Provides an on-going quality assurance check Even when done on an irregular basis, the process serves as an incentive to ensure quality at the institutional level When outside reviewers are used, a greater measure of objectivity can be obtained Weaknesses of Program Review Practices
10 of 20 Institutions may focus on the review process and d o little with the results Reviews are not done with great enough frequency t o provide real quality control Process is time consuming Process is expensive Summary of Program Approval and Review Practices by States Program approval and program review can be seen as part of the integrated components of quality assurance practices within a state system of higher education. In this view, program approval is the initial and auth orizing stage of program quality assurance and program review is a continuation and revalidation of the approval process. The objectives of program approval and program revi ew are the same: ensure mission compatibility, maintain academic standards, assure continuing improvement of academic programs, and guarantee accountability of academic programs. Issues in both program approval and program review also are the same: miss ion compatibility, need, program structure, availability of resources (financing, fa culty and staff, facilities, technology, etc.), and quality assurance. Program approval and program review process es can be both internal and external, that is, they can be carried out both within the in stitutions themselves and/or by external agents. External agents may include the state agenc y, external consultants, peer institutions, accreditation agencies, and other int erested parties. Internal program approval and program revie w can best safeguard the institutionÂ’s autonomy, integrate the processes with the institut ional self-improvement efforts, be more flexible, and boost the morale of the faculty and administrators of institutions. However, internal program approval tends not to pro vide sufficient stimulation and motivation for improvement. External program approv al and review procedures are part of the internal program operating processes, exerci se outside monitoring, challenge existing program development notions, ensure maximu m objectivity and expertise, and encourage the exchange of good practices. However, external review approaches may intrude on institutional autonomy and bring extra f inancial and reporting burdens to the institutions. Distinctions between program approval and review pr actices between undergraduate and graduate programs cannot be clearly drawn from this study. Some states clearly are more concerned with one level of academic program than t he other, but no systematic pattern in these concerns was evident from the data.Program Approval and Review Practices by Foreign Co unties and Regions Program approval and review practices of Au stralia, Canada, England, Germany (Lower Saxony), Hong Kong, the Netherlands, New Zea land, and Scotland are summarized as a single practice. In these internati onal practices, program approval and program review often are intertwined and are called quality assurance. Quality assurance approaches in internation al locations are similar to practices in the United States in many respects. Three general m odels are evident: Self-regulating (regulation by the institution or p rovider of the educational program), as seen in Canada where universities have the authority and responsibility for quality assurance. 1. Externally regulated (regulation by an external age ncy), as seen in Australia. The 2.
11 of 20federal government of Australia plays a direct and intrusive role in educational policy.A combination of the two (mixed or collaborative re gulation), as seen in most of the countries and regions, such as in England, Scot land, the Netherlands, Hong Kong, Germany (Lower Saxony), and New Zealand, thou gh the degree of the external control varies to a great extent. For exam ple, in England and Scotland, quality assurance is more government-driven than in the Netherlands where the institutions are delegated more autonomy. This appr oach features institutional self-evaluation and cyclical review conducted by a quality assurance agency. 3. Distinctive Features of Program Approval and Review Practices in Foreign Counties and Regions Institutional self-regulation (self-study) is comb ined with external quality assurance agency review or audit. The quality assur ance agency ensures that the institutions implement their own quality assurance procedures effectively. The institution may either design its own quality assurance procedures or adopt a formal quality assurance policy determined by the q uality assurance agency or by the government. Adopting the formal quality assuran ce policy helps to emphasize system priorities and ensures consistency and compr ehensiveness of comments and judgement of external reviewers across the syst em. External reviewers (assessors) play a very importa nt role to ensure objectivity and expertise, promote the exchange of good practices, and respond to the needs of the society. In some countries, external reviewers are draw from foreign countries (e.g., Hong Kong and the Netherlands), from industr y (e.g., the Netherlands), and from the local lay people (e.g., Hong Kong). Extern al reviewers may receive training from the quality assurance agency before v isiting institutions under review (e.g., Scotland). In some countries, quality assurance initiatives a re very extensive, including an assessment of institutional teaching and learning p ractices of all academic programs and an assessment of the research skills a nd training of junior academic staff (e.g., United Kingdom and Germany). Quality assurance results are scored (e.g., United Kingdom), ranked (e.g., the Netherlands) or published (e.g., United Kingdom) in some countries. Decision-making, such as funding and program elimin ation, is based on these scores or ranks. On-site visits involve meetings with groups of fac ulty, students, administrative staff, and those responsible for running support se rvices. Time is spent in direct observation of teaching and learning. To reduce the administration burden, participants are encouraged to share proposals, databases, and trend analyses electronic ally (e.g., New Zealand). Discussion According to Barak (1998), more than half o f the 50 states are considering deregulation or decentralization of program approval and review pra ctices. This study confirmed a widespread interest in finding alternatives to current practic es that still meet statutory or policy requirements. Academic program approval and review proced ures generally are conducted to address program quality and program productivity at the ins titutional level. Statewide concerns include
12 of 20access and capacity, quality, occupational supply a nd demand, and program costs and institutional productivity. Interest in decentraliz ation or deregulation of academic program approval and review policies in a context of accoun tability was evident in state agencies though most demonstrated this interest only in thei r future plans. Evidence from this study suggested that ove rall program approval and review practice in 50 states and eight foreign countries and regions c an be distilled into three conceptual models for practice: State Regulatory Model. A centralized model for qu ality control characterized by development and application of centralized regulato ry requirements for program approval and review by state-level agency. Collaboration Model. A consolidated model for ins titution and state agency cooperation characterized by jointly developed and administered program approval and review procedures by institution and state agency. Accreditation Model. A decentralized standards-bas ed model characterized by the development and application of standards and guidel ines for program approval and review and by cyclical audit by state and consultin g agents from outside the institution. The State Regulatory Model and Collaborati on Model are derived from practices in the 50 states. The Accreditation Model primarily is use d in foreign countries and regions, though aspects of accreditation are used in program review practices in this country.These models can be depicted along a continuum of state control as s hown in Figure 3. Figure 3. Relationship of Program Evaluation Models to State Agency Control Analysis of information from this study was used to formulate two suggested alternative models for consideration by SCHEV--the Quality Assu rance Audit Model (see Figure 4) and the Modified Collaboration Model (see Figures 5 an d 6). The Quality Assurance Audit Model is a dece ntralized model of program approval and review characterized by:
13 of 20 Figure 4. Quality Assurance Audit Model of Program Approval and Review
14 of 20 Figure 5. Modified Institution/State Collaboration Model Delegation of appropriate state agency authority t o institutional governing boards, Development and application of institutional-level quality assurance policies and procedures (referring to policies and practices tha t include quality, duplication, and productivity issues), and Cyclical or triggered state-level audit of these p olicies and procedures. The Modified Collaboration Model is a centralized m odel of program approval and review characterized by: Shared institution and state-level oversight autho rity, Institutional-level program approval by classifica tion according to mission relatedness (within mission, related to mission, outside of mis sion) and the requirement for new resources, and Cyclical reviews by state-level agency (for exampl e, at five-year intervals) depending upon classification of initial approval.
15 of 20 Figure 6. Program Approval and Review Status within the Modified Institution/State Collaboration Model The degree of centralization among the five models, those that represent current practice and those proposed as alternatives, can be depicted as shown in Figure 7 on the continuum of state agency control. Both alternative models are attractive for different reasons relative to state interests in deregulation or decentralization of program approva l and review practices. The Quality Assurance Audit Model places the agency in a policy /coordination role that enables the agency staff to provide broad oversight for the process of quality assurance. The state agency would be integrally involved in process development and m anagement but would leave the implementation of the process to its respective ins titutions.
16 of 20 Figure 7. Level of State Agency Control in Five Pro gram Evaluation Models The most apparent disadvantages of the Qual ity Assurance Audit Model is that too much authority and control are delegated to the institut ions (although this runs counter to stated interests in deregulation or decentralization). Ho wever, using a periodic system-wide audit of program offerings noting year-to-year changes might serve as an excellent way to monitor institutional activity. Self-study reports and acc reditation visits, processes already in place in most public institutions, would provide additional information on institutional decision making in the area of program approval and review. The Modified Collaboration Model is attract ive because it stratifies the approval and review process based on two critical factors--missi on and cost. The model prescribes that additional attention be given to programs that requ ire supplementary resources and fall outside an institution's current mission--the areas of grea test risk to the institution and the state. At the same time, however, institutions building new missi on-related programs by reallocating existing resources receive additional control and a uthority. The disadvantage in this process is that risk-taking and innovation may be reduced if i nstitutions act to avoid the more rigorous reviews that come with programs that may fall outsi de their current mission or require new resources.Implications State agencies that wish to modify their cu rrent academic program approval and program review practices to accomplish goals of deregulatio n or decentralization in an environment of accountability may find policy alternatives suitabl e to accomplish the goal. Most current practices currently are reasonably well portrayed i n either the State Regulatory Model or the Collaboration Model. As currently practiced, howeve r, neither of these models accomplish the goals of deregulation or decentralization very well Two alternative models to current practice were developed as part of this study. These new models, when appropriately constructed on polic ies consistent with the applicable statutory requirements, can release state agencies from burdensome practices without relinquishing responsibility or diminishing account ability. Both the Quality Assurance Audit Model and the Modified Collaboration Model may serv e this purpose although clearly the Quality Assurance Audit Model moves the agencies fu rther from current practice than does the Modified Collaboration Model.
17 of 20 States coordinating and governing boards ac ross the country are struggling to find new, more effective ways of dealing with program approva l and program review. This synthesis of current practice, along with the two alternative mo dels suggested here, may prove helpful as these discussions continue.Notes1. Special thanks are due to Virginia Tech doctoral students Chunmei Zhao, Michael Perry, and Miya Simpson who assisted in all phases of the research for this project. 2. A copy of the complete study report and a compl ete bibliography of materials used for the original study is available at: http://epaa.asu.edu/epaa/v7n23/v7n23.pdf ReferencesBarak, R. J. (1991). Program review and new program approval of state ed ucation boards (Report to State Higher Education Executive Officer s) Denver, CO: SHEEO. Barak, R. J. (September, 1998). Personal communicat ion. Creamer, D. G., Janosik, S. M., Zhao, C., Simpson, M., & Perry M. (1998). Academic program approval and review practices in states and selecte d foreign countries (Special Study Report to the State Council of Higher Education for Virgin ia). Blacksburg, VA: Virginia Tech. McGuinness, A. C., Jr. (1997). Essay: The functions and evolution of state coordination and governance in postsecondary education. In Education Commission of the States, 1997 state postsecondary education structures sourcebook: State coordinating and governing boards (pp. 1-48). Denver, CO: Education Commission of the Stat es. About the AuthorsDon G. Creamer email@example.comDon G. Creamer is Professor and Coordinator of High er Education and Student Affairs in the Department of Educational Leadership and Policy Stu dies at Virginia Polytechnic Institute and State University (Virginia Tech). He is Director of the Educatonal Policy Institute of Virginia Tech.Steven M. JanosikSteven M. Janosik is Associate Professor and Senior Policy Analyst in the Department of Educational Leadership and Policy Studies at Virgin ia Polytechnic Institute and State University (Virginia Tech). He is Associate Direct or of the Educational Policy Institute of Virginia Tech and recently served as Deputy Secreta ry of Education for the Commonwealth of Virginia.
18 of 20 Copyright 1999 by the Education Policy Analysis ArchivesThe World Wide Web address for the Education Policy Analysis Archives is http://epaa.asu.edu General questions about appropriateness of topics o r particular articles may be addressed to the Editor, Gene V Glass, firstname.lastname@example.org or reach him at College of Education, Arizona State University, Tempe, AZ 85287-0211. (602-965-96 44). The Book Review Editor is Walter E. Shepherd: email@example.com The Commentary Editor is Casey D. Cobb: firstname.lastname@example.org .EPAA Editorial Board Michael W. Apple University of Wisconsin Greg Camilli Rutgers University John Covaleskie Northern Michigan University Andrew Coulson email@example.com Alan Davis University of Colorado, Denver Sherman Dorn University of South Florida Mark E. Fetler California Commission on Teacher Credentialing Richard Garlikov firstname.lastname@example.org Thomas F. Green Syracuse University Alison I. Griffith York University Arlen Gullickson Western Michigan University Ernest R. House University of Colorado Aimee Howley Ohio University Craig B. Howley Appalachia Educational Laboratory William Hunter University of Calgary Richard M. Jaeger University of North CarolinaÂ—Greensboro Daniel Kalls Ume University Benjamin Levin University of Manitoba Thomas MauhsPugh Green Mountain College Dewayne Matthews Western Interstate Commission for HigherEducation William McInerney Purdue University Mary McKeown-Moak MGT of America (Austin, TX) Les McLean University of Toronto Susan Bobbitt Nolen University of Washington Anne L. Pemberton email@example.com Hugh G. Petrie SUNY Buffalo Richard C. Richardson New York University Anthony G. Rud Jr. Purdue University Dennis Sayers Ann Leavenworth Centerfor Accelerated Learning Jay D. Scribner University of Texas at Austin
19 of 20 Michael Scriven firstname.lastname@example.org Robert E. Stake University of IllinoisÂ—UC Robert Stonehill U.S. Department of Education Robert T. Stout Arizona State University David D. Williams Brigham Young University
20 of 20 EPAA Spanish Language Editorial BoardAssociate Editor for Spanish Language Roberto Rodrguez Gmez Universidad Nacional Autnoma de Mxico email@example.com Adrin Acosta (Mxico) Universidad de Guadalajaraadrianacosta@compuserve.com J. Flix Angulo Rasco (Spain) Universidad de Cdizfelix.firstname.lastname@example.org Teresa Bracho (Mxico) Centro de Investigacin y DocenciaEconmica-CIDEbracho dis1.cide.mx Alejandro Canales (Mxico) Universidad Nacional Autnoma deMxicocanalesa@servidor.unam.mx Ursula Casanova (U.S.A.) Arizona State Universitycasanova@asu.edu Jos Contreras Domingo Universitat de Barcelona Jose.Contreras@doe.d5.ub.es Erwin Epstein (U.S.A.) Loyola University of ChicagoEepstein@luc.edu Josu Gonzlez (U.S.A.) Arizona State Universityjosue@asu.edu Rollin Kent (Mxico)Departamento de InvestigacinEducativaDIE/CINVESTAVrkent@gemtel.com.mx email@example.com Mara Beatriz Luce (Brazil)Universidad Federal de Rio Grande do SulUFRGSlucemb@orion.ufrgs.brJavier Mendoza Rojas (Mxico)Universidad Nacional Autnoma deMxicojaviermr@servidor.unam.mxMarcela Mollis (Argentina)Universidad de Buenos Airesmmollis@filo.uba.ar Humberto Muoz Garca (Mxico) Universidad Nacional Autnoma deMxicohumberto@servidor.unam.mxAngel Ignacio Prez Gmez (Spain)Universidad de Mlagaaiperez@uma.es Daniel Schugurensky (Argentina-Canad)OISE/UT, Canadadschugurensky@oise.utoronto.ca Simon Schwartzman (Brazil)Fundao Instituto Brasileiro e Geografiae Estatstica firstname.lastname@example.org Jurjo Torres Santom (Spain)Universidad de A Coruajurjo@udc.es Carlos Alberto Torres (U.S.A.)University of California, Los Angelestorres@gseisucla.edu