USF Libraries
USF Digital Collections

Educational policy analysis archives

MISSING IMAGE

Material Information

Title:
Educational policy analysis archives
Physical Description:
Serial
Language:
English
Creator:
Arizona State University
University of South Florida
Publisher:
Arizona State University
University of South Florida.
Place of Publication:
Tempe, Ariz
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Education -- Research -- Periodicals   ( lcsh )
Genre:
non-fiction   ( marcgt )
serial   ( sobekcm )

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
usfldc doi - E11-00188
usfldc handle - e11.188
System ID:
SFS0024511:00188


This item is only available as the following downloads:


Full Text
xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam a22 u 4500
controlfield tag 008 c20009999azu 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E11-00188
0 245
Educational policy analysis archives.
n Vol. 8, no. 44 (August 27, 2000).
260
Tempe, Ariz. :
b Arizona State University ;
Tampa, Fla. :
University of South Florida.
c August 27, 2000
505
Information needs in the 21st century : will ERIC be ready? / Lawrence M. Rudner.
650
Education
x Research
v Periodicals.
2 710
Arizona State University.
University of South Florida.
1 773
t Education Policy Analysis Archives (EPAA)
4 856
u http://digital.lib.usf.edu/?e11.188


xml version 1.0 encoding UTF-8 standalone no
mods:mods xmlns:mods http:www.loc.govmodsv3 xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govmodsv3mods-3-1.xsd
mods:relatedItem type host
mods:identifier issn 1068-2341mods:part
mods:detail volume mods:number 8issue 44series Year mods:caption 20002000Month August8Day 2727mods:originInfo mods:dateIssued iso8601 2000-08-27



PAGE 1

1 of 20 Volume 8 Number 44August 27, 2000ISSN 1068-2341 A peer-reviewed scholarly electronic journal Editor: Gene V Glass, College of Education Arizona State University Copyright 2000, the EDUCATION POLICY ANALYSIS ARCHIVES. Permission is hereby granted to copy any article if EPAA is credited and copies are not sold. Articles appearing in EPAA are abstracted in the Current Index to Journals in Education by the ERIC Clearinghouse on Assessment and Evaluation and are permanently archived in Resources in Education Information Needs in the 21st Century: Will ERIC Be Ready? Lawrence M. Rudner ERIC Clearinghouse on Assessment and Evaluation University of Maryland, College ParkAbstractUbiquitous for 35 years, the Educational Resources Information Center (ERIC) is known for its database and recently for i ts range of web-based information services. I contend that federal policy with regard to ERIC must change and that ERIC will need massive restruc turing in order to continue to meet the information needs of the educa tion community. Five arguments are presented and justified: 1) ERIC is the most widely known and used educational resource of the US Depar tment of Education, 2) senior OERI and Department of Educati on officials have consistently undervalued, neglected, and underfunde d the project, 3) ERIC’s success is due largely to information analys is and dissemination activities beyond ERIC’s contracted scope, 4) infor mation needs have changed dramatically in the past few years and ERIC cannot keep up with the demands given its current resources, and 5 ) the ERIC database itself needs to be examined and probably redesigned

PAGE 2

2 of 20Introduction The Educational Resources Information Cente r (ERIC) has been the most visible source for education information since its inceptio n in 1966. As a system of 16 clearinghouses and 3 support contractors, ERIC coll ects, abstracts, and indexes education materials for the ERIC database; responds to requests for information in subject specific areas; and produces special print and electronic publications on current research, programs, and practices. As we enter into the 21st century and the Information Age, the question to ask is: "Will ERIC be ready?" Taking a hard look at what ERIC has been and what ERIC is today relative to user inform ation needs, I conclude that ERIC will need massive restructuring in order to continu e to meet the information needs of the education community. I base my conclusion on five basic argument s 1. ERIC is the most widely known and used education al resource of the US Department of Education.2. While ERIC staff, including Office of Educationa l Research and Improvement monitors, have long appreciated ERIC, s enior OERI and Department of Education officials have consistently undervalued, neglected, and underfunded the project.3. ERIC's success is due largely to information ana lysis and dissemination activities that go beyond ERIC's contracted scope.4. Information needs have changed dramatically in t he past few years and ERIC cannot keep up with the demands given its curr ent resources. 5. The ERIC database itself needs to be examined an d probably redesigned. In this article, I justify these arguments. In my summary, I look at the federal role in education and conclude that unless ERIC is restruct ured, the U.S. Department of Education will fragment the nation's already frail educational information infrastructure. Educational research and practice will lose because neither will be able to readily build on past findings.ERIC is the most widely known and used educational resource of the U.S. Department of Education In its early years, ERIC was primarily an a rchive of the education literature. Its main activity was the development of its databases, Resources in Education (RIE) and Current Index to Journals in Education (CIJE) Its primary users were researchers; the primary mode of access was through expert intermedi aries typically, reference librarians. While these two databases continue to be a major cornerstone for all clearinghouses, the rapid advancements of informati on technology have prompted ERIC to evolve into a much more powerful and useful reso urce. With the explosive growth of the Internet and CD-ROM products, ERIC as a system is now widely recognized as the central source for educational information.

PAGE 3

3 of 20 Table 1 Popularity Rank of OERI ERIC, Regional Laboratories and Research Center websites as rated by Alexa 8/10 /2000ERIC ClearinghousesOERI R&D Laboratories OERI Centers Reading, English, & Communication 1,891 Northwest Regional Education Laboratory 41,021 Center for the Study of Teaching & Policy 18,886 Information & Technology 5,630 Mid-continent Regional Education Laboratory 41,620 National Center for Early Development & Learning 52,336 Assessment & Evaluation 9,512 North Central Regional Education Laboratory 42,519 Center for Improvement of Early Reading Achievement 171,770 Urban Education58,764 Southwest Educational Development Laboratory 82,744 National Center for the Study of Adult Learning & Literacy 200,195 Social Studies/Social Science Education 67,902 WestEd102,588 Center for Research on the Education of Students Placed At-Risk 210,687 Disabilities & Gifted Education 97,825 Appalachia Regional Education Laboratory 167,103 National Center for Improving Student Learning & Achievement in Mathematics & Science 218,904 Community Colleges99,033 Southeastern Regional Vision for Educators 220,079 Center for Research on Evaluation, Standards, & Student Testing 357,558 Elementary & Early Childhood Education 157,034 Northeast & Island Regional Education Laboratory 411,025 National Research Center on the Gifted & Talented 402,967 Teaching & Teacher Education 181,268 Pacific Region Education Laboratory 1,020,475 National Research & Development Center on English Learning & Achievement 545,177 Educational Management 209,587 Center for Research on Education, Diversity, & Excellence 782,396 Higher Education210,740 Adult, Career, & Vocational Education 271,478 Science, Mathematics, & Environmental Education 298,924 mean128,430 mean236,575 mean296,088 median99,033 median102,588 median214,795

PAGE 4

4 of 20Rankings by Alexa are based on page visits by Alexa users. With millions of users, Alexa claims to have the largest, most geographically and demogr aphically diverse sample of overall web usage currently available. Organizations that do not have their own domain name are not ranked and are not shown in the table.* ERIC/Early Childhood and ERIC/Information & Tech nology operate multiple websites with multiple domain names. Shown are just the rankings for the main clearinghouse website. More than half of AEL page visits are from the ERIC/Rura l Clearinghouse.Department of Education officials have consistently undervalued, neglected, and underfunded the ERIC program. This is a bold statement. It reflects 19 ye ars of personal observation. I preface my remarks with a recognition that senior Department o f Education officials arrive with large agendas and limited time. ERIC is a program t hat appears to be working and not causing problems. Hence it is a program that doesn' t require much attention. However, ERIC has suffered both from efforts to politicize i t and from benign neglect. One of the first OERI Assistant Secretaries formed an ERIC Recompetition Design Panel involving government and non-government repre sentatives. Inserting politics rather than informed judgment, that Assistant Secre tary then claimed that the panel advocated changes that were part of his agenda and that had nothing to do with the deliberations of the Design Panel. Historically, As sistant Secretaries and other senior U.S. Department of Education officers had so many m isconceptions that the Director of the ERIC program authored a paper entitled "Myths a nd Realities about ERIC" (Stonehill, 1995). ERIC has received few invitation s to participate in various OERI panels and advisory meetings. Until recently, the E RIC program office within OERI has been severely understaffed. For the past 10 years, the federal governme nt has spent approximately $9 million yearly on ERIC. The funding goes to pay for the cle aringhouses, a central processing facility, GPO printing of Resources in Education and ACCESS ERIC which serves as a contact point for the ERIC system and produces many reports previously produced by the central processing facility. Most users think w e have a much bigger budget. During ERIC's lifetime, federal support for education near ly quadrupled (Hoffman, 1995). In constant dollars, funding for ERIC, however, is now less than one-half what it was 20 years ago. In the last ERIC recompetition, Clearing houses were each level funded while required to provide support for AskERIC and to devo te $30,000 toward web development. Notably absent are funds for research and d evelopment. Until this year, the US Department of Education's Office for Educational Re search and Improvement has spent zero dollars for study and systematic evaluation of its most visible project. In FY 2000, four papers were commissioned at $10,000 each. When one considers that ERIC has been level funded for 20 years and that virtually n o money has been allocated for research and evaluation in support of the ERIC proj ect, ERIC's accomplishments appear even more amazing. Credit goes to the ERIC Director s for being in tune with their content areas and to the ERIC program office for ge ntly guiding ERIC without the benefit of hard data. However, the assumptions that have guided ERIC so well in the past, no longer hold. Information needs have change d dramatically and, more than ever, the ERIC program office needs to be guided by data rather than by intuition and to have the benefit of adequate resources to allocate. ERIC has always taken pride in its ability to leverage resources. The ERIC Document Reproduction Service, which prepares micro fiche of ERIC documents and

PAGE 5

5 of 20distributes paper and electronic copies on demand, is a no-cost-to-the-government contract. It is paid for by standing orders for ERI C microfiche, fees collected for on-demand papers and electronic copies, and more re cently subscriptions to the on-line, on-demand file. Central processing and quality cont rol for the Current Index to Journals in Education was handled by Oryx Press at no charge to the gove rnment in exchange for the right to print CIJE The private sector disseminated the ERIC database by mounting it as part of electronic information services (e.g. Dialog, BRS) or CD-ROM. Again these activities occur at no cost to the government Consistent with this minimal funding level, the scope of work for the individual clearinghouses has changed little over the past 30 years. Clearinghouses are charged with Acquiring documents Selecting documents for the ERIC database Preparing citations (about 1500-3000 per clearingho use each year) Preparing Digests (about 10 per clearinghouse each year) Preparing major publications (about 2 books per cle aringhouse each year) Giving workshops (about 2 per clearinghouse each ye ar) Responding to user questions The Request for Proposals used to compete t he ERIC Clearinghouses has not changed significantly in the past 20 years. In fact the scopes of work for the individual clearinghouses have not changed. In the 1970s, care er and adult education were hot topics. Approximately 12% of the documents put into the ERIC database during that time were put in by the ERIC Clearinghouse on Adult Career, and Vocational Education. This was more than twice the average of the other clearinghouses. Despite today's interest in bilingual education, assessment higher education, and reform, the ERIC Clearinghouse on Adult, Career, and Vocational Education continues to be contractually obligated to supply some 12% of the E RIC documents while the clearinghouses responsible for these other topics c ontribute at the same levels they did 25 years ago—an average of approximately 6.0% (See Table 2). The activities of the ERIC clearinghouses should be guided by the ebb and flow of contemporary issues, contributions to knowledge, and user demand. It sho uld not be basically static for 30 years.Table 2 Distribution of RIE entries by Clearinghouse over time 1976198019901998pratio Elem Ed 4.2%7.3% 1.73 Reading7.2%9.5% 1.32 Foreign Lang4.5%5.8% 1.28 Test Measure5.0%6.0% 1.20

PAGE 6

6 of 20Cmmnity Col4.1%4.7% 1.16 Disab/Gifted6.2%6.5% 1.05 Higher Ed7.2%7.5% 1.04 Inform Reso7.1%7.2% 1.02 Social Stud4.9%4.9% 1.01 Teacher Ed5.4%5.3% 0.97 Educ Manage6.9%6.7% 0.96 Career/Adult Ed12.4%11.4% 0.92 Counsel Guid5.5%4.7% 0.86 Rural Sch4.1%3.4% 0.84 Urban Sch5.1%4.2% 0.82 Science Math5.9%4.6% 0.79 p-ratios between .8 and 1.25 indicate that the perc entages are practically equivalent. A final example of ERIC's apparent failure to be appreciated within the Department of Education has to do with the creation of an Inte rnet presence. When it became clear that educators at all levels were expecting to see Federally produced documents on the Internet, OERI provided supplemental funding to its Regional Labs to post their materials. The Labs responded with wonderful web pa ges, great collections of useful material. The ERIC Clearinghouses did not get any o f this supplemental funding. ERIC's web presence is mostly the result of dedicated prof essionals staying up late at night. The irony is that the Labs and Centers receive a great deal of funding to disseminate their own research, yet, as shown in Table 1, ERIC web si tes have been much more effective. As the national education dissemination system (Mathtech, 1998a), ERIC is responsible for disseminating all quality material related to e ducation and, even without sufficient funds, has been far more successful in serving the education community. I argue later that ERIC cannot maintain that level of service any longer. Part of the problem stems from the nature o f the program. ERIC is best known for its archiving of educational materials. ERIC gather s the literature and prepares the microfiche. From one point of view, ERIC is a fairl y uninteresting project. It doesn't provide research breakthroughs. It does not generat e headlines. It does not provide political mileage. It is not known outside of educa tion and information science. Further, it appears to do its job adequately at the current funding level. What senior Department of Education officia ls apparently have not appreciated is that, to be a quality archive, ERIC had to be a qua lity information center. ERIC has established formal relationships with every major o rganization that produces and

PAGE 7

7 of 20consumes educational resources and information. To build these relationships, ERIC has to be an appreciated provider of information servic es.ERIC's success is due largely to many marginal activities beyond ERIC's contracted scope The success of ERIC is clearly not due sole ly to its efforts to gather papers and build a database. Rather, ERIC's success is due, to a great extent, to its value-added services. ERIC excels at identifying what will be h elpful to its clients, identifying what is relevant and of high quality, and organizing and presenting information. In other words, ERIC is successful because it blends informa tion science with subject matter expertise. Some ERIC activities that are beyond the ba sic scope of clearinghouse work are: Mounting and maintaining the ERIC database on the w eb Most responses to Frequently Asked Questions Pathfinders Newsletters Journals (print and electronic) Newsletter and journal columns Workshops (beyond the first 2 each year) All printing activities All research activities Bookstores Major publications and books (beyond the first 2 ea ch year) Development of lesson plans Compilations of reference materials Writing state-of-the-art search software for the we b Test Locator Most web activities beyond simply establishing an I nternet presence The magnitude of these out-of-contract acti vities is evident in the wide range of on-line services offered at ERIC Clearinghouse webs ites, especially the more popular ERIC websites—those of the Reading, Information Res ources, Assessment, Social Studies, Urban, and Disabilities/Gifted Clearinghou ses. These are massive websites with many special features. However, they are marginal r elative to what could be accomplished with a concerted, well-planned, and we ll-supported effort. Lynch (2000) points out that ERIC needs to be concerned with database services in addition to database building. The Clearinghouses u ndertake these activities because this is what is necessary to be a viable clearinghouse. The time to create these products comes as volunteer time, either contributed by indi viduals or by their host institutions. Several ERIC Clearinghouses actually view the ERIC contract as a franchise license (Colker, 2000) and put a great deal of effort into selling and making money from books with the ERIC label. They then use this money to su pport the necessary Clearinghouse efforts not adequately funded by the government. Se nior Department staff appear to be oblivious to these activities. They are paying prim arily for the creation of the database; to them, everything else appears to be viewed as ta ngential. The Directors, however, view these activities as critical to clearinghouse success.Information needs have changed

PAGE 8

8 of 20dramatically in the past few years For thirty-five years, the ERIC database ha s been built around well-established information science principles. Abstracts are devel oped following a set of standards. Citations draw upon authority lists so publication types, journals, and organizations are always presented the same way. The ERIC Thesaurus is used to identify appropriate major and minor descriptors. The ERIC procedures ma nual takes more than a foot of shelf space. The quality of the ERIC database in te rms of its structure is well appreciated in the information science community. About 10 years ago, most ERIC searching was conducted by expert intermediaries. Reference librarians familiar with the ERIC databas e and trained in information retrieval would conduct searches rather than the end user. On ce information needs were clearly identified, the intermediary would often present a highly relevant set of references. In my experience, I usually received 30 to 100 citations that were of potential interest. I would then spend hours in the library looking up and obta ining appropriate articles. The process would take weeks. That type of searching has changed. Today t he end user conducts his or her own search. When reference services are provided, the e nd user is often given 10 to 15 potentially relevant citations. End users today wou ld like to obtain the most current information and they want it immediately. ERIC has responded by now offering the full-text of RIE documents since 1994, on-demand (F or more information read about the E*subscribe program at www.edrs.com ). Efforts are underway to make ERIC more timely. To underscore that information needs have c hanged, let me ask a set of questions. Which would you prefer to search? National Academy of Science full-text of their book s on-line a. OCLC First Search of full-text journals b. ERIC—Abstracts only c. Twenty years ago, there were few options. F ive years ago, ERIC was still basically the only education database. University Microfilms International (UMI) provided access to most of the journal articles in ERIC. The ERIC D ocument Reproduction Service provided access to the documents in RIE. Today, the re are multiple education databases. For most people, the first preference will be highquality materials they can get immediately. OCLC, EBSCOHost, JSTOR, CatchWord, the American Psychological Association and others are creating fee-based datab ases linked to the full text of peer-reviewed articles. ERIC's CIJE database has no such set of links, and UMI no longer provides reprint services. However, document s in ERIC's RIE database that were prepared in 1994 and later are now available on-dem and, on-line. Should ERIC continue to abstract journal articles if it can't make them readily available? Which would you prefer? Packages with an Introduction to an issue and caref ully selected full-text resources a. An annotated bibliography b. Search for yourself c. Obtaining an answer to an education questio n is often not a trivial task. The literature is full of highand low-quality article s; it is often difficult to identify potentially relevant articles, yet alone key articl es. Ten years ago, there were few

PAGE 9

9 of 20information analysis packages, and those that exist ed were often difficult to find. A lengthy annotated bibliography was considered a gre at starting tool. Today, there is a growing number of expertly prepared responses to Fr equently Asked Questions. These make excellent starting points when one is interest ed in search a topic. Today, any FAQ is a blessing. In five years, however, the demand w ill be for quality FAQs. In a watch-dog role, the researchers in the content area will want to be sure novices are led to the best resources. Novices will want the best reso urces. Quality FAQs, with expert introductions to each topic's special problems and key references identified, require reference librarians working in conjunction with su bject experts, as well as peer review and periodic updating. Today's ERIC can develop som e FAQs, but not enough, not at the quality ERIC is capable of, and not with the ongoin g maintenance FAQs require. You need to make a policy decision, which do you pr efer? Carefully edited briefing papers presenting all sid es of an issue a. A selected collection of abstracts that summarize p apers b. Large collection of abstracts that summarize papers c. Short abstracts that indicate without summarizing. d. This question illustrates several points. F irst, a search of the ERIC database may be the end product desired of researchers, but it is g enerally a long way from the information desired by policymakers. Researchers ma y be willing to wade through indicative abstracts. Unless the policymaker has th e luxury of time and is a researcher, the policy maker would prefer informative abstracts that summarize a paper. Ten years ago, the policymaker would have been happy with a l arge collection of informative abstracts, or better yet, a carefully selected coll ection. Today, when information is required, the ne ed is for greater depth and for immediate answers or at least viewpoints. ERIC's Di gest Series fill that role nicely. Some 80,000 digests are distributed each month by www.ed.gov and ericae.net But, will Digests be adequate, yet alone optimal, five years from now? I don't think so. The clearinghouses are told to budget approximately $1, 200 for each Digest title. This amount does not provide the resources for an analys is of policy decisions, for the commissioning of papers, or even for assuring that the Digests are of the highest possible quality. While the education community has been very supportive of the ERIC Digest series and most expert authors are willing t o volunteer to write Digests, something that is designed to introduce topics and possibly help guide decision making, should not be funded at the lowest possible level.Which do you prefer to help you search for resource s? An expert in your field who is also an expert refer ence librarian a. Expert librarian to search for you b. A graduate student to search for you c. Search for yourself d. Ten years ago, one often used an expert lib rarian to help locate resources. There was often some tension as the expert librarian ofte n did not have the subject-matter expertise. With the growth of on-line services, suc h as Dialog and the Internet, many have searched for themselves and have become frustr ated (Rudner, 2000). The Clearinghouses now provide on-line reference servic es in response to those needs. In theory, we have subject-matter experts within the E RIC system and they respond with a

PAGE 10

10 of 20set of relevant ERIC and Internet resources. In man y ways, this has been a major success. Most patrons have been delighted with the service. However, ERIC cannot provide reference services as it does for the next five years. The clearinghouses are told to budget approximately $10.00 to respond to questi ons and it typically takes 30 to 45 minutes to provide a response. At this rate, most q uestions are answered by junior staff and graduate students. At that funding level, we ca nnot provide the quality and systematic evaluation that we would like and patron s should receive. The problem will get worse as the number of questions are increasing rapidly each year and the current ERIC contracts only allows for minor increments.You are a researcher or practitioner, which do you prefer? Search a carefully constructed pathfinder of the be st resources a. Search the entire Internet by yourself b. Of course, ten years ago, the Internet was not an option. Perhaps last year, many were content to search the Internet themselves. But the Internet has become massive and overwhelming. Using the major search engines often yields many irrelevant links. Typically, the user enters a word or two and the en gines provide a crude ranking and relevancy match based on all the text appearing on each web page. Improvements in this area will be marginal at best. An alternative is a carefully constructed pathfinder that identifies, organizes and annotates resources withi n a given field. The Argus Corporation ( www.clearinghouse.net ) maintains an impressive list of such pathfinders. Many ERIC Clearinghouses have developed such tools and they are well-received. But, pathfinders must be maintained. URLs change; new re sources become available; the pathfinder categories need to evolve; and resources should be continuously evaluated. Five years from now, the Clearinghouses will not be able to maintain their pathfinders as volunteer activities given increasing demand and th e sheer growth in the knowledge base.The ERIC database itself needs to be examined and probably redesigned The ERIC system has always sought to be a c omprehensive database by including virtually everything that has been written about ed ucation. The idea was that if the database is comprehensive, then with the right sear ch strategy, a person could find everything that is important to them. With constant level funding, however, the reality is that ERIC is no longer comprehensive. Several educa tion-related journals are not routinely put into the database. Acquisition of con ference papers is often not aggressive. Many high quality, state and federal reports do not get into the database. There is a real question whether the mix of documents being put into the ERIC database is optimal. To address this question, I lo oked at the demand and supply of ERIC citations. On the demand side, I analyzed character istics of two datasets: 1) 56,073 ERIC citations retrieved by web patrons of the ERIC Clea ringhouse on Assessment and Evaluation during three days in September 1999, and 2) all 35,433 documents ordered from the ERIC Document Reproduction Service in 1999 I looked at the target audience, publication type, clearinghouse codes, descriptors and publication years within each of the ERIC citations. I evaluated demand in terms of the absolute number and percent of retrieved citations with the addressed characterist ics. I evaluated supply using the percent

PAGE 11

11 of 20of documents in the ERIC database from 1985 with th e addressed characteristics. Supply for the first data set included both CIJE and RIE d ocuments; for the second data set, just RIE documents. A major problem with retrieval percentage a s a demand indicator is that it is heavily influenced by supply. If nearly all the documents i n the database were of a certain type, for example, then we would expect nearly all the re trieved documents to be of that type. To gauge the relationship of demand and supply, I c omputed a probability ratio by dividing the percent of retrieved documents with th e addressed characteristic by the percent of documents in the ERIC database with that characteristic. A ratio of 1.0 would indicate that supply exactly equals demand. A ratio greater than 1.25 is accepted as indicating that there is greater demand than supply A ratio less than .80 indicates that the supply is greater than demand. Because the sample s izes are so big, all ratios are significantly different from 1.000. One should conc entrate on practical significance. Table 3 shows supply and demand by target audience; Table 4 shows supply and demand by publication type. This evaluation of supply and demand is in terms of quantity, not quality. While there may not be many documents of a certain type i n the database, the few that are in the database may address the patron questions and compl etely meet the demand. Further, low demand does not necessarily indicate that a documen t type should not be sought. Demand may be low because patrons don't know that a certai n type of document may be in the database. Other documents should be archived, such as publications from the National Center for Educational Statistics, and hence belong in the database even if they are in low demand. Nevertheless, ERIC acquisitions needs to be rethought.Table 3 Supply and Demand of ERIC Citations by Target AudienceOn-line citationsReproduced documents DemandSupplyRatioDemandSupplyRatio Community0.7%0.5%1.491.6%0.7%2.43Practitioners50.2%18.3%2.7543.2%18.9%2.29 Counselors0.3%0.4%0.91 0.8%0.5%1.56 Parents1.3%0.7%1.792.5%1.6%1.54 Support Staff 0.1%0.1%0.41 0.1%0.1%1.21 Administrators3.2%3.8%0.844.4%3.9%1.13Researchers 2.5%5.1%0.49 2.2%2.1%1.07 Students1.3%1.6%0.812.9%2.7%1.06Teachers 14.6%9.9%1.48 11.0%11.4%0.97 Policymakers2.3%2.8%0.833.0%3.3%0.92p-ratios between .8 and 1.25 indicate that the perc entages are practically equivalent.

PAGE 12

12 of 20Table 4 Supply and Demand of ERIC Citations and Documents by Publication Type On-line citations Reproduced documents DemandSupplyRatioDemandSupplyRatio ERIC Product0.9%0.9%1.03 3.8%2.3%1.68 Thesis0.6%0.3%2.271.4%0.8%1.65Review Literature9.5%7.5%1.2610.5%6.4%1.64Dissertation0.4%0.3%1.290.9%0.6%1.42 Research Report31.4%30.6%1.02 30.7%25.9%1.19 Conference Paper 9.5%12.6%0.76 31.7%28.5%1.11 Practicum Paper 0.5%0.4%1.50 1.3%1.2%1.09 Position Paper 14.4%19.1%0.75 9.5%9.7%0.98 Test, Questionnaire 2.1%2.7%0.75 6.0%6.4%0.93 Evaluative Report 11.4%8.6%1.33 10.0%11.5%0.87 Project Description18.7%20.9%0.90 13.3%16.8%0.79 Bibliography1.2%1.7%0.69 1.7%2.2%0.76 Non-clssrm Material9.2%7.5%1.22 8.5%11.3%0.75 General Report1.1%2.3%0.48 0.8%1.1%0.70 Teaching Guide8.9%9.2%0.97 5.8%8.7%0.67 Confer. Proceedings0.6%1.1%0.52 1.3%2.2%0.59 Historical Material0.5%1.2%0.38 0.5%1.0%0.53 Directory0.3%0.6%0.51 0.6%1.2%0.50 General Reference0.1%0.2%0.65 0.1%0.3%0.47 Legal Material0.6%1.3%0.45 0.7%1.7%0.40 Statistical Material1.0%2.2%0.47 1.8%4.6%0.40

PAGE 13

13 of 20 Instructional Material0.5%1.5%0.36 0.8%2.1%0.39 Book 4.2%2.1%2.05 1.7%8.0%0.21 Audiovisual Material0.1%0.1%0.53 0.0%0.3%0.09 p-ratios between .8 and 1.25 indicate that the perc entages are practically equivalent. Based on this analysis, the most popular ty pes of documents are those flagged as written for practitioners and teachers; demand for these types of documents exceeds the supply in the database. Documents written expressly for researchers are also in demand; however, there appears to be an adequate supply of such documents. There is very little demand, however, for historical materials, director ies, general reference material, legal material, and audio-visual material. Of special int erest is that there is very little demand for instructional material. Right now, patrons do n ot come to ERIC in search of materials to use in their classroom. Yet, a significant porti on of documents are selected for inclusion in the database on the grounds that a tea cher may find the materials useful. The data suggest that either ERIC markets the availabil ity of these types of documents or puts much less effort into their acquisition. Another read of these data is that demand e xceeds supply for comprehensive materials such as literature reviews, books, theses and dissertations as well as evaluative materials. One reviewer pointed out that ERIC needs a better policy with regard to books. One one hand, there are databases for books and one could flood the database with textbooks. On the other hand, books providing insig hts into policy issues and books summarizing scholarly research are sorely needed an d are not adequately being identified by ERIC. I noted earlier that the scopes of work for the ERIC Clearinghouses have not changed significantly in the past 25 years. As show n in Table 5, this lack of change may be becoming problematic. Five clearinghouses are pu tting in significantly more documents than people seem to be demanding. Further these clearinghouses supply about one-third of the documents in the ERIC database yet account for only one-fifth of the demand. This is not to say that the mix of document s in the ERIC database should be determined by demand, but rather the mix of clearin ghouse activities needs to be periodically re-examined. The ERIC database is composed of a document s database, RIE and a journal article database, CIJE While the documents in RIE are not peer reviewed, the RIE database has many advantages. It serves as a pre-print service f or many papers originally presented at conferences. It serves as an archive for on-line jo urnals, such as Education Policy Analysis Archives And it contains state and federally produced repo rts. Most importantly, ERIC can make most of these documents available, either though the microfiche collection, or on-line for documents sub mitted after 1994. Thus, people can search the RIE database and usually obtain the documents. The same is not true for CIJE Patrons finding articles in CIJE need to go to an academic library, or if it is in one of a limited n umber of journals, order the document through a reprint service. Thus, CIJE presents additional work for the patron and there are alternatives. As mentioned earlier, OCLC, EBSCO and the American Psychological Association provide on-line access to a growing num ber of journal articles. H.R. Wilson's Education Abstracts database covers many of the jou rnals covered by CIJE Perhaps,

PAGE 14

14 of 20ERIC should drop CIJE in light of these other databases or perhaps index only those journals it can archive in RIE .Table 5 Supply and Demand of ERIC Citations and Documents by Clearinghouse On-line citationsReproduced documents DemandSupplyRatioDemandSupplyRatio Ed Manage 9.6%6.4%1.529.2%6.7%1.38 Teacher Ed 7.7%5.0%1.536.5%5.3%1.24 Disab/Gifted 16.3%8.2%1.99 7.5%6.5%1.16 Early Child 9.6%5.6%1.71 7.9%7.3%1.09 Reading9.4%8.2%1.15 10.3%9.5%1.08 Assessment4.8%4.6%1.04 6.4%6.0%1.07 Commn Col 1.7%2.8%0.59 4.7%4.7%1.00 Urban4.4%4.0%1.10 4.1%4.2%0.98 Counsel6.1%6.4%0.96 4.6%4.7%0.97 Foreign Lang 3.2%5.0%0.65 5.2%5.8%0.89 Rural3.6%3.0%1.18 2.9%3.4%0.87 Sci Math5.7%7.5%0.76 3.2%4.6%0.70 Higher Ed3.8%7.4%0.51 5.2%7.5%0.69 Info Resou4.9%8.0%0.61 4.9%7.2%0.68 Career/Adult Ed5.1%10.0%0.51 6.4%11.4%0.56Soc Stud4.0%6.1%0.65 2.8%4.9%0.56 p-ratios between .8 and 1.25 indicate that the perc entages are practically equivalent.Summary and Conclusions ERIC's value lies is its ability to make ed ucational information relevant to a wide

PAGE 15

15 of 20range of consumers. ERIC does this by identifying r esources, organizing information, applying information science, using literature, syn thesizing information, developing new information tools, and developing special informati on products. While building the database has been its central activity, the most vi sible and useful ERIC accomplishments are not part of the core ERIC contract. They do, ho wever, stem from the database and the process of building the database. I have argued that ERIC will not be able to provide its current level of services much longer because demand is outpacing institution al and personal capacity. If ERIC maintains the low levels of service the government currently funds, without any effort to redirect and expand resources to meet demonstrated need, the education community will lose. ERIC is the information infrastructure for Am erican education. While operating at a fraction of its capacity, it has effectively prov ided access to the wide range of information and information services produced acros s the country. The need to build this education information infrastructure is increasing. Perhaps more than ever, the education community needs to use information to inform decisi on-making at all levels. The daily instructional activities of America's 3,000,000 ele mentary and secondary school teachers should be guided by sound educational practices. Ad ministrators and policymakers should benefit from the management decisions made b y their colleagues. Research is a cumulative science and should be built on the metho ds and findings of other researchers with built-in mechanisms for dissemination and feed back from practitioners. The need to build and maintain the educatio n information infrastructure exists and the responsibility falls squarely on the U.S. Depar tment of Education. Historically, there have been two criteria in determining the appropria teness of government interventions (programs): limit the intervention of all governments to undert aking only those activities whose purposes are unattainable in the desired amou nt or quality through private action and where the public benefits equal or excee d the public costs of production 1. remand the public intervention to the lowest level (local, state, federal, or some combination) where the function can be effectively performed Mathtech (1998b). 2. By these criteria, providing information to the education community is clearly an appropriate federal role. Federal involvement in th is area prevents needless duplication of effort, can assure better quality, can assure a range of products, and is cost effective. ERIC could be doing a great deal more in it s quest to provide information to the education community. I have mentioned several thing s ERIC is not doing: systematically gathering and analyzing patron satis faction information systematically analyzing queries and search strateg ies to identify user community training needs and topics of interest designing benchmarks and systematically evaluating and improving the quality of reference services producing management resources to be shared across the 16 clearinghouses gathering and analyzing high-quality usage statisti cs vigorously pursuing acquisitions vigorously acquiring and cataloging web resources providing access to the journal literature marketing and disseminating itself to a broader aud ience preparing articles about the project

PAGE 16

16 of 20 I have also mentioned some things ERIC is d oing, but should do more of: developing a wide range of content-oriented trainin g material disseminating information about itself establishing on-line electronic journals creating access to full-text documents posting quality materials on the Internet as they a re acquired providing more syntheses and information products ERIC has amply demonstrated the need to inf use information science in the various educational subject matter disciplines, and its abi lity to do so. ERIC needs to expand if it to institutionalize its current level of service an d respond well to information requests of the 21st century. Properly funding the volunteer ac tivities will allow for more concentrated effort and inevitably higher quality a nd usability. Just as educational practice and advances should be based on research, ERIC also needs a program of research into ways of being more responsive to user needs. The ERIC of today is confronted with a vast ly different user base, mode of access, mix of services and set of demands. No, ERIC is not ready for this new environment. It has the ability, but not the resources and not the guidance. In my view, this will hurt not only the research community, but more importantly, teachers and practitioners who have neither the time, desire or ability to sift through today's overwhelming volumes of potential resources. Notes. Based on a paper presented at the Annual Meeting of the American Educational Research Association, New Orleans, LA April 24-28, 2000. 1. This study did NOT receive any funding from the U.S Department of Education. 2.EndorsementsThe Directors of the following ERIC Clearinghouses have indicated that they concur with most, but not necessarily all, of the points r aised in this article: ERIC Clearinghouse on Higher Education, ERIC Clearinghouse on Counseling and Student Servic es, ERIC Clearinghouse on Educational Management, ERIC Clearinghouse on Elementary and Early Childhoo d Education, ERIC Clearinghouse on Languages and Linguistics, ERIC Clearinghouse on Urban Education, ERIC Clearinghouse on Reading, English and Communic ation, ERIC Clearinghouse on Disabilities and Gifted Educa tion, ERIC Clearinghouse on Information and Technology, ERIC Clearinghouse on Social Studies/Social Science Education, ERIC Clearinghouse on Community Colleges, and ERIC Clearinghouse on Rural Education and Small Sch ools, and I am the Director of the ERIC Clearinghouse on Assessment and Evaluation. References

PAGE 17

17 of 20Colker, Laura J. (2000). Reminiscences from the Field: The Continuing Story of ERIC Springfield, VA: Dyntel Corporation.Eisenberg, Mike; Henson, Jane; Howley, Craig; Cawle y, Nancy; Ramirez, Bruce; Rothenberg, Dianne (1997). Rising Expectations: A F ramework for ERIC's Future in the National Library of Education. Report of the ERIC O perations Framework Task Force. (ERIC Document Reproduction Service No. ED410969)ERIC Annual Report 1999 Available. Online: http://www.accesseric.org/resources/annual99/index. html Hoffman, Charlene M (1995). Federal Support for Edu cation. Fiscal Years 1980 to 1995. Washington, DC: National Center for Education al Statistics. (ERIC Document Reproduction Service No. ED392853).Lynch, Clifford A. (2000). Technology and the ERIC System: New Opportunities and New Impacts. Discussion Draft for March 8, 2000 ERI C Director's Meeting. Mathtech (1998a). Shaping the Future of Educational Research, Development, and Communication. OERI Reauthorization Working Papers Prepared for the National Educational Research Policy and Priorities Board (E RIC Document Reproduction Service No. ED428105).Mathtech (1998b). A Review of the Federal Role and the Department of Education Structure. OERI Reauthorization Working Papers Prep ared for the National Educational Research Policy and Priorities Board (ERIC Document Reproduction Service No. ED428105).Rudner, Lawrence M. (1997). ERIC and NLE: Past, Pre sent, and Future. Paper prepared for the National Educational Research Policy and Pr iorities Board. May 30, 1997. Rudner, Lawrence M. (2000). Who Is Going to Mine Di gital Library Resources? And How? D-Lib Magazine 6(5) May 2000 On-line: http://www.dlib.org/dlib/may00/rudner/05rudner.html Stalford, Charles B.; Stern, Joyce D. (1990). Major Results of a Survey on the Use of Educational R&D Resources by School Districts. Pape r presented at the Annual Meeting of the American Educational Research Association (B oston, MA, April 16-20, 1990). (ERIC Document Reproduction Service No. ED329212).Stonehill, Robert M. 1992. "Myths and Realities Abo ut ERIC." ERIC Digest. Washington, DC: Office of Educational Research and Improvement, National Center for Education Statistics. (ERIC Document Reproduction S ervice No. ED345756). Trester, Delmer J. (1981). ERIC: The First Fifteen Years: 1964-1979. Columbus, OH: SMEAC Information Reference Center.About the AuthorLawrence M. Rudner Director ERIC Clearinghouse on Assessment and Evaluation

PAGE 18

18 of 20 Department of Measurement, Statistics and Evaluatio n 1129 Shriver LaboratoryUniversity of Maryland, College ParkCollege Park, MD 20742 Email: rudner@ericae.net Lawrence Rudner is with the College of Library and Information Services, University of Maryland, College Park. He has been involved in qua ntitative analysis for over 30 years, having served as a university professor, a branch c hief in the U.S. Department of Education, and a classroom teacher. For the past 12 years, he has been the Director of the ERIC Clearinghouse on Assessment and Evaluation an information service sponsored by the National Library of Education, U.S Department of Education. Dr. Rudner holds a Ph.D. in Educational Psychology (197 7), an MBA in Finance (1991), and lifetime teaching certificates from two states.Copyright 2000 by the Education Policy Analysis ArchivesThe World Wide Web address for the Education Policy Analysis Archives is epaa.asu.edu General questions about appropriateness of topics o r particular articles may be addressed to the Editor, Gene V Glass, glass@asu.edu or reach him at College of Education, Arizona State University, Tempe, AZ 8 5287-0211. (602-965-9644). The Commentary Editor is Casey D. C obb: casey.cobb@unh.edu .EPAA Editorial Board Michael W. Apple University of Wisconsin Greg Camilli Rutgers University John Covaleskie Northern Michigan University Alan Davis University of Colorado, Denver Sherman Dorn University of South Florida Mark E. Fetler California Commission on Teacher Credentialing Richard Garlikov hmwkhelp@scott.net Thomas F. Green Syracuse University Alison I. Griffith York University Arlen Gullickson Western Michigan University Ernest R. House University of Colorado Aimee Howley Ohio University Craig B. Howley Appalachia Educational Laboratory William Hunter University of Calgary Daniel Kalls Ume University Benjamin Levin University of Manitoba Thomas Mauhs-Pugh Green Mountain College Dewayne Matthews Western Interstate Commission for HigherEducation

PAGE 19

19 of 20 William McInerney Purdue University Mary McKeown-Moak MGT of America (Austin, TX) Les McLean University of Toronto Susan Bobbitt Nolen University of Washington Anne L. Pemberton apembert@pen.k12.va.us Hugh G. Petrie SUNY Buffalo Richard C. Richardson New York University Anthony G. Rud Jr. Purdue University Dennis Sayers Ann Leavenworth Centerfor Accelerated Learning Jay D. Scribner University of Texas at Austin Michael Scriven scriven@aol.com Robert E. Stake University of Illinois—UC Robert Stonehill U.S. Department of Education David D. Williams Brigham Young UniversityEPAA Spanish Language Editorial BoardAssociate Editor for Spanish Language Roberto Rodrguez Gmez Universidad Nacional Autnoma de Mxico roberto@servidor.unam.mx Adrin Acosta (Mxico) Universidad de Guadalajaraadrianacosta@compuserve.com J. Flix Angulo Rasco (Spain) Universidad de Cdizfelix.angulo@uca.es Teresa Bracho (Mxico) Centro de Investigacin y DocenciaEconmica-CIDEbracho dis1.cide.mx Alejandro Canales (Mxico) Universidad Nacional Autnoma deMxicocanalesa@servidor.unam.mx Ursula Casanova (U.S.A.) Arizona State Universitycasanova@asu.edu Jos Contreras Domingo Universitat de Barcelona Jose.Contreras@doe.d5.ub.es Erwin Epstein (U.S.A.) Loyola University of ChicagoEepstein@luc.edu Josu Gonzlez (U.S.A.) Arizona State Universityjosue@asu.edu Rollin Kent (Mxico)Departamento de InvestigacinEducativa-DIE/CINVESTAVrkent@gemtel.com.mx kentr@data.net.mx Mara Beatriz Luce (Brazil)Universidad Federal de Rio Grande do Sul-UFRGSlucemb@orion.ufrgs.brJavier Mendoza Rojas (Mxico)Universidad Nacional Autnoma deMxicojaviermr@servidor.unam.mxMarcela Mollis (Argentina)Universidad de Buenos Airesmmollis@filo.uba.ar

PAGE 20

20 of 20 Humberto Muoz Garca (Mxico) Universidad Nacional Autnoma deMxicohumberto@servidor.unam.mxAngel Ignacio Prez Gmez (Spain)Universidad de Mlagaaiperez@uma.es Daniel Schugurensky (Argentina-Canad)OISE/UT, Canadadschugurensky@oise.utoronto.ca Simon Schwartzman (Brazil)Fundao Instituto Brasileiro e Geografiae Estatstica simon@openlink.com.br Jurjo Torres Santom (Spain)Universidad de A Coruajurjo@udc.es Carlos Alberto Torres (U.S.A.)University of California, Los Angelestorres@gseisucla.edu