USF Libraries
USF Digital Collections

Educational policy analysis archives

MISSING IMAGE

Material Information

Title:
Educational policy analysis archives
Physical Description:
Serial
Language:
English
Creator:
Arizona State University
University of South Florida
Publisher:
Arizona State University
University of South Florida.
Place of Publication:
Tempe, Ariz
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Education -- Research -- Periodicals   ( lcsh )
Genre:
non-fiction   ( marcgt )
serial   ( sobekcm )

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
usfldc doi - E11-00383
usfldc handle - e11.383
System ID:
SFS0024511:00382


This item is only available as the following downloads:


Full Text
xml version 1.0 encoding UTF-8 standalone no
mods:mods xmlns:mods http:www.loc.govmodsv3 xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govmodsv3mods-3-1.xsd
mods:relatedItem type host
mods:identifier issn 1068-2341mods:part
mods:detail volume mods:number 12issue 34series Year mods:caption 20042004Month July7Day 2222mods:originInfo mods:dateIssued iso8601 2004-07-22



PAGE 1

Education Policy Analysis Archives A peer-reviewed scholarly journal Editor: Gene V Glass College of Education Arizona State University Copyright is retained by the first or sole aut hor, who grants right of first publication to the Education Policy Analysis Archives EPAA is a project of the Education Policy Studies Laboratory. Articles are indexed in the Director y of Open Access Journals (http://www.doaj.org). Volume 12 Number 34 July 22, 2004 I SSN 1068-2341 Southern Association of Colleges and Schools Accreditation: Impact on Elementary Student Performance Darlene Y. Bruner University of South Florida Lance Lamar Brantley Ware County Public Schools (GA) Citation: Bruner, D. Y. & Brantley, L. L. (2004, July 22). Southern Association of Colleges and Schools accreditation: Impact on elementary stud ent performance, Education Policy Analysis Archives, 12(34). Retrieved [date] from http://epaa.asu.edu/epaa/v12n34/. Abstract Currently, 848 Georgia public elementary sc hools that house thirdand fifth-grades in the same building use the Southern A ssociation of Colleges and Schools (SACS) accreditation as a school improvement model The purpose of this investigation was to determine whether elementary schools t hat are SACS accredited increased their levels of academic achievement at a higher rate over a five-year period than elementary schools that were not SACS accr edited as measured by the Iowa Test of Basic Skills (ITBS). Independent variables included accreditation status, socioeconomic status (SES) of sc hools, and baseline scores of academic achievement. Dependent variables included mathematics and reading achievement scores. There was a statistically signifi cant difference found when examining the SES of schools and baseline scores of th e elementary schools. SACS accredited elementary schools had higher SES and higher baseline scores in thirdand fifthgrade mathematics and reading. However, t he multiple regression model indicated no statistically significant differences in gain scores between SACS accredited and non-SACS accredited elementary schools in thirdand fifth-grade mathematics and reading achievement during the five y ear period examined in this study. Throughout the history of education, schools have been reformed, restructured, and re-cultured to meet societal needs. Schools are struggling to meet the demands of high

PAGE 2

Schools Accreditation: Impact on Elementary Student Performance 2 stakes testing and to identify interventions that can improve student performance and at the same time are faced with the challenge of educating a growing at-risk population. Programs and services are coming under scrutiny as schools attempt to meet the achievement levels set by their legislatures. Georgia, like many other states, works to answer the national call for school improvement. The Quality Basic Education (QBE) initiative became law in 1986 and sought to reform Georgia schools and hold them accountable for student achievement (Elmore, 1992). This law requires the publication of school and district performance on standardized tests. The QBE law was stimulus for the development of the Georgia Quality Core Curriculum (QCC) objectives that was an attempt to standardize the curricula in Georgia schools (Elmore, 1992). House Bill 1187 (Smith et al. 2000), known as Georgia’s A Plus Education Reform Act, placed emphasis on ending social promotion, training teachers in technology skills, funding a school nurse in every school, lowering class size in an attempt to increase student achievement, and increased teacher accountab ility. Schools are given a letter grade based on student performance. Trained school improvement specialists offer assistance to schools that receiving a failing grade (Smith et al. 2000). Throughout these educational reforms, SACS has attempted to restructure schools to meet the accountability demands (Miller, 1998). The Southern Association of Colleges and Schools (SACS) consists of the Commission of Elementary and Middle Schools (founded in 1965), the Commission of Secondary and Middle Schools (founded in 1912), and the Commission of Colleges (founded in 1919). The central purpose of SACS is the improvement of education in the southern United States through the process of accreditation. Accreditation is a voluntary process of evaluation concerned with improving the educational quality and assuring the public that members of accredited institutions meet established standards. SACS school improvement process embraces the concepts of shared governance (Perry, Brown, & McIntyre, 1993) and the school improvement process espoused by Lezotte and Jacoby (1990). Specifically, a quality school improvement process for elementary and middle schools, according to the bylaws of SACS, involves three phases: planning, peer review, and implementation for continuous improvement (Miller, 1998). The planning phase usually takes 12 to 18 months according to the Commission of Elementary and Middle Schools (1999). In this phase, schools collaboratively develop a profile of the school (socioeconomic status, race and gender data, etc.). The school stakeholders then develop a shared instructional covenant that includes the vision, mission,

PAGE 3

Schools Accreditation: Impact on Elementary Student Performance 3 and beliefs for the school. This shared vision gives direction and determines long-term goals for the schools (Sunoo, 1996). Theoretically, the instructional covenant helps to increase student achievement by focusing all aspects of the school toward a common instructional purpose (Allen & Calhoun, 1998). During this phase, educators analyze instructional and organizational effectiveness and develop an action plan based on the data collected from the school and address the specific needs of the school. In the latter stages of the planning phase, stakeholders must implement the action plan while documenting progress and make modifications to the plan as needed. In the peer review phase teachers, counselors, and administrators from other SACS accredited schools comprise the peer review team. The team is trained in the SACS school improvement process. The focus of the peer review team is to provide the school with an assessment of the action plans, the implementation process, and the effectiveness of the school improvement planning process and to determine if benchmarks are being met. A written SACS report is prepared with the teams’ recommendations for the host school (SACS Proceedings, 2000). The school’s improvement is based on “a continuous and sustained phase of implementation, monitoring, and revisions of the action plan for school improvement” (Commission of Elementary and Middle Schools, 1999, p. 105). The final phase is implementation and includes preparation, effective monitoring, and communication by reporting. School stakeholders must review the recommendations of the peer review team and their goals and objectives to ascertain that they are measurable and attainable. The stakeholders achieve effective monitoring when there is evidence of increased student performance and documented changes to the action plans as new needs arise based on the performance results (Commission of Elementary and Middle Schools, 1999). Andrews (1999) conducted a quantitative research investigation comparing student achievement over time between students in SACS accredited elementary schools and those attending non-SACS accredited schools. The researcher matched baseline scores the year before accreditation with similar schools that were not accredited. Comparisons were conducted between mean scores over a three-year period. Results of the study found that there was no statistically significant difference in SACS accredited elementary schools and non-SACS accredited schools. This research investigation suggested that SACS accreditation is not an effective model for improving student achievement (Andrews, 1999). Andrews (1999) suggested that another investigation was needed that examined a five-year period because SACS school improvement process requires a peer review every five years.

PAGE 4

Schools Accreditation: Impact on Elementary Student Performance 4 The purpose of this study was to determine whether elementary schools that are SACS accredited increase their levels of academic achievement at a higher rate over a fiveyear period than do elementary schools that are not SACS accredited as measured by thirdand fifth-grades Iowa Tests of Basic Skills (ITBS). Standardized test scores were examined in the areas of reading and mathematics over a five-year period as SACS is a five-year process. There were two research questions: 1) Is there a differential gain in reading achievement over time for students enrolled in elementary schools that attain the Southern Association of Colleges and Schools status? 2) Is there a differential gain in mathematics achievement over time for students enrolled in elementary schools that attain the Southern Association of Colleges and Schools status? Methods Sample Population Elementary schools in this study are defined as public schools in Georgia listed by The Council for School Performance (1999) and contain both Grades 3 and 5, as determined by the Georgia Public Education Di rectory (Georgia Department of Education, 2001). There were 217 non-SACS accredited schools that met the criteria of housing both third and fifth grades (control group). The Southern Association of Colleges and Schools provided a list of all elementary schools in the state of Georgia that were accredited in 1996. There were 41 elementary schools, housing both third and fifth grades, in the state of Georgia and accredited by SACS in 1996 (treatment group). To determine socioeconomic status of schools in this investigation, data obtained from the Council for School Performance was reviewed. This data divides all schools in the state of Georgia into 13 clusters based on factors such as socioeconomic status, race, locale, etc. Schools in clusters 1-3 would be considered high socioeconomic status, clusters 4-8 considered medium, and clusters 9-13 low. The 41 SACS elementary schools’ socioeconomic status was 70% high, 17.5% medium and 12.5% low. One of the SACS accredited schools housed only K-2 grades and was eliminated from the study as it did not meet the requirements of having third and fifth grades mathematics and reading scores. The sample of SACS accredited schools used in the final analysis of mathematics and reading achievement scores was n = 18 because 22 of the SACS accredited schools did not have a baseline ITBS scores due to the Georgia Legislature passing laws for consolidation of schools in small districts.

PAGE 5

Schools Accreditation: Impact on Elementary Student Performance 5 Instrumentation Achievement data were collected from archival sources. Data were collected using the total reading and total mathematics Normal Curve Equivalence (NCE) scores on the ITBS. The ITBS was the only standardized test administered to every thirdand fifth-grade student in Georgia for the years studied. The ITBS produces scores that have high validity and reliability coefficients (Impara & Plake, 1998). Published by the Riverside Publishing Company and authored by Hoover, Hieronymous, Frisbie, and Dunbary, the ITBS is a normreferenced test administered in grades Kindergarten through Grade 8. ITBS scores are reported as raw scores, percentile ranks, grade equivalent scores (GE), and scaled scores. Composite reliabilities and core total reliabilities are all above 0.90 (Impara & Plake, 1998). Equivalent-forms re liability estimates are in the acceptable range and norm-referenced scores have internal cons istency reliability estimates (i.e., K-R 20) between the subscale scores of 0.85 and 0.92 (Impara & Plake, 1998). The ITBS is a timed test and students are given 50 minutes to complete the reading portion and 50 minutes to complete the math segment. The reading section is presented in multiple-choice format and contains questions on reading comprehension and vocabulary. A multiple-choice design is also used in the mathematics section and contains questions using computation and reasoning/logical skills. The score sheet is graded electronically and local boards of education keep records of the scores. The Georgia Elementary School Report Card reports the data for each school in the state. Additionally, a survey was designed to determine if educators in non-SACS elementary and SACS accredited schools were implementing similar school improvement strategies. The survey was reviewed by a panel of four experts in education to increase the validity. The survey was sent to a random sample survey of principals (n=100) in non-SACS elementary schools and to all 40 elementary schools that were SACS accredited in 1996. Research Design The research design of this study was causal-comparative. A comparison of NCE scores on the ITBS was done. NCE scores are commonly used in research studies because they allow the data to be algebraically manipulated (Huck, 2000). The causal-comparative design is effective when two similar groups are compared; however, this design also includes weaknesses such as lack of control and a lack of manipulation (Huck, 2000).

PAGE 6

Schools Accreditation: Impact on Elementary Student Performance 6 Variables The independent variables were the type of schools, SACS accredited or non-SACS schools, socioeconomic status among the schools, and the 1995 baseline NCE score on the ITBS before the treatment. The dependent variables were the total reading and the total mathematics sections of the ITBS. Treatments Differences existed between SACS and non-SACS accredited elementary schools in the state of Georgia. The SACS accredited schools had slightly SES than their non-SACS peers. SACS accredited schools implemented the SACS school improvement model. This model emphasizes shared governance, planning, and implementation of an action plan with monitoring of student progress and peer reviews every five years. Elementary schools that are not accredited do not have peer reviews. Data Collection School report cards, provided by the Georgia Department of Education, listed the National Percentile Rank (NPR) which were converted to NCE scores for each school in total reading and total mathematics as measur ed by the ITBS. The NCE scores were utilized to determine gains in student achievement. Surveys were sent to all 40 1996 SACS accredited schools and to a random sample of principals (n = 100) in non-SACS elementary schools. The response rate was 57.5% for the SACS accredited schools and 63% for the non-SACS accredited schools. Data Analysis Achievement test scores from 1995, 1996, 1997, 1998, 1999, and 2000 were analyzed to test the research hypotheses. To test the first hypothesis an independent samples t test and multiple regression comparison of academic gains over the five-year period of this study, with the independent variables of type of school, the 1995 baseline ITBS NCE scores, and socioeconomic status. The total reading achievement score was the dependent variable. Similarly, the second hypothesis was examined with the same tests and independent variables but with total mathematics achievement score as the dependent variable. Statistical test were completed using the Statistical Package for the Social Sciences [SPSS] (SPSS, 1999).

PAGE 7

Schools Accreditation: Impact on Elementary Student Performance 7 The survey of schools was analyzed for common themes and differences in school improvement strategies used in Georgia elementary schools. The survey was also examined to determine if some of the same improvement strategies were being used by both SACS and non-SACS elementary schools. Results were reported as frequencies and percentages. Results The univariate analysis of variables presented in Table 1 indicates that 226 elementary schools were used for analysis of third-grade reading and mathematics scores, while 227 elementary schools were utilized for fifth-grade reading and mathematics scores. NCE scores increased more in third-grade mathematics (M = 3.74, SD = 8.84) and fifthgrade mathematics (M = 2.10, SD =7.86). Table 1 Sample Size, Means, and Standard Deviations of NCE Gain Scores n M SD Third-grade reading gains 226 1.08 7.81 Third-grade math gains 226 3.74 8.84 Fifth-grade reading gains 227 0.52 6.51 Fifth-grade math gains 227 2.10 7.86 Note: n = sample size; M = mean; SD = standard deviation The bivariate analysis of the independent variable, accreditation status, predicating the dependent variables, third and fifth-grade reading and mathematics NCE gain scores are shown in Table 2. The sample of SACS accredited schools is 18. An independent samples t test was used to compare SACS accredited elementary schools to non-SACS accredited elementary schools. Students in SACS accredited schools achieved higher gain scores in third-grade reading and mathematics and fifth-grade mathematics than did their non-SACS counterparts. Non-SACS elementary sch ools achieved higher gain scores in fifthgrade reading. Statically significant effects were found in both thirdand fifth-grade mathematics, but not found in reading. Students in SACS accredited elementary schools had a mean NCE increase of 10.17, and students in non-SACS elementary schools had a mean NCE increase score of 3.18 on the third-grade mathematics portion of the ITBS. Students in SACS

PAGE 8

Schools Accreditation: Impact on Elementary Student Performance 8 accredited elementary schools had a mean increase score of 7.89 and students in nonSACS elementary schools had a mean increase score of 1.60 on the fifth-grade mathematics portion of the ITBS. An independent samples t test was used to compare baseline NCE scores of the SACS accredited elementary schools to the non-SACS accredited elementary schools. The 1995 ITBS NCE scores were used as the baseline scores because the treatment, i.e., accreditation status, occurred during the 1996 school year. Students in SACS accredited elementary schools had higher baseline NCE scores in third-grade reading, third-grade mathematics, fifth-grade reading, and fifth-grade mathematics than their non-SACS counterparts. Statically significant effects were found in both thirdand fifth-grade reading, but not in mathematics for either grade. Students in SACS accredited elementary schools had a NCE baseline score of 55.17 and students in non-SACS elementary schools had a NCE baseline score of 48.72 on the third-grade reading portion of the ITBS. Students in SACS accredited elementary schools had a NCE baseline score of 55.50 and students in non-SACS elementary schools had a NCE baseline score of 49.32 on the fifth-grade reading portion of the ITBS. Table 2 n ’s, Means, Standard Deviations, & Level of Significance for Bivariate Analysis SACS Non-SACS Sig. Dependent variable n M SD n M SD p Third-grade reading 18 1.39 4.92 208 1.06 8.02 > .05 Third-grade math 18 10.1 7 4.91 208 3.18 8.89 < .01 Fifth-grade reading 18 -0.56 4.27 209 0.61 6.67 > .05 Fifth-grade math 18 7.89 5.55 209 1.60 7.84 < .01 The first research question sought to determine if there is a differential gain in reading achievement over time for students enrolled in elementary schools that attain the

PAGE 9

Schools Accreditation: Impact on Elementary Student Performance 9 Southern Association of Colleges and Schools status? Table 3 presents a multivariate analysis of the variables in the model for thirdand fifth-grade Total Reading. The analysis of thirdand fifth-grade reading scores revealed statistically significant differences in the socioeconomic status of the elementary schools. The unstandardized beta coefficients indicated that the higher the school's socioeconomic status, the lower the increase in NCE scores over the five-year period. A statistically significant difference in reading scores was also found when analyzing the baseline scores for the year before accreditation occurred. The unstandardized beta coefficients of -0.76 for third-grade reading and -0.71 for fifth-grade reading indicated that the higher the baseline NCE score, the lower the increase after a five-year period. The multiple regression model indicated no statistically significant differences in SACS accredited and non-SACS accredited elementary schools in thirdand fifth-grade reading. The negative coefficient relating SACS accreditation to performance on state mandated assessments meant that non-SACS schools experienced greater improvement in scores on the reading assessments than their SACS accredited peers. The effects for each of the two coefficients were small and none attained statistical significance. The effect size for the multiple regression model was large for both third-grade reading (.52) and fifth-grade reading (.46) as determined by Cohen’s (1988) criteria. Table 3 Multiple Regression Model for Thirdand Fifth-Grade Reading Third-grade reading Fifth-grade reading Variables b p Standard error b p Standard error Constant 48 < .01 3.12 42 <.01 3.14 SES -.16 < .01 .02 -.10 <.01 .02 SACS -1.3 > .05 1.50 -1.0 >.05 1.32 Baseline -.76 < .01 .05 -.71 <.01 .05 R square .52 .46 n 226 227 Note: b = unstandardized beta coefficient; p = level of significance The second research question that guided the research was: Is there a differential gain in mathematics achievement over time for students enrolled in elementary schools that

PAGE 10

Schools Accreditation: Impact on Elementary Student Performance 10 attain the Southern Association of Colleges and Schools status? Table 4 presents a multivariate analysis of the variables in the model for thirdand fifth-grade Total Mathematics. Analysis of thirdand fifth-grade mathematics scores revealed statistically significant differences in the socioeconomic status of the elementary schools. The unstandardized beta coefficients of -0.21 for third-grade mathematics and -0.15 for fifth-grade mathematics indicated that the higher the school's socioeconomic status, the lower the increase in NCE scores over the five-year period. A statistically significant difference was also found when analyzing the baseline scores for the year before accreditation occurred. The unstandardized beta coefficient for fifth-grade mathematics indicated that the higher the baseline NCE score, the lower the increase after a five-year period. The multiple regression model indicated no statistically significant differences in SACS accredited and non-SACS accredited elementary schools in thirdand fifth-grade mathematics. The coefficient relating SACS accreditation to performance on state mandated assessments were negative for third-grade, but positive for fifth-grade. The negative coefficient meant that non-SACS schools experienced more improvement in scores on the third-grade mathematics assessment than their SACS accredited peers. The positive coefficient meant that SACS accredited schools experienced a bigger movement in scores on the fifth-grade mathematics assessment than their non-SACS peers. The effects for each of two coefficients were small and none attained statistical significance. The effect size for the multiple regression model was large for both thirdand fifth-grade mathematics as determined by Cohen's (1988) criteria. Table 4 Multiple Regression Model for Thirdand Fifth-Grade Mathematics Third-grade mathematics Fifth-grade mathematics Variables b p Standard error b p Standard error Constant 54 < .01 3.42 49 <.01 3.71 SES -.21 < .01 .02 -.15 <.01 .02 SACS -.70 > .05 1.68 .63 >.05 1.63 Baseline -.75 < .01 .06 -.75 <.01 .06 R square .53 .45 n 226 227 Note: b = unstandardized beta coefficient; p = level of significance

PAGE 11

Schools Accreditation: Impact on Elementary Student Performance 11 Surveys The survey was sent to all 40 elementary principals that work in 1996 SACS accredited schools and to a random sample of elementary school principals ( n = 100) that work in non-SACS elementary schools. The response rate on the survey was 57.5% for SACS accredited elementary schools, and 63% for non-SACS accredited elementary schools. The researchers found that both SACS accredited and non-SACS accredited elementary schools use some of the same strategies for improving their school. All participants in the study had school improvement teams consisting of administrators, teachers, parents, and members of the community. Twenty-six percent of SACS accredited elementary schools include students on their school improvement teams, while 41% of non-SACS elementary schools include students on their teams. All SACS accredited elementary schools that participated in this study have vision, mission, and belief statements about their school. Eighty-five percent of non-SACS elementary schools have these statements. Both SACS and non-SACS elementary school in the state of Georgia that completed the survey implement action plans for school improvement that contain goals and objectives for the next school year. The last question of the survey sought to determine the plans and strategies that principals would implement next school year when they receive test scores from this year's assessment. Some principals that completed the survey did not answer the question concerning standardized test scores, while others indicated that they did not have enough training to analyze standardized test scores effectively. Three major themes emerged from the analysis of the participant responses for the last question. The first major theme to emerge from the survey was that teams use standardized test data to determine if current school improvement goals are being met at desired levels. School improvement team members disaggregate test scores by grade and subject. Some schools use a checklist to det ermine if each grade level and subject met the school improvement goals of the school. The second theme that emerged from the survey was that school improvement team members determine strengths and weaknesses of the school based on the results of standardized tests. Brainstorming sessions occur between team members to improve the weaknesses and emphasize the strengths of the school. Administrators use test score data to make organizational changes that they hope will increase student achievement. The third, and last, topic that emerged from the survey was that administrators use standardized test scores to prepare appropriate staff development activities for teachers. Principals that responded to this survey revealed that staff

PAGE 12

Schools Accreditation: Impact on Elementary Student Performance 12 development for the upcoming school year is always based on the parts of the standardized test on which their school scores the poorest. Discussion and Conclusions As we make our conclusions, we must acknowledge that this research was limited by potential threats to internal and external validity and findings presented should be interpreted with caution. Internal validity threats included treatment interaction, mortality, maturation, and testing, while threats to external validity were population and ecological (Gay & Airasian, 2000). Also, the level of implementation of the SACS school improvement process may be a limitation. The Southern Association of Colleges and Schools is a private, nonprofit, voluntary organization founded in 1895 in Atlanta, Georgia. The Association is comprised of the Commission on Colleges, the Commission on Secondary and Middle Schools, and the Commission on Elementary and Middle Schools. The three commissions carry out their missions with considerable autonomy, develo ping their own standards and procedures, and governing themselves by a delegate assembly. SACS accreditation for elementary and middle schools’ website states that SACS uses a research-based and proven process that focuses on improving student performance. The process requires a school to involve its stakeholders in decision-making and to conduct a continuous cycle of school improvement activities. The SACS school improvement model contains many features described in educational research for successfully restructuring schools. The process focuses on planning, developing a shared vision and mission for the school, communicating with stakeholders, establishing benchmarks for stu dent achievement, providing peer review for outside feedback, and implementing new strategies (Commission of Elementary and Middle Schools, 1999). In the last 10 years, states have been legislating planning processes whereby schools develop school improvement plans that will assist them in accomplishing the outcomes established by their respective state boards of education. Kansas, Florida, New Jersey, Maryland and others require each school to develop school improvement plans and to submit these plans for board approval. A review of the improvement process of SACS and states that require school improvement plans reveal similarities in the elements of the process and products produced by the schools. Our survey of principals reinforces that both SACS and non-SACS accredited elementary schools implemented shared governance and

PAGE 13

Schools Accreditation: Impact on Elementary Student Performance 13 both had developed shared vision, mission, and belief statements that guided improvement efforts. The increased demand for accountability for student achievement by the states has provoked non-SACS accredited schools to examine their strengths and weakness and to make plans for improvement as is required of SACS accredited schools. Results of our study indicated that both mathematics and reading scores increased over a five-year period at about the same rate for both types of schools when controlling for baseline scores and socioeconomic status. This study supports Andrews’ findings (1999) of no statistically significant differences between SACS accredited and non-SACS accredited elementary schools as measured by standardized achievement tests in reading and mathematics. Andrew’s conducted an investigation over a three-year period comparing student achievement of students in SACS accredited and non-SACS accredited elementary schools. Other than Andrews, most research regarding SACS accreditation has been limited to historical studies, case studies, and interviews. The focus of the earlier investigations was to determine the perceptions of teachers and principals about school morale. They did not attempt to show a relationship between accreditation and increased student achievement. SACS annual dues are $300 annually for 2002. In addition, schools are responsible for cost of peer reviews, materials and other resources necessary for the review process. The average cost for an elementary school varies depending on size over a five year period for materials and dues (from $1500 to $3500). Some school districts pay dues as a district but usually any other cost incurred in the accreditation process is born by the school. With budget cutbacks and no significant statistical differences, elem entary administrators might want to consider if the SACS process is feasible for the school, especially if they are in states that require basically the same school improvement planning process as SACS. Monies could be used for staff development activities that would improve and/or align the curriculum thereby improving student achievement, provide diversity training on working with students and parents, or how to integrate technology in the classroom. The financial aspect aside, in many areas of the south it would be politically incorrect to drop accreditation. Part of the appeal of accrediting organizations is the status of being an accredited school and having all accredited schools in the district. SACS asserts that accreditation is a recognized endorsement of quality and ensures that schools are focusing on improving student performance. Our survey revealed that most school leaders use standardized test scores to determine if current school improvement goals are being met, to ascertain the strengths and

PAGE 14

Schools Accreditation: Impact on Elementary Student Performance 14 weaknesses of students and to prepare professional development activities for teachers. Georgia’s A Plus Education Reform Act (Smith et al., 2000) holds teachers and administrators accountable for student performance based on standardized test scores. Survey data in our study reveled that school leaders need additional training in analyzing and disaggregating test data. County and school level staff development needs to provide assistance to educators working with data and how to use data in decision making concerning students and the curriculum. Although this investigation did not reveal a statistically significant difference between SACS and non-SACS elementary schools, further research is needed concerning the effectiveness of SACS school improvement proc ess. SACS accreditation occurs throughout the entire southeastern part of the United States and occurs at all levels of education (elementary, middle, and high schools) as well as colleges and universities. Future research should include other states and achievement analyzed at other educational levels. Further research should also incorporate qualitative components such as personal interviews and focus groups. Researchers could then investigate the perceptions of teachers, students, administrators, and parents in SACS accredited schools and it may reveal successful school improvement strategies that could be shared among schools. References Allen, L. & Calhoun, E. (1998). School-wide action research: Findings from six years of study. Phi Delta Kappan, 79, 706-710. Andrews, S. (1999). The effects of school restructur ing on third and fifth grade achievement scores. Unpublished dissertation, Auburn University, 1999. Dissertation Abstracts International, 60 (05), 1401. (UMI No. 9931081). Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). New York: John Wiley. Commission on Elementary and Middle Schools. (1999). The quality school improvement process. Decatur, GA: Southern Association of Colleges and Schools. Elmore, R. (1992). Why restructuring alone won’t improve teaching. Educational Leadership, 49 (7), 44-49. Gay, L.R. & Airasian, P.W. (2000). Educational research: Competencies for analysis and application. (6th ed.). Englewood Cliffs, NJ: Prentice Hall. Georgia Department of Education. (2001). Georgia public education directory. Retrieved September 21, 2001, from http://accountablility.do e.k12.ga.us/Report00/default.html Huck, S. (2000). Reading statistics and research (3rd ed.). New York: Addison Wesley Longman, Inc.

PAGE 15

Schools Accreditation: Impact on Elementary Student Performance 15 Impara, J. & Plake, B. (1998). The thirteenth mental measurements yearbook. Lincoln, NE: University of Nebraska. Lezotte, L. & Jacoby, B. (1990). A guide to school improvement based on effective schools Miller, J. (1998). A centennial history of the southern association of colleges and schools. Decatur, GA: Southern Association of Colleges and Schools. Perry, C., Brown, D., & McIntyre, W. (1993). Teachers respond to the shared decision-making opportunity. Education, 114, 605-608. Southern Association of Colleges and Schools. (2003). Retrieved from the World Wide Web September 3, 2003 http://www. sacs.org/pub/elem/faq/eans01.htm SACS Proceedings. (2000). Proceedings. Atlanta, GA: Southern Association of Colleges and Schools. Smith, C., Dukes, W., Murphy, T., Jamies on, M., Porter, D., & Taylor, M. (2000). Georgia house of representatives: HB 1187 – A plus education reform act of 2000. Retrieved August 18, 2002, from http://www.doe.k12.ga.us/co mmunications/releases/hb1187.html SPSS Inc. (1999). SPSS 9.0 for Windows [Computer Software]. Chicago, IL: SPSS. Sunoo, B. (1996). Weighing the merits of vision and mission statements. Personnel Journal, 75 (4), 20-21. About the Authors Darlene Y. Bruner University of South Florida dbruner@tempest.coedu.usf.edu dbruner@tampabay.rr.com Darlene Y. Bruner is Associate Professor in the Department of Educational Leadership & Policy Studies at the University of South Florida. She teaches course in leadership and curriculum and her research interests concern the work culture of schools, curriculum issues, teacher leadership, and the principalship. Email: dbruner@tempest.coedu.usf.edu Lance Lamar Brantley Ware County Public Schools Lbrantley@ware.k12.ga.us Dr. Lance Brantley is a graduate of Valdosta State University and is currently principal of William Heights Elementary in Waycross, GA.

PAGE 16

Schools Accreditation: Impact on Elementary Student Performance 16 The World Wide Web address for the Education Policy Analysis Archives is http://epaa.asu.edu Editor: Gene V Glass, Ar izona State University Production Assistant: Chris Murr ell, Arizona State University General questions about appropriateness of topics or particular articles may be addressed to the Editor, Gene V Glass, glass@asu.edu or reach him at College of Educat ion, Arizona State University, Tempe, AZ 85287-2411. The Commentary Editor is Casey D. Cobb: casey.cobb@uconn.edu. EPAA Editorial Board Michael W. Apple University of Wisconsin David C. Berliner Arizona State University Greg Camilli Rutgers University Linda Darling-Hammond Stanford University Sherman Dorn University of South Florida Mark E. Fetler California Commission on Teacher Credentialing Gustavo E. Fischman Arizona State Univeristy Richard Garlikov Birmingham, Alabama Thomas F. Green Syracuse University Aimee Howley Ohio University Craig B. Howley Appalachia Educational Laboratory William Hunter University of Ontario Institute of Technology Patricia Fey Jarvis Seattle, Washington Daniel Kalls Ume University Benjamin Levin University of Manitoba Thomas Mauhs-Pugh Green Mountain College Les McLean University of Toronto Heinrich Mintrop University of California, Los Angeles Michele Moses Arizona State University Gary Orfield Harvard University Anthony G. Rud Jr. Purdue University Jay Paredes Scribner University of Missouri Michael Scriven University of Auckland Lorrie A. Shepard University of Colorado, Boulder Robert E. Stake University of Illinois—UC Kevin Welner University of Colorado, Boulder Terrence G. Wiley Arizona State University John Willinsky University of British Columbia

PAGE 17

Schools Accreditation: Impact on Elementary Student Performance 17 EPAA Spanish & Portuguese La nguage Editorial Board Associate Editors Gustavo E. Fischman Arizona State University & Pablo Gentili Laboratrio de Polticas Pblicas Universidade do Estado do Rio de Janeiro Founding Associate Editor for Spanish Language (1998—2003) Roberto Rodrguez Gmez Universidad Nacional Autnoma de Mxico Argentina Alejandra Birgin Ministerio de Educacin, Argentina Mnica Pini Universidad Nacional de San Martin, Argentina Mariano Narodowski Universidad Torcuato Di Tella, Argentina Daniel Suarez Laboratorio de Politicas Publicas-Unive rsidad de Buenos Aires, Argentina Marcela Mollis (1998—2003) Universidad de Buenos Aires Brasil Gaudncio Frigotto Professor da Faculdade de Educao e do Programa de Ps-Graduao em Educao da Universidade Federal Fluminense, Brasil Vanilda Paiva Lilian do Valle Universidade Estadual do Rio de Janeiro, Brasil Romualdo Portella do Oliveira Universidade de So Paulo, Brasil Roberto Leher Universidade Estadual do Rio de Janeiro, Brasil Dalila Andrade de Oliveira Universidade Federal de Minas Gerais, Belo Horizonte, Brasil Nilma Limo Gomes Universidade Federal de Minas Gerais, Belo Horizonte Iolanda de Oliveira Faculdade de Educao da Universidade Federal Fluminense, Brasil Walter Kohan Universidade Estadual do Rio de Janeiro, Brasil Mara Beatriz Luce (1998—2003) Universidad Federal de Rio Grande do Sul-UFRGS Simon Schwartzman (1998—2003) American Institutes for Resesarch–Brazil

PAGE 18

Schools Accreditation: Impact on Elementary Student Performance 18 Canad Daniel Schugurensky Ontario Institute for Studies in Educati on, University of Toronto, Canada Chile Claudio Almonacid Avila Universidad Metropolitana de Ciencias de la Educacin, Chile Mara Loreto Egaa Programa Interdisciplinario de Invest igacin en Educacin (PIIE), Chile Espaa Jos Gimeno Sacristn Catedratico en el Departamento de Didctica y Organizacin Escolar de la Universidad de Valencia, Espaa Mariano Fernndez Enguita Catedrtico de Sociologa en la Universidad de Salamanca. Espaa Miguel Pereira Catedratico Univer sidad de Granada, Espaa Jurjo Torres Santom Universidad de A Corua Angel Ignacio Prez Gmez Universidad de Mlaga J. Flix Angulo Rasco (1998—2003) Universidad de Cdiz Jos Contreras Domingo (1998—2003) Universitat de Barcelona Mxico Hugo Aboites Universidad Autnoma Metropolit ana-Xochimilco, Mxico Susan Street Centro de Investigaciones y Estudios Superiore s en Antropologia Social Occidente, Guadalajara, Mxico Adrin Acosta Universidad de Guadalajara Teresa Bracho Centro de Investigacin y Docencia Econmica-CIDE Alejandro Canales Universidad Nacional Autnoma de Mxico Rollin Kent Universidad Autnoma de Puebla. Puebla, Mxico Javier Mendoza Rojas (1998—2003) Universidad Nacional Autnoma de Mxico Humberto Muoz Garca (1998—2003) Universidad Nacional Autnoma de Mxico Per Sigfredo Chiroque Instituto de Pedagoga Popular, Per

PAGE 19

Schools Accreditation: Impact on Elementary Student Performance 19 Grover Pango Coordinador General del Foro Latinoamer icano de Polticas Educativas, Per Portugal Antonio Teodoro Director da Licenciatura de Cincias da E ducao e do Mestrado Universidade Lusfona de Humanidades e Tecnologias, Lisboa, Portugal USA Pia Lindquist Wong California State University Sacramento, California Nelly P. Stromquist University of Southern Califor nia, Los Angeles, California Diana Rhoten Social Science Research Council, New York, New York Daniel C. Levy University at Albany, SUNY, Albany, New York Ursula Casanova Arizona State University, Tempe, Arizona Erwin Epstein Loyola University, Chicago, Illinois Carlos A. Torres University of California, Los Angeles Josu Gonzlez (1998—2003) Arizona State University, Tempe, Arizona


xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam a22 u 4500
controlfield tag 008 c20049999azu 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E11-00383
0 245
Educational policy analysis archives.
n Vol. 12, no. 34 (July 22, 2004).
260
Tempe, Ariz. :
b Arizona State University ;
Tampa, Fla. :
University of South Florida.
c July 22, 2004
505
Southern association of colleges and schools accreditation : impact on elementary student performance / Darlene Y. Bruner [and] Lance Lamar Brantley.
650
Education
x Research
v Periodicals.
2 710
Arizona State University.
University of South Florida.
1 773
t Education Policy Analysis Archives (EPAA)
4 856
u http://digital.lib.usf.edu/?e11.383