USF Libraries
USF Digital Collections

Implementing school-wide positive behavior support

MISSING IMAGE

Material Information

Title:
Implementing school-wide positive behavior support exploring the influence of socio-cultural, academic, behavioral, and implementation process variables
Physical Description:
Book
Language:
English
Creator:
Cohen, Rachel Mara
Publisher:
University of South Florida
Place of Publication:
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
System-wide behavioral interventions
Team functioning
Demographic factors
Indicators
Treatment integrity
Dissertations, Academic -- Interdisciplinary Education -- Doctoral -- USF
Genre:
bibliography   ( marcgt )
theses   ( marcgt )
non-fiction   ( marcgt )

Notes

Abstract:
ABSTRACT: This study evaluated the influence of academic, behavioral, and sociocultural variables on the implementation of Schoolwide Positive Behavior Support (SWPBS), a system intended to improve discipline in school buildings. The number of schools that are implementing SWPBS has been increasing dramatically over the years as school violence continues to rise and solutions are needed to improve school climate. This study examined the relationship between three categories of variables and the level of implementation of SWPBS in three multiple regression analyses. The categories were school demographic variables (i.e., ethnicity, socio-economic status, teacher: student ratio, percentage of teachers who are out-of-field), severity of need for change (suspensions, office referrals, percentage of students below grade level in reading), and team process variables (coaching, team functioning, administrative support). Of these variables, team functioning was the only one found to be sign ificantly related to implementation. A second component of the study involved collecting data relating to factors that were enablers or barriers to the implementation of SWPBS. Two-hundred and thirty-six school personnel completed a survey, Schoolwide Implementation Factor Survey (SWIF). The survey derived three factors through a factor analysis: school, staff, and students; principal; and assistant principal. These factors were all found to have a high Cronbach's alpha for internal consistency. There were significant differences between schools with a high, middle, and low level of implementation on all of these factors, with respondents from high implementing schools scoring the highest on all factors,and respondents from low implementing schools scoring the lowest. The item on the survey rated as the most helpful in the implementation process was "Expectations and rules that are clearly defined," while the item rated as the most problematic in the implementation process was "Adequat e funding for PBS." Overall, the results highlighted the complexity of implementing a system-wide change.(i.e., ethnicity, socio-economic status, teacher: student ratio, percentage of teachers who are out-of-field), severity of need for change (suspensions, office referrals, percentage of students below grade level in reading), and team process variables (coaching, team functioning, administrative support). Of these variables, team functioning was the only one found to be significantly related to implementation. A second component of the study involved collecting data relating to factors that were enablers or barriers to the implementation of SWPBS. Two-hundred and thirty-six school personnel completed a survey,Schoolwide Implementation Factor Survey (SWIF). The survey derived three factors through a factor analysis: school, staff, and students; principal; and assistant principal. These factors were all found to have a high Cronbach's alpha for internal consistency. There were signific ant differences between schools with a high, middle, and low level of implementation on all of these factors, with respondents from high implementing schools scoring the highest on all factors,and respondents from low implementing schools scoring the lowest. The item on the survey rated as the most helpful in the implementation process was "Expectations and rules that are clearly defined," while the item rated as the most problematic in the implementation process was "Adequate funding for PBS." Overall, the results highlighted the complexity of implementing a system-wide change.
Thesis:
Dissertation (Ph.D.)--University of South Florida, 2006.
Bibliography:
Includes bibliographical references.
System Details:
System requirements: World Wide Web browser and PDF reader.
System Details:
Mode of access: World Wide Web.
Statement of Responsibility:
by Rachel Mara Cohen.
General Note:
Title from PDF of title page.
General Note:
Document formatted into pages; contains 264 pages.
General Note:
Includes vita.

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
aleph - 001796802
oclc - 156876556
usfldc doi - E14-SFE0001607
usfldc handle - e14.1607
System ID:
SFS0025925:00001


This item is only available as the following downloads:


Full Text

PAGE 1

Implementing School-Wide Posi tive Behavior Support: Exploring the Influence of SocioCultural, Academic, Behavioral, and Implementation Process Variables by Rachel Mara Cohen A dissertation submitted in partial fulfillment of the requirement s for the degree of Doctor of Philosophy Department of Psychological and Social Foundations College of Education University of South Florida Major Professor: Michael Curtis, Ph.D. Kathy Bradley-Klug, Ph.D. Lou Carey, Ph.D. Don Kincaid, Ed.D. Kathleen Armstrong, Ph.D. Date of Approval: April 20, 2006 Keywords: system-wide behavioral intervent ions, team functioning, demographic factors, indicators, treatment integrity Copyright 2006, Rachel Cohen

PAGE 2

Dedication To anyone who is currently working toward a goal “One can always push oneself a little bit beyond what only yesterday was thought to be the absolute limit of one’s endurance.” –Gold Meir

PAGE 3

Acknowledgements The completion of this dissertati on would not have been possible without the guidance and support of m any individuals. I would first like to thank Dr. Michael Curtis for the endl ess hours he spent providin g valuable guidance and feedback. I would like to thank Dr. Lou Carey for sharing her wisdom in the research and evaluation processes, Dr. Kathy Bradley-Klug for her mentorship throughout the completion of my doctora l studies, Dr. Kathi Armstrong for her support over the years, and Dr. Don Kinc aid for his leadership and opportunities with the Positive Behavior Support Project, as well as Karen Childs for her ideas and hard work on the evaluat ion of the PBS project. There are many individuals in my life who also have provided endless support in the completion of this dissertat ion. I would first like to thank my Mom for giving me that extra push every time I needed it, my Dad for helping me keep things in perspective, and my sister for being there when I needed her. I would also like to thank Erin Ax for helping me narrow down a dissertation topic and Kahlila Mack for the countle ss hours spent at coffee shops working toward the completion of our dissertations.

PAGE 4

i Table of Contents List of Tables vi List of Figures viii Abstract viv Chapter I: Introduction 1 Purpose of Study 5 Theoretical Framework 6 School-Wide Positive Behavior Support (SWPBS) 6 Overview 6 Research Questions and Hypotheses 9 Operational Definition s of the Variables 11 Significance of Study 15 Chapter II: Literature Review 17 Trends in Educational Innovations 18 Failure of Implementat ion of Innovation 19 Systems Approach to Innovation 21 Systems Perspective 21 Systems Change Approach 21 Overview of Schoolwide Positive Behavior Support 24 Organizational Change/ Implem entation Process for SWPBS 24 Planning for Change 25 Anticipate the Need for Change 25 Develop Relationships and Obtain Commitment 26 Involve Stakeholders and Conduct a Needs Assessment 27 Establishing Organizational Policies to Support Change 27 Developing a Plan 28 Implement the Plan 30 Securing Resources 30 Implementing strategies 31 Evaluate the Plan 34 Evaluate Student Outcomes 35 Summary 37 Implementation Factors 37 General Implem entation Factors 38

PAGE 5

ii Support Variables 39 District/administrative Support 39 Team Functioning 40 Coaching 40 Summary 44 Need for Change Indicators 44 Office Discipline Referrals (ODR) 46 Suspensions 47 Academic Performance 49 Summary 50 Socio-cultural Variables 50 Student Behavioral Indicators 51 Student Academic Indicators 55 School Variables 56 Teacher Variables 57 Summary 59 Summary 60 Influence of Factor s on Implementation 60 Chapter III: Methods 64 Setting 64 Overview of SWPBS 64 Recruitment to SWPBS 65 Coaches 66 SWPBS Teams 67 Training 67 Participants 71 Instrumentation 74 School-Wide Benchmarks of Quality 74 Instrument Content 74 Instrument Development 74 Instrum ent Administration and Scoring 75 Psychometric Properties 76 School Wide PBS Implem entation Factors Survey (SWIF) 80 Instrument Content 80 Instrument Development 80 Scoring and Administration 82 Psychometric Properties 82 Team Process Survey 82 Coach’s Self-Assessment 83 Research Questions 84 Research Question One 84 Purpose 84 Design 85

PAGE 6

iii Research Question Two 85 Purpose 85 Design 86 Research Question Three 86 Purpose 86 Design 86 Research Question Four 88 Purpose 88 Design 88 Research Question Five 89 Purpose and Design 89 Research Question Six 89 Purpose 89 Design 89 Research Question Seven 90 Purpose 90 Design 90 Data Collection Procedures 90 Florida Department of E ducation School Indicators Report Database 91 Mid-Year FLPBS Evaluation Database 92 End-of-Year Evaluation Database 93 Survey Monkey Database (SWIF) 93 Summary 95 Chapter IV: Results 96 Research Questions 96 Research Question One 96 A priori power analysis 97 Preliminary analyses 97 Analysis of Variance 98 Research Question Two 99 Descriptive Statistics 99 Preliminary Analysis 100 Multiple Regression 104 Student variables 104 School building variables 104 Research Question Three 106 Descriptive Statistics 106 Preliminary Analysis 108 Multiple Regression 110 Research Question Four 111 Descriptive Statistics 112 Preliminary Analysis 112 Multiple Regression 114

PAGE 7

iv Research Question Five 116 Sample 116 Test-Retest Reliability 117 Factor Analysis 118 A priori power analysis 118 Factor Extraction 119 Factor Interpretation 120 Research Question Six 122 Descriptive Statistics 123 Analysis 123 Research Question Seven 126 Quantitative Item Analysis 126 Qualitative Item Analysis 128 Chapter V: Discussion 130 Overview of SWPBS Implementation 131 SWPBS Implementat ion Components 131 Benchmarks of Quality 131 School-wide Implementat ion Factor Survey 133 High vs. Low Implementing Schools 134 Trends in Implementation 136 Implementation Process 137 Team Functioning 138 Administrative Support 139 Coaching 141 Technical Assistance 142 Implementation by School Characteristics 142 Demographics 143 School Level 143 Student and School Building Variables 143 Academic and Behavioral Indicators 145 Limitations and Considerations 146 Internal Validity 146 Social Desirability 146 Instrumentation 148 History of Events 148 External Validity 148 Population Validity 148 Variables 158 Implications 152 References 155 Appendices Appendix A: Benchmarks Of Quality Scoring Form 176

PAGE 8

v Appendix B: Benchmarks Of Quality Scoring Guide 179 Appendix C: Benchmarks of Quality Team Rating Form 194 Appendix D: Team Process Survey 196 Appendix E: Coach Self-Assessment Survey 198 Appendix F: General Im plementation Factors 199 Appendix G: District Readiness Checklist 203 Appendix H: District Planning Process Form 205 Appendix I: School Readiness Checklist 210 Appendix J: Comparison of Schools that Returned the BoQ Survey (R) and Schools that Did Not Return the BoQ Survey (NR) 212 Appendix K: Average Scor es for Florida Schools 214 Appendix L: SWIF Survey 215 Appendix M: Summary of SWIF Feedback Comments 237 Appendix N: Research Questions 241 Appendix O: SWIF Emails 244 Appendix P: Descriptive Data for Demographic Variables 247 Appendix Q: Descriptive Data for Team Functioning (TF), Administrative S upport (AS), and Coach SelfEfficacy (CSE) Scores 248 Appendix R: Descriptive Data for ISS, OSS, BLGR, ODR 249 Appendix S: Promax Rotation of Th ree Factor Solution for SWIF Items Structure Coefficients 250 Appendix T: Promax Rotation of Th ree Factor Solution for SWIF Items Patter n Coefficients 252 Appendix U: SWIF Item Co rrelations with Factor (rFx) and Alpha if Deleted from Factor 254 Appendix V: SWIF Item Analysi s by High, Middle, and Low Implementing Schools 256 Appendix W: SWIF Item R ankings by Mean Score 257 Appendix X: SWIF Item Respons e Frequencies for AllRespondents 261 Appendix Y: SWIF Item Means and Standard Deviation by Category 264 Appendix Z: Means and Standard De viations for Overall Score and Subscales by Categories 267 Appendix AA: Content Analysi s of Open-Ended Responses on SWIF Survey 270 Appendix AB: Content Analysis of Open-Ended Responses on SWIF Survey-Final 274 About the Author End Page

PAGE 9

vi List of Tables Table 1 Models for Organizational Change 23 Table 2 Summary of BOQ Scores By Type of School and Year of Implementation 72 Table 3 Analysis of Variance fo r Implementation by Type of School 72 Table 4 Analysis of Variance fo r Implementation by Elementary and Middle Schools 72 Table 5 Sample Size By Year of Implementation 73 Table 6 Means, Standard Deviations, and Cronbach’s Alpha Coefficients fo r the of BoQ Subscales 78 Table 7 Mean Total Poin ts for Each BoQ Subscale 78 Table 8 Source of Data for Variables 92 Table 9 Analysis of Variance fo r Implementation Level by Year of Implementation 99 Table 10 Skewness and Kurtosis Val ues for Socio-Cultural Factors 101 Table 11 Intercorrelations Am ong Socio-Cultural Factors 103 Table 12 Multiple Regression A nalysis Results for Student Variables Predict ing Implementation 105 Table 13 Multiple Regression A nalysis Results for School Building Variables Predicting Implementation 106 Table 14 Skewness and Kurtosis Va lues for Implementation Process Factors 109 Table 15 Intercorrelations Among Team Functioning (TF), Administrati ve Support (AS), and Coach's Self-Efficacy (CSE) 110 Table 16 Multiple Regression Analysis for Implementation Process Factors Pr edicting Implementation 111 Table 17 Skewness and Kurtosis Va lues for Academic and Behavioral Indicators 113 Table 18 Intercorrelations among Academic and Behavioral Indicators 114 Table 19 Multiple Regression A nalysis for Academic and Behavioral Indicators Pr edicting Implementation 115 Table 20 Test Retest Re sults for SWIF survey 117 Table 21 Component Correlati ons Before Grouping Items 120 Table 22 Factor Correlat ions After Grouping Items 120

PAGE 10

vii Table 23 Descriptive Statistics and Cronbach's Alpha for Three Factors 122 Table 24 Instrument mean and standar d deviations for the low, middle, and high im plementing groups 124 Table 25 Interpretation Guide fo r SWIF Item, Subscale, and Total Scores 124 Table 26 Results of Kruskal-Wallis Test 125 Table 27 Results of Mann-Whitney U Test 125

PAGE 11

viii List of Figures Figure 1 Three-tiered approach to PBS interventions 7 Figure 2 Using the SET to Com pare the SWPBS Implementation of Four Cohorts of Schools at Different Stages of Implementation from 1998 to 1999 33 Figure 3 Diagram of Kurt Le win’s Force Field Model was recreated from Harvey and Brown 38 Figure 4 Histogram of BoQ Scores 77 Figure 5 Scatter plot of standardi zed residuals and standardized predictor values for BoQ scores 102 Figure 6 Scree plot for factor analysis 119

PAGE 12

ix Implementing School-Wide Positi ve Behavior Support: Exploring the Influence of Socio-Cultural, Academic, Behavioral, and Implementation Process Variables Rachel Cohen ABSTRACT This study evaluated the influenc e of academic, behavioral, and sociocultural variables on the implementation of Schoolwide Positive Behavior Support (SWPBS), a system intended to improve discipline in school buildings. The number of schools that are impl ementing SWPBS has been increasing dramatically over the years as school viol ence continues to rise and solutions are needed to improve school climate. This st udy examined the relationship between three categories of variables and the level of implementation of SWPBS in three multiple regression analyses. The cat egories were school demographic variables (i.e., ethnicity, socio-economic status teacher: student ratio, percentage of teachers who are out-of-field), severity of need for change (suspensions, office referrals, percentage of students below grade level in reading), and team process variables (coaching, team functioning, administrative support). Of these variables, team functioning was the only on e found to be significantly related to implementation. A second component of the study involved collecting data relating to factors that were enablers or barriers to the implementation of SWPBS. Two-hundred and thirty-six sc hool personnel completed a survey, Schoolwide Implementation Fa ctor Survey (SWIF). Th e survey derived three

PAGE 13

x factors through a factor analysis: school, staff, and students; principal; and assistant principal. These factors were all found to have a high Cronbach’s alpha for internal consistency. There were si gnificant differences between schools with a high, middle, and low level of implem entation on all of t hese factors, with respondents from high implementing school s scoring the highest on all factors, and respondents from low impl ementing schools scoring the lowest. The item on the survey rated as the most helpful in the implementation process was “Expectations and rules that are clearly defined,” while t he item rated as the most problematic in the impl ementation process was “A dequate funding for PBS.” Overall, the results high lighted the complexity of implementing a system-wide change.

PAGE 14

1 Chapter 1 Introduction Foreign competition, widespread cr iticism of the education system, and the increasing heterogeneity of the student population has created a demand for educational reform across the United Stat es. The impact of foreign competition on the education system can be traced to Ru ssia’s launching of Sputnik in 1957 (Dow, 1991). Literary criticism of t he education system c an be illustrated by Jonathan Kozol’s 1967 claim that schools “destroy the minds and hearts of our children” (p.4). In addition, students are more racially, ethnically, economically, and linguistically diverse than they were 30 years ago (Hargreaves, 1997; Lewis & Newcomer, 2002; National Research Co uncil, 2002; United States Department of Education [USDOE], 2002, 2003). As educational practices fail to meet societal standards, education reformers generate solutions to accommodate the changing needs of American students and t he demands of foreign competition. To meet these demands, educational trends have run the gamut from multicultural education to service learni ng and back to basics education. Current trends include accountability, high standards for all students, and choice in the delivery of education (Ellis, 2001; Hall & Ho rd, 2001; National Research Council, 2002; OSEP, 2004). The continual dem and to develop and adopt new trends paired with the continual demand to improve educational practices leads one to question whether it is the i nnovations that fail to improve the school system or the schools’ failure to implement the innovations. As many of these trends were developed from research-based practices

PAGE 15

2 (Ellis, 2001), it is important to consider whether or not these programs have been implemented as intended. When program s are not implemented with full strength, the treatm ent outcome is weakened as indicated by a meta-analysis conducted by Lipsey (1992) of 443 juvenile delinquency prevention and treatment studies. The author found that programs conducted by the researcher had stronger implementation and thus larger effects than did those conducted by subsequent researchers. Similarly, Gottfredson, Gottfredson, and Skroban (1998) did not obtain the anticipated level of implementation or behavioral results in a district-wide school reform effort to decrease student behavior that led to dropping out of high school. To monito r implementation, the researchers established specific implem entation standards prior to t he intervention (e.g., 85% of students must complete 82% of assi gnments; half the students must get one hour of tutoring per week); however, they found that the progr am implementation did not meet these goals. Specifically, the individual component s of the program were not implemented to the same degr ee as were the original components developed by other researcher s. The original empirically-based programs had included training and on-going c onsultation by the researcher, but without these elements, treatment in tegrity was reduced. If programs are not being im plemented at full strength, it is crucial that they include a measure of implementation to determi ne whether the outcomes or lack of outcomes can be attributed to the program or the degree of implementation. It is im portant to consider the difference between outcome studies of programs that are closely m onitored by the researcher and outcomes

PAGE 16

3 studies of programs implem ented by other persons in the field. Measures of implementation will provide researchers with a yardstick with which to determine how much program implementation is neces sary to elicit positive results. In addition to measuring implementati on, the reasons that schools fail or succeed in implementing innovations as intended must be explored. The possible reasons that innovations are not impl emented as intended will be discussed first. Innovations may fail if they are not needed or are not appropriate. If the innovation does not fit the needs of the school (Ellis, 2001) or if people are not concerned with the problem to be addresse d by the innovation (Hall & Hord, 2001), they will not be motivated to and will not implement the innovation. There also may be competing initiatives or systems already in place (Knoff, 2002; Grimes & Tilly, 1996; OSEP, 2004). The system hosting the innovation may lack support from administrators and poli cy makers. There may be a lack of communication, training, on-site coac hing, and/or time to implement the innovation as well (Hall & Hord). Innovati ons also can fail if the people who must implement them lack an underst anding of the rationale, la ck a commitment to the new procedures (Fullan, 1997), or lack an understanding of a systems perspective and a systems change approach (C urtis & Stollar, 2002; Schmuck & Runkel, 1994; Senge, Kleiner, Rober ts, Ross, & Smith, 1994). As a systems perspective is nec essary to successfully implement innovations, there are seven categories of system-wide factors or issues that are described in the literature as facilitati ng schools in the successful implementation of innovations. These include (1) disse minating knowledge about the innovation

PAGE 17

4 (e.g., Harvey & Brown, 2001; Sparks, 1988), (2) providing resources to implement the innovation (e.g., Gottfreds on, Gottfredson, & Hybl, 1993; Reimer, Wacker, & Koeppl, 1987; Witt, Martens, & E lliot, 1984), (3) soliciting input from staff, parents, and students (e.g., Chapman & Hofweber, 2000; Hall & Hord; Ikeda, Tilly, Stumme, Volmer, & Allison, 1996), (4) providing on-going training (e.g., Curtis & Stollar, 2002; Grimes & Tilly; Knoff, 2002), (5) integrating the innovation into the current system (e.g ., Nersesian, Todd, Lehmann, & Watson, 2000; Ponti, Zins, & Graden, 1988), (6) evaluating the i nnovation (e.g., Chapman & Hofweber; Taylor-Green & Kartub), and (7) providing support for the innovation (e.g., Grimes & Tilly, 1996; Ha ll & Hord, 2001; Lewis, Sugai, & Colvin, 1998; Nakasato, 2000; Sadler, 2000). Specif ic to support, administrative support (e.g., Taylor-Green & Kartub, 2000), effective team func tioning (e.g., Hall & Hord, 2001), and coaching have been found to increase the implementation of interventions (Joyce & Showers, 1982; Lewis & Newcomer, 2002; Noell, Witt, Gilbertson, Ranier, & Freel and, 1997; Witt, Noell, LaF leur, & Mortenson, 1997; Mortenson & Witt, 1998). These factors have been described as key issues in the implementation of systems change projects. Many of these fa ctors, however, were derived from anecdotal accounts or “lessons learned” from implementing innovations (e.g., Chapman & Hofweber, 2000; Chapman & Hofweber, 2000; Taylor-Green & Kartub, 2000; Taylor-Greene et al., 1997) and were not systematically explored. Additionally, many of these factors result ed from case studies (e.g., Nakasato, 2000, Taylor-Green & Kartub, 2000), small n studies (Got tfredson, Gottfredson, &

PAGE 18

5 Hybl, 1993; Nersesian, Todd, Lehmann, & Watson, 2000) or resulted from interventions with individual teachers instead of school-wide interventions (Reimer, Wacker, & Koeppl, 1987; Wi tt, Martens, & Elliot, 1984; Joyce & Showers, 1982; Witt Noell, LaFleur, & Mortenson, 1997). There is a paucity of large scale studies that quantitatively describe the relationship between these variables and the implementation of school -wide interventions. While qualitative reports can inform practice, quantitat ive approaches are needed to describe the relationship of these variables in a lar ge number of schools in which innovations are being implemented. Purpose of Study The purpose of this study is to quantitat ively identify factors that relate to the implementation of a specific school -wide innovation intended to improve the behavioral climate of schools, School-Wide Positive Behavior Support (SWPBS). SWPBS is the implem entation of “procedures and processes intended for all students, staff, and settings. [It] must have a building-wide team that oversees all development, implementation, modificati on, and evaluation activities” (Florida Positive Behavior Su pport, 2004, slide 26). As rela tively little is known about factors that affect the succe ss or failure of the implemen tation of PBS (Metzler et al., 2001; Kincaid et al., 2002), this study intends to identify which factors influence implementation and to determine whether the high and low levels of implementation by schools differ based on these factors. As a need also exists to evaluate data trends relative to SES, lo cation, size, diversity (Sugai, Sprague, Horner, & Walker, 2000), so cio-cultural factors, behavioral indicators, and

PAGE 19

6 academic indicators that existed prior to implementation al so will be explored with regard to the implem entation of SWPBS. Theoretical Framework The theoretical framework of this study is a systems perspective, the “ability to understand how the various co mponent parts of a system, the system itself, and the surrounding systems or en vironment influence one another” (Curtis & Stollar, 2002, p. 225). In this st udy, the components of the system to be examined include implementation proce ss variables, barriers and enablers to implementation, socio-cultural factors, and behavioral and academic indicators of success. The study is intended to descri be the influence of these variables and their interactions on the implementation of SWPBS. School-Wide Positive Behavior Support (SWPBS) Overview School-Wide Positive Behavior S upport (SWPBS) is part of a larger initiative called Positive Behavior Suppor t (PBS). The term “Positive Behavior Support” (PBS) will be used, al though the literature refers to the same model as Effective Behavioral Support (EBS), or Positive Beha vioral Interventions and Supports (PBIS). By definition, PBS is a “syst ems approach to enhancing the capacity of schools to adopt and sustain t he use of effective practices for all students” (Lewis & Sugai, 1999, p. 4) and is based on the principles of behavioral science, empirically-based and practical in terventions, social values for the individual, and a systems perspective (S ugai et al., 1999). Implementation of

PAGE 20

7 PBS involves three tiers of disciplinary in terventions that increase in intensity from primary or universal interventions to secondary and tertiary interventions (See Figure 1). In this model, the int ensity of the intervention matches the intensity of the problem behavior (Le wis & Sugai, 1999; Nelson, 2000; OSEP, 2004; Sugai et al.; Taylor-Greene, 1997). Students who do not respond to primary interventions ar e provided with secondary interventions, and students who do not respond to secondary interventions are provided with tertiary interventions. Thus, the implementation of primary interventions targeting all students and settings should increase the accuracy of selecting students for secondary and tertiary levels of inte rventions. While a comprehensive PBS system includes all three intervention tiers, the emphasis of this study will remain on implementation at the primary level or School-W ide Positive Behavior Support. Primary Prevention: School-/ClassroomWide Systems for All Students, Staff, & Settings Secondary Prevention: Specialized Group Systems for Students with At-Risk Behavior Tertiary Prevention: Specialized Individualized Systems for Students with High-Risk Behavior ~80% of Students ~15% ~5% CONTINUUM OF SCHOOL-WIDE INSTRUCTIONAL & POSITIVE BEHAVIOR SUPPORT Figure 1. Three-tiered approach to PBS interventions (OSEP, 2004, p. 17).

PAGE 21

8 Implementation of SWPBS To successfully implement SWPBS, schools must comm it to the SWPBS process, develop a team, conduct a needs assessment, and train the team in the elements of SWPBS. SWPBS is not one intervention but ra ther a combination of evidence-based disciplinary practices compri sed of six elements: (1) a positivelystated purpose statement, (2) school-wi de expectations, (3) procedures for teaching school-wide expectations, (4) a continuum of procedures for encouraging school-wide expectations, (5 ) a continuum of procedures for discouraging violations of school-wide expectations, and (6) procedures for monitoring the impact of SW PBS (Lewis & Sugai, 1999). Because a key element of SWPBS involves monitoring program impact and collecting data to make decisions about effective practices, evaluators must collect data to determine the impact of SWPBS on the school. While the majority of program evaluations have yielded significa nt decreases in disciplinary actions, many authors have acknowledged that only limited measures of treatment integrity were used (Eber, Lewis-Palmer, & Pacchiano, 2001; Metzler, Biglan, Rusby, & Sprague, 2001; Scott, 2001). Treat ment integrity is a measure of the degree to which an intervention is impl emented as intended (Gresham, 1989). Therefore, the present study measures the implementation of SWPBS and differentiates the characteristics of schools that were successful and unsuccessful in implementing SWPBS.

PAGE 22

9 Research Questions and Hypotheses This study is designed to answer the following questions. Hypotheses are proposed for each question. Operational definitions for key variables are presented following the research questions. Research Question One: Are there differences in the perceived levels of implementation of School-Wide Positive Behavior Support betw een schools in their first, second, and third year of the implementat ion process? Hypothesis: As interventions/programs often take three to five years to implement, it is hypothesized that schools that have been involved in SWPBS longer will have a higher leve l of perceived implementation. Research Question Two: What is the relationship between socio-cultural school factors (i.e., socio-economic stat us, ethnicity, school size, teacher: student ratio, student stability, percentage of students with a disability, percentage of teachers with an advanced degree, perc entage of out-of-field teachers) and perceived level of School-Wide Positi ve Behavior Support implementation? Hypothesis: As SWPBS is indivi dualized for each school, it is hypothesized that socio-cultural variabl es should not greatly influence the level of SWPBS implementation. Research Question Three: What is the relationship between implementation process factors (i.e., effe ctive team functioning, admin istrative support, and coach’s self-efficacy) and perceived level of Sc hool-Wide Positive Behavior Support implementation? Hypothesis: As research indicates t hat implementation process variables

PAGE 23

10 of administrative support, positive t eam functioning, and coach’s selfefficacy are necessary for successful implementation, it is hypothesized that the presence of these variabl es will predict a higher level of implementation. Research Question Four: What is t he relationship between level of need for School-Wide Positive Behavior Support as measured by the percentage of students who received an in-school suspension (ISS), out-of-school suspension (OSS), office discipline referral (ODR), or the percentage of students w ho were below grade level in reading during the baseli ne year and perceived level of School-Wide Positive Behavior Support implementation? Hypothesis: It is hypothesized th at schools that had a higher need for PBS as indicated by a higher perc entage of students with an ISS, OSS, ODR, and who were below grade le vel should have a higher level of implementation as the schools may have been more motivated to make changes in the school, and thus, invest mo re effort in th e implementation process. Research Question Five: What is the re liability, validity, and factor structure of the School-wide Positive Behavior Support Im plementation Factors Survey (SWIF), an instrument intended to measure the degree to which various factors influence implementation? Hypothesis: As this instrument was carefully constructed based on the principles of instrument development, the SWIF should have good reliability and validity and a good factor structure.

PAGE 24

11 Research Question Six: Is there a difference between schools classified as having a high level of School-Wide Positi ve Behavior Support implementation and schools classified as having a low level of School-Wide Positive Behavior Support implementation on the factor sc ores of the SWIF survey? Hypothesis: High implem enters will have a higher total score on the factors of the SWIF indicating a higher degree of helpfulness of these factors than will lo w implementers. Research Question Seven: Which item s are perceived as most helpful in the implementation of School-Wide Positive Behavior Support by coaches and team members, and which items ar e perceived as being most problematic in the implementation of School-Wide Positive Behavior Support by coaches and team members? Hypothesis: There is no hypot hesis for this question bec ause it is exploratory. Operational Definition s of the Variables Year of implementation indicates the number of years that a school has been engaged in the implementati on of SWPBS. Schools are either in their first, second, or third year of im plementation. Schools in the first year were trained in the summer of 2004. Schools in their second year were trained in the summer of 2003. Schools in their third year were trained in the summer of 2002. Team rating of implementation le vel was derived by grouping the schools with scores on the Benchmarks of Quality (BoQ) instrument (See Appendices A, B, C) in the top 1/3 as “high implementers,” schools in the middle 1/3 as “middle implementers,” and the schools with scores in the bottom 1/3 as

PAGE 25

12 “low implementers.” Socio-economic status (SES) is represented by the percentage of students eligible for free/reduced lunch during the 2003/2004 school year. This is derived by dividing the total number of st udents eligible for fr ee or reduced lunch price by the school enrollment. Ethnicity is represented by the percentage of non-white students in the school during the 2003/2004 school year. A higher score indicates a more diverse student population. School size is the total number of students enrolled in the school as measured during the fall survey period in October of the 2003/2004 school year. Teacher: student ratio is the ratio of the total number of students in the school as measured during the fall survey period in October 2003 divided by the total number of instructional staff. The total number of instructional staff was derived by multiplying the total number of school staff by the percentage of instructional staff. The percentage of students with a dis ability (% disability) was derived from the October 2003 me mbership count of students with a primary exceptionality who are identified with hav ing a disability in accordance with the requirements of the Florida D epartment of Education. The percentage of teachers holdi ng an advanced degree (% advanced degree) includes teachers with a master’s, specialist’s, or doctoral degree during the 2003/2004 school year. A teacher is defi ned as a professional who is paid on the instructional salary schedule of a Florida school district.

PAGE 26

13 Percentage of out-of-field teachers (% out-of-field) is defined by the percentage of courses in core academic subjects being taught by classroom teachers who are teaching out of fiel d during the 2003/2004 school year. Core academic courses are Eng lish, reading, language arts mathematics, science, foreign languages, civics, government, econ omics, arts, histor y, and geography. Stability rate is defined as the percentage of students from the October membership count in 2003 who are still present in the same school for the membership count in February, 2004. Team functioning score (TF) is deriv ed from the average total score of all team members on each school team on Items # 1-7, 9, 10 on the Team Process Survey (see Appendix D). Item scores can range from 1 indicating strongly disagree to 5 indicating strongly agree. Total scores range from 9 to 45. Perceived district/administrative suppor t (AS) is derived from the average total score of all team me mbers on each school team on the items #11, 12, 13, 14, and 15 on the Team Process Survey. Item scores can range from 1 indicating strongly disagree to 5 indicating strongl y agree. Total scores can range from 5 to 25. Coach's self-efficacy (CSE) is the coach’s rating of his/her own skills related to implementing the SWPBS pr ocess, including data use, team processes, and implementati on. Coach self-efficacy is measured by the eight items on the Coach Self-Assessment rating form (See Appendix E). Each item is worth between one and three poi nts. A score of one indicates that the coach is learning the skill, two indicates that t he coach is building the skill but not fluent,

PAGE 27

14 and a three indicates that the coach is fluent or has mastered the skill. Total possible scores range from 3 to 24. A hi gher score indicates that the coach believes he/she has higher fluency in these skills. In school suspensions (ISS) is de rived from the per centage of students from the total enrollment who served in -school suspensions during the 180 day school year occurring one year prior to SWPBS implementation. If a student was suspended more than once, he/she is counted only once. Out of school suspensions (OSS) is derived from the percentage of students from the total enrollment who served out-of-school suspensions during the 180 day school year occurring one year prior to SWPBS im plementation. If a student was suspended more than onc e, he/she is counted only once. Office discipline referral (ODR) indica tes the total number of ODR reported during the 180 day school year o ccurring one year prior to SWPBS implementation. An ODR is defined as any written documentatio n that a student violated a school expectation or rule and was sent to the o ffice. Each school reports the total number of ODRs per year to th e Florida Department of Education in a category called “Incident s of Crime and Violence.” These are incidents that have occurred on school gr ounds, on school vehicles, or at schoolsponsored events. The incident s are reported in the six categories of (1) violent acts against persons, (2) possession of alcohol, tobacco, and other drugs, (3) property offenses, (4) fighting and har assment, (5) weapons possession, and (6) other nonviolent offenses and disorderly conduct. Percentage of students who are below grade level in reading (BGLR) was

PAGE 28

15 derived as the Average Pe rcentage of Students across grades who score below a level three on the Reading portion of the FCAT during the 180 day school year occurring one year prior to SWPBS impl ementation. The per centage of students in each grade who scored a level one or a level two was added together for each grade. The percentages were averaged across all the grades in each school. School-wide Positive Behavior Suppor t Survey (SWIF) instrument is intended to measure the degree to which various factors influence implementation. It was developed follo wing a nominal group process during which PBS implementers generated a list of barriers and enablers to the implementation of SWPBS. This lis t was developed into factors for the instrument. Each item within the factors was rated on a Likert scale ranging from “problematic” to “helpful .” The item means were compared and ranked to determine the barriers and enablers that are most important. Additionally, respondents were asked to provide additiona l factors that were not included in the survey. These responses were tallied. Significance of Study The findings from this study will help practitioners identify the characteristics of schools that are bes t suited for the impl ementation of SWPBS. Practitioners also can identify the charac teristics of schools that have lower levels of implementation of SWPBS. This identificati on is important as these schools may need modified program component s to succeed in implementation. In addition, identifying influential proce ss variables will help SWPBS trainers create strategies to bette r train school personnel to be more successful in

PAGE 29

16 SWPBS implementation. Ult imately, improving the implementation of SWPBS implementation will increase posit ive outcomes for all students.

PAGE 30

17 Chapter II Literature Review Russia’s launching of Sputnik in 1957 created a fear in the American people that they were being surpassed by the Russians in education (Dow, 1991). As a response, Americans scrambl ed to improve their education system as President Lyndon B. Johnson launc hed a war on poverty and initiated programs, such as Head Start, to impr ove educational opportunities for children from low-income households. Around t he same time, American authors began writing compelling documentaries that ex posed the inequities of the education system. Authors like John Holt (1964) and Iv an Illich (1971) claimed that schools were failing to provide even an adequate education for the neediest students. In 1967, Jonathan Kozol claimed that schools “destroy the minds and hearts of our children,” and then in 1991 claimed that a ttempted reforms had only provided “a ‘more’ efficient ghetto school ” (p. 4). With such wides pread criticisms of the education system, coupled with foreign competition, school reform has become an integral part of and will continue to be part of the cult ure of American education. Another reason that education reform has become part of school culture is the need to accommodate an increasi ngly heterogeneous student population (Hargreaves, 1997; Lewis & Newcomer, 200 2; National Research Council, 2002). Our school-age population is more racially, ethnically, economically, and linguistically diverse than it was 30 y ears ago (United States Department of Education [USDOE], 2002, 2003). There ar e more than twice as many children

PAGE 31

18 who do not speak English at home today as compared to 1979 (USDOE, 2003). There are increasing numbers of childr en who are diagnosed with behavioral and emotional disorders, exposed to toxic s ubstances, infected with the HIV virus, and considered homeless (Kni tzer, 1993; Office of S pecial Education Programs [OSEP], 2004; Stevens & Price, 1992). To that end, there are fewer family resources with two-parent households decr easing from 83% in 1976 to 68% in 2001 (USDOE, 2003). With this changi ng student population, educational practices that were effective in the pas t may not be effective today, and solutions are needed to accommodate the increasin gly diverse needs of American students. Trends in Educational Innovations To provide more effective practices, advances in research and technology continuously emerge, creati ng trends in education. Educational trends have addressed a wide range of issues such as multicultural educat ion, cooperative learning, service learning, values clar ification, human relations training, open schools, competency-based education, peace education, back to the basics, and bilingual education (Ellis, 2001; Hall & Hord 2001). The majority of these trends represented a “fix the parts” approach because they each targeted only one specific aspect of the school; in contrast, current trends resemble more of a “fix the school” or “fix the system” appr oach because they either use school improvement teams to impr ove school functioning or they propose a complete restructuring of school components (Sashk in & Egermeier, 1993). Current trends include accountability, a focus on st udent outcomes, high standards for all

PAGE 32

19 students, local flexibility, and choice in t he delivery of education (e.g., No Child Left Behind Act of 2001) (National Res earch Council, 2002; OSEP, 2004). The continual development and adoption of new trends and innovations suggests that educators continue to search for innovat ions to improve the education system (Ellis). With the ongoing criticism of the educational system, however, the question is whether it is the innovations t hat fail to “fix” the schools’ problems or the schools’ failures to implement the innovations. Failure of Implementation of Innovation The failure to implement innovations as intended is often the reason the innovations fails, not the innovations t hemselves (Gresham, 1989). Studies that include measures of impl ementation or treatment in tegrity often fail to demonstrate a high level of integrity in implementation (Gottf redson et al., 1998), and innovation programs themselves ar e not implemented as comprehensively as have the programs in the original empi rically based studies (Silvia & Thorne, 1997, cited in Gottfredson et al.). No t surprisingly, the degree of treatment integrity influences the degree of treatm ent outcomes (Gottfredson et al.; Gresham, 1989). There are many reasons that implem entation may fail. The implementation of an innovation may fail if it is not per ceived to be responsive to a need or not properly integrated into the school system If school personnel do not believe the innovation will respond to the needs of thei r school (Ellis, 2001) or if they are not concerned with the problem the innovation is intended to address (Hall and Hord, 2001), they will not be motivated to and will no t implement the innovation. Even if

PAGE 33

20 the initiative is needed, t here may be competing initiati ves or systems already in place (OSEP, 2004). When ther e are competing initiative s, Knoff (2002) suggests that organizations should replace t he old system with the new system while Grimes and Tilly (1996) suggest that t he new system should coexist with the old system in a temporary dual system. T he former suggestion, however, could create resistance to change while the la tter suggestion could create additional work for the people in the system; either re sult could be mistaken for the failure of the new system instead of the failure to effectively implement it. Failure of the new system or initiative also can result from the absence of systemic support from persons in key l eadership positions and policy makers. If an initiative is not followed by conti nuous communication, ongoing training, onsite coaching, and time for implementati on, it is not likely to succeed (Hall & Hord, 2001). On the contrary initiatives that are mandat ed by legislation, even though supported, can fail if the people who must implement them lack an understanding of the rationale and commitment to the ne w procedures (Fullan, 1997). An understanding of t he rationale and a commitment to the innovation are necessary because initiatives and school-wide interventions involve a complex series of events that require high-level sk ills or thinking (Fullan). Because change is complex, many innovations fail because the implementers lack a systems perspective (Curtis & Stollar, 2002; Schmuck & Runkel, 1994; Senge, Kleiner, Roberts, Ross, & Smith, 1994). Implement ers must, therefore, adopt a systems perspective and systems change approach to account for factors such as school

PAGE 34

21 personnel, administration, cu rrent policies, and level of commitment to the new procedures. Systems Approach to Innovation Systems Perspective A systems perspective is the “ab ility to understand how the various component parts of a system, the system itself, and the surrounding systems or environment influence one another” (Curtis & St ollar, 2001, p. 225). A school is considered a system because it is “an orderly combination of two or more individuals whose interaction is intended to produce a desired outcome” (Curtis & Stollar, p.224). Guided by a common mi ssion, school systems are comprised of many subsystems including the clients, students, human resources or program implementers, the building, district, home, community, and the organizational system. All these subsystems must be considered in the change process (Knoff, 2002). Systems Change Approach Considering a systems perspective is the first step in a systems change approach; understanding the st eps in the change process is the second. There are several models of organizational c hange. This section will focus on three models for change offered by Harvey and Brown (2001), Curtis and Stollar (2002), and Valentine (1991). The models share four stages: planning for change, developing a plan, implementing the plan, and evaluating the plan. Within each stage, there are specific components. Most of the specific

PAGE 35

22 components have been described by the th ree sets of authors; however, some are unique to individual models. The similar and unique components of each model are presented in Table 1. The models for organizational change present a heuristic approach for understanding the change process needed to e ffectively implement school-based initiatives. Application of the change process, howev er, can be best understood through an example. To demonstrate the application of the change process, the steps required to implement a school-wide initiative called Schoolwide Positive Behavior Support (SWPBS) will be described Identification of organizational factors that predict successf ul implementation of SWPBS was the focus of this study. To provide a context for this study, the remaining sections will provide an overview of SWPBS, a descr iption of the organizational change process used to promote SWPBS implementation, and the potential factors that can influence implementation of this initiative. The purpose of Schoolwide Positive Behavior Support (SWPBS) is to improve the climate of schools us ing system-wide positive behavioral interventions. SWPBS is a component of a la rger general initiative called Positive Behavior Support. The term “Positive B ehavior Support” (PBS) will be used in this review; however, the literature also refers to terms, su ch as “Effective Behavioral Support” (EBS) and “Positive Be havioral Interventions and Supports” (PBIS) interchangeably with “Positive Behavio r Support.” By definition, PBS is a “systems approach to enhancing the capacit y of schools to adopt and sustain the use of effective practices for all student s” (Lewis & Sugai, 1999, p. 4). PBS

PAGE 36

23 emerged in the midto late -1980’s as an alternative to aversive interventions for students with disabilities who had self-i njurious or aggressive behavior (Durand & Carr, 1985). The intent of this program was to emphas ize individuals’ quality of life and abilities over their disability (Dunlap, 2004; Sugai et al., 1999). PBS reflects principles from the behavioral sciences, empirically-based and practical interventions, social values for the i ndividual, and a systems perspective (Sugai et al., 1999). Table 1 Models for Organizational Change Models Curtis & Harvey & Valentine Stollar Brown Stages Components of each stage (2002) (2001) (1991) Planning Anticipate the need for change x x x for change Develop relationship with key personnel and obtain commitment x x x Involve stakeholders x x Conduct diagnosis/needs assessment x x x Establish policies for organization x Developing a plan Develop mission, goals, and objectives x x x Develop strategies and techniques x x x Select goal-focused strategies x x Implementing Secure resources x the plan Ensure staff possess planning/ problem-solving skills x Implement strategies x x x Evaluating Monitor progress x x x the plan Revise areas that need improvement x x x Evaluate outcomes x x x Recycle process when appropriate x x Note. The x's indicate that the component was described by the author.

PAGE 37

24 Overview of Schoolwide Positive Behavior Support Implementation of the PBS model involves three tiers of behavioral interventions (i.e., primary or universal, secondary, and tertiary); the intensity of the interventions is intended to match the intensity of the problem behavior (Lewis & Sugai, 1999; Nelson, 2000; O SEP, 2004; Sugai et al., 1999; TaylorGreene, 1997). In the PBS model, Sugai et al. contend that when universal school-wide strategies (e.g., posti ng behavioral expectations, teaching expectations, rewarding students who m eet expectations) are consistently applied for all students across all school settings, 80-90% of the students will demonstrate appropriate behavior. Secondary support interventions, such as specialized group-based strategies (e.g., social skills training), will be necessary for the 5-15% of students who do not respond to universal strategies, and tertiary interventions or individualized strategies (e.g., self-monitoring) will be necessary for the additional 1-5% of students who do not respond to universal or secondary interventions (Lewis & Sugai). (See Fi gure 1 for a represent ation of the PBS model.) While a comprehensive PBS syste m includes the primary, secondary, and tertiary levels, the inclusion of all thr ee levels in this literature review would be expansive and not directly relevant to the purposes of this study. The emphasis of this research will be on the pr imary level of behavioral interventions, or Schoolwide Positive Behavior Support. Organizational Change/ Implementation Process for SWPBS The following sections will explain t he application of the stages of the organizational change models described earlier as they apply to the

PAGE 38

25 implementation process for SWPBS (See Table 1). The implementation process for SWPBS was compiled from multiple sources on the SWPBS implementation process (e.g., George, Harrower, & K noster, 2003; Lewis & Sugai, 1999; OSEP, 2004) and reflects the following sequenc e of stages: planning for change, developing a plan, implem enting the plan, and ev aluating the plan. Planning for Change Anticipate the need for change. The need for SWPBS is supported in the literature on school violence and disciplinar y problems. A survey of a nationally representative sample of 1,000 teachers and 1,180 students in grades 3 through 12 found that while most teachers felt safe at schools, 11% had been the victims of violence on school property; while 50% of students felt safe, 23% reported being victims of violence, and 22% were somewhat or very worried about being hurt at school (Leitman & Bi nns, 1993). With statistics like these, it is not surprising that school violence continues to be rated by the public as the top problem or concern in schools (Ros e & Gallup, 2004; Mayer & Leone, 1999). In addition to violence, many schools have an abundance of office discipline referrals (ODR), suspensions, and expulsions for violating school rules and need to improve their discipline syst ems. For example, one school is depicted to have had one to five percent of the students representing over 50% of the office discipline referrals (T aylor-Greene et al., 1997); another high school with an enrollment of appr oximately 1400 students had accumulated over 2000 ODRs from September th rough February, and an urban middle school of only 600 students accumulated over 2000 ODRs in one year (Sugai et al., 1999). If

PAGE 39

26 greater than 10% of the population engages in repeated disruptive behaviors, are chronically absent or tardy, do not comple te work on time, or are violating the same rule repeatedly, it is typically the discipline system that needs to change (Florida Positive Behavior Support [FLPBS ], 2003-2004). Many efforts to remediate syst em wide discipline problems have been reactive (OSEP, 2004). Paradoxic ally, research shows that the more that effort is needed to run a “secure building” using reac tive methods (e.g., metal detectors, personnel interventions), the more vict imization occurs and the less safe the students feel (Mayer & Leone, 1999, p. 4). Such reactive or aversive strategies may immediately reduce a problem, but such reductions are temporary and problem behaviors often reoccur (O SEP, 2004). In contrast, proactive approaches that emphasize teaching expectations and rewarding positive behavior are effective for the majority of students (Sugai et al., 1999). The more that a “system of law” is mainta ined where students know the rules and consequences for misbehavior and know that the rules are applied fairly, the less victimization and disorder occur in school buildings (Mayer & Leone, p.4). Findings of this nature have supported the need for a shift from a reactive to a proactive approach to discip line, such as SWPBS. Develop relationships and obtain commitment. As SWPBS is a complex and time-intensive process, implementati on requires a high level of commitment from schools. Schools must meet three requirements before the implementation of SWPBS will be init iated: (1) SWPBS must be incl uded as one of the top three school improvement priorities, (2) the school must agree to collect school

PAGE 40

27 performance data, and (3) the school must have or find a source for on-site technical assistance. Without this commitment, SWPBS is less likely to be successful (Lewis & Sugai, 1999). Involve stakeholders and conduct a needs assessment. To involve stakeholders in the change process, Pe shak George et al. (2003) recommend obtaining a commitment from at least 80% of the faculty, staff, and administration to decrease problem behaviors. Once comm itment is obtained a representative team of school staff is formed (Peshak Geroge et al.). Teams are asked to conduct a needs assessment by reviewi ng their current discipline data to determine trends in problem behavior (FLPBS, 2003-2004). Establishing organizational policies to support change. Initiatives are more likely to succeed when they are support ed by higher level policy or mandates (Hall & Hord, 2001). PBS has been support ed by legislation since the 1997 reauthorization of the Indi viduals with Disabilities E ducation Act (IDEA) and subsequently by the Individuals with Disabi lities Education Improvement Act of 2004 (IDEIA). The law require s that schools now consid er the use of PBS with individuals during the development of an Individualized Education Plans (IEP) or when individual students are facing disci plinary action due to behavior. Despite the term “consider,” PBS is presumed to be the intervention of choice as it is the only intervention explicitly recommended by IDEA (Turnbull, Wilcox, Stowe, & Turnbull, 2001). Furthermore, the use of PBS with all students is recommended as best practice by IDEA (Turnbull, W ilcox, Stowe, & Turnbull). If schools are required to use PBS with students with dis abilities, school personnel must have a

PAGE 41

28 functional understanding of PBS; this requires pers onnel training in the implementation of PBS across settings. Developing a Plan Successful implementation of SW PBS requires the development of a mission statement, goals, objectives, and strategies for implementation. SWPBS is not one specific intervention but rat her a combination of six elements: a statement of purpose, sc hool-wide expectations, proc edures for teaching schoolwide expectations, a continuum of pr ocedures for encouraging school-wide expectations, a continuum of procedur es for discouraging violations of expectations, and procedures for moni toring the impact of school-wide PBS (Lewis & Sugai, 1999). The development of an implementation plan, therefore, should include the development of all six elements. As teams develop an implementation plan, they prioritize stra tegies for implementing the plan through the development of an action pl an (Peshak George et al., 2003). Schools must first develop a positively stated mission statement, expectations for student behavior, and less ons to teach the expectations. The mission statement should be brief, consi der all the intended outcomes of PBS, and encompass all students, staff, and setting s (Lewis & Sugai, 1999) (e.g., The mission is to inspire lifelong learners through quality learning experiences and strategies for success). Once the missi on statement is defined, school staff should develop three to five school-wide expe ctations that are clearly, positively, and broadly stated (e.g., Be Ready to Lear n, Be Safe Be Respectful, Be Responsible). To create a context for eac h expectation, specific rules should be

PAGE 42

29 established for each setting (e.g., cla ssroom, hallways, cafeteria), and there should be no more than two rules for eac h expectation (Lewis & Sugai). For example, classroom rules for “Be Res pectful” could be “Listen to others” and “Keep hands to self.” Once the expec tations and rules are developed, they should be explicitly taught to the student s in the natural setting (Colvin, Kameenui, & Sugai, 1993; Lewis, Sugai, & Co lvin, 1998; Nelson, Colvin, & Smith, 1996). For example, teachers should demon strate “keeping hands to self” and should provide examples and nonexamples of this rule. The development of expectations and lesson plans for teaching the expectations must be supplemented with a plan to enforce the expectations. To encourage behaviors that adhere to the schoo l-wide expectations, there should be a continuum of procedures for reward ing these behaviors (Proctor & Morgan, 1991; Witt & Elliott, 1982). The reward system should begin with tangible, external, frequent, and predictable rewards and should shift to rewards that are social, internal, infrequent, and unpredictabl e (Lewis & Sugai). For example, in the beginning of the year a teacher may give students one ticket to purchase something at the school store each time they demonstrate appropriate listening skills. As the year progresses, the teacher may begin to intermittently reward the whole class with five minute breaks when they demonstrate listening skills for a whole instructional period. Schools also must develop a cont inuum of procedures to discourage violations of school-wide expectations A systematic response to disruptive behavior has been found to decrease office di scipline referrals (Nelson, Martella,

PAGE 43

30 & Benita, 1998). The procedures for the vi olations of rules and the description of the violations must be clearly stat ed and must be implemented consistently across the school. More severe violations (e.g., drugs, violence) should result in more severe consequences (Lewis & S ugai), such as suspension, while less severe violations (e.g., breaking dress code, tardiness) should lend themselves to less severe consequences, such as an office discipline referral. When school disciplinary procedures incl ude a continuum of responses to violations of the expectations and when schools develop, teach, and reward expectations, the initial tenets of SWPB S are in place. The last element is developing proc edures for monitoring the impact of school-wide PBS. Often, office discipl ine referral (ODR) data guide decisions about the effectiveness of disciplinary procedures and the need for interventions. To effectively use data to make decisions regular entry, review, and analysis of ODRs is necessary to monitor monthly tr ends in discipline. ODR data include the number of ODRs per day, cumulative num ber of ODRs over time, ODRs by location, ODRs by consequence, and ODRs by students (Lewis & Sugai, 1999). These data are used to facilitate implement ation of interventions and as outcome measures. Implement the Plan Securing resources. Implementing the plan in cludes securing resources for implementation, training staff in planning/problem solving skills, and implementing strategies. Securing resour ces includes both financial resources, such as federal, state, or district funding for implementation, and personnel

PAGE 44

31 resources, such as time and energy. Tr aining includes teaching the SWPBS procedures to all administration, facult y, and staff (Peshak George et al., 2003). Implementing strategies. Full implementation of school-wide innovations often takes between three and five y ears (Chapman & Hofweber, 2000; Hall & Hord, 2001; Taylor-Greene et al., 1997); the entry and acceptance phase alone can take up to two or three years (Zins et al., 1988). However, the OSEP reports that, for PBS to be effective, it mu st be implemented with high accuracy and sustained for 5-10 years (OSEP, 2004). It is important, ther efore, to study patterns of implementation of SWPBS to learn when the highest and lowest levels of implementation tend to occur. To date, there have been very few published studies describing trends in impl ementation of SWPBS. The majority of these data are likely to be found in end-of -year reports by state projects. The following section, therefore, provides a review of the limited data that were available on the implem entation of SWPBS. While there are limited tools to m easure the implementation of SWPBS, SWPBS implementation studies often us e the Schoolwide Evaluation Tool (SET; Lewis-Palmer, Irvin, Sugai, & Boland, 2004), a research based observation and interview instrument. The SET includes six subscales: expectations defined, behavioral expectations taught, on-going system for rewarding behavioral expectations, system for responding to behavioral violations, monitoring and decision-making, management, and district -level support. As part of the validation of this instrument, Lewis-Palme r et al. measured the implementation of SWPBS in 13 schools at time one (before SWPBS training) and time two (6 to 24

PAGE 45

32 months following training) The authors found a signi ficant increase in implementation from time one to time tw o with the average score increasing from 47.9% to 83.6%. These data indicate that implementation does increase over time; however, the authors do not provide information regarding the difference in the SET scores for schools at different ti me periods following training (e.g., 6 months compared to 12 months). More differentiated information on the le vel of implementation over time is presented in a study by Nersesian, T odd, Lehmann, and Watson (2000), in which the authors reviewed the levels of Sc hoolwide PBS implementation in three cohorts of schools from 1998 to 1999. Each cohort reflected a different stage of implementation. One cohort of schools was in year one in 1998 and year two in 1999; one cohort of schools was in year tw o in 1998 and year three in 1999; and, one cohort of schools was in year three in 1998 and year four in 1999. One additional cohort was only measured in 1999 and was not included since there was no second year comparison. T he researchers used the Schoolwide Evaluation Tool (SET) to measure implementation. S ee Figure 2 for a visual representation of the data. Two trends are noted in the data. First, schools that had been implementing PBS for a longer period of time had a higher level of implementation than those that had been implementing fo r less time. Second, all schools except for one, regardless of t he year of implementation had an increase in their SET score from 1998 to 1999. T herefore, these data indicate that implementation increases ov er a four year period.

PAGE 46

33 SW-PBS Implementation from 1998 to 19990 10 20 30 40 50 60 70 80 90 100Level of Implementation Schools in Year 1-2 Schools in Year 3-4 Schools in Year 2-3 1998 1998 19981999 1999 1999 Figure 2. Using the SET to Compare the SWPBS Implementation of Four Cohorts of Schools at Different Stages of Implementation from 1998 to 1999. Note. The data on this graph were extrapolat ed from data provided in Figure 1 in Nersesian, Todd, Lehmann, and Watson, 2000, p. 246. Another set of data was examined from the Illinois PBIS Project’s 20022003 End of Year Report (Eber et al., 2003-2004). The project members explored whether schools that were impl ementing PBIS at the 80/80 criterion level on the SET sustained their implement ation over time. T he 80/80 criterion level indicates an overall score of 80% or above and a score of 80% or above on the teaching expectations subscale. In a review of schools that completed the SET in 2000-1, 2001-02, or 2002-03, 85% to 100% of those schools were still implementing at the 80/80 le vel in 2002-03, one, two, or three years later. More specifically, 100% of the six schools that completed the SET in 2000-01 were still implementing at the 80/80 level three years later; 91% of the 20 schools who completed the set in 2001-02 were still im plementing at the 80/80 level two years later; and 85% of the 73 schools who completed the SET in 2002-03 were still

PAGE 47

34 implementing at an 80/80 leve l one year later. These dat a indicate that 80/80 schools typically sustain implementation for at least three years. It is important to note that this review only included sc hools which met the 80/80 level, and more research is needed to track the implementati on in schools that did not meet these criteria to determine if they also improve over time (Eber et al.). These studies indicate that implement ation of SWPBS increases over time from six months (Lewis-Palmer et al ., 2004) to four years after initiation (Nersesian, Todd, Lehmann, & Watson, 2000), and schools scoring at the 80/80 level on the SET tend to sust ain implementation for up to three years (Eber et al., 2003-2004). These findings coincide with th e general finding that school-wide innovations take between three and five y ears to implement (Hall & Hord, 2001). As implementation is a lengthy proce ss, sustaining the program requires mechanisms to monitor and evaluate progress. Evaluate the Plan Evaluating the plan involves both m onitoring progress and evaluating the outcomes with the intention of revisi ng any areas that need improvement or recycling the implementation process when needed. This component of the organizational change models is consis tent with Stufflebeam’s model for evaluation, the Context I nput Process Product model (CIPP; 1971; 1999), which encourages the measurement of the process involved in implementing the program, as well as the products or outcomes of the pr ogram. For SWPBS, monitoring progress involves both m easuring and monitoring implementation over time and evaluating outcomes. T he monitoring of implementation was

PAGE 48

35 discussed in the previous section; eval uating outcomes will be discussed in this section. Evaluate student outcomes. For the evaluation of student outcomes, many studies have used discipline referral ra tes, suspension rates, and satisfaction reports to evaluate the overall effect iveness of SWPBS (Lewis & Newcomer, 2002; Taylor-Greene et al., 1997). Many schools have found an overall decrease in the number of discipline referr als one and two years after SWPBS implementation (Eber, Lewis-Palmer, & Pacchiano, 2001). Specifically, there have been significant decreases in disrupt ion and fighting in the classroom and schoolyard (McCurdy, Mannella, & Eldridge, 2003) and a decrease in referrals for harassment (Metzler, Bigl an, Rusby, & Sprague, 2001). There also have been decreases in the number of days of out -of-school suspension (Scott, 2001) and the number of suspensions per day (Eber et al.). Schools reported more time to focus on individual interventions (Eber et al .), and staff ratings of satisfaction with PBS were moderate to high (McCurdy et al.). While the results of the majority of the program evaluations yielded significant findings, authors acknowledged that no measure of treatment fidelity was included (Scott, 2001). Other authors acknowledged that implementation data were collected during year one, but not during subsequent years (Metzler et al., 2001). Another study us ed the number of teachers involved with the project as a measure of implementation (Eber et al., 2001) but did not measure implementation of pr ogram components.

PAGE 49

36 There is a paucity of research that examines the outcomes of SWPBS in relation to the level of implementation. As noted earlier, the degree of treatment integrity influences the degr ee of treatment outcomes (Gottfredson et al., 1998; Gresham, 1989). One report from the Illinois PBIS project reported that schools that were implementing PBIS at the 80/80 criterion leve l had a significantly higher percentage of children r eading at grade level; howe ver, the authors did not determine whether this difference ex isted before impl ementation (Eber, Lewandowski, Horner, & Sugai 2003-2004). The project also found that schools that reached the 80/80 impl ementation criterion on the SET demonstrated a decrease in their office discipline refe rral (ODR) rate, and the number of the students with more than two O DRs per year decreased significantly. McCurdy et al. (2003) also concluded t hat the decrease in office discipline referrals in one school could be associated with the im plementation of SWPBS because the school scored 82% on the SET, indi cating a moderately high level of implementation. More studies such as these that examine the association between implementation and outcomes are needed, and these studies also must include schools that are not impl ementing SWPBS at a high level. It is important to note t hat many of these examples were case studies, and the researchers were typically involved and able to informally monitor the implementation of SWPBS. As SWPBS expands, there is a need for the use of more objective, standardized measures of implementation to compare the outcomes of the SWPBS initiative acro ss the nation. There also is a need to

PAGE 50

37 understand factors that infl uence implementation so the identification of these factors can help to im prove the initiative. Summary The above section provided an exampl e of using organizational change principles for implementing a school-wi de innovation. An organizational change model is necessary to consider the in fluence of factors, such as the school personnel, administration, cu rrent policies, and level of commitment to the new procedures. Implementation Factors Relatively little is known about the factors influencing the implementation of PBS (Metzler et al ., 2001) or other interventions developed through consultation (Noell & Witt, 1999). Kincai d et al. (2002) suggest a need for future research to identify the issues that affect the success or failure of the implementation of PBS, particularly in t he schools. There is a need to evaluate data trends in SES, location, and size for diverse schools (Sugai, Sprague, Horner, & Walker, 2000). This next section will review research that describes implementation factors for SWPBS or sim ilar programs. The discussion of factors will be organized into three sections: general implementation variables, need for change indicators, and socio-cultural variables. General Implementation Factors The larger the impact a change will hav e on an organizational culture, the more resistance there will be to the c hange (Harvey & Brown, 2001). As the behavior of organizations is neither static nor stable, there are forces that

PAGE 51

38 promote change (enablers or driving forces) and forces that hinder change (barriers or restraining forces). Kurt Lewin described this phenomenon in his force field analysis model depicted in Figure 3. Change occurs when the driving forces or enablers are str onger than the restraining forc es or barriers (Harvey & Brown). Restraining Forces (Barriers) Quasi-Stationary Equilibrium Driving Forces (Enablers) Figure 3. Diagram of Kurt Lewin’s Forc e Field Model was recreated from Harvey and Brown (2001, p. 139 Figure 5.4). There were seven categories of fa ctors that have been reported in the literature as promoting or hindering implementation: (1) knowledge about the innovation (e.g., Harvey & Brown, 2001; Sparks, 1988), (2) resources to implement the innovation (e.g., Gottfreds on, Gottfredson, & Hybl, 1993; Reimer, Wacker, & Koeppl, 1987; Witt, Martens, & Elliot, 1984), (3) input from staff, parents, and students (e.g., Chapman & Ho fweber, 2000; Hall & Hord; Ikeda, Tilly, Stumme, Volmer, & Allison, 1996), (4) on-going training (e.g., Curtis & Stollar, 2002; Grimes & Tilly ; Knoff, 2002), (5) integrati on of the innovation into the current system (e.g., Nersesian, T odd, Lehmann, and Watson, 2000; Ponti, Zins, & Graden, 1988), (6) evaluation of the innovation (e.g., Chapman & Hofweber; Taylor-Green & Kartub), and (7 ) support for the innovation (e.g., Grimes & Tilly, 1996; Hall & Hord, 2001; Lewis, Sugai, & Colvin, 1998; Nakasato,

PAGE 52

39 2000; Sadler, 2000). See Appendi x F for specific factors and references. All of these factors were found in anecdotal or qualitative reports describing the “lessons learned” following the impl ementation of an intervention and are comprised of individual factors that hav e been labeled by the authors as helpful or problematic in implementation. Mo re research is needed, however, that quantitatively explores their relationshi p to implementation and determines which categories or variables are the most influential. The variables that have been described more extensively in the literat ure are those related to support for the intervention: administrative support, t eam process, and coaching. They will be described in the next section on support variables. Support Variables District/administrative support. There is very clear evidence from the literature on school innovations consultation, and reform t hat district and building level administrative support are ess ential components of the initiation and sustainability of successful program implementation (Chapman & Hofweber, 2000; Curtis & Stollar, 2002; Gottfredson, Gottfredson, & Hybl, 1993; Knoff, 2002; Lewis et al., 1998; Lohrmann-O’Rourke et al., 2000; Nersesian et al., 2000; Noell & Witt, 1999; Sadler, 2000; Taylor-G reen & Kartub, 2000). Administrative support can include financia l support, teacher release time, and/or staff development (Taylor-G reen & Kartub). Administra tors can be the key to success in organizational change because they are typically the gatekeepers. They make decisions, give permission for initiatives, and are able to distribute resources (Curtis & Stolla r). Based on their experience and research, Hall and

PAGE 53

40 Hord (2001) found that above all, the “ hero principal,” an ener getic, enthusiastic person committed to improving outcomes for children was the key to success. When leaders promote commitment to and s upport for a new direction, staff are more empowered to change than when l eaders are passive and reluctant to begin (Grimes & Tilly, 1996). Teachers wh o feel supported are more committed and effective than those who do not f eel supported (Rosenholtz, 1989). Smylie (1992) reported that havi ng an open, collaborative, and supportive relationship with the principal positively influenced t eachers’ willingness to participate in a district-wide effort to in clude them on decision-making councils While support for staff is important, administrators need support as well to implement and sustain comprehensive interventions (Rosenberg & Jackman, 2003). They can receive support from statewide projects from the innovation team, and from school consultants or coaches. Team functioning. The stakeholders or the me mbers affected by change must be included in decision-making for im plementation to be effective (Curtis & Stollar, 2002; Harvey & Br own, 2001). Stakeholders typi cally include teachers, support staff, students, and parents. One way to incl ude stakeholders is by developing an implementation team. Team efforts are required to make change work (Hall & Hord, 2001) as teamwo rk has been reported as an essential component of the implement ation of SWPBS (LohrmannO’Rourke et al., 2000; Taylor-Greene and Kartub, 2001). SWPBS im plementation is facilitated by a unified team with a common goal (Lewis et al., 1998). A unified team includes the elements of role differentiation, goal clarity, open planning, accuracy of

PAGE 54

41 information transfer, continual interaction, continuity, collegiality, combining of effort, positive attitude, and complementarit y (Hall & Hord). Groups’ abilities to do work are based on their abilities to coor dinate and blend their skills and attitudes. The success of an organizational dev elopment intervention depends on the ability of the recipients to work together to achieve their intervention goal (Schmuck & Runkel, 1994). Coaching. Coaching has been defined as “on-site assistance for a teacher who is attempting to apply a new skill” (N eubert, 1988, p. 7) or “the provision of companionship, the giving of tec hnical feedback, and the analysis of appreciation” (Joyce & Showers, 1982, p. 3). These definitions emphasize the improvement of coachees’ skills in a cl assroom or school. The coach helps the coachee gain autonomy in using the new skill by observing, giving feedback, and providing support (Neubert). SWPBS uses the term “coach,” to i ndicate a person who is assisting a team in the implementati on of the intervention, but the terms “technical assistance,” “consultant,” and “facilitator,” have been used in the literature. Coaching or technical assistance from someone with expertise, such as a school psychologist, is necessary for implement ation of SWPBS (Lewis et al., 1998). Research has found that interventions are not implement ed with treatment integrity unless a consultant is conti nuously involved (Lewis & Newcomer, 2002; Noell, Witt, Gilbertson, Ranier, & Freeland, 1997; Witt, Noell, LaFleur, & Mortenson, 1997; Mortenson & Witt, 1998), and Hall and Hord (2001) purport that facilitators are necessary to m onitor interventions and refocus the change

PAGE 55

42 process in organizations. Much of the coaching research specif ically focuses on teachers not school teams or organizations; howev er, the research with t eachers is valuable as teachers often implement the intervention. Research shows that teachers who are coached use newly acquired skills from a training or in-service program more frequently, more appropriately, and for a l onger time than teachers who are not coached (Baker & Showers, 1984 cited in Neubert, 1988). Without coaching, teachers rapidly lose the skills they re cently acquired and few teachers achieve transfer of training into their classroom s (Joyce & Showers, 1982). For example, only 10% of teachers who attended a training without follow-up support transferred the new strategies to their classrooms while, 80% of teachers who received the training and coaching trans ferred the new strategies to their classrooms (Showers, 1984 ci ted in Showers, 1990). Because coaching is a crucial component of treatment integrity, it is important to investigate the skills and knowledge necessary to be an effective coach. The level of skills and knowle dge possessed by the consultant is positively related to the client’s attain ment of treatment goal s. Consultants with more skills and knowledge facilitate a higher attainment of treatment goal for the clients than consultants with fewer skills and knowledge (Lepage et al., 2004). Consultant skills that are necessary to specifically build and sustain SWPBS are the mastery of universal SWPBS element s, fluency in basic behavior principles, and the abilities to train others on these st rategies, establish a school-wide data collection system, provide technical assistance to the team and build

PAGE 56

43 communication between the team and school. If the consultant possesses these skills, he/she is better able to help teams implement and accomplish the goals of PBS (Lewis & Newcomer, 2002). In addition to skills in building and sustaining programs, coaches also must have confidence in their ability to implement these skills or “coaching efficacy.” The term “coaching efficacy” is borrowed from the literature in sports psychology and is defined as the “extent to which coaches believe they have the capacity to affect the learning and performa nce of their athletes” (Feltz, Chase, Moritz, & Sullivan, 1999, p. 765) or c oachees. Coaches with higher self-efficacy used more behaviors that were effective (e.g., praise and enc ouragement), had a higher percentage of wins, used less inst ructional and organizational behavior, and had higher player satisfaction than coaches with lower self-efficacy (Feltz et al.). To understand what influenced coach’s self efficacy, Feltz et al. surveyed 189 coaches and found that past experience, t he perception of the client’s ability, perceived community support, and past success were related to a coach’s belief that he or she can influence the perfo rmance of the athletes. Relating this information to teachers, coaches with mo re experience and greater expectations for their teachers are more likely to pr omote better performance and outcomes in the teachers with whom they work. In addi tion, the more support the coach has in the community, the more effective he or she will be with the t eacher. Therefore, both a coach’s skills and a coach’s self-asse ssment of his/her sk ills can influence the performance and outcomes of those being coached.

PAGE 57

44 Summary In conclusion, there are seven ar eas that can influence the successful implementation of organizati onal change. Specifically administrative support, teamwork, and coaching have been descr ibed in the liter ature as key implementation variables. It is important to consider and include these variables when implementing school-wide programs, and it is al so important to conduct research to determine which variables are the most influential on successful program implementation. Need for Change Indicators In line with Lewin’s force field model, change occurs when the driving forces or enablers are str onger than the restraining forc es or barriers (Harvey & Brown, 2001), creating a need for change. “The need for change arises when there is an imbalance between the direction in which th e system is going and the direction in which the s uperintendent and the school boa rd want the system to go” (Valentine, 1991, p. 65) Organizations must anticipate this need for change before any program can or should be im plemented (Harvey & Brown; Schmuck & Runkel, 1994), and organizational change/interv entions must be clearly linked to the needs of the organization (Ponti et al., 1988). Schools with a higher need for change, however, may be overwhelmed with the bas ic problems of student misbehavior and lack the capacity to effectively implement innovations (Gottfredson et al., 1998). While there is a limit ed amount of research on the link between the need for organizational change and treatment acc eptability or integrity, there is

PAGE 58

45 research on the impact of the severity of symptoms on treatment acceptability. While Wickstrom et al. (1998) did not find a correlation between severity, acceptability, and integrity, Reimers, Wacker, and Koeppl (1987) reviewed the earlier literature and found that most studies indicated t hat the more severe the problem, the higher that teachers or respondents rate the acceptability of treatment. In addition, Elliott, Witt, Ga lvin, and Peterson (1984) found that teachers are more likely to accept more complex interventions when the problem is more severe. In a more recent study of an actual intervention by Sheridan, Eagle, Cowan, and Mickelson (2001) symptom severity was found to significantly predict treatment outcome s (as measured by effect sizes of behavioral outcomes) following intens ive consultation sessions between consultants and teachers. This was only for younger students, however, and paradoxically, older students with less severe symptoms had larger effect sizes. Both findings make sense as problem severity may moti vate teachers to implement interventions to decrease t he problem, but less severe problems are easier to improve. Therefor e, it seems that problem severity does influence acceptability and outcomes, but it has not been found to do so in each case. While research indicates an impact of severity of need, the question of whether the severity of need for change faci litates or hampers the organizational change process remains unanswered. There is research, however, on several indicators of the need for change and their in fluence on school climate or culture. Severity of need for change in a school system could be measured by the number and rate of disciplinary acti ons and academic performance of the

PAGE 59

46 students. Specifically, the measures that will be reviewed include office discipline referral rates, suspension rates, and academic achievement in a school. The relationship between each indicator and sc hool climate will be explored further. Office discipline referrals (ODR). An office discipline referral is an unobtrusive, indirect, yet stable measur e of teacher-reported student behavior (Lewis-Palmer, Sugai, & Lar son, 1999; McCurdy et al., 2003; Wright & Dusek, 1998). ODRs are relatively standardized (Wright & Dusek, 1998) and at the least contain the student’s name, date, reason for referral, and administrative decision. They are a more sensitive measure t han suspension, detention, or expulsion data (Sugai et al., 2000), and the most common reason for referrals is disobedience, followed by conduct, disres pect, and fighting. Ironically, the least common reasons for referrals are serious offenses: possession of a weapon, vandalism, or setting fire (Ski ba, Peterson, & Williams, 1997). ODRs can describe school-wide behavior and have been found to be a valid way to measure behavioral climate, school discipline problems, schools’ effectiveness of school-wide intervent ions, and the differing needs across schools (Irvin, Tobin, Sprague, Sugai & Vincent, 2004; Lewis & Newcomer, 2002; Metzler et al., 2001; Nelson, B enner, Reid, Epstein, & Currin, 2001; Putnam, Luiselli, Handler, & Jefferson, 2003; Tobin, Sugai, & Colvin, 1996; Wright & Dusek, 1988). On the other hand, the limitatio ns of using ODRs as a measure is that each school defines problem behaviors differently and has different procedures regarding the adminis tration of ODRs (Sugai et al., 2000). Other limitations include teacher bias, di fferent degrees of distribution due to

PAGE 60

47 individual teacher tolerance, and the lack of direct obser vation of the behavior or objectivity (Wright & Dusek, 1998). Desp ite the limitations, a high number of referrals can indicate that current discipline procedures are not clearly defined or need to be improved. A high number of ODRs can indicate a reported negative and reactive environment (Taylor-Green & Kartub, 2000) that needs to be changed. The need for change in a school is apparent if the number of students with at least one referral is higher than the average percent found in research studies. Sugai et al. (2000) provide averages from their re search: 21% in elementary school and 47.6% in middle/high schools. It is important to note that thes e data were derived from groups with less than 11 schools, but Skiba et al. (1997) also found an average of 41% for 19 middle schools. Wi th these numbers as guidelines, above average percentages of students receiving a referral can be used as a red flag for an ineffective discipline system. Suspensions. Suspensions are typically a consequence of office discipline referrals and can be an in-school punishmen t, or in-school suspension (ISS), or removal from the school building for a s pecified number of da ys, or out-of-school suspension (OSS). Skiba et al. (1997) found that suspensions account for 33% of the consequences given for office referra ls. In-school suspensions are most commonly given as a consequence for disrup tive behaviors, tardiness or cutting classes, disrespect for rule s or people, cutting detention, using profanity, loss of "self-control," and smoking cigarettes; out-of-school-suspensions are most commonly given as a consequence for fight ing, theft, and bringing weapons to

PAGE 61

48 school. The most common referral that resu lts in suspension is fighting (Skiba et al.) and two-thirds of suspensions are for non-serious offenses (Edelman, Beck, & Smith, 1975). More extreme violations or repeated violations of a school's behavior code, such as drug or alcoho l abuse, endangering others, committing felonies, hitting an adult or teacher, and arson mostly co mmonly result in expulsion (Rose, 1982). So far, there is little research supporting suspensions as an effective means to decrease repeated problem behavi or. Tobin, Sugai, and Colvin (1996) found that the more referrals students received, the more likely they were to receive a suspension. To further this conclusion, students with high rates of referrals that resulted in a suspensi on during the first sc hool term had more referrals in the future than those st udents who did not receive a suspension (Tobin et al.). Additionally, it was found that 43% of students who were suspended once subsequently were suspended at least one other time, and 24% were suspended multiple times (Commi ssion for Positive Change in Oakland Public Schools [CPCOPS], 1992). However, Morgan-D’Atrio, Northup, LaFleur, and Spera (1996) found that middle and high school students with recurrent suspensions in different schools we re a heterogeneous group with a range of externalizing, internalizing, academic pr oblems, and skills deficits, and they did not have significantly more academic or social skills deficits than students without a high rate of suspensions. As different types of students can rece ive suspensions, it is important to note that suspensions are influenced by teacher attitudes, discipline procedures,

PAGE 62

49 and school governance (Wu, Pink, Crain, & Moles, 1982). Suspension can be representative of a schools’ discipline practices. Lewis and Newcomer (2002) recommend using suspension rates as outcome measures of SWPBS. Therefore, the perc entage of students receiving suspensions can be indicative of a school’s climate and t he effectiveness of their disciplinary policies and procedures. Academic performance. In conjunction with improv ing a schools’ climate, increased academic achievement is the ultimate goal of any school reform agenda (Hall & Hord, 2001). The USDOE (2004a) reported that 37% of 4th graders and 26% of 8th graders were reading below grade level in 2003 and 26% of 12th graders were reading below grade leve l in 2002. The Adequate Yearly Progress standards included in the No Child Left Behind (2001) legislation require the restructuring of schools that continue to have a significant percentage of students below grade level for six to seven years (The Education Trust, 2004). To attain this goal, Reading First grants ar e provided by the US DOE to states to enable all students to become successful r eaders by the end of Grade 3 (North Central Regional Educational Laboratory [NCREL], 2005). As these initiatives are currently emphasizing reading, schools ar e scrambling to im prove the reading scores of their students. Schools with a higher percentage of students below grade level are likely to have more behavior problems. Chil dren who are below grade level in academic subjects are going to have more difficulty with academic tasks, leading to more time off-task, less academic engaged time, and more problem behaviors

PAGE 63

50 in the school (Gunter, Denny, Jack, S hores, & Nelson, 1993; Lee, Sugai, & Horner, 1999; Nelson, 2000). T herefore it is often nec essary to increase the appropriate behavior of student s with behavior problems in addition to providing academic interventions. For example, improving the appropriate behavior of students with severe behavior problems improved their academic performance as measured by the number of assignments completed (Witt & E lliott, 1982). It thus is important to simultaneousl y emphasize both academic and behavior interventions in a school, and SWPB S is recommended for schools with academic performance below national, stat e, or local expectations levels (FLPBS, 2003-2004). Summary. This section explored whether t he severity of need for an intervention increases or decreases t he acceptability and implementation of an intervention. The link between the se verity of symptoms and treatment acceptability was demonstrated for intervent ions, and the severity of need for a school-wide intervention was explored us ing office discipline referrals, suspensions, and academic status. Mo re research is needed to determine whether these indicators influence the level of implementat ion of school-wide programs. Socio-cultural variables When discussing variables that indicate a need for an intervention, it is important to consider their interacti on with student, teacher, and school variables and to consider how the interaction of these variables influences the implementation of school-wi de programs. Variables, su ch as SES, ethnicity,

PAGE 64

51 student mobility, school size, teacher: st udent ratio, teacher education, and teacher quality can influence school-wide implementation of programs. While research on the specific influence of t hese factors on implementation of schoolwide programs is limited, there is an ex pansive body of research exploring the inter-relationship among these variables and the relationship between these variables and the behavioral and academic indicators of need for change. Student behavioral indicators. The variables of socio-economic status and ethnicity will be discussed in terms of t heir relationship to behavioral indicators. For ethnicity, the preferred terms ar e white and other ra ces (Office of Management and Budget [OMB], 1995). For socio-economic status (SES), the research refers to either the level of poverty or free or r educed lunch status as measures; consequently these terms will be used interchangeably. It is important to note that the United States Departm ent of Education def ines schools with a high poverty rate as having greater than 30% of students on free and reduced lunch and a low poverty rate as less than 15% of students; a high minority enrollment is defined as gr eater than 50%, and a low minor ity enrollment is less than 10% (2002). The variables of socioeconomic and mi nority status are combined in this section because much of the research on disciplinary practices in schools describes similar results for these two variables. One reason may be that minority students attend schools with both a high concentration of other minority students and students on free or reduced lunch. Approximat ely 30% of Black and Hispanic 4th graders attend a school with a 90 % minority population (USDOE,

PAGE 65

52 2004, The Condition), and 47% of Black students and 51% of Hispanic students attend high poverty schools. In compar ison, only 5% of White students attend high poverty schools (USDOE). Several studies have found that students from lower income households and from minority groups (except for Nati ve American students) received more disciplinary actions (i.e., referrals, sus pensions) than students from high income households and students in the majority group (Skiba et al., 1997; Townsend, Thomas, Witty, & Lee, 1996; Wu et al., 1982). For example, in one study, African-American students accounted for 53% of all suspensions but only comprised 28% of the student populatio n (CPCOPS, 1992). Additionally, the consequences for violations of school rules and expectations have been found to differ for students of different racial gr oups. A review of the consequences for a large district in Florida that uses co rporal punishment found that White students received a higher rate of in-school sus pensions and a lower rate of corporal punishment while Black students received a lower rate of in-school suspensions and a higher rate of corpor al punishments. The authors inferred that the results for Black and White students were related to race because they were not related to the severity of the puni shment as White students had a higher rate of defiance, fighting, and bothering others, which were the behaviors most typically resulting in corporal punishment. In contrast, McFadden, Ma rsh, Price, and Hwayng (1992) found that the rate of in-school, out-of school, and corporal punishment rates for Hispanic students was commensur ate to their representation in the population. This research indicates a lin k between the socioeconomic status and

PAGE 66

53 minority status of indivi dual students and the number of disciplinary violations; however, the research for school populations is less clear. Earlier research from a large city in Britain found that the number of suspensions in a school did not correlate with the school’s SES, size, or absentee rate (Galloway, 1976) while more recent research has found that the variables of the percentage of students receiving free and reduced lunch and the percentage of Black students in a school we re positively correlated to a schools’ suspension rate (Raffaele M endez, Knoff, & Ferron, 2002) However, research by McCarthy and Hoge (1987) found other pr edictors of punishment levels to be more salient. They did not find age, race sex, SES, or home situation to be significant predictors of punishment levels; instead, they found teachers’ perception of students’ behav ior, knowledge of recent academic performance, and past record of behavior sanctions to be statistically significant. The past record was the greatest predictor follo wed by teachers’ perceptions of past behavior, then by academic records. Ther efore, minority and low socioeconomic status can be predictors of a higher rate of disciplinary actions, but the relationship may be influenced by these other factors as well. Another factor to consider when re viewing research on minorities and discipline in schools is that some minor ity students, such as African-Americans may perceive school rules as meaning less and controlling. Townsend (2000) observed teachers reprimanding AfricanAmerican students for slouching even though they were academically engaged. W hen these African-American students receive a disciplinary action for someth ing that may be meani ngless to them,

PAGE 67

54 they may act confrontational. It is impor tant, therefore, t hat school rules and expectations offer “meaningful codes that will influence students’ quality of life” (Townsend, p.385). Delpit (1995) calls attent ion to the fact that the rules in the education system reflect the culture of power or the White cultur e. By rules, she refers to both the explicit rules and the informal rules of walking, talking, dressing, and acting. Explicitly telling these rules to minority students makes adhering to the rules easier for them. Bec ause a critical element of PBS is the teaching and rewarding of expectations and rules, this program may be very effective for minority students. Another finding that is relevant to this area of research is that programs targeting low income schools may fail if they do not involve families and communities. One example of this is the fa ilure of a five year community program intended to improve the academic and heal th outcomes of disadvantaged urban youth and the social and economic condi tions of the neighborhood. The failure was not due to the service system or the in stitutional changes but to the failure to include the home and communi ty (Ann E. Casey, 1995). To support this point, high schools that did include parents in a school-wide discipline plan had lower OSS rates than did schools that did not (Raffaele Mendez, Knoff, & Ferron, 2002). Therefore, the inclusi on of families in school-wide programs, particularly in schools with a low socioeconomic populat ion cannot be ignored; neither can socioeconomic and minority status be i gnored when examining research on the implementation of school-wi de programs on discipline.

PAGE 68

55 Student academic indicators. Donahue, Finnegan, Lutkus, Allen, and Campbell (2001) found that a higher per centage of Black (63%) and Hispanic (58%) fourth grade students were below gr ade level in reading than were White (22%) and Asian (27%) students. Linking other variables to academics, Raffaele Mendez, Knoff, and Ferron (2002) found that t he percent of students on free or reduced lunch, the percentage of Black stud ents, and mobility rate of students, or the rate that students mov ed to another school, were all negatively related to standardized achievement variables at the elementary and secondary level (Raffaele Mendez et al., 2002). Johnson a nd Lindbald (1991) found that students who had a higher mobility rate scored lower on a standardized assessment of achievement than peers who had not moved. It is not surprising that mobility of students was found to be linked to achiev ement as Martin (2004) found that families of students from highly mobile fami lies in a Midwestern city also tended to live in areas with a lower median income. Liechty (1996) found a positive significant relationship between mobility rates and teacher ratings of student behavior and mobility rates and student achi evement. The data for this study were derived from the review of academic and attendance data from the cumulative records of fourth-grade st udents from eight elem entary schools in a medium-sized urban district. Teacher s additionally completed a behavior checklist for each student in their class. The checklist (i.e., Teacher’s Report Form) measured adaptive functioning in the classroom. Mobility was defined as relocation to the school from another sc hool. Therefore, mobility can account for a portion of the variance in academic and behavior problems. From a program

PAGE 69

56 implementation standpoint, a high mobility rate would affect program implementation if t he program procedures are only t aught at the beginning of the year; students who were enrolled later would miss that training component. This stresses the importance of schools incor porating procedures for new students into their disciplinary plan. School variables. It is important to consider school variables, such as school size and student: teacher ratio. To provide a framework for the research, the national average school size in an el ementary school is 441 students, the average middle school size is 612 students, and the average high school size is 753 students (USDOE, 2004b). Schmuck and Runkel (1994) have found that large secondary schools often do not show readiness for change or readiness to participate in organizational development. Ro se (1988) found that as the size of the school increased, the use of out -of-school suspensions and expulsions reported by principals increased. Therefor e, a large school size may increase a school’s need for SWPBS but also may i nhibit the school’s readiness to change. With the average class size acro ss the country of 21.1 for public elementary schools and 23.6 for public secondary schools (NCES, 2003), elementary school students in a classroom with a smaller teacher: student ratio (i.e., 15-20 students per teacher) had more academic gains than students in a classroom with a larger student: teacher ratio (Molnar et al., 1999). While all students made gains, these gains were the strongest for African-American students. Teachers reported that changes were medi ated by a reduction or elimination of discipl ine problems, increase in indi vidualized attention and needs,

PAGE 70

57 and student-centered learning activities. Teachers unanimously agreed that this academic gain was possible because they dev oted less of their time to disruptive behavior and managing the classroom. In fa ct, they reported that classroom discipline problems were near ly eliminated (Molnar et al.; Finn & Achilles, 1999; Nye 1999). Smaller math class sizes in middle and high schools also were found to result in less time spent on discip line and administrative tasks and more time spent on review of academic subjects (Betts & Skolnik, 1999). This research indicates that a smaller teac her: student ratio allows fo r more instructional time because there are fewer disciplinary pr oblems to handle. Therefore, since student: teacher ratio is linke d to fewer disciplinary probl ems, it is important to determine if this variable influences the implementation of SWPBS, which focuses on decreasing the amount of time spent on disciplinary problems in a school. Teacher variables. According to a national poll conducted by Recruiting National Teachers, Inc., nine out of ten Americans believe that qualified teachers are the best way to raise student achiev ement (Fideler, Foster, & Schwartz, 2000). Two measures are cons idered important indicators of teacher quality: percentage of teachers wit h an advanced degree and percent age of out of field teachers. It has been reported that 52% of teachers nationally have a bachelor’s degree, 42% of teachers have a master’s degree, about 5% have a specialist’s degree, and less than one percent have a doc toral degree (NCES, 2003). An “out of field teacher” is defined as a t eacher who does not have a major or certification in the subject area wh ich he/she teaches (USDOE, 2003).

PAGE 71

58 The presence of less qualified teacher s and more out-of-field teachers can influence both the quality of instruction and the quality of implementation of school-wide programs. In a st udy of factors related to treatment integrity and outcomes of consultation, consultee educat ion level was positively related to the recording of target behavior, indicati ng treatment integrity (Wickstrom, 1996). This indicates that teacher s with a higher level of education are more likely to implement an interventi on with integrity. Teacher quality as measured by teac her education and certification also are positively related to student achiev ement (National Center for Education Statistics, 2001). Based on a sample of 900 teachers in Te xas, the teacher education level (i.e., master’s degree) and expertise as measured by both scores on a licensure exam and experience compri sed 40% of the variance in students’ reading and mathematics achievement sco res when socioeconomic status was controlled (Ferguson, 1991). However, schools with a high percentage of students from a low socioeconomic stat us and minority populat ion tend to have less qualified teachers who are inadequately prepared and fewer teachers with a master’s degree (Darling-Hammond and Post, 2000; USDOE, 2002, 2001). Low income schools have a more difficult time attracting teachers with higher cognitive ability (Ferguson, 1991). In addition, 15-21% of elementary and middle schools with a large low-income populati on have a higher percentage of teachers with less than three years of experience as compared to only 8-9% of higher income schools (USDOE, 2001). Therefor e, the presence of less qualified

PAGE 72

59 teachers in schools with a lower income and higher minority population will likely affect both student achievement and program im plementation. In addition, schools with a higher percentage of minority students and students on free or reduced lunch have been found to have more out-of-field teachers in 1999-2000 (USDOE, 2004a). In low SES schools, 12% of teachers were found to have emergency certificates and 18% are out of field as compared to only 1% in higher SES schools (USD OE, 2001). There are cu rrently shortages of teachers in all subject area in urban high schools, particularly in math and special education (Fideler et al., 2000), but middle schools students are more likely to be taught by out-of-field t eachers than are high school students (USDOE, 2004, The Condition ). Therefore, students in schools with a low SES and/or high minority populat ion and middle schools are more likely to be taught by less qualified teachers. Although there is a significant amount of research on the influence of teacher quality on student achievem ent and school demographics on teacher quality, there is less research on t he impact of teacher quality on program implementation. While there was a lin k between teacher education and treatment integrity, more research must be conducted in this area. Summary. The research on socio-cultural variables indicates a relationship between many of the st udent, teacher, and school variables. Specifically, the research indicated a re lationship between socio-economic status and ethnicity; socio-economic status and disciplinary actions, ethnicity and disciplinary actions, student mobility and disciplinary actions, student: teacher

PAGE 73

60 ratio and disciplinary actions, socio-econom ic status and academic achievement, ethnicity and academic achievement, soci o-economic status and teacher quality, and teacher quality and academic achievem ent. The interactions of these variables, therefore, must be c onsidered when evaluating both the implementation and impact of school-wide programs. Summary This section described the rela tionship between school-wide programs, such as SWPBS, and enablers and barriers to the implementat ion of SWPBS, administrative support, coaching, team process variables, the need for an intervention, and socio-cultural variables Because all of t hese factors have been found to either directly or indirectly in fluence implementation, it is important to know the direct impact of these vari ables on implementat ion. With this knowledge, PBS trainers and state agenc ies would be able to determine which factors to emphasize or de-emphasize in both selecting schools and the training and support for the selected schools. Influence of factors on Implementation While no literature could be found t hat examined the influence of general implementation factors, the severity of need for an intervention, demographic factors, and socio-cultural factors on the implementatio n of SWPBS, Cooper (1998) studied the influence of socio-cult ural factors on the implementation of Success For All, a comprehensive school-wide intervention to improve the academic success of all children (Cooper, 19 98). The predictors examined in this study were SES, school size, mobility rate, year of implementation, racial

PAGE 74

61 composition, urbanicity, and community size. Poverty level was determined by grouping the percentage of students who rece ived free or reduced lunch into four groups: low, medium, high, and extreme gr oup. School size was measured by enrollment figures. Mobility rate wa s determined by number of students who transferred during the course of the y ear. The year of implementation was determined by the number of years in the program. The racial make-up was determined by percentage of white or non-white students. Urbanicity was determined by the facilitators’ indica tion of urban, suburban, or rural, and community size was determined by the fac ilitators rating of t he community: inner city, big city, moderate size city, small town, or other. Cooper (1998) conducted a multiple regression analysis and found that lower student mobility, hi gher student attendance rate s, and a higher percentage of white students predicted higher implementation of the pr ogram in schools. It is unclear, however, whether mobility rate was a positive or negative predictor of implementation because counter intuit ively, the author reported that low implementing schools had a lo wer mean mobility rate than the moderate and high implementing schools. The non-significant correla tions included poverty level, years of implementation, size of school, urbanicity, and size of community. While interesting, these results shoul d be interpreted with caution as there is a concern about the measure of im plementation, whic h the level of implementation was based on one question. The facilitators were asked to rate the level of implementati on on a five-point scale ranging from “hardly evident, very poorly implemented or not impl emented” to “thoughtful, creative,

PAGE 75

62 enthusiastic implementation” Based on this one rating, the researcher divided the sample into the high, middle, and lo w implementation groups. Interestingly, none of the facilitators chose a rating of one or two for le vel of implementation. It cannot be determined whether all of the sites were im plementing the program at a moderate to high level or whether all the ratings were inflated because the facilitators did not want to rate themse lves with a low rating on implementation. Despite the questionable nature of t he implementation rating, Cooper’s (1998) second research question will be described as it is similar to a question that will be examined in this study. Cooper also examined the relationship between implementation and enablers and barriers to implementation. The problem was that the devel opment, administration, and sco ring of the instrument used to measure enablers and barriers were not clearly described nor could a description be found in the ar ticles that Cooper refe renced for the methodology (Cooper, Slavin, & Madden, 1998; Cooper, Slavin, & Madden, 1997). The brief description of the instrum ent indicated that respondent s circled all the barriers and enablers that applied to their school, and from these 56 items, the author developed a scale and derived nine factor s using a factor analysis. Then, controlling for race, attendance, and mobilit y (the significant predictors), Cooper conducted a MANCOVA to determine if there was a difference between high, middle, and low implementation groups and the enabler and barrier factors. Based on this analysis, the largest effect size between high and middle and low implementers was a more supportive cult ure and less program re sistance. Again, as the psychometric properties of bot h instruments are questionable, these

PAGE 76

63 results should be interpreted with caution. In conclusion, while there were no psychometric properties provided for the instruments in this study, the met hods provide a foundation for research linking socio-cultural and process variables to school-wide program implementation. There is a need for meas ures that provide reliable and valid scores of implementation and influential im plementation factors. Practitioners can then use this information to create more sophisticated strategies to promote successful program implem entation and increase positive outcomes for students.

PAGE 77

64 Chapter III Methods School-Wide Positive Behavior Suppor t (SWPBS) is a co llaborative teambased approach that uses data-based interventions to build a positive school environment. It is intended to provide effect ive procedures for all students, staff, and settings and is often implemented in collaboration with classroom, targeted groups, and individual positive behavior support (OSEP, 2004). The purpose of this study is to identify factors that influence the implementation of SWPBS, as well as to determine enablers and barriers to implementation. The intended outcome is to generate information that will be helpful in selecting schools for future implementation and to understand which factors might be addressed to enhance implementation in less successful schools. This chapter outlines the procedures, instruments, and analyses that were used to answer the research questions. Setting Overview of SWPBS School-Wide Positive Behavior Suppor t (SWPBS) is part of a larger initiative called Positive Behavior Suppor t (PBS). The intention of PBS is to systemically enhance schools’ capacity to adopt and sustain effective discipline practices and empirically-based interv entions (Lewis & Sugai, 1999). SWPBS is

PAGE 78

65 the application of PBS at the schoolwide level for all students. Currently, SWPBS has been implemented in 4600 schools in the United States (Horner & Sugai, 2006, March), including 161 schools located in 24 of the 67 county school districts in Florida (FLPBS, 2004, Fall). In Florida, SWPBS is implemented through a statewide project called Florida’s Positive Behavior Support Projec t (FLPBS), which is administratively housed in the Louis De la Parte Florida Ment al Health Institute of the University of South Florida in Tam pa, Florida. FLPBS “provi des training and technical assistance to school districts in the development and implement ation of positive behavior supports at the school-wide, cl assroom, targeted group and individual student levels” (Florida’s Positive Behav ior Support Project, 2004, p.1). FLPBS is administered by eight staff members wit h each staff member being responsible for coaching and supporting schools in one to five school districts that are implementing SWPBS. Recruitment to SWPBS The FLPBS Project staff engages in a multi-step process annually for the purpose of recruiting new schools into the project. Recruitment of school districts can occur either informally or formally. Informally, school districts learn about SWPBS through professional networks, conferences, and the PBS Newsletter. Formally, the FLPBS staff disseminates information packets about SWPBS to all school districts in Florida and upon reques t, presents an in-person overview of the project. FLPBS staff mem bers then meet with in terested districts to identify a district coordinator and l eadership team. The staff member, dist rict coordinator,

PAGE 79

66 and team then complete the District Readiness Checklist for the Leadership Team (see Appendix G). The Checklist guides districts through a series of steps considered necessary to support the implementation of SWPBS in schools; districts must meet each criterion on the checklist before they can invite schools to participate in SWPBS training. As part of the Readiness Checklist process, the SWPBS staff assists the distri cts in establishing district -wide goals for the project using the District Action Planni ng Process Form (See Appendix H). There are various procedures by which individual schools are selected to participate in SWPBS. Some district s select schools through an application process, others strongly encourage specif ic schools to participate, and still other districts allow all interested schools to participate. Once potential schools are selected by the district, they are r equired to complete a School Readiness Checklist (See Appendix I) to document their commitment to engage in the SWPBS process. Similar to the Distric t Readiness Checklist, schools must meet all criteria on the checklist bef ore participating in training. Coaches Facilitators or coaches are consider ed necessary to monitor innovations and interventions and to refocus the c hange process in organizations (Hall & Hord, 2001). Consequently, each SWPBS school is provided with a coach to assist in SWPBS implementation. Distric ts assign coaches to schools as part of the District Readiness Checklist process and also provide funding for their role as a coach. Coaches are supported by their di strict coordinator with assistance from the FLPBS staff member assigned to hi s/her district. Coaches attend an annual

PAGE 80

67 coach’s training and implementers’ forum to receive guidance for their role as a coach. Their roles and responsibilities include attending all SWPBS team meetings and events, serving as a team facilitator, providing assistance and support to the PBS team, monitoring the progress of PBS school teams, shadowing PBS project sta ff during visits (FLPBS, 2003-2004), and completing mid-year and end-of-year evaluation reports. SWPBS Teams A school-based leadership team is required to lead SWPBS efforts (FLPBS, 2003-2004). The SWPBS team is responsible for planning and conducting implementation procedures and for introducing SWPBS to the school staff. Each participating sc hool selects a team as part of the completion of the School Readiness Checklist. The School Readi ness Checklist includes a criterion that the team must incl ude a broad representation of school staff, such as a member/s from the School Improvement Team, a behavior specialist or member with behavior expertise, an administrator, a school counselor, and general and special education teachers. In additi on, teams are asked to document a commitment to meet monthly. The purpose of monthly meet ings is to review and analyze data, develop strategies to addre ss any existing problems, and develop an action plan to implement SWPBS components (FLPBS). Training The SWPBS team and coach from each school attend a three-day training session during the summer prior to implem entation. Each attendee receives $375 for completing all three days of the trai ning program. The foundat ion of the three-

PAGE 81

68 day training was developed by George Sugai at the University of Oregon (Sugai, Personal Communication, 2002). The FLPBS staff further developed the training into a comprehensive presentation that includes 400 research-based slides. The training is presented in a le cture-format that is complemented by activities and work periods to develop and complete an action plan for implementation. The complete training program includes ei ght modules (FLPBS, 2003-2004), which are presented by the SWPBS staff, experienced coaches, and district coordinators. Each module is described briefly below. The first module addresses team/facu lty buy-in (slides 50-74). Because the commitment of faculty is presum ed to be a foundation for the successful implementation of SWPBS and the successf ul decrease of problem behavior, the objective of this module is to teach t eams how to acquire at least 80% faculty buy-in to SWPBS. This module provides strategies to secure commitment, such as sharing data with faculty, conducti ng staff surveys, and developing an “election” process to involve faculty. Trainers also teach teams techniques for presenting the basics of behaviora l principles to faculty. The second module addresses the estab lishment of a data-based decision making system (slides 109-208). The objecti ves for this module are for team members to understand why data collect ion is important, to be able to operationally define problem behaviors, to develop a referral form and process, to identify whether behaviors are managed in the office (major referrals) or the classroom (minor referrals), to underst and how to use a system for collecting and reporting data [e.g., School-Wide Inform ation System (SWIS II)], and to

PAGE 82

69 understand how to use the data for decis ion-making. FLPBS recommends that data (i.e., major and minor office discipline referrals) be entered daily and analyzed monthly and that t he referral form should in clude the student’s name, teacher, grade level, description of problem behavior, possible motivation for the problem behavior, others involved, the dat e and time of incident, its location, and resulting consequence or administrative decision. The third module focuses on the devel opment of a school crisis plan (slides 209-226). The objectives of this module are for teams to define crisis incidents, develop a crisis pl an, develop strategies to tr ain staff on the crisis plan, and to connect the crisis plan with the PBS plan. The fourth module emphasizes the development of effective consequences (slides 227-248). The objec tives are for teams to develop a continuum of effective procedures and consequences for problem behaviors. A behavior problem is defined as a violation of the school-wide expectations that will be described in the next module. Teams must identify where problem behaviors will be managed and develop a hierar chy or flow chart for responding to them. To ensure that t he procedures are consistent for all children, the plan should include a list of administrative decisions that coincide with problem behaviors. The fifth module addresses the devel opment of school-wide expectations and rules (slides 249-276). The objectives of this module are for teams to develop three to five positive school-wide expectations and setting-specific rules that coincide with those expectations. The expectations should be applicable to

PAGE 83

70 all settings, but the rules should be devel oped for each specific setting (e.g., cafeteria, classrooms). FLPBS recommends the development of a maximum of three to five rules for each setting. The sixth module emphasizes the dev elopment and teaching of lesson plans on the expectations and rules (slides 277-305). The objectives are for teams to identify techniques for teachi ng expectations and rules, to develop lessons plans for teaching expectations and rules, and to identify techniques for embedding the lesson plans into the cu rriculum. Sample lesson plans are provided. The seventh module focuses on t he development of a school-wide reward/incentives program (slides 306-325). The objective is to develop a schoolwide reward system using reward system guidelines that are provided. The guidelines include the following: keep the plan simple, include recognition/rewards for students in common areas, encourage expectations during the daily announcem ents, reward desired behaviors frequently in the beginning, offer rewards contingent on desired behavior, refrain from taking rewards from students after they hav e earned them, include students in the development of the system, and maintain a ratio of four reinforcements for each correction. The eighth module trains the team on techniques to implement the plan and to train the faculty, students, and fam ilies in the plan (slides 326-364). The ninth module focuses on the evaluation of PBS efforts (slides 365-385). The objective of this module is for teams to create systems for evaluating the plan

PAGE 84

71 development, implementation, and outcomes. Participants The participant pool for this study we re the SWPBS school teams (N=161) that have been trained in the implem entation of SWPBS since 2002. These schools were located in 24 county school districts. The participating schools either participated in training during t he summers of 2002, 2003, or 2004. The schools trained in 2002 will be called third ye ar schools; those trained in 2003 will be called second year schools; and those tr ained in 2004 will be called first year schools. The schools included elementary school s, middle schools, high schools, and alternative/special education centers. (See Table 2 for Descriptive Data for BoQ scores). As SWPBS is individualized ac cording to the needs of each school, the school type should not greatly infl uence implementation. For the group of schools who returned the BoQ, a One-Wa y Analysis of Variance was conducted to determine if there was a difference bet ween the BoQ scores of the elementary schools (n=37), middle schools (n=30), high schools (n=9), and center/other types of schools (n=15) in this sample There was not a significant difference between the BoQ scores of these schools, F(3, 81)=.465, p=.70. As there was a small sample size for the high school and center/other schools, a second ANOVA was conducted between elementary school s and middle schools to ensure the accuracy of the analysis. There was no significant difference in the implementation between these schools either F(1,65)=1.38, p=.2 5. See Tables 3 and 4.

PAGE 85

72 Table 2 Summary of BOQ Scores By Type of School and Year of Implementation Sample Size Range of Scores Type 1st 2nd 3rd All Years 1st 2nd 3rd All Years Elementary 24 11 2 37 4-99 45-98 80-89 4-99 Middle 18 11 1 30 26-94 39-97 1 26-97 High 5 3 1 9 36-67 63-79 1 36-79 Center/Other 10 3 2 15 51-92 40-82 77-88 40-92 All Types 57 28 6 91 4-99 39-98 50-89 4-99 Mean Standard Deviation Type 1st 2nd 3rd All Years 1st 2nd 3rd All Years Elementary 64.46 76.64 84.50 69.16 25.62 14.52 6.36 22.9 Middle 62.28 65.73 50.0063. 13 20.03 16.95 NA 18.53 High 57.00 70.00 76.0063.44 13.73 18.19 NA 13.14 Center/Other 69.00 66.33 82.50 70.27 15.55 22.94 7.78 16.12 All Types 63.91 70.54 76.67 67.89 21.29 15.99 14.17 19.7 Table 3 Analysis of Variance for Implementation by Type of School Source SS df MS F Between groups 561.12 3 187.04 0.47 0.71 Within groups 32560.1381 401.98 Total 33121.2584 Table 4 Analysis of Variance for Implement ation by Elementary and Middle Schools Source SS df MS F Between Groups 602.16 1 602.16 1.36 0.25 Within Groups 28826.4965 443.48 Total 29428.6666

PAGE 86

73 The sample size of each school that returned a BoQ is presented in Table 5 by type (elementary, middle, high, c enter) and by year of implementation. Descriptive data for the schools that returned a BoQ are presented in Appendix J. Appendix J also includes descriptive data for the schools who have been trained in SWPBS that did not return a BoQ for comparison, as well as the average scores for the descriptive variabl es for all Florida schools. For further comparison, the data on Florida schools is presented by elem entary, middle, and high school in Appendix K. To determine if there was a differ ence between the schools that returned their BoQ and the schools that did not return their BoQ, two one-way between groups Multivariate Analyses (MANOVAs ) were performed to investigate differences between the responding samp le and the non-responding sample on the demographic variables and the seve rity of need variables. For the demographic variables, there was not a statistically significant difference between the responding and the non-responding schools: F (8, 127)=1.927, p=.061, Wilks’ Lambda=.89. For the severity of need variables, there was not a statistically significant differ ence between the responding and the nonresponding schools: F (4, 92)=1.33, p=.945, Wilks’ Lambda=.263. Table 5 Sample Size By Year of Implementation Year Type First (%) Second (%) Third (%) Total Elementary 24 (64%) 11 (29%) 2 (5%) 37 Middle 18 (60%) 11 (36%) 1 (3%) 30 High 5 (55%)3 (33%)1 (11%) 9 Center 10 (66%) 3 (20%) 2 (13%) 15

PAGE 87

74 Instrumentation School-Wide Benchmarks of Quality (BoQ) Instrument content. The School-Wide Benchmar ks of Quality (BoQ; FLPBS Project, 2005) instrum ent is a 53-item rating sc ale that measures the degree to which a school is implementing SWPBS (See Appendix A, B, C). It was developed as a self-evaluation tool for school teams to use to review their progress towards implementi ng the critical elements of PBS. The critical elements comprise the ten subscales of the instrument: PBS Team, Faculty Commitment, Effective Di scipline Procedures, Data Entry and Analysis, Expectations and Rules, Reward System, Teaching Expectations, Implementation Plans, Cris is Plans, and Evaluation. Instrument development. The BoQ was developed in four stages: qualitative item development, expert panel review, cognitive interviewing, and a pilot study. For qualitative item de velopment, the SWPBS staff members developed the items based on the training goals and objectives for each module (FLPBS, 2003-2004). An expert panel of 10 key people in the SWPBS field then reviewed and ranked the items in order of importance. These rankings were used to determine the point value of each item. The next stage involved cognitive interviewing, a technique used to find s ources of response error in survey questions by asking survey respondents to think aloud while responding to each item (Willis, 1999). The think aloud technique allows the interviewer to determine whether the respondents are interpreti ng the items as intended. Three SWPBS coaches were selected from different count ies (Polk, Indian Ri ver, and Leon) to

PAGE 88

75 participate in cognitive interviewing. They represented different genders and races: a white female, black female, and white male. The coaches were trained in the procedures and then asked to “thi nk aloud” while completing the BoQ for their respective schools. The intervie wer probed the coaches to clarify any unclear items or responses, which were reviewed and revised. The revised instrument was then pi loted with 10 SWPBS t eams. The SWPBS teams completed the instrument for their sc hools and provided feedback on any unclear or irrelevant items or procedur es, which were then revised. Instrument administration and scoring. There are three BoQ documents: a Benchmarks of Quality Scoring Form (Appendix A), a Benchmarks of Quality Scoring Guide (Appendix B), and a Team Me mber Rating Form (Appendix C). The coach first completes the Scoring Form using the Scoring Guide. The Scoring Guide provides operational definit ions for the items and an explanation of the scoring for each item. The team members then individually complete the Team Member Rating Form for the team a simplified version of the Coach’s Scoring Form that does not require the sco ring guide. The raters instead indicate whether each item is “not in place,” “needs improvement,” or is “in place.” After both the coach and team have independently completed the Scoring Forms, the coach compares his/her ratings to the team’s ratings and discusses any discrepancies with the team. If the team provides a good rationale to increase or decrease an item’s score, the coach c an change the score. The coach, however, makes the final scoring decisions.

PAGE 89

76 To administer this instrument as described above, SWPBS coaches were trained on its administration during a coac h’s training session in January, 2005. The training included an explanation of the instrument and a practice session with a fictitious school. Coaches who di d not attend the training session were instructed to view a training CD that included a power point presentation of the training. The BoQ has a total possible score of 100. This score is derived from the 3-8 items in each of the nine subscales. Each item is worth between one and three points; the items are summed to obtain a total score, as well as subscale scores. Psychometric properties. The Cronbach’s alpha coefficient for all the items on the instrument was .96, indicating good in ternal consistency. All but one of the Cronbach’s alpha coefficients for the subsca les was greater than .70. Measures of central tendency (mean, median, standar d deviation, range) were calculated for the total score, subsca le scores, and items for the 91 schools that completed the BoQ for the school year of 20042005. The mean was 67.89 with a standard deviation of 19.7, and the median score wa s 71. The scores ranged from 4 to 99. The distribution of scores is present ed in the histogram in Figure 4.

PAGE 90

77 020406080100 0 5 10 15 20 BoQ Total Scores Figure 4. Histogram of BoQ scores, n=91. The mean, standard deviati on, and Cronbach’s alpha coefficient for each subscale is presented in Table 6. Table 7 provides a ranking of the means for each subscale. In this table, the mean of each subscale was divided by the total possible points to obtain a mean percentage of the possible total. The subscale with the highest mean was “E ffective procedures for dealing with discipline” (M=85%, SD=21%), and the subscale wit h the lowest mean was “Lesson plans for teaching expectations” (M=46%, SD=33%). To examine intra-rater reliability or the correlation between ratings from the same respondent at two different points in ti me, all coaches who completed the BoQ were asked to complete a second BoQ withi n two weeks of completing it the first time. Seventeen coaches returned a sec ond BoQ, and data were analyzed from these 17 schools. Intra-rater reliability wa s observed to be r=.978. Additionally, correlations were conducted on each of the s ubscales from time one to time two. Results ranged from r= .63 to r=.98. Fre q uenc y of BoQ Scores

PAGE 91

78Table 6 Means, Standard Deviations, and Cronbach’s Alph a Coefficients for the of BoQ Subscales Subscale Name Poss. Total M SD 1 PBS Team .43 7 5.91 1.26 2 Faculty Commitment .75 6 3.37 1.60 3 Effective Procedures for Dealing with Discipline .81 12 10.23 2.51 4 Data Entry & Analysis Plan Established .74 9 6.45 2.21 5 Expectations & Rules Developed .76 11 8.35 2.53 6 Reward/ Recognition Program Established .87 17 10.79 4.41 7 Lesson Plans for Teaching Expectations/ Rules .87 9 4.13 3.00 8 Implementation Plan .79 13 6.92 3.71 9 Crisis Plan .83 3 2.51 0.96 10 Evaluation .83 13 8.12 3.44 Note. N=91. Table 7 Mean Total Points for Each BoQ Subscale Mean % Effective Procedures for Dealing with Discipline 85% PBS Team 84% Crisis Plan 84% Expectations & Rules Developed 82% Data Entry & Analysis Plan Established 72% Reward/ Recognition Program Established 63% Evaluation 62% Faculty Commitment 56% Implementation Plan 53% Lesson Plans for Teaching Expectations/ Rules 46% Note. N=91. Mean percentage was derived by dividing average mean by total possible points for the subscale. To compute inter-rater reliability or the correlation between scores from two raters at the same point in time 19 individuals other than the coach who were familiar with the school (e.g., exte rnal coach or district coordinator) completed the benchmarks within the two-w eek period that the coach completed them. The second rater used the team members’ ratings and their own

PAGE 92

79 experience with the school to complete the instrument. A Pearson productmoment correlation was conducted on the scores from both individuals, and the results indicated a high correlation of r=.864. To determine if there was concurrent validity, the relationship between the current instrument and an instrument that has demonstrated good psychometric properties, the SET (SET; Ho rner, Todd, Lewis-Palmer, Irvin, Sugai, & Boland, 2004), was examined. The SET is an observa tion and interview tool that is used extensively to measure SWPBS implement ation. The item scores are either derived from interviews with the admin istrator, team me mbers, staff, and students or from observations of PBS permanent products. Each item on the SET can be scored with a 0,1, or 2 (0=not implemented, 1=partially implemented, and 2=fully implemented). Ther e are 28 items representing seven key features of SWPBS that comprise the subscales. These seven subscales on the SET are similar to the subscales on the BoQ; how ever, the SET includes an additional section on district support and the BoQ in cludes three additional sections on lesson plans, crisis plans, and evaluation. Overall, the two instruments cover similar content but use a different method for obtaining the responses. At 29 Florida SWPBS schools that al so completed the BoQ, an evaluator from USF completed the SET. The adm inistration of the SET was scheduled approximately within a two-week period that the BoQ was completed. The implementation of PBS is purported to rema in relatively stable over a two-week period as most teams meet monthly and it is typically at the monthly meetings that the teams make changes to the im plementation of PBS. To determine the

PAGE 93

80 validity, the total scores on the BoQ were correlated with the total scores of the SET using Pearson product moment corre lation, and the results indicated an r=.450. School-Wide PBS Implementation Factors Survey (SWIF) Instrument content. The purpose of the SWIF (C ohen, Childs, & Kincaid, 2005) (Appendix L) is to determine coaches ’ and team members’ perceptions of the degree to which a range of factors in fluence implementation of SWPBS. It contains 60 statements descr ibing potential fact ors that were helpful (enablers) or problematic (barriers) in the impl ementation of SWPBS. The respondent is asked to rate how problematic or helpful each item has been in the implementation of SWPBS using a five point Likert rating scale: “problematic,” “somewhat problematic,” “no influence,” “somewhat helpful,” or “helpful.” There also are open-ended questions in which t he respondent is asked to identify two additional factors that have been helpful and two additional factors that have been problematic in implem entation. Upon completion, the respondent is asked to provide information about his/her positi on with the PBS project, position in the school, highest degree attained, field of study, number of years in the current school, type of school, number of years t he school has been involved in PBS, and approximate number of stud ents enrolled in the school. Instrument development. The SWIF was developed in three stages: item generation, expert panel review, and pilot test. The items were generated from the content of a nominal group proc ess (Dunham, 1998) facilitated by the National Implementation Research Ne twork (NIRN) held during the SWPBS

PAGE 94

81 Implementer’s Forum in July, 2004 in Orlando, FL. The participants in the nominal group process were SWPBS team members representing 30 schools and 13 districts in Florida. These indi viduals participated in small groups to identify barriers and enablers to successf ul implementation of School-Wide SWPBS. Each group was asked, “What ha ve been the barriers (or enablers) to implementing School-Wide Positive Behavior Support in your school or district?” The groups generated a list of barriers and enablers and then rated the top 10 responses. Similar responses across groups were combined into categories and were used to generate items. For exam ple, one category was “having enough time.” As this category could refer to several people, the items of “teachers having enough time,” “coaches having enough time,” and “ administrators having enough time” were generated. An expert panel of three professors from the University of South Florida reviewed the items and provided suggesti ons for their improvement. The panel included experts in Measurement, Or ganizational Development, and SchoolWide Positive Behavior Support. There were two rounds of pilot testing to ensure the clarity of the directi ons, items, and administration. Four PBS coaches were asked to complete the survey onli ne and provide feedback on the directions, format, structure, and items of the surv ey. The coaches were provided with a paper copy of the survey which they used to edit the items. T he paper copies of the survey were reviewed, and the edito rial changes were made to the survey. Four more coaches or people familiar with PBS were asked to provide additional feedback on the strengths and problems with the directions, fo rmatting, items,

PAGE 95

82 and online administration. The feedback is summarized in Appendix M. Further revisions were then made with this f eedback for the final version of the instrument. Scoring and administration. The SWIF was administered online. Internetbased surveys are advantageous for convenience samples, such as this one, and for research with organizations that have a list of email addresses for potential participants (Schonlau et al., 2002) To score the SWIF, point values were assigned to each rating on the fivepoint Likert scale. “Problematic” was assigned one point; “somewhat problemat ic” was assigned two points; “no influence” was assigned three points; “somewhat helpful” was assigned four points; “helpful” was assigned five points. Psychometric properties. The Cronbach’s alpha coefficient for all the items on the instrument was .97, indicating good internal consistency. Measures of central tendency (mean, median, standard deviation, range) for the total, subscale, and item scores are presented in the results for research question seven in Chapter Four. Addi tional measures of reliabilit y and validity (i.e., TestRetest Reliability, Cronbach’s Alpha Coeffici ents for Factors, Fa ctor Analysis) are presented in the results for research question five in Chapter Four. Team Process Survey The team process survey is an 18item evaluation tool intended to measure the PBS team members’ perc eption of their team functioning and effectiveness (see Appendix D for the in strument). The instrument was developed by the FLPBS proj ect with items related to team functioning and

PAGE 96

83 processes. To complete the instrument, a ll team members were asked to rate the items from strongly disagree (1) to strongly agree (5 ). There are a total of 90 possible points. An average score for eac h team was derived to represent each school’s overall score. For the psychomet ric properties of this instrument, Cronbach’s alpha coefficient was calculated for the two scales that were derived from this instrument, t eam functioning (items 1-7, 9, 10) and administrative support (items 11-15). To calculate these coefficients, the items from the entire sample of team members were used (n=577 for team functioning, n=581 for administrative support) and not the av erage item score for each team. The Cronbach’s alpha coefficient for team functioning was =.91 and the coefficient for administrative support was =.78. Coach’s self-assessment The coach’s self-assessment is an 8-it em instrument intended to measure the coach’s level of perceived competen cy, or self-efficacy. The items on the instrument closely match the consul tants’ skills recommended by Lewis and Newcomber (2002) that are necessary to build and sustain SWPBS: mastery of universal SWPBS elements, fluency in basic behavior principles, ability to train others on these strategies, establishm ent of a school-wide data collection system, ability to provide technical assistance to the team and build communication between team and schools (see Appendix E for the instrument). The Coach’s Self-Assessment survey wa s developed by the FLPBS staff. To complete this instrument, coaches rate each item with a score of one to three. A score of one indicates that the coach is learning the skill, two indicates that the

PAGE 97

84 coach is building the skills but is not fluen t, and a three indicates that the coach is fluent or has mastered the skill. The to tal possible score is 24. A higher score indicates that the coach feels he/she is fluent in these skills. For any missing items, the item mean replaced the missi ng item so the total score will not be skewed due to that item. To determine the internal cons istency of the items for this sample, a Cronbach’s alpha coefficient was calculated using a sample of 79 coaches who returned a complete set of responses. The Cronbach’s alpha coefficient was .89 for this sample. Research Questions This section will present the purpose design, variables, and instruments for each of the research questions to be addressed in this study. The analyses and results of each question will be present ed in Chapter IV. A summary of the research questions, hypotheses, variables, analyses, and results is presented in Appendix N. Research Question One: Are there differences in the perceived levels of implementation of School-Wide Positive Behavior Support between schools in their first, second, and third year of the implementation process? Purpose Interventions/programs often take 35 years to implement (Hall & Hord, 2001); thus, it is interpreted that school s that have been involved in SWPBS for a longer period of time will hav e a higher level of implem entation. To determine if implementation level increases over the ye ars for SWPBS, the perceived level of implementation between school s in their first, second, and third years of

PAGE 98

85 implementation were compared. Design A correlational design (Fraenkel & Wallen, 2003) was used for the analysis the independent variable (year of implementation) and the grouping occurred prior to the study. The dependent va riable in this question is the total score on the Benchmarks of Quality (BoQ) instrument, which indicates a school’s level of implementation. T he total score on the BoQ was derived from the sum of the coach’s ratings on the BoQ Scoring Form. The unit of measurement is the school. Research Question Two: What is the re lationship between socio-cultural school factors (i.e., socio-economic status, ethnicity, school size teacher: student ratio, student stability, per centage of students with a disability, percentage of teachers with an advanced degree, percentage of out -of-field teachers) and perceived level of School-Wide Positive Behavior Support implementation? Purpose SWPBS is expanding rapidly to schools across the country; therefore, it is important to identify whether different characteristics of schools are related to implementation. The socio-cultural variables selected for this study are those that must be considered to individualize SW PBS implementation. As SWPBS is individualized for each school, it is hypothesized that these demographic variables will not greatly influence t he level of SWPBS implementation. Design To answer Research Question Two, a correlational design was used

PAGE 99

86 because the independent variables we re not manipulated, and both the independent and dependent variables were continuous data. The independent variables included demographic variables obt ained from archival data from the Florida Department of Education, incl uding socio-economic status, ethnicity, school size, student: teacher ratio, percentage of students with a disability, stability of students, percentage of teachers with an advanced degree, and percentage of out-of-field teachers. The dependent variable was the total score on the BoQ. Research Question Three: What is the relationship between implementation process factors (i.e., effective team f unctioning, administrative support, and coach’s self-efficacy) and perceived leve l of School-Wide Positive Behavior Support implementation? Purpose It is important to understand wh ich general PBS factors (i.e., administrative support, team functioning, and coach’s se lf-efficacy) can influence implementation. Since these variables are considered important to PBS implementation, it is hypothesized that higher scores on these variables will predict a higher level of implementation. Design Similar to question two, this questi on was answered using a correlational design (Fraenkel & Wallen, 2003). The i ndependent variables in this question are scores on a survey measuring administra tive support, team functioning, and coach’s self-assessment. The dependent variable is the score on the

PAGE 100

87 Benchmarks of Quality (BoQ) indicating level of implementation. The independent variables in this questi on are scores on a survey measuring administrative support, t eam functioning, and c oach’s self-assessment. The team functioning variable was deriv ed from relevant items from the Team Process survey (Appendix D). T he team functioning score was derived from the average total score of all team members for each school on the items 17, 9, 10. Item scores range from 1-5 indicating strongl y disagree (1) to strongly agree (5). Total scores range from 9-45. Missing items from individual team member’s surveys should not affect t he score because the score that was used was an average of all the team members’ scores. The administrative support variable was derived from items from the Team Process survey related to administrative issues. The administrative support score was derived from the average total score of all team member s for each school on the items 11-15. Total scores range from 5-25. Missing items from individual team members’ surveys should not affect the score because the score that was used was an average of all t he team members’ scores. The coach self-assessment variable wa s obtained from the eight items on the Coach Self-Assessment survey (See Appendix E). The total possible score was 24. A higher score indicated that t he coach feels he/she is fluent in these skills.

PAGE 101

88 Research Question Four: What is t he relationship between level of need for School-Wide Positive Behavior Support as measured by the percentage of students who received an in-school sus pension (ISS), out-ofschool suspension (OSS), office discipline referral (ODR), or the percentage of students who were below grade level in reading during the baseline year and perceived level of School-Wide Positive Behavio r Support implementation? Purpose Since the severity of need can be pr edictive of treatment integrity (Witt & Elliot, 2003), it is important to under stand whether severity of need for SWPBS predicts implementation. It is important to determine w hether PBS is effective for schools with a higher need for SWPBS because there are many schools with significant school-wide discipline problems t hat are in need of interventions. It is hypothesized that schools observed to have a higher need for SWPBS will have a higher level of implem entation because they may be more motivated to alleviate the problem and thus, implement the intervention. Design Similar to questions two and three, this question was answered using a correlational design (Fraenkel & Wallen, 2003). The independent variables in this question comprising severity of need were the percentage of students who received an out-of-school suspension, an in-school suspension, and an office discipline referral (i.e., incidents of cr ime and violence), and the percentage of students who were below grade level in reading (i.e., a score lower than three on the FCAT). The percentage of students below grade level in reading was

PAGE 102

89 selected because there is currently a st rong emphasis on reading in Florida as Governor Bush has set the goal for 100% of Florida students to be reading at or above grade level by 2012 (Just Read, Florida, 2004). These variables were obtained from archival data from the Florida Departm ent of Education. The dependent variable in this question was t he score on the Benc hmarks of Quality (BoQ). Research Question Five: What is the reliabi lity, validity, and factor structure of the School-wide Positive Behavior Support Im plementation Factors Survey (SWIF)? Purpose and Design Implementation of SWPBS is consider ed key by many (Metzler, Biglan, Rusby, & Sprague, 2001) to influenci ng behavioral and academic outcomes; therefore, it is important to understand which factors serve as enablers and barriers to successful implementation. The SWIF survey was designed to specifically determine coaches ’ and team members’ perc eption of the degree to which barriers and enablers influence impl ementation of SWPB S and is intended to yield information useful for targeting future in-services related to SWPBS. An examination of its potential factors and rela ted reliability and validity of the scores obtained from the items and scales it contains was conducted. Research Question Six: Is there a difference between schools classified as having a high level of School-Wide Posi tive Behavior Suppor t implementation and schools classified as having a low le vel of School-Wide Positive Behavior Support implementation on the factor scores of the SWIF survey?

PAGE 103

90 Purpose It is important to understand what factor s if any differentiated high and low implementers. If the fact ors represent barriers and enabl ers, it is hypothesized that high implementers will have a higher total score for enablers and a lower total score for barriers than low implementers. Design To answer this question, a quasiexperimental design was used. The independent variable was the level of implementation. The schools with scores on the BoQ that were greater than one standard deviation above the mean were considered high implementers; schools that had scores on the BoQ that were between one standard deviation below t he mean and one standard deviation above the mean were considered middle implementers; and, schools that had scores on the BoQ that were one standard deviation or greater below the mean were considered low impl ementers. The dependent variables were the observed factor scores deriv ed from the SWIF. Research Question Seven: Which items are perceived as the most helpful in the implementation of School-Wide Positive Behavior Support by coaches and team members, and which items are perceived as being the most problematic in the implementation of School-Wide Positive Behavior Support by coaches and team members?

PAGE 104

91 Purpose It is important to understand which fa ctors serve as enablers and barriers to successful implementation. If these factors are identified, then they can be incorporated into training to help school s overcome barriers, and the enablers can be used to enhance their implementation. This question would provide PBS trainers with information on the most significant enablers and barriers. Design A mixed-method design was used to address this question as both quantitative and qualitative informat ion were needed. The quantitative information was obtained from the scores on the SWIF items, and the qualitative information was obtained through the open-ended questions on the SWIF survey. Data Collection Procedures All data that were used in this study were obtained from four sources of archival data: the Flor ida Department of Educati on (DOE) School Indicators Report, the FLPBS project mid-year ev aluation report, the FLPBS end-of-year evaluation report, and the online Survey Mo nkey database with the results of the SWIF survey. Permission was obtained from the Institutional Review Board (IRB) to use data from these sources. T he data collection and the creation of a database from each of these sources will be described. See Table 8 for a summary of the data source for each variable. Florida Department of Education Sc hool Indicators Report Database Each year, all Florida schools report demographic data to the Florida DOE following a survey period in October and agai n at the end of the school year. The

PAGE 105

92 Florida DOE (2003) posts this demographi c information on its website, and the information can be downloaded into an exce l spreadsheet. The following data on the schools involved in the SWPBS project were retrieved from the website for this study: school size (number of st udents and staff), percentage of students with a disability, percentage of teacher s with an advanced degree, the number of in-school and out-of-school suspensions, the incidents of crime and violence (ODRs), and the percentage of students below grade level in reading. Table 8 Source of Data for Variables Source Database Variables Florida DOE School Indicators School size Report % students with a disability % teachers with advanced degree ISS OSS ODRs % students below grade level in reading Ethnicity data Free and reduced lunch data FLPBS Mid-Year Team Process Survey Project evaluation Coach's Self-Assessment Survey FLPBS End-of-Year BoQ survey Project evaluation FLPBS Survey Monkey SWIF survey Project Note. ISS=In-School Suspensions; OSS= Out of School Suspensions; ODR= Office Discipline Referrals. Mid-Year FLPBS Evaluation Database To create this database, a packet c ontaining mid-year evaluation forms was sent to each SWPBS coach in November, 2004 and was due in December, 2004. The packet included a school profile report, the Team Process Survey, a team update, and the Coach’s Self-A ssessment. Districts and/or FLPBS

PAGE 106

93 compensated each coach with $125 for completing and returning the entire evaluation packet. The responses to t he instruments were entered into a database. From this database, team process survey data, coach’s selfassessment data, ethnicity data, and SES data were used in this study. End-of-Year FLPBS Evaluation Database To create this database, a packet c ontaining end-of-year evaluation forms was sent to each SWPBS coach in March, 2005 and was due by the last week of the school year (which varied per school ). Districts and/or FLPBS compensated the coaches with $125 for completing and returning the entire evaluation packet. The packet included another school profile report, a team update survey, and the Benchmarks of Quality (B oQ) forms. The data from these instruments were entered into a database. From this databas e, the responses to the BoQ were used for this study. Survey Monkey Database (SWIF) An online survey, the School-wide Impl ementation Factors Survey (SWIF), was created to determine the factors that most influenced the implementation of SWPBS. The survey was posted on a websit e that hosts surveys called Survey Monkey (2003-2004). The procedures to obtain participants for the survey followed the general guidelines propos ed in Dillman’s (1978) Total Design Method for surveys. Dillman suggests that the initial mail out date should be early in the week and should not be near a ho liday. The first email was sent on a Wednesday of a typical week to all coac hes and district coordinators with a web link to access the survey on Survey Monkey and directions to send the survey to

PAGE 107

94 the rest of their team members. The participants we re directed to the FLPBS website (http://flpbs.fmhi.usf.edu) where a pop-up window and embedded link provided access to the survey. As follo w-up mailings help to increase the response rate by two (Dillman), two generic reminder emails were sent one and three weeks later thanking those who had participated and reminding those who had not yet participated to complete the su rvey. Seven weeks later, a specific follow-up email was sent to those who had not yet completed the survey. This email included the components recommended by Dillman: tie to previous communication, recognition of the importanc e of the survey, explanation of why completion of survey is im portant, the usefulness of t he study, the importance of recipients to study's usefulness, a reminder, and a note of appreciation (See Appendix O for a copy of all the emails). The survey was available online from May 18, 2005 to July 15, 2005. The responses were electronically entered into a database. The database was downloaded into Excel to use in this st udy. Prior to conducting the item analysis for this question, the responses from the SWIF survey were downloaded and converted into numerical values. A respons e of “problematic” was converted to a one; “somewhat problematic” to a two; “no influence” to a three; “somewhat helpful” to a four, and “helpf ul” to a five. Therefore, a higher average item score indicated that respondents rated this item as helpful in the process while a lower average item score indicates that respondents rated this item as problematic. The database was then reviewed for erro rs. For the “number of years that school has been involved with PBS,” the responses were reviewed for

PAGE 108

95 consistency. If a respondent selected a choi ce that disagreed with the majority of the other respondents from t heir school, the response was changed to match the remainder of the responses. If there were equal numbers of two different responses for one school, the FLPBS Project database was consulted to determine the actual year of traini ng and implementation for that school. Some of the responses allowed a fill-in option labeled “other”. These responses also were reviewed. If the open-ended responses in the “other” categories matched another type of res ponse, they were recoded with that number. For example, in “highest degree, ” two respondents selected “other” and wrote “MSW.” This was recoded as a “3” for Master’s degree. Summary This chapter described the setting of the study, participants, instruments, data collection procedures, and research questions that comprised the methods of this study. The result s from these research questions are presented in Chapter Four.

PAGE 109

96 Chapter IV Results This study represents an effort to identify and underst and factors that influence the implementation of Sc hool-Wide Positive Behavior Support (SWPBS), including soci o-cultural, academic, and behavioral indicators. This chapter provides a description of the par ticipating schools from which data were collected, as well as the results of the data analyses conducted to answer each of the research questions. Research Questions In this section, each research que stion will be presented, followed by an explanation of the data analysis used to answer that question and then the results of that analysis. A summary of the research questions, hypotheses, variables, and results is presented in Appendix N. Research Question One: Are there differences in the perceived levels of implementation of School-Wide Positive Behavior Support between schools in their first, second, and third year of the implementation process? To answer this question, an Anal ysis of Variance (ANOVA) was completed. The a priori power analysis, preliminar y analyses, and results of the ANOVA procedure are described below. A priori power analysis To determine whether Benchmarks of Quality (BoQ) scores differed for schools in different years of implementation, a OneWay Analysis of Variance

PAGE 110

97 (ANOVA) was conducted with BoQ score as the dependent variable and year of implementation as the indepe ndent variable. The means and standard deviations for first, second, and third year school s were 63.91 (21.20), 70.54 (15.49), and 76.67 (14.17) respectively. There was a suffi cient sample size of first and second year schools (n=57 first year schools and n= 28 second year schools) to conduct the analysis; however, there were only six schools in the third year sample. Due to the small number of thir d year schools, it could not be determined whether the mean was representative of all third year SWPBS schools or was representative of this small sample of schools. These schools, therefore, were not included in this ANOVA, nor were they included in subsequent analyses. The data for the third year schools are reported descriptively. Preliminary analyses Three assumptions had to be met before conducting the ANOVA: independence, normality, and homogeneity of variances. First, the observations (i.e., the BoQ data) had to be independent of one another. This assumption was met because the teams implemented SWPB S independently at each of their own school sites. In addition, they also completed the instruments independently and were not influenced by other schools. Second, the samples of scores should represent a normal distribution. To evaluate normality, the distributions of the total sample of BoQ scores and the sample of BoQ scores by year of impl ementation were plotted. Skewness and kurtosis values also were derived. The distribution of the total sample of BoQ scores was slightly negatively skew ed (-0.898) and slightly peaked with a

PAGE 111

98 kurtosis value of 0.800; however, t he distribution appeared reasonably normal. Similarly, the distributions of BoQ sco res for both the first and second year implementing schools were slightly negatively skewed (-0.46 and -0.83) and slightly peaked with kurtosis values of -0.39 and 0.51, respectively; however, these scores also were approximately norma lly distributed. Despite the presence of a negative skew, it was concluded t hat it was appropriate to conduct the analysis because many scales used in social sciences research tend to be positively or negatively skewed (Pallant, 2005). Third, there must be homogeneity of variances, indicating that the samples were obtained from populations representing equal variances. This assumption was evaluated using the Levene test for equality of variances, which yielded a statistic of 1. 845 (2,88) which was not significant (p=.164). A nonsignificant value for this st atistic indicates that the assumption of homogeneity of variances was met (Pallant, 2005). Analysis of Variance Since the sample size was suffici ent and the assumptions were met, an ANOVA was conducted to determine whethe r a significant difference existed between schools in their first and second ye ar of implementation at the alpha level of .05. The results are reported in Table 9 and indicated that there was no significant difference between the BoQ score s of schools in their first year of implementation (n=57) and those of schools in their second year of implementation (n=28), F (1, 38) = 2.12, p=.149. Because there was no

PAGE 112

99 difference, these two groups of school s were combined to answer research questions two, three, four, and six. Table 9. Analysis of Variance for Implementat ion Level by Year of Implementation Source SS df MS F Between groups 823.72 1 823.722.120.15 Within groups 32297.53 83 389.13 Total 33121.25 84 Note. n =57 first year schools. n =28 second year schools Research Question Two: What is the re lationship between socio-cultural school factors (i.e., socio-economic status and ethnicity of student body, school size, teacher: student ratio, student stability, percentage of students with a disability, percentage of teacher s with an advanced degree, percent age of out-of-field teachers) and perceived level of School-Wide Positi ve Behavioral Support implementation? To answer this question, a multiple regression analysis was completed. The descriptive statistics for each variable will be presented firs t, followed by the results of the preliminary and mu ltiple regression analyses. Descriptive Statistics The descriptive data (i.e., sample size, range, mean, standard deviation) for the demographic variables are pres ented by school level and year of implementation in Appendix P. A total of 73 elementary, middle, and high schools for which demographic data were avail able also returned the completed Benchmarks of Quality inst ruments. Demographic data were not available for the centers or alternative education elementary and middl e schools because of the frequent change in student population at these schools. Of these 73 schools, four

PAGE 113

100 schools were in the third year of SW PBS implementation. As explained for research question one, data from third year schools will be presented only in descriptive form, but were not included in the analysis. As a result, data for 69 schools (46 first year schools and 23 second year schools) were included in the analysis. The data for these 69 schools were combined for purposes of the analysis as determined in research question one. Based on Stevens’ (1996) recommendati on that data for 15 subjects be used for each predictor variable, the sample size in this study of 69 schools was considered sufficient for examination of four predictors relati ng to implementation of SWPBS. Since there were eight variables initially included in this question, the assumptions were reviewed to determine if any variables should be eliminated, combined, or included in a separate analysis. Preliminary Analysis The data were first reviewed for outlie rs in the predictor variables using Cook’s distance (CD) method. No data poi nts were considered influential to the analysis based on the criteria set by Cook and Weisberg ( 1982) of a Cook’s distance value greater than one. The data then were reviewed to determine whether they met the assumptions for us e of a multiple regression analysis: normality, linearity, homoscedasticity and independence of residuals. The distributions for all variables were pl otted to examine normality. Kurtosis and skewness values also were examined for the eight variables. Most distributions displayed normality (See Table 10); however the distribution for the variable of student stability was negatively skewed (= -5.50) and had a kurtosis value of

PAGE 114

101 30.04. It was determined that data from two high schools were causing the negative skew because these two schools had stability rates of less than 15%. The remainder of the schools had stability rates of greater than 85%. These two high schools were vocation schools, which ex plains their low stability rate. These data were considered valid and were reta ined in the analysis. One of these two schools had a high percentage of students with disabilities (50%), while the remainder of the schools had less than 30% of students with disabilities. As these data points were valid and this analysis is robust to skewness (Pallant, 2005), these values were not removed. Table 10 Skewness and Kurtosis Values for Socio-Cultural Factors Skewness Kurtosis FRL 0.16 -0.09 Ethnicity (% non-white) 0.73 0.50 School size 1.20 2.15 Teacher: student ratio -1.05 0.73 % Students with a disability 2.76 13.84 Stability of students -5.50 30.04 % teachers with advanced degree -0.19 -0.38 % out-of-field teachers 1.48 2.30 Note. These values are for the sample of instruments received from first and second year implementing schools that had no missing values. FRL=Free and/or Reduced Lunch. To review for the homoscedasticity of errors, the standardized residuals for each variable were plotted with the BoQ scores. Homoscedasticity indicates that scores are centered and have equal va riances (Stevens, 1986). The plot revealed that most residuals were centered around zero. (See Figure 5).

PAGE 115

102 -2 -1 0 1 2 3Regression Standardized Predicted Value -3 -2 -1 0 1 2Regression Standardized Residual Figure 5. Scatter plot of standardized residuals and standardized predicted values of BOQ scores. To evaluate linearity, the correlation between the predicted DV scores and the errors of prediction was depicted in a scatterplot. A straight-line relationship was displayed which indica ted linearity. The correla tions between pairs of predictors for this question were then reviewed for high intercorrelations, which can result in multicollinearity. This c an limit the size of R because confounding effects of the predictors can reduce the interpretability of R (Stevens, 1986). The intercorrelations for the variables in this question are presented in Table 11. Tabachnick and Fidell (2001) recommend corr elations between .3 and .7. All but one correlation fell within this range. T he correlation between percentage of students eligible for free and reduced lunch (FRL) and the percentage of non-

PAGE 116

103 white students in the school was r=.82, p<.001. As this value exceeded the recommended range, two other indi cators of multicollinearity were assessed, the Tolerance and Variance Inflation Value (V IF) values. These values describe the degree of variability that results from a given variable which is not accounted for by the other variables. Data sets that exceed the commonly used cut-off points (i.e., <0.1 for Tolerance and > 10 for VIF) indicate multicollinearity (Pallant, 2005). The VIF and Tolerance values in th is sample did not exceed the cut-off points. The second highest correlation was found between school size and teacher: student ratio (r=.63, p<.001). Scho ol size also had a positive correlation with teacher: student ratio, as well as with stability and a negative correlation with percentage of students with a disability The FRL variable was significantly correlated (r=.25, p<.05) with percentage of classes t aught by an out-of-field teacher. Table 11 Intercorrelations Among Socio-Cultural Factors Factor 1 2 3 4 5 6 7 8 9 1. FRL 1.00 .82**-.15 -.20 .23 .10 -.23 .25* -.17 2. Ethnicity 1.00 .04 -. 14 .08 .00 -.14 .23 -.16 3. School size 1.00.63**-.31** .25* .17 .24 .03 4. Teacher: student ratio 1.00 -.38** .42** .12 .05 -.02 5. % Students with a disability 1.00 -.61**-.24 .05 -.02 6. Stability of students 1.00 .19 .10 .09 7. % Teachers-advanced degree 1.00 .27* .18 8. % Out-of-field teachers 1.00 .04 9. BoQ score 1.00 ** p < .01. p < .05. N=69. Note. FRL=Free and/or Reduced Lunch.

PAGE 117

104 Multiple Regression Analysis All of the required assumptions were met; however, the sample size was not adequate to conduct a multiple r egression analysis that included eight variables. The examination of the assumpti ons did not reveal any variables that should be eliminated. Conseq uently, the eight variables were regrouped into student variables (i.e., ethnicity, free and reduced lunch status, percentage of students with a disability, student stability) and school building variables (i.e., school size, teacher: student ratio, percent age of classes taught by an out-of-field teacher, percentage of teachers with an advanced degree). Two multiple regression analyses were conducted instead of one to examine the strength of relationships between the independent variables and the dependent variables. The standards procedure was used to enter the variables into the multiple regression equation because there wa s no hypothesis regarding the best predictor variables within the set of variables. The predictors were entered simultaneously into the analysis to determine their independent and joint influence (Pallant, 2005). Student variables. The student variables in cluded ethnicity, free and reduced lunch status, perc entage of students with a disability, and student stability. The Coefficient of Multiple Correlation, R, or the variance, was calculated to indicate the strength of the relationship between the predictor variables and the criterion variable. The R value was found to be .226. The Coefficient of Determination, R2 indicates the proportion of unique and shared variability explained by all of the vari ables. This value was found to be.051 and

PAGE 118

105 was not statistically significant, F (4, 64)=.862, p=.492. This model explained an insignificant amount of variance in the BOQ scores. T he proportion of unexplained variability, 1R2 was found to be high, .949. The effect size calculated to be .05 using the formula (R2 / 1R2). A review of the standardized coefficient indicated that none of the predictor variables made a significant contribution to the dependent variable. The squared semipartial correlations, which indicate s the unique contribution of each variable to the total R2 when the contribution of the other predictor variables is partialed out, were all extremely small. Th ese data are presented in Table 12. Table 12 Multiple Regression Analysis Results for Student Variables Predicting Implementation Variable SE Semisquared partial correlation FRL -.28 .28 -.25 .02 Ethnicity (% non-whit e) .03 .23 .03 .00 % Students with a disability .55 .63 .16 .01 Stability of students .30 .25 .21 .02 Note. R2=-.008. n =69 schools. None of the beta values were statistically significant. FRL=Free and/or Reduced Lunch. School building variables The school building variables included school size, teacher: student ratio, percentage of classes taught by an out-of-field teacher, and the percentage of te achers with an advanced degree. The Coefficient of Multiple Correlation, R, wa s calculated to indicate the strength of the relationship between thes e predictor variables and the criterion variable. The R value was calculated to be .190. The Coefficient of Determination, R2 was .036 and was not statistically signifi cant F (4, 64)=.599, p=.664. This indicated that this model did not account for a significant am ount of the variance in the BoQ scores. The proportion of unexplained variability, 1R2 was high, .964.

PAGE 119

106 The effect size was calculat ed to be .04 using the formula (R2 / 1R2). A review of the standardized coefficient i ndicated that none of the variables had a significant influence on the dependent vari able and that the squared semipartial correlations were close to zero (See Tabl e 13 for beta values and for the squared semipartial correlations). Table 13 Multiple Regression Analysis Results fo r School Building Variables Predicting Implementation Variable SE Semisquared partial correlation School size .00 .00 .05 .00 Teacher: student ra tio -.51 1.03 -.08 .00 % teachers with advanc ed degree .43 .30 .19 .03 % out-of-field teachers -.05 .34 -.02 .00 Note. R2=.036. n=69 schools. None of the beta values were significant. Research Question Three: What is the relationship between implementation process factors [i.e ., team functioni ng (TF), administrati ve support (AS), and coach’s self-efficacy (CSE)] and perceiv ed level of School-Wide Positive Behavioral Support implementation? To answer this question, a multiple regression procedure was used. The descriptive statistics for each variable will be presented followed by the results of the preliminary and multiple regression analyses. Descriptive Statistics The data were first reviewed for respondents who had not completed all items on the instruments because the tota l score from a respondent with missing items would be deflated. Six cases fr om the coach’s self-assessment and one case from administrative support we re excluded due to missing items. The descriptive data for the remain ing cases are presented in Appendix Q.

PAGE 120

107 The data are presented for all respondents, as well as for respondents grouped by school level and by year of implem entation. The overa ll means and standard deviations were found to be 37.09 (3.95) for team functioni ng, 20.03 (2.17) for administrative support, and 18.69 (3.66) fo r coach’s self-assessment. The means and standard deviations for the subgroups (i.e., type, year) were all similar to the mean and standard deviation for the whol e group for all variables, with the exception of the mean for coach’s self-asse ssment in schools in the third year of implementation, which was found to be higher than the total mean ( M =23.67, SD= 0.58). As determined in research question one, data for schools in years one and two of implementation were combined in this analysis because the difference between the scores for the two groups was not statistically significant. Once again, data for schools in the third year of implementation were used for descriptive purposes only. Therefore, t he remainder of the analyses included the combined data for year one and two impl ementing schools. As some SWPBS teams did not return complete evaluatio n packets or left some items blank, different sample sizes resulted for the th ree variables. There were 98 teams that returned a completed Team Process Eval uation (i.e., Team Functioning and Administrative Support variables) (87% retu rn rate), and 78 teams that returned a completed Coach’s Self-Assessment (69% return rate). However, of those schools that returned the instruments, 79 schools had both Team Functioning scores and a BoQ, 78 schools had both Ad ministrative Support scores and a BoQ, and 59 schools had both Coach’s Self-Assessment scores and a BoQ.

PAGE 121

108 Thus, these schools were the sample for the Multiple Regression Analysis. This sample size was considered sufficient fo r analyzing three predictor variables based on Stevens’s (1996) recommendati on of 15 subjects per predictor variable, which means 45 subjects were required for this analysis. Preliminary Analysis The data first were reviewed for outliers in the predictor variables using the Mahalanobois distance and Cook’s dist ance (CD) methods. No data points were considered influential to the analysis according to the criteria set by Cook and Weisberg (1982) (i.e., CD>1) or by Tabachnik and Fidell (1996) (i.e., Mahalanobois Distance > 16.37 for three independent variables). No cases were considered outliers. The data then were reviewed to determine whether they met the assumptions required for the use of mu ltiple regression analysis: normality, linearity, homoscedasticity, and independence of residuals. The distributions for all variables were plotted to examine t he normality of the distribution. Kurtosis and skewness values were examined fo r the three variables. All three distributions displayed normality (See T able 14); however, the distribution for coach’s self-assessment was negatively ske wed (=-0.514). As this analysis is robust to skewness (Pallant, 2005), and the normal probability plot did not reveal any major deviations from normality for any of the scales, this assumption was considered to be met.

PAGE 122

109Table 14 Skewness and Kurtosis Values fo r Implementation Process Factors Skewness Kurtosis Team Functioning -0.426 -0.082 Administrative Support -0.276 -0.302 Coach's Self-Assessment -0.514 0.189 Note. These values are for first and second year implementing schools that had no missing values on their instruments. To evaluate linearity, the correlation of each of the predicted DV scores and the errors of prediction was depicted in a scatterplot. The graphs displayed a straight-line relationship. To review for the homoscedasticity of errors, a histogram of the standardi zed residuals for each va riable with the BoQ scores was plotted. Homoscedasticity indicates that scores are centered and have equal variances (Stevens, 1986). The plot rev ealed that the residuals were centered around zero, indicating that this assumption was met. The correlations between the pairs of predictors in this question were then reviewed for high intercorrelations, which c an result in multicollinearity. This can limit the size of R, and the confounding effects of the predictors can reduce its interpretability (Stevens, 1986). The interc orrelations for the variables in this question are presented in Table 15. The intercorrelations were reviewed to determine if they fell within Tabachnick and Fidell’s (2001) recommend range of .3 to .7. As some of the correlation values exceeded this range, two other indicators of multicollinearity, the To lerance and Variance Inflation Value (VIF) values, were assessed. These values describe the degree of variability that results from a certain variable when it is not accounted for by the other variables. Data sets that exceed the commonly used cu t-off points (i.e., <0.1 for Tolerance

PAGE 123

110 and > 10 for VIF) demonstrate multicollinearit y. The VIF values in this sample were all greater than 0.1 (2 .56, 2.49, 1.06), and the tolerance values were less than 10 (0.39, 0.40, 0.94) (Pallan t, 2005). Although team functioning and administrative support had a correlation greater than .70, there was not a concern about multicollinearity, and the variables were, therefore, analyzed independently in this analysis. Table 15 Intercorrelations Among Team Functioning (TF), Administrative Support (AS), and Coach's Self-Efficacy (CSE) 1 2 3 4 1. BoQ 1.00 .41 .25 .16 2. TF 1.00 .77* .24 3. AS 1.00 .16 4. CSA 1.00 *p<.05 Multiple regression As the sample size was adequate and the assumptions were met, a multiple regression analysis was us ed to examine the strength of the relationships between the independent variables and the dependent variables. Because there was no hypothesis about which variable was the best predictor, the predictors were entered simultaneously into the analysis to determine their independent and joint influence (Pallant, 2005). The Coefficient of Multiple Correlation, R, or the variance, was calculated to indicate the strength of the relati onship between the predictor variables and the criterion variable. The R value was .432. The Coefficient of Determination, R2 indicates the proportion of unique and shared variability that all of the variables explained. This value was .187 and was stat istically significant, F (1, 52)=3.98,

PAGE 124

111 p=.013, indicating that 18.7% of the vari ance in the BoQ scores was explained by this model. The proportion of unexplained variability, 1R2 was .813. The effect size calculated using the formula (R2 / 1R2) was .23 indicating a small effect size. A revi ew of the standardized coeffi cient indicated that the variable of Team Functioning had the st rongest unique contribution and the only significant value (See Table 16 for val ues). The predicti on equation for this question was: Z = (.534 TF) + (.178 AS) + (.066*CAS) + 26.45 The squared semipartial correlations, whic h indicate the unique contribution of each variable to the total R2 when the contribution of the other predictor variables are partialed out, for TF, AS, and CAS were .11, .01, and .00 respectively. Table 16 Multiple Regression Analysis for Implementati on Process Factors Predicting Implementation Variable SE Semisquared partial correlation Team Functioninga 2.73 1.03 0.53* 0.11 Administrative Supportb -1.69 1.87 -0.18 0.01 Coach's Self-Efficacyc 0.364 0.71 0.06 0.00 Note. R2=.187 an =79. bn =78. cn =59. p <.01 Research Question Four: What is t he relationship between level of need for School-Wide Positive Behavioral S upport (SWPBS) as measured by the percentage of students who received an in-school suspension (ISS), out-ofschool suspension (OSS), office disciplin e referral (ODR), or the percentage of students who were below grade level in read ing (BGLR) during the baseline year and perceived level of SWPBS implementation?

PAGE 125

112 To answer this question, multiple regression analysis procedure was used. Descriptive statistics for each variable are presented in the next section, followed by the results of the preliminary and multiple regression analyses. Descriptive Statistics The sample for this question incl uded the schools that both returned completed BoQ instruments as part of the end-of-year evaluation and for which baseline data were available on the FLDO E website, School Indicators Report. FLDOE data were not available for center schools or for schools that were in their first year of operation (and had in itiated the implem entation of SWPBS during this year). Additionally, schools in their third year of implementation were not included in the multiple regression analysis. Of t he 91 first and second year implementation schools that returned the BoQ, 59 (65%) also had ISS, OSS, and ODR data available and 68 schools (75 %) had BLGR data available. This sample size was deemed sufficient for 4 predictor variables based on Stevens’ (1996) recommendation of 15 subjects per predictor variable, requiring 60 subjects for this analysis. The descripti ve data (i.e., mean, standard deviation, range) for these schools, as well as for the third year implementation schools are presented in Appendix R. Preliminary Analysis The data were reviewed for outliers using the Mahalanobois distance and Cook’s distance methods. The criteria for influential data points in a set of four variables was a CD of greater than one or a Mahalanobois distance value of greater than 18.47 (Tabachnik & Fidell, 1996). There were two data points with a

PAGE 126

113 Mahalanobois distance of 25 and 27, but their CD values were less than one. Therefore, these dat a points were retained in the analysis. The data then were reviewed to determine whether they met the assumptions of multiple regression: normality, linearity, homoscedasticity, and independence of residuals. The distributions for all variabl es also were plotted to examine the normality of t he distribution. Kurtosis and skewness values were examined for the four variables. The dist ributions for ISS, OSS, and BGLR were approximately normally distributed with all skewness and kurtosis values below 1.5 (See Table 17). The skewness and kurtos is values were notably high for the ODR scores (2.49 and 9.14, respectively). A review of the data indicated two extremely high points (great er than 400) that caused th e positive skewness of the distribution. Additionally, over one-hal f of the schools reported ODR values between 0 and 50, causing the high kurt osis value or high peak in the distribution. As these data points were not considered outliers, the results were valid and were used in the analysis. Table 17 Skewness and Kurtosis Values for Academic and Behavioral Indicators Skewness Kurtosis ISS 0.76 -0.61 OSS 1.31 1.8 ODR 2.49 9.14 BGLR 0.34 -0.19 Note. These values are for first and second year implementing schools that had no missing values. The correlation between the predi cted DV scores and the errors of prediction was depicted in a scatterplot to evaluate linearity. They displayed a straight-line relationship. A histogra m of the standardized residuals for each

PAGE 127

114 variable with the BoQ scores was plotted to review for the homoscedasticity of errors. The scatterplot of the residual s displayed a roughly rectangular shape with the scores concentrated around the ze ro point, indicating homoscedasticity. The correlations between pairs of pr edictors in this question were then reviewed for high intercorrelations or mult icollinearity. The intercorrelations for the variables in this question are pr esented in Table 18. As Tabachnick and Fidell (2001) recommend correlations bet ween .3 and .7, all of correlations among the variables fell approximately wit hin this range. The Tolerance and Variance Inflation Value (VIF) values were reviewed as well. The Tolerance values in this sample were all greater than 0.1 (0.54, 0.51, 0.76, and 0.81), and the VIF values were less than 10 (1.84, 1.95, 1.32, 1.24) (Pallant, 2005). Table 18 Intercorrelations among Academic and Behavioral Indicators 1 2 3 4 5 1. BoQ 1 -.08 -.14 -.14 -.15 2. ISS 1 .47* .27* .32* 3. OSS 1 .32* .64* 4. ODR 1 .42* 5. BGLR 1 *p<.05 Multiple Regression Analysis As the sample size was adequate and the assumptions were met, a multiple regression analysis was conduc ted to examine the strength of relationships between the independent variables and the dependent variable and to determine the unique and independent in fluence of these variables (Pallant, 2005). The variables were simultaneously entered into the multiple regression equation because there was no hypothesis r egarding the best predictor variable.

PAGE 128

115 The Coefficient of Multiple Correlation, R, was calculated to indicate the strength of the relationsh ip between the predi ctor variables and the criterion variable. The R value was determined to be .18. The Coefficient of Determination, R2 which indicates the proportion of unique and shared variability explained by all variables was found to be .032 and was not statistically significant, F (4, 54)=0.453, p=.77. The proportion of unexplained variability, 1R2 was determined to be .968. The effect size was calculat ed to be .03 using the formula (R2 / 1R2). The effect size of .03 is considered to be very small. A review of the standardized coefficient indicated that none of the va riables had a strong unique contribution, and none of the coefficients was signi ficant (See Table 19). The squared semipartial correlations, whic h indicate the unique contri bution of each variable to the total R2 with the contribution of the other predictor va riables partialed out, were all extremely small (< .008). Table 19 Multiple Regression Analysis for Academic and Behavioral Indicators Predicting Implementation Variable SE Semisquared partial correlation ISSa -.01 .22 -.00 .000 OSSa -.16 .39 -.07 .003 ODRa -.02 .04 -.09 .007 BGLRb -.07 .21 -.06 .002 Note. R2=.032 an =59. bn =68 p <.01

PAGE 129

116 Research Question Five: What is the re liability, validity, and factor structure of the School-wide Positive Behavior S upport Implementation Factors Survey (SWIF)? To answer this question, the test-rete st reliability of the instrument was first determined, and then an explorator y factor analysis was conducted. The sample will be described followed by a r eport of the results of the analyses. Sample For the SWIF survey, a total of 1 127 potential participants were possible because there were 161 teams in the schools included in this study, each with six to eight members. It is im portant to note that although all of these teams were trained in SWPBS, many were no longer implementing SWPBS at the time of data collection, and the FLPBS project does not have access to the teams that are no longer implementi ng SWPBS. Two hundred and ei ghty-nine people (26%) responded to the online SWIF survey duri ng the period of data collection, May 15, 2005 to June 30, 2005 Fifty-three of the respondents exited the on-line survey before completing it, and their dat a were eliminated. An additional 25 respondents indicated that they did not have a second adminis trator at their school and did not complete the section about the second administrator. For the purposes of the factor analysis, thes e 25 schools were not included in the analysis. Consequently, the responses from the 211 remaining participants were included in this analysis.

PAGE 130

117 Test-Retest Reliability To determine the test-retest reliability of scores from the instrument, five PBS coaches were asked to complete t he SWIF a second time within a two-week period. Two methods of reliability we re conducted, for total score and for individual items. The total scores for t hese respondents were calculated. Percent agreement between total scores for time one and for time two was calculated by dividing the lower score by the higher sco re and multiplying by 100. The average percent agreement was 98%. A tally of the agreement for ratings on individual items was conducted and divided by the to tal number of items (i.e., 60). The average percentage of items on which resp ondents indicated the same response at time one and at time two was 86%. See Table 20 for the test-retest reliability scores. As the percent agreement for bot h the total score and the item scores was greater than 80%, these SWIF scores demonstrated good test-retest reliability. Table 20 Test Retest Results for SWIF survey Respondent Total Score Percent Agreement Time 1 Time 2 Total Score Items School1 256 264 97% 72% School2 273 278 98% 80% School3 137 137 100% 100% School4 240 242 99% 98% School5 227 235 97% 78% Mean 98% 86% To determine if there were any item s that did not have good reliability, the number of respondents who agreed on each of the items were tallied. For all items, three or more respondents indica ted the same response at time one and

PAGE 131

118 time two. Therefore, a ll items demonstrated good relia bility with this sample. Factor Analysis A priori power analysis. To assess the factorial validity of the scores from this instrument, an exploratory factor analysis (EFA) was conducted. Prior to performing EFA the suitability of the data for factor analysis was assessed by considering the adequacy of the sample size the factorability of the data, and the item to item correlations. The recommended sample size for an EF A ranges from five individuals for each item within an instrum ent (Gorsuch, 1983) to a minimum range of 150 to 300 (Tabachnick & Fidell, 2001). Since the SWIF includes 60 items, a sample of 300 participants was recommended for this analysis. Although a sample of only 211 participants was obtained for this analysi s, the size was deemed sufficient based on Tabachnick and Fidell’s criterion. To obtain this sample, both team members’ and coaches’ responses were used. It is important to note that these data are nested data because there are mu ltiple respondents from one school. This violates the independence assumpti on. It is likely, however, that respondents from the same school will have di fferent opinions on the influence of each variable. To assess the factorability of the data, Bartlett’s Test of Sphericity and the Kaiser-Meyer-Olkin Measure of Sa mpling Adequacy (KMO) were conducted (Bartlett, 1954; Kaiser, 1974) For Bartlett’s test, the chi square was significant (p<.001), and the KMO index was greater than .6 (KMO=.90). These values suggest that the data have good factorab ility (Tabachnick & Fidell, 2001).

PAGE 132

119 Inspection of the matrix of item to it em correlations revealed the presence of many coefficients of .30 and above as recommended (Tabachnick & Fidell, 1996). The average correlation was .37. Thes e results indicated that the data were suitable for conducting a factor analysis. Factor Extraction Principal component analysis reveal ed the presence of 12 components with eigenvalues exceeding 1, explaini ng 72.48% of the variance. As this criterion has been criticized for retaining too many factors, the scree plot was then inspected. There was a break after the third component (See Figure 6). Using Cattell’s scree test (Cattell, 1966), t he researcher decided to retain three components for further investigation. 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51 53 55 57 59Component Number 0 5 10 15 20 25Eigenvalue Scree Plot Figure 6. Scree Plot for Factor Analysis.

PAGE 133

120 To aid in the interpretation of t hese three components, a Promax oblique rotation was selected because factors we re predicted to be correlated. This hypothesis was confirmed, i. e., all the factor correlati ons were greater than .57 (See Table 21 for factor correlations bef ore analysis and Table 22 for factor correlations after items were grouped by factors). The three components generated from the rotated solution re vealed a number of strong loadings, with most variables loading strongly on onl y one component. The three component solution explained a total of 47.48% of the variance, with Component One contributing 35.94% of t he variance, Component Tw o contributing 6.8%, and Component Three contributing 5.0%. Table 21 Component Correlations Before Grouping Items Factor Factor1 Factor2 Factor3 Factor1 1.00 .56 .63 Factor2 1.00 .45 Factor3 1.00 Table 22 Factor Correlations After Grouping Items Factor Factor1 Factor2 Factor3 Factor1 1.00 .65* .69* Factor2 1.00 .57* Factor3 1.00 Note. N =211. p <.01 Factor Interpretation The factor pattern and structure matr ices were generated to determine the grouping of the items within eac h of the three factors. Th e pattern matrix includes standardized regression coefficients or t he degree to which each item relates to the factor with the influenc e of the other items parti aled out, called the factor

PAGE 134

121 loadings. The structure matrix includes correlations between the item and the factor (Stevens, 2002). The items were first categorized by the factor that represented the highest fact or loading and the highest regression coefficient for the item. If there was a conflict between the highest scores, the item was included with the factor t hat appeared to be the best fit based on the content of the other items in the fact or. After grouping the items together by their factor, they were reviewed for cohesiveness. Two items (i.e., Data entered regularly, Coach’s stability) that were grouped with Fa ctor 3 were moved to Factor 1 for a better fit. The factor loadings or regre ssion coefficients for Factor 1 were above .20, and the correlations between the items and the overall factor score in Factor 1 were above .40. Another item, “Discipline refe rral process” was moved from Factor 3 to Factor 2 for the same reason. See Appendices S and T for the Pattern and Structure matrices. To confirm the groupings listed in Appendices P and Q, Cronbach alpha coefficients were conducted for the des ignated items in each factor and are reported in Appendix U. The coefficients fo r the three factors were determined to be .95, .93, and .93, respectively. To confirm that the it ems fit with their designated factor, Cronbach’s alpha coefficients were calculated for the factors with each item deleted. T here were no cases in which an increase in the coefficient occurred after the deletion of an item. Additionally, individual item scores were correlated with their designa ted factor scores. These correlations ranged from .37 to .84, wit h an average score of .63 and also are reported in Appendix U.

PAGE 135

122 The first factor was named “Staff, Students, and Resources” or SSR. The second factor was named “Assistant Principal ,” or AP and the third factor was named “Principal” or P. See Table 23 fo r the number of item s, possible point totals, means, and standard deviations for each factor. These three factors were used to answer Research Question Seven. Table 23 Descriptive Statistics and Cronbach's Alpha for Three Factors Scale Name Alpha # Items Possible Total Mean SD 1 Staff, Students, and Resource s (SSR) 0.95 36 180 137.4928.53 2 Assistant Principal (AP) 0.93 13 65 55.75 11.14 3 Principal (P) 0.93 11 55 46.13 10.59 Note. N =211. Research Question Six: Is there a difference between schools classified as having a high level of School-Wide Po sitive Behavioral Support (SWPBS) implementation and schools classified as having a low level of SWPBS implementation on the factor sc ores of the SWIF survey? To answer this question, a Multivar iate Analysis of Variance (MANOVA) was initially considered; however, t he sample size was determined to be insufficient to conduct this analysis. Instead, two non-parametric analysis procedures, the Kruskal-Wa llis procedure and the MannWhitney U tests, were conducted. First, the method used to split the data will be described, followed by report of the results of the non-parametric analysis procedures.

PAGE 136

123 Descriptive Statistics The sample for this question incl uded those schools that returned their BoQ as part of the end-of-year evaluatio n and also had a coach who completed the SWIF for their school. Thirty-six coaches completed both instruments. Analysis The sample was divided into three groups, using cut-off scores of one standard deviation above and below the mean. The lower cut score was 50 and the higher cut score was 90. The mean BoQ score for the 36 coaches was 70.08, with a standard deviation of 19.78. The lowest implementi ng group included scores from 0 to 50 including 50; the middle implementing group included scores greater than 50 up to 90; and, the highest implementing group included scores from 90 to 100 including 90. Descriptive statistics for the three groups are presented in Table 24. Table 25 presents a score interpretation guide for the meaning of the SWIF factors. To determine if there was a differenc e between the low, middle, and high groups on the three factors, the Kruskal-Wallis test was conducted. The KruskalWallis test is a nonparametric test sim ilar to a one-way ANOVA and was selected because the sample did not meet the assumptions needed to conduct an ANOVA; however, the sample met the le ss stringent assumptions required to conduct the Kruskal-Wallis test (e.g., i ndependent observations) (Pallant, 2005). The results of the Kruskal-Wallis test suggested a difference in the factor scores across all three groups. The resu lts are presented in Table 26. To determine if there was a difference bet ween the high and low groups, a Mann

PAGE 137

124 Whitney U test was conducted. The resu lts of the Mann Whitney U test are presented in Table 27 and suggest a signi ficant difference between the two groups on all three factor scores. Table 24 Instrument mean and standard deviations for the low, middle, and high implementing groups Implementation Level n BOQ SWIF Total Staff, Students, and Resources (SSR) Assistant Principal Factor Principal Factor Low 8 40.38 (6.48) 182.5 (34.90) 102.63 (26.81) 47.25 (10.12) 32.63 (11.71) Middle 23 74.57 (10.07) 245.26 (36.13) 139.26 (25.84) 58.30 (8.82) 47.70 (7.21) High 5 97.00 (1.87) 292.00 (6.08) 172.80 (5.50) 64.20 (1.79) 55.00 (0.0) Note. The possible totals are 100 for BoQ 300 for SWIF, 180 for SSR, 65 for AP, and 55 for Principal. Table 25 ` Interpretation Guide for SWIF Item, Subscale, and Total Scores Score Ranges Very problematic to Problematic Problematic to No Influence No Influence to Helpful Helpful to Very Helpful Item 1-2 2-3 3-4 4-5 Principal 11-22 23-33 34-44 45-55 AP 13-36 27-39 40-52 53-65 Students, Staff, And Resources 36-72 73-108 109-144 145-180 Total 60-120 121-180 181-240 241-300 To further interpret the data, Krus kal-Wallis tests were conducted on each of the items to see if there was a signifi cant difference between the high, middle,

PAGE 138

125 and low implementers on any individual it em. All but three items yielded significant differences. The three items t hat did not yield significant differences were “Adequate funding” (p=.39), “PBS pr ocedures in a handbook” (p=.28), and Assistant Principal’s pers onal commitment to PBS” (p =.45). Several other items produced trends worth noting. Middle and high implementers had similar high rankings on “Data entered regularly,” “Team recognizes/rewards faculty for participation,” and “Coach stability of positi on,” which indicates that the presence of these items in a school may facilitat e higher implementation. Low and middle implementers had a similar rank for “Assi stant Principal’s personal commitment to PBS.” See Appendix V for item means and standard deviations for low, middle, and high implementers. Table 26 Results of Kruskal-Wallis Test F1: Staff, Students, and Resources F2: AP F3: Principal Chi-Square 15.8 14.3 18.3 Df 2 2 2 Sig .000* .001* .000* Note. These results evaluated the difference on factor scores between low ( n =8), middle ( n =23), and high ( n =5) implementers. *p<.01 Table 27 Results of Mann-Whitney U Test F1: Staff, Students, and Resources F2: AP F3: Principal Chi-Square 15.8 14.3 18.3 df 2 2 2 Sig .000* .001* .000* Note. These results evaluated the difference on factor scores between the high ( n =8) and low ( n =5) implementers. *p<.01

PAGE 139

126 Research Question Seven: Which items are perceived as the most helpful (enablers) in the implementation of School-Wide Po sitive Behavioral Support (SWPBS) by coaches and team members, and which items are perceived as being the most problematic (barriers) in the implem entation of SWPBS by coaches and team members? Quantitative Item Analysis To answer this question, quantitativ e and qualitative item analyses were conducted. First, the m ean and standard deviation for eac h item were derived from the data from all re spondents with complete datas ets. Complete datasets were available for 236 respondents. Of those, 211 reported that they had a second administrator or Assist ant Principal (AP) at their school; therefore, for the items related to the AP, the mean was based on a sample size of 211 and not 236. The items were then ranked from the highest to the lowest mean and are reported in Appendix W. T he item with the highest m ean (M=4.63, SD=0.90) was “Expectations and rules that are clearly defined,” while the item with the lowest mean (M=3.26, SD=1.57) was “Adequate funding for PBS.” To corroborate the findings from t he item mean rankings, the response frequencies for each item are presented in Appendix X. These figures provide a visual representation of percentage of respondents who selected each response. The items are ranked from the items with the highest percentage of respondents who selected a five to those with the lowest. The second analysis that was conduct ed was a comparison of different respondents’ item means to differentiate between the responses by coaches,

PAGE 140

127 team members, district personnel, and st ate project personnel. This item analysis by category is presented in Appendix Y. The third analysis was a comparis on of the mean and standard deviation of the overall score and the factor scores between respondents of different categories (i.e., type of school, positi on with school, position in PBS, highest degree, number of years in current school, and number of years with PBS project) (See Appendix Z). Review Table 25 for an interpretation guide for the meaning of the scores. The t able categorizes the item, subscale, and total scores by the range of scores that best represent s the ratings on the SWIF Likert scale (i.e., very problematic-problematic, probl ematic to no influence, no influence to helpful, helpful to very helpful). While the sample size for each ca tegory of respondents ranged from a sample size of 2 to a sample size of 119, there were several interesting trends to note. The highest overall mean for type of school was by the respondents in an elementary school ( M =255.69, SD =35.90). The highest for position with PBS was by the respondents who were team members ( M =242.45, SD =41.70), and for position in school was by respondents who were office staff ( M =288.67, SD =7.23); however, there were only three respondents in the la st category. The next highest means from a group of res pondents in the school position category was from Principals ( M =260.18, SD =26.67) and Assistant Principals ( M =257.39, SD =26.52). These two groups also had t he highest mean scores for the three factor categories of “Staff Students, and Resources,” “P rincipal,” and “Assistant Principal.” The respondents who had a highest degree of a high school

PAGE 141

128 diploma/some college had an overall mean that was 22-35 points higher than any other group ( M =278.40, SD =27.72), but there were only five respondents in this category. In the years with PBS categor y, schools in year one and two had a similar overall score ( M =240.66, SD =44.67 and M =241.45, SD =39.45, respectively). Finally, the respondents who had been in their current school for two years ( M =258.27, SD =35.37) and four years ( M =253.27, SD =34.18) had the highest mean scores. Qualitative Item Analysis Analysis of the open-ended questions was conducted by the primary investigator and was validated by a se cond researcher who worked with the FLPBS project. The responses were divi ded into categories on the survey. Items that were helpful in the PBS process, items that were problematic, and openended responses at the end of the su rvey. The open-ended responses were intended to provide feedback regarding the SWIF survey and not the SWPBS process; however, respondents listed additional information about the SWPBS process under this category. These responses were placed in the helpful or problematic category bas ed on their content. The responses were then reviewed for redundant items fr om the survey. The intent of the open-ended questions was to generat e additional problematic and helpful factors. Responses that were already mentioned in the survey were deleted. The content analysis, therefor e, represents only new responses and does not necessarily represent the most frequent responses. For example, many respondents mentioned that a committed, mo tivated team was helpful in the

PAGE 142

129 process. As “committed team” is an item on the survey, these responses were not included in the qualitative analysis. The problematic and helpful cate gories were then analyzed separately but with the same procedures. The responses in each category were divided into individual thought units. For example, if a respondent wrote “principal was involved and team meetings went smoothly, ” the statement was divided into two statements. Next, sim ilar thought units were grouped together. When thought units were nearly identical, the units were combined into one statement and a count was kept of the total number of statements in that category. When the statements could not be combined any fu rther, the categories were named by their theme. Both researchers independently creat ed a table with the items grouped according to category. The original tabl e created by the researcher is presented in Appendix AA. The primar y researcher compared t he categories and items to those of the second researcher and made any modifications to the table that would simplify and better organize the item s. The final table is presented in Appendix AB. All changes are pr esented in bold-faced print. Of the final categories and item s presented in Appendix AB, the Problematic Items (barriers) included the categories of hurricanes, team, coach, district, principal, staff, st aff training, retraining, teaching expectations, rewards, referral system, and consequences. The Help ful items (enablers) included similar categories with the addition of FLPB S staff, funding and parents.

PAGE 143

130 Chapter V Discussion Intervention studies have changed as researchers have begun to address factors such as implementation proce ss and treatment integr ity variables, and treatment adherence in addition to the ex amination of intervention outcomes (Walker, 2004). The study of the implem entation integrity of interventions is important to the continui ng development of e ffective service-de livery systems in schools. Walker purports t hat “perhaps the greatest opportunity for improving understanding of applied interventions lie s in the systematic study of the implementation process and careful a ssessments of a r ange of variables affecting its quality” (p. 403) This chapter provides a discussion of the findings from the systematic st udy of the statewide im plementation process for Schoolwide Positive Behavior Support (SWPBS) and variables affecting the quality of its implementation. It also provides an overview of crucial SWPBS implementation components that have been ident ified in the literature as well as a discussion of implementation tr ends and differences between high and low implementing schools on t hese components. Then, co mponents that influence the implementation process (i.e., team functioning, administrative support, and coaching) and the influence of school char acteristics on the implementation of SWPBS are examined, followed by a discussion of the limitations and implications of this study.

PAGE 144

131 Overview of SWPBS Implementation SWPBS Implementation Components Benchmarks of Quality. This study measured the degree to which SWPBS was implemented in 91 schools across the State of Fl orida using the Benchmarks of Quality survey (BoQ; FLPBS Pr oject, 2005) to assess implementation. The results from the anal ysis of BoQ scores provide information regarding the degree to which each co mponent of SWPBS was implemented. The mean BoQ score for all schools indi cated that school PBS teams were implementing approximately two-thirds of all SWPBS co mponents. The mean subscale scores ranged from 45% to 85%, with most PBS school teams implementing less than one-half of some components but almost all of other components. A ranking of mean BoQ subsca le scores provided information on the components that were most and least pronounced in the schools. The subscales of “Effective procedures for dea ling with discipline,” “Crisis plan,” and “PBS team” yielded three of the highes t mean scores. This finding is not surprising because the PBS team, discipline system and crisis plan are often in place to some degree prior to the init iation of PBS training. Because the development of a team is prerequisite in Florida for the opportunity of school personnel to attend SWPBS training, this component must initially be in place for all schools prior to implement ation. In addition, teams ar e likely to focus initially on the discipline system first because the personnel in their schools tend to be highly concerned about the effectiveness of current discipline practices (Hall & Hord, 2001). Therefore, ear ly expectancy for training teams to modify their

PAGE 145

132 existing discipline system may provide a stra tegic opportunity to obtain team buyin during initial training and, subsequently, for teams to obtain staff buy-in during the initial stages of the implementation process. The components of “Expectations,” “Reward system” and “Data analysis system” received the next highest mean scores. These components may have received lower scores than the top thr ee because they were not part of the school culture prior to the introduction of SWPBS. Teams had to develop these components during and after PB S training; therefore, it may have taken teams longer to implement them. Some of t he qualitative items on the BoQ indicated that teams that had devel oped and implemented an effe ctive discipline system found that the system was hel pful in the identificati on of the most severe problems and as a way to decrease re ferrals. Many respondents, however, reported that many staff did not apply t he discipline system consistently because they had not been properly trained in t he use of the system and that school leaders did not remind or encourage sta ff to use the system correctly. The component of “Lesson plans fo r teaching expectations” yielded the lowest mean score of all subscales. Th is indicates that teaching behavioral expectations was the last component to be implemented if it was implemented at all. One possible explanation for this findi ng is that teaching of the expectations must be implemented by the school staff ra ther than by the members of the PBS team. Therefore, the team s had less control over the implementation of this component than they did over the dev elopment and implem entation of the expectations, reward system, and data analysis system. Additionally,

PAGE 146

133 respondents mentioned that they woul d have liked a sample or pre-made behavioral curriculum to use in their sc hool to facilitate the teaching of expectations. This need could be addressed in future implementation efforts for SWPBS. “Implementation plan” reflected th e second lowest mean rating even though it is one of the most essential co mponents. The items in this category relate to providing training to both t he staff and the student s about the purpose and implications of the SWPBS plan. This indicates that, even if the PBS teams created the other components (e.g., discip line procedures, expectations, reward system), they may not be effectively in forming and training their students and staff about these components and may need greater support to integrate them into their school system. Additionally, PB S team members qualitatively indicated a lack of time during the school year to train the staff. Moreover, many PBS Team respondents stated that they woul d have liked more time during the summer to prepare for PBS and to provide staff training. School-wide Implementation Factor Survey (SWIF) To facilitate the interpretation of rankings of the SW PBS components, the SW IF items with the highest and lowest scores were examined. The mean scores of all items were ranked from highest to lowest. The items wit h the highest means, indicating that the items were rated as “most helpful,” were “Expectation s and rules that are clearly defined,” and several items relating to the commitment of the Assistant Principal (AP) to the SWPB S implementation process. It is possible that the AP, who is often directly involved with disci pline, can have a large influence as a

PAGE 147

134 gatekeeper on the implementatio n process. In contrast, the items with the lowest means were “Adequate funding for PBS” and several items relating to the staff’s philosophy toward SWPBS. This indicates that these items were considered the most problematic in the implementati on process and were likely some of the hardest components to change in a school environment. First, the PBS team does not likely have control over the fundi ng of their PBS effo rts in their school, and they may lack the skills to fundraise in the community. The finding related to the staff’s philosophy is consistent with past research. According to Rogers (2003), about 29% of any group will be sk eptical of change and 17% of a group will resist change altogether Therefore, the PBS t eams encountered a challenge that commonly occurs in systems change. High vs. Low Implementing Schools The difference between high, middl e, and low implementing schools on the three factors and their respective items of the SWIF survey also was examined. The three factors were Princi pals, Assistant Prin cipals, and Students, Staff, and other Resources. Respondent s from high implementing schools had significantly higher scores on all three SW IF factors than respondents from low implementing schools. There also was a clear trend in the data. The respondents from low implementing schools had lo wer scores on the factors than did respondents from middle implementing sch ools, who in turn had lower scores on the factors than did res pondents from high implementi ng schools. There are two ways to view these findings. First, it is possible that teams that experienced success were reinforced fo r their success and were mo tivated to continue

PAGE 148

135 implementing PBS to a high degree. They, therefore, vi ewed the factors/items on the survey in a more favorable manner as they have been experiencing success with implementation. It also is possibl e that the more successful teams had members with a more positive attitude toward implementation than the less successful teams. It is po ssible that the reason these teams had a higher level of implementation was because the team me mbers viewed these factors as means to facilitate implementation in stead of viewing them as ba rriers. Therefore, it is possible that either a team's positive atti tude toward the items /factors facilitated their success with implementat ion or that a team's su ccess with implementation influenced their attitude toward the items /factors related to implementation. A review of the specific item rank ings for each group revealed that middle and high implementing schools scored sim ilarly on “Data entered regularly,” “Team recognizes/rewards faculty for par ticipation,” and “Coach’s stability of position.” These three items are, ther efore, related to a higher level of implementation and may be key factors in im proving implementatio n. In contrast, the only three items that did not reflect significant differences between the groups were “Adequate funding”, “PBS procedures in a handbook”, and “Principal’s personal commitment to PBS. ” As noted above, a dequate funding received the lowest mean rating for schools at all leve ls of implementati on. All respondents also were consistent in assigning a high rating for “Pri ncipal’s personal commitment to PBS,” indicati ng that this factor was perceived as being important and having a strong influence on t he implementation of SWPBS.

PAGE 149

136 Trends in Implementation No significant differences in levels of implementation were found among schools with regard to being in their first, second, or thir d year of implementation. There was, however, a trend toward a highe r level of implementation in schools that had been implementing SWPBS for a l onger period of time. This trend was consistent with results obtained by Nersesian, Todd, Lehmann, and Watson (2000) who found that schools that had been implementing PBS for a longer period of time reflected a higher level of implementation th an did those that had been implementing the program for less ti me. When these authors compared the same schools’ implementation over time, they found that the majority of schools demonstrated an increase in their level of implementation from one year to the next. Lewis-Palmer, Irvin, Sugai, and Bol and (2004) also found that schools showed a significant increase in level of implementation when evaluated with the Schoolwide Evaluation Tool (SET; Lewis-Palmer, Irvin, S ugai, & Boland, 2004) 6 to 24 months following the initial training. It is important to note, however, that both of these studies included fewer than 15 schools in their samples. More research, therefore, is needed to exam ine implementation trends over time across a large cohort of schools (e.g., 100 or more). Although the trend in implementation ac ross schools in the first, second, and third year of implementation was cons istent with past research, the lack of a significant difference between schools in their first and second year of implementation was surprising. Due to the non-experimental nature of this study, there are several issues to consider w hen interpreting these results. First, the

PAGE 150

137 FLPBS project continues to evolve, im proving over time. For example, the training manual and supportive services provided to districts and schools have been modified and improved considerably from t he initiation of training in 2002 to 2004. School staff who received training dur ing earlier stages of the project in 2002 and 2003 received training that was s lightly different from those who received training in 2004. While it might be assumed t hat the training and assistance provided to PBS schools has improved over time, this assumption cannot be validated. Another consideration regarding the absence of a significant difference between schools engaged in the first versus the second year of implementation is that full implementation of school wid e innovations often takes between three and five years (Chapman & Hofweber, 2000; Hall & Hord, 2001; Taylor-Greene et al., 1997). The entry and acceptance phase alone can take up to two or three years (Ponti et al., 1988). As the majority of the school s included in this study were only in their first or second year of implementation, their levels of implementation may not yet have stabiliz ed. OSEP reports that PBS must be implemented with high accuracy and sustain ed for a period of 5 to10 years to be effective (OSEP, 2004); ther efore, the impl ementation in these schools should continue to be monitored each year. Implementation Process Walker (2001) stressed the importance of carefu l assessment of a range of variables potentially affecting the qua lity of PBS implement ation. The current study is one of few to quantit atively measure the impact of process variables (i.e.,

PAGE 151

138 team functioning, administrative support, and coaching) on the implementation of a school-wide program. The proportion of explained va riance derived from the three implementation process factors addressed in this study toward the implementation of PBS was found to be lo w to moderate. The team functioning variable explained the only significant contribution and the highest proportion of unique variance among these three vari ables. Each variable will be discussed below in relation to past research. Team Functioning As noted above, the team functioning variable was statistically significant and had the strongest relationship with im plementation compared to the other variables examined in this study. The in fluence of team functioning was not surprising since team efforts and teamwo rk have been identified as essential change agents (Hall & Hord, 2001; Lohrm ann-O’Rourke et al., 2000; TaylorGreene and Kartub, 2001; Lewis et al., 1998). Qualitatively, some teams reported that it was helpful to have parents as members and that the commitment of at least a few core team members who did a significant amount of work facilitated the implementation process. Others stated that it wa s problematic when only a few team members did all the work, as was turnover in team membership. One possible solution to this problem could be for the team to determine a minimum period of time for which an individual woul d be expected to serve on the team or for the team to change members after a s pecified period of ti me. Another solution would be to offer team members incentiv es to remain on the team, such as financial incentives. However, when there is team turn-o ver, it is important to

PAGE 152

139 create a system to train t he new members of the team. Given the finding of this study that the effectiveness of the team is an important influence on the im plementation process, fact ors such as these should be carefully addressed during the team dev elopment process. The significance of effective team functioning also is cons istent with the argum ent put forth by Schmuck and Runkel (1994) that the success of an organization development intervention depends on the ability of the reci pients to work together to achieve their intervention goal. Administrative Support The administrative support and c oach’s self-assessment variables explained very little of the implementati on variance. The lack of influence from the administrative support variable was surp rising in view of the large body of evidence reported in the lit erature that district and bu ilding level administrative support are essential components for the initiation and sustainability of successful program implementation (Chapman & Hofweber, 2000; Curtis & Stollar, 2002; Gottfredson, Gottfredson, & Hybl, 1993; Knoff, 2002; Lewis et al., 1998; Lohrmann-O’Rourke et al., 2000; Nersesian et al., 2000; Noell & Witt, 1999; Sadler, 2000; Taylo r-Green & Kartub, 2000). One explanation for this finding is t hat many of the it ems used to measure Administrative support were subjective (e .g., “A school-based ad ministrator is an active member of the team .”), and many of the res pondents may not have felt comfortable selecting a lower rating for these items because of a concern that their administrator would review them. Anot her explanation for this finding is that

PAGE 153

140 administrative support is a requirement to become involved in the PBS project; therefore, the schools invo lved in the PBS Project already had administrative approval to participate in the project. On the other hand, both the factor analysis and the item analysis of the SWIF suggested the influence of this variabl e. There were three distinct factors generated from the factor anal ysis: (1) Principal, (2) Assistant Principal, and (3) Staff, Students, and External Resources. The identification of these factors supports the measurement of administ rative support as being two separate categories: Assistant Principal (AP) and Principal. The items in the AP and Principal factors, particularly the items in the AP category, tended to be rated as more helpful than items included in the ot her factor. More research is needed in this area relating to both the measurem ent of this variable and its impact on implementation. Qualitatively, many t eams reported that good leader ship, such as having a leader with good organizational skills, was important/helpful in the implementation of SWPBS at their school. Teams reported that problems resulted when principals were either t oo controlling or not controlling enough. One team noted that their pr incipal’s evaluation by the district was linked to the number of office discipline referrals (ODR’s) that the school reported, thus discouraging that principal from engaging in the PBS process because, initially, the PBS process can lead to an increase in the number of ODR’s Factors, such as these, highlight the importance of considering organizational policies and procedures in terms of their potential in fluence on the implementation process.

PAGE 154

141 Coaching The construct of coach self-efficacy derived from the coaches selfassessment measure was not found to be significantly associated with implementation as measured by the BoQ. The lack of association between the coaches’ self-efficacy and PBS implementat ion was less surprising because the instrument measured each coach’s self-ra tings of his or her own ability and not his or her actual influence on the team. While these findings suggest that the coach’s self-efficacy was not associated wit h the degree of implem entation, it is likely that that coach him/herself did hav e an influence on implementation. In fact, many of the qualitative items indicated t hat having a coach was helpful and that not having a coach was problematic. One consideration for these findings is that there was a lower response rate for the Coach’s Self-Assessment ( 69%) compared to the response rate for the Team Process Survey (87%). Some coaches assist more than one school, but may have completed the survey for only one school. It woul d be important to devise techniques to improve the response rate for all instruments, perhaps by providing stronger incentives and/or by providing easier processes for the completion of the tools, such as through the use of online surveys. Very little research has been dedic ated to the measurement of team functioning, administrative support, and the influence of the coach on the implementations of organizational changes This study repr esents an initial attempt to measure some of the qualitative predictors of impl ementation that have been reported in the liter ature; however, a great deal more research is

PAGE 155

142 needed that addresses the measurem ent of these variables. Technical Assistance Technical assistance provided by the FLPBS staff was mentioned by ten respondents in the qualitative items as being helpful in implementation. This is consistent with prior resear ch reported in the literatur e. In an evaluation of the obstacles faced by eight of the Blueprint Violence Prevention programs implemented across 42 sites, quality of technical assistance was identified as one of the most influential factors (Mihalic & Irwin, 2003 cited in Walker, 2004). Rosenberg and Jackman (2003) also purport ed that administrators need support as well as staff to implem ent and sustain comprehensive interventions. In this study, support was provided by the FLPBS staff in the form of technical assistance. Although the variable of tec hnical assistance was not examined as a factor in this study through the multip le regression analyses completed, the qualitative data indica ted its importance. Implementation and School Characteristics This section describes the associati on or lack of association between the characteristics of a school and the implem entation of SWPBS. The first section describes the association between SWPBS implementation and the demographic characteristics of the school. The sec ond section describes the association between SWPBS implementation and the degree to which a school experienced discipline and academic problems prior to initially implementing a school-wide initiative.

PAGE 156

143 Demographic Characteristics School level. Differences in SWPBS implementation were examined according to school level (i.e., elementar y, middle, high, and c enter schools) with no significant differences being found between levels. However, elementary schools and center schools each had a sli ghtly higher mean BoQ scores than did middle schools and high schools As noted in an earlier section, there was a trend toward a higher level of implem entation in schools that had been implementing SWPBS for a longer period of time. This trend was found to be present in elementary schools and high schools, but not in middle schools and center schools. The sample size for some of these groups, however, was extremely small (e.g., there was only one middle school in the third year of implementation). Similarly, when the score s on the SWIF survey were compared by respondents from schools at each school level, respondents from elementary schools tended to have the highest ratings across all categories, indicating that they perceived there to be the least number of problems in t he implementation of SWPBS. Student and school building variables. None of the student or school building demographic variables were found to have a significant association with the level of SWPBS implementat ion, indicating that neit her the type of school nor the type of students enrolled were associ ated with SWPBS implementation. This corroborates the finding that there was no difference in implementation between elementary, middle, high, or center schools, suggesting that SWPBS can be implemented across student populations and with different levels of schools.

PAGE 157

144 One study that examined the associ ation between socio-cultural factors and the implementation of a school wide reform initia tive, Success-For-All, (Cooper, 1998), obtained both similar and dissi milar results with regard to the association between these factors and im plementation. Cooper reported finding an association between implementation and lower student mobility (stability), higher student attendance rates (not in cluded in this study), and a higher percentage of white students (minority rate); however, he did not find that student SES was associated with implementation. Cooper did not find that school size was associated with implementat ion, which is consistent with the findings of the present study. The results of the current study also do not indicate an association between small class sizes or teacher-st udent ratio and a higher level of SWPBS implementation. While prior research has found that classroom discipline problems were decreased or nearly elimin ated with smaller class sizes (Betts & Shkolnik, 1999; Molnar et al.; Finn & Ach illes, 1999; Nye 1999), this study did not find that smaller teacher-student ratios had a positive or negative association with implementation of SWPBS. Smaller cl ass sizes, therefore, facilitate better classroom discipline, but do not necessar ily facilitate better implementation of SWPBS. On the other hand, it may be important to note that classroom discipline is a class-specific variable, whil e SWPBS is a school-wide variable. In conclusion, past research t hat supports the association between student stability, minority rate, school si ze, teacher-student ratio, and teacher quality with the implem entation of school-wide program s was not corroborated by the findings of this study. In contra st, research that does not support the

PAGE 158

145 association of the variables of soci oeconomic status and school size with implementation was corroborated in this st udy. The lack of association between these socio-cultural factors and SWPB S implementation in this study was promising as this finding supports the adaptabi lity of this innovation for different student populations and types of schools. Academic and behavioral indicators. The schools’ discipline and academic indicators prior to implementation, or the severity of need, did not explain a significant portion of the variance in t he implementation of SWPBS. While the combination of these vari ables did not explain a significant portion of the variance, there was a relationship between higher scores on each of these variables (e.g., higher level of problems prior to implementat ion) and a lower implementation score. In other words, schools with fewer problems prior to implementation had a higher implement ation score. The strength of the correlations between each variable and th e implementation scores, however, was low (<.15). Therefore, there were relationships between implementation and the academic and behavioral indicators prio r to implementation; however, these indicators as a group did not signific antly contribute to the variance in implementation. As both Harvey and Brown (2000) and Schmuck and Runkel (1994) recommend that organizations must perceive a need for change before any program can or should be implem ented, it is possible that it is the perception of a need and not an actual need as demonstrated by the school’s data that makes the difference. More research is needed that examines both perceptions and

PAGE 159

146 indicators of need for change and their influence on the implementation of interventions in organizations. Limitations and Considerations The results of this study must be inte rpreted in light of several threats to internal and external validity. Internal validity is the degree to which extraneous variables are controlled. Threats to internal validity in this study relate to social desirability, instrumentation, and history. External validity is the degree to which the results can be generalized to the general population (Fraenkel & Wallen, 2003). Threats to external validity in this study include ecolog ical or population validity and sample bias. Internal Validity Social desirability. As multiple self-report instruments were used in this study, there was the possibility of bias or inflated scores on the instruments due to social desirability, or influence on response based on the respondent’s beliefs regarding what is thought to be desirable. This was evident as all the instruments reflected a negatively skewed distribution, indicating that the respondents tended to select higher ratings on most items to describe themselves or their team. However, this phenomenon is common for scales used in the social sciences (Pallant, 2005) and is unlikely to invalidate the results. There was a trend specifically noted on the SWIF survey. As the SWIF was completed by groups of individuals comparisons of the mean scores for each item and factor were made accord ing to groups. Individuals tended to respond to items most positively that refe rred to their personal position For

PAGE 160

147 example, principals and a ssistant principals rated th e “Principal” and “Assistant Principal” factors of the SWIF higher t han did other school personnel. However, they also rated the items in the “Staff Students, and Resources” factor higher than did other school personnel, as well. Principals and Assi stant Principals, therefore, selected higher ratings for their schools on a ll items and categories of the SWIF than did other school personnel. Li kewise, district personnel rated the items related to district personnel higher than team members di d, but rated these same items lower than the coaches did. Fo r two out of the thr ee coach items, the coaches rated these items higher than di d the other respondents, but they rated the item of “availability” lowe r than the team members did. In contrast, however, respondents fr om the Florida State Project assigned lower ratings, as reflected in a lower mean score, than the ot her respondents did for most of the items, indicating that they believed these it ems were problematic for the schools. It is possible that the re spondents from the stat e project staff may have rated the items as more problemati c because they were more familiar with the exact implementation protocol and more aware of deviations from the protocol than were the other respondents. In other words, they may have used a higher standard to judge the degree to wh ich each item impacted their school’s implementation. While there were only fi ve respondents from the state project, this pattern was consistent across the ma jority of the items. Findings such as these that indicate differences in the responses from different groups of people indicate the importance of obtaining mult iple perspectives on surveys related to

PAGE 161

148 schools, or perhaps, even more importantly identifying data-based strategies for measuring variables, rather than relying on perceptions and beliefs. Instrumentation. Instrumentation often is cons idered a threat to internal validity. The coaches were trained in adm inistration of the BoQ in January, 2005. Exposure to the items on the instrument may have increased their likelihood to address factors associated with these items in the months following the training. History of events. The history of events in Florida also could have influenced SWPBS implementation. In the beginning of the 2004-2005 school year, hurricanes occurred in several counties that caused severe damage and the closure of the schools for several days to several weeks. This disruption may have influenced the implementation of SWPBS in the those parts of the state that were affected. In fact, 13 respondents on the SWIF reported hurricanes as a factor that was problematic to the implementation of SWPBS. External Validity Population validity. Population validity is the ex tent to which the results from this population can be generalized to the larger population in all settings, contexts, and conditions (Fraenkel & Wallen, 2003). Since this study was conducted with schools in Florida, a co mparison of the dem ographics of the Florida schools with the general population is useful in interpreting the data. This sample included schools with sli ghtly higher rates of students living in poverty and students who were members of a minority group than would be expected in the general school population across the Un ited States. In this sample, none of the schools had a low pover ty rate, and 90% of the schools (60

PAGE 162

149 out of 67) had a high poverty rate. T he USDOE (2002) defines schools with greater than 30% of students on free and reduced lunch as having a high poverty rate and schools with less than 15% of the students on free and reduced lunch as having a low poverty rate. Similarly, this sample had a slightly higher percentage of students who were mem bers of minority gr oups than would be expected in the U. S. popul ation. A low minority student enrollment is defined as less than 10% of total student enrollm ent, whereas high minority student enrollment is defined as gr eater than 50% of total student enrollment (USDOE, 2002). In this sample, only one school wa s considered to have a low minority student enrollment rate, while 34% of the schools were cons idered to have a high minority student enrollment. The descriptive data for academic and behavior indicators for the sample in this study were compared to data for schools involved with the FLPBS project that did not return their instruments to determine if there was bias in the sample. For the sample of schools included in th is study, no significant difference was found in the percentage of in-school suspensions (ISS), out of school suspensions (OSS), office discipline referrals (ODR), and percentage of students below grade level in reading (BGLR) betw een the schools from which a BoQ was received and those from which a BoQ was not received. The data for the academic and behavior indicators for this sample also were reviewed and if possible, compared to any available data on national student population to determine if these data were reasonable and representative of the national student population. The aver age number of ODR was in the sixties and seemed

PAGE 163

150 to be a very low estimate for one school year, as compared to national data. Additionally, the ODR distri bution had a very high kurtosis value, indicating a peak in the distribution on the low end. This is a result of many of the sample schools reporting very low totals (<20) for ODR. It cannot be determined whether these totals are accurate or if school s tended to either have poor systems for collecting these data or a tendency to underreport their data to the Florida Department of Education. On the other hand, some schools had extremely high values for ODR (>400), and it is not clear if these are errors in reporting, are accurate and honest reports, or are reports from schools that use a different basis for data collection, e.g., they incl ude very minor inci dents in their ODR reports. The average number of students who were belo w grade level in reading in this sample was approximately 50%. These data were higher than the reported national averages (i.e., 37% of 4th graders and 26% of 8th graders were reading below grade level in 2003, and 26% of 12th graders were reading below grade level in 2002; USDOE, 2004a). Unfortunately, the characteristics of those schools that did not return their evaluation materials could not be evaluated, but many of them are likely to be implementing SWPBS at a lower level than are those who returned the evaluation materials. The bottom 33% of this sample, therefore, may not accurately represent the schools that are implementing at the lowest level. FLPBS staff reported that there was a misperception from schools who were implementing at a low level that they did not need to return their evaluation materials because their implementation leve l was too low. Additionally, the actual

PAGE 164

151 completion of evaluation materials by personnel in these schools very likely would be more challenging because they may not have had a team in place or their team may not have met on a regular basis. Access to the information requested, therefore, would have been challenging fo r these teams, and many of these schools that were implementing at a very low level are not accounted for in this sample. As one purpose of this study was to understand the reasons for low implementation levels, it is important to develop procedures to obtain information from low implementing sc hools and to encourage them to return their evaluation materials. Variables Another limitation in this study was the inability to obtain data relating to some variables because the data were not available on a state-wide level. For example, the variable of teacher stability may be asso ciated with implementation, but schools do not keep a systematic reco rd of information relating to this variable. It also was not possible to obtain data for c enter/other schools as the FLDOE did not collect these data due to the frequent changes in their student populations. The variables also were restricted by the methods of data collection used by the state department of education. Fo r example, OSS was represented as the percentage of students who had received one or more suspensions during the current school year. There was no method to collect data relating to students who had received two or more suspens ions in one year even though this information would have been valuable for t he purposes of this study. Another

PAGE 165

152 example is the inability to separate ODR data according to major versus minor infractions. This distinction would have provided valuable information to the study; however, schools are inconsistent in their data collection relative to minor referrals; therefore, these data were not available. Finally, socio-cultural data were obtained based on the school year prior to the one on which PBS implementation was measured. Despite these limitatio ns, the data that were collected do represent a valid source of information, and the results should be considered as one interpretation of the data that exist relating to the implementation of sc hool-wide innovations. Implications The findings of this study suggest se veral implications for implementing school-wide innovations in educational setti ngs. This study highlights the trend that schools which have been engaged in th e implementation of SWPBS for a longer period of time demons trated higher levels of implementation. This trend should be shared with schools to provide t hem with the motivation to continue implementing SWPBS for more than one year, even if implementation is lower in the first year than they had anticipated or hoped. These trends should continue to be documented with the goal of creati ng an implementation blueprint of the typical implementation patterns ac tually experienced in schools. This study also highlights severa l suggestions for improving the SWPBS implementation process in schools. First, this study demonstrates the importance of team functioning to im plementation. State projec ts and coaches, therefore, should emphasize team functioning by providing both training on team

PAGE 166

153 functioning and continuous feedb ack to the team about their functioning. Second, it is important to secure the commitment of the assistant princi pal in the process. Teams also should focus on training schoo l personnel on the critical components of SWPBS and on using t he training as an opportunity to influence the philosophy of the staff regarding SWPBS. Teams t hat rewarded staff for participating in the PBS process obtained positive results. Finally, two components rated as the most helpful in the SWPBS implementation process were having clear expectations and regular data entry. While clear expectations can be developed during the in itial training, coaches an d technical assistance staff should assist schools in the devel opment of an effective system for data entry and should provide continuous feedback for teams that have not incorporated this component in to their school culture. This study highlights the need for futu re research on the measurement of levels of implementation and the link bet ween implementati on and outcomes of SWPBS. To conduct this research, techni ques to increase the involvement of non-responding schools in future resear ch and evaluation must be considered. Implementation trends in SWPBS in lar ge cohorts of schools across multiple years also must be carefully evaluated. The variables addressed in this study and the variable of technical assistance should be included in future quantitative investigations of implement ation, as well. With onl y 5% of 1,200 intervention studies reported collecting implementat ion data (Durlak, 1997), this study represents one of the first empirical effo rts to examine care fully implementation

PAGE 167

154 trends in a large number of schools. Si milar purposes and methods should be incorporated into future research.

PAGE 168

155 References Ann E. Casey Foundation. (1995). The path of most resistance: Reflections on lessons learned from New Futures. Baltimore: Author. Bartlett, M.S. (1954). A not e on the multiplying factor s for various chi square approximations. Journal of the Royal Statistical Society, 16, (Series B), 296-298. Betts, J.R., & Skolnick, J.L. (1999). The behavioral effects of variations in class size: The case of math teachers. Educational Evaluation and Policy Analysis, 21, 193-213. Chapman, D., & Hofweber, C. (2000). E ffective behavior support in British Columbia. Journal of Positive Behavior Interventions, 2, 235-237. Cohen, R., Childs, K., & Kincaid, D. School-wide Positive Behavior Support Implementation Factors Survey (SWI F). (2005). Unpublished instrument, University of South Florida. Colvin, G., Kameenui, E.J., & Sugai, G. (1993). Reconceptualizing behavior management and school-wide discipl ine in general education. Education and Treatment of Children, 16, 361-381. Commission for Positive Change in the Oakland Public Schools (CPCOPS). (1992). Keeping children in schools: Sounding the alarm on suspensions. (ERIC Document Reproduction Service No. 350 680). Cook, R.D. & Weisberg, S. (1982). Residuals and influence in regression. New York: Chapman and Hall.

PAGE 169

156 Cooper, R., Slavin, R.B., & M adden, N.A. (1997). Success for All: Exploring the technical, normative, political, and so cio-cultural dimensions of scaling up (Report NO. 16). Baltimore: Johns Hopkins University. Center for Research on the Education of Students Placed At Risk. Cooper, R., Slavin, R.B., & M adden, N.A. (1998). Success for All: Improving the quality of implementati on of whole-school change through the use of a national reform network. Education and Urban Society, 30, 385-408. Curtis, M.J., & Stollar, S. (2002). Best practices in system-level change. In A. Thomas and J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 223-234). Washington, DC: National Association of School Psychologists. Darling-Hammond, L., & Post L. (2000). Inequality in teaching and schooling: Supporting high quality teac hing and leadership in low income schools. In R.D. Kahlenberg (Ed.), A Nation at Risk: Preserving Public Education as an Engine for Social Mobility (pp. 127-168). New Yo rk: The Century Foundation Press. Delpit, L. (1995). Power and Pedagogy in E ducating other people’s children. In L. Delpit (Ed.). Other people’ s children: Cultural conflict in the classroom. New York: New Press. Dillman, D.A. (1978). Mail and telephone surveys: The total design method New York: John Wiley & Sons.

PAGE 170

157 Donahue, P.L., Finnegan, R.J. Lutkus, A.D., Allen, N.L., & Campbell, J.R. (2001). The Nation’s report card: Fourth grade reading 2000. Washington DC: U.S. Department of Education, National C enter for Educational Statistics, Office of Educati onal Research and Improvement. Dow, P. (1991). Schoolhouse Politics: Lessons from the Sputnik Era Boston: Harvard University Press. Dunham, R. (1998). Organizational Behavior. Retrieved May 16, 2005. http://instruction.bus.wisc. edu/obdemo/readings/ngt.html Dunlap, G. (2002). Critical feat ures of positive behavior support. Association for Positive Behavior Support Newsletter, 1, 1-4. Durand, V.M., & Carr, E.G. (1985). Self-injurious behavior: Motivating conditions and guidelines for treatment. School Psychology Review, 14, 171-176. Eber, L., Lewandowski, H., Ho rner, R., & S ugai, G. (2003-2004). Illinois Positive Behavioral Interventions and Suppor t Project 2003-2004 Progress Report: A collaborative effort between center on positive behavioral interventions and supports. Illinois: Author. Eber, L., Lewis-Palmer, T., & Pacchi ano, D. (2001, February). School-wide positive behavior systems: Improving sc hool environments for all students including those with EBD. Paper presented at the 14th Annual Research Conference, Tampa, Fl. Edelman, M., Beck, R ., & Smith, P. (1975). School suspensions: Are they helping children? Cambridge, MA: Children’s Defense Fund.

PAGE 171

158 Elliot, S.N., Witt, J.C., Galv in, G., & Peterson, R. (1984) Acceptability of positive and reductive interventions: Factors that influence teachers’ decisions. Journal of School Psychology, 22, 353-360. Ellis, A.K. (2001). Research on educati onal innovations (3rd ed.). Larchmont, NY: Eye on Education. Feltz, D.L., Chase, M.A., Moritz, S.E., Sullivan, P.J. (1999). A conceptual model of coaching efficacy: Prelimi nary investigation and instrument development. Journal of Educational Psychology, 91, 765-776. Ferguson, R.R. (1991). Paying for public education: New evidence of how and why money matters, Harvard journal of legislation, 28 465-498. Fideler, E.F., Foster, E.D. & Schwartz, S. (2000). The urban teacher challenge: Teacher demand and supply in the great city schools. Retrieved February 6, 2005, from http:// www.cgcs.org/pdfs/utc.pdf Finn, J.D., & Achilles, C.M. (1999). T ennessee’s class size study: Findings, implications, misconceptions. Educational Evaluatio n and Policy Analysis, 21, 97-109. Florida Department of Education. (2003). School Indicators Report Retrieved April 30, 2005 http://data.fldoe.org/fsir/. Florida Department of Educ ation. (2004, October). E ducation Information and Accountability Services: Enrollment Si ze of Florida’s Public Schools. Series 2005-07F. Retrieved February 28, 2005 from http://www.firn.edu/doe/eias/eia spubs/pdf/enroll.pdf

PAGE 172

159 Florida’s Positive Behavior S upport Project (FLPBS). (2003-2004). Team Training on School-Wide Positive Behavior Support. Unpublished training manuscript, University of South Florida. Florida’s Positive Behavior Support Projec t (FLPBS) (2004, Fall). Directors Note: The State of PBS. Positive Outlook, 6, 1-2. Florida’s Positive Behavior Support Project (FLPBS). (2005). School-wide Benchmarks of Quality (BoQ). Unpublished evaluation instrument, University of South Florida. Fraenkel, J.R. & Wa llen, N.E. (2003). How to design and evaluate research in education New York: McGraw-Hill Higher Education. Fuchs, D., & Fuchs, L.S. (1989). Explor ing effective and efficient prereferral interventions: A component anal ysis of behavioral consultation. School Psychology Review, 18, 260-283. Fullan, M. (1997). The complexity of t he change process. In M. Fullan (Ed.), The challenge of School Change (pp. 33-56). Arlington Heights, Il: Skylight Professional Development. Galloway, D. (1976). Size of school, socio-economic hardship, suspension rates and persistent unjustified absence from school. British Journal of Educational Psychology, 46, 40-47. George, H.P., Harrower, J.K., & Knoste r, T. (2003). School-w ide prevention and early intervention: A process for establishing a system of school-wide behavior support. Preventing school failure, 47, 170-176.

PAGE 173

160 Glass, G.V. & Hopkins, K.D. (1996). Statistical methods in education and psychology. Needham Heights, MA: Allyn & Bacon. Gorsuch, R. L. (1983). Factor Analysis. Hillsdale, NJ: Lawrence Erlbaum. Gottfredson, D.C., Gottfreds on, G.D., & Hybl, L.G. (1 993). Managing adolescent behavior a multiyear, multischool study. American Educational Research Journal, 30, 179-215. Gottfredson, D.C., Gottfreds on, G.D., & Skroban, S. (1 998). Can prevention work where it is needed most? Evaluation Review, 22, 315-340. Gresham, F.M. (1989). Assessm ent of treatment integrit y in school consultation and prereferral intervention, School Psychology Review, 18, 37-50. Gresham, F.M., & Kendell, G.K. (1987) School consultation research: Methodological critique and fu ture research directions. School Psychology Review, 16, 306-316. Grimes, J., & Tilly III, W.D. (1996). Po licy and process: Means to lasting educational change. School Psychology Review, 25, 465-476. Gunter, P. L., Denny, R. K., Jack, S. L., Shores, R. E. & Nelson, C. M. (1993). Aversive stimuli in academic inte ractions between students with serious emotional disturbance and their teachers. Behavioral Disorders, 18, 265274. Hall, G.E., & Ho rd. S.M. (2001). Implementing change: Patte rns, principles, and potholes. Needham Heights, MA: Allyn & Bacon.

PAGE 174

161 Hargreaves, A. (1997). Rethinking Educat ional Change. In M. Fullan (Ed.), The challenge of School Change (pp. 1-32). Arlington Heig hts, Il: Skylight Professional Development. Harvey, D., & Brown, D.R. (2001). An experiential approach to organization development (6th ed.). Upper Saddle River NJ: Prentice-Hall. Holt, J. C. (1964). How Children Fail. New York: Dell Pub. Co. Horner, R.H., & Sugai G. (2006, March). Taking School-wide PBS to scale: District and state-level implementation. Paper presented at the Annual Association for Positive Behavio r Support Conferenc e, Reno, NV. Horner, R.H., Todd, A.W., Lewis -Palmer, T., Irvin, L. K ., Sugai, G., & Boland, J.B. (2004). The school-wide evaluation tool ( SET): A research instrument for assessing school-wide positive behavior support. Journal of Positive Behavior Interventions, 6, 3-12. Illich, I. (1971). Deschooling society. New York: Harper and Row. Irvin, L.K., Tobin, T.J., Sprague, J.R ., Sugai, G., & Vincen t, C.G. (2004). Validity of office discipline referral measures as indices of school-wide behavioral status and effects of school-w ide behavioral interventions. Journal of Positive Behavior Interventions, 6, 131-147. Johnson, R., & Lindbald, A.H. (1991). Effect of mobility on academic performance of sixth grade students. Perceptual and motor skills, 92, 547552. Joyce, B., & Showers, B. (1982) The coaching of teaching. Educational Leadership, 40, 4-10.

PAGE 175

162 Just read, Florida. Retr ieved March 19, 2005 from http://www.justreadflorida.com/ Kaiser, H. (1974). An index of factorial simplicity. Psychometrika, 39, 31-36. Kazdin, A.E. (1979). Unobtrusive measures in behavioral assessment. Journal of Applied Behavior Analysis, 10, 141-150. Kincaid, D., Knoster, T., Harrower, J. K., Shannon, P., & Bust amante, S. (2002). Measuring the impact of positive behavior support. Journal of Positive Behavior Interventions, 4, 109-117. Knitzer, J. (1993). Childr en’s mental health policy: Challenging the future. Journal of Emotional and Behavioral Disorders, 1, 8-16. Knoff, H.M. (2002). Best pr actices in facilitating school reform, organizational change, and strategic plann ing. In A. Thomas and J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 235-252). Washington, DC: National Association of School Psychologists. Kozol, J. (1967). Death at an Early Age. Boston: Houghton Mifflin. Kozol, J. (1991). Savage Inequalities: Children in America’s Schools. New York: Harper Collins Publis hers, Inc. Perennial. Kurlinger, F.N. & Pedhazur, E.J. (1973) Multiple Regression in Behavioral Research. New York: Holt, Rinehart, and Winston, Inc. Lee, Y-Y., Sugai, G., & Horner R.H. (1999). Using an instru ctional intervention to reduce problem and off-task behaviors. Journal of Positive Behavior Interventions, 1, 195-204.

PAGE 176

163 Leitman, R., & Binns, K. (1993). The American teacher 1993. Violence in America’s schools (Survey conducted for Metr opolitan Life Insurance). New York: Louis Harris and Associates. Lepage, K., Kratochwill, T.R. & Elliot. S.N. (1994). Co mpetency-based behavior consultation training: An evaluati on of consultant outcomes, treatment effects, and consumer satisfaction. School Psychology Quarterly, 19, 128. Lewis, T.J., & Newcomer, L.L. (2002). Examining the efficacy of school-based consultation: Recommendations for improving outcomes. Child and Family Behavior Therapy, 24, 165-181. Lewis, T.J., & Sugai, G. (1999). Effect ive behavior support: A systems approach to proactive school-wide management. Focus on Exceptional Children, 31, 1-17. Lewis-Palmer, T., Sugai, G., & Larson, S. (1999). Using data to guide decisions about program implementat ion and effectiveness: An overview and applied example. Effective school practices, 17, 47-53. Liechty, S.J. (1996). The effects of mobility on fourth grade students' achievement, attendance, and behavior. Dissertation Abstracts International Section A: Huma nities & Social Sciences, 56 (10-A), 3890. (UMI No. 95008-061)

PAGE 177

164 Lipsey, M.W. (1992). Juven ile delinquency treatment: A meta-analytic inquiry into the variability of effects. In T.D. Cook, H.Cooper, D.S. Cordray, H.Hartmann, L.V. Hedges, R.V. Light, T. A. Louis, and F. Mosteller (Eds.), Meta-analysis for explanation Beverly Hills, CA: Sage. Lohrmann-O’Rourke, S., Knoste r, T., Sabatine, K., Smit h, D., Horvath, B., & Llewellyn, G. (2000), Journal of Positive Behavior Interventions, 2, 238240. Martin, A.J. (2004). An ecological invest igation of social systems and student mobility: Policy implicati ons for school practices. Dissertation abstracts international section A: Humanities and social sciences, 64, (10-a), 3867, (UMI 99007-075) Mayer, M.J., & Leone, P.E. (1999). A structural analysis of school violence and disruption: Implications for creating safer schools. Education and Treatment of Children, 22, 333-356. McCarthy, J.D., & Hoge, D.R. (1987). The social construction of school punishment: Racial disadvantage out of universalistic process. Social Forces, 6, 1101-1120. McCurdy, B.L., Mannella, M.C., & Eldridge, N. (2003). Positive behavior support in urban schools: Can we prevent the escalation of antisocial behavior? Journal of Positive Behavior Interventions, 5, 158-179. McFadden, A.C., Marsh II, G.E. Price, B.J., & Hwang, Y. (1992). A study of race and gender bias in the punishment of school children. Education and Treatment of Children, 15, 140-146.

PAGE 178

165 Metzler, C.W., Biglan, A., & Rusby, J. C. (2001). Evaluation of a comprehensive behavior management program to improv e school-wide positive behavior support. Education and Treatment of Children, 24, 448-479. Molnar, A., Smith, P., Zahorik, J., Palmer A., Halbach, A., & Ehrle, K. (1999). Evaluating the SAGE progr am: A pilot program in targeted pupil-teacher reduction in Wisconsin. Educational Evaluation and Policy Analysis, 21, 165-177. Moncher, F.J., & Prinz, R. J. (1991). Treatment fide lity in outcome studies. Clinical Psychology Review, 11, 247-266. Morgan-D’Atrio, C., Nort hup, J., LaFleur, L., & S pera, S. (1996). Toward prescriptive alternatives to suspensions. Behavioral Disorders, 21, 190200. National Center for Educati on Statistics (NCES). (2001). Number of newly hired public school teachers needed for 11 years from 1998-99 to 2008-9, by continuation rate used and t eacher total assumption. Washington, DC: U.S: Department of Education. National Center for Educati on Statistics (NCES). (2003). Highest degree earned, number of years teaching experience, and average class size for teachers in pubic elementary and secondary schools, by state: 1999-2000. Retrieved March 18, 2005 from https://nces.edu.gov/programs/ digest/d03/tables/dt068.asp National Research Center (NRC). (2002). Minority students in special and gifted education. Washington, DC: National Academy Press.

PAGE 179

166 North Central Regional E ducational Laboratory (NCREL). Resources and Information for State Education Agencies With Reading First Grants Retrieved March 19, 2005 from http://www.ncrel.org/rf/ Nelson, J.R. (2000). Educating st udents with emotional and behavioral disabilities in the 21st century: Looking through windows, opening doors. Journal of Education and Tr eatment of Children, 23, 204-220. Nelson, J.R., Benner, G.J., Reid, R.C., Epstein, M.H. & Currin, D. (2002). The convergent validity of office di scipline referrals with CBCL-TRF. Journal of emotional and behavioral disorders, 10, 181-188. Nelson, J.R., Colvin, G., & Smith, D.J. (1996). The effects of setting clear standards on students’ social behavior in common areas of school. The Journal of AtRisk Issues, Summer/Fall, 10-18. Nelson, J.R., Martella, R., & Benita, G. (1998). The effects of teaching school expectations and establishing a consistent consequence on formal office disciplinary actions. Journal of Emotional and Behavioral Disorders, 6, 153-161. Nersesian, M., Todd, A. W., Lehmann, J., & Watson, J. (2000). School-wide behavior support through district-level system change. Journal of Positive Behavior Interventions 4, 244-247. No Child Left Behind Act of 2001., H.R. 1, 107th Congress (2001).

PAGE 180

167 Noell, G.H., & Witt, J.C. (1999). When does consultation lead to intervention implementation? Critical issues for research and practice. Journal of Special Education, 33, 29-35. Noell, G.H., Witt, J.C., G ilbertson, D.N., Ranier, D. D., & Freeland, J.T. (1997). Increasing teacher intervention im plementation in general education settings through consultation and performance feedback. School Psychology Quarterly, 12, 77-88. Neubert, G.A. (1988). Improving Teaching Through Learning. Bloomington, In: Phi Delta Kappa Educational Foundation. Nunnally, J. (1978). Psychometric theory. New York: McGraw Hill. Nye, B., Hedges, L.V., & Konstantopoulos. (1999). The long-term effects of small classes: A five year follow up of the Tennessee class size experiment. Educational Evaluation and Policy Analysis, 21, 127-142. Office of Management and Budget (O MB) (1995). Standards for the Classification of Federal Data on Race and Ethnicity. from http://www.whitehouse.gov/omb/f edreg/race-ethnicity.html Office of Special Education Program s (OSEP) (2004). School-wide Positive Behavior Support: Implementers’ Blueprint and Self-Assessment. University of Oregon: C enter on Positive Behavi oral Interventions and Supports. Park, C. & Dudycha, A. (1974). A cros s validation approach to sample size determination for regression models. Journal of the Am erican Statistical Association, 69, 214-218.

PAGE 181

168 Ponti, C.R., Zins, J.E., & Graden, J.L. (1988). Implem enting a consultation-based service delivery system to decrease referrals for special education: A case study of organizational considerations. School Psychology Review, 17, 89-100. Proctor, M.A., & Morgon, D. (1991). Effectiveness of a response cost procedures on the disruptive classroom behavior of adolescents with behavior problems. School Psychology Review, 20, 97-109. Putnam, R.F., Luiselli, J.K., Handler, M. W., & Jefferson, G.L. (2003). Evaluating student discipline practices in a public school through behavioral assessment of office referrals. Behavior modification, 27, 505-523. Raffaele Mendez, L.M., Knof f, H.M., & Ferron, J.M. (2002). School demographic variables and out-of-school suspens ion rates: A quantitative and qualitative analysis of a large, ethnically diverse school district. Psychology in the Schools, 39, 259-277. Reimers, T.M., Wacker, D.P., & Koeppl, G. (1987). Acceptability of behavioral interventions: A review of the literature. School Psychology Review, 16, 212-227. Robbins, J.R., & Gutkin T.B. (1994). Consultee and client remedial and preventative outcomes following consul tation: Some mixed empirical results and directions for future research. Journal of Educational and Psychological Consultation, 3, 149-167. Rogers, E. M. (2003). Di ffusion of Innovations. 5th ed. New York: Free Press.

PAGE 182

169 Rose, T. (1988). Current disciplinary practices with handi capped students: Suspensions and expulsions. Exceptional Children, 55, 230-239. Rose, L.C., & Gallup, A.M. (2004). The 36th Annual Phi Delta Kappa/Gallup Poll of the Public’s Attitudes Toward the Public Schools. Phi Delta Kappan, 86, http://www.pdkintl.org/kappan/k0409pol.htm Rosenberg, M.S., & Jackm an, L.A. (2003). Developm ent, implementation, and sustainability of comprehensive school-wide behavior management systems. Intervention in School and Clinic, 39, 10-21. Rosenholtz, S. (1989). Teachers’ workplace: The social organizations of schools New York: Longman. Sadler, C. (2000). Effective behavior support implementation at the district level: Tigard-Tualatin school district. Journal of Positive Behavior Interventions, 4, 241-243. Sashkin, Marshall and Egermeier, J. (1993). School Change Models and Processes: A Review and Synthe sis of Research and Practice Washington D.C.: U.S. D epartment of Education. Schmuck, R.A., & Runkel, P.J. (1994). The handbook of organizational development in schools and colleges, 4th ed. Prospect Heights, Il: Waveland Press, Inc. Scott, T.M. (2001). A schoolwide example of positive behavior support. Journal of Positive Behavior Interventions, 3, 88-94.

PAGE 183

170 Senge, P, Kleiner, Al, Roberts, Cl, Ross, R.B., & Smith, B.J. (1994) The Fifth Discipline Fieldbook: Strategies and tools for building a learning organization. Doubleday: New York. Sheridan, S. M., Eagle, J. W., Cowan, R. J., & Mickelson, W. (2001). The effects of conjoint behavioral consultation re sults of a 4-year investigation, Journal of School Psychology, 39, 361-385. Sheridan, S.M., Welch, M. Orme, S.F. (1996). Is c onsultation effective? A review of outcome research. Remedial and Special Education, 17, 341354. Showers, B. (1990). Aiming for superior clas sroom instruction for all children: A comprehensive staff development model. Remedial and Special Education, 11, 35-39. Skiba, R.J., Peterson, R.L., & Willia ms, T. (1997). Offi ce referrals and suspension: Disciplinary inte rvention in middle schools. Education and Treatment of Children, 20, 295-315. Smylie, M.A. (1992). Teacher participation in school decision making: Assessing willingness to participate. Educational Evaluation and Policy Analysis, 14, 53-67. Stevens, J. (1986). Applied multivariate statis tics for the social sciences Hillsdale, NJ: Lawrence Erlbaum Associates, Publishers. Stevens, J. (1996). Applied multivariate statisti cs for the social sciences (3rd ed) Mathway, NJ: Lawrence Erlbaum Associates, Publishers.

PAGE 184

171 Stevens, J. (2002). Applied multivariate statis tics for the social sciences (4th ed). Hillsdale, NJ: Lawrence Erl baum Associates, Publishers. Stevens, L.J., & Price, M. (1992). Meeti ng the challenge of educating children at risk. Phi Delta Kappan, 74, 18-23. Stokes, T.F., & Baer, D.M. (1977). An implicit technology of generalization. Journal of Applied Behavior Analysis, 10, 349-367. Sugai, G., Horner, R.H., D unlap, G., Hieneman, M., Lewis, T.J., Nelson, C.M., Scott, T., Liaupsin, C., Sailor, W., Tu rnbull, A.P., Turnbul l, H.R., Wickham, D., Ruef, M., & Wilcox, B. (1999). Applying positive behavioral support and functional behavioral assessment in schools. Technical Assistance Guide 1 Version 1.4.4 University of Oregon: Center on Positive Behavioral Interventions and Support. Sugai, G., Sprague, J.R., Horner, R.H., & Walker, H.M. (2000) Preventing school violence: The use of office discip line referrals to assess and monitor school-wide discipline interventions. Journal of Emotional and Behavioral Disorders, 8, 94-107. Tabachnick, B.G., & Fi dell, L.S. (2001). Using multivariate statistics (4th ed). New York: Harper Collins. Taylor-Greene, S.J., Brown, D., Nelson, L., Longton, J ., Gassman, T., Cohen, J., Swartz, J., Horner, R. H., Sugai, G., & Hall, S. (1997). School-wide behavioral support: Starti ng the year off right. Journal of Behavioral Education, 7, 99-112.

PAGE 185

172 Taylor-Greene, S.J. & Kartub, D.T. ( 2000). Durable implem entation of schoolwide behavior support: The high five program. Journal of positive behavior interventions, 2, 233-235. The Education Trust. (2004). The ABC’s of “AYP:” Raising Achievement for All Students. Retrieved on March 18 2005 from //www2.edtrust.org/NR/rdonlyr es/9C974109-4A70-4F5E-A07F6DC90D656F0F/0/ABCAYP.pdf Tobin, T., Sugai, G., & Colvin, G. ( 1996). Patterns in middle school discipline records. Journal of Emotional and Behavioral Disorders, 4, 82-94. Townsend, B. (2000). The disproporti onate discipline of African-American Learners: Reducing school su spensions and expulsions. Exceptional Children, 66, 381-391. Townsend, B., Thomas, D., Witty, J., & Lee, R. (1996). Diversity and school restructuring: Creating partnersh ips in a world of difference, Teacher Education and Special Education, 19, 102-118. Turnbull III, H.R., Wilcox, B. L., Stowe, M., & Turnbull, A.P. IDEA requirements for use of PBS: Guidelines for responsible agencies. Journal of Positive Behavior Interventions, 3, 11-18. United States Department of Education. (2001). The longitudinal evaluation of school change and performance in Ti tle I schools: Final report. Washington, DC: U.S. Go vernment Printing Office.

PAGE 186

173 United States Department of Education. (2002). The Condition of Education 2002, (NCES 2002-025). Washingt on, DC: U.S. Government Printing Office. United States Department of Education. (2003). The Condition of Education 2003, (NCES 2003-067). Washingt on, DC: U.S. Government Printing Office. United States Department of Education. (2004a). The Condition of Education, 2004, (NCES 2004-077). Washingt on, DC: U.S. Government Printing Office. U.S. Department of Educati on, (2004b). Education Statis tics Quarterly, Vol. 5, Issue 4. National Center fo r Educational Statistics. Valentine, E.P. (1991). Strategic management in educat ion a focus on strategic planning. Needham Heights, MA: Allyn and Bacon. Walker, H.M. (2004). Comment ary: Use of evidence -based interventions in schools: Where we’ve been, where we are, and where we need to go. School Psychology Review, 33, 398-407. Watson, T.S., Sterling, H. E., & McDade, A. (1997). Demythifying behavioral consultation. School Psychology Review, 26, 467-74. Wickstrom, K.F. (1996). A study of the relationship among teacher, process, and outcome variables within schoolbased consultation. Dissertation Abstracts International Section A: Humanities & Social Sciences, 56 (11A), 4331. (UMI No. 95010-209)

PAGE 187

174 Wickstrom, K.F., Jones, K.M., LaFleur, L. H., & Witt, J.C. ( 1998). An analysis of treatment integrity in school-based behavioral consultation, School Psychology Quarterly, 13, 141-154. Witt, J.C., & Elliot, S.N. ( 1982). The response cost lott ery: A time-efficient and effective classroom intervention. Journal of School Psychology, 20, 155161. Witt, J.C., Martens, B.K., & Elliot, S.N. (1984). Fact ors affecting teachers’ judgments of the acceptability of behavioral interventions: Time involvement, behavior severity, and type of intervention. Behavior Therapy, 15, 204-209. Witt, J.C., Noell, G.H., LaF leur, L.H., & Mortenson, B. P. (1997). Teacher use of interventions in general education settings: Measurement and analysis of the independent variable. Journal of Applied Behavior Analysis, 30 693696. Wright, J.A., & Dusek, J. B. (1998). Compiling school base rates for disruptive behaviors from student disciplinary referral data. School Psychology Review, 27, 138-147. Wu, S.-C., Pink,. W., Crain, R., & Mo les, O. (1982). Student suspension: A critical reappraisal. The Urban Review, 14, 245-303.

PAGE 188

175 Appendices

PAGE 189

176 Appendix A: Benchmarks Of Quality Scoring Form School-wide Benchmarks of Quality: SCORING FORM School Name: ___________________________________________ District:__________________________ Coachs Name:___________________________________________ Date: __________________________ STEP 1: Coach uses the Scoring Guide to determine appropriate point value. Circle ONE response. STEP 2: Indicate your teams most frequent response. Write the response in column 2. (in place ++, needs improvement +, or not in place ). If there is a tie, report the higher score. STEP 3: Place a check next to any item where there is a di screpancy between your rating and the teams rating. Document the discrepancies on page 3. Critical Elements STEP 1 ST EP 2 ++, +, or STEP 3 1. Team has broad representation 1 0 2. Team has administrative support 3 2 1 0 3. Team has regular meetings (at least monthly) 2 1 0 PBS Team 4. Team has established a clear mission/purpose 1 0 5. Faculty are aware of behavior problems across campus (regular data sharing) 2 1 0 6. Faculty involved in establishing and reviewing goals 2 1 0 Faculty Commitment 7. Faculty feedback obtained throughout year 2 1 0 8. Discipline process describe d in narrative format or depicted in graphic format 2 1 0 9. Process includes documentation procedures 1 0 10. Discipline referral form includes information useful in decision making 2 1 0 11. Behaviors defined 3 2 1 0 12. Major/minor behaviors are clearly identified/understood 2 1 0 13. Suggested array of appropriate responses to minor (non office-manage d) problem behaviors 1 0 Effective Procedures for Dealing with Discipline 14. Suggested array of appropriate responses to major (office-managed) problem behaviors 1 0 15. Data system to collect a nd analyze ODR data 3 2 1 0 16. Additional data collected (attendance, grades, faculty attendance, surveys) 1 0 17. Data entered weekly (minimum) 1 0 18. Data analyzed monthly (minimum) 2 1 0 Data Entry & Analysis Plan Established 19. Data shared with team and faculty monthly (minimum) 2 1 0 20. 3-5 positively stated school-wide expectations posted around school 3 2 1 0 21. Expectations apply to bot h students and staff 3 2 1 0 22. Rules developed and poste d for specific settings (where problems are prevalent) 2 1 0 23. Rules are linked to expectations 1 0 Expectations & Rules Developed 24. Staff feedback/involvement in expectations/rule development 2 1 0

PAGE 190

177 Appendix A: (Continued) Critical Elements STEP 1 STEP 2 ++, +, or STEP 3 25. A system of rewards has elements that are implemented consistently across campus 3 2 1 26. A variety of methods are used to reward students 2 1 0 27. Rewards are linked to expectations 3 2 1 0 28. Rewards are varied to maintain student interest 2 1 0 29. System includes opportunities for naturally occurring reinforcement 1 0 30. Ratios of reinforcement to corrections are high 3 2 1 0 31. Students are involved in identifying/developing incentives 1 0 Reward/ Recognition Program Established 32. The system includes incentives for staff/faculty 2 1 0 33. A behavioral curriculum includes concept and skill level instruction 2 1 0 34. Lessons include examples and non-examples 1 0 35. Lessons use a variety of teaching strategies 2 1 0 36. Lessons are embedded into subject area curriculum 2 1 0 37. Faculty/staff and students are involved in development & delivery of lesson plans 1 0 Lesson Plans for Teaching Expectations/ Rules 38. Strategies to reinforce the lessons with families/community are developed and implemented 1 0 39. Develop, schedule and deliver plans to teach staff the discipline and data system 2 1 0 40. Develop, schedule and deliver plans to teach staff the lesson plans for teaching students 2 1 0 41. Develop, schedule and deliver plans for teaching students expectations/rules/rewards 3 2 1 0 42. Booster sessions for students and staff are planned, scheduled, and delivered 2 1 0 43. Schedule for rewards/incentives for the year is planned 1 0 44. Plans for orienting incoming staff and students are developed and implemented 2 1 0 Implemen-tation Plan 45. Plans for involving families/community are developed & implemented 1 0 46. Faculty/staff are taught how to respond to crisis situations 1 0 47. Responding to crisis situations is rehearsed 1 0 Crisis Plan 48. Procedures for crisis situations are readily accessible 1 0 49. Students and staff are surveyed about PBS 2 1 0 50. Students and staff can identify expectations and rules 2 1 0 51. Staff use discipline system/documentation appropriately 3 2 1 0 52. Staff use reward system appropriately 3 2 1 0 Evaluation 53. Outcomes (behavior problems, attendance, morale) are documented and used to evaluate PBS plan 3 2 1 0 TOTAL

PAGE 191

178 Appendix A: (Continued) Benchmarks of Quality TEAM SUMMARY School_______________________ Date_________ Total Benchmarks Score_______ Areas of Discrepancy Item # Team Respons e Coach’s Score Scoring Guide Description If a team discussion of an area of discrepancy reveals information that was previously unknown to the coach and would justify a different score on any item (based upon the Scoring Guide), adjust the benchmark item(s) and total scores. Areas of Strength Critical Element Descripti on of Areas of Strength Areas in Need of Development Critical Element Description of Areas in Need of Development

PAGE 192

179 Appendix B: Benchmarks Of Quality Scoring Guide Completing the Benchmarks of Quality for School-wide Positive Behavior Support (SWPBS) When & Why Benchmarks of Quality for School-wide Positive Behavior Support should be completed in the spring of each school year (Mar/Apr/May). The Benchmarks are used by teams to identify areas of success, areas for improvement, and by the PBS Project to identify model PBS schools. Procedures for Completing Step 1 Coaches Scoring The Coach will use his or her best judgment based on personal experience with the school and the descriptions and exemplars in the Benchmarks of Quality Scoring Guide to score each of the 53 items on the Benchmarks of Quality Scoring Form (p.1 & 2) Do not leave any items blank. Step 2 Team Member Rating The coach will give the Benchmarks of Quality Team Member Rating Form to each SWPBS Team member to be completed independently and returned to the coach upon completion. Memb ers should be instructed to rate each of the 53 items according to whether the component is “ In Place”, “Needs Improvement”, or “Not in Place”. Some of the items relate to product and process development, others to action items; in order to be rated as “In Place;” the item must be developed and implemented (where applicable). Coaches will collect and tally responses and record on the Benchmarks of Quality Scoring Form the team’s most frequent response using ++ for “In Place, ” + for “Needs Improvement,” and – for “Not In Place.” Step 3 – Team Report The coach will then complete the Team Summary on p. 3 of the Benchmarks of Quality Scoring Form recording areas of discrepancy, strength and weakness. Discrepancies If there were any items for which the team’s most frequent rating varied from the coaches’ rating based upon the Scoring Guide, the descriptions and exemplars from the guide should be shared with the team. If upon sharing areas of discrepancy, the coach realizes that there is new information that according to the Scoring Guide would result in a different score, the item and the adjusted final score should be recorded on the Scoring Form

PAGE 193

180Step 4 Reporting Back to Team After adjusting for discrepancies and completing the remainder of the Benchmarks of Quality: Scoring Form the coach will report back to the team using the Team Report page of the Benchmarks of Quality: Scoring Form the coach will lead the team through a discussion of the identified areas of strength (high ratings) and weakness (low ratings). This information should be conveyed as “constructive feedback” to assist wi th action planning. Step 5 Reporting to District Coordinator The coach will forward a copies of the Benchmarks of Quality: Scoring Form and all of the Team Member Rating Forms to the to the district coordinator. Based upon the results of the Benchmarks, a PBS faculty member may contact the coach to determine if the school is interested in being considered for “model school” status. Potential “model schools” must agree to participate in on-site follow -up assessments.

PAGE 194

181BENCHMARKS OF QUAL ITY SCORING GUIDE Benchmark 3 points 2 points 1 point 0 points 1. Team has broad representation Includes all of the following: SAC team member, Administrator (i .e., principal, asst. principal or dean), reg. ed. teacher, spec. ed. teacher, member with behavior expertise, and a coach/districtlevel representation. Some groups are not represented on the team. 2. Team has administrative support Administrator(s) attended training, play an active role in the PBS process, actively communicate their commitment, attends all team meetings, and supports the decisions of the PBS Team. Administrator(s) support the process but do not attend all meetings or take as active a role as the rest of the team. Administrator(s) support the process but attend only a few meetings or doesn’t take as active a role as the rest of the team. Administrator(s) do not actively support the PBS process. 3. Team has regular meetings (at least monthly) Team meets monthly ( min. of 9 one-hour meetings each school year). Team meetings are not consistent ( 5-8) monthly meetings each school year). Team seldom meets ( fewer than five monthly meetings during the school year). 4. Team has established a clear mission/purpose Team has a written purpose/mission statement for the PBS team (commonly completed on the cover sheet of the action plan). No mission statement/purpose written for the team. 5. Faculty are aware of behavior problems across campus (regular data sharing) Data regarding school-wide behavior is shared with faculty monthly ( min. of 8 times per year). Data regarding school-wide behavior is occasionally shared with faculty (3-7 times per year). Data is not regularly shared with faculty. Faculty may be given an update 0-2 times per year

PAGE 195

182 Benchmark 3 points 2 points 1 point 0 points 6. Faculty involved in establishing and reviewing goals Most faculty participate in establishing PBS goals (i.e. surveys, “dream”, “PATH”) on at least an annual basis. Some of the faculty participates in establishing PBS goals (i.e. surveys, “dream”, “PATH”) on at least an annual basis. Faculty does not participate in establishing PBS goals. 7. Faculty feedback obtained throughout year Faculty is given opportunities to provide feedback, to offer suggestions, and to make choices in every step of the PBS process (via staff surveys, voting process, suggestion box, etc.) Nothing is implemented without the majority of faculty approval. Faculty are given some opportunities to provide feedback, to offer suggestions, and to make some choices during the PBS process. However, the team also makes decisions without input from staff. Faculty are rarely given the opportunity to participate in the PBS process (fewer than 2 times per school year). 8. Discipline process described in narrative format or depicted in graphic format Team has established clear, written procedures that lay out the process for handling both major and minor discipline incidents. ( Includes crisis situations) Team has established clear, written procedures that lay out the process for handling both major and minor discipline incidents. ( Does not include crisis situations.) Team has not established clear, written procedures for discipline incidents and/or there is no differentiation between major and minor incidents. 9. Process includes documentation procedures There is a documentation procedure to track both major and minor behavior incidents (i.e., form, database entry, file in room, etc.). There is not a documentation procedure to track both major and minor behavior incidents (i.e., form, database

PAGE 196

183 Benchmark 3 points 2 points 1 point 0 points entry, file in room, etc.). 10. Discipline referral form includes information useful in decision making Information on the referral form includes ALL of the required fields: Student’s name, date, time of incident, grade level, referring staff, location of incident, gender, problem behavior, possible motivation, others involved, and administrative decision. The referral form includes all of the required fields, but also includes unnecessary information that is not used to make decisions and may cause confusion. The referral form lacks one or more of the required fields or does not exist. 11. Behaviors defined Written documentation exists that includes clear definitions of all behaviors listed. All of the behaviors are defined but some of the definitions are unclear. Not all behaviors are defined or some definitions are unclear. No written documentation of definitions exists. 12. Major/minor behaviors are clearly identified/ understood Most staff are clear about which behaviors are staff managed and which are sent to the office. (i.e. appropriate use of office referrals) Those behaviors are clearly defined, differentiated and documented. Some staff are unclear about which behaviors are staff managed and which are sent to the office (i.e. appropriate) use of office referrals) or no documentation exists. Specific major/minor behaviors are not clearly defined, differentiated or documented. 13. Suggested array of appropriate responses to minor (non officemanaged) problem behaviors There is evidence that most staff are aware of and use an array of appropriate responses to minor behavior problems. There is evidence that few staff are aware of or use an array of appropriate responses to minor behavior problems. 14. Suggested array of appropriate responses to major (officemanaged) problem behaviors There is evidence that all administrative staff are aware of and use an array of predetermined appropriate responses to major behavior problems. There is evidence that some administrative staff are not aware of, or do not follow, an array of predetermined appropriate

PAGE 197

184 Benchmark 3 points 2 points 1 point 0 points responses to major behavior problems. 15. Data system to collect and analyze ODR data The database can quickly output data in graph format and allows the team access to ALL of the following information: average referrals per day per month, by location, by problem behavior, by time of day, by student, and compare between years. ALL of the information can be obtained from the database (average referrals per day per month, by location, by problem behavior, by time of day, by student, and compare between years), though it may not be in graph format, may require more staff time to pull the information, or require staff time to make sense of the data. Only partial information can be obtained (lacking either the number of referrals per day per month, location, problem behavior, time of day, student, and compare patterns between years.) The data system is not able to provide any of the necessary information the team needs to make school-wide decisions. 16. Additional data collected (attendance, grades, faculty attendance, surveys) The team collects and considers data other than discipline data to help determine progress and successes (i.e. attendance, grades, faculty attendance, school surveys, etc.) The team does not collect or consider data other than discipline data to help determine progress and successes (i.e. attendance, grades, faculty attendance, school surveys, etc.). 17. Data entered weekly (minimum) Data is typically entered at least weekly Data is not entered at least weekly (minimum). 18. Data analyzed monthly (minimum) Data is printed, analyzed, and put into graph format or other easy to understand format by a member of the team monthly (minimum) Data is printed, analyzed, and put into graph format or other easy to understand format by a team member less than once a month Data is not analyzed

PAGE 198

185 Benchmark 3 points 2 points 1 point 0 points 19. Data shared with team and faculty monthly (minimum) Data is shared with the PBS team and faculty at least once a month Data is shared with the PBS team and faculty less than one time a month. Data is not reviewed each month by the PBS team and shared with faculty. 20. 3-5 positively stated school-wide expectations posted around school 3-5 positively stated schoolwide expectations are visibly posted around the school. Areas posted include the classroom and a minimum of 3 other school settings (i.e., cafeteria, hallway, front office, etc). 3-5 positively stated expectations are visibly posted in most important areas (i.e. classroom, cafeteria, hallway), but one area may be missed. 3-5 positively stated expectations are not clearly visible in common areas. Expectations are not posted or team has either too few or too many expectations. 21. Expectations apply to both students and staff PBS team has communicated that expectations apply to all students and all staff. PBS team has expectations that apply to all students AND all staff but haven’t specifically communicated that they apply to staff as well as students. Expectations refer only to student behavior. There are no expectations. 22. Rules developed and posted for specific settings (where problems are prevalent) Rules are posted in all of the most problematic areas in the school. Rules are posted in some, but not all of the most problematic areas of the school. Rules are not posted in any of the most problematic areas of the school. 23. Rules are linked to expectations When taught or enforced, staff consistently link the rules with the school-wide expectations. When taught or enforced, staff do not consistently link the rules with the schoolwide expectations and/or rules are taught or enforced separately from expectations.

PAGE 199

186 Benchmark 3 points 2 points 1 point 0 points 24. Staff feedback/involvement in expectations/rule development Most staff were involved in providing feedback/input into the development of the schoolwide expectations and rules (i.e., survey, feedback, initial brainstorming session, election process, etc.) Some staff were involved in providing feedback/input into the development of the schoolwide expectations and rules. Staff were not involved in providing feedback/input into the development of the school-wide expectations and rules. 25. A system of rewards has elements that are implemented consistently across campus The reward system guidelines and procedures are implemented consistently across campus. Almost all members of the school are participating appropriately. at least 90% participation The reward system guidelines and procedures are implemented consistently across campus. However, some staff choose not to participate or participation does not follow the established criteria. at least 75% participation The reward system guidelines and procedures are not implemented consistently because several staff choose not to participate or participation does not follow the established criteria. at least 50% participation There is no identifiable reward system or a large percentage of staff are not participating. less than 50% participation 26. A variety of methods are used to rewards students. The school uses a variety of methods to reward students (e.g. cashing in tokens/points). There should be opportunities that include tangible items, praise/recognition and social activities/events. Students with few/many tokens/points have equal opportunities to cash them in for rewards. However, larger rewards are given to those earning more tokens/points. The school uses a variety of methods to reward students, but students do not have access to a variety of rewards in a consistent and timely manner. The school uses only one set methods to reward students (i.e., tangibles only) or there are no opportunities for children to cash in tokens or select their reward. Only students that meet the quotas actually get rewarded, students with fewer tokens cannot cash in tokens for a smaller reward.

PAGE 200

187 Benchmark 3 points 2 points 1 point 0 points 27. Rewards are linked to expectations Rewards are provided for behaviors that are identified in the rules/expectations and staff verbalize the appropriate behavior when giving rewards. Rewards are provided for behaviors that are identified in the rules/expectations and staff sometimes verbalize appropriate behaviors when giving rewards. Rewards are provided for behaviors that are identified in the rules/expectations but staff rarely verbalize appropriate behaviors when giving rewards. Rewards are provided for behaviors that are not identified in the rules and expectations. 28. Rewards are varied to maintain student interest The rewards are varied throughout year and reflect students’ interests (e.g. consider the student age, culture, gender, and ability level to maintain student interest.) The rewards are varied throughout the school year, but may not reflect students’ interests. The rewards are not varied throughout the school year and do not reflect student’s interests. 29. System includes opportunities for naturally occurring reinforcement. Students often get natural rewards such as praise and recognition for academic performance that are not part of the planned reward system. Students rarely get natural rewards, such as praise and recognition for academic performance that are not part of the planned reward system. 30. Ratios of reinforcement to corrections are high Ratios of teacher reinforcement of appropriate behavior to correction of inappropriate behavior are high (e.g., 4:1). Ratios of teacher reinforcement of appropriate behavior to correction of inappropriate behavior are moderate (e.g., 2:1). Ratios of teacher reinforcement of appropriate behavior to correction of inappropriate behavior are about the same (e.g., 1:1). Ratios of teacher reinforcement of appropriate behavior to correction of inappropriate behavior are low (e.g., 1:4) 31. Students are involved in identifying/developing incentives Students are often involved in identifying/developing incentives. Students are rarely involved in identifying/developing incentives. 32. The system includes incentives for staff/faculty The system includes incentives for staff/faculty and they are delivered consistently. The system includes incentives for staff/faculty, but they are not delivered The system does not include incentives for staff/faculty.

PAGE 201

188 Benchmark 3 points 2 points 1 point 0 points consistently. 33. A behavioral curriculum includes concept and skill level instruction Lesson plans are developed and used to teach rules and expectations Lesson plans were developed and used to teach rules, but not developed for expectations or vice versa. Lesson plans have not been developed or used to teach rules or expectations 34. Lessons include examples and nonexamples Lesson plans include both examples of appropriate behavior and examples of inappropriate behavior. Lesson plans give no specific examples or non-examples or there are no lesson plans. 35. Lessons use a variety of teaching strategies Lesson plans are taught using at least 3 different teaching strategies (i.e., modeling, roleplaying, videotaping) Lesson plans have been introduced using fewer than 3 teaching strategies. Lesson plans have not been taught or do not exist. 36. Lessons are embedded into subject area curriculum Nearly all teachers embed behavior teaching into subject area curriculum on a daily basis. About 50% of teachers embed behavior teaching into subject area curriculum or embed behavior teaching fewer than 3 times per week Less than 50% of all teachers embed behavior teaching into subject area curriculum or only occasionally remember to include behavior teaching in subject areas. 37. Faculty/staff and students are involved in development & delivery of lesson plans Faculty, staff, and students are involved in the development and delivery of lesson plans to teach behavior expectations and rules for specific settings. Faculty, staff, and students are not involved in the development and delivery of lesson plans to teach behavior expectations and rules for specific settings. 38. Strategies to The PBS Plan includes The PBS plan does

PAGE 202

189 Benchmark 3 points 2 points 1 point 0 points reinforce the lessons with families/community are developed and implemented strategies to reinforce lessons with families and the community (i.e., after-school programs teach expectations, newsletters with tips for meeting expectations at home) not include strategies to be used by families and the community. 39. Develop, schedule, and deliver plans to teach staff the discipline and data system The team scheduled time to present and train faculty and staff on the discipline procedures and data system including checks for accuracy of information or comprehension. Training included all components: referral process (flowchart), definitions of problem behaviors, explanation of major vs. minor forms, and how the data will be used to guide the team in decision making. The team scheduled time to present and train faculty and staff on the discipline procedures and data system, but there were no checks for accuracy of information or comprehension. OR training did not include all components (i.e., referral process (flowchart), definitions of problem behaviors, explanation of major vs. minor forms, and how the data will be used to guide the team in decision making.) Staff was either not trained or was given the information without formal introduction and explanation.

PAGE 203

190 Benchmark 3 points 2 points 1 point 0 points 40. Develop, schedule, and deliver plans to teach staff the lesson plans for teaching students The team scheduled time to present and train faculty and staff on lesson plans to teach students expectations and rules including checks for accuracy of information or comprehension. Training included all components: plans to introduce the expectations and rules to all students, explanation of how and when to use formal lesson plans, and how to embed behavior teaching into daily curriculum. The team scheduled time to present and train faculty and staff on lesson plans to teach students expectations and rules but there were no checks for accuracy of information or comprehension. OR Training did not include all components: plans to introduce the expectations and rules to all students, explanation of how and when to use formal lesson plans, and how to embed behavior teaching into daily curriculum. Staff was either not trained or was given the information without formal introduction and explanation. 41. Develop, schedule and deliver plans for teaching students expectations, rules, & rewards Students are introduced/taught all of the following: school expectations, rules for specific setting, and the reward system guidelines. Students are introduced/taught two (2) of the following: school expectations, rules for specific setting, and the reward system guidelines. Students are introduced/taught only one (1) of the following: school expectations, rules for specific setting, and the reward system guidelines. Students are not introduced/taught any of the following: school expectations, rules for specific setting, and the reward system guidelines.

PAGE 204

191 Benchmark 3 points 2 points 1 point 0 points 42. Booster sessions for students and staff are planned, scheduled, and implemented Booster sessions are planned and delivered to reteach staff/students at least once in the year and additionally at times when the data suggest problems by an increase in discipline referrals per day per month or a high number of referrals in a specified area. Expectations and rules are reviewed with students regularly (at least 1x per week). Booster sessions are not utilized fully. For example: booster sessions are held for students but not staff; booster sessions are held for staff, but not students; booster sessions are not held, but rules & expectations are reviewed at least weekly with students. Booster sessions for students and staff are not scheduled/planned. Expectations and rules are reviewed with students once a month or less. 43. Schedule for rewards/incentives for the year is planned There is a clear plan for the type and frequency of rewards/incentives to be delivered throughout the year. There is no plan for the type and frequency of rewards/incentives to be delivered throughout the year. 44. Plans for orienting incoming staff and students are developed and implemented Team has planned for and carries out the introduction of School-wide PBS and training of new staff and students throughout the school year. Team has planned for the introduction of School-wide PBS and training of either new students or new staff, but does not include plans for training both. OR the team has plans but has not implemented them. Team has not planned for the introduction of School-wide PBS and training of new staff or students 45. Plans for involving families/community are developed and implemented Team has planned for the introduction and on-going involvement of school-wide PBS to families/community (i.e., newsletter, brochure, PTA, open-house, team member, etc.) Team has not introduced schoolwide PBS to families/community.

PAGE 205

192 Benchmark 3 points 2 points 1 point 0 points 46. Faculty/staff are taught how to respond to crisis situations Faculty and staff are taught how to personally respond to crisis situations and have written information (i.e. manual) of the district crisis plan. Faculty and staff are not taught how to personally respond to crisis situations and/or have no written information (i.e. manual) of the district crisis plan. 47. Responding to crisis situations is rehearsed Faculty and staff are given opportunities during the school year to practice responding to crisis situations. Faculty and staff do not practice responding to crisis situations. 48. Procedures for crisis situations are readily accessible Faculty and staff have ready access to and know where to find procedures for dealing with crisis situations Faculty and staff do not have ready access to or know where to find procedures for dealing with crisis situations 49. Students and staff are surveyed about PBS Students and staff are surveyed at least annually (i.e. items on climate survey or specially developed PBS plan survey), and information is used to address the PBS plan. Students and staff are surveyed at least annually (i.e. items on climate survey or specially developed PBS plan survey), but information is not used to address the PBS plan. Students and staff are not surveyed.

PAGE 206

193 Benchmark 3 points 2 points 1 point 0 points 50. Students and staff can identify expectations and rules Almost all students and staff can identify the school-wide expectations and rules for specific settings. (can be identified through surveys, random interviews, etc…) at least 90% Many students and staff can identify the school-wide expectations and rules for specific settings. at least 50% Few of students and staff can identify the expectations and rules for specific settings OR Evaluations are not conducted less than 50% 51. Staff use discipline system/documentation appropriately Almost all staff know the procedures for responding to inappropriate behavior, use forms as intended and fill them out correctly. (can be identified by reviewing completed forms, staff surveys, etc…) at least 90% know/use Many of the staff know the procedures for responding to inappropriate behavior, use forms as intended and fill them out correctly. at least 75% know/use Some of the staff know the procedures for responding to inappropriate behavior, use forms as intended and fill them out correctly. at least 50% know/use Few staff know the procedures for responding to inappropriate behavior, use forms as intended and fill them out correctly OR Evaluations are not conducted. less than 50% know/use 52. Staff use reward system appropriately Almost all staff understand identified guidelines for the reward system and are using the reward system appropriately. (can be identified by reviewing reward token distribution, surveys, etc…) at least 90% understand/use Many of the staff understand identified guidelines for the reward system and are using the reward system appropriately. at least 75% understand/use Some of the staff understand identified guidelines for the reward system and are using the reward system appropriately. at least 50% understand/use Few staff understand and use identified guidelines for the reward system OR Evaluations are not conducted at least yearly or do not assess staff knowledge and use of the reward system. less than 50% understand/use 53. Outcomes (behavior problems, attendance, morale) are documented and used to evaluate PBS plan. There is a plan for collecting data to evaluate PBS outcomes, most data is collected as scheduled, and data is used to evaluate PBS plan. There is a plan for collecting data to evaluate PBS outcomes, some of the scheduled data has been collected, and data is used to evaluate PBS plan. There is a plan for collecting data to evaluate PBS outcomes, however nothing has been collected to date. There is no plan for collecting data to evaluate PBS outcomes.

PAGE 207

194 Appendix C: Benchmarks of Qua lity Team Rating Form Sample School-wide Benchm arks of Quality TEAM MEMBER RATING FORM Directions: Place a check in the box that most accurately describes your progress on each benchmark. Check One Critical Elements Benchmarks of Quality In Place (++) Needs Improvement (+) Not In Place (-) 5. Team has broad representation 6. Team has administrative support 7. Team has regular meetings (at least monthly) PBS Team 8. Team has established a clear mission/purpose 28. Faculty are aware of behavior problems across campus (regular data sharing) 29. Faculty involved in establishing and reviewing goals Faculty Commitment 30. Faculty feedback obtained throughout year 31. Discipline process described in narrative format or depicted in graphic format 32. Process includes documentation procedures 33. Discipline referral form includes information useful in decision making 34. Behaviors defined 35. Major/minor behaviors are clearly identified/understood 36. Suggested array of appropriate responses to minor (non office-managed) problem behaviors Effective Procedures for Dealing with Discipline 37. Suggested array of appropriate responses to major (office-managed) problem behaviors 38. Data system to collect and analyze ODR data 39. Additional data collected (attendance, grades, faculty attendance, surveys) 40. Data entered weekly (minimum) 41. Data analyzed monthly (minimum) Data Entry & Analysis Plan Established 42. Data shared with team and faculty monthly (minimum) 43. 3-5 positively stated school-wide expectations posted around school 44. Expectations apply to both students and staff 45. Rules developed and posted for specific settings (where problems are prevalent) 46. Rules are linked to expectations Expectations & Rules Developed 47. Staff feedback/involvement in expectations/rule development

PAGE 208

195Appendix D: Team Process Survey Date:______________ Name of School:____________ __________________ The following items relate to the func tioning and effectiveness of the PBS team throughout the year. Please rate each item on the following scale: Strongly Disagree Disagree Not Sure Agree Strongly Agree Not Applicab le 1 2 3 4 5 NA 1. The team shares common goals. 2. The team has a common vision for the school. 3. All team members actively participate effectively in the process. 4. Each team member’s goals for the school are recognized throughout the process and planning. 5. Team members are comfortable sharing their thought/concerns. 6. The team is able to resolve conflicts effectively. 7. The team facilitators are effective in guiding the team through the PBS process. 8. Family support for the team and school has increased since program implementation. 9. Team members are accomplishing goals within the identified timelines. 10. The team is able to agree on strategies identified for the school. 11. A school-based administrator is an active member of the team. 12. District-based administration is available for team support. 13. The degree of local control over settings and resources is adequate to support the process. 14. Systems issues in the school or district do not impede the team structure and functioning. 15. School policies and procedures support the PBS process.

PAGE 209

19616. The agencies that agreed to work with the team to meet the school’s needs continue to be involved. 17. There has been an increase in the number of community entities that support the school. 18. My vision for a positive future for this school has improved.

PAGE 210

197 Appendix E: Coach Self-Assessment Survey School-wide PBS Coach’s Self-assessment Coach Name:__________________District: ________________Date: _____________ Skill 1 – learning 2 – building but not fluent 3 – fluent/mastered DATA 1. Understand and use the school behavior data system 2. Supporting the team in use of data to make decisions TEAM 1. Facilitating team meetings 2. Assisting teams in problem solving IMPLEMENTATION 1. Understand the critical elem ents of School-wide PBS 2. Understand the basic principles of behavior 3. Know or have resources to i dentify effective strategies for reducing problem behaviors 4. Know or have resources to identify models and examples of effective school-wide strategies OTHER ISSUES

PAGE 211

198 Appendix F. Enablers and Barri ers to SWPBS Implementation Enablers Barriers Category Factor Source Factor Source Knowledge Information about initiative Harvey & Brown, 2001 Threat to job security, power, or social network Harvey & Brown, 2001 Grimes & Tilly, 1996 Philosophical Acceptance Grimes & Tilly, 1996; Sparks, 1988 Perceived and actual effectiveness Gresham, 1989 Resources Resources Harvey & Brown, 2001 Grimes & Tilly, 1996 Gresham, 1989 Handbook with all the procedures Gottfredson, Gottfredson, & Hybl, 1993 Enough time required to implement Witt, Martens, & Elliot, 1984 Reimer, Wacker, & Koeppl, 1987; Gresham, 1989

PAGE 212

199 Appendix F: (Continued) Support Support (e.g. administrative) Harvey & Brown, 2001; Grimes & Tilly, 1996; Ponti, Zins, & Graden, 1988 Weak leadership at district and school level Gottfredson, Gottfredson, & Skroban, 1998 Grant funding Sadler, 2000 Change of administration Gottfredson, Gottfredson, & Skroban, 1998 Full time EBS coordinator Nakasato, 2000 Nersesian, Todd, Lehmann, & Watson, 2000 Teamwork Taylor-Greene and Kartub, 2001;Lohrmann-O’Rourke, Knoster, Sabatine, Smith, Horvath, & Llewellyn, 2000 Having a coach Lewis, Sugai, & Colvin, 1998; Hall & Hord, 2001 Input Constant staff input Ikeda, Tilly, Stumme, Volmer, & Allison, 1996 Lack of buy-in LewisPalmer, Flannery, Sugai, & Eber, 2002 Including parents and community Lohrmann-O’Rourke, Knoster, Sabatine, Smith, Horvath, & Llewellyn, 2000 Failure to involve key personnel Gottfredson, Gottfredson, & Skroban, 1998

PAGE 213

200 Appendix F: (Continued) Input (cont.) Including students Lohrmann-O’Rourke, Knoster, Sabatine, Smith, Horvath, & Llewellyn, 2000 Collaboration with other agencies Hall & Hord, 2001 Chapman & Hofweber, 2000 Nersesian, Todd, Lehmann, & Watson, 2000 Ikeda, Tilly, Stumme, Volmer, & Allison, 1996 Training Assessment of training needs Knoff, 2002 Lack of personnel training in consultation Ponti, Zins, & Graden, 1988 On-going professional development Chapman & Hofweber, 2000 Staff consistency LewisPalmer, Flannery, Sugai, & Eber, 2002 Training in problem solving Curtis & Stollar, 2002 Strategic planning for 5-7 years Grimes & Tilly, 2001; Chapman & Hofweber, 2000 Integration Including PBS in School Improvement Plan Taylor-Green & Kartub, 2000

PAGE 214

201 Appendix F: (Continued) Integration with other initiatives Nersesian, Todd, Lehmann, & Watson, 2000 Ponti, Zins, & Graden, 1988 Evaluation Inclusion of evaluation Chapman & Hofweber, 2000 Getting staff to collect and analyze data LewisPalmer, Flannery, Sugai, & Eber, 2002 Taylor-Green & Kartub, 2000

PAGE 215

202 Appendix G: District Readiness Checklist School-wide Positive Behavior Support: District R eadiness Checklist for Leadership Team (2-sided) District: _____________________________________ Date: ________________________Contact Person: _____________________________________ Documents/Evidence Complete? Items to Complete Prior to School-wide PBS Training YES NO 1. A district representative has been identified as the PBS Dist rict Coordinator (i.e., lead contact) for all PBS initiatives within your district. List district representative and provide contact inform ation (name, title, address, phone, cell, fax, e-mail) YES NO 2. District Administrators have participated in an awareness presentation summarizing Florida’s PBS Project and the School-wide PBS process. List date(s) of presentation, loca tion(s) and name of presenter(s): YES NO 3. A district Positive Behavior Support (PBS) Team is fo rmed and has broad representation (including regular and exceptional student education, student s upport services, personnel preparation, curr iculum and instruction, management information systems, safe a nd drug free schools, school improve ment, transportation, etc.). List team members and identify roles: YES NO 4. District PBS Team co mmits to attend a portion of the school-wide traini ng and participate in annual or bi-annual update meetings to discuss progress to date. Describe when you meet or plan to meet (days, location, and time) throughout the school year: YES NO 5. District PBS Team has participated and completed a needs assessment and action plan facilitated by Florida’s PBS Project. Provide copy of action plan and list date of completion: YES NO 6. PBS Coaches (Facilitators) have been identified by the PBS District Coordinator to re ceive additional training and actively participate in the sc hool-wide initiatives (may over lap with District PBS Team) List PBS Coaches and roles: Documents/Evidence Complete? Items to Complete Prior to School-wide PBS Training (continued) YES NO 7. District has allocated/se cured funding to support the school-wide initiatives in their respective schools (e.g., School Improvement, Safe and Drug Free School s, other school/community resources). Identify funding source(s) that will be utilized: YES NO School-wide discipline (i.e., school climat e, safety, behavior, etc.) is identifie d as one of the top district goals. Attach a copy of district goals or letter of supp ort from Superintendent’s office.

PAGE 216

203 Appendix G: (Continued) YES NO 9. The district will provide a letter to participating school Pr incipals reminding them of the training dates, requirements of attendance, stipend requirements, it ems needed at training, etc. Attach a copy of the letter. YES NO 10. Following training, the district will provide a letter to participating school Principa ls on the importance of data collection, the need for dail y use of their database system, and encourag e participation of team members in ongoing training opportunities. Attach a copy of the letter of support disseminated to Administrators. YES NO 11. The district is aware that SWIS III is a school-based discipline data system that is not intended or capable of replacing the current district database. Confirm: Yes OR No List current discipline data system utilized in your district: YES NO N/A 12. If your school district agrees to adopt SWIS III for participat ing schools, then the district agrees to provide the participating schools computer access to Internet, and at least Netscape 6 or Internet Explorer 5. Confirm available Internet access: Netscape ____ OR Internet Explorer ____ (Please remember that SWIS training is OPTIONAL and follows successful completion of school-wide training) YES NO N/A 13. If your school district agrees to adopt SWIS III for participati ng schools, then the district will provide time for a perso n from your MIS department to develop query statements nece ssary for SWIS compatibility with your current district database. List MIS Person and provide contact information: (Please remember that SWIS training is OPTIONAL and follows successful completion of school-wide training) YES NO 14. The district agrees to allow the participating schools to revise/utilize a discipline referral form, problem behavior definitions, and develop a coherent di scipline referral process in order to enhance data-based decision making on campuses. Confirm: Yes OR No

PAGE 217

204 Appendix H: District Planning Process Form District :________________ ________________ Completion Date: ________________________ Team Members Present: ________________ ______________ ________________________ ________________________ ________________________ ________________________ ________________________ ________________________ _________ District Action Planning Purpose: To assist the district team in developing an annual, comprehensive plan for initiati ng, supporting, and evaluating the School-wide Positive Behavior Support (PBS) efforts for all schools in the district. The plan will help determine which district personnel, representin g various service areas, are needed to build and maintain PBS as a priority for schools within the district. The plan will also determine persons who will be identified as PBS Coac hes, who will be directly responsible for regularly moni toring individual school team’s progress. Additionally, the plan a llows the district team to plan for resources (time, fundi ng…) to support implementing school teams. To conclude the plan, the team will generate goals for expanding Positive Behavior Supports within the district for the coming school year. **Red = Recommended for first year distri cts or districts adding fewer than 3 schools a year. **Blue = Recommended for second year distri cts or districts adding more than 3 schools a year. May be considered for dist ricts that have been implementing at a slower pace for several years.

PAGE 218

205 Appendix H: (Continued) a. What is the level of interest regarding Schoolwide Positive Behavior Support in the district? b. How was the interest generated? c. Has the district team and/or school administrators received overview information on the PBS process? d. Is school-wide discipline identified as a top district goal? If so how? e. How many schools will receive training this year? f. Why and how are thes e schools selected for training? g. Are there other initiatives or issues that might impact (positive or negat ive) the support of School-wide Positive Behavior Support by the School Board or Superintendent? h. Will the established PBS Leadership team agree to meet and action plan at a minimum annually? i. Are all necessary people members of the district team (now and later)? Completed In progress Not Addressed INITIAL COMMITMENT

PAGE 219

206 Appendix H: (Continued) Completed In Progress Not Addressed COORDINATING SUPPORT a. Which team member has been selected to be the District Coordinator of PBS? SEE DEFINITION b. Will the chosen district coordinator be given the time to attend the 3 day training with the school teams? c. Will this person be prov ided with sufficient Support (time, resources…) to make the PBS process work at the selected schools and expand efforts across the district? How? d. Does the leadership team have the authority to commit specific resources to school teams? e. How will the Leadership Team determine who will be appropriate School Team Coaches? SEE DEFINITION f. How will the team set aside time for team coaches to meet as a group a minimum of once per quarter? g. How will the team ensure that coaches will attend the 3-day training with their assigned school team? h. How will the team ensure that coaches will attend the annual coaches training presented by the University of South Florida? i. How will the team free up coaches time so they may attend school team monthly PBS meetings or various school-wide events? j. How will the team provide funding to support/sustain school-wid e efforts at multiple school sites over time?

PAGE 220

207 Appendix H: (Continued) Completed In Progress Not Addressed EVALUATING PROGRESS a. Is each of the identif ied school’s current database useful for data-based decision making? b. What is the name of the current data collection system in the district? c. Will the district allow schools to use an alternate data collection system? What if t hat system is web based? d. How do schools get their data (process and format)? e. How and when will SWPBS information/data be shared with other necessary persons (e.g., State PBS Project, Superintendent, School Board Members)? f. Are there a variety of c hannels of communication to inform and receive feedback from all impacted by SWPBS? g. What, if any, tools are schools required to use yearly to assess climate/safety? h. Will the District Coordinat or require all implementing school teams to complete and turn in The Benchmarks of Quality? How? When? i. How will the District Coor dinator ensure that Coaches are monitoring school team ’s action plan and progress on completing stated goals? j. Will District Coordinat ors ensure Coaches are assisting school teams in completing annual staff satisfaction surveys? k. Will District Coordinator s ensure that coaches are monitoring the regular/valid input of discipline data? How? l. How and when will coaches be evaluated? m. How and when will data be shared with the District Coordinator?

PAGE 221

208 Appendix H: (Continued) Completed In Progress Not Addressed GOALS / NEXT STEPS a. How do you plan to support classroom, ta rget groups, or individu als through the PBS process? b. What social skill init iatives (i.e. anger management, conflic t resolution, verbal deescalation…) does the di strict have in place to support targeted group interventions? c. Do you have representati on on your leadership team from Professiona l Development? d. What are the plans for expanding PBS efforts in the next year? For the next 3 years?

PAGE 222

209 Appendix I: School Readiness Checklist School-wide Positive Behavior Support: Training Readiness Checklist for Individual Schools School: ____________________________________________________ District: ________________________Date: _________________ Documents/Evidence Complete? Items to Complete Prior to School-wide PBS Training YES NO 1. A school improvement plan exists that includes school-wide di scipline (i.e., behavior, school safety, school climate) as one of the top school goals. Attach a recent copy of your School Improvement Plan and School Mission Statement YES NO 2. A Positive Behavior Support (PBS) Team is formed and has broad representation (i ncluding some School Improvement Team members, a behavior specialist or team me mber with behavioral expertise, administ rator(s), guidance counselor, and regular and special education teachers). List team members and roles: YES NO 3. Principal or AP who is responsible for making discipline decisi ons is an active participant on PBS Team and agrees to attend all 3 days of School-wide Training. List participating Principal(s): YES NO 4. Principal commits to School-wide PBS and is aware that PBS is a 3-5 year process that ma y require ongoing training and/or re visions of school’s PBS Plan. Please provide Principal signature(s): YES NO 5. PBS Team commits to meet at least once a month to analyze a nd problem-solve school-wide data. De scribe when you meet or plan to meet (days, location, and time) throughout the school year: YES NO 6. PBS Team has reached consensus and completed the PBS Initial Be nchmarks of Quality. Attach a recent copy of the completed In itial Benchmarks of Quality YES NO 7. Your entire faculty including your PBS Team particip ated in an awareness presentation on School-wide PBS. Indicate date of presentation and presenter name(s): YES NO 8. Majority of your faculty, staff, and administra tion are interested in implementing School-wide PBS. Attach recent assessment/survey disseminated and results (i.e., percentage or range of faculty committed): YES NO 9. School has allocated/secured funding from their district to support their school-wide initiatives. Identify funding source: YES NO 10. An individual at the district level has been identified as the lead district contact or PBS District Coordinator. Lead District Contact: YES NO 11. PBS Coaches or Facilitators have been identified by the District Coordinator to receive additional training and actively pa rticipate in the school-wide initiatives. List PBS Coach with title that will be supporting your PBS Team:

PAGE 223

210 Appendix I: (Continued) School: ____________________________________________District: _____________________________________Date: _______________________ Documents/Evidence Complete? Items to Complete Prior to SWIS Training YES NO 12. The school uses an office discipline referral form and problem behavior definitions that are compatible for SWIS. Attach a final copy developed during the school-wide training YES NO 13. The school has a coherent office discipline referral process. Attach a final copy developed during the school-wide training YES NO 14. Data entry time is allocated and scheduled to insure that office referral data will be current to within a week at all time s. Describe this process on campus: YES NO 15. Three people within the school are identified to receive a 2+ hour training on the use of SWIS. List individuals and their roles: YES NO 16. The school has computer access to Internet, a nd at least Netscape 6 or Internet Explorer 5. Confirm available Internet access: Netscape OR Internet Explorer YES NO 17. The school agrees to on-going training for the team rece iving SWIS data on uses of SW IS information for data-based decision-making. Confirm: Yes OR No YES NO 18. The school district agrees to allow the PBS Coaches or Fa cilitators to work with the school personnel on data collection and decision making procedures. List PBS Coach(es) who will work with your school team: YES NO 19. The school agrees to continue to input data into the di strict database until SWIS compatbility with the district database is completed. This may require the school to double enter their discipline data in the meantime. Confirm: Yes OR No

PAGE 224

211Appendix J: Comparison of Schools that Returned the BoQ Survey (R ) and Schools that Did Not Return the BoQ Survey (NR) N M ( SD) Categories All NR R All NR R Fl Socio-Cultural Factors % Students eligible for FRLa 136 63 73 55.90 (21.25) 58.45 (23.92) 53.69 (18.52) 50.80 Ethnicity (% non-white) 136 63 73 46.05 (23.61) 45.69 (26.53) 46.36 (20.95) 50.20 School Sizeb-Elementary 66 31 35 608.03 (183.94) 592.13 (182.58) 622.11 (186.66) 741.00 School Sizeb-Middle 48 20 28 1042.06 (390.37) 935.75 (482.70) 1118 (294.88) 1069.0 0 School Sizeb-High 22 12 10 1418.54 (957.94) 1276.75 (861.04) 1588.7 (1084.47) 1565.0 0 Teacher: student ratio 136 63 73 15.25 (3.56) 14.56 (3.83) 15.84 (3.21) NA % Students with a disability 136 63 73 17.12 (5.69) 17.43 (5.72) 16.82 (5.69) 14.43 Stability of students 136 63 73 90.70 (13.57) 90.21 (13.30) 91.13 (13.88) 93.4 % teachers with adv. degree 136 63 73 32.43 (10.57) 34.39 (12.12) 30.73 (8.77) 35.10 % out-of-field teachers 136 63 63 7.68 (9.78) 9.04 (11.48) 6.51 (7.92) 5.60 Severity of Need Factors % students with ISS 102 39 63 12.57 (14.19) 10.54 (13.69) 13.82 (14.46) 12.00 % students with OSS 102 39 63 11. 54 (9.69) 9.72 (8.25) 12.68 (10.38) 10.40 % students with ODR 102 39 63 63.25 (80.66) 57.77 (82.48) 66.63 (79.99) NA % students who are BGLR 132 60 72 49.37 (17.32) 49.69 (18.94) 49.10 (15.99) 48.63 aFRL=Free and Reduced Lunch. bData were not available for center schools on any of these variables.

PAGE 225

212Appendix K: Average Scores for Florida Schools Categories Elementary Middle High All Socio-Cultural Factors Free and Reduced Lunch Status 53.50 48.00 NA 50.80 Ethnicity (% non-white) NA NA NA 50.20 School Size 741.00 1069.00 1565.00 1125 Teacher: student ratio NA NA NA NA % Students with a disability 15.40 15.00 12.90 14.43 Stability of students 94.30 93.90 92.00 93.4 % teachers with advanced degree 35.10 35.10 35.10 35.10 % out-of-field teachers 5.60 5.60 5.60 5.60 Severity of Need Factors % students with ISS 1.80 17.50 16.70 12.00 % students with OSS 3.00 15.40 12.80 10.40 % students with ODR NA NA NA NA % students who are BGLR* NA NA NA 48.63 Note. *BGLR data were available by gra de: 3rd (35%), 4th (30%), 5th (41%), 6th (46%), 7th (47%), 8th (56% ), 9th (68%), 10th (66%)

PAGE 226

213 Appendix L: SWIF Survey Exit this survey > SWPBS Implementation Factors (SWIF) Survey 1. Welcome to the survey Thank you for taking the time to complete the School-Wide Positive Behavior Suppor t Implementation Factors Survey (SWIF). The survey should take about 10 minutes to complete. Exit this survey > SWPBS Implementation Factors (SWIF) Survey 2. Introduction to Survey This survey was developed to determine t he degree that various factors affect implementation of School-wide or Un iversal Positive Behavior Support (SWPBS). These factors were generated by school teams, coaches, facilitators, and district coordinators as either HELPFUL or PROBLEMATIC to the implementation of PBS. The survey was developed for use by Florida's schools and districts; however, implem enters from any state or district are welcome to complete the survey. While this survey refers to the School-Wide Positive Behavior Support (SWPBS) process, most school personnel use the term PBS to refer to this process; therefore, the term PBS will be used throughout the survey instead of SWPBS. Appendix L: (Continued) Exit this survey >

PAGE 227

214 SWPBS Implementation Factors (SWIF) Survey 3. Organization of survey The survey will include items in the following categories: PBS elements Administrators School staff PBS team PBS coach Students Resources 4. Directions for Survey You will be presented with fact ors in each of the cat egories. Please determine the degree that each factor has been HE LPFUL or PROBLEMATIC in the PBS implementation process or indicate w hether the factor has had no impact The choices for each item are: Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence 5. Definition of Helpful and Problematic Use the following definiti ons to help you determine whether each item has been helpful or problematic. Definition of HELPFUL: Any factor t hat has promoted or assisted in PBS implementation. Definition of PROBLEMATIC: Any fact or that has hampered, delayed, or prevented PBS impl ementation. Appendix L: (Continued) Exit this survey >

PAGE 228

215 SWPBS Implementation Factors (SWIF) Survey 7. Demographic Information Please complete the following informati on before you begin the survey. This information will be confidential and will not be reported to your school or team. 1. District 2. School name (will not be r evealed in analysis). Please use your experience with this school to answer the items in this survey. Exit this survey > SWPBS Implementation Factors (SWIF) Survey BEGIN SURVEY Please complete the entire survey and i ndicate an answer for each item based on your experience with the school you named on the previous page. If you would like to add comments for indivi dual items, space wi ll be provided at the end of the survey. Thank you.

PAGE 229

216 Appendix L: (Continued) SWPBS Implementation Factors (SWIF) Survey 10. PBS Elements 1. School has: Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence Expectations and rules that are clearly defined A reward system that works A discipline referral process that works Consequences for problem behavior that are consistent and effective A School Improvement Plan (SIP) that includes PBS

PAGE 230

217 Appendix L: (Continued) 2. Discipline data are: Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence Entered regularly Reviewed regularly Used to make decisions Shared with faculty regularly Appendix L: (Continued) Exit this survey >> SWPBS Implementation Factors (SWIF) Survey 12. Second Administrator 1. Do you have a second admini strator (e.g., Assistant/Vice Principal, Dean) at your school? Yes No Exit this survey >>

PAGE 231

218 SWPBS Implementation Factors (SWIF) Survey 13. Second Administrator Determine the degree that these items related to a SECOND ADMINISTRATOR (e.g., AP/VP, dean) have been problematic, helpful, or had no influence on your school 's implementation of PBS. Note. If there is more than one sec ond administrator, select the one who was the most involved with PBS. 1. Second administrator's: Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence Personal commitment to PBS Amount of time he/she is involved with PBS implementation Appendix L: (Continued) Availability to attend PBS meetings Providing input about PBS implementation Stability from year to year (e.g., continuity of person in position) 2. Second administrator's willingness to:

PAGE 232

219 Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence Teach or model PBS expectations Reward students for meeting PBS expectations Follow discipline procedures consistently Allow PBS team to train staff in PBS Allow PBS team to train students in PBS Appendix L: (Continued) SWPBS Implementation Factors (SWIF) Survey 14. School Staff Determine the degree that the items related to the SCHOOL STAFF (e.g., general education teachers, s pecial education teachers, special area teachers, support staff) have been problematic, helpful, or had no influence on your school's implementation of PBS. 1. School staff's: Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence Amount of time available for PBS implementation Philosophy

PAGE 233

220 towards discipline/behavior Belief about the effectiveness of PBS Input about PBS (e.g., surveys discussions) Stability year to year (i.e., teacher population) 2. School staff's consistency in: Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence Teaching expectations Rewarding students for meeting expectations Following discipline procedures Following discipline procedures

PAGE 234

221 Appendix L: (Continued) Exit this survey >> SWPBS Implementation Factors (SWIF) Survey 15. PBS Team Determine the degree that the items related to the PBS TEAM have been problematic, helpful, or had no influence on your school's implementation of PBS. 1. The PBS team is: Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence Representative of the school staff Cohesive Committed Able to meet regularly Available for PBS-related activities and events (e.g., time to plan, time to participate)

PAGE 235

222 Appendix L: (Continued) 2. The PBS team: Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence Shares/publicizes outcomes that demonstrate success (e.g., decrease in referrals) Recognizes/rewards faculty for participation Integrates PBS into school initiatives

PAGE 236

223 Appendix L: (Continued) Exit this survey >> SWPBS Implementation Factors (SWIF) Survey 16. PBS Coach Determine the degree that the items related to the PBS COACH have been problematic, helpful, or had no influence on your school's implementation of SW-PBS. 1. PBS coach's: Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence Availability for PBS implementation (e.g., time) Guidance with process Stability of position (e.g., same person in position of coach)

PAGE 237

224 Appendix L: (Continued) Exit this survey >> SWPBS Implementation Factors (SWIF) Survey 17. Students Determine the degree that the items related to the STUDENTS have been problematic, helpful, or had no influence on your school's implementation of SW-PBS. 1. Students': Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence Response to rewards and activities Input about PBS (e.g., surveys/informal discussions) Stability year to year (i.e., student population)

PAGE 238

225 Appendix L: (Continued) Exit this survey >> SWPBS Implementation Factors (SWIF) Survey 18. Resources Determine the degree that the items rela ted to the following RESOURCES have been problematic, helpful, or had no influence on your school's implementation of PBS. 1. Support from or collaboration with: Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence District personnel Other PBS teams Superintendent Parents Community agencies 2. Availability of: Problematic Somewhat Problematic Somewhat Helpful Helpful No Influence Staff PBS training by school PBS team Student training in PBS Adequate funding for PBS PBS procedures in a handbook

PAGE 239

226 Appendix L: (Continued) Exit this survey >> SWPBS Implementation Factors (SWIF) Survey 19. Additional Factors 1. Please list 2 additional factor s that you believe have been PROBLEMATIC to PBS implem entation at your school: Exit this survey >> SWPBS Implementation Factors (SWIF) Survey 24. End of Survey

PAGE 240

227 Appendix M: Summary of SWIF Feedback Comments Category Strengths Problems/Suggestions Survey direction (e.g. clarity, parsimony) Clear I went through the survey and, in general, I think the directions are pretty clear and the items seem to be relevant. Do you mention the $100 drawing at the beginn ing? I only remember seeing it at the end. I think I would put the confidentiality info rmation before the rating descriptions. I wasn't exactly clear on whether the goal of t he survey was to rate a specific school on these factors or to identify factors across sites. It may be that those completing it will understand because of their expe rience with SW PBS. Will all respondents be school-based or might some (like me) be from outside the system? If district or other personnel will be responding, there ne eds to be a way for them to opt out of identifying their school. It probably isn't necessary, but it might be a nice ex tra if the survey instructions let people know that their responses will be saved if t hey need to go back to previous screens. Survey questions (e.g. wording, formatting) Great. The items seem pretty clear to me. School Team Available for PBS Implementation (e.g., time) was confusing. Obviously they are available for implementation since that is what the team does. Perhaps it should read "School Team Gives adequate amount of time needed for implementation?" or something like that? And then section #11 /question 2 just add "teac hing expectations throughout the school year". Some people will agree that they taught the expecta tions the first week of school but they won't talk about them again. The only thing is to go back and check weather or not the SWPBS has a hyphen or not. In the directions it does not, but the directions for que stions 11-15 there is a hyphen. In questions 9, 16, and 17 it's PBS without the school-wide. I think just keep it consistent. I'm assuming that everyone completing th is survey will fully understand the terms SW/Universal PBS and the roles played by t he PBS coach and team. If not, they should be defined. There is an area of the survey that focuses on the degree to which certain features of SWPBS are in place at the school. It seems like this should come first (which would help to define SWPBS) and that the rating for this part might be different maybe based on the extent to which the things are evident???

PAGE 241

228Category Strengths Problems/Suggestions Survey questions (e.g. wording, formatting) (Continued) It might be helpful to include a "Does not appl y" option for the answers. For instance, the school I had in mind was a first year (ever) school so the stability of the principal was not an issue. On item 12, how is "Having a PBS team that is...A ble to meet regularly" different from having a team that is "Available for PBS implementation?" An example might help clarify this. On item 14 (students), the last item needs the word "from" added to it(i.e., "Stability from year to year"). Also, might consider adding a descr iptor onto the example (e.g., student population growth/matriculation/etc...). Item 15, number 2: (consider re-w ording)"Availability of staff tr aining by school's PBS team." The issue raised in my second comment came up again for me on item 16. What if the expectations are only partially defined, or the re ward system is partially effective? How will survey users be able to evaluate the degree to which they've accomplished each of these items? Response format (e.g. clarity, ease of use, logic to question) Great If the purpose is to rate factors generally, it seems to me that a different rating scale might make more sense (e.g., from very important to very unimportant. in the success of SWPBS). If the helpful/problematic scale is to be retained, you might want to modify it to very helpful, somewhat helpful, neither helpful nor problematic somewhat problematic, very problematic so that it is in order of effect and equally weighted on both sides. It might be helpful to have an example item or two (maybe a positive and negative example) for survey users. The wording of each item is clea r (e.g., "the principal's commitment to PBS"), but having to think about it in terms of something not being there was a little confusing at first (e.g., if the principal is NOT committed to PBS...it would be VERY problematic, but it took me a minute to get there). As I mentioned, I think the content sounds very familiar and is right on track. However, I'm not sure about how it is organized. Because you hav e it divided by role at the school (principal, teachers, PBS team), the content s eems redundant in some areas. You also have a kind of "is"/"does" separation. Trying to parse this out was hard fo r me but this may just be me. On item 17, it might encourage respondents to put two comments down if you structured up the answer space just a little bit more (e.g., having a numbered space for each item -1a., 1b.).

PAGE 242

229Category Strengths Problems/Suggestions Using Survey Monkey (e.g. ease, logistics, clarity, availability) When I had to chose the survey...I noticed there were 2 (possibly 3) that said "SWIFSchool Wide Implementation Factors ". You said to click on the one that said "Florida Only". But, I almost didn't because it said it was "closed" next to it. I wonder if that will make anyone else second guess the directions and think to choose the other one instead? It would be REALLY cool if Survey Monkey offer ed some kind of indicator to let you know how close you were to being done (survey is 25% complete, 60% complete, etc...). Time-pressed school personnel might appreciate this. Other Comments Pretty simple to use. I would have liked to know how long it would take (e.g. 5-10 minutes). Very professional, comprehensive -looks GREAT!!!!!!! Highest Degree Attained.... Can you add S.S.P? Which is the spec ialist in School Psychology or not make the Specialist degree specific to an E ducational Specialist only? Didn't know if you had any control over this. If possible, it might be better for respondents to be able to make comments per area when things will be fresh in their minds. If they have to wait until the end of the survey, you will get less.

PAGE 243

230 Appendix N: Research Questions Research Question IV DV Analysis Results 1 Is there a difference in the perceived level of SW-PBS implementation between schools in their first, second, and third year of SW-PBS? Year of Implementation BOQ total score One-way Analysis of Variance (ANOVA) N=57 First Year Schools N=28 Second Year Schools Using an alpha =.05, there was no significant difference between these two groups of schools on the BOQ scores. F (1,38)=2.12, p=.149) 2. What is the relationship between demographic school factors (demographics) and the perceived level of SW-PBS implementation? Student: 1. SES 2. Ethnicity 3. % of students with a disability 4. Stability of students School Building: 1. School size 2. Teacher: Student ratio 3. % of teachers with an advanced degree 4. % of out of field teachers BOQ total score Multiple Regression Sample size for analysis: N=69 schools Student Variables R2 = .051 for SES = -.25 for Ethnicity =.03 for % Students with disabilities = .16 for Stability = .21 There were no significant values. School Building Variables R2=.036 for School size =.05 for Teacher: student ratio = -.08 for % Teachers with adv. deg. = .19 for % Out-of-field teachers = -.02 There were no significant values. 3. What is the relationship between the implementation process factors and the perceived level of SW-PBS implementation? 1. perceived district/ administrative support (AS) 2. team functioning (TF) BOQ total score Multiple Regression Sample size for analysis: N=79 TF N=78 AS N=59 CSA

PAGE 244

231 Research Question IV DV Analysis Results 3. coach’s self assessment (CSA) R2=.187 for TF=0.53* for AS=-0.18 for CSA=0.06 *p<.05 4. What is the relationship between the severity of need for SW-PBS as measured by baseline year academic and behavioral indicators and the perceived level of SW-PBS implementation? 1. In-School Suspensions (ISS) 2. Out-of-school suspensions (OSS) 3. Office Discipline Referrals (ODR) 4. % Students Below Grade Level in Reading (BGLR) BOQ total score Multiple Regression Sample size for analysis: N=59 ISS, OSS, ODR N=68 BGLR R2=.032 for ISS=0.00 for OSS=-0.07 for ODR=-0.09 for BFLR=-0.06 There were no significant values. 5. What is the reliability and validity for the SWIF survey, a measure of enablers and barriers to PBS implementation? Exploratory Factor Analysis: Promax Oblique Rotation Test Retest: N=5 Total Score % Agreement= 98% % Items T1 and T2 Agreement =86% Factor Analysis There was a 3 Factor Solution that explained 47.5% of the total variance. The factors are: 1. Staff, Students, and External Agents 36 items, alpha=.95 2. Assistant Principal 13 items, alpha=.93 3. Principal 11 items, alpha=.93

PAGE 245

232 Research Question IV DV Analysis Results 6. Is there a difference between high and low implementers on the factors of the SWIF survey? Level of implementation Scores on observed factors from the SWIF survey KruskalWallis test MannWhitney test Sample size for this analysis: Low implementers (BOQ 0 50), n=8 Middle implementers (BOQ 50.190), n=23 High implementers (BOQ 90.1100), n=5 Kruskal-Wallis test for differences among high, middle, and low implementers on F1, F2, F3 were identical to Mann-Whitney test for differences between high and low implementers on F1, F2, F3. All tests were significant. F1: x2=15.8, 2, p<.001 F2: x2=14.3, 2, p<.001 F3: x2=18.3, 2, p<.001 7. Which items are perceived as the most helpful and problematic in the implementation of PBS by coaches and team members? Items on the SWIF Survey Ratings for Each Item on the SWIF Survey Additional factors from openended questions Item Analysis including descriptive statistics and item rankings Qualitative item analysis Highest rated items: 1. Expectations/rules clearly defined 2. Reward system that works 3. Discipline referral system that works Lowest rated items: 1. PBS procedures in handbook 2. Adequate funding 3. Student training in PBS

PAGE 246

233 Appendix O: SWIF emails Dear District Coordinator or Coach: To help improve your PBS trainings and t echnical assistance, the PBS project needs your input on the factors have most impacted PBS implementation. We would like every PBS team member to comp lete a BRIEF online survey (10 minutes) called the School wide PBS Implementati on Factor (SWIF) survey Please complete the survey yo urself and email ALL members of the PBS teams in your school or district. (A sample email is provided in the attachment) The link and popup window for the survey are on the PBS website: http://flpbs.fmhi.usf.edu All respondents who complete the survey by May 31 will be entered into a drawing for $100 Thank you, Florida PBS Project Staff Note. All survey results will remain confidential and there will be no identifying information published with the results. Figure O1. First email to inform pa rticipants about the SWIF survey. Dear District Coordinator or Coach: To those who have already comp leted the survey, thank you very much for your feedback. If you haven’t yet completed the survey, please take a few minutes in the next few days to complete it. Also, please remind your team members to do the same. We greatly appreciate your feedback. Here is the info rmation about the survey: To help improve your PBS trainings and technical assistance, the PBS project needs your input on the factors have most impacted PBS implementation. We would like every PBS team member to comp lete a BRIEF online survey (10 minutes) called the School wide PBS Implementati on Factor (SWIF) survey Please complete the survey your self and email ALL members of the PBS teams in your school or district. (A sample email is provided in the attachment) The link and popup window for the survey are on the PBS website: http://flpbs.fmhi.usf.edu All respondents who complete the survey by May 31 will be entered into a drawing for $100 Thank you, Florida PBS Project Staff Note. All survey results will remain confidential and there will be no identifying information published with the results. Figure O2. Reminder to partici pants about the SWIF survey.

PAGE 247

234 Appendix O: (Continued) Dear District Coordinator or Coach: So far, we’ve had a great response to the survey and appreciate your input. We know many schools are out for th e summer, but for those who are still around. . we need about 100 more respondents by June 30th so the results can represent the opinions of 1/3 of the PBS participants. Here is the information about the survey: To help improve your PBS trainings and technical assistance, the PBS project needs your input on the factors have most impacted PBS implementation. We would like every PBS team member to comp lete a BRIEF online survey (10 minutes) called the School wide PBS Implementati on Factor (SWIF) survey Please complete the survey yourself and email ALL members of the PBS teams in your school or district. (A sample email is provided in the attachment) The link and popup window for the survey are on the PBS website: http://flpbs.fmhi.usf.edu All respondents who have completed the survey by June 30th will be entered into another drawing for $50 Thank you, Florida PBS Project Staff Note. All survey results will remain confidential and there will be no identifying information published with the results. Figure O3. Second reminder to part icipants about the SWIF survey. Dear Participants I am writing to you about the School Wide Im plementation Factor (SWIF) survey. We have not yet received your completed survey. We understand that the SWIF survey was not a part of the original evaluation requirements; however, we have gained such useful information from the SWIF that we would now like to be able to examine the SWIF results in conjunction with the Benchmarks of Quality (BoQ) results. As you have completed a BoQ for you school, School X, your completion of the SWIF for this school is very important to us. The large number of questionnaires returned has been very encouraging. This is the first web-based survey on the factors that influence PBS implementation. The results will be very useful to the Florida PBS Project in designing training and support mechanisms for the districts and schools involved in the PBS project. It is for these reasons that I am sending this email reminder to complete the SWIF survey for your school. I encourage you to complete the survey by next Monday, July 18, 2005. I am including the link that will allow you to access the survey. A pop-up window will provide access to the survey, or there is a link on the first page of th is website. http: //flpbs.fmhi.usf.edu Thank you, Rachel Cohen & Florida PBS Project Staff Please contact me at rcohen@fmhi.usf.edu if you have any further questions. If you have already completed the SWIF, please disregard this message. Thank you. Figure O4. Final reminder to parti cipants about the SWIF survey

PAGE 248

235 Appendix P: Descriptive Data for Demographic Variables Sample Size Elementary Middle High Center/Other First Second Third Total 35 28 10 0 46 23 4 73 Range Category FRLa Ethnicityb School size Teacher: student ratio % Students with a disability Stability of students % ADc Teachers % OOFd Teachers Elementary 18.8-94.2 10.6-93.2 52-910 8. 7-19.9 8.6-27.3 88.2-96 .8 0.0-35.9 12.8-50 Middle 24.3-96.5 6.7-99.2 2581783 12.3-20.5 9.1-26.9 86.297.3 0.0-28.8 16.1-48.7 High 16.9-55.0 12.5-75.7 84-2905 6.0-19. 6 8.4-49.7 8.3-96.5 0.0-16.4 11.1-44.3 First (04) 18.8-96.5 6.7-99. 2 52-2452 7.6-20.5 8. 4-49.7 12.2-97.3 0. 0-35.9 11.1-50.0 Second (03) 16.9-77.1 12.57 5.7 84-2737 6.0-18.8 12.3-25.0 8.3-97.2 0.0-18.2 12.8-48.7 Third (02) 20.4-79.5 33.4-82 .0 392-2905 9.3-19.0 13.8-19.4 89.6-95 27.4-36.8 0.0-18.6 All 16.9-96.5 6.7-99.2 522905 6.0-20.5 8.449.7 8.3-97.3 0.035.9 11.1-50.0 Mean (SD) Category FRL Ethnicity School size Teacher: student ratio % Students with a disability Stability of students % AD Teachers % OOF Teachers Elementary 57.67 (16.21) 48.78(19.20 ) 622.11(186.67) 14.94(3.19) 15.95(3.87) 93.64(2.24) 4.65 (7.72) 28.73(9.16) Middle 55.26 (19.31) 45.36(23.95) 111 8.00(294.88) 17.22(1.77) 17.24(4.35) 93.3(2.78) 8.99 (8.20) 32.73(7.91) High 35.40 (13.90) 40.74(18.35) 1588. 70(1084.47) 15.16(5.02) 18.72(11.76) 76.23(34.82) 6.06 (6.50) 32.18(8.94) First (04) 57.12 (18.54) 50.36(21.91 ) 890.17(492.50) 15.91(3.27) 17.24(6.77) 91.38(12.21 ) 6.03(8.76) 30.48(9.08) Second (03) 46.83 (15.86) 37.91(16.66 ) 994.52(575.07) 15.78(3.04) 16.18(3.24) 90.46(18.00) 6.75 (5.96) 30.87(8.94) Third (02) 53.83 (26.39) 49. 00(22.29) 1285.75 (1139.85) 15.49( 4.41) 15.68(2.54) 92.08 (2.2 2) 32.8 (4.19) 10.63(8.30) All 53.69 (18.53) 46.36(20.95) 944.73 (562.08) 15.84(3.21) 16. 82(5.69) 91.13(13.88) 6. 51 (7.92) 30.73(8.77) Note.a FRL=% students eligible for free or reduced lunch. bEthnicity refers to the percentage of non-white students enrolled in the school. c% teachers with an advanced degree (i.e ., Masters, Specialist, Doctoral). d% classes taught by out-of-field teachers

PAGE 249

236 Appendix Q: Descriptive Data for Team Functioning (TF), Administra tive Support (AS), and Coach Self-Efficacy (CSE) Scores N Range Mean (SD) Category TF AS CSE TF AS CSE TF AS CSE Type Elementary 44 44 33 26.1 44.0 15.8 25.0 11.0 24.0 37.70 (3.80) 20.31 (2.29) 18.67 (3.04) Middle 31 31 26 27.7 43.0 14.7 23.8 8.0 24.0 37.20 (3.70) 20.03 (2.09) 18.73 (4.23) High 10 9 7 27.4 42.0 17.0 21.7 9.0 24. 0 35.86 (4.88) 19.80 (1.83) 17.57 (5.16) Center/Other 18 18 15 29.7 45.0 15.2 24.0 13. 0 24.0 36.14 (4.14) 19.45 (2.18) 19.20 (3.34) Year First 54 54 44 26.1-44.0 14.7-19.3 8.0-24.0 36.70 (4.08) 19.92 (2.22) 18.55 (3.81) Second 44 44 34 30.0-45.0 16.0-24.0 13.0-24.0 37.45 (3.64) 20.14 (1.96) 18.44 (3.36) Third 5 4 3 28.3-42.0 15.8-25.0 23.0-24.0 38.05 (5.59) 20.30 (1.96) 23.67 (0.58) All 103 10281 26.1-45.0 14.7-25.0 8.0-24.0 37.09 (3.95) 20.03 (2.17) 18.69 (3.66) Note. The total possible points for TF is 45, for AS is 25, and for CSE is 24.

PAGE 250

237Appendix R: Descriptive Data for ISS, OSS, BLGR, ODR ISS OSS Type N Range M SD N Range M SD Elementary 29 0.0-22.6 2.80 5. 13 290.0-49.3 7.29 9.32 Middle 26 0.1-45.6 24.09 12. 10 260.2-38.9 16.72 8.81 High 8 0.0-50.4 20.39 16.76 8 4.8-40.4 17.15 12.20 First 45 0.0-50.4 11.92 14.68 450.0-49.3 13.02 10.90 Second 14 0.0-36.0 21.11 12. 87 140.0-29.8 12.82 9.76 Third 4 0.2-24.0 9.70 10. 90 4 0.7-14.5 8.38 6.72 All 63 0.0-50.4 13.82 14.46 630.0-49.3 12.68 10.38 BGLR ODR Type N Range M SD N Range M SD Elementary 29 12.33-89 41.7 14. 36 290-188 26.41 4.91 Middle 26 25-82.67 53.40 14.17 2616-466 108.42a 95.68a High 8 33.0-80.5 66.25 14.71 8 10-173 76.63b 51.02b First 45 12.3-89.0 49.29 17.80 450-466 72.73 90.59 Second 23 19.67-80.50 47.62 13. 29 140-123 47.73 38.79 Third 4 51.3-64.0 55.5 5.85 4 2-136 65.25 55.31 All 72 12.3-89.0 49.10 15.99 630-46.6 66.63 79.99 Note. There were no data available for Center/Other Schools. Two medians were included for data that had a very large standard deviation. aThe median was 93.0 bThe median was 62.5

PAGE 251

238Appendix S: Promax Rotation of Three Factor Soluti on for SWIF Items: Structure Coefficients Item F1 F2 F3 h1 A reward system that works 0.59 0.35 0.47 0.37 Data entered regularly 0.48 0.18 0.65 0.46 Data reviewed 0.63 0.22 0.57 0.48 Data used to make decisions 0.68 0.33 0.58 0.50 Data shared regularly 0.69 0.34 0.57 0.51 Staff: Amount of time available for PBS Implementation 0.73 0.46 0.56 0.55 Staff: Philosophy towards discipline/behavior 0.72 0.46 0.44 0.52 Staff: Belief about the effectiveness of PBS 0.72 0.43 0.47 0.52 Staff: Input about PBS 0.72 0.38 0.45 0.53 Staff: Stability year to year 0.54 0.38 0.34 0.30 Staff: Teaching expectations 0.66 0.47 0.39 0.45 Staff: Rewarding students for meeting expectations 0.71 0.43 0.40 0.51 Staff: Following discipline procedures 0.64 0.48 0.41 0.43 Team: Representative of the school staff 0.48 0.37 0.38 0.25 Team: Cohesive 0.55 0.45 0.33 0.33 Team: Committed 0.59 0.43 0.36 0.37 Team: Able to meet regularly 0.59 0.51 0.43 0.40 Team: Available for PBS-related activities and events 0.74 0.48 0.43 0.56 Team: Shares/publicizes outcomes that demonstrate success 0.81 0.44 0.51 0.66 Team: Recognizes/rewards faculty for participation 0.63 0.42 0.34 0.40 Team: Integrates PBS into school initiatives 0.75 0.45 0.50 0.56 Coach: Availability for PBS implementation 0.60 0.25 0.51 0.40 Coach: Guidance with process 0.60 0.32 0.56 0.42 Coach: Stability of position 0.48 0.23 0.55 0.34 Student: Response to rewards and activities 0.62 0.34 0.41 0.39 Student: Input about PBS 0.63 0.40 0.28 0.43 Student: Stability year to year 0.59 0.35 0.37 0.35 District personnel 0.38 0.30 0.32 0.16 Other PBS teams 0.40 0.38 0.21 0.20 Superintendent 0.42 0.24 0.29 0.18 Parents 0.63 0.36 0.42 0.40 Community agencies 0.53 0.31 0.35 0.29 Staff PBS training by school PBS team 0.63 0.34 0.37 0.39 Student training in PBS 0.68 0.39 0.43 0.47 Adequate funding for PBS 0.40 0.14 0.27 0.18 PBS procedures in a handbook 0.50 0.27 0.29 0.26

PAGE 252

239 Appendix S: (Continued) F1 F2 F3 h1 Expectations and rules that are clearly defined 0.47 0.56 0.50 0.39 A discipline referral process that works 0.45 0.50 0.51 0.35 Consequences for problem behavior that are consistent and effective 0.48 0.55 0.44 0.36 AP: Personal commitment to PBS 0.46 0.87 0.37 0.76 AP: Amount of time he/she is involved with PBS implementation 0.44 0.88 0.32 0.79 AP: Availability to attend PBS meetings 0.40 0.80 0.26 0.66 AP: Providing input about PBS implementation 0.48 0.88 0.34 0.78 AP: Stability from year to year 0.40 0.55 0.40 0.33 AP: Teach or model PBS expectations 0.49 0.87 0.37 0.75 AP: Reward students for meeting PBS expectations 0.50 0.80 0.34 0.65 AP: Follow discipline procedures consistently 0.50 0.79 0.42 0.62 AP: Allow PBS team to train staff in PBS 0.47 0.77 0.42 0.60 AP: Allow PBS team to train students in PBS 0.51 0.79 0.46 0.64 A School Improvement Plan (SIP) that includes PBS 0.51 0.40 0.62 0.42 Principal: Personal Commitment 0.49 0.35 0.88 0.79 Principal: Amount of time he/she is involved with PBS implementation 0.49 0.33 0.87 0.76 Principal: Availability to attend PBS meetings 0.48 0.32 0.81 0.66 Principal: Input about PBS implementation 0.56 0.34 0.86 0.75 Principal: Stability from year to year 0.45 0.37 0.70 0.49 Principal: Teach or model PBS expectations 0.45 0.44 0.81 0.67 Principal: Reward students for meeting PBS expectations 0.46 0.36 0.70 0.49 Principal: Follow discipline procedures consistently 0.44 0.51 0.68 0.52 Principal: Allow PBS team to train staff in PBS 0.47 0.47 0.58 0.40 Principal: Allow PBS team to train students in PBS 0.55 0.55 0.63 0.49 Note. These values represent item to factor correlations for each factor. N =211.

PAGE 253

240 Appendix T: Promax Rotation of Three Factor Soluti on for SWIF Items: Pattern Coefficients Item F1 F2 F3 h1 A reward system that works 0.49 0.00 0.16 0.368 Data entered regularly 0.21 -0.22 0.62 0.462 Data reviewed 0.56 -0.24 0.33 0.48 Data used to make decisions 0.57 -0.11 0.26 0.501 Data shared regularly 0.59 -0.10 0.24 0.512 Staff: Amount of time available for PBS implementation 0.61 0.05 0.15 0.55 Staff: Philosophy towa rds discipline/behavior 0.69 0.09 -0.04 0.524 Staff: Belief about the effectiveness of PBS 0.69 0.03 0.02 0.518 Staff: Input about PBS 0.76 -0.04 -0.01 0.525 Staff: Stability year to year 0.49 0.11 -0.02 0.301 Staff: Teaching expectations 0.61 0.15 -0.06 0.452 Staff: Rewarding students for meeting expectations 0.73 0.06 -0.09 0.506 Staff: Following discipline procedures 0.54 0.18 -0.01 0.429 Team: Representative of the school staff 0.34 0.13 0.11 0.251 Team: Cohesive 0.47 0.21 -0.06 0.331 Team: Committed 0.54 0.14 -0.05 0.365 Team: Able to meet regularly 0.42 0.25 0.05 0.396 Team: Available for PBS-related activities and events 0.74 0.10 -0.08 0.562 Team: Shares/publicizes outcomes that demonstrate success 0.82 -0.03 0.01 0.655 Team: Recognizes/rewards faculty for participation 0.63 0.10 -0.11 0.404 Team: Integrates PBS into school initiatives 0.70 0.03 0.05 0.563 Coach: Availability for PBS implementation 0.53 -0.16 0.25 0.403 Coach: Guidance with process 0.45 -0.07 0.30 0.419 Coach: Stability of position 0.27 -0.11 0.43 0.341 Student: Response to rewards and activities 0.61 -0.03 0.04 0.389 Student: Input about PBS 0.71 0.09 -0.21 0.425 Student: Stability year to year 0.58 0.03 -0.01 0.35 District personnel 0.25 0.11 0.12 0.163 Other PBS teams 0.35 0.24 -0.12 0.204 Superintendent 0.40 0.00 0.04 0.18 Parents 0.60 0.00 0.04 0.396 Community agencies 0.52 0.00 0.03 0.285 Staff PBS training by school PBS team 0.66 -0.02 -0.03 0.392 Student training in PBS 0.68 0.00 0.00 0.466 Adequate funding for PBS 0.46 -0.14 0.04 0.175 PBS procedures in a handbook 0.54 -0.02 -0.04 0.254

PAGE 254

241Appendix T: (Continued) Item F1 F2 F3 h1 Expectations and rules that are clearly defined 0.06 0.40 0.28 0.39 A discipline referral process that works 0.07 0.31 0.32 0.35 Consequences for problem behavior that are consistent and effective 0.14 0.39 0.18 0.364 AP: Personal commitment to PBS -0.04 0.89 -0.01 0.758 AP: Amount of time he/she is involved with PBS implementation -0.06 0.94 -0.06 0.786 AP: Availability to attend PBS meetings 0.00 0.86 -0.13 0.658 AP: Providing input about PBS implementation 0.01 0.91 -0.07 0.779 AP: Stability from year to year 0.03 0.45 0.17 0.327 AP: Teach or model PBS expectations 0.01 0.87 -0.03 0.752 AP: Reward students for meeting PBS expectations 0.10 0.77 -0.07 0.645 AP: Follow discipline procedures consistently 0.05 0.73 0.06 0.624 AP: Allow PBS team to train staff in PBS -0.01 0.73 0.10 0.598 AP: Allow PBS team to train students in PBS 0.03 0.73 0.12 0.643 A School Improvement Plan (SIP) that includes PBS 0.15 0.09 0.49 0.415 Principal: Personal Commitment -0.11 -0.02 0.96 0.785 Principal: Amount of time he/she is involved with PBS implementation -0.07 -0.05 0.94 0.758 Principal: Availability to attend PBS meetings -0.04 -0.05 0.85 0.659 Principal: Input about PBS implementation 0.05 -0.08 0.87 0.752 Principal: Stability from year to year -0.03 0.08 0.68 0.494 Principal: Teach or model PBS expectations -0.17 0.15 0.85 0.671 Principal: Reward students for meeting PBS expectations 0.01 0.06 0.66 0.489 Principal: Follow discipline proc edures consistently -0.12 0.30 0.62 0.523 Principal: Allow PBS team to train staff in PBS 0.05 0.25 0.44 0.396 Principal: Allow PBS team to train students in PBS 0.11 0.30 0.43 0.489 Note. These values represent standardized regression coefficients for each factor. N =211.

PAGE 255

242 Appendix U: SWIF Item Correlations with Factor (rFx) and Alpha if Deleted from Factor Item rF1 Alpha if item deleted A reward system that works 0.56 0.95 Data entered regularly 0.49 0.95 Data reviewed 0.62 0.95 Data used to make decisions 0.66 0.95 Data shared regularly 0.67 0.95 Staff: Amount of time available for PBS implementation 0.72 0.95 Staff: Philosophy towards discipline/behavior 0.69 0.95 Staff: Belief about the e ffectiveness of PBS 0.69 0.95 Staff: Input about PBS 0.68 0.95 Staff: Stability year to year 0.50 0.95 Staff: Teaching expe ctations 0.62 0.95 Staff: Rewarding students for meeting expectations 0.67 0.95 Staff: Following disciplin e procedures 0.61 0.95 Team: Representative of the school staff 0.46 0.95 Team: Cohesive 0.52 0.95 Team: Committed 0.56 0.95 Team: Able to meet regularly 0.57 0.95 Team: Available for PBS-related activities and events 0.71 0.95 Team: Shares/publicizes outcomes that demonstrate success 0.78 0.95 Team: Recognizes/rewards faculty for participation 0.58 0.95 Team: Integrates PBS into school initiatives 0.72 0.95 Coach: Availability for PBS implementation 0.59 0.95 Coach: Guidance with process 0.61 0.95 Coach: Stability of position 0.49 0.95 Student: Response to rewards and activities 0.58 0.95 Student: Input about PBS 0.58 0.95 Student: Stability year to year 0.55 0.95 District personnel 0.38 0.95 Other PBS teams 0.39 0.95 Superintendent 0.41 0.95 Parents 0.61 0.95 Community agencies 0.51 0.95 Staff PBS training by school PBS team 0.59 0.95 Student training in PBS 0.65 0.95 Adequate funding for PBS 0.37 0.95 PBS procedures in a handbook 0.46 0.95

PAGE 256

243 Appendix U: (Continued) Item rF2 Alpha if item deleted Expectations and rules that are clearly defined 0.56 0.93 A discipline referral proc ess that works 0.53 0.93 Consequences for problem behavior that are consistent and Effective 0.57 0.93 AP: Personal commitment to PBS 0.84 0.92 AP: Amount of time he/she is invo lved with PBS implementation 0.83 0.92 AP: Availability to attend PBS meetings 0.72 0.93 AP: Providing input about PBS implementation 0.84 0.92 AP: Stability from year to year 0.49 0.93 AP: Teach or model PBS expectations 0.81 0.92 AP: Reward students for meeting PBS expectations 0.75 0.92 AP: Follow discipline procedures consistently 0.75 0.92 AP: Allow PBS team to train staff in PBS 0.70 0.93 AP: Allow PBS team to train students in PBS 0.74 0.92 Item rF3 Alpha if item deleted A School Improvement Plan (SIP) that includes PBS 0.55 0.93 Principal: Personal Commitment 0.84 0.92 Principal: Amount of time he/she is involved with PBS Implementation 0.82 0.92 Principal: Availability to attend PBS meetings 0.77 0.92 Principal: Input about PBS implementation 0.82 0.92 Principal: Stability from year to year 0.66 0.92 Principal: Teach or model PBS expectations 0.76 0.92 Principal: Reward students for meeting PBS expectations 0.65 0.92 Principal: Follow discipline pr ocedures consistently 0.66 0.92 Principal: Allow PBS team to train staff in PBS 0.60 0.93 Principal: Allow PBS team to train students in PBS 0.65 0.92

PAGE 257

244Appendix V SWIF Item Analysis by High, Middle, and Low Implementing Schools Low Middle High Mean SD Mean SD Mean SD Expectations and rules that are clearly defined 3.50 1.41 4.70 0.88 5.00 0.00 A reward system that works 3.25 1.28 4.43 1.04 5.00 0.00 A discipline referral process that works 2.75 1.49 4.35 0.88 4.60 0.89 Consequences for problem behavior that are consistent and effective 2.50 1.69 4.13 0.97 4.60 0.89 A School Improvement Plan (SIP) that includes PBS 2.38 1.19 4.35 0.93 5.00 0.00 Data: Entered regularly 3.38 1.60 4.52 1.16 4.60 0.89 Data: Reviewed 3.13 1.64 4.00 1.48 5.00 0.00 Data: UsedDecisions 2.88 1.64 3.91 1.44 5.00 0.00 Data: SharedRegularly 2.75 1.75 3.35 1.56 5.00 0.00 Principal: Personal Commitment 3.50 1.60 4.52 0.90 5.00 0.00 Principal: Amount of time he/she is involved with PBS implementation 2.88 1.81 4.09 1.31 5.00 0.00 Principal: Availability to attend PBS meetings 3.13 1.81 3.83 1.53 5.00 0.00 Principal: Input about PBS implementation 3.00 1.69 4.00 1.38 5.00 0.00 Principal: Stability from year to year (e.g., continuity of person in principal position) 2.88 1.36 4.35 1.27 5.00 0.00 Principal: Teach or model PBS expectations 2.88 1.55 4.52 0.90 5.00 0.00 Principal: Reward students for meeting PBS expectations 2.75 1.67 4.74 0.69 5.00 0.00 Principal: Follow discipline procedures consistently 2.50 1.31 4.43 1.04 5.00 0.00 Principal: Allow PBS team to train staff in PBS 3.88 1.55 4.57 0.79 5.00 0.00 Principal: Allow PBS team to train students in PBS 2.88 1.25 4.30 1.06 5.00 0.00 AP: Personal commitment to PBS 4.50 1.07 4.57 0.90 5.00 0.00 AP: Amount of time he/she is involved with PBS implementation 4.25 0.71 4.48 0.90 5.00 0.00 AP: Availability to attend PBS meetings 3.63 1.41 4.52 0.90 5.00 0.00 AP: Providing input about PBS implementation 3.63 1.41 4.48 0.95 5.00 0.00 AP: Stability from year to year (e.g ., continuity of person in position) 3.13 1.46 4.00 1.48 5.00 0.00 AP: Teach or model PBS expectations 4.13 0.99 4.65 0.88 5.00 0.00 AP: Reward students for meeting PBS expectations 3.75 1.16 4.78 0.67 5.00 0.00 AP: Follow discipline procedures consistently 3.88 1.25 4.35 1.27 5.00 0.00 AP: Allow PBS team to train staff in PBS 4.13 1.13 4.65 0.78 5.00 0.00 AP: Allow PBS team to train students in PBS 3.50 1.07 4.65 0.78 5.00 0.00

PAGE 258

245Appendix V: (Continued) Low Middle High Mean SD Mean SD Mean SD Staff: Amount of time available for PBS implementation 2.25 1.16 3.48 1.38 5.00 0.00 Staff: Philosophy towa rds discipline/behavior 2.25 1.49 3.43 1.53 5.00 0.00 Staff: Belief about the effectiveness of PBS 2.38 1.51 3.87 1.36 4.80 0.45 Staff: Input about PBS (e.g., surv eys/informal discussions) 2.50 1.41 4.00 1.41 4.80 0.45 Staff: Stability year to y ear (i.e., teacher population) 2.38 1.19 3.39 1.62 4.60 0.55 Staff: Teaching expectations 2.13 1.55 3.57 1.38 5.00 0.00 Staff: Rewarding students for meeting expectations 2.50 1.31 3.78 1.20 5.00 0.00 Staff: Following discipline procedures 2.00 1.07 3.61 1.27 4.80 0.45 Team: Representative of the school staff 4.25 1.04 4.65 0.88 5.00 0.00 Team: Cohesive 3.63 1.41 4.04 1.30 5.00 0.00 Team: Committed 3.25 1.39 4.22 1.13 5.00 0.00 Team: Able to meet regularly 3.13 1.81 4.13 1.42 4.80 0.45 Team: Available for PBS-related activities and events (e.g., time to plan, time to participate) 2.63 1.51 4.00 1.17 5.00 0.00 Team: Shares/publicizes outcomes that demonstrate success (e.g., decrease in referrals) 2.25 1.49 3.57 1.44 5.00 0.00 Team: Recognizes/rewards faculty for participation 2.13 1.55 3.65 1.30 4.00 0.71 Team: Integrates PBS into school initiatives 2.38 1.69 3.78 1.48 5.00 0.00 Coach: Availability for PBS implementation (e.g., time) 2.25 1.16 4.65 0.71 4.40 1.34 Coach: Guidance with process 3.38 1.19 4.61 0.72 4.40 1.34 Coach: Stability of position (e.g., same person in position of Coach: ) 3.13 1.55 5.00 0.00 5.00 0.00 Student: Response to rewards and activities 3.13 1.25 4.52 0.73 5.00 0.00 Student: Input about PBS (e.g., surveys/informal discussions) 2.25 0.89 3.30 1.18 5.00 0.00 Student: Stability year to year (i.e., student population) 2.25 0.89 3.48 1.41 5.00 0.00 District personnel 4.13 0.64 4.43 0.95 5.00 0.00 Other PBS teams 3.75 0.89 4.13 0.97 4.80 0.45 Superintendent 3.50 0.76 4.04 0.98 4.60 0.55 Parents 2.88 0.35 3.30 1.15 4.80 0.45 Community agencies 3.00 0.00 3.52 0.85 4.60 0.55 Staff PBS training by school PBS team 3.00 1.69 3.52 1.50 5.00 0.00 Student training in PBS 2.50 1.31 2.78 1.38 4.80 0.45

PAGE 259

246Appendix V: (Continued) Low Middle High Mean SD Mean SD Mean SD Adequate funding for PBS 3.00 1.41 2.87 1.55 3.80 1.64 PBS procedures in a handbook 3.13 1.36 3.70 1.43 4.20 1.30 Factor1 102.63 26.81 139.26 25.85 172.80 5.50 Factor2 47.25 10.12 58.30 8.82 64.20 1.79 Factor3 32.63 11.71 47.70 7.21 55.00 0.00 TotalScore 182.50 34.90 245.26 36.13 292.00 6.08

PAGE 260

247 Appendix W: SWIF Item Rankings by Mean Score Category Items M SD Element Expectations and rules that are clearly defined 4.63 0.90 Team Representative of the school staff 4.60 0.90 AP* Allow PBS Team to train staff in PBS 4.47 0.98 AP* Personal commitment to PBS 4.47 1.04 AP* Reward Students for meeting PBS expectations 4.47 1.00 Principal Allow PBS Team to train staff in PBS 4.44 1.07 AP* Teach or model PBS expectations 4.44 1.04 Team Cohesive 4.39 1.06 Coach Stability of position 4.35 1.14 Principal Reward students for meeting PBS expectations 4.34 1.18 Principal Teach or model PBS expectations 4.33 1.18 Principal Personal Commitment 4.31 1.22 AP* Amount of time he/she is involved with PBS implementation 4.31 1.16 AP* Allow PBS Team to train students in PBS 4.31 1.09 Team Committed 4.31 1.13 Resource A School Improvement Plan (SIP) that includes PBS 4.30 1.14 AP* Providing input about PBS implementation 4.29 1.17 Student Student response to rewards and activities 4.28 1.08 Principal Allow PBS Team to train students in PBS 4.28 1.15 Coach Guidance with process 4.27 1.12 AP* Follow discipline procedures consistently 4.26 1.25 AP* Availability to attend PBS meetings 4.23 1.24 Element A reward system that works 4.22 1.20 Team Able to meet regularly 4.21 1.27 Data Entered regularly 4.20 1.33 Resource District personnel 4.18 1.06 Team Integrates PBS into school initiatives 4.14 1.26 Coach Availability for PBS implementation 4.12 1.27 Principal Follow discipline procedures consistently 4.11 1.36 Principal Input about PBS implementation 4.11 1.34 Principal Stability from year to year 4.10 1.35 AP* Stability from year to year 4.09 1.30 Data Reviewed 4.08 1.34 Element A discipline referral process that works 4.03 1.29 Principal Amount of time he/she is involved with PBS implementation 4.01 1.36 Data Used for decisions 3.95 1.34 Staff Staff PBS training by school PBS Team: 3.89 1.30 Resource Other PBS Team 3.87 1.11 Principal Availability to attend PBS meetings 3.87 1.42 Team Available for PBS-related activities and events 3.85 1.41 Element Consequences for problem behavior . .consistent and effective 3.83 1.40 Team Shares/publicizes outcomes that demonstrate success 3.82 1.42 Resource PBS procedur es in a handbook 3.80 1.41 Staff Input about PBS 3.72 1.30 Staff Stability year to year 3.62 1.36

PAGE 261

248 Appendix W: (Continued) Category Items M SD Student Stability year to year 3.57 1.33 Data Shared Regularly 3.56 1.53 Resource Superintendent 3.53 1.05 Staff Teaching expectations 3.51 1.42 Staff Amount of time available for PBS implementation 3.51 1.38 Resource Community agencies 3.51 1.02 Staff Rewarding students for meeting expectations 3.48 1.41 Resource Parents 3.48 1.06 Student Training in PBS 3.48 1.40 Team Recognizes/rewards faculty for participation 3.48 1.49 Staff Philosophy towards discipline/behavior 3.47 1.44 Staff Belief about t he effectiveness of PBS 3.44 1.42 Staff Following discipline procedures 3.40 1.41 Student Input about PBS 3.36 1.34 Resource Adequate funding for PBS 3.26 1.57 Note. N =236. N =211. Item scores range from 1-5.

PAGE 262

249 Appendix X: SWIF Item Response Frequencies for All Respondents 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 0%20%40%60%80%100% P: Student training Team: Committed AP: Time involved AP: Input SIP AP: Student training AP: Follows discipline P: Teach Team: Cohesive Data entered P: Rewards students Coach: Stability P: Commitment AP: Teach AP: Rewards students P: Staff training AP: Staff training AP: Commitment Team: Representative Expectations

PAGE 263

250 Appendix X: (Continued) 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 0% 20% 40% 60% 80% 100% Consequences Team: Shares outcomes Team: Available P: Availability Data for decisions District personnel Referral Process Team: Integrates PBS P: Time involved Coach: Availability Student: Response Coach: Guidance AP: Stability Data reviewed Reward System P: Input P: Follows discipline P: Stability AP: Availability Team: Meets Percentage of respondents who selected each response

PAGE 264

251 Appendix X: (Continued) 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 5 0%10%20%30%40%50%60%70%80%90%100% Community agencies Parents Superintendent Student: Input Staff: Follow discipline Staff: Belief Staff: Time Student training Funding Student: Stability Staff: Reward Staff: Teach Staff: Philosophy Staff: Input Team: Recognizes staff Staff: Stability Other PBS teams Data shared Staff training PBS handbook Percentage of respondents who selected each response

PAGE 265

252Appendix Y: SWIF Item Means and Standard Deviation by Category Items Coach (n=47) Team Member (n=144) State Project (n=5) District (n=7) Other (n=4) M SD M SD M SD M SD M SD Expectations and rules that are clearly defined 4.47 1.10 4.67 0.85 4.20 1.79 5.00 0.00 4.50 0.58 A reward system that works 4.13 1.28 4.23 1.17 4.20 1.79 4.86 0.38 4.75 0.50 A discipline referral process that works 4.02 1.29 3.99 1.32 3.80 1.64 4.43 1.13 5.00 0.00 Consequences for problem behavior t hat are consistent and effective 3.79 1.38 3.83 1.40 3.40 1.82 4.29 1.50 4.25 0.50 A School Improvement Plan (SIP) that includes PBS 4.04 1.32 4.40 1.02 4.20 1.79 4.14 1.46 4.75 0.50 Data: Entered regularly 4.36 1.28 4.31 1.24 4.20 1.79 3.86 1.68 4.75 0.50 Data: Reviewed 3.96 1.49 4.19 1.21 4.20 1.79 4.43 1.51 5.00 0.00 Data: Used Decisions 3.81 1.51 4.04 1.24 3.60 1.95 4.43 1.51 5.00 0.00 Data: Shared Regularly 3.55 1.59 3.65 1.48 3.80 1.64 3.86 1.68 3.75 1.26 Principal: Personal Commitment 4.36 1.13 4.32 1.20 4.20 1.79 4.00 1.73 5.00 0.00 Principal: Amount of time he/she is involved with PBS implementation 3.94 1.47 4.00 1.35 4.00 1.73 3.86 1.68 4.75 0.50 Principal: Availability to attend PBS meetings 3.79 1.55 3.89 1.38 3.60 1.95 4.43 1.51 4.50 0.58 Principal: Input about PBS implementation 3.96 1.47 4.10 1.32 4.00 1.73 4.43 1.51 4.75 0.50 Principal: Stability from year to year 3.98 1.48 4.20 1.25 4.20 1.79 3.57 1.81 5.00 0.00 Principal: Teach or model PBS expectations 4.30 1.23 4.33 1.18 4.00 1.73 4.29 1.50 4.75 0.50 Principal: Reward students for meeting PBS expectations 4.40 1.19 4.30 1.18 4.20 1.79 4.71 0.49 5.00 0.00 Principal: Follow discipline procedures consistently 4.15 1.30 4.08 1.40 4.00 1.73 4.43 1.13 4.00 1.41 Principal: Allow PBS team to train staff in PBS 4.38 1.15 4.50 1.00 4.00 1.73 4.86 0.38 4.25 0.96 Principal: Allow PBS team to train students in PBS 4.06 1.29 4.37 1.05 4.00 1.73 4.57 0.79 4.00 0.82 AP: Personal commitment to PBS 4.51 1.00 4.55 0.94 4.00 1.73 4.14 1.46 4.25 0.96 AP: Amount of time he/she is involved with PBS implementation 4.40 0.95 4.36 1.18 3.80 1.64 4.00 1.41 4.00 1.15 AP: Availability to attend PBS meetings 4.30 1.10 4.30 1.21 3.40 1.82 3.57 1.81 4.00 1.15 AP: Providing input ab out PBS implementation 4.30 1.12 4.36 1.14 3.40 1.52 4.00 1.41 4.00 1.15 AP: Stability from year to year 3.91 1.44 4.19 1.23 3.80 1.64 3.71 1.70 3.50 1.73 AP: Teach or model PBS expectations 4.55 0.90 4.47 1.01 3.80 1.64 4.29 1.50 4.25 0.96 AP: Reward students for meeting PBS expectations 4.51 0.91 4.50 1.00 4.00 1.73 4.71 0.49 4.25 0.96 AP: Follow discipline procedures consistently 4.36 1.15 4.29 1.23 3.60 1.52 4.14 1.46 4.50 0.58

PAGE 266

253 Appendix Y: (Continued) Items Coach (n=47) Team Member (n=144) State Project (n=5) District (n=7) Other (n=4) M SD M SD M SD M SD M SD AP: Allow PBS team to train staff in PBS 4.55 0.88 4.52 0.94 3.80 1.79 4.57 0.79 4.00 1.15 AP: Allow PBS team to train students in PBS 4.45 0.95 4.33 1.09 3.80 1.79 4.29 0.95 3.75 0.96 Staff: Amount of time avai lable for PBS im plementation 3.34 1.39 3.56 1.41 3.20 1.64 3.86 1.35 2.75 0.96 Staff: Philosophy toward s discipline/behavior 3.19 1.57 3.56 1.40 3.00 1.41 3.43 1.99 2.25 0.50 Staff: Belief about the effectiveness of PBS 3.49 1.44 3.42 1.42 3.00 1.41 3.57 1.81 2.25 0.50 Staff: Input about PBS 3.64 1.45 3.67 1.28 3.40 1.34 4.29 1.11 4.25 0.96 Staff: Stability year to year 3.32 1.53 3.78 1.22 3.80 1.64 3.00 1.91 3.50 1.29 Staff: Teaching expectations 3.28 1.56 3.57 1.39 2.60 1.34 3.29 1.89 3.25 1.50 Staff: Rewarding students for meeting expectations 3.40 1.39 3.46 1.43 3.20 1.64 3.57 1.81 3.00 1.15 Staff: Following discipline procedures 3.28 1.42 3.47 1.38 3.00 1.41 3.43 1.99 2.75 1.50 Team: Representative of the school staff 4.60 0.88 4.60 0.92 4.20 1.79 4.57 1.13 4.50 1.00 Team: Cohesive 4.21 1.21 4.43 1.04 4.00 1.73 4.43 1.13 4.75 0.50 Team: Committed 4.21 1.16 4.33 1.15 3.80 1.64 4.43 1.13 4.75 0.50 Team: Able to meet regularly 4.06 1.37 4.23 1.28 4.20 1.79 4.43 1.13 4.50 0.58 Team: Available for PBS-related activities and events 3.62 1.39 3.86 1.45 4.00 1.73 4.00 1.41 4.50 0.58 Team: Shares/publicizes outcom es that demonstrate success 3.34 1.51 4.01 1.36 3.00 1.87 4.00 1.73 3.75 1.26 Team: Recognizes/rewards faculty for participation 3.26 1.50 3.58 1.49 2.60 1.34 4.00 1.41 4.00 1.41 Team: Integrates PBS into school initiatives 3.64 1.54 4.32 1.09 3.40 1.82 4.29 1.50 4.00 1.41 Coach: Availability for PBS implementation 4.09 1.33 4.19 1.19 3.40 2.19 4.14 1.46 4.50 1.00 Coach: Guidance with process 4.38 0.95 4.27 1.14 3.40 1.82 4.29 1.50 4.50 1.00 Coach: Stability of position 4.66 0.94 4.32 1.13 2.80 2.05 4.14 1.46 4.25 0.96 Student: Response to rewards and activities 4.23 1.03 4.33 1.06 4.20 1.79 4.29 1.11 4.25 0.50 Student: Input about PBS 3.28 1.26 3.39 1.37 2.60 1.34 3.71 1.38 3.00 1.15 Student: Stability year to year 3.38 1.42 3.74 1.23 3.40 1.82 3.14 1.57 3.75 1.26 District personnel 4.57 0.77 4.08 1.06 3.40 2.19 4.43 1.51 3.50 1.00 Other PBS teams 4.23 0.91 3.76 1.17 3.00 2.00 4.29 0.49 3.75 0.96 Superintendent 3.98 0.94 3.40 1.04 2.40 1.34 3.29 1.38 3.50 1.00

PAGE 267

254Appendix Y: (Continued) Items Coach (n=47) Team Member (n=144) State Project (n=5) District (n=7) Other (n=4) M SD M SD M SD M SD M SD Parents 3.38 1.05 3.56 1.05 3.00 1.58 3.14 1.21 3.50 1.00 Community agencies 3.47 0.86 3.55 1.06 2.40 1.34 3.29 1.11 3.50 1.00 Staff PBS training by school PBS team 3.60 1.56 3.97 1.21 3.40 1.82 4.00 1.00 3.25 0.96 Student training in PBS 2.96 1.44 3.64 1.35 3.60 1.52 3.00 1.63 3.00 1.63 Adequate funding for PBS 2.98 1.48 3.33 1.59 2.60 2.19 3.86 1.68 3.25 1.71 PBS procedures in a handbook 3.60 1.44 3.80 1.40 3.80 1.64 3.00 1.63 4.25 0.96

PAGE 268

255 Appendix Z: Means and Standard Deviations for Overa ll Score and Subscales by Categories Category Choices N Staff, Students, and Resources Assistant Principal Principal Overall Score Type of Elementary 80 148.29 (24.02) 59.28 (7.82) 48.13 (8.84) 255.69 (35.90) School Middle 63 136.02 (31.53) 56.62 (11.38) 47.02 (10.92) 239.65 (49.17) High 23 125.74 (24.80) 51.52 (11. 80) 40.09 (13.09) 217.34 (41.67) Center/Primary-Intermediate 24 125.54 (21.03) 50.33 (13.56) 45.38 (9.75) 221.25 (36.39) District 17 139.88 (31.88) 52.88 (11.23) 44.71 (10.82) 228.47 (46.39) Position with Coach 47 134.47 (29.41) 56.13 (10.82) 45.36 (11.08) 235.96 (44.76) PBS Team Member 144 139.60 (26.80) 56. 36 (10.51) 46.50 (1 0.07) 242.45 (41.7) State Project 5 121.80 (49.10) 48.80 (20.47) 44.40 (19.07) 215.00 (87.9) Other 4 137.75 (5.85) 54.25 (9.00) 50.75 (2.63) 242.75 (8.02) District Personnel 7 140.00 (39.56) 55.14 (13.12) 47.29 (12.00) 242.43 (62.95) Position in Principal 11 145.00 (22.59) 61.82 (5.96) 53.36 (1.80) 260.18 (26.67) school Assistant Principal/Dean 18 146.50 (20.41) 62.61 (3.94) 48.28 (6.86) 257.39 (26.52) General education teacher 50 138.34 (31.32) 55.86 (10.31 ) 46.04 (11.27) 240.24 (47.62) Special education teacher 29 133.62 (23.30) 53.93 (11.47) 46.28 (8.13) 233.83 (39.20) Special area teacher 8 142.63 (19.15) 56.50 (8.70) 43.50 (11.94) 242.63 (34.26) School Psychologist 10 139.80 (22.83) 51.30 (13.46) 43.80 (9.43) 234.90 (41.10) Behavior Analyst 23 132.48 (30.30) 56. 70 (11.75) 47.96 (11.05) 237.13 (45.66) School Counselor 9 135.56 (30.96) 52. 00 (11.42) 46.00 (12.22) 233.56 (49.25) Teaching Assistant 2 152.00 (33.94) 59. 50 (7.78) 42.50 (17.68) 254.00 (59.40) Office staff 3 170.33 (6.11) 64.00 (1.73) 54.33 (0.57) 288.67 (7.23) Other (e.g. transportation) 20 137.60 (32.36 ) 53.45 (13.98) 45.75 (11.44) 236.80 (52.54) District Personnel 23 128.22 (31.55) 54.35 (12.04) 41.35 (12.63) 223.91 (49.43) School social worker 2 166.50 (7.78) 63.50 (0.71) 55.00 (0.00) 285.00 (8.49) Highest High school/Some college 5 167.00 (11.73) 60.80 (7.82) 50.60 (8.73) 278.40 (27.72) degree Associates Degree 3 145.67 (16.26) 57.67 (4.73) 40.33 (11.06) 243.67 (31.50) Bachelors Degree 69 136.10 (28.30) 54. 50 (11.36) 45.42 (10.34) 236.03 (45.43) Masters Degree 103 137.58 (28.39) 56. 85 (10.59) 46.97 (10.31) 241.41 (43.01) Specialist Degree 19 135.74 (29.65) 54.42 (12.81) 45.16 (10.84) 235.32 (49.34) Doctoral Degree 8 143.63 (27.70) 59. 13 (8.56) 47.38 (10.89) 250.13 (43.12)

PAGE 269

256 Appendix Z : (Continued) Category Choices N Staff, Students, and Resources Assistant Principal Principal Overall Score Years One 119 138.43 (28.09) 55.87 (11.42) 46.37 (10.65) 240.66 (44.67) implementing Two 64 139.05 (26.39) 56. 52 (9.12) 45.89 (10.10) 241.45 (39.45) PBS Three 23 131.22 (34.02) 55.26 (1 2.94) 47.04 (11. 32) 233.52 (54.11) Four 2 155.00 (24.04) 64.5 (.70 ) 50.00 (5.65) 269.50 (19.09) Years with One 23 129.65 (24.62) 52. 87 (14.92) 45.57 (10.83) 228.09 (44.33) school Two 41 149.49 (22.81) 60.42 (7 .10) 48.37 (9.56) 258.27 (35.37) Three 31 131.00 (27.22) 54.23 (1 0.97) 42.90 (10.97) 228.13 (43.91) Four 26 146.73 (24.82) 57.31 (9.8 4) 49.23 (7.27) 253.27 (34.18) Five or more 86 134.59 (30.80) 55.08 (11.16) 45.83 (11.21) 235.51 (47.64) Note. There are 36 items and 180 possible point s for Staff, Students, and Resources. There are 13 items and 65 possible point s for Assistant Principal. There are 11 items and 55 possible points for Principal. There are 60 items and 300 possible point s for the Overall Score.

PAGE 270

257 Appendix AA: Content Analysis of Open-Ended Responses on SWIF Survey Problematic Category Topic Items External Situations (13) Hurricanes Team (2) Lack of shared ownership for team responsibilit ies (e.g. one to two people did most of the work) (2) Lack of communication to team members and coach about meeting times (2) Team turn over after initial training (2) Having AP as team leader changed the team dynamics: team members were less vocal and willing to share ideas and feelings Trying to do everything/too much at once Coach Coach's negative attitude toward team Team was never assigned a coach District Difficult to have district personnel a ssist schools because of staff shortages Superintendents lacked knowledge of students' needs Principal Principal was too controlling Principals would not allow school wide rewards to be integrated into schedule Implementation was not top down Part of the principal's evaluation was based on offi ce referrals. The office referrals do not get processed so they will not count against t he principal. The principal just hands out a punishment or has student write a two sentence lette r of apology. Prinicpal has stated that he/she wishes he/she had not agreed on this program although faculty wants plan to be successful. Administrators do not agree on consequences (e.g., number of days for supsension) Staff (2) Staff only wants to focus on academics and rewards were not considered academic Staff follow through did not last the whole year Teachers were on multiple committees and had a hard time finding a time to meet Some faculty were initially skeptical and saw little evidence that PBS worked Staff shortage Shortage of experienced staff No formal defined system of communication with the whole faculty Teachers often stated that behavior should not be bought

PAGE 271

258Appendix AA: (Continued) Problematic Category Topic Items Staff training No time in school calendar for adequate training of staff More than one booster training a year would be helpful More intense teacher training is needed for not only buy in, but gaining positive practice as well Trainers at summer training did not make us f eel comfortable and stifled our energy with negative attitudes Need more time to organize PBS agenda after training before school starts Retraining (3) Retraining of new students and staff on expect ations, consequences, and rewards was challenging Need to reteach the expectations on a school wide basis during the school year. Teaching Lack of behavior curriculum or sample lesson plans (2) Expectations So many pressing situations that prevented taking time to recognize good behavior Rewards (4) Not enough money to keep reward store stocked Not all teachers made their students spend their reward dollars PBS turned mostly into a reward based system, a token economy Rewards must be more age appropriate (e.g. other than ice cream) Referral System Forms were confusing at first Consistent use of the minor infraction form was problematic Too much time is taken to deal with minor infractions Discipline system needs to be staffed properly Consequences Staff awareness of referrals was lacking Staff wrote referrals for every incident listed and did not find out what happened first Variance in rules/discipline within classrooms Students were given too many chances

PAGE 272

259Appendix AA: (Continued) Helpful Category Topics Items New topics (6) Good effective leadership (5) PBS is wonderful, a great benefit to school, great system and process (3) Visiting a successful pr ogram/meeting with another school (2) Putting PBS on school news (2) Expect bigger and better for future Positive attitude Consistent goals Knowing that PBS process takes a long time to implement takes the pressure off Hurricane Charley in some ways it helped because staff was then ready for anything FLPBS staff (10) FLPBS project su pport (e.g. specific technical assistant per sonnel mentioned, staff coming to school) Team (5) is great (e.g., awesome, energetic, enthusiastic motivated, hard-working, "with-it,"tenacious) (3) Commitment from a few core team members (3) Parents on the team PBS team listens to and addresses concerns Team meetings opened up to whole staff PBS team is willing to fix what does not work Coach (2) Coach was great leader/guide A trained coach Is familiar with staff and works well with team members Had two coaches available Monthly coaches meetings for county District Superintendent believes PBS is important Incorporating county expectations was helpful Administration Encouraged high visibility of PBS Organizational skills of the assistant principal to put it all together and keep us headed in the right direction Staff Cohesiveness (3) Desire for improvement in behavior Good communication High expectations of students and staff Staff was willing to try something new PBS helps administration and staff retr ain control back from the students

PAGE 273

260Appendix AA: (Continued) Helpful Category Training (2) Training in summer was helpful Trainers were excellent Training of the entire staff before t he school year begins will be a vital piece to the implementation of this project Students Kids love PBS Behavior of students has improved Active student advisory committee More motivated Funding Donations generated from the community and the country following Hurricane. Stipends would help $300.00 to use towards tokens and rewards for good behavior Expectations (4) Expectations and rules posted everywhere Teaching assistants now tell students what they want and not they don’t want Reward (2) Recognizing the good kids and not just those with behavior problems Store was kept alive PBS student of the month club and re warding teachers that submit student names for this incentive Referrals Less major discipline referrals Expectations are repeated everywhere. Identification of most frequent misbehavior patte rns and most chronically misbehaving students Referrals decreased dramatically Parents PTO and SAC participation

PAGE 274

261 Appendix AB: Content Analysis of Open-Ended Responses on SWIF Survey-Final Helpful Category Problematic Category Topic Items Items Hurricanes (13) Hurricanes Team (5) is great (e.g., awesome, energetic, enthusiastic, motivated, hard-working, "with-it,"tenacious) (2) Lack of shared ownership for team responsibilities (e.g. one to two people did most of the work) (3) Committment from a few core team members (2) Lack of communication to team members and coach about meeting times (3) Parents on the team (2) Team turn over after initial training PBS team listens to and addresses concerns (2) Having AP as team leader changed the team dynamics: team members were less vocal and willing to share ideas and feelings Team meetings opened up to whole staff Trying to do everything/too much at once Coach (2) Coach was great leader/guide Coach's negative attitude toward team PBS team is willing to fix what does not work Team was never assigned a coach A trained coach Is familiar with staff and works well with team members Had two coaches available District (3) Visiting a successful program/meeting with another school Difficult to have district personnel assist schools because of staff Monthly coaches meetings for county Shortages Superintendent believes PBS is important Superin tendents lacked knowledge of students' needs Incorporating county expectations was helpful Principal (6) Good effective leadership Principal was too controlling Encouraged high visibility of PBS Principals would not allow schoolwide rewards to be integrated into schedule Organizational skills of the assistant principal to put it all Implementation was not top down together and keep us headed in the right direction Part of the principal's evaluation was based on office referrals. The office referrals do not get processed so they will not count against the principal. The principal just hands out a punishment or has student write a two sentence letter of apology.

PAGE 275

262 Appendix AB: (Continued) Helpful Category Problematic Category Topic Items Staff (3) Desire for improvement in behavior (2) Staff only wanted to focus on academics and rewards were not considered academic Cohesiveness Staff follow through did not last the whole year Good communication Teachers were on multiple committees and had a hard time finding a time to meet High expectations of students and staff Some faculty were initially skeptical and saw little evidence that PBS worked Staff was willing to try something new Staff shortage PBS helps administration and staff regrain control back from he students Shortage of experienced staff t No formal defined system of communication with the whole faculty Teachers often stated that behavior should not be bought Staff training (2) Training in summer was helpful No time in school calendar for adequate training of staff Trainers were excellent More than one booster training a year would be helpful Training of the entire sta ff before the school year begins will be a vital piece to the implementation of this project More intense teacher training is needed for not only buy in, but gaining positive practice as well Trainers at summer training did not make us feel comfortable and stifled our energy with negative attitudes Need more time to organize PBS agenda after training before school starts Retraining (3) Retraining of new students and staff on expectations, consequences, and rewards was challenging Need to reteach the expectations on a school wide basis during the school year. Teaching (4) Expectations and rules posted everywhere Lack of behavior curriculum or sample lesson plans (2) Expectations Teaching assistants now tell students what they want and not they don’t want So many pressing situations that prevented taking time to recognize good behavior

PAGE 276

263 Appendix AB: (Continued) Helpful Category Problematic Category Topic Items Rewards (2) Recognizing the good kids and not just those with behavior problems (4) Not enough money to keep reward store stocked Store was kept alive Not all teachers made their students spend their reward dollars PBS student of the mont h club and rewarding teachers that submit student names for this incentive PBS turned mostly into a reward based system, a token economy Rewards must be more age appropriate (e.g. other than ice cream) Consequences/Referral System Process Forms Less major discipline referrals Forms were confusing at first Expectations are repeated everywhere. Consistent use of the minor infraction form was problematic Idenfication of most frequent misbehavior patterns and most chronicly misbehaving students Too much time is taken to deal with minor infractions Structure Referrals decreased dramatically Discipline system needs to be staffed properly Staff awareness of referrals was lacking Staff wrote referrals for every incident listed and did not find out what happened first Variance in rules/discipline within classrooms Students were given too many chances (5) PBS is wonderful, a grea t benefit to school, great system and process (2) Expect bigger and better for future Seeing positive outcomes (2) Putting PBS on school news Consistent goals Positive attitude Knowing that PBS process takes a long time to implement takes the pressure off

PAGE 277

264 Appendix AB: (Continued) Helpful Category Problematic Category Topic Items FLPBS staff (10) FLPBS project supp ort (e.g. specific technical assistant personnel mentioned, staff coming to school) Students Kids love PBS Behavior of students has improved Active student advisory committee More motivated Funding Donations generated from the community and the country following Hurricane. Stipends would help $300.00 to use towards tokens and rewards for good behavior Parents PTO and SAC participation

PAGE 278

About the Author Rachel Cohen received a Bachelor’s Degree in Psychology from Pennsylvania State University in 2000. She entered the Ph.D. program in School Psychology at the University of South Florida in 2001. Rachel declared doctoral emphases in Systems and Or ganization and Research and Measurement. While at the University of South Fl orida, Rachel has coauthor ed several papers including a recent publication in School Psychology Review and a future book chapter in Best Practices in School Psychology V, and s he has made numerous paper presentations at regional, state, and nati onal conferences. She is currently completing her internship at an APA approved site in a northern suburb of Ch icago where she will be employed as a School Psyc hologist the following year.


xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam Ka
controlfield tag 001 001796802
003 fts
005 20070720110950.0
006 m||||e|||d||||||||
007 cr mnu|||uuuuu
008 070720s2006 flu sbm 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E14-SFE0001607
040
FHM
c FHM
035
(OCoLC)156876556
049
FHMM
090
LB1555 (ONLINE)
1 100
Cohen, Rachel Mara.
0 245
Implementing school-wide positive behavior support :
b exploring the influence of socio-cultural, academic, behavioral, and implementation process variables
h [electronic resource] /
by Rachel Mara Cohen.
260
[Tampa, Fla] :
University of South Florida,
2006.
3 520
ABSTRACT: This study evaluated the influence of academic, behavioral, and sociocultural variables on the implementation of Schoolwide Positive Behavior Support (SWPBS), a system intended to improve discipline in school buildings. The number of schools that are implementing SWPBS has been increasing dramatically over the years as school violence continues to rise and solutions are needed to improve school climate. This study examined the relationship between three categories of variables and the level of implementation of SWPBS in three multiple regression analyses. The categories were school demographic variables (i.e., ethnicity, socio-economic status, teacher: student ratio, percentage of teachers who are out-of-field), severity of need for change (suspensions, office referrals, percentage of students below grade level in reading), and team process variables (coaching, team functioning, administrative support). Of these variables, team functioning was the only one found to be sign ificantly related to implementation. A second component of the study involved collecting data relating to factors that were enablers or barriers to the implementation of SWPBS. Two-hundred and thirty-six school personnel completed a survey, Schoolwide Implementation Factor Survey (SWIF). The survey derived three factors through a factor analysis: school, staff, and students; principal; and assistant principal. These factors were all found to have a high Cronbach's alpha for internal consistency. There were significant differences between schools with a high, middle, and low level of implementation on all of these factors, with respondents from high implementing schools scoring the highest on all factors,and respondents from low implementing schools scoring the lowest. The item on the survey rated as the most helpful in the implementation process was "Expectations and rules that are clearly defined," while the item rated as the most problematic in the implementation process was "Adequat e funding for PBS." Overall, the results highlighted the complexity of implementing a system-wide change.(i.e., ethnicity, socio-economic status, teacher: student ratio, percentage of teachers who are out-of-field), severity of need for change (suspensions, office referrals, percentage of students below grade level in reading), and team process variables (coaching, team functioning, administrative support). Of these variables, team functioning was the only one found to be significantly related to implementation. A second component of the study involved collecting data relating to factors that were enablers or barriers to the implementation of SWPBS. Two-hundred and thirty-six school personnel completed a survey,Schoolwide Implementation Factor Survey (SWIF). The survey derived three factors through a factor analysis: school, staff, and students; principal; and assistant principal. These factors were all found to have a high Cronbach's alpha for internal consistency. There were signific ant differences between schools with a high, middle, and low level of implementation on all of these factors, with respondents from high implementing schools scoring the highest on all factors,and respondents from low implementing schools scoring the lowest. The item on the survey rated as the most helpful in the implementation process was "Expectations and rules that are clearly defined," while the item rated as the most problematic in the implementation process was "Adequate funding for PBS." Overall, the results highlighted the complexity of implementing a system-wide change.
502
Dissertation (Ph.D.)--University of South Florida, 2006.
504
Includes bibliographical references.
516
Text (Electronic dissertation) in PDF format.
538
System requirements: World Wide Web browser and PDF reader.
Mode of access: World Wide Web.
500
Title from PDF of title page.
Document formatted into pages; contains 264 pages.
Includes vita.
590
Adviser: Michael Curtis, Ph.D.
653
System-wide behavioral interventions.
Team functioning.
Demographic factors.
Indicators.
Treatment integrity.
690
Dissertations, Academic
z USF
x Interdisciplinary Education
Doctoral.
773
t USF Electronic Theses and Dissertations.
4 856
u http://digital.lib.usf.edu/?e14.1607