USF Libraries
USF Digital Collections

An examination of the implementation of the Second step program in a public school system

MISSING IMAGE

Material Information

Title:
An examination of the implementation of the Second step program in a public school system
Physical Description:
Book
Language:
English
Creator:
Pedraza, Lynn
Publisher:
University of South Florida
Place of Publication:
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Evidence-based
Fidelity
Implementation
Research-to-practice
Violence prevention
Dissertations, Academic -- Special Education -- Doctoral -- USF   ( lcsh )
Genre:
non-fiction   ( marcgt )

Notes

Summary:
ABSTRACT: As school districts integrate evidence-based prevention programs into their daily regime, they may struggle with implementing these programs with fidelity. This is a multi-method, multi-source, retrospective explanatory study of the implementation factors associated with program installation and partial implementation of an evidence-based violence prevention program, Second Step, in six elementary schools within a large urban school district. The goals of this study were to provide a better understanding of (a) the factors that support implementation of evidence-based programs in K-12 public schools, (b) the factors that constrain implementation, and (c) how developers and researchers might facilitate the application of research to practice. Schools that identified as implementing Second Step school-wide (Level 1) were matched to schools that identified as implementing in individual classes or grades (Level 2).Matching of paired schools was done through statistical peer grouping using statistical cluster analysis to identify groups of similar schools to help support the internal validity of the study by controlling for external variables that might affect implementation factors associated with program installation and partial implementation differently between the schools (Dunavin, 2005). This present study used a variety of data collection methods, including principal, counselor, and teacher interviews, school staff focus groups, an implementation checklist, and document reviews. Propositions and their indicators were proposed. Data were collected to determine the extent schools were implementing two of the stages identified by Fixsen et al. (2005), program installation and initial implementation.Raters were trained to rate the responses of the interviewees and focus group participants to test whether responses supported the propositions proposed, were against the proposition, or showed no evidence either way. Those scores were averaged and comparisons were made between matching Level 1 schools that identified using the program school-wide, and Level 2 schools that identified as using in individual classrooms and grades. T-tests were completed to examine the interview and focus group ratings and the checklist. There were no significant differences between schools implementing school-wide and those implementing in particular classrooms or grades accept for two proposition indicators. There was evidence that school staff received training on the Second Step curriculum and there was evidence that Second Step was delivered school-wide. However, the t-test results were opposite of what was predicted.Whether a school implemented school-wide or in individual classes or grades, schools were challenged by their competing priorities. Conditions that lead to fidelity in prevention program were often adapted to better meet the everyday life of the schools. School staff understood the importance of fidelity, but no school provided the program as designed. Staff suggests that with programs designed with flexibility and clear recognition of school culture, they might better be able to implement programs as designed.
Thesis:
Dissertation (Ph.D.)--University of South Florida, 2009.
Bibliography:
Includes bibliographical references.
System Details:
Mode of access: World Wide Web.
System Details:
System requirements: World Wide Web browser and PDF reader.
Statement of Responsibility:
by Lynn Pedraza.
General Note:
Title from PDF of title page.
General Note:
Document formatted into pages; contains 182 pages.
General Note:
Includes vita.

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
aleph - 002069289
oclc - 608096086
usfldc doi - E14-SFE0003183
usfldc handle - e14.3183
System ID:
SFS0027499:00001


This item is only available as the following downloads:


Full Text
xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam 2200409Ka 4500
controlfield tag 001 002069289
005 20100420155807.0
007 cr mnu|||uuuuu
008 100420s2009 flu s 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E14-SFE0003183
035
(OCoLC)608096086
040
FHM
c FHM
049
FHMM
090
LC3965 (Online)
1 100
Pedraza, Lynn.
3 245
An examination of the implementation of the Second step program in a public school system
h [electronic resource] /
by Lynn Pedraza.
260
[Tampa, Fla] :
b University of South Florida,
2009.
500
Title from PDF of title page.
Document formatted into pages; contains 182 pages.
Includes vita.
502
Dissertation (Ph.D.)--University of South Florida, 2009.
504
Includes bibliographical references.
516
Text (Electronic dissertation) in PDF format.
520
ABSTRACT: As school districts integrate evidence-based prevention programs into their daily regime, they may struggle with implementing these programs with fidelity. This is a multi-method, multi-source, retrospective explanatory study of the implementation factors associated with program installation and partial implementation of an evidence-based violence prevention program, Second Step, in six elementary schools within a large urban school district. The goals of this study were to provide a better understanding of (a) the factors that support implementation of evidence-based programs in K-12 public schools, (b) the factors that constrain implementation, and (c) how developers and researchers might facilitate the application of research to practice. Schools that identified as implementing Second Step school-wide (Level 1) were matched to schools that identified as implementing in individual classes or grades (Level 2).Matching of paired schools was done through statistical peer grouping using statistical cluster analysis to identify groups of similar schools to help support the internal validity of the study by controlling for external variables that might affect implementation factors associated with program installation and partial implementation differently between the schools (Dunavin, 2005). This present study used a variety of data collection methods, including principal, counselor, and teacher interviews, school staff focus groups, an implementation checklist, and document reviews. Propositions and their indicators were proposed. Data were collected to determine the extent schools were implementing two of the stages identified by Fixsen et al. (2005), program installation and initial implementation.Raters were trained to rate the responses of the interviewees and focus group participants to test whether responses supported the propositions proposed, were against the proposition, or showed no evidence either way. Those scores were averaged and comparisons were made between matching Level 1 schools that identified using the program school-wide, and Level 2 schools that identified as using in individual classrooms and grades. T-tests were completed to examine the interview and focus group ratings and the checklist. There were no significant differences between schools implementing school-wide and those implementing in particular classrooms or grades accept for two proposition indicators. There was evidence that school staff received training on the Second Step curriculum and there was evidence that Second Step was delivered school-wide. However, the t-test results were opposite of what was predicted.Whether a school implemented school-wide or in individual classes or grades, schools were challenged by their competing priorities. Conditions that lead to fidelity in prevention program were often adapted to better meet the everyday life of the schools. School staff understood the importance of fidelity, but no school provided the program as designed. Staff suggests that with programs designed with flexibility and clear recognition of school culture, they might better be able to implement programs as designed.
538
Mode of access: World Wide Web.
System requirements: World Wide Web browser and PDF reader.
590
Co-advisor: Albert J. Duchnowski, Ph.D.
Co-advisor: Krista B. Kutash, Ph.D.
653
Evidence-based
Fidelity
Implementation
Research-to-practice
Violence prevention
0 690
Dissertations, Academic
z USF
x Special Education
Doctoral.
773
t USF Electronic Theses and Dissertations.
4 856
u http://digital.lib.usf.edu/?e14.3183



PAGE 1

The Examination of the Implementation of the Second Step Program in a Public School System by Lynn M. Pedraza A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy Department of Special Education College of Education University of South Florida Co-Major Professor: Albert J. Duchnowski, Ph.D. Co-Major Professor: Krista B. Kutash, Ph.D. Daphne D. Thomas, Ph.D. Robert F. Dedrick, Ph.D. Date of Approval: November 10, 2009 Keywords: evidence-based, fidelity, implementa tion, research-to-practice, violence prevention Copyright 2009, Lynn M. Pedraza

PAGE 2

Dedication I would like to dedicate this disserta tion to my ten children: Alexandra, Geovanny, Jennifer, Pauline, Sarah, Lucas, Quin cy, Melody, Rachel, and Joey, as well as my eleven grandchildren: Johnny, Max, Kayla, Kylie, Joseph, Sharia, Lily, Neveah, Sean, the Boonstra Baby and Pedraza Baby to be bor n in 2010 and any future grand children. I am forever grateful that each of you cl aimed my heart and touched my soul. You have taught me so much about life with its many challenge s and opportunities. My wish for you and our future generations is the realizati on that at any age you can achieve your dreams. Remember, no matter what barriers you face, it is your attitude and the path that you take that determines your future. Believe in yourself and reach as high as you want. I love and adore all of you. Thank you for your inspiration. Life is 10% what happens to you and 90% how you react to it. Attitude is Everything Charles Swindoll

PAGE 3

Acknowledgments I would like to thank my committee chairs and members, Dr. Albert Duchnowski, Dr. Krista Kutash, Dr. Daphne Thomas and Dr. Robert Dedrick for their patience and understanding as I juggled my dissertation writ ing with my responsibilities at work and my family life of many children. I appreciat e the opportunities you provided me to learn and grow and the time you gave me to sh are your knowledge, provide me assistance, and direction. I want to thank Dr. Katherine Bemis, Ka thryn Tafoya, and Julie Medina for being my "Research Team". Kathryn, a second tha nk you for helping me with my computer challenges and forever being my support. River Dunavin and Dr. Douglas Crist, for sharing their knowledge and time and Michae l Greeson for reminding me what I have done and who I am. Of course, I could not have done this work without the school principals and staff sharing their precious time with me. Thank you. A very special thank you to Dr. Eleanor Guetzloe. My first encounter with Dr. Guetzole was at age 19 when I began my 3r d year of college. I wanted to make a difference in the lives of children with social and emotional challenges. She taught me to see the humor and joy in working with this special population. Our paths have crossed many times since that year. In 1999, Dr. Guet zole encouraged me to return to the University of South Florida for my doctorate. I thought she was crazy because it meant moving a family of nine across the country, but we did it. I am proud to say I have broken the cycle. I am a first generation high school graduate... and now this! My dream come true. Dr. Guetzole, thank you for believing in me.

PAGE 4

i Table of Contents List of Tables ..................................................................................................................... iv List of Figures ............................................................................................................... ..... vi Abstract ...................................................................................................................... ....... vii Chapter One Introduction ....................................................................................................1 Significance of the Problem .....................................................................................1 Why School-based Psycho-social Pr evention & Intervention Services? .................2 Zero Tolerance Policy Implications .........................................................................3 Dropout Implications ...............................................................................................4 Psycho-social Implications ......................................................................................6 Shift to Evidence-based Programming ....................................................................6 Social-Emotional Learning Using a Public Health Approach ..............................7 Evidence-based Practice Movement ........................................................................9 Implementation of Evidence-based Programs in Schools .....................................13 Second Step ............................................................................................................16 Statement of the Problem .......................................................................................17 Purpose of the Study ..............................................................................................18 Research Questions ................................................................................................19 Operational Definitions ..........................................................................................20 Chapter Two Literature Review .........................................................................................25 Violence in Childrens Lives .................................................................................26 Other Factors in Childrens Lives ..........................................................................27 Evidence-based Prevention and Intervention ........................................................29 Limitations of Evidence-based Programs ..............................................................30 Why the Focus on Implementation? ......................................................................32 Importance of Implementation in Successful Prevention Programming ...............36 School Involvement in Prevention and Intervention .............................................36 Norway Study ........................................................................................................38 Theoretical Framework ..........................................................................................39 Second Step Design and Implementation Process .................................................40 Research Questions ................................................................................................43 Summary ................................................................................................................45

PAGE 5

ii Chapter Three Methodology ..............................................................................................47 Overview ................................................................................................................47 Research Questions ................................................................................................49 Statistical Peers for Benchmarking ........................................................................50 Why Study Central City Public Schools? ..............................................................51 Study Design .........................................................................................................54 Pilot ............................................................................................................55 Study Propositions .....................................................................................55 School Selection Procedures ......................................................................57 School, Principal and Staff Selection .........................................................59 Statistical Peer Grouping ...........................................................................61 Selected Schools ....................................................................................................61 Schools Alto W & Familia P .....................................................................62 Schools Bueno W & Especial P .................................................................62 Schools Dia W & Campo P .......................................................................63 Data Collection ......................................................................................................65 Schools .......................................................................................................66 Participants .................................................................................................66 Research Team ...........................................................................................70 Protocols ....................................................................................................71 Data Analysis .........................................................................................................71 Interview Proposition Ratings....................................................................72 Interrater Reliability of Ratings .................................................................73 Interviews & Focus Groups .......................................................................76 Checklist ....................................................................................................77 Document Review ......................................................................................77 Confidentiality .......................................................................................................78 Study Limitations ...................................................................................................78 Summary ................................................................................................................80 Chapter Four Results..........................................................................................................81 Overview ................................................................................................................81 Research Questions ................................................................................................81 Research Question 1 ..................................................................................82 Research Question 2 ..................................................................................87 Research Question 3 ..................................................................................90 Research Question 4 ..................................................................................93 Research Question 5 ..................................................................................96 Document Review ................................................................................................100 Summary ..............................................................................................................101 Chapter Five Discussion ..................................................................................................103 Purpose .................................................................................................................103 Overview of the Study .........................................................................................103 Review of the Method ..........................................................................................104

PAGE 6

iii Discussion of Findings .........................................................................................105 Review of Question 1...............................................................................105 Review of Question 2...............................................................................108 Review of Question 3 ..............................................................................109 Review of Question 4 ..............................................................................110 Review of Question 5...............................................................................112 Study Limitations .................................................................................................113 Contribution to the Research and Practice ...........................................................114 A Public Health Approach: A Potential Framework ...............................115 Recommendations for Future Research ...............................................................117 Summary ..............................................................................................................121 References .................................................................................................................... ....122 Appendices .................................................................................................................... ...132 Appendix A: Email to Principals .........................................................................133 Appendix B: Principal Protocol ...........................................................................135 Appendix C: Counselor/Teacher /Key Informant Protocol ..................................144 Appendix D: Focus Group Protocol ....................................................................153 Appendix E: Checklist for all Participants to Complete ......................................168 Appendix F: Rating Scale ....................................................................................169 Appendix G: Document Review ..........................................................................173 About the Author ................................................................................................... End Page

PAGE 7

iv List of Tables Table 1 Contrasting Perspective in School Based Mental Health ..........................11 Table 2 Four Areas and Propositi ons Describing Implementation ............................ Components ...............................................................................................16 Table 3 Risks Factors that Affect Youth Violence .................................................27 Table 4 Reasons for Studying and Monitoring Implementation .............................33 Table 5 Stages of Implementation ..........................................................................34 Table 6 Central City Public Sc hools Student Risk Factors .....................................53 Table 7 Area and Propositions Describing Early Stage Implementation .................... Activities ....................................................................................................56 Table 8 School Association with Statistical Peers and Level .................................61 Table 9 Statistical Peers Comparisons ....................................................................63 Table 10 2005 Student Demographics ......................................................................64 Table 11 Participant Demographics & Implementation ...........................................69 Table 12 Intraclass Correlations of Proposition Indicators and Proposition Aspect Scores .........................................................................76 Table 13 Difference between paired Level 1 and Level 2 schools on .......................... Proposition A and Indicators .....................................................................83 Table 14 Mean Comparisons of Paired Level 1 and 2 Schools for ............................... Proposition A and Indicators .....................................................................85

PAGE 8

v Table 15 Difference between paired Level 1 and Level 2 schools on .......................... Implementation Steps related to Research Question 1 ..............................86 Table 16 Checklist Comparisons of Le vel 1 vs. Level Schools on Training ............87 Table 17 Difference between paired Level 1 and Level 2 schools on ......................... Proposition B and Indicators ......................................................................88 Table 18 Mean Comparisons of Paired Level 1 and 2 Schools for .............................. Proposition B and Indicators ......................................................................89 Table 19 Difference between paired Level 1 and Level 2 schools on ......................... Proposition D and Indicators .....................................................................91 Table 20 Mean Comparisons of Paired Level 1 and 2 Schools for ............................... Proposition D and Indicators .....................................................................92 Table 21 Checklist Comparisons of Leve l 1 vs. Level Schools on Barriers ................ and Facilitators ...........................................................................................94 Table 22 Checklist Comparisons of Level 1 vs. Level Schools on Difference ..................................................................................................95 Table 23 Difference Between Paired Level 1 and Level 2 Schools on Proposition C and Indicators ......................................................................98 Table 24 Mean Comparisons of Paired Level 1 and 2 Schools for .............................. Proposition C and Indicators ......................................................................99 Table 25 Checklist Comparisons of Level 1 vs. Level Schools on Differences ...............................................................................................100 Table 26 Feasibility Checklist .................................................................................118

PAGE 9

vi List of Figures Figure 1. Profile of CCPS students ........................................................................51 Figure 2. School selection procedures ....................................................................59 Figure 3. School, principa l and staff selection .......................................................60 Figure 4. Rating Scale ............................................................................................73

PAGE 10

vii An Examination of the Implementation of the Second Step Program in a Public School System Lynn M. Pedraza ABSTRACT As school districts integrat e evidence-based prevention programs into their daily regime, they may struggle with implementing these programs with fidelity. This is a multi-method, multi-source, retrospective e xplanatory study of the implementation factors associated with program installation and partial implementation of an evidencebased violence prevention program, Second Step in six elementary schools within a large urban school district. The goals of this study were to provide a better understanding of (a) the factors that support implementation of evidence-based programs in K-12 public schools, (b) the factors that constrain implementation, and (c) how developers and researchers might facilitate the application of research to practice. Schools that identified as implementing Second Step school-wide (Level 1) were matched to schools that identified as implementi ng in individual classe s or grades (Level 2). Matching of paired schools was done throug h statistical peer group ing using statistical cluster analysis to identify groups of similar schools to help support the internal validity of the study by controlling for external variable s that might affect implementation factors associated with program installation and partial implementa tion differently between the schools (Dunavin, 2005).

PAGE 11

viii This present study used a variety of data collection methods, in cluding principal, counselor, and teacher interviews, school sta ff focus groups, an implementation checklist, and document reviews. Propositions and th eir indicators were proposed. Data were collected to determine the extent schools we re implementing two of the stages identified by Fixsen et al. (2005), program inst allation and initial implementation. Raters were trained to ra te the responses of the interviewees and focus group participants to test whether responses s upported the propositions proposed, were against the proposition, or showed no evidence either way. Those scores were averaged and comparisons were made between matching Level 1 schools that identified using the program school-wide, and Level 2 schools th at identified as using in individual classrooms and grades. T -tests were completed to examine the interview and focus group ratings and the checklist. There were no significant differences between school s implementing schoolwide and those implementing in particular classrooms or grades accept for two proposition indicators. There was evidence that school staff received training on the Second Step curriculum and there was evidence that Second Step was delivered schoolwide. However, the t -test results were opposite of what was predicted. Whether a school implemented school-wide or in individual cl asses or grades, schools were challenged by their competing priori ties. Conditions that lead to fidelity in prevention program were often adapted to bett er meet the everyday life of the schools. School staff understood the importance of fide lity, but no school provided the program as designed. Staff suggests that with programs designed with flexibility and clear

PAGE 12

ix recognition of school culture, they might be tter be able to implement programs as designed.

PAGE 13

1 Chapter One Introduction Significance of the Problem Schools are in the best position to help young people and the adults they become to live healthier, longer and more satisfying and more productive liv es (Carnegie Council on Adolescent Developments Task Force on Education of Young Adolescents, 1989). And, schools are the only setting with access to large numbers of children and youth throughout their developmental years. This unique access creates an ideal setting for reducing at-risk behaviors th rough prevention and intervention programs (DeFriese, Crossland, Pearson, & Sullivan, 1990; Gottfredson, Fink, Skroban, & Gottfredson, 1997; Kolbe, Collins, & Cortese, 1997). School dist ricts are often cons idered the natural resource to support the needs of children and their families, and consequently, are placed in the position of both educat or and social savior (Gree nberg, Weissberg, OBrien, Zins, Fredericks, Resnik, 2003; Prevention 2000 2000). In this study, pseudo names have been gi ven to the state, county, city, school district, and individual schools to protect confidentiality. Th e state will be referred to as Manzano, the county as Sandia County, and the c ity as Central City. The district will be referred to as Central City Public School s and the six schools in this study will be referred to as Alto W, Bueno W, Dia W, Familia P, Manzano P, and Campo P.

PAGE 14

2 Why School-Based Psycho-Social Pre vention & Interven tion Services? There is no question that students face multip le psycho-social barriers that are not being addressed adequately with in the public sector. With lim ited resources and access to children and youth, communities look to schools to provide psycho-social prevention and intervention practices and progr ams as one of the solutions to poor health and social dynamics, putting even more pressure on today s teachers. This increased pressure to reach children and youth places schools in th e awkward position of being the de facto system of care for children with mental health problems. For example, in the Great Smoky Mountains Study of Youth, 70-80% of children who received services for mental health problems were seen by school providers such as counselors and nurses (Burns et al., 1995). Yet the level of sk ill or competence in delivering the services by these practitioners is usually unknow n. There is a similar problem when school staff select the evidence-based practices or programs for thei r schools. There is litt le information on the best practice of factors associated with pr ogram installation and initial implementation of the particular practice or pr oblem, nor research on whether core components are being implemented as planned. With easy access to children and youth and a long history of schools providing mental health and support services to stude nts, mental health professionals are now advocating for more school-based mental h ealth services (Kutash, Duchnowski, & Lynn, 2006), including prevention, and more accountabi lity for the type of services provided with a recent emphasis on fidelity to implem entation (Fixsen, Naoom, Blas, Friedman, & Wallace, 2005).

PAGE 15

3 Topics that are traditionally either public health or safety concerns have become common practice in school dist ricts, requiring schools to in tegrate social and emotional learning into already packed school days. The hope is that addressing barriers to learning will close the achievement gap among those students most at risk of failure. Today, many social and emotional le arning opportunities ar e provided through federally sanctioned evidence-based program s and practices. One concern, however, is that school districts are not achieving the sa me results that the program developers and researchers did in their res earch. A number of factors could contribute to this discrepancy. For instance, the program developer and researcher may not have considered the multitude of c onflicts that schools must navi gate each day, such as daily schedules, testing schedules, state and district curric ulum standards, and other requirements, so the schools may find the feasibility of implementation limited. Additionally, teachers might like the progr am curriculum but find it does not meet specific student learning styles. Teachers then provide differentiated instruction for those individual students, a form of adaptation en couraged in the education literature. This innovation (Fixsen et al., 2005) could be a desirable adap tation unless the adaptation deviates too much for the evidence-based practice itself. Furt her challenging the implementation may be an inability to repl icate the supports provided in the original research, a lack of understanding of the importa nce of fidelity to the program, or loss of support from the district for the program (Fixsen et al., 2005). Zero Tolerance Policy Implications Zero-tolerance policies were created because of what appeared to be an increase in school violence during the 1990s. As the media focused on violence in schools, the

PAGE 16

4 pressure on legislators to remove weapons from schools culminated with the enactment of the Gun-Free Schools Act This law made Elementary and Secondary Education Act (ESEA) funds contingent on state enactment of a zero-tolerance law with the goal of producing gun-free schools. Some states deci ded to apply zero tolerance to as many disciplinary infractions as they could in an effort to remove violators and standardize discipline. An unintended consequence for those stat es was the increase in the number of students expelled or suspended. Central City Public Schools' (CCPS) zero tolerance policy has resulted in significan t increases in students expe lled or suspended from school, indicating the need for early prevention services and suppor t to reduce th e likelihood of high-end, negative consequences such as su spension or expulsion. Of 6,595 suspensions in the 2005-2006 year, the most common reasons were disruptive behavior (1,702), fighting (1,429), and defiance of school principal (1,114). Males were twice as likely to get suspended as females. By ethnicit y, Hispanic students accounted for 64% of suspensions, although they comprise only 54% of the district enroll ment. Anglo students had the second highest number of suspensions with 22%, while representing 34% of the district enrollment (Heath, 2006). Dropout Implications Sandia County has the second highest sc hool dropout rate of any county in Manzano. Freshmen entering school in 2001, or the cohort of 2005, had a 52.50% graduation rate, with 20% of that cohort dr opping out of school. The other students can be accounted for in expulsion (0.10%), cont inuing in CCPS (8.50%), transferring to another district (18.60%), or death (0.20 %) (Graduati ng in CCPS, 2005). Dropping out

PAGE 17

5 of high school is related to a number of ne gative outcomes in adulthood. For example, the average income of persons ages 18 thr ough 65 who had not completed high school in 2005 was approximately $10,000 less per year than those who earned high school credentials, including a General Educationa l Development (GED) certificate (U.S. Department of Commerce, 2005). Furtherm ore, dropouts are more likely to be unemployed than those with high school credentials or a higher educational accomplishment (U.S. Department of Labor, Bu reau of Labor Statistics, 2006). In terms of health, dropouts older than age 24 tend to re port being in worse health than those who are not dropouts, regardless of income (U.S Department of Education, National Center for Education Statistics, 2004). Dropouts al so make up disproportionately higher percentages of the nations prison and death ro w inmates. Estimates from the most recent data available (from 1997 and 1998) indicate that approximately 30% of federal inmates, 40% of state inmates, and 50% of persons on death row are high school dropouts (U.S. Department of Justice, Bureau of Justi ce Statistics, 2000, 2002), which is much higher than the general populations dropout rate of about 18% (U.S. Census Bureau, 1998). Drop-out data and the impact on multiple domains of the lives of individuals not graduating from high school help to identify drop-out as a public health issue, not just an educational concern (Freudenberg & Ruglis, 2007 ). This broader look at drop-out creates an open door for non-educators to research and develop programs and practices for school implementation. It also may complicat e the field when others outside of the educational system define how schools should teach prevention practic es and programs.

PAGE 18

6 Psycho-Social Implications Youth in Manzano have significant rates of depression and other mental health issues and diagnoses. Approximately 18,600 child ren ages 9-17 have severe emotional disturbances, and approximately 47,000 more have other mental health disorders according to a report by the Technical A ssistance Collaborative (2002). The 2002 Sandia County Health Profile found that mental he alth disorders were among the top five hospital discharge diag noses in the county. Another factor often connected to viol ence is substance abuse problems. The Sandia County drug-related death rate of 21.0 per 100,000 for 1999-2001 is three times the national rate of 7.0 and represents 356 deat hs over a 3-year peri od. According to data from the Manzano Department of Health (2005), the county also had 53.5 alcohol-related deaths per 100,000. Youth aged 19 and under acc ounted for 7.2% of all driving under the influence (DUI) arrests in 2002. Manzano is second only to Alaska with 6.5% of youth age 12 to 17 dependent on alcohol or drugs compared to the national rate of 4.8%, according to a 2002 report by the Technical Assistance Collaborative (n.d.). The 515 drug violations reported by CCPS during the 2003-20 04 school year represent the 2nd highest annual total since 1999-2000. Nearly one-fifth of suspension s in 2003-2004 were due to substance abuse issues. Shift to Evidence-Based Programming In 2002, CCPS began shifting from a larg e array of scattered and unrelated psycho-social prevention programs to a more comprehensive approach. By July 2004, 94 CCPS teachers and staff had basic training in Second Step: A Violence Prevention Program (Crist, 2004) CCPS had been provided materials, and were encouraged to

PAGE 19

7 implement the program in their classrooms. The district did not require the use of implementation tools provid ed by the developer. In 2005, CCPS won a competitive grant from the U.S. Department of Health, Substance Abuse, Mental Health Services Administration (SAMHSA) for a period of 18 months to implement a comprehensive co mmunity approach to violence prevention, using Second Step (Osowski, 2007) as the district pr ogram while incorporating a more public health model within 12 school communities. The CCPS SAMHSA grant proposed the following three goals: (a) Build cultura lly competent, community-based violence prevention coalitions/neighborhood action team s (NATs); (b) Implement an evidencebased violence prevention cu rriculum in 12 elementary schools; and (c) Implement systems change in school policy and procedures to institutionalize proactive, culturally relevant, evidence-based violence prevention in itiatives across CCPS and into the larger Central City community. Social-Emotional Learning Using a Public Health Approach A public health approach to integratio n of prevention can provide a framework that allows for careful consid eration of the steps necessary to meet the needs of a school with a high dropout rate. This model is base d on four steps: (a) surveillance at the population/community level (What is the proble m?), (b) identifying risk and protective factors (What are the causes?), (c) developi ng and evaluating interv entions (What works and for whom?), and (d) implementation m onitoring and scaling-up (Is it meeting the intended needs?) (Kutash et al., 2006). Adelman and Taylor (2006), fearing that too many child ren are being left behind without support, emphasize the case for school re form that addresses barriers to learning.

PAGE 20

8 According to Adelman and Taylor (2006), 91,000 United States schools in 15,000 districts are implementing prevention and intervention programs. The number of schools implementing prevention and intervention prog rams highlights the importance of districts to follow a public health strategic planning approach for potentially better outcomes. Before a school embarks on determining the types and levels of social and emotional learning supports th at are necessary, it is prudent for the school community to identify needs of the student population us ing a variety of information sources as indicated in the public health model. The information gathered helps to determine the types and extent of problems, unique cultural and community-specific needs, and risk and protective factors that could mitigate the populat ions negative outcomes. This early stage assesses the potential match between the school and the practice or program and community resources to determine whether to proceed or not with the factors associated with program installation a nd initial implementation (Fixse n et al., 2005). Using datadriven decision-making, the hallmark of the publ ic health model, if the school community decides to proceed, it can th en develop a strategic plan based on the most relevant evidence-based innovations for improve d outcomes (Kutash et al., 2006). Social and emotional learning is widely recognized as a unifying concept for organizing and coordinating school-based ps ycho-social preventi on and intervention programming that focuses on positive youth de velopment, health promotion, prevention of problem behaviors, and st udent engagement in learni ng (Devaney, OBrien, Resnik, Keister, & Weissberg, 2006). This conceptu al framework mirrors a public health approach by addressing both the needs of children and youth and schools responses to those needs (Elias, Zins, et al., 1997; Greenberg et al., 2003). The process is done in the

PAGE 21

9 context of supporting academic achievement by addressing root causes of problem behaviors that are the barrier s to student success and prot ective factors that promote resiliency. Use of a Social and Emotional Learning (SEL) framework is linked with improved attendance, behavior, and performan ce, yet the focus is often fragmented and marginalized (Zins, Bloodworth, Weissb erg, & Walberg, 2004). Although few would deny the importance of supporting students emo tional-social development as a means to academic success, the challenges of non-funded mandates, no requirements for support services, independent support entities, as well as challenges in implementation and maintenance of fidelity to the research can make for less than optimal outcomes. SEL consists of three levels of service: (a) curriculum-based programs directed to all children to enhance social and emotional competencies; (b) programs and perspectives intended for special needs child ren; and (c) programs and pe rspectives that seek to promote the social and emotional awarene ss and skills of educators and other school personnel. SEL integration of cognition, affect and behavior promotes the development of responsible and productiv e students. Planned, systematic, and evidence-based curriculum provides opportunities for students to model, practice, and apply what they learn to multiple settings (Devaney et al., 2006). Evidence-based Practice Movement Evidence-based practice originated in the medical field, with disciplines such as psychology and education following the medical field's lead in an effort to build quality and accountability in their practices. Today, major efforts to improve academic outcomes for youth by focusing on the psycho-social barrie rs they face have led to joint efforts by

PAGE 22

10 mental health practitioners and educator s to adopt and implement evidence-based prevention practices and programs within school settings, often under the umbrella of a social-emotional learning framework. However, like the medical field that found it challenging to incorporate many of the random ized control study findings into direct practice with patients, school di stricts struggle with achieving the same outcomes (Pirrie, 2001). Part of this struggle is seen in the contrasting perspectives of the education and mental health systems around school-based mental health (Kutash et al., 2006), as indicated in Table 1. These distinct conceptual framework differences can also be seen in prevention practices and programs. The terms evidence-based practice and evidence-based program are often used interchangeably, although essentially one leads to the other. Evidence -based practices are skills, techniques, and strategies that can be used by a practitioner and describe effective core components that are factors associated with fidelity. These core components are then used individually or in combination to create evidence-based programs. In contrast, evidence-based programs are a collection of evidence-based practices based on particular philosophies, values, service delivery, struct ure, and treatment components. The program combines the needs of program funders with the specific methods for effective treatment, management, and quality contro l (Fixsen et al., 2005).

PAGE 23

11 Table 1 Contrasting Perspective in Sc hool Based Mental Health Education System Mental Health System Overarching Influence Individuals with Disabilities Education Act (IDEA) Diagnostic and Statistical Manual (DSM) Conceptual Framework Behavior Disorders, Challenging Behavior, Academic Deficits Psychopathology, Abnormal Behavior, Impaired Functioning Important Theoretical Influences Behaviorism, Social Learning Theory Psychoanalytic Approaches, Behavior Theory, Cognitive Psychology, Developmental Psychology, Biological/Genetic Perspectives, Psychopharmacology Focus of Intervention Behavior Management, Skill Development, Academic Improvement Insight, Awareness, Improved Functioning Common Focus Improving Social and Adaptive Functioning, Importance of and Need to Increase Availability, Access, and Range of Services (Kutash, Duchnowski, & Lynn, 2006) National efforts to encourag e adoption of evidence-based practices and programs in education cover a wide range of topics and can be seen in health (U.S. Department of Health and Human Services, 1999, 2001), ment al health (Presidents New Freedoms Commission on Mental Health, 2003), and education (Nabor s, Weist, & Reynolds, 2000; NCLB, 2001). Examples of federal efforts to encourage adoption of evidence-based

PAGE 24

12 programs can be seen in the SAMHSA-sponsor ed Registry of Evidence-based Programs for Mental Health and in the Department of Educations suppor t program, The What Works Clearinghouse. Additionally, there ha ve been prevention and intervention programs reviews across problem outcome areas such as substance abuse, teen pregnancy, school dropout, and juvenile deli nquency (Dryfoos, 1990; Elias, Gager, & Leon, 1997; Weissberg & Greenberg, 1998) that allow for schools to review the success of programs that have more focused support. However, it has only been more recent that appropriate implementation stages have been identified as a factor in reaching desired outcomes (Fixsen et al., 2005). Weiss, Murphy-Graham, Petrosino, and Ga ndhi (2008) share some possible root causes for the challenges districts face as they try to recreate the same expected level of positive outcomes of the federally supported, evidence-based programs. When Weiss et al. (2008) reviewed all the evidence that wa s used to rate programs, they found that some of the evidence looked sh aky (p. 38). More specifical ly, they had concerns about (a) the identity of the evaluator, (b) lim ited evidence of positive findings, (c) sub-group comparisons, (d) composition and procedures of the expert panel, (e) lack of belief in evaluation evidence, and (d) bureaucratic exerci se more than influence of research (p. 38). Program developers and rese archers are beginning to a ddress some of the barriers school districts and others face, such as th e large lag time (sometimes up to 20 years) between developing effective practices and pr ograms and using them in the real-world (Metz, Espiritu, & Moore, 2007). Other barriers that affect the transition from research to practice and may account for the challenges of achieving outcomes similar to the original

PAGE 25

13 research include difficulties with appr opriate cultural fit to the community implementation processes. These processes may be cumbersome to school schedules, competing time commitments, and lack of sc hool-level involvement in the early adoption processes. Further confounding the movement to eviden ce-based practices and programs is the lack of resources necessary to replicate and maintain them with the same rigor as what is reported in the original research. For example, the NCLB Act mandated prevention programs without providing suffici ent funding and without tying the mandate to accountability measures, crea ting priority dilemmas for school districts. Also hindering the success of these programs are the needs for quality assurance, technical assistance, state certification guidelines, and university education sponsorship of coursework on the integration of evidence-based practice into daily school life. One st rategy schools use to integrate prevention programs is to incorporate the prevention programs as part of a framework of SEL (Albee & Gullott a, 1997; Devaney et al., 2006). Implementation of Evidence-Based Programs in Schools The origins of Implementation Theory began with diffusion research in 1903 (Communication Theory, n.d.). At that time, th e French sociologist Gabriel Tarde plotted the original S-shaped diffusion curve, show ing that the rate of adoption or diffusion varied. Tarde defined diffusion as the spreading of social or cultural properties from one society or environment to another. Tardes vi ew was that, with imitation of interventions, social change would occur as part of a universal law of repetition (Kinnumne, 1996). According to Rogers, Ryan and Gross 1943 study reinforced Tardes work (1903) when they identified five categories (innovators, early adopt ers, early majority,

PAGE 26

14 late majority, and laggards) of farmers who adopted hybrid corn seed based on the amount of time it took them to use the innovati on and five major stages in the adoption process (awareness, interest evaluation, trial and adoptio n). As of 1994, 51 years after Ryan and Grosss hybrid corn study, about 5,000 papers about diffusion had been published (Rogers, 1995). Despite its early history, there has be en limited research on fidelity of implementation, and most researchers agr ee that poor implementation of prevention programs led to poor outcomes (Dusenbury, Brannigan, Falco, & Hansen, 2003). The literature review and analysis done at th e National Implementation Research Network (NIRN) by Fixsen et al. (2005) found impl ementation across domains (e.g., mental health, juvenile justice, edu cation, child welfare) to be most successful when there were conditions that supported the implementation early on. These conditions included: (a) carefully selected practitioners receive coordinated training, co aching, and frequent performance assessments; (b) organizations provide the infrastructure necessary for timely training, skillful supervision and co aching, and regular process and outcome evaluations; (c) communities and consumers are fully involved in the selection and evaluation of programs and practices; and (d) state and federal funding avenues, policies, and regulations create a hospitable envi ronment for implementation and program operations. Greenberg et al.s (2005) study of implementation in school-based preventive interventions yielded specific recommendati ons to researchers about implementation conditions: (a) routinely assess implementation to optimize prevention work in the realworld setting; (b) work with local stakehol ders to evaluate implementation fidelity; (c)

PAGE 27

15 share information on the programs theory to guide local changes so that if adaptations are made they are still in keeping with the pr ogram theory; (d) use local replication as an opportunity to confirm program theory by assessing whether the intervention is implemented as planned (prescriptive model) and whether the mechanisms of change function as expected (cau sal model); and (e) examine how variations in the implementation support system and implemen ter characteristics affect the program delivery. They also made recommendations to the developers to: (a) provide information about actual resources (e.g., money, time, and personnel) needed to implement an intervention; (b) communicate and share co mmon language; (c) conduct research to understand which components must be delivered exactly as they were developed, which components can be modified, and how to make changes and still achieve positive outcomes. Today, probably the most notable research ers in the field of implementation are Fixsen et al. (2005) who focus on gene ral implementation. Also noteworthy are Greenberg et al. (2005), Weiss et al. (2008), and Adelman an d Taylor (2000) who focus on implementation of prevention programs in schools. The work of these researchers and others is used as a source for this Yin-designed multi-method explanatory study. The study focuses on the implementation of an evidence-based program, Second Step. More specifically, this study examines four areas and propositions based on the literature review that are tested by co llecting data that would indicat e either support for or against the propositions or no evidence.

PAGE 28

16 Table 2 Four Areas and Propositions Describing Implementation Components Area Proposition Training Schools that received training in Second Step prior to implementation of the program were provided with the appropriate impl ementation tools and support necessary to implement the program. Time When implementing the program sufficient time was allocated for school staff to learn the program components as well as sufficient time to deliver the program to students. Implementation Level When Second Step was implemented school-wide, th ere was more staff commitment to implementation, more peer-to-peer support, more adherence to the program model, and staff were more likely to attribute positive students outcomes to Second Step than when Second Step was implemented only in individual classrooms or grades. Champion When a school had a designated champion for Second Step teachers and/or counselors were more likely to implement the program with more adherence to the program model than when there was no champion present. Second Step Second Step is a universal violence preventi on program that is designed to promote social competence and reduce children s social and emotional problems. It is recognized by at least three national organizations as an evidence-based program. The organizations that reviewed Second Step include The National Re gistry of Evidencebased Programs and Practices (NREPP) ope rated by SAMSHA (Schinke, Brounstein, & Gardner, 2002); Prevention Research Center for the Promotion of Human Development at Penn State University (Greenberg, Domitrovich, & Bumbarger, 2000); and the Collaborative for Academic, Social, and Emotional Learning (CASEL, 2003). The CASEL defines social and emotional learning as the process of ac quiring the skills to

PAGE 29

17 recognize and manage emotions, develop caring and concern fo r others, establish positive relationships, make responsible decisions, and handle challeng ing situations effectively (Devaney et al., 2006). Several skills that are considered essential to healthy social and emotional development and that potentia lly reduce violence are included in the curriculum. These skills include empat hy (Halberstadt, Denham, & Dunsmore, 2001), impulse control and problem solving (C rick & Dodge, 1994), and anger/emotion management (Eisenberg, Fabes, & Losoya, 1997). The Second Step program is built on Lurias (1961) research, which demonstrated that people could use self-t alk to control behaviors, as well as cognitive-behavioral theory, which grew out of Banduras (1986) so cial learning theory. Cognitive-behavioral theory has demonstrated that thoughts aff ect peoples interactions and that the relationships between thought and behaviors ca n be put to practical use (Crick & Dodge, 1994). Statement of the Problem As school districts across the country integrate evidence-based violence prevention practices and programs into thei r daily regime, they may struggle with implementing to the program model and w ith trying to achieve good outcomes. One problem may be the design of the programs. Feasibility of implementation comes into question when programs are designed with multiple doses and time periods that sometimes exceed the typical class period. For example, Project Alert designated as an exemplary substance abuse prev ention program for middle school students, is designed to be presented in 45-minute periods for 11 weeks with 3 booster sessions the following year (Weiss et al., 2008). Many di stricts integrate th e program into thei r health education

PAGE 30

18 class, leaving little time for the multiple other required health education standards to be completed, or they adapt the curriculum to thei r individual school needs, which may not be considered implementation to fidelity de pending on the developer's design flexibility. None-the-less, schools are in terested in incorporating prevention programs. When available funds are limited, many districts in tegrate pre-packed evidence-based programs that are linked to national or state academic standards into their daily regime. However, when mandated responsibilities challenge dist ricts already stretched time and budgets, the programs are compromised. Further exacerbating th e movement is that the science related to implementing these programs with fidelity and good outcomes lags behind (Fixsen et al., 2005), leaving districts with little guidance on the best way to integrate the work with fidelity into the daily life of schools. Purpose of the Study School districts across the country struggl e to address the gap left by limited health and mental health systems by provi ding programs and services to mitigate the psychosocial problems their students face. Despite limited resources, education is experiencing a new emphasis on evidence-based prevention programs, yet there is concern that the evidence may not be valid an d that the programs may not be feasible. Common to many school districts is the challenge of implementing science to practice in a way that maintains fidelity to the research ers work and is still adaptable to a school climate. This is a retrospective explanatory study of the factors associated with program installation and initial implementation of an evidence-based violence prevention program, Second Step in six elementary schools within a large urban school district. With this

PAGE 31

19 information, developers, researchers, and school staff will gain a better understanding of (a) the factors that support implementation of evidence-based programs in K-12 public schools, (b) the factors that constrain implementation, a nd (c) how developers and researchers might facilitate the application of research to practice. The opportunity to participate in an ev idence-based violence prevention program with support from the Health and Wellness De partment of the CCPS District was shared via email to all elementary school principals. Although Second Step was introduced in 2002, the district received f unding through a grant in 2005 for implementation. At that time, 12 schools were selected as part of the grant based on inte rest, willingness to promote the program among the other princi pals if they found it effective, and willingness to provide time for training a nd implementation. The staff of the program developer, Committee for Children, provided a tr ain-the-trainer model to 32 district and school-level staff and basic training to another 600 staff to pa rticipate in the program at the schools (Osowski, 2007). Research Questions The focus of this study was the factors a ssociated with program installation and initial implementation of Second Step: A Violence Prevention Curriculum that was chosen as the evidence-based violence prev ention program for elementary schools in the CCPS. The research questions relate d to this study are as follows: 1. To what extent, if any, are there di fferences in training on understanding purpose and expected outcomes, the curri culum, parent involvement, and being provided sufficient kits between sc hools that identify as implementing Second

PAGE 32

20 Step school-wide vs. schools that iden tify as implementing by individual classes or grades? 2. To what extent, if any, are there differences in time allocation for learning the curriculum, shared planning time, cla ssroom lessons, and review of lessons between schools that identify as implementing Second Step school-wide vs. schools that identify as implementing in individual cla sses or grades? 3. What strategies do principals per ceive to be effective in promoting implementation in their schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by individual classes or grades? 4. What are the barriers and facilitators of implementation identified by teachers and counselors in their schools that identify as implementing Second Step school-wide vs. schools that identify as individual classes or grades? 5. To what extent, if any, are there differences in staff commitment to implementation, more peer-to-peer support, more adherence to the program model, and more staff perception on positive student outcomes between schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by individual classes or grades? Operational Definitions Most definitions are adapted from Fixsen et al. (2005). Adaptation of the program: Descriptions or measures of actual modifications that are made in a program to accommodate the context and requirements at an implementation site.

PAGE 33

21 Characteristics of population served: Descriptions or measures of the demographic characteristics of the population actually being served at an implementation site. Competence: The level of skill shown by a practitioner in delivering an intervention (e.g., appropriate res ponses to contextual factors such as client variables, particular aspects of the presenting problems, cl ients individual life situations, sensitivity of timing, and recognition of opportunities to intervene). Core components: The most essential and indi spensable components of an intervention practice or program (core in tervention components) or to the most essential and indispensable co mponents of an implementation practice or program (core implementation components). Costs: Descriptions or measures of the actual costs of pr oviding services to clients at an implementation site (e.g., per diem or per client costs, overall costs, or categories). Evidence-based practices: Skills, techniques, and strate gies that can be used when a practitioner is interacting directly w ith a consumer. They are sometimes called core intervention components when used in a broader program context. Evidence-based programs: Organized, multi-faceted interventions that are designed to serve consumers with complex pr oblems. Such programs, for example, may seek to integrate social skills training, fa mily counseling, and educational assistance, where needed, in a comprehensive yet i ndividualized manner, based on a clearly articulated theory of change, identificati on of the active agents of change, and the specification of necessary organizational supports.

PAGE 34

22 Exploration: A variety of circumstances and events leading the purveyors of a program and Champions in a community to make contact and begin exploring the possibility of replicating th e program in the community. Individuals get to know one another, information is exchanged and assesse d, and activities proceed to the next stage (or not). Fidelity: Correspondence between the program as implemented and the program as described. Full implementation: The point at which a program is fully operational, with all the realities of doing business impingi ng upon the implementation site as the new program staff become skillful and the procedures and processes become routine. Systems integration refers to integration of the ne w service with the existing services and/or selection, training, coaching, evaluation, and administrati on. MIS feedback loops and attention to solving ongoing management, funding, and operational issues are notable features of advanced implementation. Initial implementation: The point at which the program begins to function. Staff is in place, referral begins to operate, external agents begin to honor their agreements, and individuals begin to receive services. Innovation: Each implementation site is differe nt, and local factors can lead to novel and effective solutions within the context of the overall program being implemented. It is important to discrimina te between innovation (d esirable) and program drift (undesirable). Installation: Once the decision to proceed is made, preparatory activities begin. This may involve arranging the necessary spa ce, equipment, and organizational supports;

PAGE 35

23 establishing policies and procedures re lated to personnel, decision making, and management; securing funding streams; selec ting and hiring new st aff and redeploying current staff; and so on. Thes e activities are in advance of actual implementation of the program. Local adaptation: Descriptions or measures of changes in any aspect of an implementation site in response to identified needs or opportunities within the federal or state system, local commun ity, or host organization. Manuals of replication/implementation procedures: Descriptions or measures of the extent to which the strategies and met hods for successful replication of the program have been codified in written protocols (e .g., site assessment, infrastructure needs, consumer involvement). Program: A coherent set of clearly describe d activities and specified linkages among activities designed to produce a set of desired outcomes. Quality: Providing appropriate supports and implementation that results in positive outcomes Program evaluation: Outcome and process measures related to the functioning (e.g., referrals, LOS) of an implementation si te or components within an implementation site. Successful: Curriculum is taught as intended. For Second Step that means: teaching at all grade levels and in all cl assrooms within a grade level; reinforcing strategies and concepts in daily activities w ith a consistent message; applying skill steps and modeling in all settings; integrating lear ning goals throughout th e regular curriculum;

PAGE 36

24 and familiarizing parents and caregivers to pr ovide support that encourages learning in nonschool settings. Sustainability: The point at which a new program is no longer new. As the implementation site settles into standard practice, internal and exte rnal factors impinge on a program and lead to its demise or c ontinuation. Coping and adapting are notable features of sustainability with respect to continuous training for practitioners and other key staff (such as turnover), changes in prio rities, funding streams w ithin local systems, changes in Championship, changes in co mmunity or client composition, etc. Training: Specialized instruction, pr actice, or activities designed to impart greater knowledge and skill.

PAGE 37

25 Chapter Two Literature Review Lack of success at school is one of the most common factors interfering with the current well-being and futu re opportunities of children and adolescents (Adelman & Taylor, 2006, p. xix). With more schools designated as lo w-performing based on federal and state accountability measures of sub-populations of students, schools must move from a vision that all children can l earn to a vision that enables all children to succeed in school, work, and life (Adelman & Taylor, 2000, 2006; Council of Chief State Sc hool Officers, 2002). For years the practice of implementing programs in schools has been left to the discretion of the school site. This has led to disc ussion on implementation of evidence-based programs and practices and on the success of addressing the multiple barriers (poverty, violence and substance abuse exposure, etc. ) that children and youth face. This study will give readers the opportunity to understand the supportiv e and challenging conditions school districts face as they implement these pr actices and programs in the early stages of the implementation process. There is urgency in getting implementati on right. The literature review begins with the status of childrens lives in todays society. It m oves on to the overview of the practices and programs that schools have adopt ed to mitigate the barriers to learning and why schools must focus on implementati on to achieve desired outcomes.

PAGE 38

26 Violence in Childrens Lives Multiple theories exist regarding why children exhibit violent behaviors; however, across these theories are common themes. On e common belief is that some children and youth have a genetic vulnerability, and when poor parenting and school failure interact with the vulnerability, the like lihood of violent outcomes early on in their lives increases. Other children and youth engage in less aggressi ve behaviors related to associations with deviant peers or rebellion or because th e opportunity presented itself (Flannery, 2006). The type and numbers of risk and protective factors have the poten tial to change an individuals life. The chances of committing violent acts later in life increases as much as 40% when children or youth have directly witnessed significant amounts of violence or have been victims of violence themselves. Hi ghly aggressive behavi or in childhood is the most significant predictor of fu ture violence (Flannery, 2006). Home is where many children are exposed to violence. In the United States, an estimated 6 million children are abused or negl ected each year, and 40% of all murders of children under age 18 are committed by a family member. In a survey of over 3,700 high school students, nearly 40% of boys and 50% of girls reported they saw someone slapped at home, and nearly 20% reported witnessing a b eating at home in the last year. The rate of victimization is high with 1 in 10 gi rls reporting being beaten, and nearly half reporting being hit. The percentage of ch ildren who had witnessed violence ranged from 90% in a New Orleans study to 45% in Washi ngton, D.C., to nearly half of all third through fifth graders in a southw estern city (Flannery, 2006). Another form of violence children are of ten exposed to in the home is media violence. Watching violence in the media may not cause a healthy developing child to

PAGE 39

27 commit a violent crime, but children who ar e at-risk or exposed to violence may be predisposed to be more aggressive (Johnson, Cohen, Smailes, Kasen, & Brook, 2002; Flannery, 2006). Table 3 presents risk factors that aff ect youth violence. Table 3 Risks Factors that Af fect Youth Violence Individual Risk Family Risk Peer/School Risk Community Risk History of violent victimization or involvement Authoritarian childrearing attitudes Association with delinquent peers Diminished economic opportunities Attention deficits, hyperactivity, or learning disorders Harsh, lax, or inconsistent disciplinary practices Involvement in gangs High concentrations of poor residents History of early aggressive behavior Low parental involvement Social reje ction by peers High level of transience Involvement with drugs, alcohol, or tobacco Low parental education and income Lack of involvement in conventional activities High level of family disruption Low IQ Parental substance abuse or criminality Poor academic performance Low level of community participation Poor behavioral control Poor family functioning Low commitment to school and school failure Socially disorganized neighborhoods Deficits in social cognitive or information-processing abilities Poor monitoring and supervision of children High emotional distress History of treatment for emotional problems Antisocial beliefs and attitudes Exposure to violence and conflict in the family (DHHS 2001, 2004; Resnick et al. 2004) Other Factors in Childrens Lives Health and social development risk fact ors that children and youth face are greater than ever (Greenberg et al., 2005). It is estimated that between 12% and 22% of

PAGE 40

28 Americas youth under age 18 need mental heal th services (Greenberg et al., 2005). As these children and youth struggl e to manage the challenges of growing up, the common behavioral health problems associated with these risk factors impede success in school. Behavioral and emotional disturbances in adolescence are associated with other problems, such as school failure and dropout, te en pregnancy, and affiliation with deviant peers (Durlak & Wells, 1997). An underlying challenge of tackling the psycho-social barriers that hampers student success is the differences in educatio nal and mental health perspectives as they relate to school-based mental health. Advancing school-based mental health services to meet the social and emotional needs of all children, while achieving the highest academic standards, requires a shared agenda of common terminology and professional perspectives (Kutash et al., 2006). One shared focus for both the education system and the mental health system are programs promo ting social and life skills training (Kutash et al.; Rones & Hoagwood, 2000). As school staff analyze data on how students can become more effective learners and analyze the broader educati onal goal of college and career readiness, it is important to make the connections between academic success and social and emotional learning. Recent research suggests that prevention pr ograms can both reduce mental disorders and problem behaviors and promote youth comp etence (Greenberg et al., 2005). The connections between risk f actors and outcomes that impact children are complex. One child may have multiple risk factors yet seem to be well-adjusted, while another may have a single risk factor and have multiple -adjustment issues. The non-linear relationship between risk factors and outcomes suggest s that providing a strategy of mediating

PAGE 41

29 multiple factors simultaneously should have a stronger positive outcome than narrowing the focus to single risk factors. Providi ng prevention efforts that focus on reducing interacting risk factors may have direct effects on diverse outcomes (Coie et al., 1993; Dryfoos, 1990) Evidence-Based Prevention and Intervention At its simplest, evidence-based practice refers to applying the best available research evidence in the provisi on of health, behavior, and ed ucation services to enhance outcomes (Metz et al., 2007, p. 1). It refers to skills, techniques, and strategies used to reinforce positive behaviors and to facilitate behavior changes. Evidence-based practices are the conditions or compone nts that lead to more comprehensive evidence-based programs. These programs are the organized, often multi-component interventions that target specific populations and are grounded in sound underlying theory of the causes of and solutions to poor outcomes and problem behaviors. Typically, a rigorous study has demonstrated that the program has a positive impact on targeted outcomes. The term evidence-based program is often used intercha ngeably with terms such as research-based program, science-based program, blueprint program, model program, promising program, and effective program (Kyler, Bumbarger, & Greenberg, 2005). Efforts to encourage adoption of eviden ce-based practices and programs cover a wide range of topics and are reinforced in the science-based research and evaluation literature that has shown that a number of evidence-based prevention programs help youth avoid risky behaviors (Albee & Gullott a, 1997; Durlak & Wells, 1997; Weissberg & Greenberg, 1998). Information is available to support school staff in comparing what has been labeled as effective prevention a nd intervention programs based on ecological

PAGE 42

30 factors, such as the socioec onomic and cultural environments in which students live, that affect the response to an intervention and ultimately its success (Jaycox et al., 2006). A number of reviews have provided qualitat ive and quantitative studies of effective programs acceptability, efficacy, effectiv eness and cost-benefit analysis/costeffectiveness (Aos, Mayfiel d, Miller, & Yen, 2006). Weiss et al.s (2008) work suggests caution on accepting an evidence-based program without exploring how the evidence was determined and whether it is a good match. Limitations of Evidence-Based Programs The exploration process should include an investigation of what the evidencebased program proposes to do to help their population and determine if a particular program can meet their needs w ithin the schools parameters. In contrast to the support of evidence-based programming, Weiss et al. (2008) identify three obsta cles to successful research-to-practice: (a) shortcomings in res earch and researchers, (b) shortcomings in policymakers and practitioners, and (c) shortc omings in the links among them that may impede the fidelity of implementation. Common complaints include untrustworthy evidence, unresponsiveness to decision-makers needs, fragmented data, evidence that fails to produce results or yields contradict ory findings (Saunders, 2005), and evaluators who are too responsive to governmental sponsors (Taylor, 2005). As mentioned previously, what may be evidence-based when the research was conducted may be outdated because of a long time period from research-to-practice. Also, what works today, may not work at a later time and place wi th a particular group of individuals or in particular settings (Mulgar, 2005). Furthe r compounding the challenges are policymakers

PAGE 43

31 who establish unrealistic timelines and expected unrealistic outcomes with limited funding. Policymakers influence the connection of research to practice by requiring evidence-based programs as part of federal grant funding. In Weiss et al. (2008) research disclosed concern about practices for determ ining what programs were listed as model programs by the Department of Education as well as other agencies More specifically, concerns were brought up regarding the source of evaluations, limited positive findings, subgroup comparisons, few long-term follow-ups, se lection of the expert panel, lack of belief in the evidence, and the bureaucracy a ssociated with the process of choosing an evidence-based program that may frustrate and confound school implementation success (Weiss et al., 2008). One concern about the criteria used for model programs suggested by Weiss et al. (2008) was that developers did almost all the evaluations of the programs they developed. For example, 18 of the 19 Life Skills Training evaluation reports were done by the developer, which may be a conflict of interest that leads to a bias in reporting. Another concern was the limited evidence of positive findings (Weiss et al., 2008). Only a few evaluations were required to achieve the approved classificati on, limiting the data. For example, Project Alert used six outcome measures, six different substances, three risk levels, and two types of programs for 100 co mparisons between a program and control condition. Only two were significant (Ell ickson, Bell, & McGuigan, 1993): one in the positive direction and one in the negative direction. Rather than compare the participants in the program to the control group, some st udies compared subgroups of participants, which skewed the results. Consequently, if a school tries to determine what went wrong

PAGE 44

32 in implementation, comparisons may be difficult if the results had multiple limitations that prevent accurate replication. Regardless of whether a program was labe led promising, model, or exemplary by the U.S. Department of Education, few program s showed substantial su ccess at post test, and the few evaluations that completed l ong-term follow-up studies after the program ended reported early success that did not last. Additionally, Weiss et al. (2008) noted concern about the selection of the members of the Department of Education Safe, DrugFree Schools (SDFS) expert panel. Some of th e panel members had either developed their own drug abuse prevention program or were part of the decision-making process for other programs. With few drug abuse prevention studies done on fidelity of implementation under real-world conditions and a study expressing concern about the validity of the U.S. Department of Education SDFS Expert Panel recommendations for evidence-based programs, it is hard to determine the real challenges for schools regarding implementation. Is it the lack of fidelity to an evidence-based progr am, the challenges of implementing the conditions that lead to fideli ty or the lack of solid research supporting the need for strict adheren ce to the program design? Why the Focus on Implementation? Previously, schools were identified as th e de facto health and mental health system (Burns et al., 1995) and now are considered one of the most important settings in which to conduct preventive and wellness prom otion interventions (G reenberg et al., 2005, p. 2). This reality underscores the im portance of good research, practices and programs to mitigate and reduce barriers to learning. Schools interested in implementing

PAGE 45

33 evidence-based prevention programs have an array of research-based options through a series of reports and reviews that su mmarize the programs. With implementation challenges, limited research, and even less di strict funding that can support the necessary infrastructure to guide schools on what are essential components, many districts find they cannot achieve the same levels of technical assistance, support, resources, and prevention expertise as the research tr ials (Greenberg et al., 2 005). These challenges provide compelling reasons to demonstrate why studying and monitoring the factors associated with program installation a nd initial implementation and the conditions necessary are important. Table 4 Reasons for Studying and Monitoring Implementation Implementation Components Reasons Effort Evaluation To know what actually happened Quality Improvement To provide feedback for continuous quality improvement Documentation To document compliance with legal and ethical guidelines Internal Validity To strengthen the conclusions being made about program outcomes Program Theory To examine whether the change process occurred as expected Process Evaluation To understand the internal dynamics and operation of an intervention program Diffusion To advance knowledge regarding best practices for replicating, maintaining, and diffusing the program Evaluation Quality To strengthen the quality of program evaluations by reducing the error in the evaluation Note. (Greenberg et al., 2005, p. 6)

PAGE 46

34 Although more emphasis has been placed on the development of prevention programs than on replication in real-world se ttings (Taylor et al., 1999), in recent years there has been a shift to more research on and study of implementation. Fixsen et al. (2005) reviewed implementation research and found that thoughtful and effective implementation strategies were essential to making systemic changes that positively influence the lives of the intended audien ces. As presented in Table 5, the principal investigators outline six stages of the impl ementation process designed to be purposeful and detailed enough for observers to detect presence and strength of intervention and implementation activity as well as their outcomes. Table 5 Stages of Implementation Implementation Stage Description Exploration & Adoption Individual, organization, or community understands a need, identifies a program, and assesses the match. Program Installation Task before implementation, such as crafting new policies, gathering, necessary resources, and hiring and training staff Initial Implementation Early stage of implementation; often a time when implementation ends because of the struggles of im plementing change in a system Full Implementation Fully operational program, including full staff and full client loads Innovation Refinement and expansion of the program based on local needs; a threat to fidelity Sustainability Supports in place for continuous of program Note. (Fixsen et al., 2005)

PAGE 47

35 Importance of Implementation in Evidence-based Programming Exploration and adoption are the components of th e first stage of the implementation process. It en tails understanding the needs of the school, identifying a potential program, and determining whethe r there is a match (Fixsen et al., 2005). Research has shown that prevention and ear ly intervention targeted to specific developmental stages, to different populati ons, within different settings, and with effective implementation strate gies can prevent many risky a dolescent behaviors. This stage lays the groundwork for adopting the program, but not how to implement with fidelity. There is less direc tion on the next stages, program installation and early implementation, which may be more complicat ed because it is retr ofitting new programs within their everyday framework (Fixsen et al., 2005). Rossi and Freeman (1985) identify thr ee ways that research may not be implemented correctly and might lead to the incorrect conclusion that the intervention does not work and the problems more complex. Their research identifies problems with how practitioners implement the programs: (a) no treatment or too little treatment is provided; (b) the wrong treatment is provided; or (c) the treatment is not standard, is uncontrolled, or varies across the targ et population (Fixsen et al., 2005). Other researchers, like Dobson and Cook (1980), c onfirm the problem as practitioners not implementing the evidence-based practice or pr ogram as intended. However, Weiss et al. (2008) argue that the criteria for the designation of evidence-based may be flawed giving false hope to school dist ricts trying to achieve the same outcomes. Research studies of programs provide protoc ols that may not be easily adaptable in the real-world school setting. With limited guidance on school-level factors associated

PAGE 48

36 with program installation and initial im plementation processes and the conditions necessary to implement, school staff may elim inate crucial component s to the program as they adapt it to their unique needs. This can be a problem with the success of the program. Research confirms the importance of implementing with fidelity in this initial implementation stage when the compelling fo rces of fear of change, inertia, and investment in the status quo (Fixsen et al., p.16) may lead to abandonment of the project. School Involvement in Prevention and Intervention The landmark legislation No Child Left Behind (NCLB) Act of 2001 mandates evidence-based practice with evidence-driven progress (R eport of the Coalition for Evidence-Based Policy, 2002). More specifically the U.S. Department of Education now requires that core academic, prevention, a nd intervention instruction be guided by theory; rigorously evaluated so as to determin e that it actually does what it set out to do; replicable; and validated or supported by researchers in the field (National Coordinating Technical Assistance Center for Drug Prevention and School Safety Program Coordinators, 2003, p. 53). Many of the program s are on lists intended to help schools differentiate between nationally available prog rams that are effective and those with no evaluation base. Even with the increase in identifying ev idence-based prevention and intervention programs, school districts may not use evid ence-based programs with fidelity for the reasons already discussed (Ennett, Tobler, Ringwalt, & Flewelling, 2003; Gottfredson & Gottfredson, 2002; Hallfors, Sporer, Pankraatz, & Godette, 2000). Hallfors, Sporer, Pankraatz, and Godettes (2000) survey provi des results from 81 Safe and Drug-Free

PAGE 49

37 School district coordinators across 11 states indicates that 59% had selected an evidencebased curriculum for implementation, but only 19% reported that their schools were implementing these programs with fidelity. As educators struggle to meet the stud ent academic performance requirements of NCLB, they are faced with difficult choices (Adelman & Taylor, 2000; Berends, Bodilly, & Kirby, 2002; Hall & Hord, 2001; Sarason, 20 02) as programming compliance is not measured; thus, many school districts focus on the evaluated components of their work: the core academics. By reducing time for prevention and intervention programs, they are limiting student success by not realizing the full potential effects of prevention and intervention programs on academic success as we ll as social and emotional development (Greenberg et al., 2005). I gnoring the link between soci al-emotional supports and academic success, educators often emphasize academics only. Further compounding the situation is the issue of fidelity in delivering the programs successfully. More than half of th e school districts surv eyed had altered the prevention and intervention programs by not delivering components with the intensity that research can provide under controlled ci rcumstances (Hallfors, Pankratz, & Hartman, 2007). Teachers incorporate prevention and in tervention programs into their day while maintaining a focus on academic achievement as their basic responsibility. Teachers may adapt the program based on their time and/or training on adaptation to special needs ( Prevention 2000, 2000) or because the implemen tation design only has particular components that meet their needs.

PAGE 50

38 Norway Study One Second Step study on implementation of Second Step was completed in Norway in 2006. Sixty percent of Norways primary (elementary) schools adopted Second Step as a way to promote social skills an d prevent violence in schools based on the Norwegian Health Association reco mmendation that there be a whole-school approach to violence prevention. Ther e were no requirements regarding the implementation process. Larsen and Samdal ( 2007) studied Norwegia n teachers fidelity in their use of Second Step and their perception of fidelity implementation. Their findings indicated that teachers adapted features of the program to meet the needs of their students. They also made adaptations based on the individual beliefs and experiences of the teacher presenting the program. Teachers who reported implementing with fidelity were in schools that adopted the program fo r the whole school. Indi vidual teachers who used Second Step tended to use it as a tool for addre ssing specific situatio ns and conflicts. Using the definition of fidelity as ad herence, adaptation, and the quality of delivery, Larsen and Samdals (2007) analysis revealed that all of the teachers adapted the program to some extent, with more expe rienced teachers being more likely to adapt the curriculum. Teachers reasoning for adaptation included (a) a need for flexibility, (b) more focus on social competence rather than the lesson itself, (c) less structure and repetition, and (d) difficulty in maintaining student engagement. Some teachers also expressed a need to modify the program to fit their teaching practicerather than to modify their practice to fit the programto enable confiden t delivery of the program and to enable their adaptation to relate to th eir prior experiences wi th what works and does not work for their pupils. (Larsen & Samdel, 2007, p. 23)

PAGE 51

39 Theoretical Framework Expansion of developmental theory that includes models from public health, epidemiology, sociology, and developmental psychopathology combined with ecological analysis provides a framework for organizi ng and building the fiel d of prevention and intervention science. This developmental-ecolo gical model can help to frame the layers of influence on behaviors that do not direc tly involve children and youth but have an impact on their academic success and life. Second Step s guiding theory is based primar ily on cognitive-behavioral theory (Kendall, 1993, 2000), which grew out of Banduras (1986) social learning theory. With evidence that self-talk can control behavi ors (Luria, 1961) and th at thoughts affect peoples social interactio n (Crick & Dodge, 1994), the Second Step program teaches first empathy skills, then response to social interactions by problem-solving, and finally management of the students own anger and intense emotions. Greenberg et al. (2005) propos ed a two-step process for a conceptual model for both the development of a program theory a nd the study of the implementation of schoolbased prevention and promotion programs This model was designed to tailor measurement decisions directly to a specifi c program by articulating the causative and prescriptive assumptions. In the model, the theory-driven evaluation objectives are (a) to utilize the essential component s of the theory that underlie s a particular program to specify the design of the program evaluati on itself, (b) to understand how and why a particular program resulted in certain outco mes, and (c) to use that information as a means to improve program effectiveness (Chen, 1990, 1998; Weiss, 1995). According to

PAGE 52

40 Chen (1990, 1998), to conduct a theory-drive n evaluation an evaluator first must construct a comprehensive program theory addressing two areas: 1. The causative theory describes the how and why of the program: (a) how the program is expected to achieve particular outcomes, (b) the relationship between the intervention and the outcomes, and (c) the mediators or moderators impact on the intervention effect. 2. The prescriptive theory describes (a) how the program should be implemented or (b) the manner in which daily activities of the program should proceed. This component includes the goals of the program, the guidelines for the type of intervention to be provided, and the contex t that is necessary for the su ccessful implementation of the intervention. Greenberg et al. (2005) found program failure may result from weakness in either the causal or prescriptive aspects of the program theory such as an inaccurate theory about mediators and m oderators that link interventi ons with outcomes, or it may be due to a failure to impl ement the intervention properly. Second Step Design and Implementation Process The Second Step curriculum focuses on three skills. The first is empathy, which focuses identification of emotions and recognition of possible causes of emotions when interacting with others. Next, students learn thoughtful responses to social interactions through neutral problem-solving steps. Last, students learn to ma nage their own anger and intense emotions. Second Step Preschool/Kindergarten Grade 5 was piloted in 1988-1991 with results indicating that the scor es for pre-and post-interviews of children who received the program showed significant enhancement of the childrens empathy, problem-solving,

PAGE 53

41 and anger-management skills compared to st udents who had not received the program (Moore and Beland, 1992). Second Step Grades 6-8 was piloted in 1989-1990 with significant gains from pre-test to post-tes t than control group st udents. In 1995-1996, the Grades 6-8 program was revise d and expanded with similar re sults. In addition, in the revised program for Second Step Middle School/Junior High st udents perceived they had a better ability to handle social situations as well as a reduction in aggression and antisocial behavior as compar ed to the control group. The Committee for Children (2002), a S eattle, Washington based organization, has appeared to put considerable thought into the implementation process. They have identified conditions that cont ribute to program fidelity incl uding, training of all staff, time to review and deliver the program, administrator support, and school-wide implementation. Their manual provides resources and information, as well as the tools to provide staff training. Among the materi als provided are the following: 1. The theory and research used to create the curriculum. 2. Ways to use the curriculum, includi ng scheduling lessons and specific teaching strategies. 3. Special material for trainers and administ rators that includes tools to assist in the initiation and ongoing impl ementation of the program. 4. Staff Training Modules that include how to use the training video and what to do to prepare for staff training. The modules also include repr oducible participant handouts and trainer transparencies. 5. Staff Training Adaptations with age-sp ecific outlines and information about grade-specific videos.

PAGE 54

42 6. Book lists and resources for students, parents, teachers, and trainers and grade-level samples of the program kits. Committee for Childrens (2002) Second Step Implementation Plan begins with dialogue on the importance of strong sponsorsh ip from key decision makers. The plan details conditions that are necessary for the ultimate effectiveness of the program, which is defined as whether the curriculum was taught as intended. It continues with explanations of training, training models, th e importance of classroom observations, and the involvement of non-classroom staff as a dditional support rather than in place of the classroom teacher. The Trainers Manual further outlines the administrators roles and responsibilities, from staff buy-in to ev aluation and success celebration. Numerous process materials were developed for school staff to stay on targ et. Listed below are Second Step tools: 1. Overview Presentation 2. Teacher Follow-Up Survey 3. Trainers Implementation Assessment 4. Lesson Observation Form 5. Implementation Planning Worksheet 6. Mid-Stream Implementation Checklist 7. Implementation Checklist 8. Lesson-Completion Record 9. Social-Emotional Learning Checklist 10. Student Satisfaction Survey 11. Teacher Follow-Up Survey

PAGE 55

43 Second Step principal investigators believe t he single most important thing an administrator can do to ensure success is to promote consistent, quality implementation (Committee for Children, 2002, p. 97). They recommend measuring the ongoing daily features of the program to provide a clear pi cture of how the curriculum actually looks. Examples of ways schools can monitor and document different aspects of effective implementation include the following: 1. Amount of program training to teachers and other staff 2. Number and frequency of lessons children receive 3. Recognition of student use of Second Step skills 4. Staff prompts of skill use outside of lessons 5. Visibility of the program, such as posters throughout the school 6. Outreach to parents. Research Questions The focus of this study was on two of the factors associated with program installation and initial implementation of Second Step: A Violence Prevention Curriculum which was chosen as the evidence-based violence prevention program for elementary schools in CCPS. Factors associat ed with program installation and initial were identified and were examined for the difference between schools that self-identified as implementing school-wide vs. thos e that identified as schools that partially implemented. This studys research questions were as follows: 1. To what extent, if any, are there di fferences in training on understanding purpose and expected outcomes, the curri culum, parent involvement, and being provided sufficient kits between sc hools that identify as implementing Second

PAGE 56

44 Step school-wide vs. schools that iden tify as implementing by individual classes or grades? 2. To what extent, if any are there differences in time allocation for learning the curriculum, shared planning time, cla ssroom lessons, and review of lessons between schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by individual classes or grades? 3. What strategies do principals per ceive to be effective in promoting implementation in their schools that identify as implementing Second Step school-wide vs. schools that identify as implementing individual classes or grades? 4. What are the barriers and facilitators of implementation identified by teachers and counselors in their schools that identify as implementing Second Steps school-wide vs. schools that identify as implementing by individual classes or grades? 5. To what extent, if any, are there differences in staff commitment to implementation, more peer-to-peer support, more adherence to the program model, and more staff perception on positive student outcomes between schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by individual classes or grades? This study examined the program installa tion and initial implementation stages of the implementation process of Second Step using the Yin method of case study. Yin developed a number of case study designs. Th is study uses an exploratory approach. These stages are the second and third stages of the implementation process as identified

PAGE 57

45 by Fixsen et al (2005). This will be accomplished by analyzing the evidence that confirms or denies four propositi ons and their indicators on the CCPS Second Step implementation. Summary Schools are the de facto health and mental health system (Burns et al., 1995). More recently, schools have been identified as the best place to provide prevention and intervention programming (Greenberg et al ., 2005). Although school districts often do provide the evidence-based practices and pr ograms, they have been challenged with implementation of these programs. Recent studi es have provided research-based options for schools, information on implementation strategies, and better understanding of the components of implementation. School distri cts are challenged with taking controlled prevention and intervention studies and integrat ing them into the real-world setting of schools. These challenges include meeting the re quirements of NCLB, time restraints and the plethora of problems student s have before they even walk through the school doors. Additionally, there has been pr ofessional concern that not a ll prevention and intervention programs for schools were accurately reviewe d, and about the time lag between research to practice, and whether the interven tion programs are culturally relevant. With competing urgencies in educati on, when schools provide prevention and intervention programming, it is important to study and monitor their implementation practices to get the best effect for their efforts. Fixsen et al.s (2005) research found that no matter the field, implementation strategies ha d to be thoughtful and effective in order to make the systemic changes the programs were designed to provide. The task for the education field and the purpose of the pres ent study is to unders tand the supportive and

PAGE 58

46 challenging conditions school di stricts face as they implement and intervention programs in the real-world se tting of schools.

PAGE 59

47 Chapter Three Methodology Overview The goals of this study were to provide a better understanding of (a) the factors that support implementation of evidence-ba sed programs in K-12 public schools, (b) the factors that constrain implementation, and (c) how developers and researchers might facilitate the application of research to pr actice. The focus of this implementation study was on the exploration of the difference between schools identifying as implementing Second Step school-wide vs. identifying implementati on in individual classes or grades in six CCPS elementary schools during the program installation and in itial implementation process. This study used a multi-method, multi-source retrospective study design (Yin, 1989; 1994). It examined conditions that, if pres ent, the research indicates are associated with better implementation. This study tested specific theoretical propositions and also developed case descriptions as outlined by Yin (1989, 1994, 2003). The second and third stages of the Implementation Theory framework proposed by Fixsen, Naoom, et. al (2005) influenced this study. Within the context of the second and third stages: (a) program installation, and (b) Initial Implementation, this study examined the supportive and limiting condi tions schools face as they implement evidence-based programs and effective factor s associated with program installation and

PAGE 60

48 initial implementation strategies, as well as investigated when, how, and why schools adapt programs. In this study, pseudo names have been gi ven to the state, county, city, school district, and individual schools to protect confidentialit y. The state was referred to as Manzano, the county as Sandia County, and the city as Central City. The district was referred to as Central City Public Schools. The six schools in this study were named Alto W, Bueno W, Dia W, Familia P, Manzano P, and Campo P. Schools with a W are schools that selfidentified as implementing Second Step school-wide, while schools with a P self-identified as implementing Second Step in individual classes or grades. Second Step, an evidence-based program is the particular program examined in this study. The program has been recognized by the SAMHSA National Registry of Evidence-based Programs and Practices (N REPP) (Schinke, Brounstein, & Gardner, 2002), as well as the Prevention Research Center for the Promotion of Human Development at Penn State University (G reenberg, Domitrovich, & Bumbarger, 2000) and Collaborative for Academic, Social, and Emotional Learning (CASEL) (CASEL, 2003). To support implementation fidelity, the developer of Second Step provided an implementation plan that includes factor importa nt to fidelity such as: (1) how to engage a sponsor or key decision maker, (2) school-wide implementation practices, (3) guidelines on school-level administrator roles, and (4) resp onsibilities and implementation tools. The research questions, the study design, us e of statistical peers, and why study Central City Public Schools will be reviewed and discussed. Data collection and the data

PAGE 61

49 analysis process will be share d. This section will end with the limitations of the study and a summary. Research Questions The research questions that were explored in the presen t study focused on two of the six stages, program installati on and initial implementation, identified by Fixsen et al. (2005). Fixsen et al.s (2005) literature review found evidence that a longer multi-level approach is important for successful impleme ntation. There is evidence that there are related conditions associated with fidelity such as, practice-based practitioner selection, skill-based training, practice-based coach ing, practitioner performance evaluation, program evaluation, facilitative administrative practices and methods for systems interventions. Specifically, the study was de signed to answer the following questions: 1. To what extent, if any, are there di fferences in training on understanding purpose and expected outcomes, the curri culum, parent involvement, and being provided sufficient kits between sc hools that identify as implementing Second Step school-wide vs. schools that iden tify as implementing by individual classes or grades? 2. To what extent, if any, are there differences in time allocation for learning the curriculum, shared planning time, cla ssroom lessons, and review of lessons between schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by individual classes or grades? 3. What strategies do principals per ceive to be effective in promoting implementation in their schools that identify as implementing Second Step

PAGE 62

50 school-wide vs. schools that identify as implementing by individual classes or grades? 4. What are the barriers and facilitators of implementation identified by teachers and counselors in their schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by individual classes or grades? 5. To what extent, if any, are there differences in staff commitment to implementation, more peer-to-peer support, more adherence to the program model, and more staff perception on positive student outcomes between schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by individual classes or grades? Statistical Peers for Benchmarking Statistical Peers for Benchmarking was created by CCPS as a data-driven strategy, using statistical cluster analysis to identif y groups of similar schools. Peer groups were formed based on schools percentages of students in the Free and Reduced Program (FRPM), percent of English Language Lear ners (ELL), percent of Under-performing Minorities (i.e., Hispanic, Native American, African American), and percentage of students enrolled at same school on days 40 and 180 of an academic year as a proxy for student stability. In 2005, there were five categories of elemen tary statistical peers groups and each category represents a distinct set of comparison schools. For this study the three cohorts are part of Groups 2, 3, and 4. Using the Statistical Peers for Benchmarking categories to match schools in this study helps to control for confounding variables, thus

PAGE 63

protecting internal valid ity. Schools were not matched on any variables based on principal or staff characteristics. Why Study Central City Public Schools? Central City Public Schools (CCPS) serve over 89,000 students, approximately 1/3 of the students in the state of Manzano. The district is loca ted in Sandia County, which has a population of over 500,000. This district is one of the 50 largest in the nation and reflects much of the cultural diversity of the area. Over 67% of students come from minority backgrounds, making CCPS a major ity minority district (CCPS, 2008). A snapshot of the ethnic composition of CC PS student body is pr ovided in Figure 1. Hispanic 55% Anglo 33% Asian 3% African American 4% Naive American 5% Figure 1. Profile of CCPS students. CCPS students are economically diverse. According to the la test (2005) U.S. Census estimates, over 17% of children in Sandia County are living in poverty. This average, while slightly highe r than the national rate of 16%, masks more extreme poverty that exists within the district when one consid ers that 40% of all stud ents in the district qualify for Free and Reduced Priced Lunch (FRPL), a common indicator of poverty. In 35 of 84 elementary schools (42%), the FRPL ra te is over 70%; in 16 schools (19%), the 51

PAGE 64

52 rate is over 90%. These pockets of povert y contribute to Manzano having the third highest poverty rate in the country. As the largest urban community in Manza no, Central City also has the most acts of violence in the state. Manzanos youth risk factors have increased compared to other states since 2004, creating the Ce nters for Disease Control (2 004) ranking of Manzano as last among the 50 states for quality social h ealth. The states combined score of 21.4 out of a possible 100 points reflects poor results in 16 social indicators, including infant mortality, child abuse, percent of children in poverty, teen drug abuse, high school completion, homicides, and alcohol-related traffic deaths. Among the 50 states, Manzano has the highest combined rate of all violen t deaths, including ho micides and suicides (Centers for Disease Control, 2004). In addition, high rates of youth violence ar e reported on the Manzano Youth Risk and Resiliency Survey (Manzano Department of Health, 2005), a nd the Central City Police Department estimated that at least 90 gangs with 7,000 memb ers, operate in the city. Sandia County has the grea test number of referrals to Juvenile Probation and Parole (JPPO) of any county in Manzano. The 9,774 re ferrals in FY 2001 and 8,200 referrals in FY 2002 represented 33% and 30% of the stat ewide totals, respectiv ely. Youth from the county also accounted for 162 commitments to juvenile facilities in FY 2002, which was 34% of the statewide figure (Manzan o Children, Youth, and Family, 2002). Furthermore, youth in Manzano have alarming rates of depression and other mental health issues and diagnoses, and s ubstance use is widespread throughout Central City. Mental health diseases are among th e top five hospital discharge diagnoses in Sandia County (Sandia Count y Health Council, 2002).

PAGE 65

53 The risk factors and supporting data representative of the student body in the Central City Public School District are re presented in Table 6. The data indicate a compelling need for prevention and interventi on programming to mitigate the plethora of risk factors associated with CCPS students. Table 6 Central City Public Scho ols Student Risk Factors Risk Factors Manzano, Sandia County, and CCPS Data Comparison Data Source Teen death (homicide/ suicide/ accidents) 6th highest among all 50 states in 2005 The states 2001 violent death rate of 27.9 per 100,000 is 56% higher than the U.S. rate of 17.9. KIDS COUNT Data Center (2007); CDC (2007) CCPS high school violence 21% boys and 13% girls in physical fights on school property; 25.2% possession of a weapon; 9.8% weapon on school property; 8.5% skipped school because they felt unsafe National figures are substantially lower at 17% for possession of a weapon and 6.4% for possessing a weapon on school property. CDC (2007) CCPS middle and high school violence 33% threatened physical harm to someone; 41% hit someone; 31% victim of physical violence; 28% trouble with police; 24% committed vandalism; 29% fear getting hurt by someone at school. CCPS RDA Developmental Assets (2005)

PAGE 66

54 Table 6 (continued) Risk Factors Manzano, Sandia County, and CCPS Data Comparison Data Source CCPS middle and high school substance abuse 47% had at least one drink of alcohol within a 30 day period; 31% binge drink; 31% use marijuana Manzano rate of youth dependence on alcohol and drugs (6.5%) is second only to Alaska and markedly higher than the national rate of 4.8%. CCPS RDA Developmental Assets (2005); Technical Assistance Collaborative (2002) Mental health issues 79% CCPS mid-high school students feel sad or depressed, 32.2% report persistently feeling sad and hopeless, 15.6% say they made a suicide plan and 15% attempted in the previous 12 months. 55.7 suicide deaths per 100,000 for 1998-2000 in Sandia County Suicide deaths in Manzano are more than four times the nationwide average of 10.7. Sandia County Health Profile, (2002); CCPS RDA Developmental Assets (2005); Technical Assistance Collaborative (2002) Study Design This study used a multi-method, multi-source retrospective case study design (Yin, 1989, 1994). Case study is the preferre d strategy for answering how and why questions such as those posed in this study. Case studies also are advantageous when the investigator has little control over events and the focus is on a real-life context, as in the present circumstances (Yin, 2003). This study te sted specific theoretical propositions and also developed case descriptions, as ou tlined by Yin (2003). Testing theoretical

PAGE 67

55 propositions or rival explanations may be preferable to developing case descriptions; however, a case description is appropriate when it can help to identi fy appropriate causal links to be analyzed or when doing a study on the complexity of implementing a program. For example, when Oakland, Calif ornia, studied a local public program, the city workers found describing the complexity in terms of the multiple conditions that had to occur for implementation to succeed allowe d the workers to identify (a) an embedded unit of analysis, and (b) an overall pattern of complexity that was used in a causal sense to explain why implementa tion failed (Yin, 2003). Pilot. Before starting the pr esent study, a pilot study was conducted at two elementary schools. The two schools began using Second Step in 2007 and could not be in the primary study. The purpose of the pilot was to field test the interview and focus group questions developed to elicit responses about the support for or opposition to each proposition. The pilot had a design similar to the current study in that the two schools were matched based on CCPS's statistical pe er groupings, both bega n training during the 2007 school year, and one school identified Second Step implementation school-wide while the other identified partial implementation of the program. Study Propositions. The work of researchers includi ng Fixsen, et. al (2005), who focus on implementation in a variety of se ttings, as well as Gr eenberg, Domitrovich, Graczyk and Zins (2005) and Weiss, MurphyGraham, Petrosino and Gandhi (2008), who focus on implementation of prevention and in tervention programs in schools, guided the development of the following propositions by providing an understanding of factors associated with program installation and initial implementation and the importance of adherence to the developer's implementation process. The propositions focused on four

PAGE 68

56 areas: training and resources, time, impl ementation level, and champion. The propositions are described in Table 7. Four indicators were associated with each proposition for a total of 16 indicators. The four propositions were based on the knowle dge gained by the literature review, focusing on factors associated with program installation and initial implementation. The indicators and their asso ciated interview questions were developed to elicit responses that would provide the evidence eith er in support of or against the proposition. Interview and focus group questions were designed to engage the participants in accurately describing the factors associated with program installation and initial implementation. Table 7 Area and Propositions Describing Early Stage Implementation Area Research Question Proposition Training & Resources 1 Proposition A: Schools that r eceived training in Second Step prior to implementation of the program were provided with implementation tools and support necessary to implement the program. Time 2 Proposition B: When implementing the pr ogram time was allocated for school staff to learn the program components as well as sufficient time to deliver the program to students.

PAGE 69

57 Table 7 (continued) Area Research Question Proposition Implementation Level 4 and 5 Proposition C: If Second Step was implemented school-wide, there was more staff commitment to impl ementation, more peer-to-peer support, and more adherence to the program model. Staff was more likely to attribute positive student outcomes to Second Step than when Second Step was implemented only in individual classrooms or grades. Champion 3 Proposition D: When a school had a de signated champion for Second Step teachers and/or counselors were more likely to implement the program than when there was no champion present. School Selection Procedures. Eight elementary schools we re selected for this retrospective study. Two of the schools were pa rt of the pilot. Th e criteria for site selection of the six schools in the final study included: (a ) first implemented in 2005, and (b) either self-identif ied as whole school Second Step adoption (Level 1) or self identified as individual classes or grades Second Step adoption (Level 2) (Figure 2). The year 2005 was chosen for the study because it had the la rgest cohort of schools that trained during the same time period, thus increasing the pool of potential sc hool participants. Fourteen elementary schools that met the criteria fo r the study were identif ied for this study of Second Step implementation by reviewing the CCPS Professional Development database, Second Step district files, and 2008 Counselor Survey documents before the final matching of the six schools. Principals of schools who trained and me t the criteria were contacted by phone and provided a brief introduction to the study. If they showed interest or agreed on-the-

PAGE 70

58 spot, a follow-up email (Appendix A) was sent with an informational sheet explaining their role in the study. The Level 1 schools in the study were matched to Level 2 schools with the Statistical Peer for Benchmarking tool designed by CCPS staff (Dunavin, 2005). Statistical peer grouping was used to help support the inte rnal validity of the study by controlling for external variab les that might affect factors associated with program installation and initial implementation differentially between the schools.

PAGE 71

Matched by statistical peer group Identify elementary schools implementing Second Step ( SS) for possible participation in study by reviewing Professional Development database, Second Ste p district files and 2008 Counselor Surve y documents. Schools that selfidentify as implementing SS at the whole school level (Level 1) Schools that selfidentify as implementing SS in individual classrooms or grade levels ( Level 2 ) Schools that have not implemented SS or were trained in a year other than 2005 Drop, doesnt qualify for study Schools that received SS training in 2005 at the whole school, individual classrooms or individual grade levels Schools that received SS training in 2005 at the whole school (Level 1) Schools that received SS training in 2005 SS in individual classrooms or grade levels (Level 2) Three schools that trained in SS in 2005 at the whole school (Level 1) each matched to statistical peer schools that implemented SS in 2005 in individual classrooms or grades (Level 2) for a total of 6 schools in the study Figure 2 School selection procedures. School, Principal and Staff Selection. Follow-up calls were made to the principals to assure that they received th e email and to answer any questions. A meeting or phone conference was set up w ith principals whose schools met criteria to be in the 59

PAGE 72

selection pool and agreed to participate. The meeting focused on (a) more specifics of the study, (b) input on the appr opriate staff to interview within the school, and (c) permission to contact identified staff. Schools were matched with thei r statistical peers. Figure 3. School, principal and staff selection. 60

PAGE 73

61 Statistical Peer Grouping. The final schools selected were matched by statistical peer groups, a comparison protoc ol designed by Dunavin (2005) Using the comparison protocol, schools that self-i dentified as implementing Second Step using a whole school approach (Level 1) were matched to sc hools that self-identif ied as implementing Second Step in only individual classes or grades (Level 2) within the same Statistical Peer group. Using the Statistical Peers for Benchmarking (Dunavin, 2005) categories to match schools in this study helped to control for confounding variables. This tool provides information on which schools are most alike in terms of student charact eristics. It allows for already established comparison schools to be easily identified and used for data analysis (Table 8). Selected Schools The schools that met all criteria and partic ipated in the study were Alto W (Level 1) and Familia P (Level 2) as part of Sta tistical Peer Group 2; Bueno W (Level 1) and Especial P (Level 2) as part of the Statis tical Peer Group 3; and Dia W (Level 1) and Campos P (Level 2) as part of Statistical Peer Group 4. Table 8 School Association with St atistical Peers and Level Statistical Peers Group School Level GROUP 2 Alto W 1 Familia P 2 GROUP 3 Bueno W 1 Especial P 2 GROUP 4 Dia W 1 Campo P 2

PAGE 74

62 Schools Alto W and Familia P. Alto W and Familia P were part of the cohort of schools that made up Statistical Group 2 (Tab le 9). This grouping of schools was in the mid to high range of economic need with stab ility of students within a school year at more than 61%. Ninety percent of the student s at Alto W and 73% of students at Familia P qualified for free or reduced meals. Stability rate for Alto W was 66% and Familia P's 67%. Stability rate refers to the percent of students enrolled at the same school on days 40 and 180 of an academic year. Under-performing Hispanic, Native American, and African American students cons tituted 92% of Alto W's st udents and 82% of Familia P students. One quarter to one third of the stud ents at these schools were English Language Learners. Proficiency rates in math for Alto W and Familia P were 34% and 24%, respectively. Proficiency rates for reading we re higher than rates for mathematics at the two schools with Alto at 47% and Familia P at 41%. Schools Bueno W and Especial P. Bueno W and Especial P were part of the cohort of schools that made up Statistical Group 3 (Table 9). This grouping of schools was in the low to mid range of economic need with stability of students within a school year at more than 56%. Seventy-two percen t of the students at Bueno W and 61% of students at Especial P qualified for free or reduced meals. Under-performing Hispanic, Native American, and African American stude nts constituted 81% of BuenoW's students and 57% of Familia W's students. Twenty-f our percent of Bueno W's population were English Language Learners, while 17% of Especial P's population were English Language Learners. Proficiency rates in math ematics for Bueno W and Especial P were 24% and 30%, respectively. Proficiency rates for reading were higher than those for Alto W and Familia P at 53% (Bueno W) and 56% (Especial P).

PAGE 75

63 Schools Dia W and Campo P. Dia W and Campo P were part of the cohort of schools that made up Statistical Group 4 (Tab le 9). This grouping of schools was in the mid to high range of economic need with stab ility of students within a school year at more than 67%. More specifically 46% of th e students at Dia W and 54% of students at Campo P qualified for free or reduced meals. Stability rate for Dia W was 78% and for Campo P it was 67%. Under-performing Hi spanic, Native American, and African American students constituted 53% of Dia W's students and 41% of Campo's students. Six percent of Dia W's popul ation were English Language Learners, while 8% of Campo's population were English Language Learners. Proficiency ra tes in mathematics for Dia W and Campo P were 41% and 48%, re spectively. Proficienc y rates for reading were 63% (Dia W) and 65% (Campo). Profic iency for math and r eading were higher at these two schools than at th e previous four schools. Table 9 Statistical Peers Comparisons Performance & Demographic Data AYP SBA 0304 AYP SBA 0405 % FRL 0405 % ELL 0405 % UPE 0405 % Stability 0405 % Prof Math SBA 0405 % Prof Read SBA 0405 Statistical Peers GROUP 2 Alto W Met Met 90 23 92 66 34 47 Familia P Not Met Not Met 73 33 82 67 24 41

PAGE 76

64 Table 9 (continued) Performance & Demographic Data AYP SBA 0304 AYP SBA 0405 % FRL 0405 % ELL 0405 % UPE 0405 % Stability 0405 % Prof Math SBA 0405 % Prof Read SBA 0405 Statistical Peers GROUP 3 Bueno W Met Not Met 72 24 81 76 24 53 Especial P No data Met 61 17 57 56 30 56 Statistical Peers GROUP 4 Dia W Met Met 46 6 53 78 41 63 Campo P Met Not Met 54 8 41 67 48 65 Note. AYP = Adequate Yearly Progress, SBA = Standard-based Assessment, FRL = Free/Reduced Lunch ELL = English Language Learners, UP E = Underperforming Ethnicities. As a whole, the ethnicity breakdown of the district was African-American at 3.9%, Asian Pacific at 2.5%, Caucasian at 31.3%, and Native American at 5.0%. Hispanics were the majority minority of the Ce ntral City School District at 54.3% of the total population. In Table 10, the ethnic brea kdown of the six schools in the study is shown. Table10 2005 Student Demographics School Enrollment Ethnicity AfricanAmerican Asian/ Pacific Caucasian Hispanic Native American Alto W 296 3.0% 0.0% 7.1% 85.8% 4.1% Familia P 559 1.1% 0.4% 14.3% 83.0% 1.3%

PAGE 77

65 Table10 (continued) School Enrollment Ethnicity AfricanAmerican Asian/ Pacific Caucasian Hispanic Native American Bueno W 377 2.1% 0.5% 15.6% 77.7% 4.0% Especial P 600 6.5% 7.8% 30.8% 45.2% 9.7% Dia W 445 4.3% 4.7% 47.4% 35.5% 8.1% Campo P 392 6.4% 6.4% 52.0% 30.6% 4.6% Each school was a single case, but the st udy as a whole, covered six schools, matched by Statistical Peers groupings a nd thus qualified as a multiple-case study. School selection maximized matching across th e five categories of statistical peer membership in order to be able to examine the relationship between category of statistical peer membership and factors associated with program installation and initial implementation. In this way, the study could pr ovide insight into outcomes that could be linked to factors associated with program installation and initial implementation and not influenced by demographic differences. With this information, developers, researchers, and school staff might gain knowledge as to what is working as more evidence-based programs are introduced to schools, what is challenging for the districts, and what researchers might do to support an easier tr ansition from research to practice. Data Collection This case study used a variety of data collection methods, including interviews, focus groups, an implementation checklist, and document reviews. Scripted documents

PAGE 78

66 such as budgets, meeting agendas, and minut es were requested and when available, reviewed to gain information about program plans, staffing, activity levels, and other program characteristics. Semi-structured in terviews of principals, counselors, and teachers solicited descriptive information about factors associated with program installation and initial impleme ntation and perceived supportive and constraining factors. Focus group questions were organized around to pics that emerged from the individual interviews. Merton, Fiske and Kendall (1990) suggest that although conversational in nature and open-ended, the interviewing done in the focus groups may be used to corroborate certain facts that the researcher thinks have been established. Schools. A group of six paired CCPS elementary schools that received Second Step training in 2005 were select ed for participation. Schools e ither identified as adopting Second Step school-wide or in individual classes or grades. Schools were matched with their statistical peer based on the protocol established by Dunavin (2005) to control for external variables that might a ffect factors associated with program installation and initial implementation differently between matched schools. Three matched pairs were established representing the middl e 3 of the 5 district identifie d statistical pe er groupings. The paired schools were Alto W/Familia P, Bueno W/Especial P, and Dia W/Campo P. Participants. The principal shared the information sheet about the study and asked for volunteers for the interviews and focus groups. Three staff (the principal, a counselor, and a key informant) were invite d to be interviewed at each of the six identified schools (Table 11). Interviews we re about 45 minutes in length. Notes were taken and the interviews were recorded on an IPod and la ter transcribed. Interview and focus group times were negotiated between the Research Team and the individual

PAGE 79

67 interviewees and/or focus group participan ts. There was no compensation provided to participants. Five principals participated in the interviews. Especial P's principal did not participate in the interviews stating that she was not in the school in 2005 and had not observed Second Step at the school, although she did know some staff implemented the program. Attempts to find the former princi pal were not successful. Campo Ps former principal contributed to the knowledg e base on the 2005 implementation of Second Step at Campo P, but was not in cluded in the demographics. Six counselors and six key informants were interviewed. A total of 18 interviews were completed. The Principal Protocol is contained in Appendix B. The Counselor, Teacher, and Key Informant Protocol are contai ned in Appendix C. Focus groups were held at five schools. One school principal was not successful in recruiting staff to participate in a focu s group. Overall, it was challenging to recruit focus group participants. It was anticipated there would be 24 part icipants, but only 12 agreed to be interviewed. Fo cus group participants included teachers, counselors, social workers, and educational assistants. These sta ff were not part of th e individual interviews and were identified by the principal as know ledgeable about programs and activities in the school. The focus groups were recorded and transcribed. The Focus Group Protocol is contained in Appendix D. All participants except for one principal (29) completed an implementation checklist rating their perception of imple menting the various components of the Second Step program. The checklist that was adapted from the work of the developers of Second Step is contained in Appendix E. Additionally; school documents that related to training,

PAGE 80

68 program planning, design, administration, and other information (Werner, 2004) were reviewed. Participants including prin cipals, teachers, counselors, and social workers from the six elementary schools all rated factor s associated with program installation and initial implementation as important (Table 11). There was a large range of experience amongst the principals. Total educational experience ranged from 10 years to 37 years, while administrative experience ranged from 4 years to 30 years. Three of the principals administrative experience was only in their present school. Those principals were Bueno Ws principal with 13 years, Dia Ws principal with 12 years, and Familia P's principal with 5 years. Alto W's principal had 19 years of administrative experience with 17 at Alto W. Campo P's principal was by far the most experienced with 30 years of administrative experience of his 37 years in education. He was the principal that was at his present school for only 3 months. He was familiar with Second Step because his previous school had implemented the program. On average, the six counselors were at th eir school for 7.8 years with a range of 5 to 14 years. Counselor experience ranged fr om 5 years to 28 years. Key informants included teachers and one social worker. The social worker had 24 years of educational experience with 8 years at Dia W. Although pr eschool was in severa l of the schools, only one preschool teacher participated in the in terviews or focus groups. She had the most educational experience with 32 years; 9.5 of the years as a special education preschool teacher. All Focus Group participants were Pre K to 2nd grade teachers. The range of experience was 1.5 years to 33 years.

PAGE 81

69 The three schools that implemented Second Step school-wide were led by principals with the most experience in educat ion and who also had been administrators in the same school for the longest period of time. Table 11 Participant Demographics & Implementation Participant Yrs in Educ. Level 1 Yrs in Educ. Level 2 Yrs in Current School Level 1 Yrs in Current School Level 2 Imp. Level 1 Imp. Level 2 Principal N=5 Avg. (Range) 33 a 10 f 17 a 5 f 4.5 a 5 f 25 b 21 e 18 b 3 e 5 b 4 e 24 d 37 c 12 d 0.4 c 4.5 d 5 c 25.8 (10-37) 12.0 (0.4-18) 4.7 (4-5) Counselor N=6 Avg. (Range) 28 a 8 f 5 a 8 f 5 a 5 f 8 b 25 e 8 b 5 e 5 b 4 e 17 d 10 c 14 d 10 c 5 d 5 c 15.5 (5-28) 7.8 (5-14) 4.8 (4-5) Key Informant N=6 Avg. (Range) 7 a 5 f 3 a 5 f 5 a 5 f 9 b 16 e 2 b 4 e 5 b 4 e 24 d 32 c 8 d 9.5 c 5 d 5 c 16.3 (7-32) 4.8 (2-9.5) 4.7 (4-5)

PAGE 82

70 Table 11 (continued) Participant Yrs in Educ. Level 1 Yrs in Educ. Level 2 Yrs in Current School Level 1 Yrs in Current School Level 2 Imp. Level 1 Imp. Level 2 Focus Groups N=12 Avg. (Range) 4-28 a 1.5-8 f 4-13 a 1.5-5 f 5 a 4.5 f 25-30 b 33 e 5-25 b 8 e 5 b 5 e 24 d 9-20 c 8 d 7-18 c 5 d 5 c 18(1.5-33) 8(1.5-18) 4.8(4.5-5) Note. Level 1 schools indentify as implementing school-wide: a = Alto W, b Bueno W, d = Dia W. Level 2 schools identify as implementing in individual classrooms or grades: f = Familia P, e = Especial P, c = Campo P. Interviewees were asked to rate the importance of prevention programming from a scale of 1, not important to a 5, very important. The former principal of Campo P was interviewed, but not included in the demographics. Especial P's principal declined to be interviewed, but her demographic data is included. Dash indicates missing data. Research Team. Four individuals were recruited to be on the Research Team for the study. They provided support with interviewing and focus groups as well as rating propositions. All of them completed appropria te IRB requirements. One rater was a semiretired professor who conducts local school di strict evaluations, two had been involved with previous state studies, and the last works with data and evaluation as part of her work. Each team member received a handbook with directions on how to conduct interviews and focus groups, a scripted statemen t to read before the interviews and focus groups, directions on how to rate th e responses, and all protocols.

PAGE 83

71 Protocols. Interview protocols were establis hed with a series of questions associated with each proposition indicator. Mo st questions were open-ended to solicit a richer, more in-depth response. Each ques tion was designed to be answered by three individuals: the principal, counselor, and a key inform ant. At one school, only two individuals volunteered for the interview. An additional interview was conducted because the present principal was only at the school for 3 months. Participants were asked to use a Likert-like scale for their belief on the importance of prevention and intervention programs in the schools. There were also Focus Group protocols th at provided more information for the study from the focus group participants. Similar to the interview protocol, most questions were open-ended. However, in the focus group, participants were able to complement each other's responses, providing a more in-depth response. Data Analysis To help establish the construct validity and reliability of the case study, evidence was examined through (a) use of multiple sour ces of evidence, (b) creation of a case study database, and (c) maintenance of a chain of evidence (Yin, 2003). This study utilized a multi-method, multi-source approach with adherence to these principles in order to increase its quality. Four sources of information were analyzed in this study to respond to the research questions and evaluate the pr oposed propositions and their in dicators. These included: (a) interviews, (b) focus groups, (c) implemen tation checklists, and (d) document review.

PAGE 84

72 Interview Proposition Ratings. The transcripts from the interviews were evaluated by three trained rate rs of the Research Team to determine the support for each proposition. The team members were asked to read the responses from the six schools and judge fidelity of the response to th e proposition indicators associated with Second Step Each rater was given a copy of the transcripts from each school without school identification as well as a rating form (Figur e 4) for objectively rati ng the level of support for or against each proposition indicator. Each school was assigned ratings on factors associated with program installation and initial implementation that were used to provide a "score" for the associated propositions. The rating form was designed so that each school had its own form with the propositions, indicators, and a grid. Raters were to judge whether the data provided were supportive of or against the statement. Proposition indicators were designed to isolate factors associated with program installation and initial implementa tion of the evidence-based program. Raters were provided the transcripts of th e interviews and focus groups The raters were asked to score the responses to the16 proposition indi cators based on a rating range of +3 for strongly in support of the propos ition indicator to -3 for st rongly against the proposition indictors. If there was no evidence for or against, they were to mark zero. The scores were averaged across th e participants (principal, counselor and key informant) to get an average score for each indicator. This score was then added across the raters to get a single total score of the raters' evaluation of the proposition indicators responses of support for or against or no response to the individual indicat ors (Appendix F).

PAGE 85

73 School being rated: ____________________ ______________ Rater ____________ Proposition A: (Training) School s that received training in Second Step prior to implementation of the program were provided with the appropriate implementation tools and support necessary to implement the program. INSTRUCTIONS: Rate the following parts of the proposition. Please circle your response. If data supports or is against the statement, rate the evidence as strong, moderate or mild by circling either +3, +2,+1, -3, -2, or -1. If the data have no evidence about the statement, then circle 0. The data provide evidence that SUPPORT S the statement that fill in one part of the proposition and the evidence is The data provide evidence that is AGAINST the statement that fill in one part of the proposition and the evidence is The data DOES NOT provide any evidence about the statement that fill in one part of the proposition. (NOTE: Mark this option only if there was NO evidence in the data) Parts of Proposition (Indicators): Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 1. There is evidence that school staff received training on the purpose and expected outcomes of providing Second Step in the schools. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 2. There is evidence that school staff received training on the Second Step Curriculum. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 3. There is ev idence that the school staff recei ved training on how to involve families in Second Step Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 4. There is evidence that school staff received an adequate number of curriculum kits for appropriate implementation of Second Step Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 Figure 4. Rating Scale. Interrater Reliability of Ratings. Intraclass correlation coefficients (ICC) were calculated to determine relia bility on the interview and focus group ratings of the proposition indicators, as well as the proposit ions as a whole, as summarized by Fleiss (1986) (Table 12). Interrater reliability was estimated us ing SPSS 15.0 to calculate the ICC values from a two way random consis tency model as described by McGraw and Wong (1996). This procedure is similar to one developed in a case study analysis examining school reform (Duchnowski, Kuta sh, & Oliveria, 2004). By convention, an ICC>.70 is considered acceptabl e interrater reliability, but this depends highly on the researchers' purpose. Another rule of t humb is that ICC from .41 to .60 indicates

PAGE 86

74 moderate interrater reliabilit y, .61 to .80 substantial, .81 and greater outstanding (Landis & Koch, 1977). Proposition indicators were examined to get a deeper understand of what conditions identified in the proposition cont ributed to the results of the intraclass correlation of the propositions. When following common precedent, there was weak interrater reliability for the Proposition Indicator on whether staff received training on the purpose and expected outcomes (A1) and wh ether the designated Champion articulated the Second Step program (D2) for both the inte rview and focus group responses. Responses to whether staff r eceived training on the curriculum (A2), shared time to work together for appropriate implementation (B2), and whether higher levels of fidelity was associated with the presences of a Champion (D 4) also showed weak interrater reliability for the interview responses. Focus Group responses for C3 had the weakest reliability score (0.00). This proposition indicator reads as "There is evidence that staff that used the Second Step implementation tools were more likely be at a school that implemented Second Step school-wide." Moderate interrater reliability was achi eved in interview responses for proposition indicators B4 (.44) and C1 (.42) on whether staff received share time to review successes and concerns about implementation and whether there was evidence that the school delivered Second Step school-wide. Focus Group responses for proposition indicators B2, B3, B4, C1, C2, and D4 also had m oderate interrater reliability. Eight of the 16 interview proposition indi cators ranged from .63 to .80, indicating substantial interrater reliability. Those indi cators included: (a) adequate number of curriculum kits (A4); (b) staff received sufficient time to review the program (B1); (c)

PAGE 87

75 specific blocks of time were allocated for st aff to implement the program (B3); (d) more time for peer-to-peer support when Second Step was implemented school-wide (C2); (e) more likely to use implementation tools when Second Step was implemented school-wide (C3); (f) staff attributed positive outcomes to Second Step (C4); (g) evidence for a designated champion (D1); and (h ) a champion articulated the Second Step program to the entire staff (D3). Two of the focus gr oup responses to the proposition indictors had substantial interrater reliab ility. They were A2 and A3. The two propositions indicators were staff receiving training in general and more specifically, on family involvement. Two of the interview proposition indicators had outstanding rater reliability. They were (a) there is evidence that the school staff receive d training on how to involve families in Second Step (A3); and (b) there is evidence that staff attributed positive outcomes to Second Step (C4). Five of the focus proposit ion indicators ranged from 0.81 to 1.00. The five that had outstanding rater reliability were (a) there is evidence that school staff received an adequate number of kits (A4); (b) there is evidence that school staff received sufficient time to review th e Second Step program (B1); (c) there is evidence that staff attributed positive outcomes to Second Step (C4); (d ) there is evidence that there was a designated Champion (D1); and (e) there is evidence that the Champion or directly insured allocation of time and resources to support the Second Step program (D3).

PAGE 88

76 Table 12 Intraclass Correlations of Proposition Indicators and Proposition Aspect Scores Interviews Focus Groups Propositions ICC ICC A1 0.28 0.33 A2 0.35 0.80S A3 0.89O 0.80S A4 0.63S 0.95O B1 0.72S 0.99O B2 0.33 0.51M B3 0.65S 0.50M B4 0.44M 0.53M C1 0.42M 0.43M C2 0.72S 0.42 M C3 0.79S 0.00 C4 0.85O 0.81O D1 0.78S 1.00O D2 0.04 0.33 D3 0.67S 0.94O D4 0.16 0.50 M Note. m = Moderate Interrater Reliability; s = Substantial Interrater Reliability; o = Outstanding Interrater Reliability Interviews and Focus Groups. The difference between ratings for the paired schools were calculated to compare paired sc hools implementing school-wide (Level 1) to all the schools implementing in individua l classes or grades (Level 2). A paired t -test was conducted on the average ratings of the proposition indicators. The t -test was completed on the average scores of propositi ons and proposition indica tors to answer the

PAGE 89

77 question, "Is there a difference in paired Level 1 and Level 2 schools on interview propositions and their indicators?" Focus Groups were held at 5 of the 6 schools. The Focus Group Protocols were examined by the Research Team and similar to the Interview Protocols, they were scanned for emerging themes that would s upport the interviews, checklist results, and artifacts. Checklist. The checklist was designed to he lp understand each implementation component's relative level of ease or diffi culty in implementing (Appendix E). Twentynine participants completed a checklist of the steps of Second Step They rated each step on a scale of 1-5 in terms of how easy or di fficult it was to implement the implementation component. One was considered the easiest and 5 the most difficult. They could also respond Don't Know. The difference between ratings for the pa ired schools were calculated to compare paired schools implementing school-wide (Lev el 1) to all the schools implementing in individual classes or grades (Level 2). A t -test was completed on the checklist results. All participants were provided a checklist with a Likert scale of 1-5, with one being the easiest and five the most challenging, to ra te ease of implementation of the conditions identified by the program developer leading to successful implementation. Document Review. Requests were made to all participants for any documents that would provide evidence of the factors associ ated with program installation and initial implementation process. Related school docum ents were visibly scanned for any other supporting information. Participan ts did not have documents available as far back as 2005. Some shared more recent documents as an example.

PAGE 90

78 The information from the four sources (interviews, focus groups, checklist, and document review were analyzed to confir m or disconfirm the proposed propositions and subsequently answer the research questions proposed for the study in conjunction with the data within the context of the interviews and focus group responses. Confidentiality All information was kept in a locked file cabinet and on a secured password protected computer. Interviews and focus groups were audio recorded. Only the principal investigator and the research team associated with the study had access to the recordings and their printed versions. The names of th e state, county, school district, and schools were changed to prot ect confidentiality. All participants were given information about the study prior to participation and written documentation of informed consent was obtained. Procedures to obtain consent were approved by the University of South Florida Institutional Re view Board (IRB) and by the CCPS Research, Accountability, & Development Department Review Board. Confidentiality was mainta ined throughout the study. Study Limitations Second Step was originally selected by staff within the district that oversaw substance abuse and violence prevention and in tervention programs. They were assigned to find an evidence-based violence preven tion program that was aligned to National Education Standards and would be willing to wo rk with the district to align the work to Manzano Education Standards. There could be a possibility of a per ception of conflict of interest in this study becaus e the principal investigator has a favorable bias toward Second Step The principal investigator had final approval on the selection of Second Step

PAGE 91

79 and has continued to support its adoption into the district. The pr incipal investigator guarded against this bias and conflict of interest by conducting interviews and focus groups when available, with staff not conne cted directly with the district, and only interviewed staff not connected directly to evaluation of the CCPS Health and Wellness Department. Some interviewees may have been uncomfortable with a district level administrator requesting information from th em about the work done in the schools even though the principal investigator had no eval uation principal authority over them. The principal investigator guarded against the perceived coercion and conflict of interest by ensuring participants confiden tiality and that the responses to the questions would not reflect on any performance evaluation. The scope of this study was limited to school staff. The principa l investigator did not request information from parents or stude nts because this was a retrospective study about the factors associated with program in stallation and initial implementation from the school staff perspective, not about pare nts and students perception of the implementation. There were only six schools with three matc hed pairs in this study. This is a very small sample. Because this is a retrospective study, there has been turnover of staff and principals in some of the selected school This limited the pool of staff available for interviewing and focus groups. One former pr incipal was not located for interviewing and the present principal declined the interview as she was not at the school in 2005. At least four focus group participants were requested for each focus group for a total of 24. There were only 12 staff available, with no participants at Especial P. F our (Alto W, Bueno W,

PAGE 92

80 Dia W, and Familia P) of the six schools re ported no turnover of principals, while the other two (Especial W and Campo W) reporte d three different principals from the 20052006 school year through the 2008-2009 school year. Alto W, Campo P, Dia W, and Especial P estimated a turnover rate of less than three teachers, while Bueno W and Familia P estimated a turnover rate of 18% and 60%, respectively. Summary This study was conducted to more comp letely understand how the factors associated with program in stallation and initial implementation worked at CCPS by gaining a better understanding of (a) the factors that suppor t implementation of evidencebased programs in K-12 public schools, (b) th e factors that constrain implementation, and (c) how developers and researchers might facilit ate the application of research to practice. This was a multi-method, multi-source retrospe ctive design, using both parametric and descriptive qualitative analys is. The focus of the analysis was to explore the early implementation stages of program installation and initial implementation and determine if there is a difference between schools that se lf-identified as implementing in the whole school vs. those self-identif ying in individual classrooms or grades including paired schools (AltoW -Familia P, Bueno WEspecial P, Dia W-Campo P).

PAGE 93

81 Chapter Four Results Overview The present study was a multi-method, multisource, retrospective explanatory study of the factors associated with program installation and initial implementation of an evidence-based violence prevention program, Second Step in six elementary schools of a large urban school district. The goals of this study were to provide a better understanding of (a) the factors that suppor t implementation of evidence-based programs in K-12 public schools, (b) the factors that constrain implementation, and (c) how program developers and researchers might facilitate the application of research to practice. The focus of this implementation study was on the explorati on of the difference between schools identifying as implementing Second Step in the whole schools (Level 1) vs. schools identifying as implementing in individual cla ssrooms or grades (L evel 2) in six CCPS elementary schools during the program installa tion and initial implementation process. The results of the study are presented in 3 sections. The firs t section provides results of the research questions. The second section discusses the analytic procedures and analysis. The final secti on concludes with a summary. Research Questions In the following section the summarized da ta and results of analyses are presented to address the research questions.

PAGE 94

82 Fixsen et al.'s (2005) literature review found evidence that a multi-level approach is important for successful implementation. Ev idence related to conditions that influence evidence-based programs includes practicebased practitioner se lection, skill-based training, practice-based coachi ng, and facilitative administrative practices. This study examined training, time, implementation level, and champion. Research Question 1. 1. To what extent, if any, are there di fferences in training on understanding purpose and expected outcomes, the curri culum, parent involvement, and being provided sufficient kits between sc hools that identify as implementing Second Step school-wide vs. schools that identify as individual classes or grade levels? This research question examines the trai ning provided to schools to see if there are any differences associated with school s implementing school-wide (Level 1) vs. schools implementing in individual classroom s and grades (Level 2) (Table 13). The range of the ratings of Proposition A (Train ing) was 0.14 to 1.39. Schools were clustered around mild supportive for the proposition on whether schools received the necessary tools and support to implement. The range of difference between the matched pairs of Level 1 and Level 2 schools was -1.17 to 0.19. The paired schools responses of Alto W/Familia P and Bueno W/Especial P were very similar. Dia W and Campo W had a noticeable difference of -1.17 with the Le vel 2 school showing greater support on the training proposition. The range of the ratings of A1 trai ning related to the purpose and expected outcome was 1.0 to 2.0, all with mild to m oderate support. The range of difference between Level 1 and Level 2 schools was 1.00 to -0.11 with all Level 2 schools rating

PAGE 95

83 greater support for this indicator. Again, the pair Alto W/Familia P was very similar. Bueno W/Especial P and Dia W/ Campo P had a wider range. Th e range of the ratings of A2, training on curriculum, was -0.45 to 1.78. The difference ranged from -0.64 to -0.45. All Level 2 schools rated hi gher on receiving training on Second Step curriculum. Familia P rated the highest at 1.78. Responses for A 3, training parent involvement, were mixed with near moderate against to near moderate in support of the indicator. The range was from -1.89 to 1.78. Familia P rated the highest for the indicator, while Dia W rated the lowest. The difference in the paired Level 1 and Level 2 schools ranged from -2.72 to 0.33. Dia W and Campo P had the largest di fference with Campo W staff identifying enough evidence to receive a rati ng of near moderate support for the parent involvement training. The range of the ratings of A4 was -0.78 to 1.83. Familia P was the only school that reported there was no support for the statement that they received adequate curriculum kits. The range of difference wa s -0.50 to 2.44 with the largest difference between Alto W/Familia P. Table 13 Difference between paired Level 1 and Level 2 schools on Proposition A and Indicators Proposition A and A Indicators Leve11 School Level 2 School Difference Proposition A (Training) Schools that received training in Second Step prior to implementation of the program were provided with implementation tools and support necessary to implement the program. 1.39a 1.20 f 0.19 0.14 b 0.25 e -0.11 0.22 d 1.39 c -1.17

PAGE 96

84 Table 13 (continued) Proposition A and A Indicators Leve11 School Level 2 School Difference A1.There is evidence that school staff received training on the purpose and expected outcome of providing in Second Step in the schools. 1.89 a 2.0 f -0.11 1.00 b 2.0 e -1.00 1.00 d 1.83 c -0.83 A2. There is evidence that school staff received training on the Second Step Curriculum 1.22 a 1.78 f -0.56 -0.45 b 0.00 e -0.45 0.44 d 1.08 c -0.64 A3. There is evidence that the school staff received training on how to involve families in Second Step 0.78 a 1.78 f 1.00 -0.67 b -1.00 e 0.33 -1.89 d 0.83 c -2.72 A4. There is evidence that school staff received an adequate number of curriculum kits for appropriate implementation of Second Step. 1.67 a -0.78 f 2.44 0.67 b 0.00 e 0.67 1.33 d 1.83 c -0.50 Note. Range was +3 for strongly in support of the proposition to -3 for strongly against support of the proposition with 0 denoting no evidence. A rating of 2 wa s moderate in support of or -2 against support of the proposition. A rating of 1 was mid in support of or -1 against support of the proposition. Level 1 schools indentify as implementing school-wide: a = Alto W, b Bueno W, d = Dia W. Level 2 schools identify as implementing in individual classrooms or grades: f = Familia P, e = Especial P, c = Campo P. Five paired t -tests (overall for A and the four indicators) were used to compare the ratings for the three pairs of matched school s (Table 14). A2, training on the curriculum, was the only indicator found to be significant, t (2) = -9.99, p = .01. In this indicator, Level 2 schools were significantly higher than Level 1. The biggest mean difference was for A3, training on parent involvement (-1.13), but there was a lot of variability in the difference scores (SD = 1.53). The difference was not statistically significant.

PAGE 97

85 Table 14 Mean Comparisons of Paired Level 1 and 2 Schools for Proposition A and Indicators Proposition A & Indicators Mean Level 1 Mean Level 2 Mean Diff SD t p Proposition A (Training) 0.58 0.95 -0.36 0.71 -0.88 .47 A1 1.30 1.95 -0.65 0.47 -2.37 .14 A2 0.40 0.95 -0.55 0.10 -9.99 .01* A3 -0.59 0.54 -1.13 1.53 -1.28 .33 A4 1.22 0.35 0.87 1.49 1.02 .42 Note. Level 1 schools indentify as implementing school-wide: Level 2 schools identify as implementing in individual classrooms or grades. p < .05 A comparison was completed on checklist questions related to training. The ratings ranged from 1.0 to 3.5. The range of difference was -1.0 to 3.7. Alto W rated the overview presentation (1.3), in itial one-day staff training as easy (1.8), preparation presentation and outline (1.7) as easy. Its paired school, Familia W rated the overview at 3.3, the one-day training at 3.5, and the presentation and outline as 2.7.

PAGE 98

86 Table 15 Difference between paired Level 1 and Level 2 schools on Implementation Steps related to Research Question 1 Checklist Implementation Step Leve11 School Level 2 School Difference 7 Second Step overview presentation 1.3 a 3.3 f -2 1 e 3.5 d 1.7 c 1.8 8 Initial one-day staff training 1.8 a 3.5 f -1.7 1 e 5 d 1.3 c 3.7 18 Second Step presentation preparation and outline 1.7 a 2.7 f -1.0 2.5 e 5 d 1.5 c 3.5 Note. Level 1 schools indentify as implementing school-wide: a = Alto W, b Bueno W, d = Dia W. Level 2 schools identify as implementing in individual classrooms or grades: f = Familia P, e = Especial P, c = Campo P. The lower the score, the easier it is to implement the step. Dashes indicate the respondent did not know. A t -test was completed to compare the difference between Level 1 and Level 2 schools on the checklist questions related to training. Paired t -tests of the individual indicators with a mean range of 2.10 to 3.40 revealed no statistically significant differences ( ps > .05).

PAGE 99

Table 16 Checklist Comparisons of Level 1 vs. Level Schools on Training Level 1 schools indentify as impl ementing school-wide: Level 2 schools identify as implementing in individual classrooms or gra des. Checklist Questions Mean Level 1 Mean Level 2 Mean Diff SD t p 7 2.40 2.50 -0.10 2.68 -.05 .97 8 3.40 2.40 1.00 3.82 .37 .77 18 3.35 2.10 1.25 3.18 .56 .68 Research Question 2. 2. To what extent, if any, are there differe nces in time allocation for learning the curriculum, shared planning time, cla ssroom lessons, and review of lessons between schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by individual classes or grade levels? This research question examines the diffe rences in time allocation in Level 1 and 2 schools (Table 17). The range of the rati ngs of Proposition B (Time) was 0.08 to 0.90. Schools were clustered around mild suppor tive for the proposition. The range of difference was 0.42 to 0.82. There were little variances amongst the schools. The range of ratings of B1, sufficient time to learn the program components and deliver the program, was -1.11 to 1.22. Familia P had the highest rating, while Dia W had the lowest. The range of differences was -1.11 to 1.19. There was over 1 point difference on all pairs when comparing Level 1 to Level 2 schools. The range of the ratings for B2, shared time to work together, ranged from -1.45 to -0.11. The range of difference was -1.0 to 1.31. All Level 1 rated higher on shared time, however, the ratings were mildly against the indicator. The ra nge for B3, specific blocks of time for implementation, 87

PAGE 100

88 ranged from 0.67 to 2.5. Alto W (0.44) and Ca mpo P had the highest ratings (2.44). The range of differences was -1.17 to 1.22. All pairs had a difference of over 1. The range of ratings for B4, time to review successes and concerns, was -0.22 to 1.42. The ratings varied with Bueno W indicati ng no evidence in either direc tion. The range of differences was -1.20-0.34. The largest differenc e was between Dia W/Campo P. Table 17 Difference between paired Level 1 and Level 2 schools on Proposition B and Indicators Proposition B and B Indicators Leve11 School Level 2 School Difference Proposition B (Time) When implementing the program time was allocated for school staff to learn the program components as well as sufficient time to deliver the program to students. 0.67 a 0.25 f 0.42 0.72 b 0.17 e 0.55 0.08 d 0.90 c -0.82 B1. There is evidence that school staff received sufficient time to review the Second Step program. 0.11 a 1.22 f -1.11 1.11 b -0.17 e 1.28 -1.11 d 0.08 c 1.19 B2. There is evidence that school staff received shared time to work together for appropriate implementation of Second Step. -0.44 a -1.45 f -1.00 -0.11 b -0.83 e -0.72 -0.11 d -0.42 c -0.31 B3. There is evidence that specific blocks of time were allocated for school staff to implement the program. 2.44 a 1.45 f 1.01 1.89 b 0.67 e 1.22 1.33 d 2.5 c -1.17

PAGE 101

89 Table 17 (continued) Proposition B and B Indicators Leve11 School Level 2 School Difference B4. There is evidence that school staff received shared time to review successes and concerns about Second Step implementation. 0.56 a -0.22 f 0.34 0.0 b 1.0 e -1.0 0.22 d 1.42 c -1.20 Note. Range was +3 for strongly in support of the proposition to -3 for strongly against support of the proposition with 0 denoting no evidence. A rating of 2 wa s moderate in support of or -2 against support of the proposition. A rating of 1 was mid in support of or -1 against suppor t of the proposition. Note. Level 1 schools indentify as implementing school-wide: a = Alto W, b Bueno W, d = Dia W. Level 2 schools identify as implementing in individual classrooms or grades: f = Familia P, e = Especial P, c = Campo P. Five paired t -tests (overall for B and the four i ndicators) were used to compare the ratings for the three pairs of matched school s (Table 18). Differences between schools on the four indicators ranged from a mean of -0.47 to 10.91. Paired t -tests of the individual indicators revealed no statistic ally significant differences ( ps > .05). The biggest mean difference was for B2, training on parent i nvolvement (10.91), but there was a lot of variability ( SD = 27.40). The difference was not statistically significant. Table 18 Mean Comparisons of Paired Level 1 and 2 Schools for Proposition B and Indicators Proposition B & Indicators Mean Level 1 Mean Level 2 Mean Diff SD t p Proposition B (Time) 0.50 0.44 0.05 0.76 0.12 .92 B1 0.37 0.38 -0.34 1.40 -0.42 .72

PAGE 102

90 Table 18 (continued) Proposition B & Indicators Mean Level 1 Mean Level 2 Mean Diff SD t p B2 -3.85 -14.76 10.91 27.40 0.69 .56 B3 1.89 1.54 0.35 1.31 0.46 -.69 B4 0.26 0.73 -0.47 1.09 -0.75 .53 Note. Level 1 schools indentify as implementing school-wide: Level 2 schools identify as implementing in individual classrooms or grades. Research Question 3. 3. What strategies do principals per ceive to be effective in promoting implementation in their schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by individual classes or grade levels? This research question examines strategies principals perceive to be effective in promoting implementation for Level 1 and 2 sc hools (Table 19). Firs t, we examined the difference between school ratings on Proposition D and indicators. Next, a t -test was completed to compare the ratings for the three pairs of matched schools. The proposition rating was mildly supportive of the proposition, when a school had a designated champion staff were more likely to implement the program. The range was 0.71 to 1.44 with a difference range of -0.11 to 0.54. Two of the Level 1 schools rated higher than their paired school, however, there was not much variability between the paired scores.

PAGE 103

91 Table 19 Difference between paired Level 1 and Level 2 schools on Proposition D and Indicators Proposition D and D Indicators Leve11 School Level 2 School Difference Proposition D (Champion) When a school had a designated champion for Second Step teachers and/or counselors were more likely to implement the program than when there was no champion present. 0.81 a 0.72 f 0.09 1.25 b 0.71 e 0.54 1.33 d 1.44 c -0.11 D1. There is evidence that there was a designated Champion. 3.0 a 1.22 f 1.78 2.67 b 2.83 e -0.16 2.89 d 2.25 c 0.64 D2. There is evidence that the designated Champion articulated the Second Step program to the entire staff. 1.44 a 0.89 f 0.55 0.89 b 0.0 e 0.89 1.11 d 1.33 c -0.22 D3. There is evidence that the Champion directly insured the allocation of time and resources to support the Second Step program. -0.56 a 1.44 f 2.0 1.22 b -0.17 e 1.39 1.44 d 0.92 c 0.52 D4. There is evidence that implementation of Second Step with higher levels of fidelity was associated with the presence of a clear Champion. -0.67 a -0.67 f 0.0 0.22 b 0.17 e 0.22 0.11 d 0.92 c -0.81 Note. Range was +3 for strongly in support of the proposition to -3 for strongly against support of the proposition with 0 denoting no evidence. A rating of 2 wa s moderate in support of or -2 against support of the proposition. A rating of 1 was mid in support of or -1 against suppor t of the proposition. Note. Level 1 schools indentify as implementing school-wide: a = Alto W, b Bueno W, d = Dia W. Level 2 schools identify as implementing in individual classrooms or grades: f = Familia P, e = Especial P, c = Campo P.

PAGE 104

Five paired t -tests (overall for D and the four indicators) were used to compare the ratings for the three pairs of matched school s (Table 20). Differences between Level 1 and Level 2 schools on the four indicators ranged from a mean of -0.11 to 2.83. Paired t tests of the individua l indicators revealed no statistically significant differences ( ps > .05). Table 20 Mean Comparisons of Paired Level 1 and 2 Schools for Proposition D and Indicators Note. Level 1 schools indentify as implementing school-wide: Level 2 schools identify as implementing in individual classrooms or grades. Proposition D & Indicators Mean Level 1 Mean Level 2 Mean Diff SD t p Proposition D (Training) 1.13 0.17 0.33 0.19 0.90 .46 D1 2.85 2.10 0.75 0.97 1.34 .31 D2 1.14 0.74 0.41 0.57 1.24 .34 D3 0.70 0.73 -0.03 1.76 -0.30 .98 D4 -0.11 0.14 -0.25 0.48 0.91 .46 The Alto W principal reinforced the use of Second Step as he "expected to see Second Step in lesson plans." Familia P grade-level chairs were responsible to teach the other teachers in their grade and work with the other teachers to develop curriculum maps. Familia P was the only school that followed Second Step protocol of having teachers teach the program, rather than the pr ogram be the responsibility of the counselor. It was the only school that the Princi pal participated in the training. Although 92

PAGE 105

93 implementation was voluntary in Familia P, th e need to be consistent across the school was emphasized by the principal and the counselor. Research Question 4 4. What are the barriers and facilitators of implementation identified by teachers and counselors in their schools that identify as implementing Second Steps school-wide vs. schools that identify as individual classes or grade levels? This research question examined what teachers and counselors experienced as barriers and facilitators of factors associated with program installation and initial implementation. They were asked to rate th e steps to implementation identified by the program developer. One was considered a step that was easy to implement and five was considered the most difficult. The range of difference was -0.07 to 3.8. The most notable difference in paired school ratings was on lesson plan social skills training. Dia W rated it a 5, while its paired school ra ted that activity at 1.2 for a difference of 3.8. Familiarizing parents and care givers with the program was rated a 5 by both schools in the pair Bueno W/Especial P. Dia W also rated it a 5, but its paired school rated it a 3.4. Especial P rated extending le arning opportunities to applying skill steps in all settings a 4.5. Its paired school, Bueno W ra ted the step 3.7. Unde rstanding the use of Second Step to address identified needs and lesso n plan social skills training were identified easy (1) by the pair Bueno W/Especial P. Awareness of need for social skills and violence prevention program was rated a 1 by Bueno W. Its partner school rated that step as a 2. Overall, schools rated the checkli st steps in the range of 1.0 to 2.6 for a total of thirty-eight times and in the range of 4 to 5 for a total of six times.

PAGE 106

94 Table 21 Checklist Comparisons of Level 1 vs. Level Schools on Barriers and Facilitators Checklist Implementation Step Leve11 School Level 2 School Difference 2 Reinforcing strategies and concepts in daily activities and using consistent messages throughout the school 1.6 a 2.2 f 0.6 2.3 b 2.5 e 0.2 1.7 d 1.9 c -0.2 3 Extending learning opportunities by applying skill steps in all settings 2.8 a 2.0 f 0.8 3.7 b 4.5 e -0.8 2.5 d 1.8 c 0.7 4 Modeling Second Step skills and behaviors in all interactions 1.8 a 1.8 f 0 4.0 b 2.5 e 1.5 1.5 d 1.6 c -0.1 5 Integrating learning goals throughout the regular curriculum 2.6 a 2.4 f 0.2 4.7 b 4.5 e 0.2 2.5 d 2.6 c -0.1 6 Familiarizing parents and caregivers with the program 3.1 a 3.2 f -0.1 5.0 b 5.0 e 0.0 5.0 d 3.4 c 1.6 10 Involvement of non-classroom staff 3.3 a 3.7 f -0.4 5.0 e 1.5 d 2.0 c -0.5 12 Awareness of need for social skills and violence prevention program 1.5 a 2.5 f -1.0 1.0 b 2.0 e -1.0 2.0 d 1.2 c 0.8

PAGE 107

95 Table 21 (continued) Checklist Implementation Step Leve11 School Level 2 School Difference 13 Understanding of use of Second Step to address identified needs 2.0 a 2.5 f -0.5 1.0 b 1.0 e 0.0 2.5 d 1.6 c 0.9 21 Lesson plan social skills training 1.3 a 2.0 f -0.7 .1.0 b 1.0 e 0.0 5.0 d 1.2 c 3.8 Note. Likert Scale is from 1 easiest to 5 most difficult to implement implementation step. The lower the score, the easier it is to implement the step. Level 1 schools indentify as implementing school-wide: a = Alto W, b Bueno W, d = Dia W. Level 2 schools identify as implementing in individual classrooms or grades: f = Familia P, e = Especial P, c = Campo P. Dashes indicate the respondent did not know. A t-test was completed to compare the difference between Level 1 and Level 2 schools on the checklist questions related to training. Paired t -tests of the individual indicators with a mean range from 1.87 to 4.37 revealed no statis tically significant differences ( ps > .05). Table 22 Checklist Comparisons of Level 1 vs. Level Schools on Difference Checklist Questions Mean Level 1 Mean Level 2 Mean Diff SD t p 2 1.87 2.20 -0.33 0.23 -2.50 .13 3 3.00 2.77 0.23 0.90 0.45 .70 4 2.43 2.00 0.47 0.90 0.90 .46

PAGE 108

96 Table 22 (continued) Checklist Questions Mean Level 1 Mean Level 2 Mean Diff SD t p 5 3.30 3.17 0.10 0.17 1.00 42 6 4.37 3.87 0.50 0.95 0.91 .46 10 2.40 2.85 -0.45 0.07 -9.00 .70 12 1.50 1.90 -0.40 1.04 -0.67 .57 13 1.83 1.7 0.13 0.71 0.33 .78 21 2.43 1.40 1.03 2.42 0.74 .54 Note. Level 1 schools indentify as implementing school-wide: Level 2 schools identify as implementing in individual classrooms or grades. Bueno W's Key Informant and Focus Gr oup participants believed the emphasis was on 1st grade with other grades as they e xpanded. The higher grades only received the program if requested. Staff at both Bueno W and Especial P believed more training would have benefitted the program. One st aff member from Bueno W explained why more training was important. "If you have a better understanding of why you are doing what you are doing and the impact it can have on children, that's going to make you buyin more. You are more likely to continue using it because you know what the outcomes are going to be." The Especial P counselor i ndicated that "I did not do everything they recommended because I didn't have time to do everything from every part of the lesson. It was the first thing to go if there was anything else happening." Research Question 5. 5. To what extent, if any, are there differences in staff commitment to implementation, more peer-to-peer support, more adherence to the program

PAGE 109

97 model, and more staff perception on positive student outcomes between schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by indivi dual classes or grade levels? This research question examines the differences of Level 1 and 2 schools in the areas of staff commitment, peer-to-peer s upport, adherence to the program model and belief positive outcomes were related to Second Step (Table 23). The range of the ratings of Proposition C (Implementation Level) was 0-.34 to 1.11. Schools were clustered around mild supportive for the proposition. The range of difference between the matched pairs of Level 1 and Level 2 schools was -0.51 to 0. 83. The range of the ratings of C1, school-wide delivery of Second Step, was -1.11 to 1.56. All Level 2 schools rated higher than their matched school. The range of difference between Level 1 and Level 2 schools was -0.78 to 1.25. The largest difference was between Dia W/Campo P at 1.25. Campo P rated higher. The range of the ratings of C2, peer-to-peer support, was -2.00 to 1.22. Alto W/Familia P and Dia W/Campo P Level 2 schools rated higher than their matched school in contrast to the expected outcome. The difference ranged from -1.39 to 1.33. The largest difference was between Bueno w/Especial P with Bueno W rating higher. The range of the ratings for C3, use of implementation tools, was from -1.50 to .89. The difference in the paired Level 1 and Level 2 schools ranged from 0.64 to 2.17. Bueno W/ Especial had the largest difference at 2.17. The range of the ratings of C4, st aff contributed positive outcomes to Second Step was 0.67 to 2.67 with Campo P rating the high of 2.67. The range of difference was -0.78 to -0.17. All Level 2 schools scored higher than their paired schools.

PAGE 110

98 Table 23 Difference Between Paired Level 1 and Level 2 Schools on Proposition C and Indicators Proposition C and C Indicators Leve11 School Level 2 School Difference Proposition C (Implementation Level) If Second Step was implemented school-wide, there was more staff commitment to implementation, more peer-topeer support, and more adherence to the program model. Staff were more likely to attribute positive student outcomes to Second Step than when Second Step was implemented only in individual classrooms or grades. -0.28 a 1.11 f 0.83 0.31 b -0.34 e 0.03 0.06 d 0.57 c -0.51 C1. There is evidence that Second Step was delivered school-wide. 0.78 a 1.56 f -0.78 -1.11 b -0.30 e 0.78 -1.0 d 0.25 c 1.25 C2. There is evidence that when Second Step was implemented school-wide th ere were more specific time blocks allocated fo r Peer-to-Peer support. -2.0 a 1.22 f 1.33 -0.11 b -1.50 e -1.39 -0.89 d -0.58 c 0.31 C3. There is evidence that staff that used the Second Step implementation tools were more likely be at a school that implemented Second Step school-wide. -0.56 a 0.89 f 1.45 0.67 b -1.50 e 2.17 0.22 d -0.42 c 0.64 C4. There is evidence that staff attributed positive outcomes to Second Step 0.67 a 0.78 f -0.17 1.78 b 2.00 e -0.22 1.89 d 2.67 c -0.78 Note. Range was +3 for strongly in support of the proposition to -3 for strongly against support of the proposition with 0 denoting no evidence. A rating of 2 wa s moderate in support of or -2 against support of the proposition. A rating of 1 was mid in support of or -1 against suppor t of the proposition. Level 1 schools indentify as implementing school-wide: a = Alto W, b Bueno W, d = Dia W. Level 2 schools identify as implementing in individual classrooms or grades: f = Familia P, e = Especial P, c = Campo P. The lower the score, the easier it is to implement the step.

PAGE 111

99 Dashes indicate the respondent did not know. Five paired t -tests (overall for C and the four i ndicators) were used to compare the ratings for the three pairs of matched school s (Table 24). Differences between Level 1 and Level 2 schools on the four indicators ranged from a mean of -0.94 to 0.45. C1, implementation of Second Step school-wide, was the only indicator found to be significant in the difference scores, t (2) = -5.98, p = .03. In this indicator, Level 2 schools were significantly higher than Level 1. All other paired t -tests of the individual indicators revealed no statistically significant differences ( p s > .05). Table 24 Mean Comparisons of Paired Level 1 and 2 Schools for Proposition C and Indicators Proposition C & Indicators Mean Level 1 Mean Level 2 Mean Diff SD t p Proposition C (Imp. Level) 0.03 0.45 .75 0.97 1.34 .31 C1 -0.44 0.49 -0.94 0.27 -5.98 .03* C2 -1.00 -0.29 -0.71 2.33 -0.53 .65 C3 0.11 -0.34 0.45 1.81 0.43 .71 C4 1.45 1.82 -0.37 0.36 -1.78 .22 Note. Level 1 schools indentify as implementing school-wide: Level 2 schools identify as implementing in individual classrooms or grades. p < .05

PAGE 112

100 A comparison was completed on checklist question related to securing buy-in from the entire staff. The range was from 1 to 4.5. Alto W, Bueno W, and Campo P found the step easy to implement compared to their paired schools. Familia P's rating was the highest at 4.5. Table 25 Checklist Comparisons of Level 1 vs. Level Schools on Differences Checklist Implementation Step Leve1 1 School Level 2 School Difference 11 Securing buy-in from the entire staff 3.2 a 4.5 f -1.3 1.0b 3.0 e -2 3.4d 1.5 c 1.9 Note. Likert Scale is from 1 easiest to 5 most diffi cult to implement particular implementation step. The lower the score, the easier it is to implement the step. Level 1 schools indentify as implementing school-wide: a = Alto W, b Bueno W, d = Dia W. Level 2 schools identify as implementing in individual classrooms or grades: f = Familia P, e = Especial P, c = Campo P. Document Review Very few documents were available to s upport the responses in the interviews and focus groups. Alto W was able to provide a very detailed mini grant proposal that discussed the school's strategic plan and the benefit of implementing Second Step including a training time, blocks of time set aside for the curriculum, use of visuals throughout the school promoting the skills in Second Step and how the tools of the program could be used to develop a comp rehensive, data-driven prevention program. Bueno W was able to provide the school 's Guidance Curriculum Plan/Do/Study/Act document. This document addresses first gr ade only, although the sc hool had identified as implementing in the whole school. The couns elor at Campo P was able to provide her

PAGE 113

101 schedule indicating Second Step as her curriculum for 1st grade. No other school provided any document review to support the data. Summary The goals of this study were to provide a better understanding of (a) the factors that support implementation of evidence-ba sed programs in K-12 public schools, (b) the factors that constrain implementation, and (c) how program developers and researchers might facilitate the application of research to practice. The focus of this implementation study was on the exploration of the di fference between schools identifying as implementing Second Step in the whole schools vs. identi fying partial implementation in six CCPS elementary schools during the program installation and in itial implementation process. This study used a multi-method, multi-source retrospective explanatory study design (Yin, 1989; 1994). It examined factors associated with program installation and initial that, if present, the research shows are associated with implementation fidelity. This study tested specific theoretical propositio ns and also developed case descriptions as outlined by Yin (1989, 1994, 2003). This case study used a variety of data collection methods, including interviews, focus groups, an implementation checklist, and a document review. Semi-structured interviews of principals, counselors, and teachers solicited descriptive information about factors associated with program installation and initial implementation and perceived supportive and constraining factors. Focus part icipants were organi zed around topics that emerged from the indi vidual interviews. There were seven stages to address the research questions. The tasks were (a) review any document review that would support the information provided in the

PAGE 114

102 interviews and focus groups, (b) analyze the checklist by examining differences between paired schools, (c) analyze the checklist by completing a t -test, (d) analyze the interview questions by examining differences between paired schools, (e) an alyze the interview questions with a t -test comparing paired schools, (f ) analyze the comparisons that emerged from results of the ratings, and (g) analyze any themes that resulted from the focus groups. Overall, the results of the ratings examination indicated that schools implementing school-wide (Level 1) and school s implementing in individual classes or grades (Level 2) were not consistently impl ementing the factors associated with program installation and initia l implementation. There was little difference in the responses between matched Level 1 and Level 2 schools. The t -tests results on the propositions and their indicators were statistically significant for A2, training on the curriculum and C1 school-wide implementation of Second Step No other t -test on the proposition and no t test on the checklist responses we re statistically significant. This multi-method, multi-source study yielded little or no support for the research questions and the propositions proposed in the areas of training, time, implementation level, or champion. In the next chapter explanation of these findings will be discussed at greater length.

PAGE 115

103 Chapter Five Discussion Purpose This present study responded to the ne w emphasis on providing evidence-based prevention and intervention programs in the schools and the challe nge of implementing research to practice in a way that mainta ins fidelity to program design, but is still adaptable to a school climate. Studying factor s associated with program installation and initial implementation in the schools while normal daily act ivities are occurring, provided opportunities that are different than when researchers examine schools and bring with them supports and financial incentives that most schools do not have. This chapter reviews the rationale, purpose, and methodol ogy of the present st udy, and discusses the results and limitations. The contributions to research and practice and areas for further research are also addressed. Overview of the Study As schools emerge as the de facto health and mental health system, they have also become the most common resource for preven tion program implementation and there is a growing body of evidence-based prevention and intervention programs available for them. Lagging behind the interest in evidence -based practices and programs, but gaining more momentum in recent years, is understanding the implementation process and its potential impact to successful replication (Fixsen et al., 200 5). The focus of this study

PAGE 116

104 was the exploration of factors associated with conditions duri ng the program installation and early implementation stages of the im plementation process. The answers to the questions addressed in this study have impli cations for developers, researchers, and the schools that implement the programs. Review of Method This retrospective explanatory study used a multi-method, multi-source design that investigated th e second and third stages of the Implementation Theory framework, program installation, and initial implementati on developed by Fixsen et al. (2005). This study examined the supportive and limiting conditions schools face as they adopt evidence-based programs and th e initial implementation strategies that are used. The district matched schools within the district based on a statistical cluster analysis to identify groups of similar schools. Peer gr oups were formed based on percentages of students in the free/reduced lunch program, English language learners, under-performing minorities and students enrolled at same school on days 40 and 180 of an academic year. Six schools were paired base d on this data analysis. Parametric and qualitative analytic techni ques were used to gain a more complete understanding of the connections between evidence-based programming and practice and how public schools implement them in the schools. The Yin method of case study was modeled to examine the difference between schools implementing Second Step school-wide (Level 1) and schools implementing in individual classrooms or grades in a large urban school district (Level 2) (Yin,1989,1994). Propositions and their indicato rs were examined based on the difference in ratings of the pa ired schools in the proposition areas identified: (a) training

PAGE 117

105 and resources, time, implementation level, and champion. Additionally, t -tests were completed on the propositions and checklist responses. Discussion of Findings The present investigation c ontributed to the empirical a nd theoretical literature on (a) the factors that support implementation of evidence-based program; (b) the factors that constrain implementation during the st ages of program installation and initial implementation; and (c) how program develope rs and researchers might facilitate the application of research to practice. Contrary to the proposed propositions, th ere was no evidence that differentiated the schools that identified as implementing Second Step school-wide and those that did not. The differences in ratings of the schools in the areas of expe rience of training and resources, time, implementation level, and champion varied from school to school with no identified link that identified schools that implemented Second Step school-wide as more likely to have supported the propositions. In the paired schools, the difference between Level 1 and Level 2 schools varied by proposition, indicators and pairs. There were other themes that came out that are wo rth noting. This section will discuss what the results indicated in relation to the research questions. Review of Question 1. 1. To what extent, if any, are there di fferences in training on understanding purpose and expected outcomes, the curri culum, parent involvement, and being provided sufficient kits between sc hools that identify as implementing Second Step school-wide vs. schools that identify as individual classes or grade levels?

PAGE 118

106 The early work of training staff is an opportunity to define and expand treatment and implementation practices and program that may contribute to more positive implementation outcomes (Fixsen et al., 2005). The expectations of the Second Step developer were that teachers would implement school-wide, all staff would be trained on Second Step, time would be set aside to do the work, staff would follow the implementation plan designed by the program developer, and administrators would be the champion. That was not what happened in the schools in this study. For the training to happen, counselors presented information on Second Step to the principals and in collaboration with the counselor, principals made the decision on who got trained. No school provided a full staff training including overview, curriculum, adaptation, and parent involve ment. All schools trained at least some staff with an overview of Second Step In all the schools, the counselor provided a brief overview during individual conference, staff or grade le vel meetings. Few indi viduals participated in the more extensive 2-day training. The onl y school that mandated training was Alto W. Although the Alto W principal did not see "a dire need for an all out school-wide training," the principal allowed the counselor to invite a dist rict staff to do an overview training of Second Step and mandate staff to attend. Alto W's paired school, Familia P did not mandate training, but did provide a plan for training by developing curriculum maps. The principal attended the overview training w ith the counselors. Familia P grade level lead teachers and the counselors attended a second training designed to train them to teach their peers. The program was designed for teachers to implement in the classroom. That is not what was reported in the schools in this study. In all cases, except Familia P, one teacher

PAGE 119

107 at Alto W and two teachers at Especial P, counselors provided the program. In some cases the teachers did not stay in the classroom while the counselor provided the program. The time was considered collaborati on time for the teachers with their gradelevel colleagues. Principals saw this as a win-win situation because they needed time periods for grade levels to work together Counseling, library, phys ical education, music and art provided lessons while the teachers met. At Alto W some teachers stayed in the classroom to support the counselor, others di d not. Familia P is the only school that the teachers taught the program with the counselor going in for support as needed. Bueno W teachers stayed with their class. One teacher said she felt it was important because the counselor was not used to working with an en tire class and if she stayed, she would be able to support the lessons better. Especial P did not. The teachers at Dia W did not stay with the class, but their paired school, Cam po P teachers did stay. The teachers who did stay with the class talked about the importance of supporti ng the counselor and having a more system-wide approach to behaviors. When the paired schools were examined, Alto W and Familia P were the paired schools that rated highest on training, except for on receiving curriculum kits. Familia P was the only school that did not receive an adequate number of kits. This may be the result of the teachers providing the program, thus more individuals providing lessons at the same time. In contrast, when the c ounselor provided the lesson, all the schools indicated they implemented on a rotating schedule based on grading periods. One counselor only needed one curriculum kit. Campo P's counselor provided an overview of the training, but no other staff were fully trained in the program. She shared, "I did a lot with the pr ogram on my own. If I

PAGE 120

108 could do it over, I would have grade level staff attend training to learn curriculum." In the paired school, only Dia W's c ounselor was trained. The same was true at Bueno W. Only the counselor was trained. The counselors talk ed about the program in staff meetings. The Especial P's two teachers were trained at an overview training, but only one followed through on implementing the program in the class. Review of Question 2. 2. To what extent, if any, are there differences in time allocation for learning the curriculum, shared planning time, cla ssroom lessons, and review of lessons between schools that identify as implementing Second Step school-wide vs. schools that identify as implemen ting by grade level or classroom? All the schools had the same sentiment as one principal's comment about the challenges. "The time is always a crunch. Ever y year we get additiona l responsibilities to try to squeeze in. There is never enough time to do everything we'd like to do." The consequence of multiple priorities was that tim e set aside to practice the lessons and plan together were not specific to Second Step The other priorities took precedence over Second Step Whether schools were implementi ng school-wide or in individual classrooms or grades, if the counselor was providing the program, the counselor would set a schedule of when Second Step would be offered in the cl assrooms. Like the time set aside to plan together, the counselor would be pulled away for other priorities. As Greenberg (2003) expressed, schools are now al so the lead on prevention programming for children and youth. It is a balancing act for schools to continue to add programs to their already full day of academics and othe r priorities. Despite the challenges, all participants in this study rated the impor tance of implementing an evidence-based

PAGE 121

109 program in their school as high. The importance of skill development, rather than crisis was emphasized. There were concerns about challenges students face today. There was also a struggle with what they believed was the right thing to do a nd the priorities they faced every day. A consistent theme in exam ining schools was that there was not enough time and competing priorities. Review of Question 3. 3. What strategies do principals per ceive to be effective in promoting implementation in their schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by grade level or classroom? All the principals believed a champion for the program was a key element in the success of the program, but did not believe th e champion had to be them. The principals saw their role as a support for the counselor Except in Familia P, the principals perceived a set rotating schedule was an effective way to implement prevention programs. Alto W's principal believed it was important to align the Second Step curriculum with existing work. In support of prevention planning he said, "A recent trend in education is academic improvement. It is gi ven that regardless of what else goes on at your school, academic improvement will happen, but it doesn't if the school isn't a safe place and people can't learn." At Alto W's pa ired school, Familia P, the principal said, "We need prevention/intervention programs in stead of winging it. We spend more time putting out fires. I think we have to not a ssume they know these things and we should do everything we can to make sure we provide them with inform ation so that as they are making choices and decisions, they have some information."

PAGE 122

110 Especial P's principal declined to be interviewed. Bueno W's principal said, "We introduced the program by grade level. We studi ed how behaviors were rising in a certain grade level. We chose to begin with 1st gr ade because they had the highest behavior numbers in the whole school." The principal continued to explain that the model was designed to be a school-wide teacher implem entation, but the school didn't use it that way. It would have been too much for teachers to have to train and make the time in the day to teach the program. She and the couns elor felt the teachers had enough to do and did not burden them. The principal and counsel or made the decision without input from the teachers. The principal at Dia W believed it was th e counselor's role to integrate the program into the school by working with i ndividual teachers about implementing the program into the classroom. Although the pr incipal did not work directly with the program, she encourages "that no matter what's going on to consider those aspects of the population that we have. We don't have a large at risk popula tion, but we try to be aware of the needs of our students and keep those in mind." Like the Dia W principal, the Campo P principal supported the program based on the counselor's recommendations. There were specific blocks of time set for implementation. Review of Question 4. 4. What are the barriers and facilitators of implementation identified by teachers and counselors in their schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by classes or grade level?

PAGE 123

111 Many of the teachers did not have the level of buy-in to the Second Step program that the counselors did because they were not part of assessing the potential match between the school and program. Staff buy-in is an important condition of the exploration stage identified by Fixs en, et al. (2005). All the individuals interviewed at Bu eno W indicated it was great that the counselor had taken on the responsibility of the Second Step curriculum. At least one staff saw a downside when it came to time allocation. She stated, "It was originally designed that she (the counselor) would co me in, but emergencies would happen. She only came in three or four times. Her emergencies took precedence. If the teachers were doing it, then it would make it easier to be cons istent. If it was able to be consistent, it would work. Because it was just the one pers on doing it, it made it difficult. We did have the time put into the schedule so it would have worked." Dia W's counselor did no t consistently follow the Second Step curriculum, preferring to use her judgment to focus on pa rticular areas of need. In her training, the trainers emphasized that teachers should teach the program and the importance of fidelity to the program, but she felt it was not realis tic. The counselor believed, "Teachers plates are full and every year they are fuller. Our teachers do not have the time to include a social-emotional curriculum on top of all th e other mandated curriculum." Similarly, the Campo P counselor's main concern was that "T here is only so much time and there are always so many academic concerns for teachers to address." Everyone interviewed discu ssed the importance of implementation with fidelity, however, there was an overwhelming response th at it was not feasible in schools because of the competing priorities. One counselor sa id if she went stri ctly by the cards the

PAGE 124

112 students would be bored. Another said that th e hammering of empathy wasn't as useful as the problem-solving and how to calm down lessons. She focused on them. Enough time in the classroom to complete a lesson was not usually available, even with set schedules of going to the classes. Other counselors f ound they could do two lessons during their classroom visit. One counselor suggested that the developer "might want to consider a modified version that has a more realistic tim eline. Another suggested that the developer should "explore adaptations of how it can be implemented. With th e ratio of kids we have, I'm not sure that one curriculum can be fully implemented in the way that it was designed. We need opportunities to look at adaptations and flexibility within the curriculum." Review of Question 5. 5. To what extent, if any, are there differences in staff commitment to implementation, more peer-to-peer support, more adherence to the program model, and more staff perception on positive student outcomes between schools that identify as implementing Second Step school-wide vs. schools that identify as implementing by classes or grade level? The Level 2 schools rated higher than the Level 1 schools on school-wide implementation, although the Level 2 school s self identified as implementing in individual classrooms and grades. Level 1 sc hools self identified as implementing schoolwide, but there was no clear evidence that they were school-wide. Bueno W rated higher than Especial P on peer-to-peer support and use of implementation tools. With the two other pairs, the Level 2 schools rated higher. All the Level 2 schools rated higher than the Level 1 schools on contributing positive outcomes to Second Step.

PAGE 125

113 Study Limitations The limitations of this study must be c onsidered in the in terpretation of the findings above. The limitations noted in Chapte r 3 are expanded to discuss the limitations identified in the study. (1) Turnover of staff impacted the number of participants in the focus group. At least 24 participants were expe cted, but only 12 participated. (2) There were only six schools with three matched pair s in this study. (3) One former principal was not located for interviewing and the present principal declined the interview citing lack of knowledge of the program. (4) Except at Familia P, only one teacher in Alto W and one at Especial P, counselors implemented Second Step in the schools, contrary to the program developers expectations that the targeted users were teachers. (5) Schools self identified as impl ementing school-wide or in individual classes or grade levels. No definition was provided on "school-wide" nor "individual classes or grade levels." There was no clear evidence to corroborate that a school was implementing school-wide. (6) No participants volunteered for the focus group at Especial P. (7) Schools were feeling the pressure of a new superintendent, new priorities, and new initiatives. Like their concern about time when discussing their experience with Second Step they also were stressed for time in participating in the study.

PAGE 126

114 Contribution to Research and Practice Considering the aforementioned limitations there are lessons to be learned. This study, while explanatory in nature, has contribu tions to make to the growing literature on implementation of evidence-based programs in school settings. Similar to other studies, this study found that schools are not impl ementing evidence-based programming with fidelity (Fixsen et al., 2005). Fixsen et al. (2005) found that imp lementation may be challenged because of the inability to replicate the supports provided in the original research, a lack of understanding of the importa nce of fidelity to the program, or loss of support from the district for the program. In this study, the participants discussed the importance of fidelity; however, there was tr emendous concern about the lack of time to do all that is expected of schools today. Providing awareness and support to particular social skills development were important, but full implementation was described as unrealistic. In the literature review in Chapter 2, it was noted that Weiss et al. (2008) addressed concerns that some programs have been identified as evidence-based by the federal government lack credibility. The schools in this study did not consider the quality of the research behind the program. Understanding the research behind the program could help the school decide if a program would work for them. Researchers should be open to providing the research reviewed by the federal government. In this way, the schools have an opportunity to review the challenges and limitations that the re searchers encountered and may better understand the program re quirements and limitations. With this knowledge, schools may be better able to differentiate between programs that are not a good match vs. a study challenge alrea dy identified in the program.

PAGE 127

115 A Public Health Approach: A Potential Framework. A public health approach to integration of prevention is promoted in th e literature as a framework to strategically make programming decisions. This approach identifies (a) What is the problem?, (b) What are the causes?, (c) What works for whom?, and (d) Is it meeting the intended needs? Are the questions that can drive th e framework for implementation and be linked to a strategic plan (Elias, Zins, et al., 1997; Greenberg et al., 2003; and Kutash, et al., 2006). Based on their literature review Fixsen et al. (2005) recommend the implementation of specific conditions that were found to be most successful across domains. Greenberg et al. (2005) recommend to researcher s and developers important conditions to support schools after the school adopts a program. Sc hools in this study identified the problem, examined the risk a nd protective factors of their population, and chose an evidence-based program that had pr otocols to evaluate the interventions and protocols to monitor implementation and s caling up. The approach missed an important step. The schools did not consider the feas ibility or dismissed the commitment the schools had to make to the program before implementing. This study has some similarities to the findings of two Norwegian studies on Second Step (Larsen and Samdal, 2007, 2008). In Norway, Second Step was provided by the Norwegian Health Association. In this present study, the school district provided the program. Neither made any specific requireme nts about implementation of the program, leaving those types of decisions to the school administra tors and their staff. The following other areas provide s upport of the Norway study: (a) Teachers adapted the program for their own needs as it related to the needs of the students, the features of the progr am, and teachers' individual beliefs and

PAGE 128

116 experiences. In the present study counselo rs adapted the program based on the same rationale. (b) As in the Norway study in reference to teachers, this present study found counselors formed two distinct groups of users: (a) those who used the program comprehensively by providing le ssons weekly or some other set schedule, and (b) selective users, consisting of those who selected only those parts of the program that related to pa rticular situations and problems. The difference was that in the Norway study, full implementation of the components was linked with school-wide implementation, while in this study, full implementation was linked to school s that implemented school-wide as well as schools that only implemented in individual cl asses or grades. (c) Teachers in the Norway study and co unselors in the present study understood the benefit of use in time and resources to provide training in social skills. (d) Teachers in the Norway study and counselors in the present study were challenged with the balance of core curriculum and implementing Second Step. Both groups indicated that at times Second Step would not always be implemented because of competing priorities. (e) One rationale for adaptation of the program in Norway was the teachers skepticism of the program's cultural valu es and content. Counselors adapted to meet the needs of minority students and students with special needs. (f) The process of program installation re sults was similar in the studies. Program installation differed among schools. Time spent on preparing, training, and resources varied.

PAGE 129

117 Larsen and Samdal (2007, 2008) identifie d a strong focus on leadership combined with a strategic plan and school -wide implementation appeared to be linked to the fidelity of program. They recommended a broader appro ach to the evaluation of the program that includes implementation and interv ention processes and outcomes. Recommendations for Future Progr am Implementation and Research To be more effective, schools may need to gather more information before adopting a program to meet the needs of students. They may need to examine the research of the evidence-based program to see how it relates to their population, the methods and results that led to the distinc tion of evidence-based, and in this ever changing world, what was the lag time between the original study and the present use of the program. The schools should examine th e expectations the program developer considers necessary for fidelity to implem entation, and how does that match with the present priorities in the sc hool (Andrews & Buettner, 2005; Backer, 2003). Furthermore, if the program is introduced to the schools with the support of a grant, the schools must look at the ability to sustain the program once the funding is gone. One approach is to use the decision trees that Daleiden & Chorpi ta (2005) developed to inform decisions regarding the appropriate program to meet th e needs of the schools. Their work provides the much needed participation of parents and community to help establish the feasibility of the program within the cultu ral context of the community. Use of a checklist similar to the one th at Andrew and Buettner (2002) developed to address the feasibility issue may prevent fu ture frustrations as schools adopt a program and find after implementation, that the program did not meet their needs (Table 26).

PAGE 130

118 Table 26 Feasibility Checklist Detailed descriptions of implementation procedures are available and understandable. Training is available when described as a necessary component of the program. Curriculum materials are available when necessary for implementation. Any other support materials described as necessary for implementation are available. If you are able to check off each of these items, the program should be described as Available. The total costs of program materials are affordable, given our organizations budget. The total costs of training are affordable, given our organizations budget. The training time commitment of new or existing personnel is affordable, given our organizations budget. The implementation time commitment of new or existing personnel is affordable, given our organizations budget. The time commitment of participants is feasible, given our capacity. The time commitment of administering the program is feasible. If you are able to check off ALL of these items, and the program was rated as Available, the program should be described as Affordable.

PAGE 131

119 Table 26 (continued) The underlying principles of the program being evaluated are consistent with our organizations approach to meeting the n eeds of high-risk youth. The approach used in the program being evaluated is consistent with existing policies and procedures currently in place within the organization. The implementation of this program will not creat e insurmountable internal political challenges. The implementation of this program is consistent with the current prioritie s of the organization. This program is sustainable, given our organizations structure and funding mechanisms. If you are able to check off each of these items, and the program was rated both Available and Affordable, the program should be described as Feasible. (Andrew and Buettner 2002) Additionally, districts might consider a systematic and systemic approach to providing prevention programming in school s. This might include (1) developing memorandum of understandings or checklists that clarify the roles a nd responsibilities in agreeing to implement an evidence-based program, (2) rolling the program out in waves to the schools starting with the most receptive schools, and (3) providing on-going technical support to the schools. Evidence-based approaches often require commitment to the programmers implementation model. If the program is not feasible, it is not likely to be implemented with fidelity. Recommendations to res earchers and program developers include: (a) Understand and be sensitive to the complexity of schools. The demands and priorities on schools are at an all time high. Schools are in the business of education. Along with that, they are exp ected to provide a plethora of supports to address the barriers students face on limited budgets.

PAGE 132

120 (b) Aim for realistic dosages and time periods to implement the programs based on school settings in today's world. A program designed with 11 weeks of activities does not fit into a 6 or 9 week school term. A 45 minute program does not fit into a 50 minute class period when you consider transition time. (c) Focus new program development on life a nd social skills that are transferrable to multiple risk factors. Violence prevention and intervention, substance abuse prevention and intervention, bullying prevention and intervention, mediati on skills, career guidance, asthma prevention and intervention, diabetes prevention and intervention, parent support and education are just a few of the ma ny competing demands on schools beyond core academics. (d) Financial supports provided by initial research should be sustainable and easily transferable within school district budgets (e) Provide access to the program's theory and original research. (f) Participatory research could be an adjunct to rigorous empirical methods. (g) Existing programs could be reevalua ted. Research could be conducted on what components or practices within the program are evidence-b ased. As indicated in the literature review, Project Aler t, a program on the National Registry of Evidence-based Programs and Practices created by the Substa nce Abuse and Mental Health Services Administration (SAMHSA), used six outcome measures, six different substances (marijuana, cocaine, etc.), three risk levels, and two types of programs for 100 comparisons between program and control condition. Only one comparison was significant in the positive direction (Ellic kson et al., 1993; Weiss et al., 2008). The Project Alert two-year core curriculum c onsists of 11 lessons. The program developer

PAGE 133

121 suggests for fidelity it should be delivered once a week during the first year, plus 3 booster lessons the following year for approxi mately 45 minutes a session. The effort does not seem to be justified by the strength of the evidence. Summary Schools have taken on a much greater ro le than the teaching of our children, which in itself is huge. Schools have become our nation's answer to provide the supports and services that what would have once been a public health or fa mily concern. They are recognized as the de facto health and mental system. They are also recognized as the most effective and efficient avenue to provi de prevention programs. School staff may be ill-equipped to understand or have time for all the new priorities placed on them, but understand the importance of addressing barriers to learning to achieve positive academic results. The dismal results of this stu dy support the need for more research on the challenges and supports school staff face as they respond to the needs of children and youth. Issues of feasibility, fidelity, and adaptabi lity should be explored in future studies, along with outcomes. As new programs are designed, program developers and researchers working with practitioners in the schools may be able to avoid some of the challenges schools face in implementing programs. A paradigm shift n eeds to occur from schools adapting their environment to programs to reach implementa tion fidelity, to researchers and developers designing and adapting their prog rams to the reality of school environments. Developing new programs or reviewing and adapting exis ting programs in conjunction with school personnel, has the potential to increase im plementation fidelity in the schools.

PAGE 134

122 References Adelman, H. S., & Taylor, L. (2000). Looking at school health and a school reform policy through the lens of a ddressing barriers to learning. Children Services: Social Policy, Research, and Practice 3, 117-131. Adelman, H. S., & Taylor, L. (2006). The school champions guide to student learning supports Thousand Oaks, CA: Corwin. Albee, G. W., & Gullotta, T. (Eds.). (1997). Primary prevention works Thousand Oaks, CA: Sage. Andrews, D., & Buettner, C. (2002). Evaluating research and prevention programs for your school. Retrieved June 16, 2008, from http://altedmh.osu.edu/Da tabase/ebdatabase.html Andrews, D., & Buettner, C. (2002). Evaluating the feasibil ity of prevention & intervention approaches Retrieved June 16, 2008, from http://altedmh.osu.edu/documents/ evidence&feadibility/feasibility.pdf Andrews, D., & Buettner, C. (2005). Evidence-based practice: Evaluating supporting evidence. Washington, DC: U.S. Department of Education, Center for School Improvement Office of A lternative Education. Aos, S., Mayfield, J., Mill er, M., & Yen, W. (2006). Evidence-based treatment of alcohol, drug, and mental health disorders: Potential benefits, costs, and fiscal impacts for Washington State. Olympia: Washington State Institute for Public Policy. Backer, T. E. (2003). Balancing program fidelity and adaptation for school-based programs. Encino, CA: Human Interaction Institute. Retrieved June 20, 2008 from http://captus.samhsa.gov/northeas t/PDF/fidladap_NECAPT_telecon.pdf Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. England, NJ: Prentice Hall. Berends, M., Bodilly, S. J ., & Kirby, S. N. (2002). Facing the challenges of whole-school reform: New American schools after a decade. Santa Monica, CA: RAND.

PAGE 135

123 Burns, B. J., Costello, E. J., Angold, A., Twee d, D., Stangle, D., Farm er, E. M. Z., et al. (1995). Childrens mental health service use across service sectors. Health Affairs, 14, 148-159. Carnegie Council on Adolescent Developmen ts Task Force on Education of Young Adolescents. (1989). Turning points: Preparing Am erican youth for the 21st century. Washington, DC: Principal investigator. Centers for Disease Control and Prevention, National Center for Injury Prevention and Control. (2007). Youth violence fact sheet. Atlanta: U.S. Department of Health and Human Services. Central City Public Schools fast facts. Central City, Manzano: Principal investigator. Retrieved July 22, 2008, from http://www.rda.CCPS.edu/RDA/Documents/Main/CCPS_Fast_Facts.pdf. Central City Public School s Research & Evaluation Team .(2008). Central City Public Schools developmental assets A profile of our youth. (2005). Retrieved July 24, 2008, from http://www.rda.CCPS.edu/RDA/Documents/Publications/05_06/A_Profile_of_Ou r_Youth.pdf. Chen, H. (1998). Theory-driven evaluations. Advances in Educ ational Productivity, 7 15-34. Chen, H. T. (1990). Theory-driven evaluations. Newbury Park, CA: Sage. Collaborative for Academic, Social and Emotional Learning. (2003). Safe and sound: An educational Champions guide to eviden ce-based social and emotional learning (SEL) programs. Chicago: Principal investigator. Committee for Children. (2002). Second Step: A violence prevention curriculum Seattle, WA: Principal investigator. Communication Theory/Diffusion of Innovation Theory (n.d). Wikibooks Retrieved May 7, 2008 from en.wikibooks.org/wiki/Communicat ionTheory/Diffusion_of_Innovations. Council of Chief State School Officers (2002). Mission Retrieved July 21, 2008 from http://www.idealist.org/en/org/136505-56 Crick, N. R., & Dodge, K. A. (1994). A review and reformulation of social-information processing mechanisms in childrens social adjustment. Psychological Bulletin, 115, 74-101.

PAGE 136

124 Crist, D. M. (2004). Central City Public Schools he alth & wellness program outcomes management Central City, Manzano: Central City Public Schools. Central City Public. Daleiden, E. L. & Chorpita, B.F. (2005). From data to wisdom: Quality improvement strategies to supporting large-scale implementation of evidence-based services. Child & Adolescent Psychiatric Clinics of North America 14, 329-349. DeFriese, G. H., Crossland, C. L., Pears on, C. E., & Sullivan, J. (Eds.). (1990). Comprehensive school health programs: Current status and future prospects. Journal of School Health, 60 127-190. Devaney, E., OBrien, M. U., Resnik, H., Keister, S., & Weissberg, R. P. (2006). Sustainable school-wide social and emo tional learning implementation guide Chicago: Collaborative for Academic Social, and Emotional Learning. Dobson, L., & Cook, T. (1980). Avoiding Type I II error in program evaluation: Results from a field experiment. Evaluation and Program Planning 3, 269-276. Dryfoos, J. G. (1990). Adolescents at risk: Prevalence and prevention. New York: Oxford University Press. Dryfoos, J. G. (1998). Safe passage: Making it through adolescence in a risky society New York: Oxford University Press. Duchnowski, A.J., Kutash, K., & Oliveria, B. (2004). Systematically examining school improvement activities including specia l education. Remedial and Special Education, 25(2), 117-129. Dunavin, R. (2006). Statistical peers for benchmarking Central City, Manzano: Central City Public Schools Research & Development Department. Durlak, J. A., & Wells, A. M. (1997). Primary prevention mental health programs for children and adolescents: A meta-analytic review. American Journal of Community Psychology, 25, 207-214. Dusenberry, L., Brannigan, R., Falco, M., & Ha nsen, M. (2003). A review of research on fidelity. Health Education Research, 18 (2), 237-256. Elias, M. J., Gager, P., & Leon, S. (1997). Sp reading a war blanket of prevention over all children: Guidelines for selecting substan ce abuse and related cu rricula for use in the schools. Journal of Primary Prevention 18, 41-69. Elias, M., Zins, J., Weissberg, R., Frey, K ., Greenberg, M., Haynes, N., et al. (1997). Promoting social and emotional lear ning: Guidelines for educators. Alexandria, VA: Association for Supervisi on and Curriculum Development.

PAGE 137

125 Ellickson, P. L., Bell, R. M., & McGuigan, K. (1993). Preventing adolescent drug use: Long-term results of a junior high program. American Journal of Public Health 83, 856-861. Ennett, S. T., Tobler, N. S., Ringwalt, C. L ., & Flewelling, R. L. (1994). How effective is drug abuse education? A meta-analysis of Project DARE outcome evaluations. American Journal of Public Health, 84 (9), 1394-1401. Fixsen, D. L., Naoom, S. F., Blas, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A sy nthesis of the literature Tampa, FL: University of South Florida, Louis de la Parte Florid a Mental Health Institute, The National Implementation Research Network. Flannery, D. J. (2006). Violence and mental health in everyday life. Maryland: Alta Mira Press. Fleiss, J.L.(1986). The design and analysis of clinical experiments. New York: John Wiley & Sons. Freudenberg, N. & Ruglis, J. (2007). Reframi ng school dropout as a public health issue. Preventing Chronic Diseases 4(4). Retrieved January 8, 2008 from http://www.cdc.gov./ped/issu es/2007/oct/07-0063.htm. Gottfredson, D. C., Fink, C. M., Skroba n, S., & Gottfredson, G. D. (1997). Making prevention work. In R. P. Weissberg, T. P. Gullotta, R. L. Hampton, B. A., Ryan, & G. R. Adams (Eds.), Establishing preventive services (pp. 219-252). Thousand Oaks, CA: Sage. Gottfredson, G. D., & Gottfredson, D. C. (2002). What schools do to prevent problem behavior and promote safe environment. Journal of Education & Psychological Consultation, 12, 313-344. Graduating in CCPS: Understanding the cohort report. (2005). Central City, Manzano: Central City Public Scho ols Research & Accountability Department. Principal investigator. Retrieved July 21, 2008, from http://www.rda.CCPS.edu/RDA/Do cuments/Publications/06_07/ Understanding_Grad_%20Cohort_Report.pdf Greenberg, M. T., Domitrovich, C. E., & Bumbarger, B. (2000). Preventing mental health disorders in school-age childre n: A review of the effectiveness of prevention programs. Pennsylvania: Prevention Research Center for the Promotion of Human Development, College of Health & Human Development. Retrieved July 21, 2008, from http://www.pdeinfo.state.pa.us/svc s_students/lib/sv cs_students/ Chapter12PreventingMentalHealthDisorders.pdf

PAGE 138

126 Greenberg, M. T., Domitrovich, C. E., Gracz yk, P. A., & Zins, J. E. (2005, draft). The study of implementation in school-based preventive interventions: Theory, research, and practice (vol. 3). DHHS Pub. No. (SMA) ___Rockville, MD: Center for Mental Health Services, Subs tance Abuse and Mental Health Services Administration. Greenberg, M. T., Weissberg, R. P., OBrien, M. U., Zins, J. E., Fredericks, L., Resnik, H., et al. (2003). Enhancing school-b ased prevention & youth development through coordinated social, emotional, & academic learning. American Psychologist 58(6/7), 466-474. Halberstadt, A. G., Denham, S. A., & D unsmore, J. C. (2001). Affective social competence. Social Development 10 79-119. Hall, G. E., & Hord, S. M. (2001). Implementing change: Patterns, principles, and potholes Needham Heights, MA: Allyn & Bacon. Hallfors, D., Pankratz, M., & Hartman, S. ( 2007). Does federal polic y support the use of scientific evidence in school-based prevention programs? Prevention Science, 8(1), 75-87. Hallfors, D., Sporer, A. Pankrat z, M. & Godette, D. (2000). 2000 Drug free schools survey. Report of results. Chapel Hill, NC: University of North Carolina Heath, D. (2006) Central City Public Schools safe and quality learning and working environment. Central City, Manzano: Central City Public Schools Research & Accountability Department. Central Cit y, Manzano: Principal investigator. Retrieved July 21, 2008, from http://www.rda.CCPS.edu/ RDA/Documents/Publications/06_07/Safe_Quality_Learning_Working_Envir.pdf Jaycox, L. H., McCaffrey, D. F., Ocampo, B. W. Shelley, G. A., Blake, S. A., Peterson, D. J., et al. (2006). Challenges in the evaluation and implementation of schoolbased prevention & intervention programs on sensitive topics. American Journal of Evaluation 27(3), 320-336. Johnson, J., Cohen, P., Smailes, E., Kasen, S., & Brook, J. (2002). Television viewing and aggressive behavi or during adolescence and adulthood: 1977-1992. Developmental Psychology, 39, 201-221. Kendall, P. C. (1993). Cognitive-behavioral ther apies with youth: Guiding theory, current status, and emerging developments. Journal of Counseling and Clinical Psychology, 61(2), 235-247. Kendall, P. C. (2000). Guiding theory for therap y with children and adolescents. In P. C. Kendall (Ed.), Child and adolescent therapy: Cognitive-behavioral procedures (pp. 3-27). New York: Guilford.

PAGE 139

127 Khan, S., & Van Wynsberghe, R. (2008). Cultivating the under-mined: Cross-case analysis as know ledge mobilization. Qualitative Social Research, 9 (1). Retrieved May 5, 2008 from http://nbn-reso lving.de/urn:nbn :de:0114-fqs0801348 KIDS COUNT Data Center (2007) Annie E. Casey Foundation. Retrieved from http://www.kidscount.org /datacenter/index.jsp Kinnumne, J. (1996). Gabriel Tarde as a f ounding father of innovation diffusion research. Acta Sociologica, 39 (4), 431-442. Kolbe, L. J., Collins, J., & Cortese, P. (1997). Building the capacity for schools to improve the health of the nation: A call for assistance form psychologists. Psychologists, 52, 256-265. Kutash, K., Duchnowski, A. J., & Lynn, N. (2006). School-based mental health: An empirical guide for decision-makers Tampa, FL: University of South Florida, The Research & Training Center for Child rens Mental Health Louis de la Parte Florida Mental Health Institute. Kyler, S. J., Bumbarger, B. K., & Greenberg, M. T. (2005). Penn State technical assistance fact sheets. Evidence-based programs. State College, PA: Pennsylvania State University, Prevention Research Center. Landis J.R.,& Koch, G. G. (1977). The meas ure of observer agreement for categorical data. Biometrics, 33, 159-174. Larsen, T., & Samdal, O. (2007). Implemen ting Second Step: Balancing fidelity and program adaptation. Journal of Educational and Psychological Consultation 17(1), 1-29. Leaf, P., Alegria, M., Cohen, P., Goodman, S. H., Horwitz, S. M., Hoven, C. W., et al. (1996). Mental health se rvice use in the community and schools: Results from the four-community MECA study. Journal of the American Academy of Child and Adolescent Psychiatry, 35 889-897. Luria, A. R. (1961). The role of speech in the regulation of normal and abnormal behaviors. New York. Liveright. McGraw, K.O. and Wong, S.P. (1996). Form ing inferences about some intraclass correlation coefficients. Psychological Methods 1, 30-46. Merton, R. K., Fiske, M., & Kendall, P. L. (1990). The focused interview: A manual of problems and procedures (2nd ed.). New York: Free Press.

PAGE 140

128 Metz, A. J. R., Espiritu, R., & Moore, K. A. (2007). What is evidence-based practice ? Washington, DC: Child Trends. Retrieved February 2, 2008, from http://www.childtrends.org/Fil es//Child_Trends-2007_06_04_RB_EBP1.pdf Moore, B., and Beland, K. (1992). Evaluation of Second Step Preschool-Kindergarten: A Violence Prevention Curriculum. Seattle WA: Committee for Children. Mulgar, M. (2005). Government, knowledge & the business of policy making: The potential and limits of evidence-based policy Evidence and Policy, 1 (2), 215-216. Nabors, L. A., and Weist, M. D., & Reynolds M. W. (2000). Overcoming challenges in outcome evaluations of school mental health programs. Journal of School Health 70(5), 206-209. National Coordinating Technical Assistan ce Center for Drug Prevention and School Safety Program Coordinators. (2003). Frequently asked questions Retrieved April 27, 2007, from http://www.k12coordinators.org/faqs.asp Manzano Department of Health. (2005). Manzano Youth, Risk and Resiliency Central City Public Schools Retrieved July 23, 2008. Manzano youth violence in Manzano: An assessm ent of indicators, policies, resources, and community readiness. (2005). Atlanta: Center for Disease Control and Prevention ESCAPE (Grant #. U17CCU624342-01). Office of Injury Prevention, Injury and Behavioral Ep idemiology Bureau, Epidemiology and Response Division, The Violen ce Free Youth Partnership. McGraw, K.O. & Wong, S.P. (1996). Form ing inferences about some intraclass correlations coefficients. Psychological Methods 1, 30-46. Osowski, M. (2007). Research brief: Steps to violence prevention project Central City, Manzano: Central City Public Schools Research, Development & Accountability. Pirrie, A. (2001). Evidence-based practi ce in education: The best medicine? British Journal of Educational Studies, 49 (2), 124-136. Presidents New Freedom Commissi on on Mental Health. (2003). Achieving the promise: Transforming mental health care in America (Final report, DHHS Pub. No. SMA03-3832). Rockville: MD: U.S. Department of Health and Human Services. Prevention 2000: Moving effective pr evention programs into practice. (2000). St. Michaels, MD: Robert Wood Johnson Foundation. Principles of substance abuse prevention (DHHS Publication No. SMA 01-3507). (2001). Rockville, MD: National Clearinghouse for Alcohol and Drug Information. Center for Substa nce Abuse Prevention, SAMHSA.

PAGE 141

129 Report of the Coalition for Evidence-Based Policy, 2002 NCLB, Act 2001, Title IX Part A. 37 (A). Principal investigator Resnick, M. D., Ireland, M., & Borowsky, I. (2004).Youth violence perpetration: What protects? What predicts? Findings fro m the National Longitudinal Study of Adolescent Health. Journal of Adolescent Health, 35 (424), 1-10. Rogers, E. M. (1995). Diffusion of innovations (4th ed.). New York: Free Press. Rones, M., & Hoagwood, K. (2000). School-based mental health services: A research review. Clinical Child and Family Psychology Review 3(4), 223-241. Rossi, P. H., & Freeman, H. E. (1985). Evaluation: A systemic approach (3rd ed.). Beverly Hills, CA: Sage. Ryan, B. & Gross, N. (1943). The diffusion of hybrid seed corn in two Iowa communities. Rural Sociology 8(15). Sandia County Health Council. (2002). Community health profile & community health improvement plan Central City, Manzano: Sandia County Health Council. Sarason, S. B. (2002). Educational reform: A self-scrutinizing memoir New York: Teachers College Press. Satcher, D. (2001). Youth violence: A report of the Surgeon General Retrieved April 3, 2008 from http://www.surgeongeneral .gov/library/youthviolence/toc.html Saunders, L. (2005). Research & policy: Reflections in their relationship Evidence & Policy 1(3), 383-390. Schinke, S., Brounstein, P., & Gardner, S. (2002). The empirical base of school-based mental health services. In K. Kutash, A. J. Duchnowski, & N. Lynn (Eds.), School-based mental health: An empirical guide for decision-makers (pp. 35-60). Tampa: University of South Florida, L ouis de la Parte Florida Mental Health Institute, Dept. of Child and Family Studies, Research and Training Center for Children's Mental Health. Tarde, G. (1903). The laws of imitation (E.C. Parsons, Trans.) New York: Holt. Taylor, D. (2005). Governing through evid ence: Participation & power in policy evaluation. Journal of Social Policy, 34 (4), 601-618. Taylor, L., Nelson, P., & Adelman, H. (1999). Scaling up reforms acro ss a school district. Reading and Writing Quarterly, 15( 4), 303-325.

PAGE 142

130 Technical Assistance Collabo rative (n.d.). Behavioral Health Needs & GCCPS in Manzano. Retrieved July 24, 2008 from http://www.tacinc.org/Pubs/Manzano_needs_gCCPS.htm U.S. Census Bureau. (1998b). Educational attainment in the United States: March 1997 Washington, DC: Principal investigator. U.S. Department of Commer ce, Census Bureau, Current Population Survey (CPS). (2005). U.S. Department of Education, National Center for Education Statistics. (2004). The condition of education 2004 (NCES 2004). Washington, DC: U.S. Government Printing Office. U.S. Department of Health and Human Services. (1999). Understanding substance abuse prevention: Toward the 21st century: A primer on effective programs (DHHS Pub. No. SSMA99-3301). Rockville, MD: Principal investigator. U.S. Department of Health and Human Services. (2001). Youth violence: A report of the surgeon general. Rockville, MD: U.S. Depart ment of Health and Human Services, Substance Abuse and Mental Heal th Services Administration, Center for Mental Health Services, National Institutes of Health, National Institute of Mental Health. U.S. Department of Justice, Bur eau of Justice St atistics. (2000). Correctional populations in the United States, 1997 (NCJ). Washington, DC: U.S. Government Printing Office. U.S. Department of Justice, Bur eau of Justice St atistics. (2002). Correctional populations in the United States, 1998 (NCJ). Washington, DC: U.S. Government Printing Office. U.S. Department of Labor, Bur eau of Labor Statistics. (2006). Tabulations Retrieved March 30, 2007, from http:/ /www.bls.gov/cps/cpsaat7.pdf Weiss, D.H. (1998). Congressional committees as users of analysis Journal of Policy Analysis International Journa l of Education, 23(2), 137-150. Weiss, C. H., Murphy-Graham, E., Petrosino, A., & Gandhi, A. G. (2008). The fairy godmother and her warts: Making the dream of evidence-based policy come true. American Journal of Evaluation, 29, 29-47. Weissberg, R. P., & Greenberg, M. T. (1998). School and community competenceenhancement and prevention programs. In W. Damon (Series Ed.) & I. E. Sigel & K. A. Renninger (Vol. Eds.), Handbook of child psychology: Vol. 4. Child psychology in practice (5th ed., pp. 877-954). New York: Wiley.

PAGE 143

131 Weisz, J.R. Sandler, I.N., Durlak, J.A., & Anton, B.S. (2005). Promoting and protecting youth mental health though evidence -based prevention and treatment. American Psychologist 60 (6), 628-648. Yin, R.K. (1989. 1994). Case Study Research: Design and Methods. Thousand Oaks, CA: Sage Publications. Yin. R.K. (1994a). Discovering the future of the case study method in evaluation research. Evaluation Practice, 15, 283-294.

PAGE 144

132 Appendices

PAGE 145

133 Appendix A: Email to Principals An Examination of the Implementation of Second Step in a Public School System Lynn Pedraza My study is on the implementation of Second Step in an urban school district. I will be studying six elementary schools that had staff trained in Second Step in 2005. I am studying the strengths and challenges of implementation of evidence-based programs for schools. The study is examining what the developers of the programs are doing right and what they can do better to support sc hools as well as what schools are doing right during implementation process and what they ca n do better. It is not about the outcomes of using the program. What I need from you and your staff: 1) Three interviews-approximately one hour each. I need the principal, a teacher or counselor that provided Second Step and what I call a "key informant". The key informant can be either another teacher or counselor that provided Second Step or someone that was there during the first years of implementation. 2) One Focus Group-approximately one hour long. At least four indivi duals that were at the school in 2005 when Second Step was implemented. They could be teachers, counselors, educational assistants, librar ians...anyone that knows anything about Second Step from that first year. 3) All participants will be asked to complete a checklist on implementationapproximately fifteen minutes.

PAGE 146

134 Appendix A: (Continued) What else you may want to know: 1) Your school, your name, and your staff name s will not be used. All information in the interviews and focus groups will be confidential. 2) If possible, I would like copies of any documents used for implementation. 3) I will need to complete six schools before the end of the school year. I will need help in setting up two or three days to do the three interviews and one focus group. 4) I have three trained team members to help me. One may accompany me or two of them may work together to do the interviews or focus groups. 5) Please do not hesitate to ask me to clarify any questions. My personal cell phone number is ___________.

PAGE 147

Appendix B: Principal Protocol 135 Stages: Exploration and Adoption, Program Installation and Initial Implementation Introduction page: 1. How long have you been in your field?_______________________________ 2. How long have you been in this school? ______________________________ 3. What grade did you teach or work with? _____________ 4. How many colleagues worked in the same grade level as you did? __________ 5. How do you rate the importance of implem enting prevention programs in the cla ssroom on a scale of 1-5 with 5 being very important _____? Why? _______ Complete this section only if this is th e first person interviewed at the school. Thinking back to the 2005-2006 school year when your sc hool was first trained to this school year 2008-2009: 6. What has been the turnover rate of principals at your school? ______ 7. What has been the approximate turnover rate of teachers at your school? ______ 8. What has been the turnover rate of counselors at your school? ______ 9. What been the approximate turnover ra te of others at your school? ______

PAGE 148

Appendix B: (Continued) 136 Principal Protocol Proposition A: (Training) Schools that received training in Second Step prior to implementation of the program were provided with the appropriate implementation tools and support necessary to implement the program. Parts of Proposition (Indicator) Question Principal 1. There is evidence that school staff received training on the purpose and expected outcomes of providing Second Step in the schools. T1. Tell me about your experience in introducing Second Step in your school? T1a. Is there something you would have done differently, and if so why? T1b. If you did not participate, why not? T2. How does your school collaborate with other schools in terms of Second Step ? T3. What was the process to engage staff participation? T3a. How were you involved? T4. How receptive was staff to the training on the purpose and expected outcomes of the Second Step program? T4a. What were the influential f actors regarding initial staff receptivity? T4b. How receptive is staff to the Second Step now? T4c. What are the influential fact ors regarding current staff receptivity? T5. How well did the training emphasize the importance of implementation fidelity to achieve the programs expected outcomes? T5a. What tools were given in the training to support fidelity? T5b. How close to the training model has your school implemented Second Step ? 2. There is evidence that school staff received training on the Second Step Curriculum. T6. Did all staff receive training on the Second Step curriculum? T6a. If not, who did, and how was that decision made? T7. How were the Second Step tools for implementation integrated in to the training? T7a. Tell me how staff was supported in using the tools to support and evaluate implementation fidelity for SS? T8. Who provided the training: professional trainers, school district staff or someone other role group? T8a. What kind of ongoing or follow-up training was offered for Second Step ? T9.To what extent were adaptations to special populations such as students in special educa tion discussed in the curriculum training?

PAGE 149

Appendix B: (Continued) 137 T10. Tell me about discussions that occurred regarding which Second Step components were flexible? T10a. How well did you feel you understood the impact of adapting the components to the success of the program? 3. There is evidence that the school staff received training on how to involve families in Second Step T11. What type of training was provided to school staff on parent involvement with Second Step ? T12. What documentation of parental involvement efforts wa s collected e.g., letters, guides parent/teacher conferences, newsletters, program-related posters? T13. What methods helped to involve families in Second Step ? 4. There is evidence that school staff received an adequate number of curriculum kits. T14. How ample were the number of curriculum kits and supplies for the staff? T14a. How did the amount of kits influence implementation fidelity? T15. Did you receive and have an oppor tunity to review curriculum kits? T15a. Were you able to give feedback on the kits? T15b. What do think about the kits? T16. Where are materials located? T16a. How are they maintained and accounted for?

PAGE 150

Appendix B: (Continued) 138 Proposition B: (Time) When implementing the program, sufficien t time was allocated for school staff to review the program comp onents with peers as well as sufficient time to de liver the program to students. Parts of Proposition (Indicators): Question Principal 1. There is evidence that school staff received sufficient time to review the Second Step program TM1. How do you feel about the time allocated for hands-on experiences and practice with program materials? TM2. Was the time used in a productive way? TM2a. How so? TM2b. Is there anything you would have done differently? 2. There is evidence that school staff received shared time to work together for appropriate implementation planning of Second Step TM3. How is shared time allocated for staff?? TM3a. Was it adequate for appropriate implementation planning of Second Step ? TM3b. If not, why not? TM4. How did you encourage staff to address language and cultural needs during shared time? TM5. What type of specific practice sessions did th e staff receive for appropriate implementation of Second Step ? An example might be how the programs key concepts can be adapted to students with special needs. TM6. How was implementation planning time facilitated, allowing for input and shared ideas amongst staff about Second Step implementation? TM7. What was your role in curricula presentation planning? TM7a. How did you prioritize your work when implementing Second Step into your schedule? TM7b. How did that decision impact the of the Second Step implementation fidelity?

PAGE 151

Appendix B: (Continued) 139 3. There is evidence that specific blocks of time were allocated for school staff to implement the Second Step program. TM8. How often were there specific blocks of time allocated for program implementation? TM8a. How adequate was this? TM8b. If there were little or no blocks or time, how did you allocate specific time for Second Step ? TM9. What types of documentation where developed (calendars, lesson plans, etc) indicating allocated time? TM9a. Do you know where they were kept? TM10. How did you feel about the time allocated to implement the program? 4. There is evidence that school staff received shared time to review successes and concerns about Second Step implementation. TM11. How was time allocated for colleagues to re view successes and concerns about implementation? TM11a. If there was little or no time allocated, would you have liked to have had time? TM11b. If you would have liked to have time, how might you have done it? TM12. How were concerns about implementation addressed? TM12a. Will you please give me an example or two?

PAGE 152

Appendix B: (Continued) 140 Proposition C: (Implementation Level) If Second Step was implemented school-wide, there was more peer-to-peer support, more adherence to the program, and staff were more likely to attribute positive student outcomes to Second Step than when Second Step was implemented only in individual classrooms or grades. Parts of Proposition (Indicators): Principal 1. There is evidence that Second Step was delivered schoolwide. D1. Were you given information on the importance of staff participation in Second Step training? D1a. If so, how was this done? D2. Did your school implement Second Step school-wide, by grade levels, or by individual classrooms? D2a. How was that decision made? D2b. What did you think about that decision? 2. There is evidence that if Second Step was implemented school-wide there were more specific time blocks allocated for Peer-to-Peer support. D3. How did you feel about the time staff was allocated for peer-to-peer support around program implementation? D4. How would you describe the process of implementation of the Second Step curricula? 3. There is evidence that staff that used the Second Step implementation tools were more likely to be at a school that implemented Second Step school-wide. D5. Which Second Step implementation tools were used at your school? D6. Tell me about which Second Step implementation tools were flexible and which were required to access fidelity. D7. How did the implementation tools help to maintain fidelity of the implementation of Second Step ?

PAGE 153

Appendix B: (Continued) 141 4. There is evidence that staff attributed positive outcomes to Second Step D8. How did you perceive and experience the Second Step program in relation to your work? D8a. In other words, what was its relevance to your work D8b. How was it meeting the needs of your students? D9. What types of student behavioral changes did you witness based on your observations during Second Step implementation? D9a. Any staff behavioral changes? D10. Overall, what is your perception of the influence Second Step has had on your school?

PAGE 154

Appendix B: (Continued) 142 Proposition D: (Champion) When a school had a designated Champion for Second Step school staff were more likely to implement the program with higher levels of implementation than when there was no Champion present. Parts of Proposition (Indicators): Principals 1. There is evidence that there was a designated Champion. C1. Who was the Champion and how was the role of the Champion communicated to the staff? L1a. How did that go? C2. What types of interactions did staff have with the Champion? C3. How did the Champion have influence on the use of Second Step ? C4. How was Second Step included as part of the school overall strategic plan? 2. There is evidence that the designated Champion articulated the Second Step program to the entire staff. C5. How involved was the Champion in the introductory training? C5a. What level of commitment or buy-in did they appear to have? C6. How involved was the Champion in the training of curriculum implementation? C6a. Was adequate time spent in the training? C7. How involved was the Champion in articulating the Second Step program with the staff? L7a.How do you know? 3. There is evidence that the Champion directly insured the allocation of time and resources to support the Second Step program. C8. Describe how the allocation of resources such as shar ed prep time, purchase of materials, etc facilitated the Second Step program in your school and the importance of this. C9. Who facilitated the allocation of time and resources to support Second Step ? C10. How did the Champion encourage fidelity to the model? C11. How did the Champion facilitate and follow-up on the implementation and use of Second Step ?

PAGE 155

Appendix B: (Continued) 143 4. There is evidence that implementation of Second Step with higher levels of implementation was associated with the presence of a Champion. C12. What strategies do you believe were the most effective in the implementation process? C12a What strategies did not work? C12b.What contributed to the either the success or failure of strategies? C13. How did your school evaluate the fidelity of the implementation of the Second Step program? Is there anything you would like to add that you think might be important for others wanting to adopt the program? Is there someone else who we should talk with?

PAGE 156

Appendix C: Counselor/Teacher/Key Informant Protocol 144 Stages: Exploration and Adoption, Program Installation and Initial Implementation Introduction page: 1. How long have you been in your field?_______________________________ 2. How long have you been in this school? ______________________________ 3. What grade did you teach or work with? _____________ 4. How many colleagues worked in the same grade level as you did? __________ 5. How do you rate the importance of implementing prevention programs in the classroom on a scale of 1-5 with 5 being very importa nt _____? Why? _______ Complete this section only if this is th e first person interviewed at the school. Thinking back to the 2005-2006 school year when your school was first trained to this school year 2008-2009: 6. What has been the turnover rate of principals at your school? ______ 7. What has been the approximate turnover rate of teachers at your school? ______ 8. What has been the turnover rate of counselors at your school? ______ 9. What been the approximate turnover rate of others at your school? ______

PAGE 157

Appendix C: (Continued) 145 Proposition A: (Training) Schools that received training in Second Step prior to implementation of the program were provided with the appropriate implementation t ools and support necessary to implement the program. Parts of Proposition (Indicator) Question Counselor/Teacher/Key Informant 1. There is evidence that school staff received training on the purpose and expected outcomes of providing Second Step in the schools. T1. Why was Second Step introduced to your school? T1a. How was it introduced to your school? T1a. Is there something you would have done differently, and if so why? T2. What was the process to engage staff participation? T2a. Was your participation in the introductory training mandated or voluntary? T2b. What was the r eason for that decision? T3. How were the programs key concepts and expected outcomes introduced to you? T4. How clearly do you think the program objectives were presented? T5. How well did the training emphasize the importance of implementation fidelity to achieve the programs expected outcomes? T5a. What tools were given in the training to support fidelity? T5b. How close to the training model has your school implemented Second Step ? T5c. If you believe your school has not followed the training model closely, why not? 2. There is evidence that school staff received training on the Second Step Curriculum. T6. Did all staff receive training on the curriculum? T6a. If not, who did? T6b. Why do you think that decision was made? T7. How were the Second Step tools for implementation integrated in to the training? T7a. Tell me how staff were encouraged to use the tools to support and evaluate implementation fidelity for Second Step ? T8. Who provided the training: professional trainers, school district staff or some other role group? T8a. What kind of ongoing or follow-up training was offered for Second Step ? T9. To what extent were adaptations to special populations such as students in special educa tion discussed in the curriculum training? T10. How much discussion occurred on which components were flexible? T10a. How well did you feel you understood the impact of adapting the components to the success of the program?

PAGE 158

Appendix C: (Continued) 146 T11. How would you describe your overall experience with the program training? T11a. Did all faculty/staff receive training on the Second Step curriculum? T11b. If not, who did, and how was that decision made? T12. What role does the training play in the fidelity of implementation of curriculum? 3. There is evidence that the school staff received training on how to involve families in Second Step T12. How would you describe the training provided to school staff on parent involvement with Second Step ? T13. What documentation of parental involvement efforts was collected e.g., letters, guides, parent/ conferences, newsletters, program-related posters? T13a. Do you have copies of any documents to share with the study? T14. What methods helped to involve families in Second Step ? T14a. What role does understanding and encouraging parent involvement have on fidelity implementation? 4. There is evidence that school staff received an adequate number of curriculum kits. T15. How ample were the number of curriculum kits and supplies for the staff? T16. Were you able to give feedback on the kits? T16a. What do think about the kits? T17. How was your opinion on the kits solicited, and how was it valued? T18. Discuss the highlights and weaknesses of curriculum kits. T18a. Were materials provided in a timely manner? T19. Where are materials located? T19a. How are they maintained and accounted for?

PAGE 159

Appendix C: (Continued) 147 Proposition B: (Time) When implementing the program, sufficien t time was allocated for school staff to review the program comp onents with peers as well as sufficient time to de liver the program to students. Parts of Proposition (Indicators): Question Counselor/Teacher/Key Informant 1. There is evidence that school staff received sufficient time to review the Second Step program TM1. How do you feel about the time allocated for hands-on experiences and practice with program materials? TM2. Was the time used in a productive way? TM2a. How so? TM2b. Is there anything you would have done differently? 2. There is evidence that school staff received shared time to work together for appropriate implementation planning of Second Step TM3. If you had shared time with colleag ues, how did you use the shared time? TM3a. Was it adequate? TM4. How did you address language and cultural needs during shared time? TM5. What type of specific practice sessions did the training provide? An example might be how the programs key concepts can be adapted to students with special needs. TM6. How was implementation planning time facilitated, allowing for input and shared ideas amongst staff about Second Step implementation? TM7. What was your role in curricula presentation planning? TM7a. How did you prioritize your work when implementing Second Step into your schedule? TM7b. How did that decision impact Second Step implementation fidelity? 3. There is evidence that specific blocks of time were allocated for school staff to implement the Second Step program. TM8. How often were there specific blocks of time allocated for program implementation? TM8a. How adequate was this? TM8b. If there were little or no blocks of time, how did you allocate specific time for Second Step ? TM9. What types of documentation where developed (calendars, lesson plans, etc) indicating allocated time? TM9a. Do you know where they were kept? TM10. How did you feel about the time allocated to implement the program?

PAGE 160

Appendix C: (Continued) 148 4. There is evidence that school staff received shared time to review successes and concerns about Second Step implementation. TM11. How was time allocated for colleagues to re view successes and concerns about implementation? TM11a. If there was little or no time allocated, would you have liked to have had time? TM11b. If you would have liked to have time, how might you have done it? TM12. How were concerns about implementation addressed? TM12a. Will you please give me an example or two?

PAGE 161

Appendix C: (Continued) 149 Proposition C: (Implementation Level) If Second Step was implemented school-wide, there was more peer-to-peer support, more adherence to the program, and staff were more likely to attribute positive student outcomes to Second Step than when Second Step was implemented only in individual classrooms or grades. Parts of Proposition (Indicators): Counselor/Teacher/Key Informant 1. There is evidence that Second Step was delivered schoolwide. D1. Were you given information on the importance of staff participation in Second Step training? D1a. If so, how was this done? D2. Did your school implement Second Step school-wide, by grade levels, or by individual classrooms? D2a. How was that decision made? D2b. What did you think about that decision? 2. There is evidence that if Second Step was implemented school-wide there were more specific time blocks allocated for Peer-to-Peer support. D3. How did you feel about the time staff was allocated for peer-to-peer support around program implementation? D4. How would you describe the process of implementation of the Second Step curriculum? 3. There is evidence that staff that used the Second Step implementation tools were more likely to be at a school that implemented Second Step school-wide. D5. Which Second Step implementation tools were used at your school? D6. Tell me about which Second Step implementation tools were flexible and which were required to access fidelity. D7. How did the implementation tools help to maintain fidelity of the implementation of Second Step ?

PAGE 162

Appendix C: (Continued) 150 4. There is evidence that staff attributed positive outcomes to Second Step D8. How did you perceive and experience the Second Step program in relation to your work? D8a. In other words, what was its relevance to your work? D8b. How was it meeting the needs of students at your school? D9. What types of student behavioral changes did you witness based on your observations during Second Step implementation? D9a. Any staff behavioral changes? D10. Overall, what is your perception of the influence Second Step has had on your school?

PAGE 163

Appendix C: (Continued) 151 Proposition D: (Champion) When a school had a designated Champion for Second Step school staff were more likely to implement the program with higher levels of fidelity than when there was no Champion present. Parts of Proposition (Indicators): Counselor/Teacher/Key Informants 1. There is evidence that there was a designated Champion. C1. Who was the Champion and how was the role of the Champion communicated to the staff? L1a. How did that go? C2. What types of interactions did staff have with the Champion? C3. How did the Champion have influence on the use of Second Step ? C4. How was Second Step included as part of the school overall strategic plan? 2. There is evidence that the designated Champion articulated the Second Step program to the entire staff. C5. How involved was the Champion in the introductory training? C5a. What level of commitment or buy-in did they appear to have? C6. How involved was the Champion in the training of curriculum implementation? C6a. Was adequate time spent in the training? C7. How involved was the Champion in articulating the Second Step program with the staff? C7a.How do you know? 3. There is evidence that the Champion directly insured the allocation of time and resources to support the Second Step program. C8. Describe how the allocation of resources such as shared prep time, purchase of materials, etc. facilitated the Second Step program in your school and the importance of this. C9. Who facilitated the allocation of time and resources to support Second Step ? C10. How did the Champion encourage fidelity to the model? C11. How did the Champion facilitate and follow-up on the implementation and use of Second Step ?

PAGE 164

Appendix C: (Continued) 152 4. There is evidence that implementation of Second Step with higher levels of Implementation was associated with the presence of a Champion. C12. What strategies do you believe were the most effective in the implementation process? C12a What strategies did not work? C12b.What contributed to the either the success or failure of strategies? C13. How did your school evaluate the fidelity of the implementation of the Second Step program? Is there anything you would like to add that you think might be important for others wanting to adopt the program? Is there someone else who we should talk with?

PAGE 165

Appendix D: Focus Group Protocol 153 Stages: Exploration and Adoption, Program Installation and Initial Implementation Introduction page: Participant: ________________________________________ 1. How long have you been in your field?_______________________________ 2. How long have you been in this school? ______________________________ 3. What grade did you teach or work with? _____________ 4. How many colleagues worked in the same grade level as you did? __________ 5. How do you rate the importance of implementing prevention programs in the classroom on a scale of 1-5 with 5 being very importa nt _____? Why? _______ Participant: ________________________________________ 6. How long have you been in your field?_______________________________ 7. How long have you been in this school? ______________________________ 8. What grade did you teach or work with? _____________ 9. How many colleagues worked in the same grade level as you did? __________ 10. How do you rate the importance of implementing prevention programs in the classroom on a scale of 1-5 with 5 being very importa nt _____? Why? _______ Participant: ________________________________________ 11. How long have you been in your field?_______________________________ 12. How long have you been in this school? ______________________________

PAGE 166

Appendix D: (Continued) 154 13. What grade did you teach or work with? _____________ 14. How many colleagues worked in the same grade level as you did? __________ 15. How do you rate the importance of implementing prevention programs in the classroom on a scale of 1-5 with 5 being very importa nt _____? Why? _______ Participant: ________________________________________ 16. How long have you been in your field?_______________________________ 17. How long have you been in this school? ______________________________ 18. What grade did you teach or work with? _____________ 19. How many colleagues worked in the same grade level as you did? __________ 20. How do you rate the importance of implementing prevention programs in the classroom on a scale of 1-5 with 5 being very importa nt _____? Why? _______

PAGE 167

Appendix D: (Continued) 155 Proposition A: (Training) Schools that received training in Second Step prior to implementation of the program were provided with the appropriate implementation tools and support necessary to implement the program. Parts of Proposition (Indicator) Question Focus Group 1. There is evidence that school staff received training on the purpose and expected outcomes of providing Second Step in the schools. T1. Why was Second Step introduced to your school? T1a. How was it introduced to your school? T1b. Is there something you would have done differently, and if so why? T2. What was the process to engage staff participation? T3. How were the programs key concepts and expected outcomes introduced to you? T4. How well did the training emphasize the importance of implementation fidelity to achieve the programs expected outcomes? T4a. What tools were given in the training to support fidelity? T4b. How close to the training model has your school implemented Second Step ?

PAGE 168

Appendix D: (Continued) 156 T4c. If you believe your school has not followed the training model closely, why not? 2. There is evidence that school staff received training on the Second Step Curriculum. T5. Did all staff receive training on the curriculum? T5a. If not, who did? T5b. Why do you think that decision was made? T6. How were the Second Step tools for implementation integrated in to the training? T6a. Tell me how staff were encouraged to use the tools to support and evaluate implementation fidelity for Second Step ? T7. To what extent were adaptations to special populations such as students in special education discussed in the curriculum training? T8. How much discussion occurred on which components were flexible? T8a. How well did you feel you understood the impact of adapting the components to the success of the program? T8b. Why might it be

PAGE 169

Appendix D: (Continued) 157 important to be able to adapt the components? T9. How would you describe your overall experience with the program training? T9a. Did all faculty/staff receive training on the Second Step curriculum? T9b. If not, who did, and how was that decision made? T10. What role does the training play in the fidelity of implementation of curriculum? 3. There is evidence that the school staff received training on how to involve families in Second Step T11. How would you describe the training provided to school staff on parent involvement with Second Step ? T12. What methods helped to involve families in Second Step ? T12a. What role does understanding and encouraging parent involvement have on fidelity implementation? 4. There is evidence that school staff received an adequate number of T13. How ample were the number of curriculum kits and supplies for the staff?

PAGE 170

Appendix D: (Continued) 158 curriculum kits. T14. Were you able to give feedback on the kits? T14a. What do think about the kits? T15. Discuss the highlights and weaknesses of curriculum kits. T15a. Were materials provided in a timely manner?

PAGE 171

Appendix D: (Continued) 159 Proposition B: (Time) When implementing the program, sufficien t time was allocated for school staff to review the program comp onents with peers as well as sufficient time to de liver the program to students. Parts of Proposition (Indicators): Question Focus Group 1. There is evidence that school staff received sufficient time to review the Second Step program TM1. How do you feel about the time allocated for hands-on experiences and practice with program materials? TM2. Was the time used in a productive way? TM2a. How so? TM2b. Is there anything you would have done differently? 2. There is evidence that school staff received shared time to work together for appropriate implementation planning of Second Step TM3. If you had shared time with colleagues, how did you use the shared time? TM3a. Was it adequate? TM4. How did you address language and cultural needs during shared time? TM5. What type of specific practice sessions did the training provide? An example might be how the programs key concepts can be adapted to students with special needs.

PAGE 172

Appendix D: (Continued) 160 TM6. How was implementation planning time facilitated, allowing for input and shared ideas amongst staff about Second Step implementation? TM7. What was your role in curricula presentation planning? TM7a. How did you prioritize your work when Second Step was implemented into the classroom? TM7b. How did that decision impact Second Step implementation fidelity? 3. There is evidence that specific blocks of time were allocated for school staff to implement the Second Step program. TM8. How often were there specific blocks of time allocated for program implementation? TM8a. How adequate was this? TM8b. If there were little or no blocks or time, how did you allocate specific time for Second Step ? TM9. What types of documentation where developed (calendars, lesson plans, etc) indicating allocated time? TM9a. Do you know where they were kept?

PAGE 173

Appendix D: (Continued) 161 TM10. How did you feel about the time allocated to implement the program? 4. There is evidence that school staff received shared time to review successes and concerns about Second Step implementation. TM11. How was time allocated for colleagues to review successes and concerns about implementation? TM11a. If there was little or no time allocated, would you have liked to have had time? TM11b. If you would have liked to have time, how might you have done it? ? TM12. How were concerns about implementation addressed? TM12a. Will you please give me an example or two?

PAGE 174

Appendix D: (Continued) 162 Proposition C: (Implementation Level) If Second Step was implemented school-wide, there was more peer-to-peer support, more adherence to the program, and staff were more likely to attribute positive student outcomes to Second Step than when Second Step was implemented only in individual classrooms or grades. Parts of Proposition (Indicators): Focus Group 1. There is evidence that Second Step was delivered schoolwide. D1. Were you given information on the importance of staff participation in Second Step training? D1a. If so, how was this done? D2. Did your school implement Second Step school-wide, by grade levels, or by individual classrooms? D2a. How was that decision made? D2b. What did you think about that decision? 2. There is evidence that if Second Step was implemented school-wide there were more D3. How did you feel about the time staff was allocated for peer-to-peer support around program implementation?

PAGE 175

Appendix D: (Continued) 163 specific time blocks allocated for Peer-to-Peer support. D4. How would you describe the process of implementation of the Second Step curriculum? 3. There is evidence that staff that used the Second Step implementation tools were more likely to be at a school that implemented Second Step school-wide. D5. Which Second Step implementation tools were used at your school? D6. Tell me about which Second Step implementation tools were flexible and which were required to access fidelity. D7. How did the implementation tools help to maintain fidelity of the implementation of Second Step ? 4. There is evidence that staff attributed positive outcomes to Second Step D8. How did you perceive and experience the Second Step program in relation to your work? D8a. In other words, what was its relevance to your work D8b. How was it meeting the needs of students at your school?

PAGE 176

Appendix D: (Continued) 164 D9. What types of student behavioral changes did you witness based on your observations during Second Step implementation? D9a. Any staff behavioral changes? D10. Overall, what is your perception of the influence Second Step has had on your school?

PAGE 177

Appendix D: (Continued) 165 Proposition D: (Champion) When a school had a designated Champion for Second Step school staff were more likely to implement the program with higher levels of implementation than when there was no Champion present. Parts of Proposition (Indicators): Focus Groups 1. There is evidence that there was a designated Champion. C1. Who was the Champion and how was the role of the Champion communicated to the staff? C1a. How did that go? C2. What types of interactions did staff have with the Champion? C3. How did the Champion have influence on the use of Second Step ? C4. How was Second Step included as part of the school overall strategic plan? 2. There is evidence that the designated Champion articulated the Second Step program to the entire staff. C5. How involved was the Champion in the introductory training? C5a. What level of commitment or buy-in did they appear to have? C6. How involved was the Champion in the training of curriculum implementation? C6a. Was adequate time spent in the training?

PAGE 178

Appendix D: (Continued) 166 C7. How involved was the Champion in articulating the Second Step program with the staff? C7a.How do you know? 3. There is evidence that the Champion directly insured the allocation of time and resources to support the Second Step program. C8. Describe how the allocation of resources such as shared prep time, purchase of materials, etc. facilitated the Second Step program in your school and the importance of this. C9. Who facilitated the allocation of time and resources to support Second Step ? C10. How did the Champion encourage fidelity to the model? C11. How did the Champion facilitate and follow-up on the implementation and use of Second Step ? 4. There is evidence that implementation of Second Step with higher levels of fidelity was associated with the presence of a Champion. C12. What strategies do you believe were the most effective in the implementation process? C12a What strategies did not work? C12b.What contributed to the either the success or failure of strategies ?

PAGE 179

Appendix D: (Continued) 167 C13. How did your school evaluate the fidelity of the implementation of the Second Step program? Is there anything you would like to add that you think might be important for others wanting to adopt the program? Is there someone else who we should talk with?

PAGE 180

Appendix E: Checklist for all Participants to Complete 168 Lets go through some of the steps of Second Step and have you rate each step on a scale of 1-5 in terms of how easy it was to implem ent. One will be the ea siest and 5 will be the most difficult. ________________________________________________________________________ A. Implementation Practices RATING: 1. Teaching SS at the grade level(s) you taught __________ 2. Reinforcing strategies and concep ts in daily activities & using consistent messages throughout the school __________ 3. Extending learning opportunities by a pplying skill steps in all settings __________ 4. Modeling SS skills and behaviors in all interactions __________ 5. Integrating learning goals through out the regular curriculum __________ 6. Familiarizing parents and caregivers with the program __________ B. Training Models 1. SS Overview Presentation __________ 2. Initial one-day staff training __________ 3. Classroom observation __________ 4. Involvement of non-classroom staff __________ C. Administrators Roles and Responsibilities 1. Securing buy-in from entire staff __________ 2. Awareness of need for social skill s and violence prev ention program __________ 3. Understanding of use of SS to address identified needs __________ D. Evaluation of Progress __________ E. Needs assessment __________ F. Process evaluation __________ G. Outcome evaluation __________ H. SS presentation preparation & outline __________ I. SS lesson plans 1. Lesson plan breakdown __________ 2. Lesson plan timing guidelines __________ 3. Lesson plan social skills teaching strategies __________ 4. Lesson plan role play tips __________ 5. SS suggested scripts __________ 6. SS problem-solving steps __________

PAGE 181

Appendix F: Rating Scale 169 School being rated: __________________________________ Rater ____________ Proposition A: (Training) Schools that received training in Second Step prior to implementation of the program were provided with the appropriate implementation tools and support necessary to implement the program with fidelity. INSTRUCTIONS: Rate the following parts of the proposition. Please circle your response. If data supports or is against the statement, rate the evidence as strong, moderate or mild by circling either +3, +2,+1, -3, -2, or -1. If the data have no evidence about the statement, then circle 0. The data provide evidence that SUPPORT S the statement that fill in one part of the proposition and the evidence is The data provide evidence that is AGAINST the statement that fill in one part of the proposition and the evidence is The data DOES NOT provide any evidence abou t the statement that fill in one part of the proposition. (NOTE: Mark this option only if there was NO evidence in the data) Parts of Proposition (Indicators): Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 1. There is evidence that school staff received training on the purpose and expected outcomes of providing Second Step in the schools. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 2. There is evidence that school staff received training on the Second Step Curriculum. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 3. There is evidence that the school staff received training on how to involve families in Second Step Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 4. There is evidence that school staff received an adequate number of curriculum kits for appropriate implementation of Second Step. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0

PAGE 182

Appendix F: (Continued) 170 School being rated: __________________________________ Rater ____________ Proposition B: (Time) When implementing the program, sufficie nt time was allocated for school staff to review the program components with peers as well as sufficient time to deliver the program to students. INSTRUCTIONS: Rate the following parts of the proposition. Please circle your response If data supports or is against the statement, rate the evidence as strong, moderate or mild by circling either +3, +2,+1, -3, -2, or -1. If the data have no evidence about the statement, then circle 0. The data provide evidence that SUPPORT S the statement that fill in one part of the proposition and the evidence is The data provide evidence that is AGAINST the statement that fill in one part of the proposition and the evidence is The data DOES NOT provide any evidence abou t the statement that fill in one part of the proposition. (NOTE: Mark this option only if there was NO evidence in the data) Parts of Proposition (Indicators): Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 1. There is evidence that school staff received sufficient time to review the Second Step program Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 2. There is evidence that school staff received shared time to work together for appropriate implementation of Second Step. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 3. There is evidence that specific blocks of time were allocated for school staff to implement the program Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 4. There is evidence that school staff received shared time to review successes and concerns about Second Step implementation and outcomes. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0

PAGE 183

Appendix F: (Continued) 171 School being rated: __________________________________ Rater ____________ Proposition C: (Implementation Level) If Second Step was implemented school-wide, there was more peer-to-peer support, more adherence to the program, and staff were more likely to attribute positive students outcomes to Second Step than when Second Step was implemented only in individual classrooms or grades. INSTRUCTIONS: Rate the following parts of the proposition. Please circle your response. If data supports or is against the statement, rate the evidence as strong, moderate or mild by circling either +3, +2,+1, -3, -2, or -1. If the data have no evidence about the statement, then circle 0. The data provide evidence that SUPPORT S the statement that fill in one part of the proposition and the evidence is The data provide evidence that is AGAINST the statement that fill in one part of the proposition and the evidence is The data DOES NOT provide any evidence about the statement that fill in one part of the proposition. (NOTE: Mark this option only if there was NO evidence in the data) Parts of Proposition (Indicators): Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 1. There is evidence that Second Step was delivered school-wide. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 2. There is evidence that if Second Step was implemented school-wide there were more specific time blocks allocated for Peer-to-Peer support. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 3. There is evidence that staff that used the Second Step implementation tools were more likely be at a school that implemented Second Step schoolwide. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 4. There is evidence that staff attributed positive outcomes to Second Step Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0

PAGE 184

Appendix F: (Continued) 172 School being rated: __________________________________ Rater ____________ Proposition D: (Champion) When a school had a designated Champion for Second Step teachers and/or counselors were more likely to implement the program with higher levels of implementation than when there was no Champion present. INSTRUCTIONS: Rate the following parts of the proposition. Please circle your response. If data supports or is against the statement, rate the evidence as strong, moderate or mild by circling either +3, +2,+1, -3, -2, or -1. If the data have no evidence about the statement, then circle 0. The data provide evidence that SUPPORT S the statement that fill in one part of the proposition and the evidence is The data provide evidence that is AGAINST the statement that fill in one part of the proposition and the evidence is The data DOES NOT provide any evidence about the statement that fill in one part of the proposition. (NOTE: Mark this option only if there was NO evidence in the data) Parts of Proposition (Indicators): Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 1. There is evidence that there was a designated Champion Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 2. There is evidence that the designated Champion articulated the Second Step program to the entire staff. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 3. There is evidence that the Champion directly insured the allocation of time and resources to support the Second Step program. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0 4. There is evidence that implementation of Second Step with higher levels of implementation was associated with the presence of a clear Champion. Strong +3 Moderate +2 Mild +1 Strong -3 Moderate -2 Mild -1 No evidence 0

PAGE 185

Appendix G: Document Review 173

PAGE 186

Appendix G: (Continued) 174

PAGE 187

Appendix G: (Continued) 175 Appendix G: (Continued) 175

PAGE 188

Appendix G: (Continued) 176

PAGE 189

Appendix G: (Continued) 177

PAGE 190

Appendix G: (Continued) 178

PAGE 191

Appendix G: (Continued) 179

PAGE 192

Appendix G: (Continued) 180

PAGE 193

Appendix G: (Continued) 181

PAGE 194

Appendix G: (Continued) 182

PAGE 195

About the Author Lynn Pedraza is a first generation high sc hool and college graduate. She received both her bachelor's and master's degree at th e University of South Florida in special education. She received her educ ation specialist certi ficate in Education Leadership at the University of New Mexico. She has been an educator for 33 years. Ms. Pedraza is currently a district administrator, overseei ng support services in a large urban school district. Ms. Pedraza received the U.S. De partment of Health and Human Services, Substance Abuse and Mental Health Services Administration (SAMSHA) 2007 Administrator's Award for Advancing School Mental Health. She ha s been involved in numerous activities locally, state-wide, and na tionally to advance the work of education and mental health by supporting th e needs of children and yout h to reach their potential.