USF Libraries
USF Digital Collections

The effects of fluency training on performance, maintenance and generalization of parenting skills

MISSING IMAGE

Material Information

Title:
The effects of fluency training on performance, maintenance and generalization of parenting skills
Physical Description:
Book
Language:
English
Creator:
Williams, Gertie
Publisher:
University of South Florida
Place of Publication:
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Parent training
Timed drills
Parenting tools
Frequency
Accuracy
Dissertations, Academic -- Applied Behavior Analysis -- Masters -- USF
Genre:
bibliography   ( marcgt )
theses   ( marcgt )
non-fiction   ( marcgt )

Notes

Abstract:
ABSTRACT: The effects of fluency training on performance, maintenance, and generalization of parent training skills were examined within the context of a classroom and home setting. Three foster parents attended a 24-hour Parenting Tools for Positive Behavior Change (PBC) course. Participants completed timed fluency drills using flash cards to increase learning and performance of PBC tools. A non-concurrent multiple baseline design across participants was used to assess participant performance on flash card drills and PBC tools during in-class, pre-test, and post-test role plays, and in novel situations with children in the home before, during and after the course. Results showed that fluency training had little or no effect on increasing tool performance across all testing phases for all participants, nor were there any changes in frequency and accuracy of fluency trained tools in the home to indicate maintenance and generalization of treatment effects.
Thesis:
Thesis (M.A.)--University of South Florida, 2005.
Bibliography:
Includes bibliographical references.
System Details:
System requirements: World Wide Web browser and PDF reader.
System Details:
Mode of access: World Wide Web.
Statement of Responsibility:
by Gertie Williams.
General Note:
Title from PDF of title page.
General Note:
Document formatted into pages; contains 67 pages.

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
aleph - 001790578
oclc - 144729042
usfldc doi - E14-SFE0001510
usfldc handle - e14.1510
System ID:
SFS0025828:00001


This item is only available as the following downloads:


Full Text

PAGE 1

The Effects of Fluency Training on Performance, Maintenance, and Generalization of Parenting Skills by Gertie Williams, BA, BCABA A thesis submitted in partial fulfillment of the requirements for the degree of Master of Arts in Applied Behavior Analysis College of Graduate Studies University of South Florida Major Professor: Jennifer Austin, Ph.D Trevor Stokes, Ph.D. Michael Stoutimore, Ph.D Date of Approval: December 9, 2005 Keywords: parent training, timed drills, parenting tools, frequency, accuracy Copyright 2006, Gertie Williams, BA, BCABA

PAGE 2

Acknowledgements I would like to express my gratitude to Jennifer Austin, Ph.D. for her guidance and support throughout my thesis process and graduate career. I would also like to express my gratitude to Drs. Stokes and Stoutimore for their feedback and encouragement. Thank you to Dr. Timothy Volmer and the Directorship and staff of the Behavior Analysis Services Program for their support, and the opportunity to complete this study. A heartfelt thank you also to my family and friends who provided their time, and support during this process.

PAGE 3

i Table of Contents List of Tables ii List of Figures iii Abstract iv Chapter One Introduction 1 Chapter Two Method 14 Participants and Setting 14 Institutional Review 15 Dependent Variables and Measurement 15 Observer Training 18 Interobserver Agreement 18 Procedure 20 Baseline 20 Fluency Training 20 Experimental Design 22 Social Validity 23 Chapter Three Results 24 Chapter Four Discussion 36 References 43 Appendices 47 Appendix A: Sample Pretest and Post Test Role play Scripts 48 Appendix B: In teraction Evaluation Questionnaire 52 Appendix C: Sample Pre/Post Test, In-class Task Analyses and In-Home Task Analysis 54 Appendix D: Part icipants Data Sheet and Graph 62 Appendix E: Sample Flash Cards 64 Appendix F: Trainer Script for Fluency Drill 65 Appendix G: Social Validity Questionnaire 67

PAGE 4

ii List of Tables Table 1 Interobserver Agreement Scores 19 Table 2 Mean Percent Accuracy of Pe rformance on Fluency Trained (FT) and Non Fluency (Non FT) Trained Tools During Classroom Instruction 30

PAGE 5

iii List of Figures Figure 1. Rate of Correct and Incorr ect Responses per Minute for Fluency Trained Tools 25 Figure 2. Pre-test, Classroom and Post-test Performance on Fluency and Non Fluency Trained Tools 27 Figure 3. Frequency of In Home Tool Performance 32 Figure 4. Accuracy of In Home Tool Performance 34

PAGE 6

iv The Effects of Fluency Training on Performa nce, Maintenance and Generalization of Parenting Skills Gertie Williams ABSTRACT The effects of fluency training on performance, maintenance, and generalization of parent training skills were examined within the cont ext of a classroom and home setting. Three foster parents attended a 24-hour Parenting Tools for Positive Behavior Change (PBC) course. Participants completed timed fluency drills using flash cards to increase learning and performance of PBC tools. A non-concurrent multiple baseline design across participants was used to assess participant performance on flash card drills and PBC tools during in-class, pre-test, and pos t-test role plays, and in novel situations with children in the home before, during and after the course. Results showed that fluency training had little or no effect on increasing tool perf ormance across all testing phases for all participants, nor were there any changes in frequency and accuracy of fluency trained tools in the home to indicate maintenance a nd generalization of treatment effects.

PAGE 7

1 Chapter One Introduction Behavior analysis has provided a wealth of empirically validat ed strategies for improving learning outcomes across a variety of populations. Many of these strategies were developed in basic research laboratories and were intro duced or modified in applied settings (Binder & Watkins, 1990). Behavior an alytic principles such as reinforcement schedules, behavioral shaping, stimulus fadi ng, discrimination, and other research-based principles have been employed to address problems and challenges in education (Binder & Watkins). The implementation of these pr inciples has led to the development of multiple research-based strategies to improve overall student performance with regard to academic and non-academic behavior, teacher instruction, and measurement systems in educational settings. Such strategies include the use of response car ds (e.g., the use of a dry erase board on which students write a nd display answers to teacher delivered questions, see Gardner, Heward, & Grossi 1994), programmed instruction (e.g., where students respond to carefully designed questions and ar e provided with immediate feedback on performance, thus shaping delivery of future material, see Vargas & Vargas, 1991), and direct instruction (which places an emphasis on faultless communication using modeling and feedback, see Kinder & Carnin e, 1991). These strate gies have provided educators with multiple means of producing out comes such as increased response rates, higher grades, and greater retention. Pr ecision teaching (Lindsley, 1990) is another strategy that provides an instru ctor with the added benefit of frequent measurement of student performance and improved student respons e rates, in addition to increases in the retention, endurance, and application of learned skills.

PAGE 8

2 Developed by Ogden Lindsley in the 1960s precision teaching (P T) is the method of measuring student performance on a fre quent (e.g., daily) basi s, and subsequently using the data obtained from this measurement to analyze performance and propose instructional and motivational strategies to mediate any failures to learn (West, Young, & Spooner, 1990). West et al. maintain that pr ecision teaching is not so much a method of instruction as it is a precise and systematic method of evaluating instructional tactics and curricula ( p. 5 ). Precision te aching originated in basic research laboratories and developed inductively from research in free operant conditio ning (Lindsley, 1992). According to Lindsley, precision teaching involves the applica tion of rate of response and standard cumulative recording st rategies to assess learning in the classroom. This method of assessment allows instructors to base e ducational decisions on changes in continuous self-monitored performance frequencies displa yed on standard celeration charts ( p. 51). Lindsley suggests that cost effective and time efficient learning occurs when performance is counted and charted on a daily basis by le arners. With the use of precision teaching, learning is maximized when high performance aims (i.e., pre-set fluency criterion for a skill) are established. This is incredibly importa nt to both the instructor and leaner in that the instructor, when provided with frequent feedback, can adjust instructional methods so that the leaner is able to achieve learning objectives. Th is facilitates acquisition, retention and mastery of instructional material. There are seven basic elements that ma ke up the framework of precision teaching (West et al., 1990). The first element is the student knows best (West et al, 1990, p. 5). This suggests that an instructor, by analyz ing the students behavior, can determine whether learning is occurring or whether the instruction bein g provided is appropriate and

PAGE 9

3 effective. A second important element of precision teaching is an emphasis on direct, continuous measurement and daily monitoring a nd analysis of behavior. Such continuous measurement of behavior allows the instructor to make timely changes in instruction if students are failing to meet dynamic aims (i.e. projected trends) (B inder and Watkins, 1990). The third element involves measuring beha vior using rate of response, defined as a behavioral event or products per unit of time (Lindsley, 1991, p. 254). According to Lindsley, this provides a more accurate in dication of performance than the more traditional percentage correct measure. Rate of response data is then plotted on a standard celeration chart. Cele ration charting is the fourth el ement of PT and facilitates easy visual analysis of data with regard to performance trends, and allows both the student and instructor to iden tify patterns in performance, and make necessary changes if needed. A fifth element in the framework of PT is the use of descri ptive and functional definitions to describe behavior and behavi oral processes. Binder and Watkins (1990) further explain this element by stating th at in precision teaching, operational or descriptive definitions of events (e.g., an tecedent events and movement cycles) were distinguished from functional definitions (e .g., stimuli, consequences, or responses) of events. Thus, the authors explain, functional relations were determined between behavior and environmental variables. A stimulus was only considered a consequence when it caused a change in the rate of the behavior that preceded it. The sixth element emphasizes continued monitoring and analysis of instructional strategies, with a specific focus on the impact of those strategies on student learning. Continued monitoring facilitates immediate feedback and allows timely changes to the curriculum, thus

PAGE 10

4 increasing learning. Finally, the seventh element focuses on increasing appropriate behavior instead of placing an emphasis on decreasing or eliminating inappropriate behavior (i.e., errors). According to Binde r and Watkins, research has shown that encouraging rapid response rates facilitates gr eater learning, even when error rates remain relatively high. Although errors are usually corr ected, decreasing errors is not the main focus of instruction. One particularly attractive feature of PT is that it can be easily combined with other curricular approaches (e.g. Direct Instruction, personalized system of instruction or programmed instruction). For example, Morningside Academy offers a program that teaches basic skills (reading, writing a nd math) to adult learners. The program incorporates the use of PT with other instruct ional strategies such as direct instruction and the Tiemann-Markle instructional desi gn (Lindsley, 1992). Performance measures indicate that students who complete this progr am exhibit performance levels at or above the national eight-grade literacy standard, and achieve gains that exceed government standards. These outcomes provide support fo r the utility of PT within an existing curricula and the efficacy of PT in a ssessing the appropriate ness of curricula. Unfortunately, data on maintenance or genera lization of learner skills/performance were not presented. Such data would demonstrat e the durability of skills and provide an indication of skill maintenance; two important components in any behavior change procedure. Instructors that use preci sion teaching have not only demonstrated increased performance in learners, but have also provided a means by which curriculum choices for learners can be adopted or improved. Linds ley (1990) describes a study in which a

PAGE 11

5 second grade teacher was able to implem ent curriculum changes to improve the performance of the children in the classroom. Children in the class wrote answers to simple addition and subtraction problems on a PT worksheet during 1-minute timings. Although the children demonstrat ed high rates of correct re sponses, their error learning was poor (i.e., errors occurred at a low rate be low the record floor). However, when the curriculum was adjusted and children were requ ired to write answers to a combination of addition, subtraction, and multiplication probl ems with no prior instruction, both high rates of correct and error learning were observed. The latter task was an example of a curriculum leap up (i.e. an advancement in curriculum to a new learning objective). Lindsley concluded that the le ap up in curriculum made the mathematics problems more difficult, but facilitated increased and effici ent student learning a nd made the learning experience for the children more enjoyable. In other words, increases in errors were desirable because they provided children with additional opportunities to respond and learn. The data obtained from this study al so showed that PT can be used in the acquisition of skills (e.g., multiplication) not previously taught. Despite promising outcomes in student learning, data on student retention and genera lization of skills to other problems (e.g., division pr oblems) was not discussed. In the same article, Lindsely (1990) di scussed another clas sroom implementation of PT that demonstrated ways in which the most effective curriculum choices for learners could be identified. In this study, elementary school children who were one to four grade levels below their required reading levels were tutored by inner city high school students. Reading materials included multiple vocabulary lists, the local newspaper, various graded readers, and pupil written stories. Each child received tutoring from the same tutor for 45

PAGE 12

6 minutes, five days a week. Over a two-week period, each child read from three different materials during three different daily one-m inute timings. If the child showed the steepest learning (exhibited by the slope on the standard chart) with one curriculum method (e.g., newspaper) in comparison to the other two methods (e.g., the graded reader, pupil stories), the former curriculum material would be kept and compared with two other curricula for the following two week period. The two curricula th at resulted in poor learning would then be discarded. Thus, inst ructors simultaneously explored, identified, and implemented curriculum choices that resu lted in increased leaning for the children. Unfortunately, this study also failed to disc uss student retention of learned material, response rates, or generaliza tion of learned material. Although precision teaching is largely regard ed as a strategy for assessing the appropriateness of an educational program in producing satisfactory learning outcomes, strategies for teaching have evolved as a na tural outgrowth of the basic assumptions of precision teaching. One such strategy is fl uency training. Binder (1996) defines fluency as competent performance that is characteri zed by a combination of accuracy plus speed. It is the rate of accurate pe rformance and is typically measur ed as the number of correct and incorrect responses per minute (B ucklin, Dickinson, & Brethower, 2000). According to Binder (1988), fluency is the true indicator of mastery. If an individual is fluent in a skill, they will be able to reta in the learned skill, apply it to previously untrained skills or situations, and perform the skill easily in the presence of distraction. Precision teaching promotes fluency, because the PT framework facilitates the provision of multiple opportunities to respond within a specified time period (West, Young, & Spooner, 1990). There are multiple ways in which fluency may be achieved within the

PAGE 13

7 PT paradigm. These methods maximize use of learning channels, which are the primary means of input and output of information. Examples of learning channels include hear/say (i.e., learners hear a question or pr ompt, and say the answe r) and see/write (i.e., learners see the question or prompt and mark their respons es) (Binder, 2001). Fluency may be built into a curriculum by infusing frequent, short (e.g., 1minute) timed drills within instruction using varying activities th at incorporate the learning channels. Some of these activities may include the use of flash car ds, rapid recall exercises in which learners rapidly blurt out information recalled, practice sheets, or choral responding (i.e., groups responding to questions in unison) (Binder, 1993, 2001). Fluency is characterized by multiple bene fits that are summarized by the acronym REAPS; that is, retention, endurance, applic ation and performance standards (Binder, 1996). Retention has occurred when the lear ner exhibits a high rate of accurate performance post training, afte r the trained skills have not been practiced over a long period of time. Maintenance has been define d as the extent to which the learner continues to perform the target behavior after a portion or al l of the intervention has been terminated.(Cooper, Heron & Heward, 1987, p. 558). Essentially, retention and maintenance are similar concepts. Both emphasize continued performance of an acquired behavior after an interventi on (in this case, fluency practice) has been removed. Application is defined as th e learners ability to transf er skills trained to a new and more complex task. That is, the learner is able to combine component skills and use them to perform more complex (composite) ta sks that were not trained. For example, a child is fluent in solving single digit multipli cation problems, and is then able to solve more complex multiplication problems using th e component skills previously learned.

PAGE 14

8 Baer, Wolf, and Risely (1968) de fine generality as a behavior change that spreads to a variety of related behavior, or appears in a variety of environments. It is important to note that this spread of behavior change may not have been directly trai ned or reinforced in the novel environments or with novel behavior s. Here again, the te rms application and generalization are similar, and impl y similar outcomes for learning. Bucklin, Dickinson, and Brethower, (2000) conducted a study that compared rates of retention and application of learned skills in participan ts who had been trained to perform a task fluently, with participants who had been trained to perform a task accurately. A total of 29 students were required to learn 10 arbitrary associations between threeletter nonsense syllables and Arabic numerals, and Hebrew symbols and threeletter nonsense syllables (i.e., component ski lls). Participants were randomly assigned to one of two treatment groups (i.e., accuracy or fluency). All partic ipants studied flash cards with Hebrew symbols on one side a nd the corresponding nonsense syllable on the other side, as well as flash cards with Arab ic numerals on one side and the corresponding nonsense syllable on the other side. Particip ants completed the accuracy phase of the training when they could write the corresponding nonsense syllable when presented with a specific Hebrew symbol or Arabic numera l when presented with a nonsense syllable with 100% accuracy for four consecutive untimed trials. This was the only training received by the accuracy group. After achieving accuracy, the fluency group was provided with training via the use of See Hebrew Symbol-Write Nonsen se Syllable and See Nonsense SyllableWrite Arabic Numeral worksheets (Buc klin et al., 2000, p. 153) Training sessions included five one-minute timings, during which participants completed two types of work

PAGE 15

9 sheets. Worksheets were randomly presented. Pa rticipants were considered fluent when they achieved 100% accuracy for five consecu tive timings; with 50 co rrect responses per minute on Hebrew symbol/nonsense syllable recall exercises and 100 correct responses per minute on nonsense syllable/Arabic numeral worksheets. After meeting fluency or accuracy crit eria, experimenters tested participant application of concepts by asking participan ts to complete a worksheet of Hebrew symbols written as arithmetic problems and asking them to write answers to the problems in Arabic numerals (i.e., composite skills), during a one-minute timing. Results from the composite test of performance indicated that the fluency group obtai ned a greater number of correct responses per minute when compared to the accuracy group. Tests of retention of component and composite skills conducte d across 16 weeks showed that the fluency group demonstrated greater accuracy and fl uency of the above-mentioned skills when compared with the accuracy group. The results of this study provided empirical evidence that fluency training resulted in better application of co mposite skills and increased retention of component and composite skills when compared to accuracy training alone. The third part of REAPS is endurance. Endurance is the ability to perform a learned task over an extended time period wit hout exhaustion, within the presence of distraction (Binder, 1996). Binder, Haughton, and Van Eyk ( 1990) describe endurance as being the same as attention span. According to the authors, if a learne r is not fluent in a task they will be unable to sustain attention to the task for a long period of time, will emit increased error rates, and decreased learning rates. Binder et al. (1990) examined the effect s of fluency on endurance in a study with 75 students who ranged in grade levels from kindergarten to eighth grade. Students were

PAGE 16

10 required to practice writing the digits 0 through 9 as fast as they could for various timings (i.e., 15 seconds, 30 seconds, 1 minute, 2 minut es, 4 minutes, 8 minut es or 16 minutes) on different days. Results indicated that students who could write digits at a rate of 70 per minute sustained performance levels duri ng the 16-minute timing. However, students who performed below this rate were unable to sustain performan ce levels during the 16minute timing. The results of this study showed that task fluency mu st be attained if endurance is to be attained. However, the auth ors did not discuss retention or application of digit writing. This is especially important in younger children with whom the use of such skills is integral to th e development of math ability. The fourth component of REAPS is perf ormance standards, which refers to the rate and accuracy of performance or aims that must be attained for fluency training to be beneficial. Ivarie (1986) invest igated the effects of varying proficiency rates on retention of writing behavior. The participants in this study were 120 fourth grade students whom had taken the math computation section of the Iowa Test of Basic Skills. Students were then grouped according to their achievement le vels on the test. There were three groups: students who exhibited average performan ce, students who exhibited above average performance, and students who exhibited belo w average performance. Students in all groups were required to learn and recall relationships between Arabic and Roman numerals and subsequently were exposed to one of two treatment conditions with specified performance aims. One treatment c ondition required that the students acquire and maintain a proficiency rate of 70 respons es per minute with 7 or less errors across three consecutive 1-minute timings. The second treatment condition required that students acquire and maintain a proficiency ra te of 35 responses per minute with four or

PAGE 17

11 less errors for three consecutive 1-minute tim ings. Students practiced and were tested on recall and writing of material until they reached the aims specified for their treatment group. After the aim was met, students were prov ided with an alternat ive activity so that they did not exceed proficiency rates. Stude nts who did not reach the proficiency rates continued to engage in alternati ng practice and testing sessions. Results of the study showed that subjects performing at the higher proficiency rate (70 responses per minute) demonstrated superior recall performance over a 90-day period in comparison to students who performed at th e lower proficiency rate (35 responses per minute). The author concluded that even students at the average and below average achievement levels were able to develop hi gh proficiency rates that would facilitate retention. Ivarie (1986) also st ated that there was an intera ction effect between treatment and achievement levels, meaning that the highe r proficiency rate coul d be attributed to the performance of the average and below av erage groups. Therefore, she concluded that students are who were classifi ed as average and below-average achievement may need to perform a skill at a high proficiency rate to facilitate retention of the skill. The author also noted that as testing continued across the nine-month period, there was some increase in retention for bot h groups, with the lower prof iciency group showing better retention than the high proficiency group. Ivarie (1986) stated that a possible explanation for this change may be attributed to the practice effects of rep eated testing, or a proficiency rate of 35 responses per minute ma y be easier to maintain. This study did not examine application or e ndurance of skills learned. Research on the efficacy of fluency tr aining has been conducted primarily with children or other students in academic settings (Ivarie, 1986; Lindsley, 1990, 1992;

PAGE 18

12 Spangler & Hawkins, 1975), with a fe w exceptions (e.g., Binder 1989, 1996, 2001). Although these studies have demonstrated the efficacy of fluency training within a PT paradigm at improving performance, facilitating frequent measurement and feedback on student performance, and achieving REAPS, there is a relative paucity of stringent empirical investigations (Bi nder, 1996), especially with regard to expanding fluency training and precision teaching st rategies to a variety of populations and settings. One area in which these strategies might prove pa rticularly useful is parent training, where retention of skills and application to real-world s ituations are critical measures of success. Parent training curricula employ multiple in structional strategies delivered within classroom settings such as lecture, discu ssion, or reading instructional materials to provide caregivers with behavior mana gement skills (Huang, Chao, Tu & Yang, 2003; Smith & Barrett, 2002; Venning, Blampied, & Fran ce, 2003). It is imperative that skills learned within such training programs mainta in and generalize to facilitate durable behavior change. However, much of the pare nt training literature doe s not investigate the extent to which parent training delivered within the classroom facilitates maintenance and/or generalization of skills (Muir & Milan, 1982; Kuhn, Lerman & Vorndran, 2003; Smith & Barrett, 2002; Venning et al., 2003). C onsidering the benefits of fluency training in facilitating mainte nance and generalization of lear ned skills taught in classroom settings, including such an instructional strategy in classroom-based parent training curricula seems an appropriate and pote ntially beneficial course of action. One subset of caregivers with whom fl uency training might be particularly beneficial is foster parents. Foster parents are charged w ith the task of caring for and managing the behavior of dependent children w ho are placed in their care. These children

PAGE 19

13 may exhibit a myriad of behavior problems that are the result of varying contingencies within their environment. In these settings achieving REAPS with regard to parenting skills appears to be crucial. Foster parents must use effective parenting skills proficiently in the presence of distraction, and be able to retain and generalize skills learned to evoke durable behavior change in children. Considering the benefits of fluency training, it would be prudent to implement such an instructional method within the cont ext of a foster parent training program to facilitate the acquisition of high performance standards, maintenance of performance (i.e., retention) and generalization (i.e., application) of skills from the classroom to the home. This is important not only for the clinic al value in providing effective behavioral interventions for children and their caregive rs, but also for the potential to make a significant empirical contributi on to the science. Therefore, the current study proposes to investigate the effect of fluency training on foster caregiver performance of specific parenting tool skills in a classroom setting and the maintenance and generalization of those skills to novel settings w ith children in the home.

PAGE 20

14 Chapter Two Method Participants and Setting Participants were three caregivers, Esth er, Danna and Kenneth, who enrolled in the Parenting Tools for Positive Behavior Change (PBC) course sponsored by the Behavior Analysis Services Program (BASP) All participants were licensed foster parents and had a minimum of one child (foste r or biological) resi ding in their home for the duration of the study. Esther was a 55-year-old female who had been a licensed foster parent for approximately ten years. At the ti me of this study, Esther had one foster child in her home. Danna was a 42-year-old female, and Kenneth was a 39-year-old male. Danna and Kenneth were married and had tw o biological children and one foster child residing in their home for the duration of the study. Danna and Kenneth had both been foster parents for 6 months prior to participa tion in the study. Esther participated in Class I while Danna and Kenneth participated in Cl ass II. Esther and Danna completed all pretest, classroom, post-test ro le-plays and a minimum of four in home observations. However, Kenneth completed only pre-test and classroom testing and continued in-home observations one week after class completion. The PBC course was taught by the prin cipal investigator of this study. The principal investigator was a board certifie d associate behavior analyst who had been teaching the PBC course for 4 years. The course instructor was competency trained in the parenting tools (i.e., he/she had independen tly demonstrated all nine tools with 100% accuracy twice). The course provided caregiv ers with instruction on nine task-analyzed tools based on principles of applied behavior analysis. Th ese tools provided caregivers

PAGE 21

15 with specific strategies for managing challe nging behavior. The course was taught in a classroom setting for the duration of four weeks. Two sessions were held weekly. The duration of each session was 3 hours. Data we re collected both in the classroom and in the homes of the participants. Institutional Review A copy of all experimental procedures was submitted to the University of South Florida, the University of Florida and the Department of Healths Institutional Review Board for review and approval before the st art of the study. Participants reviewed and signed informed consent forms prior to the start of the study. Dependent Variables and Measurement The primary dependent variable (DV) wa s the accuracy with which participants performed specific parenting tools taught dur ing the course. Data on the following PBC tools were collected: Stay close, Give positive consequences; I gnore junk behavior; Stopredirect-give positive consequences; Pivot; Set expectations; Use a contract; Time out; and Analyze behavior using ABCs. Each DV was defined using the description included in the parenting course material s. DVs were task analyzed and converted to checklists so that both frequency and accuracy scores could be calculated (see below). First, data were collected in the classr oom during pre-test a nd posttest role-play situations, which were conducted during session one and session eight of the course, respectively. Data also were collected on the performance of tools during classroom roleplays during sessions two through seven. During the Stay close, Ignore junk behavior, Stop-redirect-give positive consequences, and Set expectations situ ations (Appendix A ), a primary trainer assumed the role of a ch ild while a secondary trainer presented the

PAGE 22

16 caregiver with varying situations, and colle cted data on caregiver performance on each tool. During the Give positive consequences, Pivot, and Time out role-plays, the secondary trainer presenting th e situation assumed the role of a second child and the caregiver was again asked to respond to each situation. Caregivers were not told which tool should be performed for each role-play. For testing on the contract tool, caregiv ers read a passage and composed written responses to questions related to formula ting a contract. To measure the ABC tool, caregivers observed and recorded interactions between two behavior analysts performing a role-play. Caregiver responses were then scor ed using the ABC task analysis. As part of the required course procedures, participants also completed a writt en pre-test and posttest as prescribed by the PBC curriculum eith er before or after completing the role-play pretest or post-test. No feedback was pr ovided on caregiver perf ormance on the written tests or pre and posttest role-plays. Data also were collected in the home during naturally occurring situations. Before the start of the course, class participan ts were contacted via telephone to schedule a home visit as is typically done with car egivers receiving services from the BASP program. A home visit was conducted before da ta collection to identify potential study participants, and allow participants to re view and sign informed consent forms. An interaction evaluation questionnaire (see A ppendix B) was administered to determine times of day or types of interactions that evoke numerous responses from the child that may necessitate the use of PBC tools. A mi nimum of three home visits were conducted on various days during the time periods identifi ed in the questionnaire. Home visits were 1-2 hours long. Home observations were conduc ted in 15-minute intervals, with a two-

PAGE 23

17 minute break between each interval. These obs ervations provided baseline data on tool performance. Home observations occurred on a weekly basis during the course of the PBC class. After class completion, weekly home visits were conducted during time periods identified during pre-class vis its for a minimum of four weeks. Data for both role-plays and home observ ations were record ed on task-analyzed tool checklists (see Appendix C), which allowed for discerning both frequency and accuracy of tool use. Performance was re ported as percentage accuracy and was calculated by dividing the total number of st eps performed accurately by the total number of steps in each tool and multiplying that number by 100. If during a role-play or in-vivo situation, a caregiver did not ha ve an opportunity to perform a ll tool steps, accuracy was calculated by dividing the numbe r of steps performed correct ly by the number of steps the caregiver had an opportunity to perfor m. That quotient was then multiplied by 100. The frequency of use of each to ol per session was summed. Participant performance on fluency drills was also measured. Approximately four fluency drills were conducted during each class session. Minimum time intervals between each drill were 15 minutes. Participants reco rded the number of correct, incorrect and skipped/passed responses on a data sheet (see Appendix D). Experimenters converted participant scores to ra te of response, which were plotte d on an equal interval graph. Rate of correct responses per minut e were calculated by dividing the number of responses by 60. Rate of errors (i.e., incorre ct or passed items) also was calculated with the same formula used to calculate rate of correct responses.

PAGE 24

18 Observer Training Prior to the start of the study, data collect ors were trained to collect data on tool performance by observing various videotaped role-play situations exhibiting both correct and incorrect examples of tool use. Data co llectors were behavior analysts and behavior assistants who were trainers for the Parenti ng Tools for Positive Behavior Change course. Role-play situations were scored simultane ously but independently by data collectors during the training sessions. At the end of each vignette, scorers compared their data with that of the researcher, who discussed scored sheets and provided feedback. Training continued until data collectors obtained 90% interobserver agreement (IOA) scores with the researcher across two consecutive prac tice observations. Approximately three training sessions were required fo r observers to meet criterion. Training for fluency drill sc oring was conducted during tr aining sessions for tool scoring and during initial flue ncy drills during the PBC course. Data collectors were verbally instructed on how to record data co llected during fluency dr ills. Data collectors and the researcher recorded data on Esthers fluency drill performance on the Stay Close tool for four consecutive trails. Both the researcher and data collector were considered trained after having obtained 100% IOA sc ores on all four consecutive trails. Interobserver Agreement Table 1 displays IOA scores recorded during the study. Classroom role-plays for pre-tests and post-tests were videotaped and interobsever agreement was calculated for 100% of observations. Approximately 45% of hom e visits were also scored by a second independent observer. Interobserver agreemen t scores were calculated by dividing the

PAGE 25

19 number of agreements on each step of a ta sk analysis by the sum of agreements and disagreements, and multiplying that number by 100. Overall IOA obtained for accuracy data collected during pre-test, posttest and classroom role-plays ranged from 76% 95% with a mean of 85%. Interobserver agreement scores for frequency of tool use ra nged from 94% 96% with a mean of 95%. Range of accuracy for tool use in the home was 74% 84% with a mean of 81%. Interobserver agreement scores for fluency drills were also co llected. An observer alternated between participants during each fluency drill and simultaneously but independently recorded responses (correct, incorrect and passed ) during each task. Approximately 63% of all data recorded duri ng fluency drills were scored for IOA. Agreement scores were calculated for the number of correct, incorrect and passed flash cards by calculating the sum of each type of response recorded by the participant and IOA data collector. For each response type the smaller sum was divided by the lager sum, and that quotient was multiplied by 100. Ra nge of IOA scores for fluency drills were 98% 99%. Mean observer agreement sc ores for fluency drills for all three participants was 99%. Table 1 Interobserver Agreement scores Observations Interobserver Agreement Scores Percent Accuracy (Pre-test, Posttest, Class) 85% Frequency of tool use (Home) 95% Accuracy of tool use (Home) 81% Fluency Drill 99%

PAGE 26

20 Procedure Class sessions were comprised of a mixture of lecture, classroom discussion, fluency drills, and various role-play situations outlined in the PBC training curriculum (a copy of the curriculum can be provided upon re quest). To control for practice effects, role-play situations presented in class did not include those us ed in the pre-test and posttest. However, feedback on role-play pe rformance during class was provided by the trainer. To increase consistency of instru ction during each session, the primary trainer proceeded with course content and activities as outlined in the PBC trainers script. Baseline. During baseline, class content was delivered via instructions from the PBC trainers manual. Fluency training. During fluency training, participants were asked to engage in fluency drills comprised of see/say learni ng channels presented via flashcards (see Appendix E). The content of the flash cards in cluded the following: task analyzed steps of target tools, child behavior or situati ons in multiple settings that required use of specified tools, identification and practice in stating core concepts presented in the curriculum (e.g., classifying behavior as cons equential or inconse quential, identifying open-ended questions and empathy statements). Participants worked in pairs to complete the drills. Before the start of each timed drill, participants were allotted two minutes to study their flash cards. The trainer signaled the beginning and end of timed practice and data collection using a script (Appendix F). Thus, participants had clear prompts that signaled a change in activity. Once participants were cued by the trainer to start the drill, one member of the pair held a flashcard w ith the question facing hi s/her partner and the answer facing him/her. The second partic ipant answered each question, and the first

PAGE 27

21 participant stacked the flashcards into three se parate piles: correct, incorrect, and passed (i.e., answers not attempted). The first pa rticipant recorded th e number of correct, incorrect, and passed flash cards on the data sheet. Participants then reversed roles and completed the exercise again. After all particip ants completed the drill, participants were allotted two minutes to review the cards th at they got correct, incorrect or passed. Performance aims for all fluency drills were established by calculating the range of performance (rate of response) on drills of three competen cy trained behavior analysts who taught the PBC course. Each behavior analyst was required to have taught a minimum of three PBC courses. Trainers we re asked to complete two types of timed fluency drills (practice and experimental). Fluency drills were conducted in the manner outlined in experimental procedures for pa rticipants. The practice drill was conducted once to familiarize trainers with procedures. The experime ntal drills were conducted twice. The mean range of performance exhi bited by each trainer during the experimental drills was recorded. Performance aims for pa rticipant fluency drills were set at the following median response rates exhibited by th e trainers: Stay Close, 0.48; Give positive consequences, 0.46; Ignore junk, 0.48; Pivot, 0.55; Stop-redirect-give positive consequences, 0.5, Set expectations, 0.54; Contract, 0.5. Time out, 0.49; ABC, 0.52 correct responses per minute. To ensure compliance to fluency drill pro cedures, participants were introduced to the fluency drill and methods of data collec tion during the first session of the course. Timed one-minute practice drills were conducted to familiarize participants with the process of collecting required materials for drills, conducting drills, collecting and recording data. The content on the practice dr ills presented a procedure for a simple,

PAGE 28

22 arbitrary task (i.e., making a peanut butter and jelly sandwich) using the same script used during experimental sessions. The trainer ex plained and then mode led (a minimum of two times) a practice fluency drill, and the data recording process. Flash cards were presented in a specified order. A transparency of the correct respons es and a data sheet were posted. Each caregiver classified cards as correct, incorrect or passed, and recorded data obtained simultaneously with the traine rs during the second modeled drill. After modeling, participants performed the prac tice drills in pairs for a minimum of 10 minutes. Performance aims for training drills were set at the minimum response rates exhibited by trainers (0.33 responses per minute). Caregive rs recorded the number of correct, incorrect and passed flash cards on the data sheet provided. Practice continued for approximately 15 minutes and ended when participants recorded scores with 90% accuracy. To maintain participants motivation to improve their performance across fluency drills, any participant who improved his or her score by five corre ct responses over the previous score placed his/her name in a cl ass drawing. At the e nd of session eight, the drawing was held and the wining participant was given a gift. Experimental Design A non-concurrent multiple baseline across participants graph was used display caregiver performance on fluency drills and fr equency and accuracy of tool performance. Fluency trained tools were counterbalanced between Class I and Class II to identify whether any changes in performance of varyi ng fluency trained tools were the result of fluency training only and not instruction. Esth er completed fluency drills on the Give positive consequences, Stay close, Set expectations, Pivot and Use ABC tools. Danna and

PAGE 29

23 Kenneth completed fluency drills on th e Ignore junk, Stop-redirect-give positive consequences, Time out and Contract tool. Esther completed four weeks of one P BC course (Class I) while Danna and Kenneth completed a second four-week course (Class I I) four weeks after the first class started. Social Validity At the completion of the classroom training, participants were administered a questionnaire with Likert type scaling to indicate their sati sfaction with training methods used, the utility and their level of enjoym ent or dissatisfaction with fluency drills, whether training methods used were durable, an d the extent to which methods facilitated tool application within the home (see Appendix G). The questionnaire was again administered at the completion of the fou r-week in-home follow up. Information obtained was used to formulate conclusions about careg iver opinion on the utility and efficacy of the training methods used.

PAGE 30

24 Chapter Three Results Figure 1 displays participants perfor mance on fluency drills. The first panel shows Esthers rate of correct and incorrect responses on fluency drills. As previously indicated, performance aims fo r rate of correct responses for fluency trained tools in Class I are: Stay close, 0.48; Give po sitive consequences, 0.46; Pivot, 0.55; Set Expectations, 0.54 and Analyze behavior us ing ABCs, 0.52. Esther exhibits both low rates of correct (mean, 0.106; range 0.07-0.16) and incorrect responses (mean, 0.01; range 0.0-0.03) on all fluency-trai ned tools. Rate of correct responses for the Stay Close tool ranges from 0.05 to 0.1 responses per mi nute with a mean performance rate of 0.08. Esthers mean rate of correct performance on the Give positive consequences tool is 0.11 (range 0.06-0.16), mean performance on the Pi vot tool is 0.16 (ra nge 0.1-0.2), mean performance for the Set Expectations tool is 0.09 (range 0.08-0.11) and mean performance for the ABC tool is 0.07 (range 0.05-0.1). The second and third panel showed Danna and Kenneths performance on fluency-trained tools. Performance aims for ra te of correct responses for fluency trained tools in Class II were: Ignore Junk, 0.48; Stop-redirect-give positive consequences, 0.5; Time out 0.49; Contract, 0.5. Danna exhibite d rates of correct responses (mean, 0.37; range 0.32-0.47) just below aims with low rates of incorrect responses (mean, 0.02; range 0-0.04). Dannas mean rate of correct responses on fluency trained tools was: Ignore Junk (mean, 0.34; range 0.18-0.43), Stop, redirect give positive consequences (mean, 0.32; range 0.18-0.38), Time Out (mean, 0.34; ra nge 0.21-0.48) and Contract (mean, 0.47; range 0.35-0.53).

PAGE 31

Figure 1. Mean rate of correct and incorrect responses per minute for fluency Danna trained tools. 25

PAGE 32

26 Kenneths performance is displayed on the third panel in Figure 1. Kenneth also exhibited high rates of correct responses (m ean, 0.4; range 0.34-0.5) just below aims, and met the fluency aim for the contract tool ( 0.5). His mean rate of correct responses on fluency-trained tools was: Ignore junk (mean, 0.34; range 0.21-0.45), Stop-redirect-give positive consequences (mean, 0.35; range 0.2-0. 45). Kenneth met performance aims for the Time out tool (mean 0.42; range 0.28-0.6) and Contract (mea n, 0.5; range 0.45-0.53) tool. His mean rate of response for incorre ct tools was near zero for the Ignore junk (mean, 0.004; range 0-0.01), Stop-redirect-giv e positive consequences tool (mean, 0.004; range 0-0.01) and the Time out tools (mea n, 0.004; range 0.0.01), and zero for the contract tool. Figure 2 displays participants accuracy of performance on pre-test, classroom and post-test role-plays. The top panel disp lays Esthers performance. Esther was enrolled in Class one, thus she was fluency trained in the Stay close, Give positive consequences, Pivot, Set expectations and A ssess behavior using ABC tools. There was an increase in tool accuracy from pre-test to classroom role-plays for both fluency trained and non-fluency trained tools. Posttest scores for the Stay close and ABC tools exceeded classroom role-play scores at 78% and 100% respectively. Post-test scores for the Pivot and Set expectations tools returned to baseli ne scores of 0% and 42% respectively. Accuracy scores for the Give Positive Consequences tool in both classroom and post-test role-plays re mained stable at 89%. Non-fluency trained tools during Class I (Esthers class) included the following: Ignore junk, Time out, Contract and Stop-redirect-give posit ive consequences. There was a decrease in accuracy from classroom role-plays to post-test role-plays on the Ignore

PAGE 33

Figure 2. Pre-test, classroom and post test performance on fluency and non fluenc y trained tools. 27

PAGE 34

28 junk (83% to 50%) and Stop-re direct-give positive conseque nces tools (78% to 11%). Percent accuracy on the Contract tool remained stable at 71% across classroom and posttest role-plays, while performa nce on the Time Out tool incr eased from 38% to 50% from classroom to post-test. Despite much variability in the data across all tools, mean posttest performance on fluency trained tools (62%) was greater than mean posttest performance on non-fluency trained tools (46%). These result s indicate that flue ncy training may have been effective in produci ng better learning. The second panel in Figure 2 displa ys Dannas performance on pre-test, classroom and post-test role-p lays. Dana was enrolled in Class II and therefore received fluency training in the Ignore junk, Time out, Contract, and Stop-redirect-give positive consequences tools. A good deal of variab ility also was observed across both fluency trained and non-fluency trained to ols, but in general, accura cy on all tools increased from pre-test to classroom role-play. However, in th is case, fluency training did not seem to be effective in producing bette r learning. Performance on both the Stop-redirect-give positive consequences and Time out tools increased from pre-test to post-test with scores of 89% and 67% respectively. Accuracy scor es on the Ignore junk tool across classroom and post test remained stable at 100%, while scores on the Contract tool decreased from 86% to 57 % from classroom to post-test role-plays. Tools that were not fluency trained in Class II included Stay close, Give positive consequences, Pivot, Set expectations and Assess behavior using ABCs. Performance on the Give positive consequences and Pivot tools remained stable across classroom and post-test role-plays at scor es of 89% and 100% respectively. Performance on the Stay close tool returned to base line levels of 70% at posttest However, performance on the

PAGE 35

29 ABC tool increased across the phases of te sting from 83% during class to 100% during posttest. Posttest performance on the Set expect ations tool decreased below baseline (pretest) levels to 42%. The third panel in Figure 2 shows the accuracy percentages for the second participant in Class II, Kenneth. Kenneth completed the pre-test and classroom roleplays, but did not complete post-test role-p lays. There were considerable increases in performance across both fluency trained and non-fluency trained tool s from pretest to classroom role-plays, except for the ABC tool. Kenneth received classroom scores of 100%, 67%, 100%, and 78% for th e Ignore junk, Time out, C ontract, and Stop-redirectgive positive consequences tools respectively. Pre-test scores for these tools were 0%, 6%, 0% and 22%. Kenneth scored 100% accura cy in class for non-fluency trained tools such as Stay Close, Give Positive Consequences and Pivot. His classroom score for the Set Expectations and ABC tool was 77% and 0% respectively. Table 2 shows participants mean pe rformance on pre-test, classroom and posttests for fluency trained a nd non-fluency trained tools. Mean posttest performance on fluency-trained tools for Esth er and Danna was 62% and 78 % respectively. Mean posttest performance on non-fluency trained tools for Esther and Danna was 46%, 80% respectively. Kenneth attained mean classr oom performance scores of 86% and 75% on fluency and non fluency trai ned tools respectively. Figure 3 displayed the participants fre quency of tool use in the home. Fluency trained tools are represented by closed data points; while non-flue ncy trained tools are represented by open data points, xs and plus signs. The top panel shows Esthers data. Esther used the Stay close tool most consis tently across phases. However, there was a

PAGE 36

30 Table 2 Mean Percent Accuracy of Performance on Fl uency Trained (FT) and Non Fluency (Non FT) Trained Tools During Classroom Instruction Participant Esther Danna Kenneth Test FT Non FT FT Non FT FT Non FT Pre-Test 22 16 21 35 7 26 Classroom 58 68 79 88 86 75 Posttest 62 46 78 80 N/A N/A downward trend across phases in the frequency of tool use, especially between class-time (mean, 5.75; range 5 to 8) and post-class obs ervations (mean, 3.57; range 1 to 6). All other tools were used infre quently (Stop-redirect-give positive consequences, Give positive consequences and Ignore junk) or not at all (Pivot, Set expectations, ABC, Contract and Time out). The second panel in Figure 3 showed Ke nneths frequency of tool performance. Kenneth completed baseline, classroom and only one week of post class home visits. Kenneth also used the Stay close tool most consistently, but a downw ard trend in tool use appeared across phases (baseline mean, 13.25; range 7 19; class mean, 8.25; range 6 to 10). All other tools were used infrequently (Give positive conseque nces) or not at all (Ignore junk, Pivot, Stop-redirect -give positive consequences, Set expectations, Contract, Time out and ABCs).

PAGE 37

31 The third panel in Figure 3 shows Dannas frequency of tool performance in the home. Danna used the Stay close and Give pos itive consequences tools most frequently in her home. There was variability in the frequency of demonstration of the Stay close tool during baseline (mean, 10.33; range 5 to 25) Data for this tool were more stable during class time (mean, 6.75; range 3 to 9) a nd post-class observations (mean, 5; range 3 to 7). There was a slight upward trend in the frequency of demons tration of the Give Positive Consequences tool across baseline (mean, 1; range 1-2) and classroom (mean, 2.25; range 1 to 3) phases. Similar to other participants, Danna did not use a wide range of tools in the home. The Pivot and Set expectations tools were only demonstrated during baseline but were not de monstrated in other phases. Figure 4 shows accuracy of tool use in th e home. Accuracy scores were calculated by dividing the sum of all accurate tool step s performed for a specific tool by the total number of steps attempted, and multiplying th e quotient by 100. The top panel displays Esthers performance. Accuracy for the Stay close tool during baseline began relatively high (58%), but subsequent observations reve aled decreases in accuracy during baseline (mean, 23.7%; range 13%-58%). Accuracy during class-time (mean, 44%; range 39%53%) and post class (mean, 44%; range 36%-56 %) was relatively stable. All other tools were used infrequently (St op-redirect-give positive conse quences (37% and 60%), Give positive consequences (11%) and Ignore junk (67%) or not at all (Pivot, Set expectations, ABC, Cont ract and Time out). The second panel in Figure 4 shows Kenneths accuracy of performance on the Stay close and Give positive consequences tool s. Data for Stay close remained relatively stable across the baseline (mean, 59.7%; range 57%-63%) and classroom (mean, 60.2%,

PAGE 38

30 min session B ase lin e Post Class Class 30 min se ssi on 30 min session Kenneth Danna E st h e r Figure 3. Frequency of in home tool performance. Filled data points represent fluency trained tools, open data points, x and + represent non fluency trained tools. 32

PAGE 39

33 range 46% 67%) phases. Scores for Give positive consequences decreased from 88% accuracy during baseline to 78% during class. Other tools such as Ignore junk, Pivot, Stop-redirect-give positive consequences, Set expectations, Contract, Time out and ABCs were not demonstrated duri ng in home observations. The third panel in Figure 4 shows Danas accuracy of performance on tools used in the home. There was a relatively stable baseline demonstrated for the Stay Close tool (mean, 56%; range 51% 62%). Data were so mewhat more variable during class (mean, 61.75%; range 43%-74%) and after class (m ean, 63.7%; range 52%-92%), but overall accuracy remained relatively unchanged acro ss phases. There was a slight downward trend in the accuracy of the Give positive consequences tool during baseline (mean, 71.75%; range 67%-78%). Accuracy increas ed slightly during class (mean, 82.25%; range 74%-89%) and remained within baseline ra nge in the post class observation (77%). Less frequently used tools such as the Pivot and Set Expectations tools occurred during the baseline phase at 50% and 25 % accuracy resp ectively or not at a ll (Stop-redirect-give positive consequences, Ignore junk, Contract, Time out, Assess behavior using ABCs). Only Esther and Danna completed social validity questionnaires after completing their post-tests. Esther and Danna stated that flash card drills help ed them to remember the concepts taught in class, were enjoyable, facilitated s timulus discrimination for tool use in the home and helped them perform be tter on posttest role-plays. Both Esther and Danna strongly agreed that fluency drills he lped them in using tools correctly with the children in their home. At the end of in-home visits, Esther and Danna were asked to complete the social validity questionnaire given afte r posttest completion once more Esther strongly agreed

PAGE 40

Figure 4. Accuracy of in home tool performance. Filled data points represent fluency trained tools, open data points, x and + represent non fluency trained tools. Baseline Class Esther Kenneth 30 min session 30 min Danna 30 min Post Class 34

PAGE 41

35 but Danna strongly disagreed that fluency training was helpful in retention of concepts learned in class. Esther agreed but Danna strongly di sagreed that fluency drills helped her perform better on posttests and were benefi cial to learning. Both Esther and Danna stated that they did not enjoy using the flas h cards; the drills did not help them to know when to use a specific tool in their home, or know how to use tools correctly with children in their homes.

PAGE 42

36 Chapter Four Discussion This study investigated the effects of fluency training during classroom instruction on tool performance (learning) during class, and in the home during instruction and after class completion. Study results indicated that fluency training was not effective in increasing classroom perfor mance of tools during posttests. Only two participants performed well on fluency drills, however these participants did not demonstrate increased performance on fluency trained tools at posttest. Fluency training was also ineffective in increasing accuracy a nd frequency of fluency trained tools in the home. One goal of the study was to increase the fluency of particip ant responses on seesay fluency drills. However, caregiver pe rformance on fluency drills was mixed. Both Kenneth and Danna achieved high response ra tes that closely approximated or met performance aims. Esther achieved low rates of correct and incorrect responses at levels considerably below performance aims. It is in teresting to note that although Kenneth and Danna performed well on fluency drills, co rresponding differentiation between fluency trained and non-fluency trained tool use across DVs was not observed. It is difficult to determine conclusively whether see/say fl uency drills are the most appropriate instructional technology to use in increasing performance, because performance results for Danna and Kenneth are mixed. Quite possibl y, drills that incorporate the use of the see/do learning channels could be more effective. During class, participants would observe a role-play situ ation, or read a role-p lay situation and perfor m the tool that was most appropriate in that s ituation. See/do drills may have provided the opportunity for

PAGE 43

37 multiple tool practice opportunities, thus incr easing performance. See/do drills may have also resulted in generalization of paren ting skills as caregivers would have had opportunities to practice s ituations that may occu r within the home. All participants demonstrated an increase in performance from pre-test to classroom role-plays, indicating that the PBC curriculum was effective in facilitating learning. Both Esther and Dannas percent accur acy on tools were higher for non-fluency trained tools than for fluenc y-trained tools. The opposite effect was observed in data obtained from Kenneth (i.e., classroom perfor mance on fluency-trained tools was higher than classroom performance on non-fluency trained tools). Such mixed results indicate that fluency training had lit tle or no effect on increasing classroom performance on fluency trained tools. It is possible that other va riables could have accounted for differences in role play accuracy, including practice effects, un equal opportunities for demonstration and practice of tools, and va rying curriculum activiti es that may have reviewed specific tool concepts or steps multiple times during classroom instruction. With regard to posttest role play sc ores, Esther demonstrated no consistent improvements. However, Esthers mean pos t-test performance on fluency-trained tools was higher than that of non-fluency traine d tools. The opposite e ffect occurred with Dannas classroom performance. Dannas mean posttest performance on non-fluency trained tools was higher than her performan ce on fluency trained tools. This outcome may be attributed to tool complexity. It is possible that the Stay close, Give positive consequences, Pivot, Set expectations and A BC tools steps were less complex than the other PBC tools. Therefore, both participan ts demonstrated increased accuracy on these tools despite fluency training.

PAGE 44

38 Another factor associated w ith tool complexity is the number of steps contained in each tool. The tool sets for Esther and Da nna (fluency trained and non-fluency trained respectively) that were demonstrated with the most increases in accuracy of performance had 9 or less steps, with the exception of th e Set expectations tool which had 14 steps. Both Esther and Dana showed decreased perfor mance from class to posttest for this tool. The same pattern was demonstrated in Kenneths data; specifically, classroom performance was greatest on tools with the leas t steps. The exception to this is the ABC tool, on which Kenneth scored a 0% across all assessments. A third reason that there was no replicat ion of increased performance on fluencytrained tools across participants may relate to frequency of drills. Fluency drills on specific tools were conducted approximately four times only during the session in which the tool was taught. Therefore, there were no additional opportunities for fluency drills on a specific tool once the session was completed. Additional drills during sessions where a non-fluency trained tool was being taught may have resulted in increased rates of responses for fluency drills, and increased pe rformance on role-plays in class and tool performance in the home. With regard to tool use in the home, fre quency of the Stay close tool occurred at higher rates than all other t ools during initial observati on sessions, but exhibited a downward trend across phases. This initial increase may have been the result of reactivity caused by the introduction of novel stimuli (observers) in the home environment. However, as reactivity decreased, so did freque ncy of performance of the Stay close tool. At this point, caregivers may have been able to discriminate more appropriate times for tool use or the performance levels now accurate ly reflected frequency of tool use in the

PAGE 45

39 home. Since frequency of Stay close decrea sed across sessions, it might be concluded that fluency training had no e ffect in increasing frequency of performance of this tool. One might also conclude that there was no oppor tunity for a treatment effect for the Stay close tool because Esther was the only participant fluency trained in this tool, and her rates of response for tool drills were low. The other tool used by all participants in the home was Give positive consequences, although frequency and accuracy varied greatly among caregivers. Esther demonstrated use of this tool only twice duri ng baseline, Danna exhibi ted use of this tool across all phases with increased trend in fr equency and Kenneth demonstrated use of the tool during baseline and cla ss with decreased frequency. This may indicate that there was some generalization and maintenance of the Give positive consequences tool in Dannas home despite fluency training. However, decreased use of this tool demonstrated by Kenneth and Esther indicates that there wa s no generalization and maintenance of this tool in the homes of these participants. Again, one could conclude that there was no opportunity for a treatment effect since Esther was the only participant fluency trained in the Give Positive Consequences tool, but Esther exhibited poor rates of responses on drills. With the exception of Stay close and Gi ve positive consequences, tools were rarely observed in the home. It is also a possi bility that the Stay close and Give positive consequences tools were predominantly performed in all homes because these tools were already in each participants repertoire, ther e were multiple opportunities for use of these tools even at baseline, or caregivers were ab le to recognize opportuni ties for use of these tools. Other tools may not have been used as frequently because th ere may not have been

PAGE 46

40 multiple opportunities to use these tools, or caregivers failed to recognize opportunities to use the tools. If the latter is the case, then the PBC curriculum may need to be revised so that the curriculum incorporates strategies that facilitate identification of opportunities for tool use in the home. Other tools such as th e Set expectations and Contract tools may not have been observed in the home because there were no opportunities to use these tools. In this case, a more sensitive measure of eff ects could asses opportunity for tool use in relation to actual tool use. At the end of class, caregivers who completed the study reported that fluency drills had positive effects such as facilitating concept and tool recollection at home and in post-tests, facilitating the identification of cl ear discriminative stimuli for tool use, and accurate tool use in the home. Actual perfor mance data did not correspond with caregiver reports at class completion. However, at the end of the study, careg ivers reports on the effectiveness of fluency drills were similar to data obtained in the study. Both the data and caregiver opinion indicated that fluency training was not helpful in retention, generalization of tool skills, or increasing performance during class. This change in opinion may have occurred at the end of the study because with the passage of time, caregivers may have been better able to determ ine the effect that fluency training in class had on their tool use at home, therefore allo wing them to make an accurate assessment of treatment effects. The results of this study indicated that fluency traini ng had little or no effect on increasing tool performance during classroom or in the home. An extension of this study should include strategies that would incr ease generalization and maintenance. Such strategies may include programming mediat ors, training loosely, or training multiple

PAGE 47

41 exemplars (Stokes & Baer, 1977). For example, future researchers could train multiple exemplars by presenting multiple role-play oppo rtunities for varying examples of tool use. Role-plays should be similar to interactions present in the home. Future researchers could also explore us e of functional mediators to facilitate generalization, by training the children in the home to solicit reinforcement following appropriate behavior. Theref ore regardless of the setting, the child could solicit reinforcement for appropriate behavior a nd the caregiver would perform the Give positive consequences tool. Caregivers could also be given a magnet or key chain containing tools and tool steps that would be present in th e both the training setting and all other settings in which tools could be used. Therefore the caregiver would carry a stimulus that could evoke tool use across multiple settings, thus facilitating generalization. Another area of research could explore th e effect that increa sed fluency drills on tools beyond the session in which they were taught would have on tool performance in class and at home. Another study could explore whether pr oviding increased fluency training on only complex tools across the ei ght session class would result in increased performance on these tools. Researchers also might investigate the r easons for low levels of tool use within the home. There are multiple reasons why this might occur, including poor stimulus discrimination (caregivers inability to recogn ize opportunities for tool use) or lack of opportunities to use certain tools. To investigate the latter, opportunities for tool use would need to be clearly defined and recorded in relation to actual tool use. This line of inquiry would be important in determining the utility of certain tools taught in the

PAGE 48

42 parenting course. If trainers can determine tools with the most utility in the home, caregiver-training curriculums can be stream lined to focus on these tools, which could subsequently increase caregiver tool performance. Additional research might also investigat e the changes in quality of parent/child interactions or types of tool steps completed as a function of training. The data obtained indicated that there were no signi ficant changes in the accuracy of tool use from baseline to training. However, there may have been changes in the quality of interactions; meaning that caregivers during and after trai ning may have been using tool components that may be important in effecting behavi or change or improving interaction. A mere calculation of tool accuracy may not have cap tured such changes. Therefore, future researchers could investigate a ny changes in the completion of certain crucial tool steps that could result in changes in types of interactions or effect changes in child behavior. This study should also be replicated with an increased number of study participants. The variability in the data, co mbined with a small number of participants, posed challenges in making definitive conclusions about the effect of fluency training on performance. Additional st udy participants, combined with the improvements and extensions noted above, may provi de additional data that could facilitate more conclusive results.

PAGE 49

43 References Baer, D.M., Wolf, M.M., Risley, T.R. (1968). Some current dimensions of applied behavior analysis. Journal of Applied Behavior Analysis, 1 (1) 91-97. Bell, K.E., Young, K.R., Salzberg, C.L., & West, R. P. (1991). High school driver education using peer tu tors, direct instructi on and precision teaching. Journal of Applied Behavior Analysis, 24, 45-51. Binder, C. (1988). Precision teaching: measuring and attaining exemplary academic achievement. Youth Policy, 10 (7) 12-15. Binder, C. (1990). Precision Teaching a nd curriculum based measurement. Journal of Precision Teaching, 7 (2) 33-35. Binder, C. (1993). Behavioral fluency: a new paradigm. Educational Technology,33 814. Binder, C. (1996). Behavioral fluenc y: evolution of a new paradigm. The Behavior Analyst, 19 (2) 163-197. Binder, C., & Bloom, C. (1989, February). Fl uent product knowledge: application in the financial industry. Performance and Instruction, pp.17-21. Binder, C., Haughton, E., Van Eyk, D. (1990). Increasing endurance by building fluency: precision teaching attention span. Teaching Exceptional Children, 22 (3), 24-27. Binder Riha Associates (2001, January 15) Effective fluency building design and coaching for classroom programs. Ideas from the Frontlines, 1-4.

PAGE 50

44 Binder, C., & Watkins, C.L. (1990). Precision teaching and direct instruction: measurably superior instructional technology in schools. Performance Improvement Quarterly,3, (4), 74-96. Buklin, B.R., Dickinson, A.M., & Brethower, D.M. (2000). A comparison of the effects of fluency training and accuracy tr aining on application and retention. Performance Improvement Quarterly, 13, (3), 140-163. Fredrick, L.D., Deitz, S.M., Bryceland, J.A., & Hummel, J.H. (2000). Instructional strategies. In Behavior analysis, educa tion, and effective schooling (pp. 79-104). Revo, NV: Context Press. Gardner, R., III, Heward, W.L., & Grossi, T.A. (1994). Effects of response cards on student participation and academic achievement: a systematic replication with inner city students during whol e-class science instruction. Journal of Applied Behavior Analysis, 27, 63-71. Heward, W. L. (1987). Promoting the ge nerality of behavior change. In Applied Behavior Analysis (pp. 552 582). Upper Saddle River, NJ: Prentice Hall. Huang, H., Chao, H., Tu, C., & Yang, P. (2003). Behavioral parent tr aining for taiwanese parents of children with attention-deficit/hyperactivity disorder. Psychiatry and Clinical Neurosciences, 57 275-281. Ivarie, J.J. (1986). Effects of proficiency rates on later performa nce of a recall and writing behavior. Remedial and Special Education, 7 (5), 25-30. Kinder, D., & Carnine, D. (1991). Direct Instruction: what it is and what it is becoming. Journal of Behavioral Education, 1 (2), 193-213.

PAGE 51

45 Kubina, R.M., & Morrison, R.S. (2000). Fluency in education. Behavior and Social Issues, 10, 83-99. Kubina, R.M., Morrison, R., & Lee, D.L. (2002) Benefits of adding precision teaching to behavioral interventions for students with autism. Behavioral Interventions, 17, 233-246. Kuhn, S. A. C., Lerman, D. C., & Vorndr an, C. M. (2003). Pyramidal training for families of children with problem behavior. Journal of Applied Behavior Analysis, 36 (1), 77-88. Kunzelmann, H.P. (1970). Precision teaching: an in itial training sequence. Seattle, WA: Special Child Publications Inc. Lindsley, O.R. (1990). Precision teac hing: by teachers for students. Teaching Exceptional Children,22, 10-15. Lindsley, O.R. (1991). Precision teachings unique legacy from B.F, Skinner. Journal of Behavioral Education, 1 (2), 253-266. Lindsley, O.R. (1992). Precision teac hing: discoveries and effects. Journal of Applied Behavior Analysis, 25, 51-57. Martin, G.L. (1979). Improving the math performance of a six year old. In T.C. Lovitt & N.G. Harring (Eds.), Classroom application of precision teaching (pp. 15-24). Seattle, WA: Special Child Publications. Muir, K. A., & Milan, M. A. (1982). Parent reinforcement for child achievement: the use of a lottery to maximize parent training effects. Journal of Applied Behavior Analysis, 15 (3), 455-460.

PAGE 52

46 Smith, M.D., & Barrett, M. S. (2002) The e ffect of parent training on hyperactivity and inattention in three school-aged girls with attention deficit hyperactivity disorder. Child and Family Behavior Therapy, 24 (3), 21-35. Spangler, R.S., & Hankins, N.E. (1975). Comp arison of two evaluative procedures on retention by college students. Psychological Reports, 36, 613-614. Stokes, T.F., & Baer, D.M. (1977). A implicit technology of generalization. Journal of Applied Behavior Analysis, 10, (2), 349-367. Vargas, E. A., & Vargas, J. S. (1991) Programmed instruction: what it is and how to do it. Journal of Behavioral Education,1 (2), 235-251. Venning, H. B., Blampied, N. M., France, K. G. (2003). Effectiveness of a standard parenting-skills program in reduci ng stealing and lying in two boys. Child and Family Behavior Therapy, 25 (2), 31-44. White, O.R., & Haring, N.G. (1980) Exceptional Teaching. Columbus, OH: Charles E. Merrill Publishing Co. West, R.P., Young, K.R., & Spooner, F. (1990). Precision teaching an introduction. Teaching Exceptional Children,22, 4-9.

PAGE 53

47 Appendices

PAGE 54

48 Appendix A: Sample Pre-test a nd Post Test Role play Scripts Tool: Stay Close Keep the time of this role-play to around 1 minute Trainer tells the Participant: You are in the kitchen getting a dr ink out of the refrigerator. Your 12-year-old child comes home from school and sits at the kitchen table. S/He looks sad. Show me what you would do. Co-Trainers Role: You are 12 years old You come home from school and walk through the door looking sad. (Your best friend at school is moving to ______ _____(pick a city that is over 100 miles away) at the end of t he (semester or month). Yo u are very upset at the thought of losing your best friend. You want to talk to your parent about it.) Sit at the kitchen table away from your parent. (When you sit at the table, be far enough that the parent must move in order to be within arms length and or touch you.) When you begin to discuss your friend moving, respond morosely and make emotional comments such as: I had a crappy (shitty) day, this sucks, its stupid and I hate this. Make these types of comments intermittently. Stop immediately if/when an em pathy statement is made. If the parent asks questions, answer them, without talking too much Avoid eye contact until the par ent makes an empathy statement. Since problem solving is not part of Stay Close, especially prior to making an empathy statement, respond with more verb al junk (ask a why question or argue with the parent). If the parent doesnt ask why you are so sad, complain about your friend moving so that the role-play continues. Remember that you want to talk to your parent. Trainer: Watch the parents Body Language. A rms folded, hands on hips, standing over the top of the child and looking at things other than the child are not appropriate. Wait to see if they change. Close Proximity and Appropriate B ody Language must occur by the -way point for it to be scored as yes. If an Appropriate Touch occurs, even at t he very end, it is scored as a yes. Stop the role-play when you have the needed information.

PAGE 55

Appendix A (Continued) Trainer: STOP the role-play when you have the information needed. Tool: Give Positive Consequences Suggested Props: Something that looks like a video game controller. Trainer tells the Participant: You are about to enter the living room. You know your two children are in the living room playing video games. You know your children often argue when they play video games. Show me what you would do when you enter the living room. (If there is one trainer pretend that the other child is there) Trainers Role: You are playing a video game with your sibling. As your parent enters, your sibling says, I want a try! You say, Okay, here you go, as you hand him the controller. If the parent does nothing, you say, Let me try again. Co-Trainers Role: You are playing a video game with your sibling. As your parent enters the room, say, I want a try! Take the controller and play. If sibling asks for another try, hand controller back. Trainer: STOP the role-play after the video game controller has been passed back and forth twice (whether parent comments or not). If the parent makes a negative response, for example, I cant believe you arent arguing! then stop the role-play. If the parent acknowledges, in some positive way, that the children are sharing, this will end the role-play. 49

PAGE 56

Appendix A (Continued) Tool: Ignore Junk Behavior Trainer tells the Participant: You are in the kitchen after dinner. Your 11-year -old child is reading a book he/she likes. His/Her homework is finished. Ask him/her to take out the garbage. Co-Trainers Role: You are the 11-year-old child. You are reading a book you really like. You do not want to take the garbage out. You have just been told to take out the garbage. Whine, But Im reading my book. You roll your eyes, slam the book shut and slowly get up. Walk very slowly, shuffling your feet, pick up the garbage. Say: How come I always have to take the damn garbage out? Emit some more junk, but pause occasionally, allowing the parent time to speak. Once the garbage is out, slam the door, pick up your magazine, and say There, are you happy now? Trainer: STOP the role-play when you have the information needed. Tool: Stop-Redirect-Give Positive Consequences Trainer tells the Participant: You are in the living room with your three year-old child and your two month-old baby who is sleeping in the bassinet. The three-year-old throws a small plastic toy in the bassinet. You are too far away to prevent this from happening. Show me what you would do. Co-Trainers Role: You are the three year-old child sitting on the floor. You are just playing in the living room with your plastic toy. You throw a small plastic toy into the bassinet where the baby is sleeping. As you throw, you say, Baby wants toy. 50

PAGE 57

Appendix A (Continued) If you are redirected, give a brief bit of whining and crying and briefly resist by pulling against the parent or falling to the floor and stomping feet, but not for more than three to five seconds. If there is no intervention, get the plastic toy and throw it in again. Trainer: STOP the role-play when you have the information needed. 51

PAGE 58

52 Appendix B: Interactio n Evaluation Questionnaire Caregiver: _______________________ Date:__________________ Interviewer: ______________________ Class start date: _________ Instructions: Ask the caregiver the following que stions and write responses in the spaces provided. The italicized bold heading above each question indicates the reason for presenting the question. Read only the questions below the headings. Identification of Observation times 1. What time(s) during the day do you interact with your child the most, and what activities/tasks/chores etc. do you do at that time? 2. List demands/instructions that you place on the child during this time. If no demands are presented at this time, when do you present demands, and what are they? Identification of appropriate behavior/ opportunities to provide reinforcement 3. What demands will/does your child comply with during this time? Identification of inconsequential behavior 4. What demands will your child not comply with during this time?

PAGE 59

53 Appendix B (Continued) Identification of inconsequential behavior/opportunities for redirection 5. When your child does not comply, what do you typically do? Identification of opportu nities for redirection 6. What are some tasks assigned during this time that you help your child with. Identification of appropriate behavior 7. What are some tasks assigned during this time that your child does independently? Identification of interactions for use of Stay Close tool 8. Is there any activity that your child enjoys doing at this time, or that both you and your child doing together?

PAGE 60

54 Appendix C: Sample Pre/Post Test, In-class Task Analyses and In-Home Task Analysis Tool Checklist: Stay Close Participant: ____________________ Date: _________ Primary Data Taker: ____________ Circle one: Pre-Test PostTest In-Class Secondary (IOA) Data Taker: _____________ Step Comments 1. Get close to the child within 15 seconds of the stay close behavior (move toward child and be within arms reach, etc.). 2. Touch appropriately (pat, hug, rub, etc.). 3. Match facial expressions. (Appropriately reflect the emotion of the situation.) 1 4. Use appropriate tone of voice (voice matches situation, a neutral monotone is not good enough). 2 5. Relax your body language within 15 seconds of the stay close behavior (relaxed, arms open, attentive, looking at child, etc.). 3 6. Ask open-ended positive questions (what? how? could you?). 4 7. Listen while the child is speaking. Talk less than the child (Do not problem-solve unless the child asks for help. Do not interrupt or abruptly change the topic.) 5 8. Use empathy statements (Act like a mirror and reflect the childs feelings, express understanding, caring etc.). 6 9. Ignore junk behavior. 7 10. Stay cool throughout the process (no coercives). Instructions: Each time a tool is performed mark Y if the step is completed, N if the step was not completed and N/A if there was no opportunity to complete the step. n/a y es y es y es no

PAGE 61

Appendix C (Continued) Bolded and italicized steps must be completed for task to be considered a performance of the tool. Trainers Notes: These steps do not have to be completed in any particular order after step 5. 1,2,3 A single instance of a punitive, disgusted or inappropriate facial expression (step 3), tone of voice (step 4) or body language (step 5) during any part of the role play should be scored no for step 3, 4, or 5. 4 Only one open-ended question is needed to score a yes for step 6. 5 If problem-solving is used without child asking for it, score no for step 7. If the parent begins to problem-solve, note if it occurs before or after the empathy statement. 6 Only one instance of an empathy statement is needed to score a yes for step 8. 7 A single instance of attending to junk behavior throughout the role play will be scored no for step 9. Overall Comments: (Circle any coercives used: sarcasm/teasing; criticism; threats; arguing; questioning; logic; despair, pleading, hopelessness; force; taking away privileges/items/allowance; one up-man-ship; silent treatment; telling on them to others? Be specific.) 55

PAGE 62

56 Appendix C (Continued) Tool Checklist: Give Positive Consequences Participant: ____________________ Date: _________ Primary Data Taker: ______ Circle one: Pre-Test PostTest In-Class Secondary (IOA) Data Taker: _______ Step Comments Tell the child which appropriate behavior he/she demonstrated. 2. Provide a positive consequence that fits the appropriate behavior. (Circle those provided): Verbal praise Appropriate touch (hug, pat, kiss, high five, etc.) Tangible item (thing) Appropriate privilege 3. Provide the positive consequence during the appropriate behavior or no longer than 3 seconds after the appropriate behavior has occurred. Staying Close Components 1 4. Get close to the child as appropriate to the situation (move toward child and be within arms reach, etc.). 2 5. Touch appropriately (pat, hug, rub, etc.). 6. Match facial expressions (reflect the emotion of the situation). 3 7. Use appropriate tone of voice (voice matches situation, a neutral monotone is not good enough). 4 8. Appropriate body language when providing consequence (relaxed, arms open, attentive, looking at child, etc.). 5 9. Stay cool throughout the process (no coercives). n/a y es y es y es no

PAGE 63

Appendix C (Continued) Instructions: Each time a tool is performed mark Y if the step is completed, N if the step was not completed and N/A if there was no opportunity to complete the step. Bolded and italicized steps must be completed for task to be considered a performance of the tool. Trainers Notes: 1 The staying close components must be used within 3 seconds of the parent responding to the appropriate behavior. If used after 3 seconds or not at all, score these items no. 2 Score step 4 yes if parent moves within arms reach even briefly. 3, 4, 5 Score no if there is any instance of inappropriate expression, tone of voice, or body language after the first 3 seconds. Overall Comments: (Circle any coercives used: sarcasm/teasing; criticism; threats; arguing; questioning; logic; despair, pleading, hopelessness; force; taking away privileges/items/allowance; one up-man-ship; silent treatment; telling on them to others? Be specific.) 57

PAGE 64

58 Appendix C (Continued) Tool Checklist: Ignore Junk Behavior Participant: _____________Date: _________ Primary Data Taker:______________ Circle one: Pre-Test PostTest In-Class Secondary (IOA) Data Taker: ________ Step Comments Dont say anything about the junk behavior. (For example, Stop that now! and Quit that!) 1 Dont do anything differently when the junk behavior happens (dont react, roll your eyes, stomp out of room, cross your arms, stare, etc.). 2 Do another activity independent of the child.(e.g. talk to another child). When appropriate behavior occurs, give a positive consequence. Circle those demonstrated: Verbal praise Appropriate touch (hug, pat, kiss, high five, etc.) Tangible item (thing) Appropriate privilege 5. Give a positive consequence no longer than 3 seconds after the junk behavior has stopped. 6. Stay cool throughout the process (no coercives). Instructions: Each time a tool is performed mark Y if the step is completed, N if the step was not completed and N/A if there was no opportunity to complete the step. Bolded and italicized steps must be complete d for task to be considered a performance of the tool. Trainers Notes: 1,2 Score no if there is any response to the junk behavior, including laughing or any change of expression. However, if the parent reali zes they have responded to the junk behavi or and stops the response, note this in the Comments and reinforce the parent for their acknowledgment and correction. Overall Comments: (Circle any coercives used: sarcasm/teasing; criticism; threats; arguing; questioning; logic; despair, pleading, hopelessness; force; taking away privileges/items/allowance; one up-man-ship; silent treatment; telling on them to others? Be specific.) n/a yes no

PAGE 65

59 Appendix C (Continued) Tool Checklist: Stop-Redirect-Give Positive Consequences Participant: ______________ Date: _________ Primary Data Taker: ____________ Circle one: Pre-Test PostTest In-Class Secondary (IOA) Data Taker: ________ Step Comments 1. Get within arms reach of the child (before saying anything). 2. Say only, Stop (behavior) or something like, Dont hit. (Score no if longer comments or repeated comments made.) 3. Make sure the child stops the behavior within 3 seconds of demand delivery. (Use gentle physical guidance if necessary.) 4. Tell the child to do something else (i.e., a positive alternative activity). 5. If the child does not do an appropriate activity within 3 seconds of task delivery, model, or gently guide them to do the activity. 6. Give a positive consequence for doing the appropriate behavior (praise, touch). 7. Give the positive consequence within 3 seconds after the appropriate behavior begins or stopping of the serious behavior. 8. Do not say or do anything about junk behavior throughout the process. 9. Stay cool throughout the process (no coercives) Instructions: Each time a tool is performed mark Y if the step is completed, N if the step was not completed and N/A if there was no opportunity to complete the step. Bolded and italicized steps must be complete d for task to be considered a performance of the tool. n/a y es no

PAGE 66

Appendix C (Continued) Overall Comments: (Circle any coercives used: sarcasm/teasing; criticism; threats; arguing; questioning; logic; despair, pleading, hopelessness; force; taking away privileges/items/allowance; one up-man-ship; silent treatment; telling on them to others? Be specific.) 60

PAGE 67

Appendix C (Continued) Stay Close Checklist In Home Participant: ______________ Date: _________Primary Data Taker: ____________ Secondary (IOA) Data Taker: _________________ Session #: _________________ Date Step 1. Get close to the child within 15 seconds of the stay close behavior (move toward child and be within arms reach, etc.). 2. Touch appropriately (pat, hug, rub, etc.). 3. Match facial expressions. (Appropriately reflect the emotion of the situation.) 1 4. Use appropriate tone of voice (voice matches situation, a neutral monotone is not good enough). 2 Relax your body language within 15 seconds of the stay close behavior (relaxed, arms open, attentive, looking at child, etc.). 3 Ask open-ended positive questions (what? how? could you?). 4 Listen while the child is speaking. Talk less than the child (Do not problem-solve unless the child asks for help. Do not interrupt or abruptly change the topic.) 5 Use empathy statements (Act like a mirror and reflect the childs feelings, express understanding, caring etc.). 6 Ignore junk behavior. 7 Stay cool throughout the process (no coercives). Instructions: Each time a tool is performed mark Y if the step is completed, N if the step was not completed and N/A if there was no opportunity to complete the step. Bolded and italicized steps must be completed for task to be considered a performance of the tool. Trainers Notes: These steps do not have to be completed in any particular order after step 5. 1,2,3 A single instance of a punitive, disgusted or inappropriate facial expression (step 3), tone of voice (step 4) or body language (step 5) during any part of the role play should be scored no for step 3, 4, or 5. 4 Only one open-ended question is needed to score a yes for step 6. 5 If problem-solving is used without child asking for it, score no for step 7. If the parent begins to problem-solve, note if it occurs before or after the empathy statement. 6 Only one instance of an empathy statement is needed to score a yes for step 8. 7 A single instance of attending to junk behavior throughout the role play will be scored no for step 9. Overall Comments: (Circle any coercives used: sarcasm/teasing; criticism; threats; arguing;) 61

PAGE 68

62 Appendix D: Participants Data Sheet and Graph Fluency Data Sheet Name: _______________________ Circle one: Practice Experiment Home Write the number of flash cards that were correct, incorrect and passed for each trial. Date: _______ Session #: ______ Date: _______ Session #: ______ Trial Correct Incorrect P ass Trial Correct Incorrect P ass X X 1 1 2 2 3 3 4 4 5 5 Date: _______ Session #: ______ Date: _______ Session #: ______ Trial Correct Incorrect P ass Trial Correct Incorrect P ass X X 1 1 2 2 3 3 4 4 5 5 Date: _______ Session #: ______ Date: _______ Session #: ______ Trial Correct Incorrect P ass Trial Correct Incorrect P ass X X 1 1 2 2 3 3 4 4 5 5

PAGE 69

Appendix D (Continued) Fluency Drill Performance01020304050607080123123123123123123123123123123Trials per SessionNumber of cards = Correct= PassedX = IncorrectName: 63

PAGE 70

Appendix E: Sample Flash Cards Ignore Junk: Step 1 Dont _______ anything about the junk behavior when it happens. Q: 1 SAY A: 1 What tool would you use? Jimmy is whining for ice cream Q: 2 Ignore Junk A: 2 What is this? You sound upset Q: 3 Empathy statement A: 3 64

PAGE 71

Appendix F: Trainer Script for Fluency Drill The following is the trainers script that should be used to prompt participants through fluency drills. Read the following instructions: Trainer: Its study time. Everyone take out their flash cards. Wait for participants to take out flash cards. Trainer: You will have 2 minutes of study time. When study time is completed, we will do a drill. You may start studying when I say You may begin, and stop when I say, end. You may begin. Start timing two minutes. At the end of two minutes . Trainer: End. Trainer: Its time for a drill. Get your cards and data sheets ready. Wait for participants to place cards and data sheets in front of them, on their desks. Trainer: Decide who goes first. If you went first the last time, it is now time for you to go Second. Wait for participants to decide order. Trainer: Shuffle your cards. Wait until all participants have shuffled cards. Trainer: You will have 1 minute to complete as many cards as you can. Your time will start when I say go. Time is up, when I say stop. Ready, set, go! Start timing. When time has elapsed . Trainer: Stop. Record your data. Wait for all participants to record their data. Trainer: Now, switch roles. Wait for all participants to exchange cards and data sheets. Trainer: Shuffle your cards. 65

PAGE 72

66 Appendix F (Continued) Wait until all participants have shuffled cards. Trainer: You will have 1 minute to complete as m any cards as you can. Your time will start when I say go. Your time is up, when I say stop. Ready, set, go! Start timing. When time has elapsed . Trainer: Stop, and record your data Wait for all participants to record their data. Trainer: You will now have 2 minutes of review time. During this time, you may review the cards that you got correct, incorrect, or missed. Review time will start now Start timing for 2 minutes. When time has elapsed . Trainer: Provide a praise statement (e.g. Nice wo rk). You may put your cards away.

PAGE 73

67 Appendix G: Social Validity Questionnaire Please circle the number below each statem ent that most accurately describes your response to the following questions. 1 = Strongly Disagree 2 = Disagree 3 = Neither agree or disagree 4 = Agree 5 = Strongly Agree 1. Flash card drills helped me remember to ols and concepts learned in class. 1 2 3 4 5 2. Flash card drills were enjoyable. 1 2 3 4 5 3. Flash card drills helped me use tools correc tly in with the children in my home. 1 2 3 4 5 4. In class I learned to play gin with my flash cards. 1 2 3 4 5 5. Flash card drills helped me to know when I should use a specific tool in my home. 1 2 3 4 5 6. I did not enjoy using the flash cards. 1 2 3 4 5 7. Flash card drills helped me to perf orm better on my post-test role-play. 1 2 3 4 5 8. Overall, the use of flash cards was beneficial to my learning. 1 2 3 4 5


xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam Ka
controlfield tag 001 001790578
003 fts
005 20070621092955.0
006 m||||e|||d||||||||
007 cr mnu|||uuuuu
008 070621s2005 flu sbm 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E14-SFE0001510
040
FHM
c FHM
035
(OCoLC)144729042
049
FHMM
090
BF176.7 (ONLINE)
1 100
Williams, Gertie.
4 245
The effects of fluency training on performance, maintenance and generalization of parenting skills
h [electronic resource] /
by Gertie Williams.
260
[Tampa, Fla] :
b University of South Florida,
2005.
3 520
ABSTRACT: The effects of fluency training on performance, maintenance, and generalization of parent training skills were examined within the context of a classroom and home setting. Three foster parents attended a 24-hour Parenting Tools for Positive Behavior Change (PBC) course. Participants completed timed fluency drills using flash cards to increase learning and performance of PBC tools. A non-concurrent multiple baseline design across participants was used to assess participant performance on flash card drills and PBC tools during in-class, pre-test, and post-test role plays, and in novel situations with children in the home before, during and after the course. Results showed that fluency training had little or no effect on increasing tool performance across all testing phases for all participants, nor were there any changes in frequency and accuracy of fluency trained tools in the home to indicate maintenance and generalization of treatment effects.
502
Thesis (M.A.)--University of South Florida, 2005.
504
Includes bibliographical references.
516
Text (Electronic thesis) in PDF format.
538
System requirements: World Wide Web browser and PDF reader.
Mode of access: World Wide Web.
500
Title from PDF of title page.
Document formatted into pages; contains 67 pages.
590
Adviser: Jennifer Austin, Ph.D.
653
Parent training.
Timed drills.
Parenting tools.
Frequency.
Accuracy.
690
Dissertations, Academic
z USF
x Applied Behavior Analysis
Masters.
773
t USF Electronic Theses and Dissertations.
0 856
u http://digital.lib.usf.edu/?e14.1510