USF Libraries
USF Digital Collections

Development and validation of a systematically designed unit for online information literacy and its effect on student p...

MISSING IMAGE

Material Information

Title:
Development and validation of a systematically designed unit for online information literacy and its effect on student performance for internet search training
Physical Description:
Book
Language:
English
Creator:
Dunsker, Emily K
Publisher:
University of South Florida
Place of Publication:
Tampa, Fla.
Publication Date:

Subjects

Subjects / Keywords:
Content-centered instruction
Learner-centered instruction
Constructivist
Behaviorist
Gagn.̈
Dissertations, Academic -- Interdisciplinary Education -- Doctoral -- USF   ( lcsh )
Genre:
government publication (state, provincial, terriorial, dependent)   ( marcgt )
bibliography   ( marcgt )
theses   ( marcgt )
non-fiction   ( marcgt )

Notes

Abstract:
ABSTRACT: As online learning increases and classroom use of print textbooks are gradually replaced by web-based instruction, what features of online instruction prove beneficial to student learning? The present study has three purposes; (1) To examine the effects of conversion of textbook content to web-based instruction for an extant Internet search course. The researcher examined performance differences of an online textbook to web tutorial compared to a second version that included interactive features found in classroom instruction. (2) To investigate students perceptions of material that afforded high levels of learner control and compared responses to a more structured instructional module. (3) To document the design process used to convert textbook material to web-based instruction. Gagns̈ Events of Instruction (1985) differentiated features for comparison and treatment online modules; one featured content-centered, the other learner-centered instructional strategies.
Thesis:
Thesis (Ph.D.)--University of South Florida, 2005.
Bibliography:
Includes bibliographical references.
System Details:
System requirements: World Wide Web browser and PDF reader.
System Details:
Mode of access: World Wide Web.
Statement of Responsibility:
by Emily K. Dunsker.
General Note:
Title from PDF of title page.
General Note:
Document formatted into pages; contains 211 pages.
General Note:
Includes vita.

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
aleph - 001681071
oclc - 62744968
usfldc doi - E14-SFE0001047
usfldc handle - e14.1047
System ID:
SFS0025368:00001


This item is only available as the following downloads:


Full Text

PAGE 1

Development and Validation of a Systematically Designed Unit for Online Information Literacy and its Effect on Student Performance for Internet Search Training by Emily K. Dunsker A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy Department of Interdisciplinary Education College of Education University of South Florida Major Professor: Arthur Shapiro, Ph.D. Ann Barron, Ed.D. James Carey, Ph.D. Jeffrey Kromrey, Ph.D. Date of Approval: March 14, 2005 Keywords: content-centered instruction, lear ner-centered instruction, constructivist, behaviorist, Gagn Copyright 2005, Emily K. Dunsker

PAGE 2

Dedication I dedicate this work to my husband Rick, my son, Alex, my best friend, editor, and mentor, Susan Zucker. To James Carey whose dedication went beyond the call of duty and who worked with me to ensure my success. To my colleagues and students at Chamblee Middle School in Atlanta, GA for their patience and enthusiasm for this work, I dedicate this study.

PAGE 3

Acknowledgments I want to acknowledge the assistance from Dr. Arthur Shapiro who stepped in as Committee Chair at the end and assisted me to maintain a sense of sanity throughout the dissertation process. I thank my assistant at work, Rita Garrard, who encouraged me, provided the space for unnecessary administrivia in the media center, and voluntarily drove my son home when I was too busy with work. Thanks to Dr. Carey who provided a structure for my rather divergent thinking an d honed in on the central issue for this study. Thanks to my husband, Rick, who through a long process stuck with me, forced me to get up and work even in the midst of exhaust ion, and reminded me that I could succeed. I especially thank Alex, who in the midst of a Bar Mitzvah, change of school, and emergence into adolescence serves as my inspiration and pride. Love always.

PAGE 4

i Table of Contents List of Tables iv Abstract vi Chapter One Introduction 1 Statement of the Problem 4 Purpose of the Study 6 Dependent Variables 7 Independent Variables 8 Common Features of Two Instructional Modules 9 Differences Between the Two Instructional Strategies 10 Research Questions 10 Definitions 12 Limitations 14 Conclusion 15 Chapter Two Review of Literature 17 Introduction 17 Instructional Design Models for Online Tutorial Development 18 Learner Control Versus Program Control And Achievement 21 E-Learning Interactions 30 Behavioral and Constructivist Theories and their Relationship to Instructional Design 35 Research Question Two: Perceptual Theories 38 Theoretical Foundation for Development of Instructional Modules 43 Instructional Systems Design 46 Derivation of Course Content 52 University of Texas’ Information Literacy Tutorial 54 Conclusion 56 Chapter Three Method 60 Purpose Restated 60 Research Questions Restated 60 Research Design Model 61 Context of the Study 61 Population and Sample 63 Research Question One 68 Instruments 69 Pre/Posttest Development 69 Internet Scavenger Hunt Performance Test Development 70

PAGE 5

ii Research Question Two 73 Academic Motivation Profile 73 Research Question Three 76 Development of Instructional Materials 76 Treatment Module 77 Comparison Module 78 Common Features Between Treatment and Comparison Modules 79 Menu Structure 79 Use of Graphical Organizers 79 Overview of the Research Process 80 Overview Screens Preceding Exercises 80 Clear Definitions of Terms 81 Reference to Previous Material for Retention and Transfer 81 Scavenger Hunt as a Performance Task 81 Different Features Between the Comparison and Treatment Module 81 Research Design for Final Administration of the Study 87 Procedure 88 Data Analysis 90 Chapter Four Results 92 Purpose of the Study 92 Research Question One 93 Self-Reported Time on Instruction 97 Research Question Two 99 Research Question Three 103 Summary of Results 106 Chapter Five Discussion 108 Introduction 108 Purpose of the Study 108 Implications of the Results 110 Research Question One 111 Research Question Two 114 Research Question Three 116 Limitations of Instructional Delivery and Online Learning Literature 119 Conclusion 122 Summary of Limitations 122 Summary of Implications 125 Study Informs Practice 125 Recommendations for Future Research 126 References 130 Appendices 137 Appendix A: Development and Validation for Comprehension Test 138 Appendix B: Flowchart and Conceptual Framework for Comparison Group 164 Appendix C: Treatment Group Flowchart 176 Appendix D: Development and Validation of Internet Scavenger Hunt 180

PAGE 6

iii Appendix E: Formative Evaluation Comments and Results 191 Appendix F: Modified AMP for Search Modules 195

PAGE 7

iv List of Tables Table 1 Grounded Interactive Strategies 32 Table 2 ARCS Motivational Design 41 Table 3 Constructivist Strategies and Internet Training 48 Table 4 Comparison and Treatment Group Differences 83 Table 5 Repeated Measures ANOVA for Pretest and Posttest 95 Table 6 Descriptive Statistics within Groups on Pretest, Posttest, Time on Task, Scavenger Hunt, and Time to Complete Scavenger Hunt 96 Table 7 Descriptive Statistics for Group 1 and 2 on Scavenger Hunt Data 97 Table 8 Paired T-Test with Peri ods Three and Four Together 97 Table 9 Descriptive Statistics Groups One and Two Assigned in Matched Pairs 100 Table 10 Paired T-Test AMP 101 Table 11 Correlations Between AMP Factors and Achievement 102 Table 12 Time and Resource Costs for Comparison and Treatment Module 104 Table A-13 Derivation of Test Questions for Pretest and Posttest Comprehension 138 Table A-14 Cronbach Alpha on Summer 2003 USF Students Posttest Scores 149 Table A-15 Cronbach Alpha on Spring 2004 Chamblee Middle School 151 Table A-16 Frequency Distribution Test Items 2004 Chamblee Middle School 8 Grade Posttest Scores 153 Table B-17 Conceptual Framework for Two Modules Comparison and Treatment 164

PAGE 8

v Table D-18 Derivation of Test Questions for Internet Hunt Posttest 180 Table D-19 Cronbach Alpha for Scavenger Hunt: Summer 2003 Students 188 Table D-20 Cronbach Alpha for Scavenger Hunt: Spring 2004 Middle School Students 189 Table D-21 Frequency Distributions for Scavenger Hunt: Spring 2004 Middle School Students 190 Table E-22 Formative Evaluation Feedback from Pilot: Undergraduate Students 191

PAGE 9

vi Development and Validation of a Systematically Designed Unit for Online Information Literacy and its Effect on Student Perfor mance for Internet Search Training Emily K. Dunsker ABSTRACT As online learning increases and classr oom use of print textbooks are gradually replaced by web-based instruction, what featur es of online instruction prove beneficial to student learning? The present study has thr ee purposes; (1) To examine the effects of conversion of textbook content to web-based instruction for an extant Internet search course. The researcher examined performan ce differences of an online textbook to web tutorial compared to a second version that in cluded interactive features found in classroom instruction. (2) To investigate students’ perceptions of material that afforded high levels of learner control and compared responses to a more structured instructional module. (3) To document the design process used to convert textbook material to web-based instruction. Gagn’s Events of Instruction (1985) differentiated features for comparison and treatment online modules; one featured c ontent-centered, the other learner-centered instructional strategies. The treatment module incorporated interactive features from the Texas Information Literacy Tutorial (TILT) with content modifications appropriate to Internet training modules. A pretest-treatment-posttest experimental design was used to assess student achievement within and between two groups of 41 high achievem ent eighth graders. Scores for comprehension and performance tests (scavenger hunt) assessed students’ retention and performance. Carey’s (1994) Academic Motivation Profile (AMP) instrument was used to study

PAGE 10

vii student perceptions of material on; attention, relevance, confidence, and satisfaction. No differences between comparison and treatment groups occurred on comprehension. Mean scores across both groups increased from X = 58.97 to 72.63 ( N=41) A repeated measures ANOVA revealed a main effect F (1,39)= 40.233 p<.000 .. Both groups excelled on the scavenger hunt with a X = 92% ( N = 41). The AMP revealed no significant differences between groups on attention, relevance, confidence, or satisfaction. The research confirmed previo us findings by Schnackenberg (1998) that provision of high learner control to high ab ility students proved sufficient for mastery of course content. When practitioners convert print materials for online delivery, considerations such as learner characteristics, validity of testing instruments, navigation, elaboration, and practical considerations ar e important to the success of the product. Replication using a heterogeneous audience would assist practitioners in their efforts to make decisions regarding strategies for students of different ability levels.

PAGE 11

Development and Validation 1 1 Chapter One Introduction Information Literacy is defined as the ab ility to access, use, evaluate, and generate information effectively (AASL, 1998). Converti ng print materials to digital formats is increasingly popular and requires that students know how to search automated catalogs, proprietary database sources, and Internet information. Users must be able to convert natural language to language that is compatib le with electronic databases to conduct effective searches. Effective searches require: (1) refining research questions, (2) identifying keywords and their synonyms, (3) becoming familiar with ter ms that pertain to a research topic, and (4) differentiating useful electronic resources from those that are not. Internet access is now mandated within public and academic libraries and is used to conduct research. Lubans (1998), Deputy Un iversity Librarian from Duke University, reported that over 85% of the 235 college freshmen surveyed preferred searching the Internet over traditional library research. The state of Georgia states that students need instruction on how to use the Internet effectively and requires all students to use Internet resources to support research projects throughou t middle and high school (Georgia’s Quality Core Curriculum, 2004). It is not clear that these skills are being taught nor is it known if students possess them. Broch’s (2000) summary of works by Neum an (1995, 1997), Bilal (1998), and others reported research on novices’ Internet search be haviors. Neuman (1995, 1997) studied high school students as they used databases to find information. She noted that students failed to apply language rules for database searches an d did not understand the concept of controlled

PAGE 12

Development and Validation 2 2 vocabulary. Neuman also found that when students lacked sufficient background information about research topics, they had difficulty choosing appropriate subjects and keywords for their searches. Bilal (1998) exam ined seventh graders who attempted search tasks with Yahooligans, a simplified subject directory for elementary and middle school students. The middle school students demons trated poorly developed search skills when using keyword queries as evidenced by misspellings and absence of Boolean logic. Boolean logic is the inclusion of conjunctions “and”, “o r”, and “not” alongside search terms in order to narrow or broaden search results. Students ultimately resorted to use of natural language for search fields. Fidel and colleagues (1999) studied eight eleventh and twelfth grade students’ use of resources in a high school media center. Studen ts had access to Internet search engines and subject directories but received only cursory tr aining on their use, no training on how to use an Internet browser, and little assistance from the media specialist or their instructor while conducting research. Summarizing qualitativ e data, Fidel concluded that students demonstrated impulsive search behavior, selected the default search tool associated with the Internet browser, input single words for search terms, and chose the first of many results without critically assessing the information. Students expressed general satisfaction with their search results even though they neglected to analyze research assignments, use Boolean logic, or identify key phrases. Students hand led their frustration with a search engine by returning to familiar sites. The researchers c oncluded that students possessed only primitive search skills and failed to use web search tools effectively. Pitts (1994) identified determinants for decision-making for 26 high school researchers. She observed and interviewed stud ents over the course of nine weeks to ascertain students’ mental models of how the resource-rich environment of the media center

PAGE 13

Development and Validation 3 3 could be used to accomplish their research task Students were asked to research information from online databases to create a script for a videography project on marine science. Pitts learned that students had poorly developed information-seeking skills and demonstrated ignorance of using electronic tools for finding re sources. Pitts argued that, “These students were not overwhelmed by too much information. Instead they were floating in a sea of information but did not know how to access more than a few useful drops.” (p.11). It is evident from this brief review of literature that adolescent searchers require instruction on how best to use Internet information resources. Media specialists have an imperative to provide students with this instruction but are confronted with a critical shortage of technical and human resources. Limited resources often mean that media specialists scan textbooks to convert content to websites in an effort to promote information literacy. Challenges to providing effective trai ning include: (1) lack of technical skill, (2) limited access to students, (3) limited access to computer teaching labs, (4) scarcity of validated instructional materials to teach In ternet search skills, (5) paucity of faculty members proficient in Internet searching, and (6) unreasonable expectations of media specialists; in schools with populations of ov er 1,000 one or two media specialists may service the entire student body. Questions arise as to how web tutorials can be designed and used most effectively. In an article entitled, Beyond the Digital Fun-Factor Glendinning (2002) acknowledges the “sexy appeal” (p.90) of computers and multimedia resources for teaching. He continues, “While computers are now a mainstay of everyday life, teaching with them remains largely the domain of a few self-educated mavericks.” Glendinning transfers text-based material for Internet access, but argues that it is alarmi ngly tedious to convert print material to web

PAGE 14

Development and Validation 4 4 delivery. He writes, “… a month of steady wo rk last summer didn’t even cover three weeks of my curriculum!” (p.94). Rarely does an instructor rely sole ly on textbooks to meet curricular objectives The trend is to use the textbook as a reference for student investigation; the teacher’s role is to facilitate inquiry-based learning. A Best Practices high school in suburban Chicago (Daniels and Zemelman, 2003) illustrated how students and teachers use contemporary magazines and web-based articles to examine in-depth questions while using textbooks as reference material. Inquiry learning was promoted th rough interchanges between (1) students and peers, (2) students and material, and (3) studen ts and teacher. The success or failure of this learning experience depended on resources, on the level of student engagement, and interaction with their instructor. Teachers used textbooks to guide and manage student learning experiences. Typically, textbook material is enhanced with hyperlinks and some graphics, but little program-controlled interaction is created. An online textbook includes content presentation and may include end-of-chapter exercises to encourage learner exploration and illustrate constructs. The text is essentially linear, but students can choose a non-linear path to determine the pacing of the instruction. Feedba ck results as a natural consequence of the student’s exploration. Like problem-based learni ng, a high degree of learner self-regulation is required to derive full benefit. The success or failure of web-based textbooks may largely depend on the learner’s prior familiarity with the subject matter as well as their metacognitive abilities and motivation (Meyer, 2003). Statement of the Problem This study investigated the problems that often arise for learners as a result of instructors’ lack of skills and the instructional design when instruction is delivered via the

PAGE 15

Development and Validation 5 5 web. When instructors place content on the web and convert face-to-face instruction to distance learning, interactions within the learning environment radically change. Problems can occur for the learner depending on the instructor and characteristics of the learner. Should the instructor lack skills to provide program control of guided practice and feedback, the degree of practice is often shap ed through the experience of the individual learner. If the learner is self-directed, one may likely anticipate that he/she will experiment independently and take advantage of hyperlinks within the instruction. The degree of the learner’s maturity and self-regulation is an im portant factor for consideration when designing elements for guided practice and feedback. Should the learner lack prior experience, knowledge of concepts, or the ability to self-reg ulate, a constructivist approach in which the guided practice and feedback results from the learner’s own ability to experiment and develop his/her own strategies to master th e material may prove insufficient (Meyer, 2003). Two fundamental design questions emerge when media specialists attempt to implement web-based designs for information literacy training: (1) Will media specialists focus on delivery of content and follow cognitiv e principles of learning, or (2) Will the instructor emphasize a learner-centered approach whereby the materials set the stage for the learner to invent his/her instructional stra tegy via exploration or experimentation? Specifically, the study compares contentcentered instruction with learner-centered instruction delivered on the web and its eff ects on student performance and perceptions about the instruction received. A content-centered approach implies that learning takes place via transmission of information and/or presen tation of constructs with examples and nonexamples. The learner is guided through exercise s, interacts with material, receives immediate feedback, and includes a summation of the ma terial as it relates to previously learned content. The designer prescribes learner strategies through delivery of the content and

PAGE 16

Development and Validation 6 6 practice-feedback sessions throughout the learni ng module. Often, navigation and elicitation of performance and feedback is program cont rolled and not left to the discretion of the learner. The content model resembles a school of thought best illustrated by Gagn (1985). The learner-centered model embraces a cons tructivist model. Constructivists believe that learning takes place as a series of intera ctions or experiences presented by the designer intended to facilitate the process of construction of meaning from experience. In contrast to a content-centered focus, the learner mainta ins control of his/her learning objectives and strategies for acquisition of content. Feedback is not dependent on the instructional program or teacher, rather it results as a natural conse quence of the learner’s experimentation with the constructs presented through authentic experience. Research is needed to determine what strategies influence performance outcomes for learners engaged in e-learning for informat ion literacy (Hirumi, 2002). Therefore, the researcher examined the effects of web-b ased instruction on student performance and perception. Purpose of the Study The purpose of the present study is thr eefold. The first purpose is to examine students’ performance on two forms of Internet search skills instruction for web-based delivery from a textbook. The second purpose is to examine effects on students’ academic motivation of two forms of web-based instruction that afford higher or lower levels of learner control. The third purpose is to document the design process used to convert textbook material to web-based instruction. The two instructional strategies were design ed to examine performance differences between two forms of converted online text for an extant Internet search training course.

PAGE 17

Development and Validation 7 7 The strategies are: (1) A content-centered form of instruction which features a high degree of program control. The instructor chooses th e sequence, pace, and amount of practice to ensure the student masters the skills intended. This form includes features found in classroom instruction such as gaining atten tion, guided practice, corrective and reinforcement feedback, embedded quizzes that inform the learner of his/her progress, and summary screens that relate new content to previously learned material. (2) A learnercentered form which features a high degree of learner control. The student chooses the sequence, pace, and amount of practice. This form is typical of web-based instruction, includes a menu structure, suggested practice exer cises, and is less prescriptive. The students choose the instructional strategy to master the course objectives. The second purpose was addressed by ex amining the perceptions of high ability students towards the presented material. Th is was done to discern whether students’ perceptions differed between the two forms of online instruction. The online instruction is based on Keller’s (1987) ARCS theory: attention, relevance, confidence, and satisfaction. A self-report instrument developed by Carey (1994), the Academic Motivation Profile (AMP) is modified for the present research to measure a ttitudinal scores on each of the four factors. Documenting the design process used to convert textbook material to web-based instruction addressed the third purpose of th e study. The researcher considers whether additional time and effort to incorporate the features found in classroom instruction mentioned above are warranted based on performance and perception outcomes. Dependent Variables This study examined the effect two instructional design strategies had on two performance assessments and student perception. The performance assessments consist of a comprehension test and an Internet scav enger hunt. A pretest-treatment-posttest

PAGE 18

Development and Validation 8 8 experimental design was used to assess within and between group differences on the comprehension test on knowledge of Internet search strategies. Following instruction, students took a second performance test in the form of an Internet scavenger hunt. Scores between groups on the scavenger hunt were compared to determine whether the instructional strategy proved to be a benefit or a detriment to the students. Further, investigation of students’ perception of the in struction was compared using a modified form of Carey’s (1994) Academic Motivation Profile (AMP). Independent Variables Two instructional design strategies are compared: (1) A content-centered form of instruction which features a high degree of prog ram control. (2) A learner-centered form of instruction which features a high degree of learner control. The researcher used content from an extant course on Introductory Library Research and Internet Skills by Frederick and Smith (200 0) and converted the material to an online format. Reigeluth’s (1996) Elaboration Theory model and formative research informed the researcher’s decisions concerning the two in structional strategies. Reigeluth concluded: Elements should be sequenced from simple to complex, A precise overview of theoretical and/or procedural information should be provided in the form of an epitome, defined by Reigeluth (1995), as overview screens that provide context for the exercises that follow. Sequential steps for procedural knowledge should be compared and contrasted during exercises with reference to previous material, Problem based instruction should include differentiation between extraneous information and required information for task performance.

PAGE 19

Development and Validation 9 9 The two instructional designs differ in ter ms of navigation and extent of learner control versus program control for guided pr actice and feedback. Differentiation between treatment and comparison groups was base d on Schnackenberg and Sullivan’s (1998) research findings. Schnackenberg found that when comparing the effects of two forms of computer-based instruction on competency-based education that students assigned to full program controlled software (treatment group) performed better than those students assigned to the lean version (comparison gr oup). The full version controlled the sequence and practice the students received during tr aining, while the lean program condition provided learners choice of extent of practice following examples. The researcher also divided participants into ability levels and found that overall, high ability students performed better than those of less ability in both treatment conditions. Schnackenberg’s (1998) research indicated th at greater reliance on Internet resources for classroom instruction afforded students greater learner control over the pace and extent of practice. She recommended that the issue of learner control versus program control navigation, guided practice, and feedback needs to be revisited. Common Features of Two Instructional Modules Vertical menu structure with visual prompt s to alert learners of their progress throughout the module. Graphical organizers and illustrations throughout the narration provide visual models of the content. Overview screens provide context for the exercises that follow. Clear definitions with pop-up hyperlinks within the narratives provide assistance.

PAGE 20

Development and Validation 10 10 Reference to previously learned material fo r retention and transfer of knowledge is contained in each of the narratives. Differences Between the Two Instructional Strategies Flash screens intended to gain the learner’s attention were used in the treatment condition but were absent in the comparison module. Guided practice and feedback in the comparison condition were accomplished through hyperlinks and suggested exercises. The treatment module controlled student navigation through several illustrative exercises and provided programcontrolled feedback throughout the practice. The treatment included review screens an d interval quizzes to provide feedback on the learner’s progress. These screens wer e absent in the comparison condition. A final game adapted from the University of Texas’ Texas Information Literacy Tutorial (TILT) called Library Squares was added to the treatment condition. Its intention was to reinforce transfer and retention of information prior to final testing. Research Questions The main purposes of this study were to examine the effects two online instructional strategies have on student performance; to examine students’ perceptions of the two forms of online instruction; and to document the desi gn process used to convert textbook material to web-based instruction. The three research questions driving this study are: 1. What effect do two online instructional design strategies for Internet training, characterized by their content-centere dness or learner-centeredness, have on student performance measures?

PAGE 21

Development and Validation 11 11 2. How do students’ perceptions based on self-reports differ on attention, relevance, confidence, and satisfacti on between two instructional strategies characterized by their content-cen teredness or learner-centeredness? 3. Is the additional time and effort n eeded to include the treatment module features found in classroom instruction; gaining attention, guided practice, corrective and reinforcement feedback, embedded quizzes, and summary screens, efficacious given the performance and perception results of this study?

PAGE 22

Development and Validation 12 12 Definitions: Ability levels – refer to the random assignment of students in this study using match pairs to physically control the covariate, which was the students’ pretest score. Boolean logic the inclusion of conjunctions “and”, “or”, and “not” used to narrow or broaden search results. Computer-Assisted Instruction (CAI) refers to drill-and-practice, tutorial, or simulation activities offered by themselves or as supplements to traditional, teacherdirected instruction. Constructivism refers to the idea that learners construct knowledge or meaning for themselves learning takes place. Digital formats – delivery of informati on using various standards expressed in numerical form especially for use by a computer. Electronic resources – digital web resources used to conduct research. Epitomes – named by Reigeluth (1996), ar e overview screens which provide context for instructional exercises that follow. Hyperlink – element in an electronic document that links to another place in the same document or to an entirely different document. Hyperlinks are the most essential ingredient of hypertext systems, including the World Wide Web HyperMedia An extension to hypertext that supports linking graphics, sound, and video elements in addition to text elements. The World Wide Web is a partial hypermedia system since is supports graphical hyperlinks and links to sound and video files.

PAGE 23

Development and Validation 13 13 Internet-based instruction (IBI) instruction delivered via the web. The IBI research that has been done thus far has focused on three general categories: a method of instructional delivery; human behavior based on educational theory, such as motivation theory, instructional design theory, and more; and technology in teacher education. Information literacy – ability to effectively access, use, evaluate, and generate information. Instructional strategy how one applies the methods the student will encounter to acquire the course objective. Internet browser – software application used for displaying HTML documents and other WWW documents. The two most popula r are Netscape and Internet Explorer. Inquiry-based learning implies involvement that leads to understanding. Involvement in learning implies possessing skills and attitudes that permit you to seek resolutions to questions and issue s while you construct new knowledge. Metacognition broadly defined, is the study of how humans think about and control their own thought processes. Multimedia is the use of several different media to convey information ( text audio graphics animation video and interactivity ); often refers to computer media. Natural language human language ; for example: English and Chinese are natural languages. Computer languages, such as FORTRAN and C are not. Online databases – collections of informat ion organized so that a computer can quickly access requested data. Like a traditional file cabinet, databases are organized by fields, records, and files.

PAGE 24

Development and Validation 14 14 Print materials – materials distributed on paper. Problem-based learning – (PBL) is widely thought of as both a curriculum and a process. The curriculum consists of designed problems that demand that learners acquire knowledge, problem solve profic iently, self-direct their learning, and participate in teams to develop skills. The process replicates those encountered in life. Search engines program s that search documents for specified keywords and returns a list of the documents from where the k eywords were found. Search engines enable users to search for documents on the World Wide Web and USENET newsgroups Variables identified in this study: Dependent – consist of a measure of student perceptions and two student performance measures: knowledge assessment and an Internet scavenger hunt. Independent – consist of two instructional designs: a content-centered form of instruction and a learner-centered form of instruction. World Wide Web (WWW) system of Internet servers that support documents formatted in a markup language called HTML Yahooligans – a digital subject directory si mplified for elementary and middle school students. Limitations Limitations of this study include threats to internal and external validity. Internal validity threats include: (1) history or rep lication of the pretest following posttest without ample time allowed between administration of th e instruments; (2) testing where the pretest

PAGE 25

Development and Validation 15 15 alters posttest responses and potentially negates the treatment; (3) instrumentation error due to low reliability or content validity of the tests and potential order effect resulting when pre and posttests follow the same order of questions. Threats to external validity included interaction between selection of the sample and treatment. Characteristics of the high ability stud ents include self-regulation and a desire to achieve knowledge and skills. The students were attracted to th e instruction in both conditions. One cannot generalize beyond the lo cal site because the researcher’s relationship with the selected sample may have influenced students’ receptiveness to the treatment. Statistical limitations include the low reliability scores for both the comprehension and performance instruments, thus resulting in high standard error. Sample size was relatively small ( N = 41) given that alpha was set at p <.05 Replication of the study with larger numbers would increase power and the researcher could have computed sample size based on an effect size of .80 and alpha at .05. For this study, the two groups should have been comprised of 30 students per group to have attained the correct effect size. The fact that both comparison and treatment groups excelled and the sample comprised high achieving middle school students may be attribu ted to learner characteristics being correlated to the sought outcomes of the dependent variable. Conclusion The appeal of web-based instruction is that it offers a convenient vehicle with which to teach large numbers of students with limited instructor resources. Problematic, however, is that web-based interactive modules are not easily created. Time resources and technical skills are not readily available at school sites. Consequently, an expedient means of producing textbook/training material to large numbers of cl asses without intensive effort on the part of media specialists is to convert text material to web via straightforwar d text-based websites.

PAGE 26

Development and Validation 16 16 Merely placing web instruction in a format conveniently available to large numbers of students does not insure that the instruction is necessarily effective. Success with webbased learning may depend on the characteristics of the learner. Meyer (2003) contends that a student’s prior knowledge as well as his or he r learning style on a continuum of initiative and passivity may predict the success or failure of web-based learning. A self-directed learner may require different instructional strategies than a student with low se lf-regulatory learning skills.

PAGE 27

Development and Validation 17 17 Chapter Two Review of Literature Introduction The topics chosen for this review of literature are intended to give the reader background and insight into educational issues concerning web-based instruction (WBI). The discussion is presented in three main sections that outline broad-based issues and then addresses issues from a narrower focus. Specific topics include: features associated with sound instructional design for online training environments, issues faced by educators when textbook material is converted to an Internet fo rmat, a theoretical analysis of instructional components necessary for successful online learni ng, and perceptual issues that influence a learner’s receptivity to instructional material. Educational technology as information technolog y is clearly developmental in nature. The proliferation of micro-processing tec hnology and convergence of telecommunications and computing have led to the digitization of information in all arenas, including education. The Information Age features a shift from linea r to interactive media, a broadening concept of literacy, a merging of information processi ng, the regulation of new technologies, and the relationship between available informati on and its effective use (Saettler, 1990). Empirical studies on the impact WBI has on learners, teachers and curricula are few because WBI is still in its infancy. Indeed, th ere is a paucity of empirical research that identifies features of online instruction that influence performance outcomes. In this study, content-centered versus learner-centered designs and the relationship of learner control versus program control are examined for their effect on student performance.

PAGE 28

Development and Validation 18 18 The WBI research that has been done thus far has focused on three general categories: First is a method of instructional de livery; i.e, online distance learning. Second is human behavior based on educational theory, such as motivation theory, instructional design theory, and others. Third is technology in teacher education. This study explores instructional delivery, design methods, and the behavioral aspect of educational theory denoted by performance and perception measures (Zucker, 1998). The Internet is a new frontier in which computers are considered tools to learn about, to learn and to teach with. Technologis ts and educators work to improve instructional outcomes, design, and aesthetics to produce optimal learning. Instructional Design Models for Online Tutorial Development Many researchers contributed to the body of literature dealing with the first research question, “What effect do the two online instructi onal design strategies for Internet training have on performance measures?” The researchers featured in this study include: Alessi and Trollip (2001), Schnackenberg and Sullivan (1998), Chung and Reigeluth (1992), Hirumi (2002), Northrup and Rasmussen (2001), Carls on and Repman (1999), Biggs (1996), Bowden and Marton (1998), Laurillard (1993), and Williams (2002). Well designed tutorials feature well written performance objectives, reference to prior learning, frequency of interaction following or prior to presentation of information, variety of question types to assist with mainta ining learner's attention, feedback, response prompts, clear organization of information presentation, and well designed navigation (Alessi and Trollip, 2001). Alessi and Trollip wr ite that good tutorials require succinct clearly written performance objectives to guide and motivate the learner through the sequence of activity described above. Tutorials that are well designed stimulate the learner's recall of prior

PAGE 29

Development and Validation 19 19 knowledge through narratives and use of meta phors, analogies, and reference directly to prior learning. Tutorials that incorporate nu merous well placed questions and require the learner to provide input/action prove to be mo re effective than a more passive learning environment. They maintain that questions sust ain the learner's attention, facilitate internal processing and reflection, and provide the lear ner self-assessed feedback on her/his progress (Alessi and Trollip, 2001, p.94). Response prompt s and well-placed cues assist the learner in navigating and processing the material. The tuto rial in this study includes graphical prompts for navigation, directions for interactions, an d provides ample corrective feedback and positive reinforcement. Tutorial navigation can be either linear or branching. Linear designs are the most simple and direct. They follow one of two sequenc es of presentation. One is hierarchical in the presentation of skill sets and builds on prior learning from one module to the next. The other is presented from simple to complex. Branching tutorials permit the learner decision points in the navigation, can provide remedial information if a student commits errors within a practice session, or permits the learner to exit and return later to the program. Branching permits the learner to skip sections of material and return to various sections via menus or forward and back navigation buttons. Instructional tutorials for Internet search tr aining require the learner to apply rules and principles for searching. Alessi and Tro llip (2001) site two methods of conveying this information: (1) Rule-Example, or the (2) Example-Rule. Rule-example provides the learner information about a search strategy followed by an example. The program elicits a response and guides the learner through an exercise providing corrective or reinforcement feedback. The second method, example-rule shows the learner an example and prompts the student to infer the rule from a series of practice-feed back exercises. The latter method is more

PAGE 30

Development and Validation 20 20 representative of a learner-centered approach while the rule-example describes a contentcentered design. The example-rule method re lies on the learner's intuitive and analytical abilities to infer the rule from experience. Arguments can be made to favor either ruleexample or example-rule methods but most educat ors prefer rule-example in that it requires less from the instructional designer and economiz es the learning process (Alessi and Trollip, 2001, p.123). They further state that successful multim edia designs include four phases: (1) presenting or modeling information, (2) guiding the learner through initiation of material, (3) encouraging the learner to practice concepts pr esented to increase learning retention, (4) conducting learning assessment. Information is presented and the learner is informed of the objectives and purpose of the instruction. Following elaboration of information (presentation of examples and non-examples), th e learner performs some kind of interaction, the click of a button, answering a question, choosing a path for more information, etc. The program provides the learner feedback aimed to correct, inform, or praise the student and to reinforce or correct comprehension and/or performance. More information may be presented followed by learner interaction, feedback, and additional information until the program ends with a summary of what has been learned. The present study documents conversion fr om print instruction to web presentation. A lean-plus design is featured affording learners nonlinear movement to various topics in the program. The learner’s degree of choice has an impact on motivation and the ability to sustain attention (Alessi and Trollip, 2001, p. 126). This study examines the effects on performance when students are allowed to choose the sequence and practice from modules on Internet training topics from a table of contents. The tutorials follow the rule-example protocol in both treatment and comparison conditions. Each topic within the table of

PAGE 31

Development and Validation 21 21 contents presents an overview or rule, followed by practice under learner control in the comparison condition and programmed controlled for the treatment condition. The treatment condition controls the guided practice following a rule statement and does not permit the learner to return to the main menu until completion of an exercise and a summary screen. The navigational design an d instructional strategy, whether program controlled or learner centered may have an impact on sustaining attention and motivation throughout the learning process. The latter i ssue is addressed in the second research question. Learner control versus program control and achievement. Learner control is defined as the learner’s ability to control learning events (S chnackenberg, 1998). Allowing learners to control the pace, sequence, and navigation within an instructional program is founded on the idea that learners are able to best evalua te their instructional needs and devise their own strategies to fulfill their needs. Learners respond favorably to instruction that affords learner control of the pace, sequence, and depth of in struction. There is an assumption that when given free reign, the learner will demonstrate stronger motivation towards the material and thereby produce higher outcomes. However, Chung and Reigeluth (1992) contend that granting learner control produces “inconclusi ve and …more frequently negative” outcomes, however with the advent of the World Wide Web, there exist gaps in the literature concerning the issue of control in a hypertext environment. They suggest that research on learner control and achievement is equivocal and is more often negative when learners are given greater control of the instruction. The researchers assert the reason that greater learner control may lead to lower achievement is due to learner characteristics. Lower achieving students lack the ability to make decisions on pacing, sequence, and amount of practice afforded them.

PAGE 32

Development and Validation 22 22 Reigeluth (1996) defined a set of learne r control strategies based on the ConditionsMethods-Outcome model. Three variables influe nce the designer regarding learner control; (1) instructional outcomes, (2) instructional conditions, and (3) instructional methods. Four factors that influence outcomes; (1) accuracy sometimes referred to as error rate, (2) speed as it relates to efficiency of learning, (3) ability to transfer information, and (4) the ability to retain information over time. Instructional c onditions refer to learner characteristics, objectives or domains of learning, and lear ning systems such as computer-assisted instruction (CAI) and multimedia environments. Instructional methods encompass decisions based on sequence and selection of examples and non-examples, content summarization, and the learner’s synthesis of the material. Chung and Reigeluth (1992) provide a prescriptive model for learner control based on content, sequence, and pace. Learner control may be granted under the following circumstances: 1. Students have previous content knowledge of the material. 2. Students have high ability learner characteristics. 3. The probability of success is high regardless of whether one affords control to the learner or restricts control. 4. When higher-order skills are being taught compared to verbal information level (rote memorization, drill and practice) and when students are familiar with content (Hannafin and Peck, 1988). 5. One should NOT afford learner control when mastery of the material is dependent on a sequence of hierarchical skills. Control of sequence may be afforded under the following conditions: 1. When presentation of instruction does not require any particular order.

PAGE 33

Development and Validation 23 23 2. Students are familiar with the content of the instruction and able to make choices over sequence of presentation. 3. Students of high ability and familiarity with subject matter may be granted greater learner control. 4. If the learning is problem-based, permitting students to select a sequence facilitates synthesis of the material (Gagn, 1985). 5. Prior knowledge of the content permit s learners to control instructional sequences (Mager and Clark, 1963). Learner control may be given over the pace of instruction when: 1. Materials need to be relevant to students’ needs. Granting depth of exploration and additional time spent on an area of interest to the learner increases attention and motivation. 2. If students believe that spending additional time will increase their achievement. 3. Individualized or self-paced instructi onal platforms require learner control. 4. Students benefit when additional time to integrate new information with already acquired material. 5. Coached practice may increase achievement and sustain attention thereby reducing instruction time (Campbell and Terry, 1963). Special considerations for hypermedia and learner control: 1. Provide guidance and objectives for low-ability learners as well as a default sequence of information presentation for the content. 2. Provide graphical cues for navigation and a form of map to let the learner know where they are within the instructional program.

PAGE 34

Development and Validation 24 24 3. Use audit trails, a graphical cue to show where the learner has been previously within the program. 4. Set standards for screen design. 5. Permit learners to make conceptual links within the framework of personal information management systems. 6. Present information in an overview rather than depth for presentation systems. 7. Inform the end-user of his/her location within the program. 8. For navigation settings, make a standard means of conveying topics of information. 9. Permit the user to close windows without exit from the program. 10. Build in an “undo” function. 11. Provide a continual help system (Kinzie and Berdel, 1990). Schnackenberg (1998) made a distinction between learner control and aptitude vs. learner pre-instructional knowledge. The author limited her review to those studies where normative aptitude measures were used to assess the relationship between learner control and ability. A lower-achieving student may not have domain knowledge and therefore has less ability to self-regulate. Chung and Reigeluth (1992) found that low achieving students were unable to self-regulate and diagnose their lear ning needs. They failed to generate effective learning strategies when they encountered mater ial with a large range of learner control options. These low achieving students benefited from a more structured, less lenient program controlled form of instruction for computer-based learning applications.

PAGE 35

Development and Validation 25 25 Steinberg (1989) found the opposite to be the case for high achieving students in a literature review on learner versus program cont rol and learner characteristics. Steinberg asserted that high ability students perceived ri gid learner control as a hindrance. Perhaps a more apt question was whether the amount of support afforded learners depended on topic knowledge and familiarity and aptitude. Steinbe rg supported the notion that when learners were informed of their progression, the prog ram acted similar to that of a coach and provided learner advisement. The high ability learner made cogent decisions about whether they should engage in practice, repeat a partic ular section, or consider the unit complete. Schnackenberg’s and Sullivan’s (1998) research was based on a sample of 202 undergraduate teacher education students placed randomly into four groups based on aptitude scores on the SAT and ACT. Subject matter focused on competency-based educational practices. Four versions of a co mputer-based program were developed based on two levels of learner control, high learner cont rol and full program control, and two levels of program presentation, lean and full. The in structional program converted text from Teaching for Competence (1983) by Sullivan and Higgins adapted for an interactive computer assisted program done in HyperCard for Macintosh computers. Information, examples, reviews, and summaries were identical for all four versions of the instructional program. Practice items were written as multiple choice single correct answer responses. Program controlled versions forced the participant to move sequentially through all screens in their view while the learner control the first of four multiple choice with feedback practice items was mandatory and under program control while the learner could opt to continue or move to another topi c. Following training students completed a 36 item multiple-choice paper and pencil test. Re liability statistics using a Kuder-Richardson formula resulted in a reliability statistic of .7 8 Results indicated that the full program scored

PAGE 36

Development and Validation 26 26 higher than a lean version and high ability studen ts scored higher than those of lower ability. More time was spent viewing material in the full version of the program thus resulting in a greater number of practice exer cises. Posttest achievement did not differ by type of control, either learner or program controlled instruction. A thirteen-item attitudinal test that used a Likert scale was used to assess learner prefer ences. No reliability data was reported in the study. Students indicated that they preferred the full program to the lean because it “afforded them time to complete more practi ce exercises”. The fact that the subject matter, competency-based instruction, lent itself mo re to face-to-face instruction versus computer delivered came as little surprise. When students were asked if they preferred computer-based versus classroom training on the subject, 51 % preferred classroom-based instruction. Seventy-seven percent of the participants responded they would prefer competency-based instruction or another subject without computer instruction in contrast to 23% who preferred to use computer-based instruction. Schnackenberg & Sullivan (1998) found wh en they blocked ability, measured by a standardized aptitude instrument, they disc overed no aptitude interaction effect. Lower achievement students performed no better or worse when provided high learner control versus low learner control. Achievement ou tcome measures indicated that students’ performance was essentially equivalent regard less of whether they experienced higher or lower learner control conditions. Schnackenberg concluded that there were strong effects for ability and achievement for high and low learner control. High ability students who were provided high learner control and students experiencing high program control outperformed their low achieving students given both full and lean versions of the software.

PAGE 37

Development and Validation 27 27 Greater reliance on web-based delivery systems that afford students high learner control regardless of ability level necessitates ed ucators to revisit the question of ability and learner participation for WBI. Quantitative and qualitative research is needed to assess how students of high ability level perceive the instruction and perform when provided with differing levels of learner versus program c ontrol for web instruction. Schnackenberg (1998) suggested research is needed to gain insight into the thought processes present as students encounter learning decisions during WBI. Elaboration Theory of Instruction (ETI) conceived by Reigeluth (1996), sought to show increases in comprehension and motivation as a result of providing learner control and sequencing instruction based on the type of learni ng; e.g., information skills, cognitive, or procedural knowledge. He draws a distinction between task and content knowledge. Content knowledge, based on conceptual learning, pr escribes a sequence of principles called conceptual elaboration sequence. Theoretical elaboration sequence is a type of sequence aimed at task knowledge such as solving algebrai c equations, creative writing, outlining, or note-taking and focus on process-based skills. To accomplish objectives for task knowledge, Reigeluth offers the Simplifying Conditions Method (SCM) based on two conditions; (1) procedural knowledge where the learner follows a prescribed strategy, or (2) a heuristic task where the learner uses causal knowledge, condit ional statements such as if this condition exists, then I (the learner) should do … Two Simplifying Conditions Methods include the procedural SCM sequence and the causal SCM sequence. Theoretical SCM Sequence moves from the most basic observable principles to most complex and detailed principles. The initial lesson, termed the “epitome” describes the most fundamental and generalizable principles taught at the concrete operations level but taught in the context of real world situations.

PAGE 38

Development and Validation 28 28 Procedural SCM Sequence, based on task or process-based knowledge or set of skills, assumes that complex cognitive tasks suc h as those that involve problem-solving are completed under different circumstances. Some of these procedures represent simpler tasks, are initially taught and build from simple to complex. The learner builds confidence as the desired competence is achieved. Both sequences use concrete real-world versions of tasks. Reigeluth (1996) conducted a qualitative formative study, divided in two phases, to identify strategies to apply his sequence theory. The study involved a group of thirteen sophomores from Indiana State University enrolled in the Electronics and Computer Technology program. The instructional program converted text from the Introductory Circuit Analysis textbook by Boylestad (1990) to a computer-assisted program. For the first phase Reigeluth conducted indi vidual “talk aloud” interviews while the ten voluntary students completed a computer -based program on electronic circuit analysis. The students were designated by ability level based on GPA: three high ability, three average ability, and four lower ability. Studen ts were asked to comment on each screen of the HyperCard program and comments were recorded by tape recorder by the investigator. Stratified sampling was used to ascertain whether qualitative feedback differed according to ability level. The programs used material from the textbook but differed on theoretical or procedural sequences. Additional comparisons were made for both procedural and causal sequences. The computer program used varied sequences to ascertain students’ reactions to different levels of elaboration on the units. Two levels of Simplifying Conditions Methods were included for each of the three instructional units; a theoretical SCM and a procedural SCM. Six assessment instruments were included: (1) two pre-tests, (2) posttest, (3) set of impromptu questions during the

PAGE 39

Development and Validation 29 29 interactive phase, (4) set of questions for debr iefing, and (5) an attitude survey. The pretests measured prior content and prerequisite knowle dge. Students who scored 30% or below on content knowledge participated which ensured th at only novice students participated in the study. The post-test was administered with an affective survey following instruction so students could provide feedback on students’ attitude toward the instruction. The posttest also allowed objective evaluation of the module s’ effectiveness. The attitude survey asks participants to circle responses on an ordinal scale of 1-5 in response to the appeal of the instruction and student’s individual attitude toward the instruction. The second phase used three students for a total of N=13. For Phase II, three students completed instructional material revi sed from comments from phase I without the presence of an investigator. Following instructi on, during a debriefing, students were asked to describe the method he/she used to navi gate the module. Typed responses taken from audio transcripts were presented to students the following day to verify their input or be able to modify their responses. Reigeluth (1996) reported data from both phases of the research and used a prewritten set of debriefing questions to ensur e consistency of the feedback. Data was coded according to lesson number and phase number and responses from low, average, and high sub-groups were divided within a matrix of responses. Students were asked to comment on (1) how distinctions in the modules were illustra ted from one procedure to another, (2) how procedural steps were explained, (3) whether th e numbers were easy to calculate, or (4) if the material induced uncertainty. The investigat or reported mean scores of high average and low ability participants on the posttest measure. Score means were 95.0 for the high ability, 93.7 for average students, and 84.1 for low ability participants. Though not an empirical

PAGE 40

Development and Validation 30 30 study, Reigeluth (1996) examined scores pr eviously for 189 students whose average was 77.3% on the criterion referenced posttest. Qua litative data supports use of the epitome, sequencing from simple to complex for all ability groups and both phases of the formative research. Reigeluth enumerated the followi ng suggestions concerning sequencing and elaboration theory. 1. For procedural information, relate previous sequences to new ones and distinguish common elements. 2. Provide labels beside conditions so that students gain optimal learner control. Labeling allows students to identify and categorize problems. 3. If the knowledge is based on a problem-bas ed situation, assist the learner to distinguish between extraneous information and information relevant to the solution. 4. Reiterate at the macro instructional level by presenting problems under the identical condition and sequence the amount of complexity. 5. Epitomes teach small numbers of ideas at th e application level, help the learner gain an overview of the procedural and theoretica l, and focus them on essential material. E-learning interactions. Differences between instructional delivery in the classroom and the web are apparent. Classroom teachers pr ovide non-verbal spontaneous reactions that elicit student-teacher and peer interaction (Hir umi, 2002). E-learning interactions require careful planning and a theoretical basis for sequence and presentation of learning opportunities. Hirumi (2002) describes four major categories of computer-mediated interactions: (1) communication, (2) purpose, (3) activities, (4) and tool based taxonomies. The focus of this study is on learner-instructional interactions, those that engage learners in activity, with emphasis on student to management interactions (feedback), the last classification listed by Northrup and Rasmussen below. Carlson and Repman (1999) classify

PAGE 41

Development and Validation 31 31 e-learning interactions into the following ca tegories: (1) questioning, (2) feedback and clarification, and (3) control of pace and sequence of presentation (p.142). Northrup and Rasmussen (2001) define four classes of intera ction: (1) student to student, (2) student to instructor, (3) student to instructional materials, and (4) student to management (feedback) interactions (p.142). The theoretical basis for design of management interactions is dependent on the perspective of the designer, whether one follows a content-centered (behavioral) or learnercentered approach. If an instructor believes that activity-based interactions inspire engagement with the material to foster critic al thinking and reflective information sharing, he/she may design navigation with high lear ner control in mind. The designer may offer learner control over sequence and presentati on of material. The opposite is true if one believes that learners require embedded practi ce and corrective and reinforcement feedback in order to develop higher level thinking skills. Hirumi (2002) proposes three levels of Elearning interactivity. Level I interactions are those experienced personally within the learne r coined as learner-self interactions. Level I interactivity supports a learner-centered model of interactivity in which the student is presented opportunities for exploration and ex perimentation. Level I includes cognitive operations and self-regulation or metacognitiv e processes. Characteristics of learners who take advantage of lenient navigation and seque nce of material are those capable of selfinitiated strategies and diagnosis of learning ga ps. Typically these learners are self-initiators and highly self-regulated, a trait highly desira ble for distance learners. Level II interactions take place between learner and instructor, learne r to learner, and learner-content interactions. Level III interactions are defined as learne r-instruction interaction and involve, “…a deliberate arrangement of events to promote learning and facilitate goal achievement.

PAGE 42

Development and Validation 32 32 Learner-instruction interactions are differentiated from Level II and Level I interactions to illustrate how theoretically grounded instructional strategies may help distance educators design and sequence planned e-Learning intera ctions.” (Hirumi, 2002, p.148). Hirumi asserts when instructors post classroom based materials or text in web form, they often overlook planning for e-learning interactions and fail to ground interactivity in pedagogical theory. Table 1 below provides a framework for grou nded instructional strategies. Two of the frameworks are directly relevant to the materials developed for this study. Table 1 Grounded Interactive Strategies (Hirumi, 2002, p.149) Nine Events of Instruction (Gagn, 1985) Direct Instruction 1. Gain Attention 2. Inform Learner of Objective (s) 3. Stimulate Recall of Prior Knowledge 4. Present Stimulus Materials 5. Provide Learning Guidance 6. Elicit Performance 7. Provide Feedback 8. Assess Performance 9. Enhance Retention and Transfer 1. Orientation 1.1. Establish Lesson Content 1.2. Review Previous Learning 1.3. Establish Lesson Objectives 1.4. Establish Lesson Procedures 2. Presentation 2.1. Explain new concept or skill 2.2. Provide visual representation 2.3. Check for understanding 3. Structured Practice 3.1. Lead group through practice 3.2. Students respond 3.3. Provide corrective feedback

PAGE 43

Development and Validation 33 33 Nine Events of Instruction (Gagn, 1985) Direct Instruction 4. Guided practice 4.1. Practice semi-independently 4.2. Circulate, monitor practice 4.3. Provide feedback 5. Independent practice 5.1. Practice Independently 5.2. Provide delayed feedback As part of England’s higher education initiative, the Courseware for History Implementation Consortium (CHIC, 1998 2001 ) examined the effect of including websites as supplements to traditional textbook and lecture formats on student performance in college courses. Two other instructional strategies were compared; (1) online seminars with cooperative learning groups and (2) integrated cooperative online learning with face-to-face instruction. An important insight emerged from Hall’s (2002) research. Mode of delivery was not at issue, rather the differences in curricula r design and student participation with peers, instructors, and materials were key to successful web-based designs. Biggs (Swartz, 1999, 11) described key featur es of effective interactivity. “Learning is the result of the constructive activity of the st udent. Teaching is effective when it supports those activities appropriate to understanding the curriculum objectives.” The activities enable learners to demonstrate their unders tanding and instructional strategies support the student’s ability to fulfill course objectives. Thus, the student perceives the process as relevant to their learning needs. One strateg y is to promote harmony (Hall, 2002, 151), provide an overview of the entire process so the student may relate the information as

PAGE 44

Development and Validation 34 34 important for personal growth and knowledge. A designer promotes deep-level understanding of the material, when students are encouraged to be proactive. Instructional designs require pre-assessment of entry skills so that new information relates to what is already acquired. Additionally opportunities for peer and tutor support are essential for students to acquire a full understanding of the le arning task. Learners may not be invested in the instruction, when content and activity is highly controlled through program constraints (program control). Educators at all levels are making efforts to distribute instruction to mass audiences through web-based courses. These sites typically include features that promote open, learnercontrolled forms of navigation de-emphasizi ng the importance of the teaching method (Bowden and Marton, 1999). The learner’s abili ty to make sense of disparate sources of information that are unrelated and feature no guidance predictably leads to learner confusion. If a logical course structure with hi gh learner control and access to the instructor as facilitator is provided, students are more likely to demonstrate mastery of the material. Current strategies that convert text to web run the risk of creating encyclopedic volumes of unrelated reference material. Information intended to supplement contentcentered instruction without opportunity for expl oration and feedback fails to promote communication and ultimately mastery of learni ng objectives. Text-based websites without interactivity omit important facets of the teaching process; discussion, interaction, adaptation, and reflection, all primary ingr edients for successful technology integration (Laurillard, 1993). According to Hall (2002), website devel opment must meet learner needs and support learning objectives, teaching processes, and learner outcomes. Learner preferences and abilities influence whether a learner-centered ve rsus program controlled design is used. The

PAGE 45

Development and Validation 35 35 role of feedback plays an integral role in desi gns that contrast constructivist and behaviorist approaches; one that emphasizes high learne r control versus one that advocates high program control for WBI. Four common components of Instructional Systems Design include (1) analysis, (2) design, (3) delivery, and (4) evaluation (W illiams, 2002). The success or failure of any instructional program depends on the design. Several steps are necessary to create an instructional program regardless of whether a contentor learner-centered approach is chosen. These steps include (1) preparation of objectives, (2) determina tion of content, (3) instructional methods and strategies employed, (4) assessment and access to resources, (5) application of content, and (6) assessment both formative and summative. This study examines the underlying theories that form the basis for qualitatively different instructional strategies; i.e., learner centered (constructivis t) or content centered (structured cognitive) approaches to acquisition of knowledge. Behavioral and Constructivist Theories and their Relationship to Instructional Design Behaviorism is associated with scientis ts such as, Pavlov and Skinner. Programmed instruction is linear and features a mechanical type of learning environment model. Invented by B.F. Skinner (1976), behaviorists held that learning takes place as a series of rewards and punishments and that the environment shapes the complexity of behavioral responses. Few educators use programmed instruction to facilitate higher order thinking despite the fact that Skinner is recognized for his contributions of shaping behaviors through reinforcement and immediate feedback. Interestingly, however, ba sic tenets of behaviorism impact current methods of instructional systems design (Ertmer and Newby, 1993). These include the use of: 1. Pre-assessment instruments to determine entry level knowledge or behavior

PAGE 46

Development and Validation 36 36 2. Sequence instruction from simple to increasingly complex levels 3. Reinforcement and feedback to shape be havioral responses, retain attention to the material, motivate, and correct performance 4. Practice and application of principles conveyed through narrative material or examples and non-examples of constructs 5. Observable, measurable, performance-based outcomes (Williams, 2002, 135) Most web-based learning is based on constructivist theory contrasted with behaviorist theory. Constructivists believe that learning is subjective. Learners arrive at a learning experience with a personal history, belief system, prior interaction with concepts, and prior knowledge. Constructivists encourage participation and relate the material to the individual’s prior experience. Ertmer and Newby (1993) posit that learner participation is enhanced when tasks include real world problem situations, collaborative learner strategies, and when the teacher acts as facilitator. Pa rticipation is also increased when teachers encourage discussion and debate within the fram ework of a common experience. Features of constructivist design include: 1. Applying learning to meaningful contexts 2. Affording high learner control 3. Providing opportunities for learners to apply what they have learned 4. Presenting information in multiple modes 5. Revisiting concepts previously encountered in the instruction 6. Emphasizing problem-based learning 7. Developing alternate ways of presenting problems 8. Focusing on transfer and reten tion of knowledge and skills

PAGE 47

Development and Validation 37 37 9. Presenting problems worded differently from those presented in examples in practice sessions on assessments Both constructivists and behaviorists r ecognize the importance of feedback and assessment (Williams, 2002). While control of the environment is a central value for behaviorists, constructivists concentrate on the learning process and how individuals acquire skills and knowledge. Williams (2002) conducted a study to deter mine optimal design features for adult learners and WBI. Her study focused on adult learning, based on the assumption that adults are self-directed and bring prior knowledge an d experience to the learning environment. Though aimed for adult learners, many of the findings relate to web-based design for any age group. Web-based trainers/designers from across the United States ( N=25 ) were selected for input of design principles deemed relevant for adult education. Thirty-six features of WBI resulted from input from designers’ el ectronic interviews. This author selected the following principles applicable to WBI regardless of age. 1. Instructional objectives and goals must be relevant to the goals of the learner 2. Content should reflect the tastes and interests of the learners 3. Pre-assessment is necessary to assure relevance of the material to learner needs 4. Learning activities should be based on th e learners’ prior experience and familiarity of language and context 5. The learner should receive help to relate new material to what is previously learned 6. Relevant examples and activities should be included to assist the learner to grasp the material 7. Instructional activities should correspond directly to both content and course objectives

PAGE 48

Development and Validation 38 38 8. Graphics, examples, cases, and analogies shou ld be included to facilitate the learner’s comprehension 9. Feedback, both positive and negative should be integrated into learning activities 10. Feedback on objectives should be provided as part of the training 11. One idea should be presented at a time followed by frequent summaries to assist retention and recall of information 12. Instructional interactions should allow for le arner to learner, instructor and learner, during instruction focusing on new knowledg e integrated into existing schemata (Williams, 2002, pg. 139-140). Research Question Two: Perceptual Theories The second research question, “How do students’ perceptions based on self-reports differ between the two instructional strategies on attention, relevance, confidence, and satisfaction?”, focuses on learner receptivity to instructional strategies and designs. Keller (1987) provides the theoretical basis for analyses of student perception. Keller believes the role of motivation and instructional design cannot be separated. The best designs in terms of features of instruction will not override a learner’s motivation or attitude toward instructional material or the instructional environment. Song and Keller (2001) discuss a systematic process to ensure that designs address motivation in their distance learning courses. Keller, known for the ARCS Theory (1 996), with Suzuki created a rubric to assess middle school students’ receptivity to instructi on based on attention, relevance, confidence, and satisfaction. Keller’s (1987) ARCS Theory, an acronym that represents attention, relevance, confidence, and satisfaction along with other systematic design principles, guide the development of the affective components of in struction and is used in tandem with Gagn’s

PAGE 49

Development and Validation 39 39 (1985) Events of Instruction. The website Integrating Instructional Design in Distance Education (IIDDE) (Carr, 2000 available: http://ide.ed. psu.edu/idde/default.htm) provides practical examples of how one coordinates Keller’s AR CS Model with Gagn’s (1985) Events of Instruction. The IIDDE site poses each of Ga gn’s Events and shows how Keller’s ARCS Theory attends to gaining the user’s attention and sustains interest throughout the instructional process. Designers can gain the attention of a learner by posing a question, having the learner generate a question, or introducing some surpris e or novel event. Relevance corresponds to Gagn’s (1985) goals and objective statements combined with Keller’s mandate to express those objectives so that they relate to the learne r’s internal motivation. Expressing learning objectives and including the purpose of the materia l in language that appeals to the learners’ needs increases the likelihood that the learner will engage in the instruction. Carr (2000) suggests using familiar examples or those pr eviously experienced by the learner when presenting instructional content. To sustain le arners’ engagement with the material designers should elicit participation from the learner, clearly state teacher expectancies, provide opportunities for guided exercises, give feedba ck, and allow the learner to select resources. Satisfaction, Keller’s last component is enhan ced when learners apply newly acquired skills to authentic problem-based activities. Satisfacti on is also increased when learners can assess their progress and are given feedback that reinforces desired behaviors. Keller collaborated with Suzuki in 19 96 on a Japanese middle school project to account for motivational characteristics of lear ners, content area to be taught, and in the hardware or software to be used. Teachers evaluated data derived from students and teachers from eight subject areas. They devise d strategies to address areas of weakness regarding student interest or motivation toward the material. The culmination of Keller and

PAGE 50

Development and Validation 40 40 Suzuki’s work was a matrix coded with plus and minus signs that represented a positive or negative response to a motivational feature. The model, based on Keller’s (1987) ARCS theory, includes motivational elements that address attention, relevance, confidence, and satisfaction. Table 2 illustrates Keller’s model app lied to an International e-mail training unit.

PAGE 51

Development and Validation 41 41 Table 2 ARCS Motivational Design (Keller, 2001) Keller (2001) asserted that student perceptual responses to instruction changed over time and condition. When students were motivated positively towards the material, they remained on task and did not respond well to additional motivating tactics. The converse was also true, when not highly motivated to learn the subject matter, the absence of

PAGE 52

Development and Validation 42 42 motivational strategies decreased engagement in the material. Keller recommended an adaptive approach built into computer-based software. Based on a survey of attitudinal responses upon initiation of the software, the program would bran ch to accommodate a learner’s level of attitude towards the material. Keller concluded that motivation, if applied systematically, sustained and facilitated motivation. The challenge to overcome motivational problems in distance learning courses is apparent from the level of attrition seen in many universities. Visser (1990), a French resident, used Keller’s (1987) motivational matrix for a distance learning study sponsored by a university in the United Kingdom. Her resear ch focused on an instructional design course for training personnel delivered via distance learning in Mozambique. Subjects were 22 adults from the Department of Ministry an d seven special students using a case study method. Research questions included the following: (1) How valid are motivational messages in distance learning courseware, (2) How do messaging feedback work and with whom, (3) How does a messaging system work over time ? A second study was conducted three years later. When she analyzed student responses to the motivation matrix, she found that more attention was required for support systems than on instructional strategies. Once help systems were implemented, student motivation increased. In a follow-up study, Visser (1990) examin ed the effect of motivational messages on student performance (retention of material). One set of motivational messages applied uniform feedback messages throughout the progra m placed at predictable points within the instruction, the other set incorporated persona lized messages of encouragement in the form of electronic greeting cards. Results from her experiment were dramatic and demonstrated how the personalized messages increased retention from 70-80%.

PAGE 53

Development and Validation 43 43 Theoretical Foundation for Development of Instructional Modules The third purpose of the current research is to document the design process used to convert textbook material to WBI. A description of the development process is included in the Methods chapter. Discussion of the theoretical bases for development of the online modules is discussed in this final section of the literature review. “Database searching should be part of a formal research offering that covers the nature and processes of research, various t ools, etc” (Neuman, 1997, p.2, p.3). Neuman proposed that Internet curricula be developed th at addresses training in the context of a holistic research process and fosters refinement of critical thinking skills. Neuman (1997) contended that we know little about how to encour age critical thinking skills with traditional library instruction; the challenge is even gr eater with digital library resources. Online searching offers students opportunities to gain technical skills and, more importantly, experience making information-use decisions ba sed on higher order thinking skills such as assessment of relevance, validity, and curren cy of information. Further, instruction must inform students on the structure of “how information is organized”. Components in electronic resource training prog rams must include the general meaning and nature of the search process and address issues such as evaluation of relevancy of sources, information organization, and the relationship of informat ion use and student learning (Neuman, 1997, p.11). The educational theories proposed by Robert Gagn (1985) and instructional design methods proposed by Dick, Carey, and Carey's (2001) Systematic Design of Instruction explain how to accomplish these goals. Gagn’s (1985) Conditions of Learning is considered a primary text used by instructional designers everywhere. His events of instructi on are applicable whether the learning is cognitive, psychomotor, attitudinal, or verb al information. These events include: (1)

PAGE 54

Development and Validation 44 44 informing the learner of the objectives of the le sson, (2) presenting the stimulus material, (3) providing learner guidance, (4) eliciting learne r performance, (5) providing learner feedback, (6) assessing performance, and (7) enhancin g transfer and retention (Gagn1985; Richey 1997). Richey (1997) views Gagn’s contribution to micro-instructional design principles as relevant to both constructivist and behavior al schools of thought. Gagn’s influence is apparent in the work of Merrill and Jones’ (1992) Instructional Transaction Theory and Keller’s (1987) Motivation Model. Learning principles outlined in both the transaction theory and Keller's motivation model include: New learning is dependent on past knowledge and concepts New learning is stimulated by external events Learning is facilitated by instruction that is adapted to the nature of the learning outcome Instructional strategies of whatever form provide motivation, direction, guidance and guided practice, feedback, and reinforcement (Richey 1997, p. 595). Gagn (1985) has been criticized for overemphasis on lesson content and not enough on process orientation. The focus on lear ning content over transference of skills for problem-solving tasks draws into question whether instructional systems theory enables students to organize knowledge and develop individual metacognitive strategies. Hannafin and Peck (1988) suggest that systems theory works best when there are prescribed objective outcomes and organization of lesson content. Questions arise regarding the current emphasis on constructivist principles of learning in relation to the efficacy of Gagn’s events and instructional systems design. Researchers criticized Gagn for focusing on external conditions for learning and placing too littl e emphasis on the internal processes of the

PAGE 55

Development and Validation 45 45 learner's experience. While Gagn emphasized th e sequence of the events of instruction, he did not discount the impact of the learner's internal processes as he/she interacted within a learning environment. Gagn's contribution is to make the designer aware of conditions that ultimately lead to optimal acquisition of inte llectual skills. Gagn provides a framework for engagement in a variety of instructional strateg ies that focus on learner participation as central to the learning process. The designer can use Gagn’s events as a guide to integrate constructivist strategies that require the learner to engage in practice and feedback for retention of learning concepts. The overarching goals of constructivism are to foster metacognitive skills and promote independence of learning so that st udents become lifelong learners. Problem-based learning (PBL), a popular constructivist strateg y, requires a high degree of metacognition on the part of learners. Metacognition is define d as the ability to reflect on one’s own thinking patterns and employ strategies with which one acquires new information or knowledge. Barrow (1988) suggested that metacognition involves the following: deliberating or pondering on a situation or problem; analyzing what is known and what information is missing and comparing it to similar problems or situations; creating hypotheses; deriving appropriate questions and observations; reviewing and questioning new information sources and what has been learned; and making decisions about future inquiries or actions. Savery and Duffy (1996) asserted that Problem Based Learning (PBL) facilitates students’ problemsolving and critical thinking skills. PBL begins with defining a problem, locating and accessing resources, employing strategies, anal yzing information appropriate to the problem, and evaluating the solution, functions that di rectly apply to information literacy instruction.

PAGE 56

Development and Validation 46 46 Instructional Systems Design What is instructional systems design? This refers to the systematic design of instruction as a procedural set of steps for course development based on a text by Dick, Carey, and Carey (2001). Systematic design of instruction refers to a sequence of steps predicated on the notion that if one performs the following steps: (1) needs assessment, (2) task, (3) audience, (4) and instructional analys is, (5) clearly written performance objectives, (6) assessment instruments, (7) selection of instructional strategies, (8) sequence and presentation of instructional materials, (9) form ative analysis, (10) revision, and (11) summative analysis with the result targeted at improved learning. The term systematic design is often associa ted with a behavioral learning model in that knowledge and skill sets can be analyzed, categorized, and sequenced according to a set of cognitive principles. The question aris es about whether a designer abandons constructivism in favor of behaviorism when the instructional designer categorizes and builds a hierarchical model of skill sets for th e learner to demonstrate a learning outcome. Creating a flowchart of interrelated knowle dge and skills and identifying subordinate and super-ordinate relationships among skill sets is central to instructional systems design. An instructional strategy is how one applies the m ethods the student will encounter to acquire the course objective. Thus, a designer can sp ecify a constructivist learning strategy and incorporate materials that enable the student to "construct meaning" from experience within a systematic design model. Constructivist princi ples are applied when Internet skills are built so that effective transference of objectives such as text structuring, truncation rules, and application of Boolean operators are applied to a problem-finding task such as a scavenger hunt activity. The learner must construct a problem-solving process or personal search strategy in order to obtain necessary information to meet the cognitive objectives of the design.

PAGE 57

Development and Validation 47 47 Designing effective problem-solving tasks in the context of a systematic design model that follows constructivist principles can be challenging. Problem-solving tasks by nature require a multi-stranded set of tools an d include procedural, cognitive, subject matter, motor, and attitudinal skills (Carey, 1998). Succe ssful search engine use requires a repertoire of skills including: subject matter knowledge, language skills, text structure (syntax), and ability to generate a problem-statement. Pre-planning strategies require that students identify key words within the research question and translate natural language into a text structure that can be searched within a database. Evalua tion of search strategies requires students to monitor their thinking and evaluate both search outcomes and search processes used to generate search outcomes (Haycock, 2000). Instructional designers successfully incorporate constructivist strategies within the context of a systematic design method for In ternet training in various ways. Constructivist strategies include modules that enable students to demonstrate application of objective rule statements to a given situation, and provide ex ercises to facilitate concept formation of how to apply multiple strategies to a given probl em set. Search skills would include question analysis, brainstorming for keywords or phrases, pre-planned use of Boolean operators, and evaluation and comparison of multiple search techniques and tools. Table 3 below based on an article by Car ey (1998), includes Gagn’s (1985) events and proposes a set of constructivist strategie s that correspond to a systems approach to instructional design. A third column has been added to illustrate how this training program includes a set of strategies for both online Internet training modules (comparison and treatment).

PAGE 58

Development and Validation 48 48 Table 3 Constructivist Strategies and Internet Training Gagn’s Events of Instruction, Objectivist Strategies Constructivist Strategies Present Internet Module Gain attention: provide motivational introduction, focus on content to be learned Provide motivation via "ownership" of material provide choice of content and methods for exploration. Provide authentic context for learning Provide authentic experience as students explore in natural setting with access to computer labs, provide choice of content presentation based on student interest, if simulation, provide screen shots of real websites Animated graphics, highly graphically based examples, humor, and conversational tone Inform students of learning objectives, what they will be learning, reference to previous learning and relevancy to what will be Problem scenarios focus on process vs. product, scenarios require reflection on part of learner (reflective observation and abstract Create hyperlinks to glossaries and objective statements early on in the instruction. Use graphical organizers to tie smaller units

PAGE 59

Development and Validation 49 49 Gagn’s Events of Instruction, Objectivist Strategies Constructivist Strategies Present Internet Module learned (present stimulus) conceptualization), incorporate functional knowledge in constructing strategies to conceptualize knowledge of instruction into a "big picture" of what the learner will be able to accomplish upon completion of the unit. Build in examples in guidance and feedback modules that include multiple opportunities to reflect on rule application, present information in context of problem-based scenario followed by examples of how to apply search rule to situation, provide opportunities for guided practice Presentation of content in way that will facilitate students to learn and recall successfully (provision Cooperative learning strategies, students negotiate meaning, high complexity problem scenarios require Present authentic scenarios exemplary of student context based problems, provide problems that employ

PAGE 60

Development and Validation 50 50 Gagn’s Events of Instruction, Objectivist Strategies Constructivist Strategies Present Internet Module learner guidance) multiple knowledge and tool strategies and skills, encourage multiple perspectives, situate problem in authentic context combination of rules for searching, comparison search strategy examples and guidance, encourage active experimentation to problem scenario generated from students Provide opportunities for practice of new skills, (guided practice and feedback) Problem scenarios student generated rather than designer prescriptive, active investigation and acquisition, use group participation to try-out and experiment similar to Kolb's active experimentation phase of cycle Use of student generated examples and "think aloud" processes to determine strategies for searching, setting up game sequence whereby students test and receive feedback to multiple scenarios Provide students information assessing how well they are doing during feedback Use of coaching techniques so that students begin authentic self-assessment Provision of feedback during practice exercises comes naturally as active

PAGE 61

Development and Validation 51 51 Gagn’s Events of Instruction, Objectivist Strategies Constructivist Strategies Present Internet Module session during instruction, examples of strategies include modeling, scaffolding, coaching, and collaboration, ensure peer review and group interaction for feedback to practice experimentation within a real environment takes place. Provide problem scenarios for student to solve and participation will provide authentic feedback from application of skill sets. Provide review and relate new skills to previously learned skills and real-world applications Provide multiple parallel problem scenarios and find new application of new scenario previously constructed Provide practice tests whereby student applies knowledge to similarly constructed problem scenarios, provide performance based testing

PAGE 62

Development and Validation 52 52 Gagn’s Events of Instruction, Objectivist Strategies Constructivist Strategies Present Internet Module Provide tests, performance checklists, rating scales, attitude scales, or other means of measurement and mastery of skills in authentic setting. Gagn called this step enhancing transfer and retention Suggest tools that are selfreporting to facilitate students' monitoring of their own progress and retention, standards of evaluation not absolute, referenced to students goal, construction of knowledge, and past achievement, ultimate measure is successful performance in new authentic environment that requires students to apply and synthesize material such as Internet hunt activity Derivation of Course Content Frederick and Smith (2000) developed a three-credit course for undergraduate students at the University of South Florida (USF) entitled Library and Internet Research Skills: A Guide for College Students The faculty members from the School of Library and Information Science intended that the course be used to or ient users of USF’s lib rary; to teach how to access Internet resources for academic research. The course is predicated on the concept of information literacy as a holistic process beginning with an overview of the research process.

PAGE 63

Development and Validation 53 53 The course treats Internet training as a subset of the research process and follows sequentially from problem definition, determi nation of appropriate resources, location and access to source material, evaluation of reso urces for relevance and efficacy, use of information, and citation of information so urces. Given that library-based and Internetbased research are viewed as two specialized forms of the fundamental research process, the textbook developed for the course can be used to cover the rules of library-based research skills, Internet-based research skills, or both (Federick & Smith, 2000, p.v). Objectives for the course are clearly stated in the preface and include: “Choose appropriate and feasible research topics for a given assignment Determine which types of information sources are most suitable and available depending on the assignment and their chosen topic Locate needed information regardless of location and format” (p.v). The Internet modules are organized into four categories of research tools: virtual libraries, specialized databases, general dir ectories, and search engines. Following the modules on each of these tools, the author s devote several pages to Boolean operators, examples and Venn diagrams to illustrate the effect of conjunctions on search results. Practice exercises following expository informat ion provide the students with examples and the opportunity for “hands-on” exploration of th e concept. After completion of computer lab practice sessions, students are given an Internet Hunt test, and answers are e-mailed directly to the instructors. Library and Internet Research Skills has been offered for several years to date. Unfortunately, this researcher is unaware of efforts to monitor participants’ research skills post training other than the performance criter ia established grade issuance. Course content is reviewed on an ongoing basis by experts from the School of Library and Information

PAGE 64

Development and Validation 54 54 Science and is modified to maintain currency of information. Critical thinking skills are emphasized in that students are asked to evaluate information sources guided by research questions and then match the best resources to their research problem. Students are tested on application of rules to problem-solving s cenarios for Internet practice and complete final quizzes, so there is some evidence that training is effective since most students at least pass the course and do well on the final Internet scavenger hunt. Frederick and Smith (2000) based their course development on systematic instructional design methods and matched course objectives to content and practice exercises. However, course presentation is left to the discretion and personal teaching philosophy of the instructor. Computer labs provide opportunities for active experimentation and application of course pr inciples. The authors adopted student-based examples in order to teach conceptual inform ation in the context of real life problems, a practice encouraged by researchers, (e.g., Ki ng and Fonseca, 2000) media specialists, and the American Library Association. University of Texas’ Texas Information Literacy Tutorial (TILT) The Texas Information Literacy Tutorial (TILT) was developed as an online, interactive, self-instructional tutorial for the University of Texas (UT) at Austin's Digital Information Literacy Office (DILO). Its basis is on cognitive principles established in Bloom's (1956) Taxonomy of Educational Objectives and its philosophy is the belief in active learning. TILT’s purpose was to ensure that incoming undergraduate students received instruction on basic research skills necessary fo r effective navigation through UT's library system. DILO’s sixteen public service librarians for various subject specialties collaborated and created a set of fundamental skills in the form of first year proficiencies (Dupuis, 1999).

PAGE 65

Development and Validation 55 55 TILT research included surveys, reviews, and usability tests and three project managers who specialized in instructional design, web design, and curriculum and content writing were involved in the development proce ss (Dupuis, 1999). TILT presents expository material, examples, practice exerci ses, games, and online quizzes. A survey was administered to 400 incoming freshman to ascertain how students used the Internet for research and assess students’ knowledge levels, usage, and interest in the Internet. TILT impacted students’ ability to discuss and use more complex research and instructors could integrate research-based instruction specific to subject level applications for face-to-face sessions with classes. Active lear ning principles guided classroom instruction and reflected the interactive nature of the online tutorial (Fowler & Dupuis, 2000). The TILT Tutorial is widely used for online information literacy instruction despite the fact that effectiveness of instruction and c oncomitant performance has not been assessed with larger numbers of students at UT or els ewhere. The researcher of this study remedied this flaw and built in performance assessment at the end of each unit. TILT's designers did not directly use Gagn’s (1985) Events of Instruction to create an instructional strategy, but their assumption wa s that interactive designs would result in higher retention than one without these features. The researcher used some of TILT’s introductory material and some interactive ex ercises in converting the textbook material from the Frederick and Smith (2000) text to an online interactive tutorial Internet search tools. Introductory material was presented using TILT’s Flash presentation on common misconceptions about the Internet and was designed to gain the learner’s attention. A brainstorming interactive exercise exemplif ies how keywords are selected and refines research questions for electronic searching. A Library Squares game at the conclusion of

PAGE 66

Development and Validation 56 56 TILT was adapted for the material in the Frederick and Smith (2000) textbook. Graphics illustrating the brainstorming process, narrowi ng of search terms, and pop-up definitions from TILT were used in both conditions. Conclusion This study involved the development of an instructional website based on textbook content about information literacy training. Tw o web versions of the material were created: one featured learner-control, the other prog ram-control. Instructional design for the two online modules was based on an analysis of features using Gagn’s (1985) Events of Instruction as a framework to convert a textbook for web de livery. A baseline comparison version of the material takes TILT tutorial narratives along with presentation of concepts and exercises from the Frederick and Smith text and converts it into a form appropriate for web instruction. TILT exemplifies many of the principles of sound instructional design enumerated earlier by Alessi and Trollip (2001). Literature from the 1980’s and 1990’s addre sses the issue of learner characteristics and program control. Schnackenberg (1998) re viewed evidence about learners’ abilities and their influence on program control strategies ve rsus learner controlled strategies and found that the research is inconclusive. The question remains about whether designers should allow high learner control versus a more content-cen tered approach with program control when converting textbook material to WBI. Arguments are made in favor of both pedagogical approaches. Chung and Reiguluth (1992) suggested that low-ability students, measured by standardized general aptitude tests, require high program-control designs because of lower motivation and self-regulation. Steinberg (1989) asserted that learner control should be

PAGE 67

Development and Validation 57 57 reserved for high-ability students. Later studie s appeared to controvert findings from the aforementioned studies. When investigat ors blocked student ability as a possibly confounding factor in similar studies, rese archers found no interaction between ability and performance (Schnackenberg & Sullivan, 1998). Instructional strategies in any form provide motivation, direction, guidance and guided practice, feedback, and reinforcement (Richey 1996, p. 595). Depending on the designer’s pedagogical perspective; whether constructivist or behavioral or a combination of both, the instructor must choose how one enga ges learners in interactive practice and feedback. A constructivist appr oach may employ a more naturalistic approach to practice and feedback reliant on the abilities of stud ents to generate conclusions from their experience. In contrast to an open-ended strategy for practice and feedback, a more contentcentered instructional model relies on program controlled guided practice and immediate feedback. Research is needed to determine what strategies influence performance outcomes for learners engaged in e-learning for information literacy (Hirumi, 2002). Therefore, this study examined the effects of conversion of textbook content to WBI and compared performance differences of a second version of a tutorial th at includes some features found in classroom instruction. While there are questions regarding learne r control and performance, one does not know if attitudinal preferences exist betwee n the two schools of thought; high program control versus high learner control given a population of high achieving learners. The current study assesses whether learner attitude toward online information literacy instruction is influenced by high program control or high learner control.

PAGE 68

Development and Validation 58 58 The most carefully devised features of instruction will not override a learner’s motivation and attitude toward instructional material or the instructional environment. Keller (2004) discusses a systematic process to ensure that designs address motivation in distance learning courses. Keller’s (1987) ARCS theory includes motivational elements that address attention, relevance, confidence, and sati sfaction. He asserted that affect as well as presentation of content is equally important fo r acquisition of learning. Further he stated that in order for instructional designs to prove successful, a systematic approach to both is necessary to sustain young learners’ a ttention and acquire new skills/knowledge. The development of WBI is a response to changes pervasive in education and the economy and is brought about by an information explosion, technological advances, and advances in education. The new prototype in web-based education involves every facet of sound instruction. Retraining teachers in pedagogical methods and informational technologies is necessary for successful delivery of online education. Studies cited in this work indicate that as digital information resources grow, distance learning and conversion of previously classr oom-based paradigms will be increasingly supported with WBI. Research is needed to determine how features of learner-centered compared to content-driven WBI fare in terms of achievement and learner perception. The researcher discussed sound instructional design strategies appropriate to WBI. The traditional method of research, known as the research-to-support-theory model (Willis, 1993), depends on proving that an innov ative instructional technology is effective if it is found to be better or as good as a tr aditional teaching method. The researcher proposes that the impact of two strategies, one that is learner centered, the other content focused be compared. The efficacy of WBI is not in questi on, rather the current research examines the impact of instructional strategies on achievement and learner perception specified by

PAGE 69

Development and Validation 59 59 Reigeluth and Chung (1992), Gagn (1995), Schnackenberg & Sullivan (1998) and Keller (2001). The instructional design (ID) research model was used in this study and the researcher believes that WBI stands on its own me rit. It is expected that future WBI research will concentrate on the interaction between learner characteristics and instructional strategies. Information derived from well-con ceived research studies promises to support and further enhance the development of WBI, thereby meeting the needs of teachers and learners.

PAGE 70

Development and Validation 60 60 Chapter Three Method Purpose The purpose of the present study is thr eefold. The first purpose is to examine students’ performance on two forms of Internet search skills instruction for web-based delivery from a textbook. The second purpose is to examine effects on students’ academic motivation of two forms of web-based instruction that afford higher or lower levels of learner control. The third purpose is to document the design process used to convert textbook material to web-based instruction. Research Questions Restated 1. What effect do two online instructional design strategies for Internet training, characterized by their content-centere dness or learner-centeredness, have on student performance measures? 2. How do students’ perceptions based on self-reports differ on attention, relevance, confidence, and satisfacti on between two instructional strategies characterized by their content-cen teredness or learner-centeredness? 3. Is the additional time and effort needed to include the treatment module features found in classroom instruction; gaining attention, guided practice, corrective and reinforcement feedback, embedded quizze s, and summary screens, efficacious given the performance and perception results of this study?

PAGE 71

Development and Validation 61 61 Research Design Model It is important that information technology research measure the impact features of instruction have on learners’ performance levels The researcher followed the instructional design and research model rather than the more traditional research-to-support-theory model (RTST) so that the web-based instruction developed for this study would be evaluated based on its own merit and not in comparison with traditional instruction. The RTST model is too limiting and cannot adequately assess the impact innovative technologies have on learning; typically, it is used to compare innovative instructional interventions with more traditional instructional methods (Willis, 1993). If the results of a study favor the innovative instructional intervention, it is said to be more effective than the traditional method of intervention. The innov ative intervention is then viewed as being representative of all such innovations. This conclusion makes no sense. One well-designed program is not representative of all programs. Each one needs to be evaluated to determine the impact it has on learning. Moreover, it is not necessarily the intention of an instructional designer to create a learning instrument that is more effective than traditional methods. This study focused on product development according to the specifications of the instructional design model, not on proving which instructional delivery vehicle is more effective. Context of the Study This study took place at a suburban middle school in the state of Georgia. A change of location from undergraduates via distance learning from USF to the middle school was based on convenience and the eighth graders de monstrated comparable academic abilities as the undergraduates. The change also perm itted the researcher physical control and supervised conditions for the research. The school is considered one of the highest ranked

PAGE 72

Development and Validation 62 62 academic middle schools in the area as demonstr ated by students’ national rankings in SAT scores and admission requirements; students score within the 85% (composite score) or higher on the Iowa Test of Basic Skills ( ITBS ). Class sizes in all magnet core classes are limited to a maximum of 21 students. The county where this middle school is lo cated places great emphasis on information literacy and research skills appropriate for st udent achievement. Th e WBI material created for this study worked well with the curriculum requirements for both the county and the state of Georgia because information literacy is emphasized. Permission forms signed and dated by parents and students were received prior to implementation of the research. Parents proved quite cooperative and enthusiastic about the activity because they were eager for their children to gain the skills the instruction taught. The language arts teacher who participated in the study worked collaboratively with the researcher, the library media specialist for the school. To minimize intrusions on routine class work, students took the pretest for knowledge of Internet search tools in the classroom under the supervision of the language arts teach er when it was convenient for the teacher. A lab time was scheduled in coordination with the teacher after pretest scores were compiled and ranked for each class. There was a two-week interval between comp letion of the pretest and assigning the students to groups. Students were matched within pairs and randomly assigned to either the treatment or comparison group to ensure equi ty between groups. Students had almost two full periods to complete the tutorials. To mi nimize interruptions during the students’ classroom work, the posttest was administered in the classroom in paper and pencil form at the convenience and under the supervision of the la nguage arts teacher. In order to mitigate

PAGE 73

Development and Validation 63 63 history for the pretest and posttest, the investig ator waited a week post-instruction for the students to complete the posttest for comprehension. Students took the Internet Scavenger hunt in the computer lab and were granted access to their respective tutorials as a reference tool the following week. Comprehension tests were administered in paper and pencil form while the performance test (Internet Scavenger Hunt) was conducted in the computer lab. Cards were made up with student names and group number for each participant to ensure that all students assigned to either comparison or treatment group were properly placed. The investigator and teacher assigned seats in the lab to ensure a more valid performance measure. Students had one-hour to complete the scavenger hunt. A digital timer measured the time taken to complete this instrument. Population and Sample Two large group pilots were conducted prior to the final implementation of the research. The first took place summer semester 2003 with undergraduate students from the University of South Florida. The summer univer sity pilot used an experimental design with volunteers from three sections of library and education online classes. A second large group formative assessment took place December, 2003. The sample was changed from undergraduate college students to a middle school population sample of high ability eighth grade students. The investigator changed the location an d population sample to the researcher’s worksite. The change was made partially for the convenience of the investigator and because the high ability middle school students had si milar academic abilities to undergraduates originally used for the large group distance learning pilot. Forty high ability eighth grade

PAGE 74

Development and Validation 64 64 language arts students participated in December’s administration. Prior to final implementation of the research in May 2004, the researcher modified the instructional modules to ensure the integrity of the two instructional conditions for final data collection. Forty one different eighth grade high ability language arts students participated in the final implementation in May. The first large group formative administration of the materials during the summer semester 2003 at the University of South Fl orida (USF) was conducted via Blackboard (distance learning portal) to test the material without instructor presence. The summer USF pilot used an experimental pretest-treatment-posttest research design. The study began with 56 volunteers enrolled in several sections of an undergraduate Course for Library and Internet Research Skills and introductory computer education course for those wishing to enroll in the graduate department of education at USF. Only 41 participants out of 56 completed the study including pretest, treatmen t, posttest, and scavenger hunt. Forty-five participants completed the pretest, treatmen t, and posttest for comprehension. Students were informed that should they wish to participate, they would earn two points towards their final grade for completing the study and that non-participation would not negatively affect final grades. The extra points proved a weak incentive even to those who completed the research. The researcher used distributed e-mail lists to inform participants of their respective groups. Pretest scores were calculated and scor es were ranked and matched within pairs to either the treatment or comparison group. Once assigned to individual groups, the researcher emailed the volunteers to provide instructions on which tutorial to take. Participants signed on to their individual web portal sessions (Blackboard) and chose the assigned website link within the course materials to complete their assigned tutorial at their

PAGE 75

Development and Validation 65 65 own pace. The volunteers were asked to appr oximate the time taken to complete the instructional materials. Self-report time estima tes ranged from half-an-hour to forty-five minutes with a few students who took an hour-fifteen minutes to complete the instruction. A week following training, participants took the posttest measure. Mean pretest scores were X =55.92 ( N = 56) voluntary participants enrolled in the pilot. Twenty-eight participants were assigned to group one (treatment group) but only 25 completed the pretest-treatmentposttest. Twenty-eight participants were assigned to group two (comparison) but only 20 of those assigned to group two completed the pretesttreatment and posttest portions of the program. Across both groups, comparison and treatment, there were appreciable gains from a mean of X =55.92, N = 56 on the pretest score to a mean of X =74.52, N = 45. The treatment group (N=25) demonstrated a gain from 57.01 to 73.47. The comparison group (N =20) increased their scores from a pretest average of 56.81 to 74.69 on the posttest measur e. Because there was an uneven distribution of participants for the two groups and the integ rity of the data was in question, statistical comparison of these results are inconclusive A repeated measures ANOVA was performed with 45 of the participants to determine if si gnificant gains resulted across both groups from pretest to posttest. The ANOVA produ ced a main effect across both groups; F (1,44) = 61.560, p < .01. No interaction effect resulted from the large group pilot, F (1,44) = .106, p < .746. Results from the distance-learning participan ts proved to be inconclusive. Attrition and lack of follow-through on completion of the module and posttests resulted in a drop from 56 students to 45 participants who completed pretest and posttest for comprehension. Given the voluntary nature of the pilot, only 43 students completed the scavenger hunt.

PAGE 76

Development and Validation 66 66 Cronbach alpha was performed with 43 participants were = 0.72. The mean score from the group was 7.48 (N=43). Results from the pilot led the investigator to examine features associated with each of the instructional programs as well as chan ges to the testing instruments. Similarities between the comparison and treatment groups used for the large group formative analysis were not dissimilar enough to identify the independent variable i.e., instructional strategy. Design of the materials was revised so that the comparison and treatment modules appeared identical except the treatment module incl uded additional exercises that were program controlled and interval quizzes were added as well as the library squares game. The comparison condition afforded greater learner control and the qualitative differences became apparent in the practice exercises to rely on learner initiative to perform the exercises. Changes to the comparison program afforded more liberal learner control for those assigned to this condition. Navigation for the treatment module ensured that learners would complete guided practice and feedback exercises, active links on the menu were eliminated until completion of the exercises, interval quizzes and a library squares game remained in the treatment condition. Due to problems with physical control of the former pilot taking place via distance learning, the researcher chose a face-to-face administration of the study. A decision was made to change the target population to those of comparable ability to the undergraduates but administer the research locally at the researcher’s worksite under supervised conditions. A change from college students to high ability middle school eighth grade students was made. The second large group administration of the materials took place in December, 2003 with a total of 40 high ability language arts students representing a culturally diverse

PAGE 77

Development and Validation 67 67 population enrolled in a high achievement magnet program in a middle school in metropolitan Atlanta, Georgia. The researcher modified the materials and the pre/posttest. Results from this group administration showed an increase from pretest to posttest score across both groups from 49.28 on the pretest to 62.91 on the posttest (N=40). Scavenger hunt scores averaged 90% across both groups wh en provided access to the tutorial to apply their knowledge on the hunt. Problems with middle school students who failed to follow verbal directions and sat at the wrong stations forced the researcher to replicate the study and refine the procedure. Students were verbally assigned to either group one (treatment) or group two (comparison) from matched pairs in rank order to either gr oup. The middle school students despite verbal directions, failed to go to their assigned grou p. All students completed the pretest-treatmentposttest and scavenger hunt but the groups wer e not equivalent on the pretest measure. The study was replicated May 2004 to better control assignment of the 41 high ability eighth grade students from two sections of la nguage arts classes to their respective groups. The same procedure used in the pilot was followed for random assignment of matched pairs of students to the treatment and comparison conditions; however, instead of relying on verbal instruction for directing students to the treatment or comparison workstations, the researcher made up index cards with the name of the student and the group assignment. The researcher assigned students one-by-one to a particular workstation before students began their respective tutorial. The room was divided in half with the comparison group sitting on one side of the lab and the treatment seated on the other side to physically separate the comparison and treatment students. The researcher added a third instrument, a modified version of the Academic Motivation Profile for the final study. The students’ language arts teacher also participated in the study.

PAGE 78

Development and Validation 68 68 Research Question One What effect do two online instructional design strategies for Internet training, characterized by their content-centeredne ss or learner-centeredness, have on student performance measures? To answer this question, two distinct instructional modules and two performance measures were created. One instructional mo dule featured content-centered features associated with classroom instruction. The le arner-centered module provided high learner control compared to the content-centered module that featured a high program control. The two performance measures created were a co mprehension pretest-posttest and an Internet Scavenger Hunt. To mitigate the possibility of prior knowledge acting as a confounding variable, a pretest measure was administered to assess preinstructional knowledge of the material. The pretest also served as a measure to rank and a ssign students in matched pairs to either the comparison or treatment condition. The sa me questions were used for a posttest for comprehension following instruction. The order of the questions was altered and a time interval of three weeks following the pretest wa s set to factor the possibility that history would threaten internal validity. Large group formative data was collected on the posttest measure to determine statistical reliability of the instrument. A pretest-treatment-posttest design was used to assess change within and between groups on the pretest and posttest score following instruction. The Internet Scav enger Hunt was used to measure differences between groups to determine if either inst ructional strategy proved more efficacious.

PAGE 79

Development and Validation 69 69 Instruments Pre-Test /Post-Test Development The researcher used a systems approach to construct the test. A systematic approach required test items corresponded to various cognitive levels of knowledge as well as referenced objectives of the instructional program. Appendix A documents the corresponding performance objective of the inst ruction, the cognitive level of the objective according to Bloom’s (1956) Taxonomy of Educational Objectives, and the individual test item. The researcher addressed face and content validity when constructing test items used in this study. Before conducting any of the large group pilots, the researcher conducted oneto-one sessions with faculty members self-identifie d as nave Internet researchers. The oneto-one sessions were held to obtain feedback on the clarity of the comprehension test questions and the instructions for the test. “Talk aloud” sessions revealed how participants interpreted each multiple choice, multiple an swer, and true/false item on the pretest for comprehension. Distracters from multiple choice options were discussed and modifications were made according to feedback from formativ e sessions. Some of this feedback included questions worded as follows: “Name two method s of searching general subject directories.” Options were originally as follows: a) hunt and peck, b) surf and turf, c) browse and search, d) subject and title. Overwhelmingly, indivi duals chose the correct answer based not on knowledge of the material, rather they used lo gic. A colleague suggested that the first two options were obviously incorrect which left one of two remaining choices. Since subject and title were more closely associated with a card catalog, five of five participants chose the remaining option “c” as a correct response even though they admitted they were making a

PAGE 80

Development and Validation 70 70 logical guess. The question now reads, “What method would you use to search for information in a general subject directory?” Options include: 1) domain and URL, 2) web address and date, 3) browse and search, 4) sub ject and keyword, and 5) title and author. (The correct response is number three.) Based on formative feedback from students, the researcher modified test items for the comprehension pre and posttest. During the su mmer of 2003, statistical analysis with the 45 participants from the online administration resulted in a Cronbach = 0.78. The summer pilot relied heavily on the clarity and internal consistency of test items. A Cronbach analysis of the data for the posttest in December 2003 with the middle school students was not performed. The researcher assumed that re plication of the study was needed because problems associated with lack of integrity of the data were ev ident. Analysis of the final study was conducted to determine the test’s internal reliability. Analysis of the final administration of th e posttest for knowledge of Internet search skills resulted in a Cronbach alpha of .6856 ( N = 41). Two of the items from the first question on identification of strategies appropr iate to refinement of research questions required a multiple answer response with a total of five options (a, b, c, d ,e). One of these items (item 1c) produced no variance with all students responding correctly to this item. A second multiple answer item (question 20) aske d students to identify two of five strategies when searching a subject directory. All students responded correctly on item 20c, thus lowering the alpha coefficient due to lack of variance. Tables A-15 and A-16 in Appendix A affords the reader an item-by-item analysis of the posttest questions. Internet Scavenger Hunt Performance Test Development The researcher developed an Internet scaven ger hunt to measure learners' abilities to find relevant information using the Internet t ools. To create as au thentic a setting as

PAGE 81

Development and Validation 71 71 possible, students used their tutorial as a refe rence tool while completing the Internet hunt. The hunt presented research scenarios that corresponded to one of the four tools presented in the instruction. The student must engage a research strategy to solve an information problem. Answers are supplied in multiple-c hoice format. Appendix D presents the test items and the corresponding objective covered in the tutorial. In summer, 2003 students enrolled in three sections of an online library science and education course completed the Internet Scav enger Hunt. Cronbach alpha was performed with 43 participants. Results on the alph a test were 0.72; mean score was 7.48 ( N =43). In December 2003, forty high achieving eighth grade students from a high achievement magnet program middle school in Atlanta, Georgia piloted the study. When provided an opportunity to take the Internet Hu nt following training for both comparison and treatment groups, the mean score for the test produced Mean = 90% ( N=40) No reliability statistics were compiled for the Decemb er 2003 administration. Due to procedural mistakes described earlier, the researcher knew that subsequent administration of the material was necessary at which time statistica l analysis of the test would be computed. During the development phase of the research, a committee member suggested the scavenger hunt produced a ceiling effect with middle school students from the high ability group. To bolster the validity of the scavenger test, a small group of similar magnet students ( N = 4) participated in the hunt without benefit of instruction. Students were asked to talk aloud their search tactics and the researcher took observational notes of how these gifted students approached the material. All four st udents were enthusiastic about the activity and readily engaged in the task. They were permitted discussion with each other as they began their separate tasks at adjacent workstations in the lab. A digital stopwatch measured time on

PAGE 82

Development and Validation 72 72 task while the students proceeded through the material. Several observations were noted concerning the pilot with the students who had not had benefited from the tutorial as a reference or training aid. 1. Without exception students went to familiar search sites such as Google, Yahoo, or Ask Jeeves and input natural language into the search fields. 2. Students used a trial and error approach as they selected search results from their natural language queries. 3. None of the students went to particular databases such as the Internet Movie Database or United States Post Office to find pertinent information. 4. Students reported that the activity itself was fun but also frustrating. 5. When asked if training prior to admini stration of the test would have been helpful, all concurred that instructi on would have made the task easier. 6. None of the students completed the acti vity within a 35-minute time window. In fact, the maximum number of items found within more than a half hour was five, with at least one error on the multiple-choice test. 7. Time was added as a secondary dependent variable to assess the relationship between time and performance accuracy on the scavenger hunt. A Cronbach alpha was computed on the fina l administration of the Scavenger hunt posttest May 2004 (N = 41). Three of the ten items on the instrument produced no variance, thus when one computed the alpha score, the sc ore computed a test with only seven items. The final computation resulted in an alpha of .5647 A ceiling effect may have resulted with an average score within both groups of 9.24 of a possible ten points ( N = 41).

PAGE 83

Development and Validation 73 73 Research Question Two How do students’ perceptions based on self-reports differ on attention, relevance, confidence, and satisfaction between two inst ructional strategies characterized by their content-centeredness or learner-centeredness? To ascertain if learner perceptions of the two versions of the instructional material differed within the sample, the researcher used a modified version of the Academic Motivation Profile (AMP). A pair ed t-test was used to compute between-group differences on each of four factors; attention, relevance, confidence, and satisfaction. Academic Motivation Profile The researcher used the Academic Motivation Profile (AMP) (Carey, 1994) to analyze student perception of the material across and between both groups. The instrument is based on four factors based on Keller’s (1983) ARCS Theory; (1) attention, (2) relevance, (3) confidence, (4) satisfaction. Carey’s AMP consists of four subscales each related to Keller’s four factors listed above. In the present study, the researcher use d Keller’s ARCS model to determine if motivational differences emerged between two instructional strategies. The treatment group received an introduction to the module using MacroMedia flash screens designed to pose controversial questions about Internet misconceptions. Additionally students received immediate feedback following guided practice sessions. Objectives in both conditions stated the purpose of the instruction in terms that described what a learner would gain upon completion of the unit. While the focus of the present research does not intend to measure adaptive feedback or motivational messages, it poses an important question of whether instructional strategies have an effect on studen t perception of their learning experiences.

PAGE 84

Development and Validation 74 74 Participation in the study was neither ma ndatory nor did scores on the tests count for extra credit for the students enrolled in the language arts classes. The middle school sample chosen for this study demonstrated their ability to perform their best in the absence of an external reward. Learners expressed interest in the instruction in both conditions. Students demonstrated a willingness to partic ipate because the focus of the instruction was relevant and because these students generally have the internal drive to perform well. A novelty effect due to change from the classroom-based instruction to the computer lab, may have enhanced learner motivation and willingn ess to participate in the research. While students did not have access to help systems, they were able to receive real-time support from their instructor and the primary research er. Students were not engaged in use of the material within a cooperative learning environmen t, rather each worked alone to ensure that the research was limited to evaluation of the material itself. If as Keller (2001) asserts, motivation changes according to learners’ requi rements and learning conditions, a follow-up study may be appropriate under different cond itions and with more varied groups of learners. To test the internal reliability and the degree to which items on the Academic Motivation Profile could be replicated with c onfidence in other courses, Carey conducted a series of three formative studies of her instru ment with over 760 undergraduate pre-service teachers. The first factor, attention, measur ed how well the delivery vehicle caught the students’ attention i.e., the textbook, lectures, pr actice exercises, assignments. Students rated their attention on a four-point scale ranging from not the least bit curious to very curious. Relevance was associated with short and long -term goals using a sample of undergraduate pre-service teachers. The confidence subscale measured the degree to which undergraduate pre-service teachers felt about short-term go als such as successfully passing the teacher

PAGE 85

Development and Validation 75 75 certification exam, and long-term goals as obtai ning a teaching position. The satisfaction measure was linked to students’ overall satisfa ction to the course evaluation questions. Carey constructed four-point response scales to elimin ate neutral responses to each of the factors. In total, the pilot studies on factor loadings and internal consistency of the instrument were conducted with over 760 undergraduate studen ts at the conclusion of spring, summer, and fall semesters of 1990. Internal consistency mea sures using Cronbach alphas proved a high coefficient (.94). Subscale reliability also proved consistent for each of the factors in the first two trials; attention yielded .82 and .83, rele vance at .92, confidence from .91 to .94, and satisfaction .85 to .87. To assess the relationshi p between totals on each of the four factors and student achievement, Carey performed a Pearson Product Moment Correlation and found that the relationship between achievemen t and overall AMP score was significant but low ( r =.22, p = .001). The purpose of using the AMP is to determine if between-group differences emerge on any one or more factor th at prove if students reacted more favorably to the comparison or the treatment instructional strategies. The researcher received a copy of a modi fied version of the AMP from Dr. James Carey who used the instrument for a graduate course on preparation of instructional materials delivered to graduate students pu rsuing media specialist certification. The investigator retained the basic structure of th e original AMP but modified items to reflect the goals and context of the graduate course. A copy of the instrument is included in Appendix F. Reliability statistics were computed usin g the modified version of Carey’s AMP designed to measure the students’ perceptions Cronbach alpha for the attention sub-scale resulted in a statistic = .92 (N =40). The reliability of the relevance sub-scale proved

PAGE 86

Development and Validation 76 76 almost as high with an = .898 (N =40). The confidence factor yielded an alpha of .8546 on eight items. Finally the sub-scale for satisfaction resulted in a Cronbach alpha = .8724 (N =40). Research Question Three Is the additional time and effort needed to include the treatment module features found in classroom instruction; gaining attention, guided practice, corrective and reinforcement feedback, embedded quizzes, an d summary screens, efficacious given the performance and perception results of this study? The development process for both tutorials follows. The materials were modified several times prior to final administration. Creating both modules was time consuming in terms of navigational design, conversion of content to web format, inclusion of graphical organizers, and design of the testing instruments. The treatment module required extensive revision for the navigation and employment of a javascript programmer to provide feedback, an d a Flash designer to modify the Library Squares game. Development of the treatment mo dule took more than a year to pilot and ensure that the navigation, guided practice, feedback, interval quizzes, summary screens, and Flash modules all worked properly. The comparison module in contrast took far less time due to the nature of the exercises and absence of interactive features reliant on the individual learner characteristics of the students. Development of Instructional Materials The researcher began the development proce ss of the material by scanning text from Frederick and Smith (2000) course workbook. It was assumed that the authors of the course performed initial: (a) needs assessment (b) in structional analysis (c) target audience analysis

PAGE 87

Development and Validation 77 77 (d) and performance objectives for the cour se. Consequently, the researcher made no modifications to the material for the high ability middle school audience given their high reading comprehension scores and demonstrated abilities to work with adult reference materials in their classes. The researcher, in collaboration with course instructors: (a) developed and validated assessment instruments (b) choose an instructional strategy (in part developed by the extant information literacy tutorial) (c) developed and selected instructional materials (instructional material was already developed in the Frederick and Smith textbook) (d) conducted a formative evaluation (e) revise d materials (f) and conducted a summative evaluation. The researcher analyzed steps in the design process to include and exclude features for two forms of the text to web conversion using Gagn’s (1984) Events of Instruction. In the comparison version students had a high degree of control and text was organized into subtopics in a vertical menu table format. Exercises were scanned from the workbook and typical of most text to web conversions, stud ents received few instructor-guided strategies for completion of the exercises. The amount of feedback students received was dependent on how many active links they explored and how engaged they were in the exercises. Therefore, students’ conclusions about their experience with the material depended on the learner’s processing abilities and individual experiences. Treatment module. The researcher used Gagn’s (1985) Events of Instruction to develop the WBI module; examining interactive exercises that mirrored those found in classroom delivered instruction of the textbook material. Instructors typically include motivational material to engage the learner’s a ttention. This is often omitted when classroom textbooks are scanned for web delivery. The Un iversity of Texas’ TILT program provided excellent motivational and informational material Sample exercises provided program

PAGE 88

Development and Validation 78 78 control to the students in the form of input fields, hotspots, and corrective and reinforcement. Additionally, response review qui zzes with immediate feedback and summary screens followed the virtual library, specialized databases, and introduction to Boolean operators. The review quizzes along with a Library Squares game patterned after a game show in the TILT tutorial assisted students assign ed to the treatment condition to transfer and retain acquired information. Inclusion of guided practice and feedba ck, motivational material, embedded quizzes and feedback, and simulated search exercises pr oved significantly more time intensive for development of the treatment module. The development of the treatment material required study of the original text and workbook ex ercises. The developer spent many hours performing screen captures, pr acticing exercises originally offered as examples in the textbook, developing instructional text, and fina lly engaging the services of a javascript programmer. Navigational program control ensur ed that a learner proceeded sequentially through a sub-module prior to return to the me nu screen. The researcher performed months of planning and testing of navigational seque nces, construction of the practice exercises, formative testing of the instructions and lear ner exercises. Selection of material from the TILT modules that fit with the Library and Internet Research Skills course also required the researcher to become familiar with the module s within the University of Texas program and integrate them into the Internet instruction treatment module. Comparison module. The comparison version of the Internet search tools module typifies what happens when an instructor take s a textbook and converts the material to web delivered instruction. Compared to the trea tment module, the efficiency of straight conversion of text to web instruction proved c onsiderably easier. Based on the Frederick and Smith (2000) topics included in an online version of the workbook are: (1) location and

PAGE 89

Development and Validation 79 79 access, (2) objectives, (3) search strategies, (4) brainstorming, (5) virtual libraries followed by an optional exercise, (6) specialized database s followed by optional practice, (7) general directories followed by optional practice, (8) sear ch engines followed by optional practice, (9) metasearch engines followed by optional practice, (10) Boolean searches broken down into sub-units for and, or and not, and nested search techniques (11) advanced search techniques to include search for images, wildcards, search by domain, and comparing results with multiple search engines. The comparison program presents narrative screens on a sub-topic related to a unit objective. Students have the opportunity to ac cept or decline an invitation to practice concepts using authentic examples of research exercises. The number of practice items in the comparison condition provides students the opportunities to gain more practice and experience a greater range of ex amples compared to the treatment condition. Students are free to take notes, practice exercises or not, an d are given no instructor assistance, except as it pertains to technical difficulties, as th ey proceed through the comparison tutorial. Common Features Between Treatment and Comparison Modules The following describe common features for both tutorials comparison and treatment: Menu structure. Both tutorials used a vertical menu structure set up as tables where topics lined up vertically on one side of the sc reen and content appeared in the right side of the table. Both groups had visual prompts (arrows) to inform the learner of the nature of the information or exercise he/she was reading. If the student moved vertically from top to bottom in the table of contents, she/he moved logically through the topics. In most cases the researcher observed the students moving sequentially through the material using the vertical menu bar. Use of graphical organizers. Graphics used in both conditions were used to illustrate

PAGE 90

Development and Validation 80 80 principles presented in the text. For example, when the students studied about general subject directories in both conditions, a link to a graphic of an inverted triangle illustrated how the directory was organized from broad t opics to specific within a subject area. Venn diagrams were used in both conditions to ill ustrate Boolean operators along with a pop-up screen and definition of the origin and de finition of the term. Flash applets were used sparingly on overview screens in both conditions in order to sustain attention and illustrate learning principles within the text. Overview of the research process. Both comparison and treatment modules included a unit on research strategies and refine ment of research questions taken with permissions from the University of Texas’ TILT Tutorial. Both mo dules presented an outline of how one goes about refining a research question, brainstorm ing subject categories, selecting keywords, use of quotation marks around phrases, and use of wildcards for word variations. These strategies were presented as a technique prior to choice of specific Internet search tool. Overview screens preceding exercises. Both introductory screens for the comparison and treatment groups contained tables with descriptions, comparisons, and live links to various categories of Internet search tools. Exercise s followed the overview in both conditions and the exercises included in the treatment condition were derived from the exercises in the textbook. With the exception of screen captur es versus open-ended practice exercises with the material, some of the treatment conditi on exercises guided students and provided corrective and reinforcement feedback were replicated in the comparison condition with the exception that students received feedback as a natural consequence of their exploration within the exercises. Because questions were closed-ended in most cases, it was possible for a learner to conclude if his/her strategy pr oved successful. If a student was motivated enough, he/she might gain more practice an d gain experience with a greater range of

PAGE 91

Development and Validation 81 81 examples in the comparison condition over the treatment. Clear definitions of terms. Narrative screens provided clear definitions readily understandable with these high ability students. Embedded glossary hypertext links were included in overview screens for both conditions. Reference to previous material for retention and transfer. The narratives used in both conditions used a conversational tone. Concepts in virtual libraries and general subject directories for example, made reference to each other so the student (if reading carefully) could glean the similarities and differences be tween various Internet search tools. Inclusion of examples of various tools within a specifi c category allowed students to compare a range of sites within a category They were able to recognize similarities and differences as they progressed through the material. Scavenger hunt as a performance task. The performance hunt measure proved to be effective with students assigned within both instructional conditions. When the researcher performed a small group pilot with students w ho did not have benefit of the online modules or any formal prior Internet search training, it was reported that the activity was enjoyable but frustrating without training. The researcher noted that inclusion of a performance task that requires students to apply principles fr om instructional material and gives access to these resources may prove beneficial and enga ging as a learning experience in itself. Different Features Between the Co mparison and Treatment Module The comparison module represents a typic al conversion of textbook to web delivery of instruction. Little attention is paid those features associated with classroom practice including creating a context to gain the learner’s attention, making the material appear relevant to the learner, instructor-learner an d material-learner inter actions with corrective

PAGE 92

Development and Validation 82 82 and reinforcement feedback, summary and review material to enhance retention and transfer of information, attention to sequence and presentation of the material to ensure that students engage with the learning material. In contrast with the comparison group that offers optional practice exercises, the treatment prog ram requires that students practice concepts and reflect on information corresponding to Gagn’s (1985) opportunities for engagement and feedback. Following a narrative on a search to ol, students engage in a series of practicefeedback exercises that simulate Internet search commands. Thus, the student profits from a concrete experience of searchin g the Internet. Two self-assessed review quizzes are included in the treatment group. To enhance comprehension of concepts, the designer built in corrective and reinforcement feedback for each student interaction. Appendix C contains a table that compares the conditions described in the preceding sections. Each of Gagn’s (1985) Events of Instruction describes features for the comparison and treatment conditions that will be tested during the evaluative phase of the study. The following table illustrates the commonalities and differences between the two versions of the instructional program.

PAGE 93

Development and Validation 83 83 Table 4 Comparison and Treatment Group Differences Gagn’s Events of Instruction Comparison Module Treatment Module Comparison module introduces concepts covered and four types of Internet search tools learner will encounter. Treatment condition offers same information as comparison but leads the learner to an introductory interactive flash presentation on myths about the Internet The screen above provides a context for the learning module with a definition of information literacy. A right arrow button take the learner to a series of flash screen on misconceptions about the Internet. Material is derived from the TILT Tutorial Gain attention Learner clicks on a bubble and program provides animated information about that myth

PAGE 94

Development and Validation 84 84 Gagn’s Events of Instruction Comparison Module Treatment Module At the end of the introduction the program returns the learner to the main menu shown below. Inform students of learning objectives, what they will be learning, and relevancy to what will be learned Course objective screens are identical in both comparison and treatment versions of the program Treatment menu removes live links within the table of contents. The arrow pointing left on the objectives screen takes the learner back to original menu with live links to wherever the learner chooses to go Presentation of content facilitate students to learn and recall successfully (provision learner guidance) Both comparison and treatment offer identical introductory information, both contain a table listing live links and descriptions of the tool. The right arrow shown above takes the learner through 3 different examples of specialized databases Provide opportunities for practice of new skills, (guided practice and feedback) The practice screen provides a series of closed ended questions with a table of tools for the learner to find his/her answer to the exercises via active exploration. The learner begins with an exercise with screen captures, input fields, hotspots, and pull down menus for the learner to step through an exercise sequentially

PAGE 95

Development and Validation 85 85 Gagn’s Events of Instruction Comparison Module Treatment Module Visual cues on the screen captures assist the learner to provide the appropriate information in the input field If student fails to provide input, the program prompts the learner to provide such input. Above the learner receives reinforcement feedback on a correct practice item. It is only when the student presses OK that the program continues Following overview screens, the comparison program provides practice exercises and where possible, illustrations of the concept. Learner gets feedback from exploration with the live links and answering the practice questions. The first interval quiz in the treatment program provides immediate feedback on a multiple selection quiz on virtual libraries and specialized databases Provide students information assessing how well they are doing during feedback session Corrective and reinforcement feedback is provided throughout the treatment program for each exercise.

PAGE 96

Development and Validation 86 86 Gagn’s Events of Instruction Comparison Module Treatment Module Provide review and relate new skills to previously learned skills and real-world applications Treatment program provides summary screens following exercises and prior to returning the learner to the main menu Nested summary in treatment program comes befo re the learner is guided through an exercise using nested search statements. Library Squares Flash game provides real search problem scenarios and learner must agree or disagree with one of the celebrity searchers

PAGE 97

Development and Validation 87 87 Gagn’s Events of Instruction Comparison Module Treatment Module Even when a learner chooses the correct response, a visual cue is provided along with reinforcement feedback Provide tests, performance checklists, rating scales, attitude scales, or other means of measurement and mastery of skills in authentic setting. Gagn called this step enhancing transfer and retention Both instructional groups take a posttest for comprehension and performance. An Academic Motivation Profile adapted for the instruction provides the learner opportunities for reflection on what is learned. Interval quizzes such as the one shown above along with the Library Squares game provide additional learner feedback on progress Research Design for Final Administration of the Study Two modules; one with highly structured program control to include features associated with classroom instruction was compared to a tutorial that afforded full learner control and scanned text from the textbook and exercises without benefit of instructor/program guidance and feedback. Dependent variables included posttest scores for comprehension, a performance test in the form of an Internet scavenger hunt, and data from the modified AMP. The final study proved a mixed experimental design, based on change

PAGE 98

Development and Validation 88 88 measure from pretest to posttest, thus the students were tested twice. The design was factorial because two between group differen ces (treatment vs. comparison) were measured on posttest scores for comprehension. Additiona lly, the researcher analyzed between-group differences on the performance measure (Internet scavenger hunt) to identify features of instruction that proved more effective for learners when presented a problem-based task. Finally scores from an attitudinal survey, Academic Motivation Profile (Carey, 1994) modified to fit the context of the instruction was administer ed post-training and after completion of the comprehension and Internet scavenger hunt tests. Procedure The instructor informed students of the voluntary nature of the research. Consent forms were distributed and collected prior to introduction of the study and the pretest. Participants required a parent and student signature on the form given that the sample comprised students over the age of twelve. Once all the consent forms were collected with signatures, the researcher distributed the pretes t on knowledge of Internet search tools. The pretest instrument was hand scored, then ranked in descending order. Based on ranked pretest scores, the investigator assigned students randomly within matched pairs to either the treatment or comparison condition. Matched pair random assignment was used to ensure the equivalence of the groups based on prior knowledge measured by the pretest for knowledge of Internet search tools. The resear cher used a six-sided die to randomly assign students within pairs to group one (even numb er on die) or group two (odd number on die). To avoid confusion from the December 2003 ad ministration of the course materials, student names and group assignments were printed on in dividual index cards. The investigator divided the lab into two sections; on one side the researcher placed students assigned to the

PAGE 99

Development and Validation 89 89 treatment module while those assigned to that comparison group sat at the other side of the room. Students were called one person at a time to take a seat at a prepared workstation within the computer lab. The researcher loaded the modules locally to a public shared drive to avoid problems with Internet access from the County’s proxy server. A freeware software Internet stopwatch program was downloaded and launched on each station so that students could record their time on task individually within groups. Groups were physically separated on two sides of the room so that the research er and classroom instructor could monitor that each student began the correctly assigned tutori al and accurately recorded the time indicated on their digital stopwatch. To avoid any confounding influence from the students’ language arts teacher or the researcher and provide an authentic learning en vironment, students worked individually at stations in the lab. While permitted to engage in conversation and assist one another while proceeding through the tutorial, the room remained almost totally silent as students proceeded through the materials in both groups. The only assistance provided was when there was a request for guidance about where to proceed once students looked at an instructional unit. Both groups were informed a bout where to click on the vertical menu bar prior to beginning their tutorial. They were also told they were free to take notes or not, that there were no repercussions either way. The rese archer and language arts teacher remained in the room throughout the instructional time without talking to students and only to provide navigational assistance and superv ision of proper behavior in the lab. Students were given the posttest for knowledge of Internet search tools a number of days after their computer lab experience. The knowledge posttest actually assessed the students on retention of the training material rather than comprehension.

PAGE 100

Development and Validation 90 90 A week after training in the computer lab, students returned to the computer labs, assigned in groups, for the Internet scavenger hunt. Again, the researcher called students individually, distributed the student’s index card, prepared the stopwatch on the desktop, and made ready the assigned tutorial before th e students sat to complete the scavenger hunt. Once students sat at their assigned workstations, the researcher instructed them to open a second window outside the tutorial to perform the searches, click the start button on their timer and begin the scavenger hunt. A week later, the AMP was distributed in paper and pencil form to students during their language arts period. The researcher asked students to sign their name at the top of the evaluation and emphasized that no answer wo uld be perceived as incorrect. The media specialist told students to respond to questions in their most honest form, that no feelings would be hurt by any negative responses and their responses would be kept in the strictest confidence. Data Analysis The researcher examined gains in lear ning from pretest on comprehension to posttest on this same measure. To mitigate hist ory and order effect from the study, three weeks delay from the pretest to posttest were sche duled. The order of the questions was also altered on the posttest comprehension instrument. Repeated measures ANOVA was used to determine if significant gains in learning occurred across both groups. A factorial ANOVA was used to identify an interaction effect be tween the comparison and treatment conditions for the comprehension score. To determine if differences between groups on performance score on the Internet scavenger hunt posttest, a paired T-test was conducted. To assess differences between groups on student perception of the material, each of the four factors

PAGE 101

Development and Validation 91 91 was averaged individually within groups Means between groups was computed and compared using a paired T-test to determine if significant differences resulted between the treatment and comparison group on any or all factors of the AMP. The researcher ran a Pearson correlation coefficient to determine if any statistically significant relationship emerged between perception and achievement on the comprehension or scavenger hunt test.

PAGE 102

Development and Validation 92 92 Chapter Four Results Purpose of the Study The purpose of the present study is thr eefold. The first purpose is to examine students’ performance on two forms of Internet search skills instruction for web-based delivery from a textbook. The second purpose is to examine effects on students’ academic motivation of two forms of web-based instruction that afford higher or lower levels of learner control. The third purpose is to document the design process used to convert textbook material to web-based instruction. Typical learner-centered approaches to textbook conversion to web delivery begi n with scanned text converted to code readable via web pages. Depending on textbook content, the instructor may insert live links and practice exercises following content presentation or simply present content via web page. In the current study, practice exercises were included in the text derived from Frederick and Smith’s (2000) Introduction to Library and Internet Research Skills course. Learner centered designs rely heavily on the self-regulatory, motivational, and work habits of the learner. In this case, high ability students were used as a sample population so that comparison of design differences due to instructional strategies could be compared. The researcher did not wish to study the interaction between learner characteristics and design features. A second textbook conversion based on cognitive principles associated with Gagn’s (1985) Events of Instruction included features associated with classroom presentation. These included (1) motivational material to gain the learner’s attention, (2) navigational controls

PAGE 103

Development and Validation 93 93 within sub topics so that students were fo rced to complete practice exercises following content overviews, (3) summary material, (4) immediate program-controlled corrective and reinforcement feedback, (5) interval review quizze s to enhance transfer of information, and (6) a Library Squares game with immediate corrective and reinforcement feedback to enhance retention and transfer of information. Dependent measures for the experiment in cluded scores on a posttest for knowledge of Internet search tools and a posttest score on an Internet scavenger hunt. Independent variables were the two tutorials for Internet search training, the first an amended version that focused on features associated with a cognitiv e model described above, the second a learnercentered approach that simply converted textbook materials and exercises from the Frederick and Smith (2000) library and Internet research skills course. Research Question One What effects do two online instructional design strategies for Internet training characterized by their content-centeredne ss or learner-centeredness have on student performance measures? Content-centered features associated with classroom instruction include: gaining attention, guided practice, corrective and reinforcement feedback, embedded quizzes that inform the learner of his/her pr ogress, and summary screens that relate new content to previously learned material. Results from the study revealed that neither tutorial yielded significantly higher performance results on the posttest measure for comprehension. Using a repeated measures ANOVA, results indicated that both groups im proved significantly from pretest score to posttest score, F (1, 40) = 40.233, p <.000. The researcher computed an effect size based on the difference between pre and posttest scores across both groups and divided by the

PAGE 104

Development and Validation 94 94 standard deviation of the pretest score. The effect size resulted in .96079, considered statistically substantial. The data revealed no advantage in terms of performance gains between both groups one and two. No sign ificant interaction e ffect emerged between groups as can be seen in the tables below. Ac ross both groups scores increased from 58.9732 points on the pretest to 72.6337 on the posttest, an increase of close to 13 points. Group one (treatment) increased from a pretest mean of X = 58.2595 to X = 74.0314 ( N = 21) slightly higher gains compared to Group two (comparison) who scored 59.7225 on the pretest and 71.1660 on the posttest ( N = 20 ). The repeated measures ANOVA found that the interaction effect between the groups was statistically insignificant; F ( 1, 40) = 1.018, p = .319 Mean score differences between groups on the Internet scavenger hunt also proved negligible. Mean scores from group one (treatment) yielded X = 9.2381 compared to group two (comparison) that averaged X = 9.2500. Average Internet hunt scores across both groups resulted in X = 9.2439 out of a potential ten points. Given that both groups had access to their respective tutorials, results on ti me to complete the scavenger hunt revealed similarly small differences. Mean time to co mplete the activity for group one was 0:27:27 ( N =21) while group two used 0:26:02 ( N =20) There were no corroborative observational da ta that indicated students found answers to the scavenger hunt as a result of usi ng their respective tutorials. The investigator noted some students using Boolean operators, moving from the tutorial to their Internet tool, and adding quotation marks around phr ases. However, no corroborative conclusions may be drawn without triangul ation of these observations.

PAGE 105

Development and Validation 95 95 Table 5 Repeated Measures ANOVA for Pretest and Posttest Groups One and Two Tests Within Subjects Type III Sum of Squares DF Mean SquareF-Value Significance Factor I: Pre and Posttest Gains 3,793.718 1 3,793.718 40.233 0.000 PretestPosttest Gains Group 95.960 1 95.960 1.018 0.319 Tests Between Subjects Group 10.074 1 10.074 0.041 0.842 Note: N= 41 and p-value set < .05 Group On e (Treatment) n = 21 and Group Two (Comparison) n = 20

PAGE 106

Development and Validation 96 96 Table 6 Descriptive Statistics on Pretest, Posttest, Time on Task, Scavenger Hunt, and Time to Complete Scavenger Hunt Group Mean N St. Dev. Pretest 58.2595 21 12.12557 Posttest 74.0314 21 14.18675 Timetask 0:49:30 21 0:18:06 Scavenger 9.2381 21 1.26114 Group 1 (Treatment) Timehunt 0:27:27 21 0:07:35 Pretest 59.7225 20 11.30883 Posttest 71.1660 20 14.46689 Timetask 0:58:12 20 0:20:02 Scavenger 9.2500 20 1.01955 Group 2 (Comparison) Timehunt 0:26:02 20 0:07:35 Pretest 58.9732 41 11.61079 Posttest 72.6337 41 14.21787 Timetask 0:53:45 41 0:19:20 Scavenger 9.2439 41 1.13535 Total within both groups N=41 Timehunt 0:26:46 41 0:07:31

PAGE 107

Development and Validation 97 97 Table 7 Descriptive Statistics for Group 1 and 2 on Scavenger Hunt Data Mean N Standard Deviation Standard Error of Mean Paired: Scavenger Hunt Group 1 9.200 20 1.28145 0.28654 Scavenger Hunt Group 2 9.2500 20 1.01955 0.22798 Table 8 Paired T-Test Comparing Group 1 an d Group 2 on the Scavenger Hunt Paired Groups Mean Standard Deviation Standard Error of Mean T-statistic DF Significance (2-tailed) Paired: Scavenger Hunt Group 1 Scavenger Hunt Group 2 -0.0500 1.76143 0.39387 -0.127 19 0.900 Note: p-value set at .05 or 95% confidence interval. N= 21 but because paired statistic, data excluded one student’s score listwise Self-Reported Time on Instruction A potential threat to the validity of the st udy was time on task. Because of the nature of the treatment group guiding students throug h exercises, provision of interval quizzes and

PAGE 108

Development and Validation 98 98 summary screens, time on task potentially could have taken longer than the comparison group. Time estimates were recorded to c onfirm that both groups spent approximately equivalent learning time on task. The reader should note that time calculations were estimated using a digital stopwatch placed students’ desk tops. Students started their stopwatches after receiving instruction at the be ginning of the instruction phase Students clicked the stop on the digital timer at the end of the training sessi on. Students recorded their times measured in minutes and seconds on their index cards for assignment to a group and workstation. The statistics for time on task and time to co mplete the scavenger hunt are rough estimates. Using a paired T-test, the difference in time was minimal with an average for group one of 49.5 minutes compared to group two whos e mean time was approximately 58 minutes ( t (1,19) = 1.254 p = .225 ) The range of time for group one was between 45 minutes and an hour and 21 minutes. Group two’s range to complete the instruction was between 35 minutes and an hour 41 minutes. Students appeared highly self-directed in both conditions. One or two students assigned to the comparison group asked if they were required to perform the exercises from the scanned textbook and the instructor sugges ted that they should follow the instructions on the exercise page of their tutorial. Both the students’ language arts instructor and the researcher noted that all students in both module conditions complied with the instructions and completed exercises in both treatment and comparison groups. To avoid undue researcher influence, the researcher did not interact with students other than to instruct students on how to start and stop digital stopw atches on the desktops, how to open multiple windows while performing the scavenger hunt, an d preparing each station to ensure that the methodology was followed throughout the research project.

PAGE 109

Development and Validation 99 99 Research Question Two How do students’ perceptions based on self-reports differ between two instructional strategies characterized by their content-cen teredness or learner-centeredness on attention, relevance, confidence, and satisfaction? To answer this research question, students completed a modified Academic Motivation Profile (Carey, 1985) instrument that encompassed four interrelated factors; (1) attention, (2) rele vance, (3) confidence, and (4) satisfaction. Each question within the factor groupings was presented as a statement to which the student responded on a four-point scale. To analyze this data, each student’s res ponses were categorized according to factor and an average score computed for both groups within each factor. The researcher analyzed data with a Windows version of SPSS (versi on 11.0). Data was input into SPSS within assigned pairs in each of the two periods engaged in the study. The researcher chose a paired T-test because students per group were assigned in matched pairs to ensure equality of group means. Due to uneven cells or pairs (originally N=41), SPSS automatically eliminated a data point without a paired data set for analysis of the paired T-test. The elimination of one of the data points forced the data to conform to N = 40 with 20 students assigned to groups one and two respectively. To compare responses between groups for each of the factors, the researcher performed a paired T-test (paired means) setting = .05. None of the data revealed significant differences between groups.

PAGE 110

Development and Validation 100 100 Table 9 Descriptive Statistics Groups One an d Two Assigned in Matched Pairs Mean N Std. Deviation Std. Error Mean Pair 1 Attention Group 1 2.5480 20 .64140 .14342 Attention Group 2 2.1820 20 .63331 .14161 Pair 2 Relevance Group 1 2.5570 20 .65785 .14710 Relevance Group 2 2.7400 20 .70068 .15668 Pair 3 Confidence Group 1 2.8135 20 .64541 .14432 Confidence Group 2 2.6650 20 .60478 .13523 Pair 4 Satisfaction Group 1 2.6325 20 .73142 .16355 Satisfaction Group 2 2.6160 20 .76651 .17140

PAGE 111

Development and Validation 101 101 Table 10 Paired T-Test AMP Paired Differences 95% Confidence Interval of the Difference Pair and Factor Mean Std. Deviation Std. Error Mean Lower Upper t df Sig. (2tailed) Pair 1 Attention Groups 1 and 2 .3660 .95254 .21300 -.0798 .8118 1.718 19 .102 Pair 2 Relevance Groups 1 and 2 .1830 .82097 .18357 -.5672 .2012 -.997 19 .331 Pair 3 Confidence Groups 1 and 2 .1485 .76757 .17163 -.2107 .5077 .865 19 .398 Pair 4 Satisfaction Groups 1 and 2 .0165 .94177 .21059 -.4243 .4573 .078 19 .938 Note: p-value set at p<.05 To further examine the strength of relationship between motivation and achievement, the researcher analyzed data by running Pearson Correlation Coefficients on the results of each of the factors across both groups and scavenger hunt scores and posttest

PAGE 112

Development and Validation 102 102 scores. Results appear in the table below. Table 11 Correlations Between AMP Factors and Achievement Posttest Scavenger Attention Relevance Confidence Satisfaction Pearson Correlation 1 .016 .124 .196 .233 -.090 Sig. (2-tailed) .920 .446 .226 .148 .579 Posttest N 41 41 40 40 40 40 Pearson Correlation .016 1 -.029 .021 .057 -.070 Sig. (2-tailed) .920 .858 .898 .726 .670 Scavenger N 41 41 40 40 40 40 Pearson Correlation .124 -.029 1 .736(**) .605(**) .673(**) Sig. (2-tailed) .446 .858 .000 .000 .000 Attention N 40 40 40 40 40 40 Pearson Correlation .196 .021 .736(**) 1 .535(**) .657(**) Sig. (2-tailed) .226 .898 .000 .000 .000 Relevance N 40 40 40 40 40 40 Pearson Correlation .233 .057 .605(**) .535(**) 1 .526(**) Sig. (2-tailed) .148 .726 .000 .000 .000 Confidence N 40 40 40 40 40 40 Pearson Correlation -.090 -.070 .673(**) .657(**) .526(**) 1 Sig. (2-tailed) .579 .670 .000 .000 .000 Satisfaction N 40 40 40 40 40 40 ** Correlation is significant at the 0.01 level (2-tailed). The table above indicates no significant correlations between the posttest scores and performance on the scavenger hunt indicating that these two performa nce measures targeted

PAGE 113

Development and Validation 103 103 different skill sets. As expected, significant relationships emerged between factors of attention, relevance, confidence, and satisfacti on. Significant inter-correlations between all four sub-scores on the AMP could be an indi cator of response generalization across all factors of the AMP instrument. The lack of significance between these affective measures and performance measures may lead to th e question of the influence of affect and performance with high ability students compar ed to those less able to self-regulate and follow instructions. Research Question Three Is the additional time and effort needed to include the treatment module features found in classroom instruction; gaining attention, guided practice, corrective and reinforcement feedback, embedded quizzes, an d summary screens, efficacious given the performance and perception results of this study? The process of creating an online module that includes these features is assessed to determine its efficaciousness The researcher incurred time and monetary expenses for the features included in the treatment module that far exceeded those of the comparison condition. Design considerations include the following: 1. Availability of software to include HTML editor, screen capture software, Flash software (MacroMedia) 2. Permissions granted to reuse animated gr aphics or Flash software from another source (in the present case from the TILT developers) 3. Technical abilities of the designer or access to programmer for javascript in the treatment module. No scripting was necessary in the comparison condition 4. Space considerations for upload of the software; 27.3 MB were used for the treatment module and 13.1 MB for the comparison module

PAGE 114

Development and Validation 104 104 An examination of performance and AMP re sults suggest that the cost outweighed the benefits with the sample chosen for this research. Table 12 Time and Resource Costs for Co mparison and Treatment Module Condition Sofware/Hardware Used to Create the Condition Personnel or Technical Requirements Needed to Create the Condition Treatment (ContentCentered) HTML Editing Software 1. Javascript programmer to provide feedback on learner input fields 2. Researcher familiar with basic HTML and how to construct tables for web delivery and construct the navigational interface 3. OCR software along with a flat-bed scanner to get text into a format easily imported into an HTML editor Flash Software 1. Derived from TILT tutorial, modification of practice Brainstorm

PAGE 115

Development and Validation 105 105 Condition Sofware/Hardware Used to Create the Condition Personnel or Technical Requirements Needed to Create the Condition exercise completed with a programmer 2. Graphic artist and programmer used to modify the Library Squares Game to fit the content of the module Screen capture software for simulations for practice exercises 3. Researcher used Snag It software from TechSmith Comparison (LearnerCentered) Software included HTML editor, no Flash programming or modifications to existing material from TILT, no need for javascript programming, only use of HTML tags No programming assistance needed as the researcher used extant animated graphics from TILT, no modifications of Flash, no programming needed to control sequence or navigation

PAGE 116

Development and Validation 106 106 Summary of Results The data indicate there were no apparent differences found between posttest scores in either the treatment or comparison modu le, however common features of both modules proved effective. A main effect resulted with an increase of scores from pretest to posttest F (1, 39) = 40.233, p < .05. Perhaps more importantly, students demonstrated both proficiency and efficiency in search skills overall as demonstrated by the Internet Scavenger hunt. Mean scores from the hunt ( N=41 ) resulted in X = 9.2439 of ten points. Examination of the second question focused on whether student perception based on Keller’s (1992) ARCS model revealed no sign ificant differences between groups on any of AMP’s four factors. Using a paired T-test to determine if significant differences emerged on attention, relevance, confidence, and satisf action, none proved statistically significant within the two paired groups with = .05. A paired t-test resulted in no statistical difference on attention ( T = 1.718, p = .102), relevance ( T = -.997, p = .331), confidence ( T = .865, p = .398), or satisfaction ( T = .078, p = .938) Correlations between factors on the AMP and achievement proved insignificant in terms of achievement but between factors of attention, relevance, confidence and satisf action were significant at the p < .001 The third research focus was on the efficaciousness of expending the increased effort and time to create the treatment WB I module. Results were based on performance and perceptual scores and they suggest that expending the resources needed to include the additional features for the treatment condition may be called into question. Given that the sample chosen for this particular research includes high ability students who demonstrate characteristics such as self-initiation, internal se lf-regulation, focus, and the ability to monitor choice of instructional strategy, the ability to generalize finding from the data to a less able group remains to be seen. Further, there was variance in student responses based on interest

PAGE 117

Development and Validation 107 107 in the subject matter regardless of assignment to any one instructional condition. From the data alone on the performance scores and perceptual feedback, it appears that one needs to focus on those features found common to both conditions given no significant differences between the two conditions on any of the perf ormance or perceptual measures. Thus, upon first inspection, one could conclude that addi tion of the aforementione d features may prove unnecessary to obtain similar results with a population of high ability students.

PAGE 118

Development and Validation 108 108 Chapter Five Discussion Introduction The researcher assessed the effectiveness of two online Internet training modules, one that used instructional strategies associated with learner-centeredness, the other based on content-centered instruction. Previous studies found that learners of high ability are able to self-initiate and regulate when provided wi th learner-centered instruction (Schnackenberg and Sullivan, 1998; Chung and Reigeluth, 1992; St einberg, 1989). The investigator revisited the question of learner-centered versus conten t-centered instructional features and their effect on performance and percep tual measures identified as high ability middle school students. Purpose of the Study The purpose of the present study is thr eefold. The first purpose is to examine students’ performance on two forms of Internet search skills instruction for web-based delivery from a textbook. The second purpose is to examine effects on students’ academic motivation of two forms of web-based instruction that afford higher or lower levels of learner control. The third purpose is to document the design process used to convert textbook material to web-based instruction. Referencing the first purpose…when textbook content is converted to HTML format for web delivery two schools of educat ional philosophy must be addressed. One

PAGE 119

Development and Validation 109 109 school of thought focuses on content-centered delivery, the other on learner-centered delivery. Specifically, this study focused on the effects content-centered versus learnercentered design has on performance scores for high ability learners. The quality of the web-delivered material depends on instructors’ abilities to manage these resources, the receptivity of the target au dience to the instructional content, and the quality of the textbook content. The research er posed the question whether high ability learners would perform differently assigned to a content-centered or learner-centered module. Features associated with a conten t-centered approach included greater program control, immediate feedback and guid ance, review, and intermittent quizzes. In reference to the second purpose, the researcher wished to ascertain whether student preference differed between the two online modules (Keller, 1987). Student responses were analyzed based on a modified Academic Motivation Profile (Carey, 1994). The researcher analyzed whether hi gh ability students favored an online module that included classroom instructional features over a skeletal version of the textbook material followed by practice exercises. Results proved inconclusive. Verbal feedback from the students indicated that the amended version of the materia l sustained student attention and increased confidence. Statistical analysis, however, did not support this finding. The third purpose addressed the cost-benefit analysis for development of online instruction with features associated with classroom instruction. These features included gaining attention, guided practice, correcti ve and reinforcement feedback, embedded quizzes, and summary screens. Design strategie s used to develop the treatment module were discussed in the Methods chapter. The researcher paid careful attention to test development to ensure content validity and course objective s corresponded with test items. One-on-one trials and small and large group pilots necessi tated revisions throughout the design process.

PAGE 120

Development and Validation 110 110 Statistical analysis was performed to deter mine the reliability of the test items on the comprehension preand post-test. A final disc ussion concerning the efficaciousness of time and professional resources necessary for devel opment of the treatment module follows in this chapter. Implications of the Results No apparent statistical advantage was seen in favor of either of the two strategies; learner-centered or content-centered instructi on. Both groups, treatment and comparison, did equally well on the comprehension and performance measures. Results from the current study support Schnackenberg and Sullivan’s (19 98) research. Those of high ability performed better than those of lower ability provided a le an and full version of software. Contradictory to Schnackenberg’s research, there was no di fference on performance or attitude between the learner-centered (lean design) versus the con tent-centered (full version) software for high ability students. Perhaps content familiarity, students’ sense of relevance of the material, and the general novelty effect of online instruction influenced performance outcomes exhibited in the present study. While the research does shed light on the efficacy of learner-centered online materials for those of high ability, it cannot be assumed these results will be obtained with average or low ability learners. The researcher found that when textbook content is converted for WBI, issues arose associated with learner characteristics, percep tions, and receptivity to instructional material. Apart from a statistically insignificant rela tionship between factors on the modified AMP in favor of either learner-centered or con tent-centered instruction, anecdotal evidence suggested that the AMP may not be sensit ive enough to detect middle school student responses to the instrument. The relationship between attention, relevance, confidence,

PAGE 121

Development and Validation 111 111 satisfaction and achievement may be of greater importance for students with less developed metacognitive and self-regulatory abilities than those selected for the present study. Performance and affective feedback indicate that additional time and cost to produce the content-centered module may not produce sufficient return. Research Question One What effect do two online instructional design strategies for Internet training, characterized by their content-centeredne ss or learner-centeredness, have on student performance measures? Findings from this study found no diff erence between students in the treatment condition from those assigned to the compar ison group for either the retention posttest or the scavenger hunt performance test. On first inspection, the results appeared to confirm the notion that a “lean-plus” design defined by Al essi and Trollip (2001) proved sufficient. A “lean-plus” design is exemplified by the lear ner-centered module. Students had opportunities to explore features of a non-linear computer-bas ed tutorial without being compelled by the WBI program to complete interactive exerci ses. The addition of review quizzes and a simulation game where the student received feed back on her/his mastery of the material did not substantially benefit performance outcomes on the posttest. The literature indicated that students of high ability may not require program control or directly prescribed strategies in practice ex ercises in order to acquire content-centered instruction. Learner characteristics may play a role in negating the treatment interaction effect of content-centered instruction. The degree of self-regulation and initiative of these students may enable them to engage in less structured instructional approaches. Schnackenberg and Sullivan’s (1998) research found when high ability students control the

PAGE 122

Development and Validation 112 112 pace, sequence, and navigation within an instru ctional program they are able to evaluate their instructional needs and devise effective learning strategies. Meyer (2003) contended that success within non-linear web-based instruction was dependent on an individual’s self-regulatory abilit ies. Individual differences in terms of level of self-regulation across both groups may ac count for the “no significant difference” outcome between the two groups. Meyer asserted that a student’s prior knowledge, learning style on a continuum of initiative and passivity, may determine the success or failure of webbased learning. Chances are good that studen ts will perform successfully if they have: (1) high motivation towards the subject matter, (2) gr eater self-regulating learning behaviors, and (3) the belief that they will learn in an online environment. Students who depend on external learning conditions and who do not possess the ability to self-regulate may not perform as well in online environments. The sample of high ability middle school students confirmed Meyer’s conclusions evidenced by the test outcomes across both groups. The effect of features associated with classroom interactions or content-centered instruction failed to produce statistical diffe rences between the treatment and comparison groups. Results from the present research requi red the investigator to identify successful common features for both tutorials. Both groups had a vertical menu structure set up in tables where topics lined up on one side of the screen and content appeared on the other. Both groups had visual prompts (arrows) to inform the student where they were within the module. If students moved vertically from top to bottom of the table of contents, they moved logically through the topics. In most cases the researcher observed the students moving sequentially through the material using the vertical menu bar. Graphics used in both conditions were used to illustrate principles presented in the text. For example, when the students studied general subject directories in both conditions, a

PAGE 123

Development and Validation 113 113 link to a graphic of an inverted triangle ill ustrated how the directory was organized from broad topics to specific within a subject area Venn diagrams were used in both conditions to illustrate Boolean operators along with a pop-up screen and definition of the origin and definition of the term “Boolean”. Flash appl ets were used sparingly on overview screens in both conditions in order to sustain attention and illustrate learning principles within the text. Both comparison and treatment modules included instruction on research strategies and refinement of research questions taken wi th permissions from the University of Texas’ TILT Tutorial. Both modules presented an out line of how one refines a research question, brainstorms subject categories, selects keywords, and uses quotation marks around phrases and wildcards for word variations. These strategies were presented as a technique before introduction of any particular Internet search tool. Overview screens for the comparison and tr eatment groups contained tables with descriptions, comparisons, and live links to va rious categories of Internet search tools. Students in both conditions took advantag e of the opportunity to ex plore links. Exercises followed the overview in both conditions and the exercises included in the treatment condition were derived from those in the tex tbook. With the exception of screen captures versus open-ended practice exercises with the ma terial, treatment condition exercises guided students and provided corrective and reinfor cement feedback with examples found in the textbook. Students assigned to the comparis on group experimented with exercises and derived feedback as a natural consequence of their exploration. Because questions were closed-ended, it was possible for students to conclude if their strategy proved successful. Narrative screens provided clear definitions readily understandable to high ability students. Embedded glossary hypertext links were included in overview screens for both conditions. The tone of the narratives in bot h conditions was conversational. Concepts in

PAGE 124

Development and Validation 114 114 virtual libraries and general subject directories fo r example, made reference to each other so the student could see the similarities and differe nces between various Internet search tools. The performance hunt measure proved to be effective with students assigned to both instructional conditions. When the resear cher performed a small group pilot with students who did not have benefit of the online modules or any formal prior Internet search training, it was reported that the activity wa s enjoyable but frustrating without training. When students were given access to their respective tutorial as a reference source, the students successfully applied principles from instructional materials to a performance task. Research Question Two How do students’ perceptions based on self-reports differ on attention, relevance, confidence, and satisfaction between two inst ructional strategies characterized by their content-centeredness or learner-centeredness? Results revealed no significant differences between groups on factors of attention, relevance, confidence, or satisfaction. Though no statistical differences were found on paired T-tests on each of the four factors measured in the modified AMP, there may be anecdotal evidence that the treatment program may have been preferable to the comparison module. Carey’s (1994) AMP was originally designed fo r use with college students. “Typical means for undergraduate students on the four scales of the AMP are in the range from slightly above to slightly below 4.0 on a five-point sc ale after a full semester course experience” (Personal interview with L. Carey, 3/22/05). The investigator modified the AMP to fi t the context and content of the current study. Given discrepancies between student verbal responses to the instruction, the researcher questioned the sensitivity of th e AMP for the middle school audience. Though

PAGE 125

Development and Validation 115 115 cognitively the students understood the verbia ge of the instrument, conversational feedback indicated the students enjoyed their training experiences. Observational data and discussion with the students may not have corresponded to their written responses on the AMP instrument. A second factor, delay of more than a week’s time after training and administration of the performance tests may have posed a threat to the validity of responses from the students on the attitudinal instrument. Upon replication of the study, the resear cher suggests that administration of a perceptual instrument take place immediately fo llowing instruction. Only after the researcher has presented the questionnaire and dialoged with students would the instrument be given absent the presence of the researcher. Design of the study was experimental using a matched pair assignment of students to either the comparison or treatment group. Mean scores on the four factors indicated that on a four point scale, students ’ mean attention score was X = 2.39, relevance X = 2.67, confidence X = 2.67, and satisfaction X = 2.65. Mean responses to the instruction suggest students rated the instruction somewhere betw een slightly and moderately interested in the material. Both the researcher and the students’ language arts teacher noted that students appeared fully engaged in their work, very little talking took place, and students appeared to carefully “read” the software screens. Observ ational indicators suggest the AMP or timing for administration of the instrument may not yield accurate data. A statistical issue related to power and effect size remains as only two classes were studied in total with students of very high ab ility. Calculation of statistical power and sample size is somewhat difficult for paired t-tests. Post hoc calculations were made via an online calculator for effect and sample size (Uitenbroek 2005 ). For an alpha of .05, a minimum of

PAGE 126

Development and Validation 116 116 30 students/ group were needed for an effect size of .6 for a double sided test. There is an error made in calculation of ES and Power when computing correlated “paired t -test value takes into account the correlation be tween the two scores the paired t -test will be larger than a between groups t -test.” (Becker, 2000) In the present st udy, the effect size was calculated using the difference Cohen's d = M1 M2 / spooled where spooled = [(s 1+ s 2) / 2] resulting in an inflated power estimate. Results using the online calculator, Cohen’s d resulted in a statistic of 0.574 for the attention factor between group one and two. On the factor of Relevance, the second group rated their experience higher than the first and resulted in a negative d = -0.269 Confidence power = .237 was in favor of group one (treatment). Satisfaction power analysis resulted in d = 0.0223. The researcher suggests that larger numbers are needed to assess between group di fferences on the four factors of attention, relevance, confidence, and satisfaction for replication of the study. The timeframe for the research posed a possible threat to external validity. The research was conducted over a two-week interval. Students had recently completed standardized testing, schedules had been modifi ed to accommodate five consecutive days of testing, and students may have been anticipating the end of year finals while they participated in the study. The time of year, the last th ree weeks of school prior to finals and summer break, may have affected the students’ full participation with the instructional online modules. Research Question Three Is the additional time and effort needed to include the treatment module features found in classroom instruction; gaining attention, guided practice, corrective and

PAGE 127

Development and Validation 117 117 reinforcement feedback, embedded quizzes, an d summary screens, efficacious given the performance and perception results of this study? Central to the question of how one converts textbook information literacy units to web-based instruction is consideration of cost-b enefit analysis to the process of creating hypertext learning environments. The researcher incurred expense both in time and money given lack of javascript skills to create the feedback and reinforcement for the treatment condition. A graphical designer and progra mmer were hired by the researcher to adapt content for the Library Squares game from the original TILT Tutorial; javascript programming was needed to design guided practice and feedback simulations. Screen capture software was used to replicate the simu lated exercises in the textbook in order to guide the students through the learning activities. The cost, measured in time and money, for media specialists or teacher s without HTML or graphics skills would be higher. A systematic design process was used for development for both treatment and comparison conditions. Modifications to the we b-based format in the comparison condition included a analysis of skills, decision regarding entry level skills required for success within the course, clearly written objective statements in terms of performance expectations postinstruction, construction of criterion-based tes ts, choice of instructional strategy, sequence and presentation of the material (developmen t and selection of instructional materials), formative trials with the software, revision of material, and summative evaluation of the instruction (Dick, Carey, & Carey 2001). The major difference between the two conditions resided in choice of instructional strategy The treatment module contained programcontrolled sequences for guided practice and feedback, program-controlled navigation within each of the sub-topics, motivational material at the start of the module, embedded quizzes for review and feedback, and simulated library squares game to provide assistance with

PAGE 128

Development and Validation 118 118 retention and transfer of learni ng. The treatment module required additional graphics (screen captures), programming to provide feedback du ring guided practice exercises, MacroMedia flash programming for the simulated library squa res game and introduction to the unit, and javascripts for feedback loops. In order to assess if participants gained skills post-instruction, much effort was placed in formative testing of the criterionbased testing instrument. This phase of the development process required careful analysis of objectives, examination of the instruction to ensure that learners were exposed to material that corresponded to the stated objective, and construction of test items that correspond ed to performance objectives. Fortunately, much of this work was derived from previous i terations of the course from the authors and instructors from University of South Florida. Additional effort and time was required to conduct one-on-one trials and interviews to determine the clarity of the questions fo r both the comprehension and scavenger hunt tests. Modifications were made to test items an d subsequent delivery of the online modules and tests allowed the researcher to conduct statis tical analyses of the internal reliability of the instruments. The following steps were followed and proved successful for both conditions. 1. Gagn’s (1985) Events of Instruction were use d to determine differences in instructional strategy and to distinguish between the treatment and comparison conditions. 2. Clear statements of learning objectives an d correspondence of objectives to narrative material, exercise examples, and test items were developed.

PAGE 129

Development and Validation 119 119 3. Narrative screens and presentation of c ontent (from scanned material from the textbook) were featured and identical graphica l organizers in both conditions taken from the TILT Tutorial were used. 4. Text content was edited to convey an informal conversational tone. 5. Menu structure was revised from a frames-based design originally adapted from the University of Texas’ TILT (1997-2004) websi te to a table format to avoid confusion and information overload for end-users. The chan ges made to the vertical menu format within a table proved easier for students to navigate and revisions were based on observations and direct feedback during formative trials. 6. One-on-one trials were conducted with the sc avenger hunt and qualitative “talk aloud” feedback was received. This assisted the rese archer in determining whether students were able to perform searches wi thout benefit of instruction. Large trials of the scavenger hunt were conducted with undergraduate grou ps to determine the clarity of multiplechoice questions. Statistical analyses were conducted to determine internal reliability of this instrument. The researcher concluded that the cost necessary to include features for the treatment condition may not be warranted for this population sample. Replication of the research is recommended to determine whether comparable results with larger numbers support these findings. Limitations of Instructional Delive ry and Online Lear ning Literature In most circumstances the Internet modules would be used not as stand-alone material. Teachers would most likely present the material to a group and guide students throughout the instruction. Students would have opportunities for discussion, questions,

PAGE 130

Development and Validation 120 120 peer to peer interaction and the like. For the present research modifications of the online units were made so that the converted text book material could be used as a stand-alone online resource. A look at distance learning li terature may explain the results in performance outcomes between groups. A consensus has been established regarding the benefits and difficulties encountered within web-based learning environments comp ared to traditional face-to-face classroom settings. Russell (1999) coined the phrase “no significant difference” between distance learning and classroom instruction; however Russell’s studies focused on high achievers enrolled in classroom and distance learning venues. Three recurrent themes in the literature emerge that influence the success or failure of student performance within web-based lear ning environments. They are; (1) inclusion of peer interaction and cooperative learning opportunities within the design of online courseware, (2) awareness of the benefits and difficulties inherent in online environments, and (3) awareness of affective factors that will enhance or diminish student performance (Perez-Prado, & Thirunarayanan, 2002). The current study failed to account for the first of the factors, peer interaction and cooperative learning opportunities. The study focused on design of instructional materials not controlling for peer interaction. The in-person delivery of the material in a computer lab neither encouraged nor discouraged peer interaction. Each student was assigned a station and while students sat at adjacent workstations (within assigned groups), there was little conversation or cooperative interaction between them. When students had a question, they raised their hands for assistance and the instructo r or researcher responded individually to a question concerning navigation or use of the digital stopwatch. Regarding awareness of the benefits and di fficulties inherent to online learning, an

PAGE 131

Development and Validation 121 121 obvious benefit is that online information literacy instruction can be distributed to large numbers of students in a school setting. A disadv antage of delivery of the material absent an instructor is the risk of losing student engage ment due to absence of “instructor presence”. Studies reveal that student satisfaction and pa rticipation are weakened absent perceptions of instructor or fellow students as “real” in di stance learning classrooms (Sherry et.al., 1998; Gunawardena & Zittle, 1997). Factors that can enhance or diminish stud ent affect towards instruction include: (1) difficulties due to lack of technical support, (2) increased demand on time for implementation for faculty members, and (3) higher demands for responsibility and selfinitiative on the part of the learner. A qualit ative study by Perez-Prado and Thirunarayanan (2002) found that students worried about th eir abilities to complete an online educational methods course independent of an instructor. Kling and Hara (2000) found that students relied on self-judgment to assess the meaning of educational interactions, contributing to some learners’ anxiety in distance learning cour ses. The presence and quality of learner-tolearner interaction and learner-to-instructor in teraction can either alleviate or exacerbate learner confusion and anxiety (Gunawardena & Zittle, 1997). The aforementioned difficulties were not apparent during administration of the instruction for the present study. Students appeared thoroughly engaged in both conditions, were afforded the opportunity to ask questions of the instructor and researcher. The design of the modules led to few navigational difficulties for groups, comparison and treatment. The students demonstrated internal self-regulation and derived benefit of the instruction without the need for instructor presence. When debriefed, when asked by the researcher if anyone experienced confusion about how to approach the material or use the tools in the tutorial to perform the scavenger hunt, all responded that the graphical cues and menu

PAGE 132

Development and Validation 122 122 format was easy to follow. Due to learner characteristics with high ability students, the same results may not result upon replication of the st udy with students of lower ability or selfregulation. Conclusion The reader should note that even under less than ideal conditions, the research demonstrated that a systematic approach for conversion of textbook to web for an information literacy unit proved effective. St udent performance on a retention and applied search performance task demonstrated ac quisition of knowledge and skills following instruction equally for both content-centered an d learner-centered approaches to WBI. The study documents how: existing material is converted for online instruction, tests are constructed to measure course objectives, and how the educator has a responsibility to demonstrate the effectiveness of instruction. Mean statistics on the AMP between groups revealed that students responded equally to both content-centered and learner-centered instructional designs. Results demonstrated th at learners grew in understanding of the material and were able to apply skills attained from the instruction. Summary of Limitations The researcher chose a pretest-treatmentposttest experimental design to measure gains in learning post instruction. To avoid sampling bias and ensure equivalency of groups for the comparison and treatment conditions, st udents were matched within pairs and within pairs assigned randomly to one of two conditions; treatment and comparison. Threats to internal validity for the pretest-treatment-posttest design included the following: (1) history or replication of th e pretest following posttest without ample time allowed between administration of the instru ments; (2) testing where the pretest alters

PAGE 133

Development and Validation 123 123 posttest responses and potentially negates the treatment; (3) instrumentation error due to low reliability or content validity of the tests and potential order effect resulting when pre and posttests follow the same order of questions. This may occur especially when both treatment and comparison groups are located within the same or approximate physical space. The researcher altered the order of questions between pretest and posttest administration to mitigate an order effect on pe rformance outcomes. A time period of three weeks was allotted between pretest and treatment. The comprehension posttest was administered within a week of training and the scavenger hu nt a week after the training. Cronbach alpha coefficients were computed for the summer 2003 administration of the pretest-posttest and for the May 2004 final study. Cronbach alpha dropped from = 0.78 (N = 44) to .6856 (N = 41). The statistical decrease in alpha level threatened the internal validity of the pre post test instrument. Replication of this research would include an increased number of test item s and include fill-in-the-blank questions. Items should be clustered to correspond to specific objective s so that one could discern objectives consistently missed by many of the students The uniform responses to three of the questions on the scavenger hunt resulted in a lo w alpha of .5647. The researcher concluded that the three items may have been too easy and need modifications for future research. To increase the reliability of the Internet hunt, one needs to add more test items including those that ask for fill-in-the-blank responses. Influence from instructor or researcher presence was eliminated in the procedure as much as possible. The researcher prepared the environment i.e., launched the websites on the students’ desktops, started the stopwatch, and told students one-by-one where to station themselves. The language arts instructor wa s present as well as the primary researcher

PAGE 134

Development and Validation 124 124 throughout the training module. The researcher refrained from interaction with students while present in the lab while remaining “as invisible as possible”. Technical assistance only was provided when a student raised his or her hand for instructions. Students remained relatively silent during the procedure. Threats to external validity included interaction between selection of the sample and treatment. As indicated in the discussion, th e results cannot be generalized beyond the sample population. The research site represents a suburban magnet school for high achievement students. The school is an ex emplary program and does not allow the researcher to generalize beyond the local middle school population. Post hoc sample size estimates and power analysis indicated that larger numbers of students were needed to determine whether the features of the treatment condition warranted the extensive development. Further, th e sample of highly self-motivated students may not be the best population for testing the effectiveness of the added features in the treatment condition. These students may be nefit regardless of instructional strategy. Statistical limitations included low re liability scores for the comprehension and performance instruments, thus resulting in high standard error. Sample size was relatively small ( N = 41) given that alpha was set at p <.05 Replication of the study with larger numbers would increase power and the researcher could have computed sample size based on an effect size of .80 and alpha at .05 prio r to implementation of the final research. The fact that both comparison and treatment groups excelled and the sample comprised high achieving middle school students may be attribu ted to learner characteristics being correlated with the sought outcomes of the dependent variable.

PAGE 135

Development and Validation 125 125 Summary of Implications The previous discussion focused on characteristics of learners who successfully engage in WBI as well as those common features of instructions that proved fruitful for the sample of high ability learners for the present study. The objectives of the modules regardless of learner-centered or content-centered appealed to students. The characteristics of the learners were ideal for a learner-center ed approach to the instruction. What remains unanswered is whether the findings can be generalized across other ability levels, perceptions towards learning modules with varied sample populations, and whether the additional expense can be justified with adoption of a c ontent-centered approach to online instruction. Study Informs Practice The present study confirms the literature rega rding design and high ability learners. If educators develop tutorial units for high achievin g or advanced classes of students, it is likely the students will gain skills using a lean design one that affords high learner control. Text materials need to be reorganized using overview of concepts screens, links for students to explore and experiment with concepts, practice screens, graphic organizers, and visual cues that inform the learner of her/his progression through the material. Designers would be well advised that prescriptions outlined by Chung an d Reigeluth (1992) that describe a conditions – methodoutcome model are supported by the current study. The software for both comparison and trea tment modules converts previous print material for WBI delivery; however both modu les may be used as multimedia software viewed through a web browser. These modules ma y be used by future practitioners to teach Internet research skills. As practitioners design former print formats for WBI or computerassisted-instruction, they would benefit from examination of Alessi and Trollip’s (2001)

PAGE 136

Development and Validation 126 126 Multimedia for learning : methods and development for features proven effective for tutorial development in the current study. Recommendations for Future Research A logical replication of the study would be to examine the effect of the two instructional conditions with varied levels of academic ability to determine if a treatment interaction effect would emerge between groups given lower achieving student sample. The high ability group showed no significant corre lation between achievement and motivation score on the Academic Motivation Profile (Carey, 1994). This may not be the case given less academically gifted learners that represen t a more generalized population of students. The population sample consisted of high ability eighth grade students supervised in a computer laboratory setting. An earlier pilo t in the Summer of 2003 with undergraduate students took place via distance learning. The higher score reports on posttests from students who received instruction under supervision indicates further research is recommended. Research is needed to determi ne whether performance results differ when high ability students are provided a distan ce learning venue with the same materials. High ability middle school students absent an external incentive, such as additional points added to their grade, appeared eager to participate in the research and demonstrated high levels of concentration. Replication of the study would strengthen results with a different group of high ability students from a comparable school. Receptivity to the material measured by the modified AMP appeared consistent across both groups. Neither group statistically reported preference for a contentcentered or learner-centered design, however anecdotal feedback indicated the treatment module sustained students’ attention and increased confidence. The implication of the AMP results calls for replication of the study

PAGE 137

Development and Validation 127 127 with larger sample sizes. The study addresses the question of cost-benefit use of a systematic online design process for the delivery of textbook content. Ev en in the comparison condition, it appears insufficient for instructional purposes to simply scan in text, insert hyperlinks, and provide performance tests to determine the effectivene ss of the instruction. Attention must be paid to menu structure, navigation, presentation of examples and non-examples, tone and structure of the narrative content, instructiona l strategies for delivery of practice exercises, and technical skill of the instructor for conversion of the material formerly in paper format to digital delivery. Further, formative trials of the materials are optimal so that the instructor can gauge the usability of the converted web-bas ed content with a wide variety of student groups. However, addition of features associated with classroom instruction to include program controlled sequence, guided practi ce and feedback, embedded review quizzes, and simulated searches may not be warranted fo r high ability students. The aforementioned features may be necessary for lower achieving students and calls for further investigation. The relationship between student achievemen t and any of the factors of the AMP was statistically insignificant. Performance measures indicated the sample did not surpass those students assigned to the learner-centered module. Cost in terms of time and money to produce the content-centered approach was not ju stified given the results of this research. Results are inconclusive with respect to cost-b enefit analysis of a content-centered versus learner-centered approach to WBI. Replication of the study may determine if the cost produces greater benefit for a wider audience of learners. These results suggest that replication in whole or in part are appropriate; the new results can be used to support or extend the findin gs of this study. There are several issues that should be considered when replicating this study. First, modifications to the Internet

PAGE 138

Development and Validation 128 128 Scavenger Hunt to increase the internal reliability of the instrument include adding items, clustering the questions based on unit objective s, and inserting fill-in-the-blank answer formats. Parallel tests for the pre and compre hension posttests are needed to ensure that history does not threaten the in ternal or external validity of th e research design. Timing for administration of the posttest for knowledge of Internet research tools suggest the test measured retention of verbal information, not necessarily measuring a learner’s ability to comprehend or apply the instruction to different contexts. Upon replication of the study, the researcher suggests that students take the test immediately following training. Further, students could use the tutorial as a reference aid to assess comprehension and application of rules regarding Internet research. Discrepancie s between verbal feedback and scores on the modified AMP suggest that modifications in pr ocedure and timing of administration of the instrument may be necessary. The delay in administration of the AMP posed a problem. The researcher suggests that narrative feed back immediately following training and distribution of the AMP at that time may yield more accurate responses. The investigator recommends a two-by-two mixed factorial quasi-experimental design to determine if a content-centered ve rsus learner-centered online module will benefit students of low achievement compared to high ability classes. Teacher training to familiarize the class instructor with both versions of the online modules would afford the students an opportunity to work with familiar classroom teachers and minimize the researcher’s influence. To create a more authentic teaching an d learning environment, instructors serve as guides to assist students through the online modules. To mitigate the influences of students divided into two groups in the same computer lab, one low achievement and one high achievement class would be assigned as a whol e group to either the treatment or comparison condition.

PAGE 139

Development and Validation 129 129 During the research, a digital stopwatch es timated time to complete the modules. A method to determine time on task less prone to human error is preferable for future research. Mean time on task estimates show an insignificant difference in time between the two groups assigned to either the treatment or comparison module, however these estimates are not reliable. Further time on task may reve al larger differences between low and high achieving students. A possible study entails replication of the materials with low achieving classes and use time as a dependent variab le along with performance test scores. Other questions worthy of investigation in clude the following: (1) How do students perform when presented a con tent-centered versus learner-cen tered online module when the instruction is facilitated by the classroom teacher? (2) Do low achieving students perform better when provided a content-centered ver sus learner-centered instructional design? (3) Will low achieving students prefer the conten t-centered treatment design over the learnercentered comparison design? (4) What is the re lationship between students’ standardized reading scores and achievement on the Internet training module? (5) What is the relationship between students’ standardized scores on inform ation skills based on the Iowa Test of Basic Skills (ITBS) and performance outcomes on the Internet training modules? The questions form the basis for further research and are only a sample of how the data from the present study may be investigated in the future. The present research supports the use of a systematic design process used to convert traditional textbook content for WBI. Regardless of whether the designer supports a learner-centered approach to course development or content-centered approach, the process of converting text to Web-BasedInstruction is lengthy and requires forethought based on learner characteristics, educational philosophy, and practical considerations.

PAGE 140

Development and Validation 130 130 References AASL, AECT, Librarians, A. A. o. S., & Technology, A. f. E. C. a. (1998). Information power: Building partnerships for learning Chicago: American Library Association. Agarwal, R., & Day, A. E. (1998). The im pact of the Internet on economic education. Journal of Economic Education, 29 (2), 99-111. Alessi, S. M., & Trollip, S. R. (2001). Multimedia for learning :methods and development (3rd ed.). Boston: Allyn and Bacon. Barrow, R., & Woods, R. G. (1988). An introduction to philosophy of education (3rd ed.). New York: Routledge. Becker, L. (March 21, 2000). Effect Size [website]. Lee Becker. Available: http://web.uccs.edu/lbecker/Psy590/ es.htm#III.%20Effect%20size%20measures %20for%20two%20dependent [2005, March 15, 2005]. Bichelmeyer, B. A., & Hsu, Y.-c. (1999). Indi vidually-Guided Education and Problem-Based Learning: A Comparison of Pedagogical Appr oaches from Different Epistemological Views. Paper presented at the Proceedings of 21st Selected Research and Development Papers Presented at the National Convention of the Association for Educational Communications and Technology [AECT] Houston, TX. Biggs, J. (1996). Enhancing Teaching through Constructive Alignment. Higher Education, v32 (n3), p347-364. Bilal, D. (1998). Children's Search Processes in Using World Wide Web Search Engines: An Exploratory Study. Proceedings of the ASIS Annual Meeting, v35 p45-53.

PAGE 141

Development and Validation 131 131 Bloom, B. S., Englehart, M. D., Furst, E. J., Hill, W. H., & Krat hwohl, D. R. (1956). Taxonomy of educational objectives: T he classification of educational goals : Handbook 1; The cognitive domain. New York: W.H. Freeman. Bowden, J., & Marton, F. (1999). The University of Learning. Beyond Quality and Competence in Higher Education Sterling, VA: Stylus Publishing, Inc. Boylestad, R. L. (1990). Introductory circuit analysis (6th ed. ed.). Columbus: Merrill Pub. Co. Broch, E. (2000). Children's search engines from an information search process perspective. School Library Media Research, vol. 3 14. Campbell, V. N. (1963). Effects of mathematical ability, pretrain ing, and interest on self-direction in programmed instruction title VII 7-48-0000-183, Office of Education, U.S. Dept. of Health, Education, and Welfare. Leslie J. Briggs, principal investigator. Palo Alto, CA: American Institute for Research. Capobianco, S. Z. (1998). Instructional intranets in graduate medical education /. Unpublished Dissertation: Thesis (Ph. D.), Un iversity of South Florida, Tampa. Carey, J. O. (1998). Library Skills, Information Skills, and Information Literacy: Implications for Teaching and Learning, [Electronic Online Journal]. SLMQ Online Available: from http://www.ala.org/aasl/SLMQ/skills.htmlRetrieved October 1, 2001]. Carey, L. (2005). Personal Communication. In J. Carey (Ed.). Tampa. Carey, L., Dedrick, R. F., & Carey, J. O. (199 4). Procedures for designing course evaluation instruments: masked personality format versus transparent achievement format. Educational and Psychological Measurement, v. 54 (Spring 1994), 134-145. Carr, A. M., & Carr, C. S. (2000). Integrating Instructional D esign in Distance Education [Website]. Available: http://ide.ed.psu.edu/idde/default.htm [2005, January 2005].

PAGE 142

Development and Validation 132 132 Chung, J., & Reigeluth, C. M. (1992). Inst ructional Prescriptions for Learner Control. Educational Technology, 32 (10), 14-20. Daniels, H., & Zemelman, S. (2003). Ou t With Textbooks, In With Learning. Educational Leadership: journal of the Department of Supervision and Curriculum Development, 61 (4), 3640. Depuis, E., Fowler, K., & Library, U. o. T. S. D. (1998-2004). Texas Information Literacy Tutorial. The University of Texas System Digital Library. Available: http://tilt.lib.utsystem.edu/ [January, 2005. Dick, W., Carey, L., & Carey, J. (2001). The Systematic Design of Instruction (5th ed.). New York: Longman. English, R. E., & Reigeluth, C. M. (1996) Formative Research on Sequencing Instruction with the Elaboration Theory. Educational Technology Research and Development, v44 (n1), p23-42. Ertmer, P. A., & Newby, T. J. (1993). Beha viorism, Cognitivism, Constructivism: Comparing Critical Features from a Design Perspective. Performance Improvement Quarterly, 6 (4), 5072. Fidel, R., Davies, R. K., Douglass, M. H., Hol der, J. K., Hopkins, C. J., Kushner, E. J., Miyagishima, B. K., Toney, C. D., Davies, R. K., Douglass, M. H., Holder, J. K., Hopkins, C. J., Kushner, E. J., Miyagishima, B. K., & Toney, C. D. (1999). A visit to the information mall: Web searching behavior of high school students. Journal of the American Society for Information Science, 50 (1), 24-37. Fonseca, T., & King, M. (2000). Incorporating the Internet into traditional library instruction. Computers in Libraries, 20 (2), 38-43.

PAGE 143

Development and Validation 133 133 Frederick, J., & Smith, D. (2000). Library and Internet research skills a guide for college students Dubuque, Iowa: Kendall/Hunt Pub. Co. Gagn, R., & Briggs, L. (1974). Principles of Instructional Design (2nd ed.). New York:: Holt, Weinhart, Winston. Gagn, R. M. (1985). The conditions of learning and theory of instruction (4th ed.). Hillsdale, N.J.: Holt, Reinhart, & Winston. Glendinning, M. (2002). Beyond the Digital Fun-Factor. Independent School, 62 (1), 9096.Gunawardena, C. N., & Zittle, F. J. (1 997). Social Presence as a Predictor of Satisfaction within a Computer-med iated Conferencing Environment. The American journal of distance education, 11 (3), 8 (19 pages). Hall, R. (2002). Aligning Learning, Teaching and Assessment Using the Web: An Evaluation of Pedagogical Approaches. British Journal of Educational Technology, 33 (2), 149-158. Hannafin;, M. J., & Peck, K. L. (1988). The design, development, and evaluation of instructional software New York: Macmillan. Haycock, K. (2000). Applying research in information literacy. Teacher Librarian, 27 (3), p.34. Hirumi, A. (2002). A Framework for Analyzing, Designing, and Sequencing Planned Elearning Interactions. Quarterly Review of Distance Education 3(2), 141-160. Keller, J. B., & Bichelmeyer, B. A. (2 004). What Happens When Accountability Meets Technology Integration. TechTrends v. 48(no. 3), p. 17-24. Keller, J. M. (1987). Development and Use of the ARCS Model of Motivational Design. Journal of Instructional Development 10(3), 2-10. Kinzie, M. B., & Berdel, R. L. (1990). Design and use of hypermedia systems. Educational Technology Research and Development, v. 38 (no. 3), p. 61-68.

PAGE 144

Development and Validation 134 134 Laurillard, D. (1993). Rethinking university teaching : a framew ork for the effective use of educational technology London: Routledge. Lubans, J. (1998). How first-year university stud ents use and regard Internet resources. Duke University. Available: www.lubans.org/docs /1styear/firstyear.html [2002, October 2002]. Mager, R. F., & Clark, D. C. (1963). The effects of normative feedback in automated instruction Palo Alto, CA: Varian Associates. Merrill, D. M., & Others, A. (Jun 1992). Instructional Transaction Theory: Classes of Transactions. Educational Technology, 32 (6), 12-26. Meyer, K. A. (2003). The Web's Impact on Student Learning. T.H.E. Journal, 30 (10), p14,16,20,22,24. Neuman, D. (1995b). High school students' use of databases:Competing conceptual structures:Results of a national Delphi study. Journal of the American Society for Information Science, 46 (4), 284-298. Neuman, D. (1997). Learning and the digital library. Library Trends, 45(4), 687, 621p. Northrup, P. T., & Rasmussen, K. (2001). Considerations for designing Web-based programs. Computers in the Schools, 17 (3/4), 33-46. Perez-Prad, A., & Thirunarayanan, M. O. (2 002). A Qualitative Comparison of Online and Classroom-Based Sections of a Course: Exploring Student Perspectives. Educational Media International, 39 (2), p195-202. Pitts, J. M. (1994). Mental models of info rmation: The 1993-94 AASL/Highsmith research award study. School Library Media Resources, 23 (3), 17p. Repman, J., & Carlson, R. D. (1999). Featur e Surviving the Storm: Using Metasearch Engines Effectively These erperts offer ex planations on the several different types

PAGE 145

Development and Validation 135 135 of metasearch tools and reveal the results of their study as to which ones work the best. Computers in libraries, 19 (5), 50-57. Richey, R. C. (1996). Robert M. Gagne's impact on instructional design theory and practice of the future. Paper presented at the National Convent ion of the Association for Educational Communications and Technology Indianapolis, IN. Russell, T. (1999). No Significant Difference Phenomenon Raleigh, NC: North Carolina State University. Saettler, L. P. (1990). The evolution of American educational technology /. Englewood, Colo: Libraries Unlimited. Savery, J. R., & Duffy, T. M. (1995). Problem based learning: an instructional model and its constructivist framework. Educational Technology, 35 (September/October), 31-38. Schnackenberg, H. L., Sullivan, H. J., Leader, L. F., & Jones, E. E. K. (1998). Learner Preferences and Achievement Under Diffe ring Amounts of Learner Practice. Educational Technology Research and Development, 46 (2), 5-15. Sherry, A. C., Fulford, C. P., & Zhang, S. (1998). Articles Assessing Distance Learners' Satisfaction with Instruction: A Quan titative and a Qualitative Measure. The American journal of distance education, 12 (3), 4 (25 pages). Skinner, B. F. (1976). About Behaviorism (2nd ed.). New York: Vintage Books. Song, S. H. K., John M. (2001). Effectivene ss of Motivationally Adaptive Computer-Assisted Instruction on the Dynamic Aspects of Motivation. Educational Technology Research and Development, 49 (2), 5-22. Steinberg, E. R. (1989). Cognition and Lear ner Control: A Literature Review, 1977-1988. Journal of Computer-Based Instruction, 16 (4), 117-121.

PAGE 146

Development and Validation 136 136 Sullivan, H. J., & Higgins, N. (1983). Teaching for competence New York:: Teachers College Press, Columbia University. Swartz, J. D., & Biggs, B. (1999). Technology, Time, and Space or What Does It Mean To Be Present? A Study of the Culture of a Distance Ed ucation Class. Journal of Educational Computing Research, 20 (1), 71-85. Uitenbroek, D. (2005). Simple Interactive Statistical Analysis [website]. SISA. Available: http://home.clara.net/sisa/index.htm [2005, March 15, 2005]. Visser, J., & Keller, J. M. (1990). The Clinical Use of Motivational Messages: An Inquiry into the Validity of the ARCS Model of Motivational Design. Instructional Science, 19 (6), 467-500. Williams, S. W. (2002). Instructional Design Factors and the Effectiveness of Web-Based Training/ Instruction.Unpublished manuscript U.S.; North Carolina; 2002-05-00. Willis, J. (1993). Sources of knowledge: one way versus many ways. Computers in the Schools, v. 9 (no. 2-3), p. 91-104. Zalud, G. G., & Reyes, E. (1990). E ffective Teaching: The Lesson Sequence. Journal of the Wisconsin State Reading Association, 34 (4), 23-25.

PAGE 147

Development and Validation 137 137 Appendices

PAGE 148

Development and Validation 138 138 Appendix A Table A-13 Derivation of Test Questions for Pretest and Posttest Comprehension Performance Objective Bloom's Level of Objective Parallel Test Item Upon completion of the TILT Search Unit, the student will identify methods of selection for search terms, construct synonyms for key words, and phrases through brainstorming activities Knowledge level: knowledge of terminology, ways or means of dealing with specifics, methodology. Knowledge defined as remembering appropriate, previously learned information Question: Choose ALL ANSWERS that apply Of the following, choose those items that represent effective strategies for choosing search terms. Answers: • Write out your topic in a few sentences • Highlight the main terms and phrases • Brainstorm synonyms, broader terms, and narrower terms • List abbreviations and alternate spellings of words • Check a subject encyclopedia for ideas and concepts All answers are correct: multiple answer

PAGE 149

Development and Validation 139 139 Performance Objective Bloom's Level of Objective Parallel Test Item Question: Choose TWO of the following choices Which of the following are effective Web search strategies? Analyze which search engine is better for your topic than the others Make sure to search with broad terms and use OR with alternate spellings or meanings Check in virtual libraries, subject directories, and metasearch engines for all possible combinations of source material Pick only the first few results because they will be most relevant Construct your search using phrases and quotation marks and compare two or more search engines for results Answer: Numbers 1 and 5 Knowledge level: knowledge Question: Of the following examples, which is NOT a

PAGE 150

Development and Validation 140 140 Performance Objective Bloom's Level of Objective Parallel Test Item The student will identify the four major categories of Internet search tools as outlined in the Library and Internet Research Skills course. of terminology, expressed as verbal information level category of Internet research tool? Usenet, Listserv, or Newsgroup Generalized subject directory Search Engine Specialized Database Virtual Library Answer is number 1 The student distinguishes the characteristics of a specialized database Application level: use of previously learned information in new and concrete situations Which of the following Internet sites is NOT an example of a specialized database: a. Mapquest.com b. Yahoo.com c. Homedepot.com d. Imdb.com Correct answer is b Student will be able to identify the characteristics of each of the four categories of Internet tools: virtual libraries, specialized databases, general subject directories, and search engines. Comprehension level: grasping the meaning of the instructional materials Which statement about virtual libraries is NOT true? Virtual libraries allow you to search within subject categories Virtual libraries all use sites reviewed by professionals in their field Virtual libraries select sources according to relevance and accuracy of information

PAGE 151

Development and Validation 141 141 Performance Objective Bloom's Level of Objective Parallel Test Item Unlike general directories, virtual libraries link to thousands of websites not millions Correct answer is number 2. The student will be able to distinguish the characteristics among virtual libraries and demonstrate when to use a particular site when presented an information problem. Application leveluse of previously learned information given novel situation and Analysis differentiates among search tools given an information problem You have to find examples of lessons on the Civil War for your high school sophomore American History class. Of the following virtual libraries, which is the best choice? 1. Internet Public Library 2. WWW Virtual Library 3. Infomine The student will be able to identify methods of searching general subject directories, virtual libraries, and specialized databases. Comprehension level: grasping the meaning of informational materials What method would you use to search for information in a general subject directory? domain and URL web address and date browse and search subject and keyword title and author Correct answer 3. The student will be able to differentiate use of a subject directory and use of a search Analysis level: Student differentiates purpose of subject directory and search Complete this analogy: A generalized subject directory is to search engine as: dictionary is to thesaurus

PAGE 152

Development and Validation 142 142 Performance Objective Bloom's Level of Objective Parallel Test Item engine engine movie schedule is to movie reviews table of contents is to an index atlas is to street map Correct answer is table of contents to index Student when presented an information problem will recognize the appropriateness of use of a search engine and when to use a subject directory is in order. Application level: student determines the appropriateness of one tool over another regarding use of a subject directory and search engine Question: Imagine you are searching for Claude Monet's painting "The Sunflowers". Choose the best search strategy from each of the examples below. 1. In Yahoo, search under the heading Arts and Humanities, subcategory Visual Art, subcategory Painting, Artists, Masters, Claude Monet 2. Look in Google and search under images. Type the words "The Sunflowers" AND Monet in the search field. 3. Type "The Sunflowers" in Yahoo. Correct answer is number 2

PAGE 153

Development and Validation 143 143 Performance Objective Bloom's Level of Objective Parallel Test Item Analysis: student examines the organizational structure of information A subject directory organizes information into categories. Information within categories is organized... fat to thin specific to broad tall to short broad to specific The answer is broad to specific Students will be able to combine search terms effectively. Sub-objective: The student will use quotation marks around a phrase when searching for adjacent words in a specified order of appearance. Application level: Given a rule statement, students use prior knowledge of punctuation used around phrase searches to obtain information. You need to find information on Lincoln's Gettys burg Address. What punctuation would you use to search for the phrase Gettysburg Address? 1. parentheses 2. apostrophe 3. commas 4. quotation marks The answer is number 4 Students will be able to combine search terms effectively. Sub-objective: The student will use an asterisk as a wildcard character for words Application level: Student will identify the correct punctuation for a wildcard character. You have an assignment that requires you to look up information on diabetes. You know that there are various methods to search for diabetes that include variant forms of the word: diabetes, diabetic, diabetics,

PAGE 154

Development and Validation 144 144 Performance Objective Bloom's Level of Objective Parallel Test Item of alternate spelling. etc. Select the command that would retrieve all the variants of this term. 1. diab? 2. diabet* 3. diabetic" 4. diabet! The answer is number 2 Student can correctly recognize use of a nested statement Analysis level: breakdown of informational materials into component partskeyword recognize or distinguish among statements Of the following search commands, which is a correctly written nested statement? 1. "rain OR snow" OR sleet 2. (townhouse AND condominium) 3. "hotel OR motel" OR "Holiday Inn" 4. (townhouse OR timeshare) AND "Orlando, FL" Student predicts the effect of the operator OR on search results when students are posed with a research question Analysis level: student must break down or analyze component part – must be able to distinguish cause and effect You are assigned a research paper on World War II and the Holocaust. From what you know about Boolean operators, use a search command that will retrieve the largest number of results. 1. WWII AND Holocaust 2. ("World War II" OR WWII) AND Holocaust

PAGE 155

Development and Validation 145 145 Performance Objective Bloom's Level of Objective Parallel Test Item 3. ("World War II" OR WWII) AND (Holocaust OR "concentration camp") 4. "World War II" AND Holocaust Correct answer is number 3 Student will be able to identify search statements with Boolean operators and match statements to expected results Analysis level: keywords are identify and match, requires student to breakdown statements and match them to rule outcomes Match the following search commands and the expected results. 1. “Thomas Jefferson” OR “Benjamin Franklin” 2. “Bed and Breakfast” + “Savannah, Georgia” 3. China – dishes Match to: a. Expands search results b. Eliminates particular results c. Specifies search results (narrows) Provided a search problem, student is able to compose a search statement that uses a NOT operator to exclude information from search results. Synthesis level: keyword is compose and requirement of student’s verbal recall and ability to apply a rule to a given situation. You are assigned a research paper on the Taj Mahal in India. How would you write a search statement that excludes information about a casino or Donald Trump in Las Vegas? casino + "Donald Trump"

PAGE 156

Development and Validation 146 146 Performance Objective Bloom's Level of Objective Parallel Test Item "Taj Mahal "Donald Trump" – casino "Taj Mahal" "Donald Trump" + casino "Taj Mahal" + India "Donald Trump" – casino Correct answer is the 4th Provided a search problem, student is able to compose a search statement that uses an OR operator to increase search results. Comprehension level: student grasps meaning and identifies the appropriate operator according to previous learning of a rule regarding an OR statement You have a research assignment on former President Jimmy Carter. What operator would you use between “Jimmy Carter”, “President James E. Carter” to increase the number of search results? ELSE OR AND NOT The correct answer is OR Provided examples and presented a search question, students will distinguish what search statement best fits a given research problem. Application level: use of previously learned information when presented new concrete problem situation. You're getting ready to buy a new dog. You can't decide if you want a miniature or a toy poodle so you seek breeders for both. Specifically, you want lists of breeders within the state of Florida who specialize in EITHER miniature OR toy

PAGE 157

Development and Validation 147 147 Performance Objective Bloom's Level of Objective Parallel Test Item poodles. How would you write a search statement that will find this information? a. Breeders AND Florida AND poodles b. Poodles AND miniature AND toy AND breeders c. (miniature OR toy) AND poodles AND Florida AND breeders Correct answer is c. Students can define the purpose and characteristics of a metasearch engine. Knowledge level – student can define the term metasearch engine An Internet tool that allows you to create one command for multiple databases is a… a. Virtual library b. Metasearch engine c. Specialized database d. General directory Correct answer is b Students can describe the search methods used for general directories Knowledge level – knowledge of specific facts pertinent to verbal information level for general directories What techniques (choose one of the following pairs) would you use to search for information in a general subject directory? a. Domain and URL b. Web address and date c. Browse and search d. Subject and keyword

PAGE 158

Development and Validation 148 148 Performance Objective Bloom's Level of Objective Parallel Test Item e. Title and author Note: These question stems have been modified according to statistics and feedback performed after the pilot. Those questions having a p value of less than 0.33 were modified for greater clarity.

PAGE 159

Development and Validation 149 149 Appendix A Continued Table A-14 Cronbach Alpha on Summer 2003 USF Students Posttest Scores Reliability Analysis Scale (Alpha) N of Cases = 44 Item Scale Mean if Item is Deleted Scale Variance if Item is Deleted Corrected Item Total Alpha if Item is Deleted Q1a 19.7955 16.6781 .4518 .7751 Q1b 19.6136 17.8705 .2555 .7854 Q1c 19.5682 18.2511 .1650 .7883 Q1d 19.6818 17.8499 .1759 .7883 Q1e 19.8409 17.4857 .2040 .7886 Q2c 19.6591 17.9508 .1589 .7888 Q2d 19.5682 18.5301 -.0513 .7921 Q3 19.8864 17.7775 .1184 .7937 Q4 19.6364 17.4461 .3934 .7804. Q5 19.7727 16.9239 .3964 .7782 Q6 20.0455 16.4165 .4426 .7749 Q7 19.8409 17.2997 .2536 .7859 Q8 19.7273 17.1332 .3718 .7797 Q9 20.1136 16.7077 .3729 .7793 Q10 19.8864 15.9635 .5984 .7657

PAGE 160

Development and Validation 150 150 Item Scale Mean if Item is Deleted Scale Variance if Item is Deleted Corrected Item Total Alpha if Item is Deleted Q11 19.6136 17.9635 .2118 .7867 Q12 20.0909 18.3636 -.0304 .8029 Q13 19.8409 16.5090 .4704 .7737 Q14 19.6591 17.0206 .5144 .7750 Q15 19.7955 17.2828 .2778 .7844 Q16 19.8636 16.2600 .5275 .7702 Q17 19.7273 17.5983 .2248 .7866 Q18 19.7045 17.5618 .2540 .7852 Q19a 19.8864 15.9635 .5984 .7657 Q19b 19.6818 17.5708 .2731 .7843 Q19c 19.9091 15.9915 .5802 .7667 Q20 19.7727 18.2262 .0222 .7969 Reliability Coefficients 27 items Alpha = .7891 Standardized item alpha = .7798

PAGE 161

Development and Validation 151 151 Appendix A Continued Table A-15 Cronbach Alpha on Spring 2004 Chamblee Middle School 8th Grade Posttest Scores Item Scale Mean if Item is Deleted Scale Variance if Item is Deleted Corrected Item Total Alpha if Item is Deleted Q1a 17.6829 10.6720 .2786 .6400 Q1b 17.1707 11.4451 .1083 .6556 Q1d 17.2439 11.1390 .1858 .6497 Q1e 17.5122 11.0561 .1449 .6556 Q2 17.4146 10.6988 .2699 .6410 Q3 17.1463 10.8280 .4447 .6316 Q4 17.6829 11.8220 -.0782 .6792 Q5 17.0976 11.5902 .1076 .6548 Q6 17.1220 11.2098 .2936 .6437 Q7a 17.2927 11.0122 .2067 .6478 Q7b 17.0976 11.3902 .2444 .6480 Q7c 17.2927 11.0122 .2067 .6478 Q8 17.3902 10.3439 .3961 .6264 Q9 17.5854 11.5488 -.0018 .6721 Q10 17.3659 10.5378 .3393 .6333 Q11 17.2439 10.6390 .3811 .6312 Q12 17.2683 10.8012 .2980 .6388

PAGE 162

Development and Validation 152 152 Item Scale Mean if Item is Deleted Scale Variance if Item is Deleted Corrected Item Total Alpha if Item is Deleted Q13 17.3415 11.3805 .0659 .6627 Q14 17.3415 10.1805 .4779 .6179 Q15 17.1707 10.9451 .3388 .6378 Q16 17.4634 11.5549 -.0018 .6718 Q17 17.3902 10.7939 .2451 .6438 Q18 17.5610 11.0024 .1606 .6538 Q19 17.1220 11.3098 .2359 .6472 Q20d 17.1707 10.7951 .4101 .6322 Note: Q1C has zero variance Q20C has zero variance Reliability Coefficients 25 items Alpha = .6563 Standardized item alpha = .6856

PAGE 163

Development and Validation 153 153 Appendix A Continued Table A-16 Frequency Distribution Test Items 2004 Chamblee Middle School 8th Grade Posttest Scores Item Frequency Percent Valid Percent Cumulative Percent Q1a 26 incorrect 63.4 63.4 63.4 15 correct 36.6 36.6 100.0 Q1b 5 incorrect 12.2 12.2 12.2 36 correct 87.8 87.8 100.0 Q1c 41 correct 100.0 100.0 100.0 Q1d 8 incorrect 19.5 19.5 19.5 33 correct 80.5 80.5 100.0 Q1e 19 incorrect 46.3 46.3 46.3 22 correct 53.7 53.7 100.0 Q2 15 incorrect 36.6 36.6 36.6 26 correct 63.4 63.4 100.0 Q3 4 incorrect 9.8 9.8 9.8 37 correct 90.2 90.2 100.0 Q4 26 incorrect 63.4 63.4 63.4 15 correct 36.6 36.6 100.0 Q5 2 incorrect 4.9 4.9 4.9 39 correct 95.1 95.1 100.0 Q6 3 incorrect 7.3 7.3 7.3 38 correct 92.7 92.7 100.0 Q7a 10 incorrect 24.4 24.4 24.4 31 correct 75.6 75.6 100.0 Q7b 2 incorrect 4.9 4.9 4.9 39 correct 95.1 95.1 100.0 Q7c 10 incorrect 24.4 24.4 24.4 31 correct 75.6 75.6 100.0 Q8 14 incorrect 34.1 34.1 34.1 27 correct 65.9 65.9 100.0

PAGE 164

Development and Validation 154 154 Item Frequency Percent Valid Percent Cumulative Percent Q9 22 incorrect 53.7 53.7 53.7 19 correct 46.3 46.3 100.0 Q10 13 incorrect 31.7 31.7 31.7 28 correct 68.3 68.3 100.0 Q11 8 incorrect 19.5 19.5 19.5 33 correct 80.5 80.5 100.0 Q12 9 incorrect 22.0 22.0 22.0 32 correct 78.0 78.0 100.0 Q13 12 incorrect 29.3 29.3 29.3 29 correct 70.7 70.7 100.0 Q14 12 incorrect 29.3 29.3 29.3 29 correct 70.7 70.7 100.0 Q15 5 incorrect 12.2 12.2 12.2 36 correct 87.8 87.8 100.0 Q16 17 incorrect 41.5 41.5 41.5 24 correct 58.5 58.5 100.0 Q17 14 incorrect 34.1 34.1 34.1 27 correct 65.9 65.9 100.0 Q18 21 incorrect 51.2 51.2 51.2 20 correct 48.8 48.8 100.0 Q19 3 incorrect 7.3 7.3 7.3 38 correct 92.7 92.7 100.0 Q20c 41 correct 100.0 100.0 100.0 Q20d 5 incorrect 12.2 12.2 12.2 36 correct 87.8 87.8 100.0

PAGE 165

Development and Validation 155 155 Appendix A Continued Pretest on your Knowledge of Internet Search Tools Instructions : The quiz you are about to take is for research purposes only. Before you will begin your training in the computer lab, please take the following pretest. Once we have obtained a group of scores, we will assign you randomly to one of two tutorial groups. None of your scores will count against you. We are looking for gains in your score from pretest and after taking the tutorial, posttest on these same concepts. There is no risk to your grade for your participation. Here's a chance to gain some additional skills. At the conclusion of the research, you will be treated to a pizza party during your language arts class! You have 30 minutes for the quiz. Take your time, and have fun and thanks for your participation!! Question 1 Multiple Answer 5 points Of the following, CHOOSE ALL ANSWERS THAT APPLY that represent good practices for choosing search terms? a) Write out your topic in a few sentences b) Highlight the main terms and phrases c) Brainstorm synonyms, broader terms, and narrower terms d) List abbreviations and alternate spellings of words e) Check a subject encyclopedia for ideas and concepts Question 2 Multiple Answer 5 points CHOOSE TWO FROM THE FOLLOWING CHOICES Which of the following are effective Web search strategies? a) Search using all capital letters for more emphasis b) Choose just one search engine and never leave it c) Use phrases surrounded by quotation marks for more specific results d) Scan a subject list and then search by keyword or phrase within the subject category for specific information e) Look through every site you retrieve to choose the best ones Question 3 Multiple Choice 5 points The difference between a search engine and subject directory is…. a) One uses commands, the other does not b) One contains driving directions, the other recipes c) One gives phone numbers and addresses, the other gives zipcodes d) One is organized into categories by subject, the other searches by word

PAGE 166

Development and Validation 156 156 Appendix A Continued Question 4 Multiple Choice 5 points Of the following examples, which is NOT a category of Internet research tool? a) Usenet, Listserv, or Newsgroup b) Generalized subject directory c) Search Engine d) Specialized database e) Virtual library Question 5 Multiple Choice 5 points An Internet tool that allows you to create one command for multiple databases is a... a) virtual library b) metasearch engine c) specialized database d) general directory Question 6 Multiple Choice 5 points Which statement about virtual libraries is true ? a) Virtual libraries do not contain subject categories b) Most virtual libraries use reviewers to select sites within categories c) Virtual libraries link to millions of websites d) You cannot search by keyword within categories on virtual libraries Question 7 Multiple Choice 5 points Imagine you are searching for Claude Monet's painti ng "The Sunflowers". Ch oose the best search strategy from each of the examples below. a) In Yahoo, search under the heading Arts and Humanities, subcategory Visual Art, subcategory Painting, Artists, Mast ers, Claude Monet b) Look in Google and search under images. Type the words "The Sunflowers" AND Monet in the search field. c) Type "The Sunflowers" in Yahoo. Question 8 Multiple Choice 5 points Which of the following Internet tools is NOT an example of a specialized database? a) mapquest.com b) yahoo.com c) homedepot.com d) imdb.com Question 9 Multiple Choice 5 points What is an easy technique to search for in formation in a general subject directory? a) Search for a URL b) Enter a WEB address c) browse the subject heading or search by keyword d) search by the author

PAGE 167

Development and Validation 157 157 Appendix A Continued Question 10 Multiple Choice 5 points How much information can be access by Internet sear ch engines? a) All of it b) More than half c) Less than half d) A small fraction Question 11 Multiple Choice 5 points You need to find information on Lincoln's Gettysburg Address. What punctuation would you use to search for the phrase Gettysburg Address? a) parentheses b) apostrophe c) commas d) quotation marks Question 12 Multiple Choice 5 points What kind of information could yo u find in a virtual library? a) A dictionary and thesaurus b) Driving directions c) Current weather forecast d) Recipes for your favorite dishes Question 13 Multiple Choice 5 points You are assigned a research paper on World War II and the Holocaust. From what you know about this subject, select the search command that will bring you the MOST NUMBER of results. a) WWII AND Holocaust b) ("World War II" OR WWII) AND Holocaust c) ("World War II" OR WWII) AND (Holocaust OR "concentration camp") d) "World War II" AND Holocaust Question 14 Multiple Choice 5 points You are assigned a research paper on the Taj Maha l in Asia. How would you write a search statement that excludes information about a casino or Donald Trump in Las Vegas? a) casino + "Donald Trump" "Taj Mahal" b) "Donald Trump" casino c) "Taj Mahal" "Donald Trump" + casino d) "Taj Mahal" + India "Donald Trump" casino Question 15 Multiple Choice 5 points Of the following search commands, which is a correctly written nested statement? a) rain OR snow OR sleet b) townhouse AND condominium c) hotel OR motel OR "Holiday Inn" d) (townhouse OR timeshare) AND "Orlando, FL"

PAGE 168

Development and Validation 158 158 Appendix A Continued Question 16 Multiple Choice 5 points You have a research assignment on former President Jimmy Carter. Which of the following phrases will you want to include to increase the number of search resu lts you wish to obtain? a) “Jimmy Carter” ELSE “President James E. Carter” b) “Jimmy Carter” OR “President James E. Carter” c) “Jimmy Carter” AND “President James E. Carter” d) “Jimmy Carter” NOT “President James E. Carter” Question 17 Multiple Choice 5 points You have an assignment that requi res you to look up information on diabetes. You know that there are various methods to search for diabetes that include variant forms of the word: diabetes, diabetic, diabetics, etc. Select the command that would retrieve all the variants of the term. a) diab? b) diabet* c) diabetic" d) diabet! Question 18 Multiple Choice 5 points You're getting ready to buy a new do g. You can't decide if you want a miniature or a toy poodle so you seek breeders for both. Specifically, you want lists of breeders within the state of Florida who specialize in EITHER miniature OR toy poodles. How would you writ e a search statement that will find this information? a) breeders AND Florida AND poodles b) poodles AND miniature AND toy AND breeders c) (miniature OR toy) AND poodles AND Florida AND breeders Question 19 Matching 5 points Match the following search commands and the expected results. "Thomas Jefferson" OR "Benjamin Franklin" (Choose a, b, or c) a) expands search results: increases the number of results "Bed and Breakfast" AND "Savannah, Georgia" (Choose a, b, or c) b) excludes possible results that are misleading China NOT dishes (Choose a, b, or c) c) narrows a search: decreases number of results Question 20 Multiple Choice 5 points A subject directory organizes information into categories. Information within categories is organized from a) fat to thin b) specific to broad c) tall to short d) broad to specific

PAGE 169

Development and Validation 159 159 Appendix A Continued Posttest on your Knowledge of Internet Search Tools Instructions: The quiz you are about to take is for research purposes only. There is no risk to your grade for your participation. Once you have completed the posttest, you will have an opportunity to take th e Internet Scavenger Hunt. Neither of your scores will count against your grade in this class. When we conclude the research, look forward to a pizza party as a thank you for your participation. Your Name: _____________________________________________ Group: One or Two (circle the appropriate choice) Time to complete the Tutorial: ________________________________ (recorded from the digital stopwatch) Question 1 Multiple Answer 5 points Of the following, CHOOSE ALL ANSWERS THAT APPLY that represent good practices for choosing search terms. a) Write out your answer in a few sentences b) Highlight the main terms and phrases c) Brainstorm synonyms, broader terms, and narrower terms d) List abbreviations and alternate spellings of words e) Check a subject encyclopedia for ideas and concepts Question 2 Multiple Choice 5 points You have a research assignment on former President Jimmy Carter. Which of the following phrases will increase the number of search results you obtain? a) “Jimmy Carter ELSE “President James E. Carter” b) “Jimmy Carter OR “President James E. Carter” c) “Jimmy Carter AND “President James E. Carter” d) “Jimmy Carter NOT “President James E. Carter” Question 3 Multiple Choice 5 points You have an assignment that requires you to look up information on diabetes. You know that there are various methods to search for diabetes including the variant forms of the word: diabetes, diabetic, diabetics, etc. Select the command that would retrieve all the variants of the term. a) diab? b) diabet* c) diabetic” d) diabet!

PAGE 170

Development and Validation 160 160 Appendix A Continued Question 4 Multiple Choice 5 points What kind of information could you find in a virtual library? a) A dictionary and thesaurus b) Driving directions c) Current weather forecast d) Recipes for your favorite dishes Question 5 Multiple Choice 5 points You need to find information on Lincoln’s Gettysburg Address. What punctuation would you use to search for the phrase Gettysburg Address? a) parentheses b) apostrophe c) commas d) quotation marks Question 6 Multiple Choice 5 points You’re getting ready to buy a new do g. You can’t decide if you want a miniature or a toy poodle so you seek breeders for both. Specifically you want lists of breeders within the state of Florida who specialize in EITHER miniature OR toy poodles. How would you write a search statement that will find this information? a) Breeders AND Florida AND poodles b) Poodles AND miniature AND toy AND breeders c) (miniature OR toy) AND poodles AND Florida AND breeders Question 7 Matching 5 points Match the following search commands and the expected results “Thomas Jefferson” OR “Benjamin Franklin” (choose a, b, or c) a) expands search results: increases the number of results “Bed and Breakfast” AND “Savannah, Georgia” (choose a, b, or c) b) excludes possible results that are misleading China NOT dishes (choose a, b, or c) c) narrows a search: decreases number of results Question 8 Multiple Choice 5 points You are assigned a research paper on World War II and the Holocaust. From what you know about this subject, select the search command that will bring you the MOST NUMBER of results. a) WWII AND Holocaust b) (“World War II” OR WWII) AND Holocaust c) (“World War II” OR WWII) AND (Holocaust OR “concentration camp”) d) “World War II” AND Holocaust

PAGE 171

Development and Validation 161 161 Appendix A Continued Question 9 Multiple Choice 5 points A subject directory organizes information into categories. Information within categories is organized from…. a) fat to thin b) specific to broad c) tall to short d) broad to specific Question 10 Multiple Choice 5 points An Internet tool that allows you to create one command for multiple databases is a… a) virtual library b) metasearch engine c) specialized database d) general directory Question 11 Multiple Choice 5 points The difference between a search engine and subject directory is…. a) One uses commands, the other does not b) One contains driving directions, the other recipes c) One gives phone numbers and addresses, the other gives zipcodes d) One is organized into categories by subject, the other searches by word Question 12 Multiple Choice 5 points Imagine you are searching for a picture of Claude Monet’s painting “The Sunflowers”. Choose the search strategy that will give you the correct answer most quickly. a) In Yahoo, search under the heading Arts and Humanities sub-category Visual Art sub-category painting Artists sub-category Masters subject Claude Monet b) Look in Google and search under images Type the words “The Sunflowers” AND Monet in the search field c) Type “The Sunflowers” in Yahoo Question 13 Multiple Choice 5 points Of the following examples, which is NOT a category of Internet research tool? a) Usenet, Listserv, or Newsgroups b) Generalized subject directory c) Search Engine d) Specialized Database e) Virtual Library Question 14 Multiple Choice 5 points Of the following search commands, which is a correctly written nested statement? a) rain OR snow OR sleet b) townhouse AND condominium c) hotel OR motel OR “Holiday Inn” d) (townhouse OR timeshare) AND “Orlando, Florida”

PAGE 172

Development and Validation 162 162 Appendix A Continued Question 15 Multiple Choice 5 points What is an easy technique to search for information in a general subject directory? a) search for a URL b) enter a web address c) browse the subject heading or search by keyword d) search by the author Question 16 Multiple Choice 5 points How much information can be accessed through the Internet search engines? a) all of it b) more than half c) less than half d) a small fraction Question 17 Multiple Choice 5 points Which of the following tools is NOT an example of a specialized database? a) mapquest.com b) yahoo.com c) homedepot.com d) imdb.com Question 18 Multiple Choice 5 points Which statement about virtual libraries is true ? a) Virtual libraries do not contain subject categories b) Most virtual libraries use reviewers to select sites within categories c) Virtual libraries link to millions of websites d) You cannot search by keyword within categories on virtual libraries Question 19 Multiple Choice 5 points You are assigned a research paper on the Taj Mahal in Asia. How would you write a search statement that excludes information about a casino or Donald Trump in Las Vegas? a) Casino + “Donald Trump” – “Taj Mahal” b) “Donald Trump” casino c) “Taj Mahal” – “Donald Trump” + casino d) “Taj Mahal”+ India – “Donald Trump” casino Question 20 Multiple Answer 5 points CHOOSE TWO FROM THE FOLLOWING CHOICES. Which of the following are effective web search strategies? a) Search using all capital letters for greater emphasis b) Choose just one search engine and never leave it c) Use phrases surrounded by quotation marks for more specific results d) Scan a subject list and then search by keywo rd or phrase within the subject category for specific information e) Look through every site you retrieve to choose the best ones

PAGE 173

Development and Validation 163 163 Appendix A Continued Note: The final revisions for the pre and pos ttest derived from p-values of Summer 2003 administration of the instrument. No Cronbach alpha computations performed on the Fall 2003 administration of the test with alternate group of 8th grade students from Chamblee Middle School.

PAGE 174

Development and Validation 164 164 Appendix B Table B-17 Conceptual Framework for Two Mo dules Comparison and Treatment Features for each Gagne’s Events of Instruction Conversion FrederickSmith to web: high learner control Structured guided practice-feedback: higher program control and content-centered 1. Gain attention: Contextualize instruction to allow the learner to take ownership of the lesson by providing a customized, meaningful learning experience. Very little is included to gain the attention of the learner. The instructor may assume that by directly stating the objectives on a screen in terms of what the student will be able to do or grasp, the learner will be motivated to engage in the material. Opening screen with question posed…what is information literacy? Takes user to flash screen with bubbles about miscon ceptions of Internet. Use of metaphors and gra phics to stimulate recall of previously learned cons tructs in light of new information. 2. Inform students of learning objectives: Informing learners of the outcomes, or objectives, will help them understand what they are to learn during the course Statement of objectives follows the information provided in text book stated in active terms. At the conclusion of this information you will be able to….no graphics or motivational material is included to suggest the relevance of the material to the student’s desire to learn. Statement of objectives follows the information provided in textbook stat ed in active terms. Statement of importance of being able to discern what and what not is found on the Internet and promise of becoming a mo re savvy searcher. As in web to text version, a clear statement of objectives is found when the user selects statement of objectives. 3. Stimulate recall of Information may relate to Relate new information to what they already

PAGE 175

Development and Validation 165 165 Features for each Gagne’s Events of Instruction Conversion FrederickSmith to web: high learner control Structured guided practice-feedback: higher program control and content-centered prior learning what is previously covered in text, but generally no attention is paid to stimulate recall of prior learning know. Use of conversational tone and introduction of metaphors an d analogies to assist student to conceptualize and relate to new informationuse of scaffolding in advanced organizer – good example is the brainstorming exercise in which learner relates prior knowledge to new situation in nonthreatening environment 4. Presentation of content to facilitate recall and successful performance Table of contents presented as left frame (or table cell) but as learner navigates through sections, graphical indicator provides direction of where learner is throughout the program Use of graphics limited to illustrate concepts such as venn diagrams to illustrate Boolean operators, otherwise not graphic intensive, most text only Following overview of material, practice exercises offered to Table of contents presen ted as left frame (or table cell) but as learner navigates through sections, graphical indicator provides direction of where lear ner is throughout the program Use of authentic ex amples and metaphors that emphasize familiar constructs to map to new information. Use of graphical cues for concepts and vocabulary. Vocabulary screens imbedded in text as pop-up windows Following overview of material, practice exercises offered to students but program control requires exercises. Sequence the instruction in a logical order. Learner has an option to move nonsequentially within the material from unit to unit but once engaged in a practice exercise,

PAGE 176

Development and Validation 166 166 Features for each Gagne’s Events of Instruction Conversion FrederickSmith to web: high learner control Structured guided practice-feedback: higher program control and content-centered students and learner chooses practice exercises. Sequence of instruction in a logical order. Learner has an option to move non-sequentially within the material from unit to unit As in textbook, information is presented as small chunks to aid the learner on retention. Text is presented in separate modules but no attempt to summarize or tie in modules and their relationships to each other is made. Narrative screens serve as advanced organizers so that the learner can place the information into a structure that compares similarities and differences between student must complete the practice before moving to another point within the instruction Modular structure of material lends itself for greater retention of material. Narrative screens serve as advanced organizers so that the learner can place the information into a structure that compares similarities and differences between Internet tools. High program control ensures learner is guided through practice material and receives immediate corrective and reinforcement feedback Call-outs on screen captures assist with navigation as well as point out additional information in graphical form Within exercises, no back buttons are provided, learner moves forward as a function of interaction such as hot spot or text input screen Narrative screens with definitions of specific Internet tool include live links to websites

PAGE 177

Development and Validation 167 167 Features for each Gagne’s Events of Instruction Conversion FrederickSmith to web: high learner control Structured guided practice-feedback: higher program control and content-centered Internet tools. High learner control afforded for student to choose exercises. Feedback results from exploration of links. Student must formulate her/his own conclusions from exploration as no assistance is provided through the material. Narrative screens with definitions of specific Internet tool include live links to websites Following narrative, hands-on practice of material guides students through illustrative examples but left to the discretion of the student Fewer graphics than with the controlled practice condition…graphical information provided by real-time exploration

PAGE 178

Development and Validation 168 168 Features for each Gagne’s Events of Instruction Conversion FrederickSmith to web: high learner control Structured guided practice-feedback: higher program control and content-centered within live websites. Narrative screens with definitions of specific Internet tool include live links to websites 5. Provide learner guidance Learner guidance in high learner control condition affords learner optional practice exercises. Questions imbedded with live links pose an open-ended question OR a series of typical research questions for the learner to solve with the tools provided. Feedback results from learner’s authentic exploration of the sites presented in realtime. Learner guidance is more structured. Examples for learner to interact provide high program control to assure that the learner engages with examples designed by inst ructor. Learner control is NOT afforded through instruction to ensure that learner is exposed to examples and nonexamples and interacts with material and controlled feedback. 6. Elicit Performance: activate learner processing, engage in learner activities to promote recall and conceptualization of the material No planned e-Learning interactions, only opportunity to click live links embedded in instruction. Practice exercises are open-ended and do not afford reinforcement or corrective feedback. Feedback is the result of student who Program controls presen tation of exemplary situations to illustrate c onstructs presented in the overview. Program contro l of guided practice and immediate corrective and reinforcement feedback ensures that student practices with content.

PAGE 179

Development and Validation 169 169 Features for each Gagne’s Events of Instruction Conversion FrederickSmith to web: high learner control Structured guided practice-feedback: higher program control and content-centered takes advantage of live links and is natural consequence of exploration. No enforced performance provided students. Students may choose to skip practice entirely or only practice those questions of their own choice. 7. Feedback: Provide students information assessing how well they are doing via feedback No formal feedback for students to assess their mastery of the information. Feedback in high learner control condition dependent on the student’s own exploration of the material. Students receive no review tests or feedback on review questions. Immediate feedback is embedded in practice exercises and includes corrective, confirmatory, informative, and analytical. Practice-feedback with exercises is required following narrative screens. Review questions on material (unscored) provide feedback for correct and incorrect responses – Summary screens and review quizzes provide feedback for correct and incorrect responses 8. Review and relate new skills to those previously learned with authentic learning applications Typically conversion of textbook material to web offers little or no review material or opportunities to practice what is learned and integrate new information into previously learned constructs or skills Review screen following practice sessions summarize information and tie in what is previously learned with new information. Intermittent quizzes at mi dpoints of instruction provide immediate learne r feedback. Embedded quizzes allow learner to become aware of her/his progress

PAGE 180

Development and Validation 170 170 Features for each Gagne’s Events of Instruction Conversion FrederickSmith to web: high learner control Structured guided practice-feedback: higher program control and content-centered 9. Assess Performance: Pretest your objectives, embed questions, provide objective tests, and opportunities for performance tasks. Both groups receive an objective retention test on the material in the form of multiple-choice comprehension pre and posttests. Additional assessment is an open-book Internet scavenger hunt designed to parallel the objectives of the tutorial Both groups receive an objective retention test on the material in the form of multiple-choice comprehension pre and posttests. Additional assessment is an open-book Internet scavenger hunt designed to parallel the objectives of the tutorial 10. Enhance transfer and retention through performance assessmentschecklists, rating scales, attitude, assess mastery in authentic setting Open-book access to the tutorial and presentation of Internet Scavenger hunt provides accurate feedback to instructor as to student’s ability to apply principles presented within instruction to authentic research scenarios Open-book access to the tutorial and presentation of Internet Scavenger hunt provides accurate feedback to instructor as to student’s ability to apply principles presented within instruction to authentic research scenarios

PAGE 181

Development and Validation 171 171 Appendix B Continued

PAGE 182

Development and Validation 172 172 Appendix B Continued

PAGE 183

Development and Validation 173 173 Appendix B Continued

PAGE 184

Development and Validation 174 174 Appendix B Continued

PAGE 185

Development and Validation 175 175 Appendix B Continued

PAGE 186

Development and Validation 176 176 Appendix C

PAGE 187

Development and Validation 177 177 Appendix C Continued

PAGE 188

Development and Validation 178 178 Appendix C Continued

PAGE 189

Development and Validation 179 179 Appendix C Continued

PAGE 190

Development and Validation 180 180 Appendix D Table D-18 Derivation of Test Questions for Internet Hunt Posttest Performance Objective Bloom's Level of Objective Parallel Test Item Upon completion of the TILT unit, the student will demonstrate use of a specialized database to find information appropriate to those databases Application level: keyword is demonstrate, takes previously learned material and applies in new concrete setting Name the 1964 movie that starred Richard Burton and Peter O'Toole about the demise of the Archbishop of Canterbury and King Henry II? Hint: look in specialized databases for an appropriate tool. 1. Virginia Wolfe 2. What's New Pussycat? 3. Becket 4. Camelot The answer is Becket Upon completion of the TILT unit, the student will demonstrate use of a specialized database to find information appropriate to those databases Application level: keyword is demonstrate, takes previously learned material and applies in new concrete setting What year did Toyota Corporation begin its first sales operations in the United States? Here's a hint: Many companies contain information about their research and development, sales figures, history, and other important facts on

PAGE 191

Development and Validation 181 181 Performance Objective Bloom's Level of Objective Parallel Test Item their business websites. 1. 1957 2. 1964 3. 1954 4. 1983 The correct answer is number 1 The student will use Boolean operators effectively within a search engine or subject directory to find information about aspartame using the OR operator Application level: use of previously learned information in new and concrete situations At the 2003 Westminster Kennel Club show, what breed of dog was declared "Best in Show" 1. Argent Big Bang, a collie 2. Malka Happy, a Pomeranian 3. Torums Scarf Michael, a kerry blue terrier 4. Winterwinds Glenn Plaid, a labrador retriever The correct answer is number 3 The student will demonstrate proficiency with a virtual library source to find an American Heritage Dictionary site Application level: use of previously learned information in new and concrete situations Use a virtual library source: Infomine (http://infomine.ucr.edu/), Argus Clearinghouse (http://www.clearinghouse.net/), Internet Public Library (http://www.ipl.org/ ), Librarians' Guide to the Internet(http://lii.org/) or the World Wide Web Virtual Library(http:// www.vlib.org/).

PAGE 192

Development and Validation 182 182 Performance Objective Bloom's Level of Objective Parallel Test Item What is the address (URL) for "The American Heritage Dictionary of the English Language." Possible answers: www.bartleby.com www.dictionary.com www.mirriamwebster.com www.americanheritage.com The correct answer is number 1 The student will demonstrate proficiency accessing a reference source from a virtual library and defining the origin of a word using an Internet dictionary Application level: use of previously learned information in new and concrete situations Using a WWW dictionary (hint: look under references from a virtual library), wh at is the meaning or origin of the word "scherzo"? 1. From the Italian, meaning joke 2. From the Spanish, meaning afraid 3. From the French for running fast 4. From English, meaning slightly crazy Correct answer is number 1 The student will demonstrate the ability to effectively search for images from a Application leveluse of previously learned information given novel You need information about the date of Monet's "The Sunflowers". What year was this impressionist painting completed?

PAGE 193

Development and Validation 183 183 Performance Objective Bloom's Level of Objective Parallel Test Item search engine situation 1. 1908 2. 1890 3. 1881 4. 1900 Correct answer is number 3 The student effectively combines search terms by using quotation marks surrounding phrases and can demonstrate use of AND operators Application level: Uses prior learning to apply search rules in new context Use HotBot's advanced search (hotbot.com) technique to find the name of the book by James Loewen, an American historian, about how schools are teaching history incorrectly. 1. Falsehoods in American History 2. Lies my Teacher Told Me 3. How Schools Lie About History 4. The Truth About America Correct answer is number 2 Student will demonstrate the ability to apply Boolean operators with the preposition NOT to a search problem Application level: student uses previous knowledge to take on a problem within a new context Explore the Taj Mahal in Altavista's (altavista.com)advanced search mode. Who built the Taj Mahal in memory of his wife? Hint: make sure to exclude "Las Vegas", and "Donald Trump" from your search. 1. Emperor Shah Jahan 2. Emporer Bahadur Shah Zafar

PAGE 194

Development and Validation 184 184 Performance Objective Bloom's Level of Objective Parallel Test Item 3. Emporer Napolean Bonaparte 4. Emporer Aurangzeb The correct answer is number 1 The student will demonstrate the ability to effectively search for zipcodes from a specialized database Application leveluse of previously learned information given novel situation What is the zipcode for Silver Springs, Florida? 1. 30338 2. 34488 3. 30336 4. 34650 Correct answer is number 2 The student will demonstrate the ability to apply advanced search techniques such as domain searching to find information. Application level: taking previously learning information and applying it to a new problem-solving context Look in Dogpile metasearch engine (www.dogpile.com). Find out when Jennifer Lopez and Ben Affleck announced their engagement publicly? Limit your information from the domain:eonline.com Hint: the command for domain limitation is domain:eonline.com 1. November 2002 2. December 2003 3. October 2001 4. January 2003 Correct answer is number 1

PAGE 195

Development and Validation 185 185 Appendix D Continued These questions were changed from short-answer format to multiple-choice upon advice from the course instructor and from information gleaned from a formative small group evaluation. The multiple-choice format eliminates any possibility of ambiguity and provides an objective method for scoring each of the items. Each of the questions corresponds to one of the course objectives bu t also may be found through any number of search strategies. Students were granted opentutorial access post-instruction and for timed administration of the Internet Scavenger Hunt.

PAGE 196

Development and Validation 186 186 Appendix D Continued Internet Scavenger Hunt Instructions : The quiz you are about to take is for research purposes only. There is no risk to your grade for your participation. Once you have completed the Internet Scavenger hunt, you can look forward to a pizza party as a thank you for your participation. You will have 30 minutes for the quiz. Take your time, and have fun and thanks for your participation!! Your name: _________________________________ Group: One or Two (circle the appropriate choice) Approximate time taken to co mplete the tutorial: _______________ Question 1 Multiple Answer 1 points At the 2003 Westminster Kennel Club show, what breed of dog was declared "Best in Show"? a) Argent Big Bang, a collie b) Malka Happy, a pomeranian c) Torums Scarf Michael, a kerry blue terrier d) Winterwinds Glenn Plaid, a labrador retriever Question 2 Multiple Choice 1 points Explore the Taj Mahal in Altavista's (altavista.com)advanced search mode. Who built the Taj Mahal in memory of his wife? Hint: make sure to excl ude "Las Vegas", and "Donald Trump" from your search. e) Emperor Shah Jahan f) Emporer Bahadur Shah Zafar g) Emporer Napolean Bonaparte h) Emporer Aurangzeb Question 3 Multiple Choice 1 points Look in Dogpile metasearch engine (www.dogpile.com). Find out when Jennifer Lopez and Ben Affleck announced their engagement publicly? Limit your information from the domain:eonline.com Hint: the command for domain lim itation is domain:eonline.com e) November 2002 f) December 2003 g) October 2001 h) January 2003 Question 4 Multiple Choice 1 points Name the 1964 movie that starred Richard Burton and Peter O'Toole about the demise of the Archbishop of Canterbury and King Henry II? a) Virginia Wolfe b) What's New Pussycat? c) Becket d) Camelot

PAGE 197

Development and Validation 187 187 Appendix D Continued Question 5 Multiple Choice 1 points Use a virtual library source: Infomine (htt p://infomine.ucr.edu/), Argus Clearinghouse (http://www.clearinghouse.net/), In ternet Public Library (http://www.ipl.org/), Librarians' Guide to the Internet(http://lii. org/) or the World Wide Web Virtua l Library(http://www.vlib.org/). What is the address (URL) for "The American Heritage Dictionary of the English Language." a) www.bartleby.com b) www.dictionary.com c) www.mirriamwebster.com d) www.americanhe ritage.com Question 6 Multiple Choice 1 points Use HotBot's advanced search (hotbot.com) technique to find the name of the book by James Loewen, an American historian, about how schools are teaching history incorrectly. d) Falsehoods in American History e) Lies my Teacher Told Me f) How Schools Lie About History g) The Truth About America Question 7 Multiple Choice 1 points Using a WWW dictionary (hint: look under references from a virtual library), what is the meaning or origin of the word "scherzo"? a) From the Italian, meaning joke b) From the Spanish, meaning afraid c) From the French for running fast d) From English, meaning slightly crazy Question 8 Multiple Choice 1 points What is the zipcode for Silver Springs, Florida? e) 30338 f) 34488 g) 30336 h) 34650 Question 9 Multiple Choice 1 points What year did Toyota Corporation begin its first sa les operations in the Unit ed States? Here's a hint: Many companies contain information about their rese arch and development, sa les figures, history, and other important facts on their business websites. e) 1957 f) 1964 g) 1954 h) 1983 Question 10 Multiple Choice 1 points You need information about the date of Monet's "The Sunflowers". What year was this impressionist painting completed? e) 1908 f) 1890 g) 1881 h) 1900

PAGE 198

Development and Validation 188 188 Appendix D Continued Table D-19 Cronbach Alpha for Scavenger Hunt: Summer 2003 Students Question Scale Mean if Item Deleted Scale Variance if Item Deleted Corrected Item Total Correlation Squared Multiple Correlation Alpha if Item Deleted Q1 6.6279 4.7154 .2429 .4577 .7216 Q2 6.7209 4.1107 .5277 .3320 .6786 Q3 6.7907 4.4551 .2738 .2918 .7215 Q4 6.7209 4.6346 .2124 .7290 .5440 Q5 6.8372 4.1872 .3995 .2595 .7001 Q6 6.6744 4.0819 .6104 .4631 .6678 Q7 6.6047 4.7209 .2711 .3051 .7178 Q8 6.6279 4.3821 .4791 .5855 .6909 Q9 6.8837 4.5814 .1804 .2157 .7397 Q10 6.9070 3.6102 .7111 .5566 .6374 Note: p-value < .05; N of Cases = 43.0 Reliability Coefficient on 10 items Alpha = .7241 Standardized item alpha = .7262

PAGE 199

Development and Validation 189 189 Appendix D Continued Table D-20 Cronbach Alpha for Scavenger Hunt: Sp ring 2004 Middle School Students Question Scale Mean if Item Deleted Scale Variance if Item Deleted Corrected Item Total Correlation Squared Multiple Correlation Alpha if Item Deleted Q3 5.4146 .6988 .6258 .4735 .3518 Q4 5.3171 1.0220 .2987 .2202 .5255 Q5 5.5122 .8061 .2640 .3451 .5555 Q7 5.2927 1.0622 .3561 .5639 .5249 Q8 5.2927 1.2122 -.1028 .0143 .6085 Q9 5.4634 .8549 .2498 .2048 .5529 Q10 5.3171 .9720 .4226 .6108 .4908 Note: p-value < .05; N of Cases = 41.0 Questions 1, 2, and 6 produced no variability and were therefore omitted from the alpha statistic based on 7 items Reliability Coefficient on 7 items Alpha = .5626 Standardized item alpha = .5647

PAGE 200

Development and Validation 190 190 Appendix D Continued Table D-21 Item Frequencies for Scavenger Hunt: Spring 2004 Middle School Students Question Frequency Percent Valid Percent Cumulative Percent Q1 41 correct 100.0 100.0 100.0 Q2 41 correct 100.0 100.0 100.0 Q3 6 incorrect 14.6 14.6 14.6 35 correct 85.4 85.4 100.0 Q4 2 incorrect 4.9 4.9 4.9 39 correct 95.1 95.1 100.0 Q5 10 incorrect 24.4 24.4 24.4 31 correct 75.6 75.6 100.0 Q6 41 correct 100.0 100.0 100.0 Q7 1 incorrect 2.4 2.4 2.4 40 correct 97.6 97.6 100.0 Q8 1 incorrect 2.4 2.4 2.4 40 correct 97.6 97.6 100.0 Q9 8 incorrect 19.5 19.5 19.5 33 correct 80.5 80.5 100.0 Q10 2 incorrect 4.9 4.9 4.9 39 correct 95.1 95.1 100.0 Note: p-value < .05; N of Cases = 41.0

PAGE 201

Development and Validation 191 191 Appendix E Table E-22 Formative Evaluation Feedback from Pilot: Undergraduate Students (N = 39) Component Instructor Observations and Participant Comments Suggestions for Improvement Pre-instructional Initial motivation Objectives Orientation to materials No orientation on how to navigate the material was problematic. When the researcher asked if students had seen or used the navigation bar, students commented they had not noticed or accessed the glossary, map, or objectives screen. Because there were no repercussions for completing material, some reported they merely skimmed the tutorial and failed to check the links within pages. Students requested instruction on how to navigate materials. Students did not know they could use the tool to review sections of the material. Suggested that student be informed that the tutorial covers required material for the course and that they take the instruction seriously.

PAGE 202

Development and Validation 192 192 Component Instructor Observations and Participant Comments Suggestions for Improvement Presentation Sequence Size Content Examples Some students reported that materials were helpful and feedback enabled them to acquire the concepts, others noted material was tiresome given that the program would control for correct input from the student. One student reported low motivation due to lack of challenge in practice exercises. Students suggested that in future versions of the tutorial, more contemporary examples be included. Instead of using John Lennon’s Strawberry Fields, they suggested a more contemporary pop artist be chosen as well as current film examples for imdb.com Participation Practice Feedback Those students who reportedly were assigned to the experimental condition reported positive feedback on the Hollywood Squares game. There was one complaint that the program did not respond to three answers diagonally but did work for those answers that were correct responses horizontally or vertically. One student commented that the feedback on the game was “campy” and “hoaky” Change some of the feedback screens to more sophisticated level of feedback for Freshman undergraduates. Different responses were reported from small group formative run-throughs with adult students and middle school students. Another suggestion is to imitate the navigation much as in the extant program that calls for optional engagement with the exercises. When students were informed they could have used the tutorial as a

PAGE 203

Development and Validation 193 193 Component Instructor Observations and Participant Comments Suggestions for Improvement reference tool and move non-sequentially through the material through the “map” icon in the navigation bar, they reported they would have perceived and used the material differently. It appears that an overview or introduction on how to navigate the material is essential.

PAGE 204

Development and Validation 194 194 Component Instructor Observations and Participant Comments Suggestions for Improvement Assessment Pretests Posttests Performance Context When informed on the first two items of the pre and posttest for comprehension that more than one answer was required, students registered confusion and misread the instruction on these items. Suggested that the researcher reword the item to emphasize that more than one item was required to answer the question. Students responded to a true/false question regarding the organization of general subject directories. When shown the link on the introductory page, none of the students indicated that they saw the link in the instructional material. Students were distressed about their low scores. The researcher explained that their feedback was essential and that the tests only measured gains in their performance scores. Students suggested that scores not be posted to their final grade averages to eliminate confusion. Modifications to those items deemed confusing were changed to reflect both statistical low reliability and student feedback.

PAGE 205

Development and Validation 195 195 Appendix F ACADEMIC MOTIVATION PROFILE Modified for Use in an Introductory Manage ment Course in Library Science (Master's Program) ATTENTION: Various aspects of this course may or may not have gained and held your attention. For each of the course aspects listed below, rate your attention levels using the following responses: During this course, I was: (responses) 1. Not the least bit interested and my attention always wandered during... 2. Slightly interested and my a ttention frequently wandered during... 3. Moderately interested and my attention occasionally wandered during... 4. Very interested and my at tention rarely wandered during... TEXTBOOK AND READING ASSIGNMENTS 4. Information and explanations in the textbook 5. Examples (e.g., charts, graphs illustrations, case studies) in the textbook 6. Information, explanations, and examples in the readings LECTURES AND DISCUSSIONS 7. Lectures and explanations given by the professor. 8. Group discussion and profe ssor’s feedback and commentary. 9. Informal interaction with classmates and the professor PROJECT ASSIGNMENTS 10. Background work and research preparing for assignments. 11. Completion of assignments in final form to turn in. 12. Review of professor’s feedback and commentary. RELEVANCE: You may perceive the information and skills covered in this course to be relevant (useful to you) or irrelevant for a variety of reasons. Rate the relevance of this course to you personally using the following responses: This course was: (responses) 1. Not the least bit relevant (n ot useful to me) for helping me... 2. Slightly relevant for helping me.... 3. Moderately relevant for helping me... 4. Very relevant for helping me... DURING MY STUDIES AS A GRADUATE STUDENT 13. Prepare for SLIS Comprehensive Exams. 14. Learn necessary professional skills in Library Science.

PAGE 206

Development and Validation 196 196 Appendix F Continued 15. Perform professionally during other cla sses, site visits to libraries, and/or fieldwork assignments. DURING TRANSITION TO FIRST PROFESSIONAL POSITION IN A LIBRARY OR TRANSITION TO A NEW JOB ASSIGNMENT IN A LIBRARY 16. Make career decisions about jobs in librarianship. 17. Interview successfully for first job as a librarian or for a new job assignment. 18. Demonstrate professionalism and skill during first job or new job assignment. AS A LIBRARIAN 19. Analyze, plan, and evaluate library policies, programs, and procedures. 20. Manage the day-to-day programs and activities in a library. 21. Work effectively as a professional librarian with patrons, other librarians, and administrators. CONFIDENCE: Related to your “internal feelings” of your own skill levels (as opposed to the grades you anticipate in this course), rate your level of confidence in performing each of the following course goals usin g the following responses: 1. I do not feel at all confident in my ability to... 2. I feel slightly confident in my ability to... 3. I feel moderately co nfident in my ability to... 4. I feel very confident in my ability to... PLANNING FOR LIBRARY OPERATIONS 22. Analyze needs and plan mission, goals, and objectives. 23. Translate mission, goals, and objective s into library programs and activities. 24. Evaluate the effectiveness of programs and activities and prescribe improvements. MANAGING LIBRARY OPERATIONS 25. Establish and manage day-to-day operating procedures and activities 26. Manage and evaluate library staff. 27. Manage facilities, equipment, and collections. MANAGING PROFESSIONAL IS SUES AND LIBRARY OUTREACH 28. Plan and implement library policies in accord with accepted professional and ethical standards. 29. Identify the library’s community and plan for inclusion of stakeholders. 30. Publicize and promote the role of the library and the library’s programs in the community.

PAGE 207

Development and Validation 197 197 Appendix F Continued SATISFACTION: You may or may not have found this course personally rewarding or satisfying for a variety of reasons. Please rate your level of personal satisfaction with the course using the following responses. During this course I was: 1. Not at all satisfied with... 2. Slightly satisfied with... 3. Moderately satisfied with... 4. Very satisfied with... MY PARTICIPATION 31. The level of personal effort I expended. 32. My opportunities to discuss library management practices with other students 33. My opportunities to discuss library management practices with my professor PERSONAL DEVELOPMENT 34. My feelings of personal accomplishment. 35. My personal gains in skills required for library management. 36. My personal attitudes and opinions about library management. PROFESSIONAL AFFILIATION 37. My current perspectives on my role and responsibilities as a librarian. 38. What I now have to offer as a librarian to patrons and colleagues. 39. My potential contributions toward solving management problems in a library.

PAGE 208

Development and Validation 198 198 Appendix F Continued ACADEMIC MOTIVATION PROFILE Modified for Use in an Introductory Internet Search Module ATTENTION: Various aspects of this course may or may not have gained and held your attention. For each of the course aspects listed below, rate your attention levels using the following responses: During this course, I was: (responses) 1. Not the least bit interested and my attention always wandered during... 2. Slightly interested and my a ttention frequently wandered during... 3. Moderately interested and my attention occasionally wandered during... 4. Very interested and my at tention rarely wandered during... ONLINE NARRATIVE TEXT 1. Information and explanations in the online module 2. Examples (e.g., charts, graphs illustrations, case studies) in the online module 3. Information, explanations, and ex amples in the online narrative overviews. INFORMATION SEQUENCE AND PRESENTATION 4. Lectures and explanations presented in the tutorial 5. Feedback and commentary provided within the online module. 6. Interactivity with the online material (how mu ch hands-on interaction with the software) 7. Summarized concepts and how they related Internet search skills PRACTICE EXERCISES 8. Background work and research preparing for practice following overviews. 9. Completion of practice exercises 10. Adequate feedback following or during practice exercises. 11. Related new information to what I already knew RELEVANCE: You may perceive the information and skills covered in this course to be relevant (useful to you) or irrelevant for a variety of reasons. Rate the relevance of this course to you personally using the following responses: This course was: (responses) 1. Not the least bit relevant (n ot useful to me) for helping me... 2. Slightly relevant for helping me.... 3. Moderately relevant for helping me... 4. Very relevant for helping me... DURING MY PARTICIPATION IN THE ONLINE MODULE STUDY 12. Preparation for the comprehension posttest 13. Usefulness of the module for preparati on with Internet Scavenger Hunt posttest

PAGE 209

Development and Validation 199 199 Appendix F Continued 14. Learn useful search skills for finding Internet-based information 15. Perform better on research assignments in school USEFULNESS OF INFORMATION FOR RESEARCH IN FUTURE 16. Use skills for with other Internet databases such as Galileo 17. Ability to communicate with other students or adults skills learned from the online module AS A RESEARCHER 18. Analyze, plan, and revise research questions 19. Construct more precise keyword and subject search es with directories and search engines 20. Demonstrate greater accuracy and efficiency with Internet searches 21. Know when to use particular Internet search tools such as virtual libraries, specialized databases, subject directories, and search engines CONFIDENCE: Related to your “internal feelings” of your own skill levels (as opposed to the grades you anticipate in this course), rate your level of confidence in performing each of the following course goals usin g the following responses: 1. I do not feel at all confident in my ability to… 2. I feel slightly confident in my ability to… 3. I feel moderately confident in my ability to… 4. I feel very confident in my ability to… PLANNING INTERNET SEARCHES 22. Analyze research questions and select eff ective strategies for searching the Internet 23. Use the most effective tool given a research question 24. Revise a strategy that results in more rele vant information sources for your research question USING INTERNET STRATEGIES 25. Use AND, OR, NOT to construct search statements 26. Use wildcard characters to find alternate word usage 27. Use parentheses to write nested search statements 28. Use domain limiters for searching 29. Distinguish among Internet tools and when to use them such as virtual libraries, specialized databases, general subject directories, and search engines

PAGE 210

Development and Validation 200 200 Appendix F Continued SATISFACTION: You may or may not have found this course personally rewarding or satisfying for a variety of reasons. Please rate your level of personal satisfaction with the course using the following responses. During this course I was: 1. Not at all satisfied with... 2. Slightly satisfied with... 3. Moderately satisfied with... 4. Very satisfied with... MY PARTICIPATION 30. The level of personal effort I expended. 31. My opportunities to discuss with fellow students 32. My opportunities to discuss Internet se arch tools/strategies with my teachers PERSONAL DEVELOPMENT 33. My feelings of personal accomplishment. 34. My personal gains in skills required for Internet searches. 35. My personal attitudes and opinions about using Internet search tools.

PAGE 211

ABOUT THE AUTHOR Emily Dunsker at the time of publicati on, serves as the Teacher-Librarian for Chamblee Middle School, a magnet program in DeKalb County Georgia. Her students won an Apple Innovation Award for an iMovie entitled All Quiet on the Western Front Chamblee Middle School Media Center was chosen by the Director of Educational Media as a visitation site prior to receiving AASL National School Library Media Program of the Year 2001 She received her Masters Degree in Library an d Information Sciences from University of South Florida in 1998 and is a member of th e international library honorary, Beta Phi Mu, 1999.


xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam 2200409Ka 4500
controlfield tag 001 001681071
005 20060215071210.0
006 m d s
007 cr bnu uuuuu
008 051227s2005 flu sbm s000 0 eng d
datafield ind1 8 ind2 024
subfield code a E14-SFE0001047
035
(OCoLC)62744968
040
FHM
c FHM
049
FHMM
090
LB1555 (Online)
1 100
Dunsker, Emily K.
0 245
Development and validation of a systematically designed unit for online information literacy and its effect on student performance for internet search training
h [electronic resource] /
by Emily K. Dunsker.
260
[Tampa, Fla.] :
b University of South Florida,
2005.
502
Thesis (Ph.D.)--University of South Florida, 2005.
504
Includes bibliographical references.
516
Text (Electronic thesis) in PDF format.
538
System requirements: World Wide Web browser and PDF reader.
Mode of access: World Wide Web.
500
Title from PDF of title page.
Document formatted into pages; contains 211 pages.
Includes vita.
3 520
ABSTRACT: As online learning increases and classroom use of print textbooks are gradually replaced by web-based instruction, what features of online instruction prove beneficial to student learning? The present study has three purposes; (1) To examine the effects of conversion of textbook content to web-based instruction for an extant Internet search course. The researcher examined performance differences of an online textbook to web tutorial compared to a second version that included interactive features found in classroom instruction. (2) To investigate students perceptions of material that afforded high levels of learner control and compared responses to a more structured instructional module. (3) To document the design process used to convert textbook material to web-based instruction. Gagns Events of Instruction (1985) differentiated features for comparison and treatment online modules; one featured content-centered, the other learner-centered instructional strategies.
590
Adviser: Arthur Shapiro.
653
Content-centered instruction.
Learner-centered instruction.
Constructivist.
Behaviorist.
Gagn.
690
Dissertations, Academic
z USF
x Interdisciplinary Education
Doctoral.
773
t USF Electronic Theses and Dissertations.
949
FTS
SFERS
ETD
LB1555 (Online)
4 856
u http://digital.lib.usf.edu/?e14.1047