USF Libraries
USF Digital Collections

Development and validation of a web-based module to teach metacognitive learning strategies to students in higher education

MISSING IMAGE

Material Information

Title:
Development and validation of a web-based module to teach metacognitive learning strategies to students in higher education
Physical Description:
Book
Language:
English
Creator:
Singh, Oma B
Publisher:
University of South Florida
Place of Publication:
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Design-based research
Design research
Developmental research
Web-based development
Web-based learning
Dissertations, Academic -- Secondary Education -- Doctoral -- USF   ( lcsh )
Genre:
non-fiction   ( marcgt )

Notes

Abstract:
ABSTRACT: This study used a design based-research (DBR) methodology to examine how an Instructional Systematic Design (ISD) process such as ADDIE (Analysis, Design, Development, Implementation, Evaluation) can be employed to develop a web-based module to teach metacognitive learning strategies to students in higher education. The goal of the study was twofold: (a) to examine the use of a systematic ISD process, ADDIE, to develop a web-based module that would be considered valid and effective, and (b) to use the design-based research (DBR) methodology to create relevant outcomes for practitioners in the field of IT while adding to the body of IT research. As in other DBR studies, a large amount of qualitative data was collected. DBR studies usually call for a variety of data collection instrument. In this study, a total of two interviews and twelve questionnaires were used to gather data. The outcomes of the study suggested that using a systematic approach such as ADDIE to develop a valid and effective interactive web-based module was still viable. Additionally, although the outcomes from this study did not form a basis to propose a new ISD model, it highlighted five key activities that could be added to the ADDIE process to accommodate development of a quality interactive web-based product. The five activities are as follows: (1) to conduct a detailed front-end analysis, (2) to develop a prototype early in the process, (3) to integrate formative and summative evaluations, (4) to assimilate iterations of "design-evaluate-refine" cycles throughout the process, and (5) to accommodate flexibility within the process. Furthermore, using the DBR methodology yielded results that added to the body of IT research and it provided support of the use of this methodology within the instructional technology discipline.
Thesis:
Dissertation (Ph.D.)--University of South Florida, 2009.
Bibliography:
Includes bibliographical references.
System Details:
Mode of access: World Wide Web.
System Details:
System requirements: World Wide Web browser and PDF reader.
Statement of Responsibility:
by Oma B. Singh.
General Note:
Title from PDF of title page.
General Note:
Document formatted into pages; contains 263 pages.
General Note:
Includes vita.

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
aleph - 002029593
oclc - 436988579
usfldc doi - E14-SFE0002940
usfldc handle - e14.2940
System ID:
SFS0027257:00001


This item is only available as the following downloads:


Full Text
xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam 2200397Ka 4500
controlfield tag 001 002029593
005 20090918082556.0
007 cr mnu|||uuuuu
008 090918s2009 flu s 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E14-SFE0002940
035
(OCoLC)436988579
040
FHM
c FHM
049
FHMM
090
LB1601 (Online)
1 100
Singh, Oma B.
0 245
Development and validation of a web-based module to teach metacognitive learning strategies to students in higher education
h [electronic resource] /
by Oma B. Singh.
260
[Tampa, Fla] :
b University of South Florida,
2009.
500
Title from PDF of title page.
Document formatted into pages; contains 263 pages.
Includes vita.
502
Dissertation (Ph.D.)--University of South Florida, 2009.
504
Includes bibliographical references.
516
Text (Electronic dissertation) in PDF format.
3 520
ABSTRACT: This study used a design based-research (DBR) methodology to examine how an Instructional Systematic Design (ISD) process such as ADDIE (Analysis, Design, Development, Implementation, Evaluation) can be employed to develop a web-based module to teach metacognitive learning strategies to students in higher education. The goal of the study was twofold: (a) to examine the use of a systematic ISD process, ADDIE, to develop a web-based module that would be considered valid and effective, and (b) to use the design-based research (DBR) methodology to create relevant outcomes for practitioners in the field of IT while adding to the body of IT research. As in other DBR studies, a large amount of qualitative data was collected. DBR studies usually call for a variety of data collection instrument. In this study, a total of two interviews and twelve questionnaires were used to gather data. The outcomes of the study suggested that using a systematic approach such as ADDIE to develop a valid and effective interactive web-based module was still viable. Additionally, although the outcomes from this study did not form a basis to propose a new ISD model, it highlighted five key activities that could be added to the ADDIE process to accommodate development of a quality interactive web-based product. The five activities are as follows: (1) to conduct a detailed front-end analysis, (2) to develop a prototype early in the process, (3) to integrate formative and summative evaluations, (4) to assimilate iterations of "design-evaluate-refine" cycles throughout the process, and (5) to accommodate flexibility within the process. Furthermore, using the DBR methodology yielded results that added to the body of IT research and it provided support of the use of this methodology within the instructional technology discipline.
538
Mode of access: World Wide Web.
System requirements: World Wide Web browser and PDF reader.
590
Advisor: James A. White, Ph.D.
653
Design-based research
Design research
Developmental research
Web-based development
Web-based learning
690
Dissertations, Academic
z USF
x Secondary Education
Doctoral.
773
t USF Electronic Theses and Dissertations.
4 856
u http://digital.lib.usf.edu/?e14.2940



PAGE 1

Devel opment and Validation of a Web B ased Module to Teach Metacognitive Learning Strategies to Students in Higher Education by Oma B. Singh A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy Department of Secondary Education College of Education University of South Florida Major Professor: James A. White, Ph.D. John M. Ferron, Ph.D. Teri L. Rydl, Ph.D. William H. Young, III, Ed.D. Date of Approval: March 3 rd 200 9 Keywords: Design B ased Researc h, Design Research, Developmental Research, Web B a sed Development, Web based Learning, Online Learning, Systematic Development, ADDIE, Metacognitive Learning Copyright 2009 Oma B. Singh

PAGE 2

Dedication To my mother, Ms. Mahadai Pu stam: Words cannot convey my appreciation for all the sacrifices you made for me, the ones known and unknown to me. Thank you for the continuo us love and support you provide to me. As a child, I know it was you who instilled in me the importance of educati on. Maybe I did not appreciate it then but I do no w. All I can hope is that I also will get the privilege to instill in others, the importance of education. So, I thank God and you for all my accomplishments, yesterday, today and tomorrow. I love you Mum.

PAGE 3

Acknowledgement s I wish to acknowledge my courageous husband Mr. Michael L. Ashley who supported me one hundred percent throughout this process. I would like to thank Mike for all the kindness, respect wonderful meals and just about everything else As noted, I did not accomplish this task by mysel f and there are many wonderful people who encouraged me along the way. I would like to thank my major professor, Dr. James A. White for his guidance and consideration as I navigated through my study. Dr. Whi dedication to his students is unsurpassed and I believe like me, all of his students think highly him. The same can be said of my committee members, Dr. John M. Ferron, Dr. Teri L. Rydl and Dr. Bill H. Young III Among the many good things I have lea rned from being mentored by Dr. White and my committee members, I think the most important things are that if you include kindness, respect, patience and humor in your I would also like to acknowledge and thank al l of my family and friends who understood how important this was to me and helped me with prayers and kind words I would espe cially like to thank my sister Sharmatie for always helping me she is the most generous person I know. I also recognize m y frien d Ms. Nadia Heinz for going to the library with me, just because I needed the motivation I also thank Nadia for editing my document -more than once My deepest gratitude goes to Ms. Susan Cook for her support Thank you also to Dr. Melissa Venable who s disciplined about writing and gave me the motivation I needed to complete what I started.

PAGE 4

i Table of Contents L ist of Tables v L ist of Figures vi i A bstract vii i Chapter One: Introduction ................................ ................................ ................................ .. 1 Background ................................ ................................ ................................ ............. 2 Statement of the Problem ................................ ................................ ........................ 6 Purpose of Study ................................ ................................ ................................ ..... 7 Research Question ................................ ................................ ................................ .. 8 Research Objectives ................................ ................................ ................................ 8 Significance of the Study ................................ ................................ ........................ 9 Assumptions ................................ ................................ ................................ .......... 11 Limitations and Threats of the Study ................................ ................................ .... 11 Instrumentation ................................ ................................ ......................... 11 Researcher Bias ................................ ................................ ......................... 12 Delimitations of the S tudy ................................ ................................ .................... 12 Definition of Terms ................................ ................................ ............................... 12 Organization of the Study ................................ ................................ ..................... 14 Chap ter Two : Literature Review ................................ ................................ ...................... 16 Design B ased Research (DBR) ................................ ................................ ............. 16 Overview of Design B ased Research ................................ ....................... 16 Paradigm shift: Design B ased Research and Instructional Technology ................................ ................................ ......................... 24 A Practical Approach to Using DBR, Evaluating WBI and Advancing Research ................................ ................................ ........... 26 Challenges of Design B ased Research in Instructional Technology ................................ ................................ ......................... 28 Web based Instruction (WBI) ................................ ................................ ............... 34 Web based Instruction for Institutes of Higher Education (IHEs) and Quality Concerns ................................ ................................ .......... 37 The Instructional Systems Design (ISD) Process ................................ ................. 44 A Systematic Approach for Web B ased Instruction ................................ 48 The ISD Process: ADDIE ................................ ................................ ......... 51 Learning ................................ ................................ ................................ ................ 56 Metacognitive Learning ................................ ................................ ............ 67 Teaching Test T aking Strategies ................................ .............................. 68

PAGE 5

ii Learner S atisfaction and Quality of Web B ased Instruction .................... 73 S ummary ................................ ................................ ................................ ............... 81 Chapter Three : Methods ................................ ................................ ................................ ... 84 Overview of the Study ................................ ................................ .......................... 85 Se tting ................................ ................................ ................................ ....... 86 Sampling ................................ ................................ ................................ ... 87 Participants ................................ ................................ ................................ 88 Ethical Considerations ................................ ................................ .............. 91 The Principal Investigator (PI) ................................ ................................ .. 91 Description of the Course for Conversion: Learning Strategies within Academic Disciplines ................................ .............................. 93 Data Collection Instruments ................................ ................................ ................. 94 Interview ................................ ................................ ................................ ... 94 Questionnaires ................................ ................................ ........................... 95 Observation ................................ ................................ ............................. 100 Logbook ................................ ................................ ................................ .. 100 Research Design ................................ ................................ ................................ .. 100 Construct Validity: Overall Study ................................ .......................... 101 Construct Validity for Instrument Development: Expert Review of Instruments ................................ .......................... 104 Data Reduction and Analysis ................................ ................................ .............. 106 Summary of Pilot Study Results: Analysis Phase and Prototype Outcomes ................................ ................................ ................................ ...... 108 Analysis Phase: Instruments ................................ ................................ ... 109 Analysis Phase: Outcomes ................................ ................................ ...... 111 Prototype Development ................................ ................................ .......... 123 Analysis Phase: Provisional Lessons Learned ................................ ........ 124 Design Phase ................................ ................................ ................................ ....... 126 Method ................................ ................................ ................................ .... 126 DBR Overview: Design Phase ................................ ................................ 127 Development Phase ................................ ................................ ............................. 128 Methods ................................ ................................ ................................ ... 128 DBR Overview: Development Phase ................................ ...................... 132 Implementation Phase ................................ ................................ ......................... 132 Method ................................ ................................ ................................ .... 132 DBR Overview: Implementation Phase ................................ .................. 133 Evaluation Phase ................................ ................................ ................................ 133 Methods ................................ ................................ ................................ ... 133 DBR Overview: Evaluation Phase ................................ .......................... 135 Evaluation Phase: Evaluation Goal. ................................ ....................... 136 Summary ................................ ................................ ................................ ............. 137

PAGE 6

iii Chapter Four : Results ................................ ................................ ................................ ..... 138 Design Pha se Results ................................ ................................ .......................... 138 Design Phase: Expert Review of Analysis Phase Instruments ............... 140 Design Phase: Analysis of Data ................................ .............................. 140 Development Phase Results ................................ ................................ ................ 142 Development Phase: Expert Review of Development Phase Instruments ................................ ................................ ........................ 146 Development Phase: Analysis of Data ................................ .................... 146 Implementation Phase Results ................................ ................................ ............ 153 Evaluation Phase Results ................................ ................................ .................... 154 Evaluation Phase: Expert Review of Evaluation Phase Instruments ................................ ................................ ........................ 154 Evaluation Phase: Analysis of Data ................................ ........................ 154 Design Based Research (DBR) Results ................................ .............................. 160 DBR Overview: Analysis Phase ................................ ............................. 161 DBR Overview: Desi gn Phase ................................ ................................ 162 DBR Overview: Development Phase ................................ ...................... 166 DBR Overview: Implementation Phase ................................ .................. 168 DBR Overview: Evaluation Phase ................................ .......................... 168 Summary ................................ ................................ ................................ ............. 170 Chapter Five : Summary ................................ ................................ ................................ .. 171 Discussion of the Research Question and the Theoretical Implications ............. 171 Discussion of Resea rch Objectives ................................ ................................ ..... 176 Research Objective 1: ................................ ................................ ............. 176 Research Objective 2: ................................ ................................ ............ 177 Research Objective 3: Deliverable A ................................ ..................... 178 Research Objective 3: Deliverable B ................................ ...................... 182 Rese arch Objective 3: Deliverable C ................................ ...................... 186 Research Objective 3: Deliverabl e D ................................ ...................... 187 Implications Concerning Quality of the Web B ased Module ............................. 188 Over view of DBR Methods for Instructional Design Research and Theoretical Implications ................................ ................................ ................................ ... 189 Limitations and Threats ................................ ................................ ...................... 194 Directions fo r Further Research ................................ ................................ .......... 195 Summary ................................ ................................ ................................ ............. 195 References ................................ ................................ ................................ ........... 197 Bibliography ................................ ................................ ................................ ....... 213 APPENDICES ................................ ................................ ................................ ................ 216 Appendix A: Results: Pilot Study (Analysis Phase & Prototype) ...................... 217 Needs Analysis ................................ ................................ ........................ 218 Audience Analysis (For SME) ................................ ................................ 220

PAGE 7

iv Task Analysis ................................ ................................ .......................... 222 Content Analysis ................................ ................................ ..................... 223 Context Analysis ................................ ................................ ..................... 226 Specimen A 1 Analysis Phase: Instrument and summary of resul t s ................................ ...... 227 Figure A 1. Screen Shot #1 of prototype of web based module .............. 230 Figure A 2. Screen Shot #2 of prototype of web based module ............ 231 Appendix B. Results: Design through Evaluation Phases of ADDIE ................ 232 Specimen B 1. Summary of information derived from ................................ ............................ 233 Specimen B 2 Development p hase: Instrument and summa questionnaire ................................ ................................ ..................... 234 Specimen B 3 Development Phase: Instrument and questionnaire ................................ ................................ ..................... 237 Specimen B 4 Development Phase: Instrument and summary of the results questionnaire ................................ ................................ ..................... 239 Specimen B 5 Development Phase: List of refinements from formative evaluations. ................................ ................................ ....... 242 Specimen B 6 Iteration 1 Evaluation Phase: Instrument and questionnaire ................................ ................................ ..................... 244 Specimen B 7 Iteration 1 Evaluation Phase: List of refinements derived from summative review after iteration 1 of the evaluate cycle ................................ .......................... 247 Specimen B 8 Iteration 2 Evaluation Phase: Instrument and questionnaire ................................ ................................ ..................... 2 48 Specimen B 9 Iteration 2 Evaluation Phase: List of refinements derived from summative review after iteration 2 of the evaluate ................................ .......................... 252 Specimen B 10 D BR Perspective: Instrument and ................................ ................................ .................. 253 Specimen B 11 DBR Perspective: Instrument and t ................................ ................................ .................. 256 Appendix C: Instructional Development Plan (IDP) ................................ .......... 258 Appendix D: Excerpt from Logbook ................................ ................................ .. 260 ABOUT THE AUTHOR ................................ ................................ ....................... End Page

PAGE 8

v List of Tables Table 1 Number and percentage distribution of 2 year and 4 year Title IV degree granting institutions, by distance education program status and institutional type and size: 2000 2001 ................................ ................................ ................................ ......................... 38 Table 2 Percentage distribution of 2 year and 4 year Title IV degree granting institutions that offered distance education courses in 2000 2001 or planned to offer distan ce education in the next 3 years, by the planned level of distance education course offerings over the next 3 years, and by the planned primary technology for instructional delivery: 2002 ................................ ................................ ................................ ................................ ... 41 Table 3 Percentage distribution of 2 year and 4 year Title IV degree granting institutions by the extent to which various factors are preventing the institution from starting or expanding distance education course offerings: 2002 ................................ ...................... 42 Table 4 based distance learning ................................ ................................ ................................ ............................. 43 Table 5 Summary of differences betwe en instruction and instructional design .............. 47 Table 6 ADDIE phases: Questions answered ................................ ................................ .. 55 Table 7 Overview of the five conditio ns of learning ................................ ....................... 60 Table 8 Instructional events and their relation to processes of learning in design of a computer based lesson ................................ ................................ ................................ ...... 65 Table 9 Interaction with course interfaces and content: Research findings and practical implications ................................ ................................ ................................ ....................... 76 Table 10 Interaction with instructors: Research findings and practical implications ................................ ................................ ................................ ....................... 77 Table 11 Interaction with classmates and vicarious interactions: Research findings and practical implications ................................ ................................ ................................ ........ 78 Table 12 Usability design principles for WBI ................................ ................................ 79 Table 13 Overview of instruments showing relationship between ADDIE and DBR evaluation functions ................................ ................................ ................................ .......... 96

PAGE 9

vi Table 14 List and type of instruments, participants and learner course descriptions ................................ ................................ ................................ ....................... 98 Table 15 .................... 156

PAGE 10

vii List of Figures Figure 1. Design experiment ................................ ................................ ............................. 18 Figure 2. Comparison of dev elopment approach and empirical research ......................... 23 F igure 3. Extending the role of the learning designer through design based research ................................ ................................ ................................ ....... 30 Figure 4. Relationship and differences between instruction and instructional design ................................ ................................ ................................ ................................ 46 Figure 5. Instructional systems design models ................................ ................................ 53 Fig ure 6. Timeline in weeks for web based development ................................ ................ 87 Figu re 7. Pictorial representation of construct validity elements inc luded in the research design ................................ ................................ ................................ .............................. 101 Figure 8. Overview of research design ................................ ................................ ........... 103 Figure 9. Execution of research plan ................................ ................................ .............. 105 F igure 10. Screen Shot 1 of web based module ................................ ............................. 143 Figur e 11. Screen Shot 2 of web based module ................................ ............................. 143 Figure 12. Screen Shot 3 of web based module ................................ ............................. 144 Figure 13. Screen Shot 4 of web based module ................................ ............................. 144 F igure 14. Screen Shot 5 of web based module ................................ ............................. 145 Figure 15. Five activities and how they related to ADDIE ................................ ............. 173

PAGE 11

viii Devel opment and Validation of a Web B ased Module to Teach Metacognitive Learning Strategies to Students in Higher Education Oma B. Singh ABSTRACT This study used a design based research (DBR) methodology to examine how an Instructional Systematic Design (ISD) process such as ADDIE (Analysis, D esign, Development, Implementation, Evaluation) can be employed to develop a web based module to teach metacognitive learning strategies to students in higher education. The goal of the study was twofold: (a ) to examine the use of a systematic ISD process, ADDIE, to develop a web based module that would be consid ered valid and effective, and (b ) to use the d esign based research ( DBR ) methodology to create relevant outcomes for practitioners in the field of IT while adding to the body of IT research. As in o ther DBR studies, a large amount of qualitative data was collected. DBR studies usually call for a variety of data collection instrument. In this study, a total of two interviews and twelve questionnaires were used to gather data. The outcomes of the study suggested that using a systematic approach such as ADDIE to develop a valid and effective interactive web based module was still viable. Additionally, although the outcomes from this study did not form a basis to propose a ne w ISD model, it highlighted fi ve key activities that could be added to the ADDIE process to accommodate

PAGE 12

ix development of a quality interactive web based produ ct. The five activities are as follows: (1) t o conduc t a detailed front end analysis, (2) t o develop a protot ype early in the proc ess, (3) t o integrate formative and summative evaluations, (4) t o assimilate evaluate and (5) t o accommodate flexibility within the process. Furthermore, using the DBR methodology yielded results that added to the body of IT research and it provided support of the use of this methodology within the instructional technology discipline.

PAGE 13

1 C hapter O ne Introduction This study applied a design based research (DBR) approach to develop and validate a w eb based module that teaches metacognitive learning strategies to learners within academic disciplines in higher education utilizing a systematic Instructional Systems Design (ISD) process: ADDIE (Analysis, Design, Development, Implementation, Evaluation). The process of analyzing, designing, developing, implementing and evaluating online content matters; especially if it affects the quality of learning modules which may in turn, affect the learning outcomes of online learners. In other words, quality matte rs. The quality of online courses should be such that it is equal or better than traditional type courses (Chao, Saj & Tessier, 2006). Distance education (note: in this study distance education will not differ in meaning to online or web based or computer based education) may no longer be considered a new phenomenon (Phipps & Merisotis, 1999). During 1995 to 1997, statisticians at the National Center for Education Statistics (NCES) reported that the percentage of distance education c ourses increased from 3 3% to 44% among 2 year and 4 year institutions ( Waits & Lewis, 2003 ). As of 2002 nearly 78% of adult students had completed a web based course (Parker, 2003). From these percentages, i t is apparent that web based learning is becoming i ntegrated into our ed ucation choices

PAGE 14

2 Background Many educational researchers choose to focus on instructional approaches to enhance learning rather than on media comparisons. In ducation technology research ers observed a need to shift their attention from media comparison to how learning occur online (Duin, 1998). T he Clark (1994, 1991, 1985, 1983) and Kozma ( 1994 1991 ) research studies and debate on this issue also cited in Robyler and Wiencke (2003) influenced that shift of instructional technology rese a r ch Clark (1994, 1991, 1985, 1983) maintained that technology in and of itself cannot improve learning outcomes. view of the research at that time led him to ascertain that there was traditiona l instruction and web based instruction (WBI). Rather, Clark thought that research focus should shift from comparative media studies to studies that would help discover new or improve instructional approaches Kozma (1994, 1991) in contrast, did not disreg ard education technology research or media comparisons. However, h e did suggest that if online instruction methods were carefully constructed to engage the learner then technology could provide the basis for successful learning opportunities. Some research ers, including Clark (1994, 1991, 1985, 1983) a nd Kozma (1994, 1991) saw the futility of focusing on comparative issues between the traditional and online forms of delivery (Brown & Wack, 1999). In essence, this debate accentuated the need for researchers to study other aspects of web based instruction that could result in more effective methods and improved learner outcomes.

PAGE 15

3 In fact, at a point when the comparative studies between traditional and online learning w as at its peak, Moore (1989) encouraged fut ure researchers to give attention to a different aspect of online education, the learners themselves. More recent studies like those highlighted by Ramage (2002) and what Kozma (1994, 1991) alluded to, infers that there is a difference in learner outcomes among online courses that may be affected by media, method, design, use and evaluation, and researchers should be encou raged to study these aspects. Web based courses are now woven into the fabric of academia and the corporate world. Many instructional tec hnology researchers are focusing on methods to improve the effectiveness of web based courses. In t his study the researcher will explore the a ffect of employing the theoretically based ISD approach to analyze, design, develop, implement and evaluate a web based module to teach metacognitive objective test taking learning stra tegies. The ISD process incorporated into this study is ADDIE. Indeed, there are many ISD models in existence today that includes the systematic ADDIE process to some degree (Scafati, 1998; Allen, 2006). Key researchers (Dick, Carey & Carey 2005; Dick & Carey, 1996, 1990; Seels & Glasgow, 1998; Scafati, 1998) within the instructional design discipline favor a systematic approach to developing WBI. They believe that quality WBI is devel oped by following a process that will analyze, design, develo p, implement and evaluate training (Dick et al., 2005; Dick & Carey 1996, 1990; Seels & Glasgow, 1998 ; Scafati, 1998; Clark, 1989). Moreover, a ccording to researchers (Dick et al 2005 ; Dick & C arey 1996, 1990; Scafati, 1998), the models of a syste ms approach are the result of more than 25 years of research. Scafati (1998, 2004) believes that one of the primary strengths of the systems approach to developing curriculum is defining clear and measu rable objectives

PAGE 16

4 (2004, p. 389) ADDIE provides the foundation of many systematic models that exists today. Allen (2006, 2003) provides a different perspective on the ADDIE process, arguing th at it is no longer adequate t o sustain the development of WBI that is both high in quality and effectiveness. He identifies the exponential growth in technology (e .g. 3 D and simulation software and advance s in network technology etc.) as one of the main r easons why ADDIE is inadequate (Allen, 2006). Also, ADDIE is generic and many designers, Allen included, modify the process to suit their individual purpose s (Allen 2006). Allen (2006) who is the Chief Executive Officer ( CEO ) of a successful e learning de velopment (p.33) Allen is not alone in his views. Gordon and Zemke (2000) have also notably criticized ADDIE as being inefficient and result ing in ineffective training. However, the re ma y be other factors to consider that may also lead to poor quality online courses Dick et al systematic approach. Could it be that quality is being affected because instruction al designers may not have enough time to design a quality online course using a systematic approach? This is a valid question when reviewing the growth rate of online ( i.e. web based or distance ) courses and the political aspects that are affecting this gr owth Currently in the United States, each state oversees their quality of education Furthermore, institutes of higher education (I HE s) may be further regulated by different accreditation groups (e.g. SACS (Southern Association for Colleges and Schools)). T his situation poses a threat to the future of distance education because distance education programs ha ve to operate under many different regulatory systems (Levine & Sun, 2003).

PAGE 17

5 What may be considered acceptable quality in one state may not be so in ano ther. Unfortunately, i n the process of meeting the growing demand for online access to course material, there also appears to be an increase in poorly designed online courses In the rush to move courses online, some instructional designers are ignoring or unaware of the systematic approach to developing a web based course. The results of this research study will help to clarify the need for instructional designers to implement a systematic approach to develop a web based module Designing a course and desi gning it well enough to meet the needs of the learners can be a reality. In addition, the outcome of this study will provide instructional designers refined practical information on how a systematic approach can be implemented and adapted for their web ba sed initiatives. Advances in information and computer technology (ICT) working in combination with a decrease in prohibitive costs, are driving IHEs to increase the use of technology into their curricula In order to meet the growing demands externally (st ate) and internally (within the IHE), IHEs are introducing web bas ed courses into their curricula within a short period of time and are rapidly increasing the overall number of web based courses. The results of a 2006 survey of approximately six hundred 2 and 4 year colleges showed that wireless networks in college classrooms, which could facilitate an increase in accessibility to online courses, h ad increased from 42.7% approxi mately a little over two fifths of the population surveyed in 2005 to 51.7% in 2006 ( The Campus Computing Project 2006) To demonstrate how rapidly IHEs are moving their courses online, the National Cen ter for Educational Statistics, conducted a series of studies on distance education in 2 year an d 4 & Lew is 2003). For example, i n 2000 01, 90% of

PAGE 18

6 public 2 year and 89% of public 4 year institut ions as well as 16% of private 2 year and 40% of private 4 year institutions offered distanc e education courses (Waits & Lewis 2003). Furthermore, researchers (Allen & Seaman, 2007) who were funded by The Sloan Consortium (Sloan C) conducted a more recent study of the growth of online courses in over 2,500 colleges in the U.S stated that in F students were taking at least one online course ( p. 1). This represented an increase of 10% over the previous year. In regards to IHEs, in Fall 2006 at least 20% of all students surveyed had taken one online course (Allen & Seaman, 2007). The researchers of the Sloan C study (Allen & Seaman, 20 07 ) also found that the enrollment growth rate for online courses during the Fall semester of 2006 was 9.7% This growth rate surpassed the overall growth rate for students enrolled in higher educa tion, which was only 1.5% As a result, the quality of web based courses is a challenge and a concern among educators considering the rate at which it is expanding online Statement of the Problem There has been a revelation in the last 10 years concerning web based courses and curriculum and that is, the problem is not the rise in the numbers of web based courses but rather the design and education al content are poor in quality (Janicki & Li e gle 2001 ; Mariasingam & Hanna, 2006 ). A successful learning outcome for web based learners is de pendent on instruction that is well d esigned and developed (Simonson, Smaldi no, Albright, & Zvacek, 2003). A poorly designed web based course can add to other problems that may lead to poor learning outcomes. Poor learning outcomes stemming from less than adequate ly designed web base d courses need to be addressed by researchers in this field.

PAGE 19

7 If instructional designers are currently designing online courses based o n a systematic approach, then they need to understand clearly what works and what does not work in practical terms There is a need in this field to clearly identify how the theoretical based systematic approach of instructional systems design (ISD) is being to based initiatives for curricula With the accessibility of web based co urses on the rise, it is critical that instructiona l designers of web based courses understand the importance of using a systematic approach in design as well as to be aware of what translates best from theoretical to practical. Purpose of Study T he re wer e two purposes in conducting this study. First, o ne of the purposes of this study was to provide instructional designers with a practical guideline of how to design web based courses that maintains a systematic approach adher ing to the foundational strengt h of the ISD theories To meet this purpose, t he study focus ed on developing an intervention using the ADDIE process where a positive learning outcome was derived. T he outcome of this study provide d instructional designers and web based instructional desig ners with guideline s to make better design decisions. Furthermore, it was the hope that this study would help instructional designers and rese a r chers gain a deeper understanding of the ISD process es involved in WBI that is pedagogically, theoretically and practically sound. Answering the call of many noted researchers in this field to use a DBR approach to study the problems and issues found within the instructional technology discipline was the second purpose of this study. Within the instructional techn ology discipline, DBR has been gaining momentum (van denAkker, Gravem eijer, McKenney & Nieveen, 2006 p.

PAGE 20

8 3 ). As van den Akker et al. (2006) pointed out that many definitions of DBR exists and will be explored further in Chapter Two one definition of DBR by Barab, Arici, and Jackson (2005) is as follows: Design based research is a collection of innovative methodological approaches that involve the building of theoretically inspired designs to systematically generate and test theories in naturalistic settings. Design based research is especially powerful with respect to supporting and systematically examining innovation. (p. 15) The DBR approach holds the possibility of providing deeper insig hts and practical outcomes that can truly aid the practit ioners in this field. A lso DBR provide the opportunity to study foundational theories with a new perspective therefore shedding light on factors that may have become obsolete or could be re energized and utilized in a new fashion. From a DBR perspective, the guideline that emerged from the data gathered in the present study helped to refine the ADDIE process and provide d an opportunity to explore the emer gence of a new model altogether although this w as not a primary objective of the study. Research Question What is the effect of applying a systematic approach to the development of a web based module for teaching metacognitive learning strategies to students in a higher education environment? Research Objectives Research Objective 1: To create a systematically and rigorously designed product intended to meet research design goals.

PAGE 21

9 Research Objective 2: To produce data that indicated the validity and effectiveness of the product. Research Objective 3: De liverables were : Deliverable A: A list of generalized Deliverable B: Report on the effectiveness of the sp ecific instructional strategies utilized. Deliverable C: An analysis of quantitative, qualitative and descriptive outcome measures of learning among field test participants Deliverable D: A module that was considered valid and effective at the juncture where the study completed a second iteration of the evaluate refine cycle Consideration of the module validity and effect iveness was derived using data collected via formative and summative evaluations guided by the ADDIE process Significance of the Study Significance of this study was twofold. First, it was important to let the readers understand that the research approach design based research (DBR) was relatively new within the field of instructional technology and education itself. A widespread adoption of the DBR approach within education was encouraged by many key researchers ( Edelson, 2002; Collins, 1992; Reeves, Herrington, & Oliver, 2005; Re eves, 2000; van den Akker, 1999; Brown, 1992; ). In fact, a growing number of researchers, (Reeves, 1995, 2000; Resnick, 1999; van den Akker, 1999) are strong advocates of the DBR approach They support DBR because they believe that it benefit s the instructional technology (IT) discipline by provid ing more socially relevant information to designers

PAGE 22

10 and developers. Reeves (1995) and Reeves et al (2005) have opined that a lack of socially relevant studies in the field of instruc tional technology is a major dilemma that needs to be addressed. Second, it is critical within any field to study anew long held theoretical approaches. In this case, the study of practical usage of the systematic approach of instructional design using the generic ADDIE process was long overdue. A possibility exist s that technological growth and improvement in ICTs within recent years have made an impact on the manner i n which instructional design was being conducted. Additionally, ADDIE may or may not hav e evolved along side these technological changes; therefore it was vital that research be conducted on this process. Some practitioners in the field of instructional technology eschew ed the systematic approach saying that this approach was a poor fit in the practical instructional development world (Allen, 2006). In contrast, some education researchers were uncomfortable with the idea that quality instructional design can be accomplished without a systematic approach. It is important, especially in instruct ional technology to create a bridge between the theoretical and the practical approach to create WBI that will add value in terms of quality and effectiveness. The nature of instructional design and development should accentuate the need for researchers an d practitioners to work closely together. These were some of the expectations for this study. I n addition, the researcher believed that this study emphasized the need for more avenues where the academic and practical world of instructional design collides and coalesces

PAGE 23

11 Assumptions There were two assumptions made by the researcher in this study. First, n oted researchers ( Reeves et al., 2005; Reeves, 2000; van den Akker, 1999; Brown, 1992; Collins, 1992; De sign based Research Collective [DBRC] 2003) in the field of instructional technology believe d that DBR is the appropriate methodology t o advance the body of research. This shift in paradigm has come as a result of criticisms levied at the body of research in the fiel d of instructional technology. Many res earchers claim that instructional technology research has a di luted impact on practitioners. Second, a systematic approach to design and to develop can be used to inculcate high quality web based instruction (WBI). Unfortunately, it appears that this appro ach is being neglected since there are many courses being delivered via the Web that are content is taken from any source and put on the Web as fast as possible with little regard for appearance and usab (Whatis.com 2007, para. 1 ). In t his study the researcher assumed that the systematic approach to instructional design was a productive way to create effective WBI. The researcher proposed that the ADDIE process will serve as a guideline to create a n innovative web based intervention that is high in quality and effectiveness. Limitations and Threats of the Study Instrumentation There were a variety of instruments utilized to gather information in each phase of the ISD process and to create the prop osed web based module. The possibility existed that i nternal valid ity may have been compromised. P re testing and post testing instruments were not employed in this study However, some measures were taken to

PAGE 24

12 co unteract any internal validity threat Notabl y, for each questionnaire, questions were derived from credible research sources and each questionnaire was expertly reviewed. For the present study, i nformation was collected via questionnaires, interviews and an observation. Researcher Bias In regards t o the qualitative data that was collected the researcher guarded against bias when reporting results of exploratory or open ended information that was collected. The researcher utilized promoted critical self refl ection to enhance awareness of any biases or predispositions thus reducing the threat of researcher bias (Johnson & Christensen, 2004). Another strategy that was employed in the study was to have the data and analysis reviewed by an editor. Delimitations o f the Study The present study was design evaluate refine was limited to one in the Development phase and two in the Evaluation phase of ADDIE Information gathering occurred in a naturalistic s etting. One of the strengths of conducting the study within this setting was that the results could be generalized across population to a certain extent. The results may be generalized to adults eighteen and over but is limited in regards to a younger popu lation Definition of Terms In order to provide clarification to the reader, the following is a list of terms and wha t it connotes in the present study. Please note that the definitions with no citations are terms defined by the researcher.

PAGE 25

13 ADDIE (Analysis Design, Development, Implementation, Evaluation) : a conceptual framework of the ISD process (Bichelmeyer, 2005 ) Cognitive Strategies : Numerous ways by which learners guide their own learning, thinking, acting, and feeling (Driscoll, 1994, p. 341). Effe ctive ness : The term "effective ness effectiveness, that is, effectiveness of the product will be an interpretation made by the participants of the study such as the Instruction Design experts and the learner s. Internet : The Internet is a massive network of networks, a networking infrastructure. It connects millions of computers together globally, forming a network in which any computer can communicate with any other computer as long as they are both connected to the Internet. Information that travels over the Internet does so via a variety of languages known as protocols ( Webopedia.com, 2007, para. 2 ). Instruction : The deliber ate arrangement of learning conditions to promote the attainment of some intended goal (Driscoll, 1994, p. 332). Instructional Design : The systematic and reflective process of translating principles of learning and instructions into plans for instructional material, activities, information resources, and evaluation (Smith & Ragan, 2005, p. 4). Instructional Systems Design (ISD) : T he incorporation of processes to develop instructional materials that can facilitate learning that has measurable outcomes (Seels & Glasgow, 1 998, p. 7). Learning : A change in human disposition or capability, which persists over a period of time, and which is not simply ascribable to processes of growth ( Gagn 1977, p. 3).

PAGE 26

14 Metacogniti on : O regulat ory behavior (Driscoll, 1994, p. 103). Web based Instruction (WBI) : Instruction designed to be delivered on the computer using the Internet, World Wide W eb (www) and it s resources. The intervention can include interaction, feedback, knowledge and skills tr ansfer to facilitate learning. World Wide Web (www) : The World Wide Web or simply Web is a way of accessing information over the medium of the Internet. It is an information sharing model that is built on top of the Internet. The Web uses the HTTP proto col, only one of the languages spoken over the Internet, to transmit data. Web services, which use HTTP to allow applications to communicate in order to exchange business logic, use the Web to share information. The Web also utilizes browsers, such as Inte rnet Explorer or Netscape, to access Web documents called Web pages that are linked to each other via hyperlinks. Web documents also contain graphics, sounds, text and video. The Web is just one of the ways that information can be disseminated over the Int ernet. The Internet, not the Web, is also used for e mail, which relies on SMTP, Usenet news groups, instant messaging and FTP. So the Web is just a portion of the Internet, albeit a large portion, but the two terms are not synonymous and should not be con fused. ( Webopedia.com, 2007, para. 3 ). Organization of the Study In summary, C hap ter O ne relates the background, problem, purpose, research questions and objectives of the present study. It also describes the signifi cance of conducting the research, delimitations and purported threats of the study Chapter T wo is the literature review and the discourse covers several foundational research studies regarding design based rese arch (DBR), web based instruction (WBI), instructional

PAGE 27

15 systems design (ISD) and the ISD process, ADDIE (Analysis, Design, Development, Instruc tion, Evaluation) and learning. Several t opics pertinent to this study and which relate to learning include metaco gnitive learning and teaching test taking strategies The dialogue in Chapter T hree, describe s the methods, procedures research design, participants and various instruments utilized An account of the pilot study, the first phase of the ISD process Anal ysis, is also in cluded in Chapter T hree Finally, Chapter Four and Chapter Five cover the results of the present study and the summary of the results respectively

PAGE 28

16 C hapter T wo Literature Review This chapter reviews th e literature regarding four topics appl icable to this research: design based research (DBR), web based instruction (WBI), instructional systems (ISD), and learning. These broad topics have been narrowed down to focus on particular areas such as DBR within instructional technology, WBI in higher education, ADDIE within the ISD process and metacognitive learning, teaching test taking strategies learner satisfaction and quality of WBI Also, due to the fact that th ere is a wide range of terminology referring to design and development al research, f or the purpose of clarity encompass all the variations. Design b ased Research (DBR) Overview of Design B ased Research Education researchers are generally pursuing two main objectives: to better understand how people learn within their learning environment; and to design effective interventions to achieve positive learning outcomes ( DBRC p ara .1 n.d. ). DBR may be relatively new to education but that is not the case in other disciplines (Bannan R itland, 2003; Bereiter 2002, Colli ns 1999; van den Akker, 1999). For instance, the engineering, medical and psychology fields were early adopters of DBR and have utilized it to sustain

PAGE 29

17 innovation and devel opment (Barab, Arici, & Jackson 2005; Bannan R itl and, 2003; Bereiter, 2002). Furthermore, according to the Association for Information Systems (AIS) (2006), adoption of DBR is currently growing in other disciplines besides education, such as the information systems (IS) arena at many colleges of business Researchers (Cobb, Confre y, diSessa, Lehrer, & Schauble, 2003 ; Bereiter, 2002) pointed out that the nature of DBR sustains innovation and development within the disciplines that adopt it; which may provide an insight as to why researchers in the discipli nes mentioned above a re using DBR in their studies. In education, specifically in the instructional technology pedagogy where the emphasis is on designing effective interactions, the research methods that inspire innovation and sustained development are ei ther missing or poorly executed (Sandoval, 2004; Bereiter, 2002; Reeves, 2000). Supporting the use of DBR in education research, Sandoval (2004) stated that a fundamental aspect of DBR is its ability to embody conjecture about the curriculum, interventions design tools and interaction structures. Before going further, a clear picture of the origins of DBR and some definitions During this time two influential studies were co nducted by scholars Ann Brown (1992) and Allan Collins (1992) ( as cited in Collins, Joseph, & Bielaczyc, 2004; Bannan to the world of research ( as cited in Collins et al ., 2004 p. 15 ; Sandoval & Bell, 2004 p. 199 ). was a naturalistic interactive system, consisting of outputs and inputs which can

PAGE 30

18 contribute to learning theories and promot e f easibility of an intervention. Figure 1 on of a design based experiment. was in her belief that the learning system is complex in nature therefore to study the various elements of the system in isolation or in a laboratory environment constricted the delivery as well as the outcome. In other words, studying an intervention in a synthetic environment did not account for the dynamic nature of the classroom, where the actual context of learning occurred. F igure 1 Design experiment Note. From Design experiments: Theoretical and methodological challenges in creating complex y A. L. Brown 1992, Journal of the Learning Sciences 2 (2), pp. 141 178. Brown compared her experience of conducting experiments in the laboratory versus her experience in the classroom. She found that the classroom was a dynamic environment, and when she placed the intervention in the classroom the results were different from the laboratory res ults. Once in the classroom, Brown discovered that she

PAGE 31

19 could change the intervention to produce positive learning outcomes. She followed an iterative process of introducing the intervention, making changes, introducing the updated intervention, making furt her changes and repeating the process. Brown perceived the design experiment as iterative, encouraging innovation and providing a means of sustainability, and at the same time advanc ing ideas on learning theor ies and practical application. Brown acknowledg ed that design research is complex to execute, but the information it could yield was pertinent within the pedagogy. There is a large amount of literature on DBR and the outcome of the research thus far prop oses a variety of definitions. It is apparent tha t a clear definition of DBR is still being debated in the academic community ( DBRC 2003; Bell, 2004 van den Akker, 1999). Here are several attempts at defining DBR that may help the reader gain a better understanding of DBR. One of the simplest definitio ns that embody the overall goal of DBR was put forward by Joseph (2004) : Design based research approaches research in education by using intervention to provide insights into learning in real world context ( p. 235 ) A more comprehensive definition of DBR b y Cobb et al. (2003) : learning and systematically studying those forms of learning within the context defined by the means of supporting them. This designed context is subjec t to test and revision, and successive iterations that result play a role to that of systematic variation in experiment ( p. 9 )

PAGE 32

20 Another position on DBR by van den Akker (1999) : Development research is often initiated for complex, innovative tasks for whic h only very few validated principles are available to structure and support the design and development activities. Since in those situations the image and impact of the intervention to be developed is often still unclear, the research focuses on realizing limited but promising examples of those interventions. The aim is not to elaborate and implement complete interventions, but to come to (successive) prototypes that increasingly meet the innovative aspirations and requirements. The process is often cyclic or spiral: analysis, design, evaluation and revision activities are iterated until a satisfying balance between ideals and realization has been achieved. ( p. 7 ) These definitions of DBR, and there are others that are similar, are indicative of a scholarly process of an emerging trend within the pedagogical society. Researcher v an den Akker (1999) points out that the myriad of terminology already in existence is another indication that DBR is an emerging trend w ithin instructional technology. The following i s a list of terminology that was/is used by various researchers; from the research o f van den Akker (1999), Reeves et al. (2005 ), Hoadley (2002), Brown (1992) and Collins (1992): (a) Design studies, Design experiments, Design research, Design based researc h, Design based research methods ; (b) Development/Developmental research; (c) Formative research, Formati ve inquiry, Formative experiments, Formative evaluation; (d) Action research; and (e) Engineering research. For the present study, this researcher has design or DBR to encompass all the terms listed previously (DBRC, 2003)

PAGE 33

21 Some researchers (Collins et al., 2004; D BRC 2003; Bell, 2002) offered further insight on the relevance of DBR and how it differs from oth er types of research. Collins et al (2004) as well as the DBRC (2003) stated that DBR addresses several research needs that make it unique among other research methodologies. Some overlying needs that DBR addresses are (Collins et al., 2004): 1. The need to address theoretical questions about the nature of learning in context. 2. The need to approach the study of learning phenomena in the real world rather than the laboratory. 3. The need to go beyond narrow measures of learning. 4. The need to derive research finding s from formative evaluation. Adding to the research supporting the use of DBR, Cobb et al., (2003) as well as Wang and Hannafin (2005) pointed out five characteristics of DBR that distinguished it from other research methodologies: ( a ) DBR is grounded, in effect, the purpose of DBR Cobb et al., 2003, p. 10) but to actively aid in the process and creation of d esign interventions in learning; ( b ) DBR is pragmatic since it helps to initiate promote and support innovation to improve the learning process; (c ) DBR is integrative, Cobb et al. explained that DBR 2003, p.10). DBR is prospective when Cobb et al., 2003, p. 10) are rigorous and can withstand scrutiny. DBR is reflective because the design interventions and its effects on the learning process are based on conjecture. This unique feature of DBR permits other conject ures to be introduced if the original one is refuted; (d ) this leads us to the fourth feature of DBR, and that is, DBR designs are iterative or

PAGE 34

22 cyclical, interactive and flexible. In other words, iterations of design change and revision at tention to evidence of learning (Cobb et al., 2003, p. 10). Cobb et al., 2003, p. 10) ; and (e ) it is contextual, that is, the theories developed by DBR must have the ability to work in the intended learning environment. Further distinctions between DBR and other methods of research can be derived from investigating the outcome of the methodological processes. As noted by Bereiter (2002) DBR cannot be defined by its methodology since it can utilize many differen t methods, but by its purpose. Some researchers (Robyler, 2005; Reeves et al., 2005; Collins et al., 2004, Reeves, 2003; van den Akker, 1999) ventured to point out the differences between DBR and several other research methodologies They do this by using the difference in perspectives utilized in the framework and the goals achieved. In Figure 2 Reeves (2000) explained the different framework and outcome between exper imental and DBR methodologie s. Figure 2 highlights the difference in outcomes between an empirical and DBR approach. The iterative nature of DBR means that the research for th e solution to a problem will go through the process of testing and refinement and it will have an impact on theory and practice. Furthermore Reeves (2000) Figure 2 shows the outcome of experimental research as refining theories and creating new hypothesis to be tested. In addition, various comparative studies of DBR, qualitative and experimental methodologies have been conducted (Robyler, 2005). d that

PAGE 35

23 Robyler (2005) sets the outcome of experimental research as having the ability to generalize the intervention to many sites whereas qualitative re search is focused on one site. Robyler (2005) point ed out that design decisions are based on two things: (a ) objectivity, implying quantitative type research such as experimental, quasi experimental ; an d (b ) implying qualitative methods such as narratives, phenomenologies, ethnographies, grounded theory studies, or case studies Figure 2 Comparison of development approach and empirical research F rom Enhancing the worth of in and other developmental research strategies y T. C. Reeves, 2000. Perspectives of Instructional Technology Research for th e 21 st SIG/Instructional Technology at the annual meeting of the American Educational Research Association, New Orleans, LA. 6) which is the advancement of theory and practice by designing interventions. Moreover, the intervention designs are based on theories and measuring

PAGE 36

24 their effects on the learners within the classroom environment involves both qualitative and quantitative methods (Dede, 2005). The DBR approach may help researchers better understand the complexity that is involved in designing and developing learning interventions and the role teachers play in making learning material effective in the classroom ( DBRC 2003). Furthermore, DBR offers innovative insights to the design process that may add meaning to existing learning theories or create new ones (Sandoval & Bell, 2004; Dede, 2005). DBR also offers the opportunity for researchers and practitioners to work closely together to develop and design better learning environments. Paradigm shift: Design B ased Research and Instructional Technology Dr. Thomas C. Reeves a noted scholar in the field of IT pointed out that decades of experimental technology instructional research, with theoretical or empirical 1 1) we are now left with principles to guide practice, especially in K 12 schools, higher education, business ( p. 11) Reeves (2000) continued to support his belief by citing another top researcher in the IT field, Lauren Resnick (1999) who concluded that the research conducted so far within the instructional technology pedagogy, has contributed very little to the s olution of education problems. corroborated by another noted IT researcher, van den Akker, who is of the opinion that instructional designers are unable to find relevancy from IT studies because the s tudies are too narrow to be meaningful, too superficial to be instrumental, too artificial to be relevant, and, on top of that, they usually come too late to be of any use (van den Akker, 1999, p. 2)

PAGE 37

25 According to van den Akker (1999) DBR can be interpreted differently among various education disciplines. Fo r example, within a curriculum discipline the major goal development of a product/program in order to improve the p roduct/program being den Akker, 1999, p. 3). Within the media and technology education discipline, development research focuses on formative evaluation and program impr ovement (van den Akker, 1999). Researchers (Reeves et al., 2005; Reeves, 2000; Robyler, 2005) including van den Akker (199 9) also hold a broader view of DBR. These researchers support the idea that DBR should emphasize using technology and theory for the creation of new designs and should improve the aspects of learning such as communication, instructional interventions and performance. Instructional designers are often challenged when faced with the dynamic nature of their task and sometimes seek he lp from past research studies. However, van den Akker (1999) pointed out that research usually does not meet their needs due to lack of relevance in terms of superficiality and timeliness. Further issues with the existing body of instructional technology res earch relate s to the lack of consistency concer ning methodology (Bell, 2004). Brown ( 1992) examined the disconnect that occurs when testing learning design s in the classroom versus in laboratories. As mentioned previously, Brown (1992) believed that the la boratory environment minimized the dy namic nature of the classroom. Research outcomes ar e affected by this disconnect. (1999, p. 177) edu cation is not clearly de fined. Another issue is one of social relevance, Reeves (2000)

PAGE 38

26 and Reeves et al., (2005) called for more research within instructional technology that researches complex learning problems, focusing on pedagogical methods as opposed to technology per se, in creasing collaboration between practitioners and researchers, refining the learning environment or revealing new designs and be ing highly collaborative. Early calls for changes in research methods for instructional technology came from Clark (1994, 1991, 1 985, 1983). Several distinguished researchers ( Kozma 1994 1991; Brown 1992; Collins 1992) conducted relevant, rigorous and specific research on technology in education that could advance pedagogy and address concerns. More recently, several researchers ( Dawson & Ferdig, 2006; Reeves et al. 2 005, 2000; Robyler, 2005; Schrum, Thompson, Sprague, Maddux, McAnear, Bell & Bull, 2005; Barab & Squire, 2004; Bell, 2004; Collins et. al, 2004; Cobb et al., 2003) have made an effort to inform and encourage researcher s in instructiona l technology to engage in DBR. They support the belief that DBR can add coherence in respect to methodology, relevance, and rigor, thereby advancing the body of instructional technology research. A Practical Approach to Using DBR, Evalua ting WBI and Advancing Research orted that they spent most of their time, 23 % developing original design work, followed closely by 22 % on administrative and project management tasks respectively In addition, they spent 14 % of their time in meetings. Surprisingly, 12 % of their time, inc luding those outside of academia, where it was not a job requirement, was spent conducting research. These

PAGE 39

27 findings by Cox and Ogsuthorpe (2003) especially regarding the research aspects of the instructional designer are encouraging to researchers like Ree ves and Hedberg (2003) a means to advance design principles of interactive learning systems. provided guidelines for practitioners to follow. In an exclusive online interview about the book, Reeves, when asked about the importance of evaluation of interactive learning ent of 7). Moreover, the terms, assessment and evaluation were clearly distinguished as they were in the book. Although, both assessment and evaluation informs dec ision making, assessment en tailed were often (DistanceEducator.com, 2003, para. 23) of a product or program and often invo lves making a judgment. In their book, Reeves and Hedberg (2003) also presente d an evaluation model that listed six forms of evaluation associated with the different phases of the design and development of an interactive learning system, web based instruc tion, or multimedia Reeves & Hedberg, 2003 p.58) of evaluations ar e as follows: ( a ) review this affords the developer clarification as to why the product is necessary; ( b ) needs assessment is crucial because it guid es the instructional development process and supplies project objectives in addition to design components; ( c ) formative evaluation occurs as the product is being developed and attention is paid to the details of

PAGE 40

28 the interface and learning objectives; ( d ) effectiveness evaluation reviews the product in context; ( e ) impact evaluation which as the term implies, is how well the product integrates into the organization in regards to strategy and training goals, and ( f) maintenance evaluation can aid tremendousl y in continuous growth and improvement of the product but this type of evaluation is often neglected (Reeves & Hedberg, 2003) Informed by the evaluation functions and guided by the four phases of the DBR approach (Reeves, 2000) mentioned earlier in a pre vious section ( see Figure 2 ) insight on design principles to practitioners and researchers. The guide that Seeto and Herrington (2006) developed maps the four phases of DBR t o the five phases of ADDIE and includes the six functions of evaluations ( see Figure 3 in Appendix ) T he information is pertinent especially from a methodological perspective to the present study In this guide, Seeto and Herrington (2006) examined all the phases of ADDIE and provided a guide to researchers and/or practitioners on how the phases can be evaluated. Along with this information, they shared possible outcomes of the evaluation. Undoubtedly, there are benefits of using the DBR approach to further advance the body of research within instructional technology. However there are also challenges to be faced as described in the next section. Challenges of Design B ased Research in Instructional Technology DBR is a relatively new research approach and ho lds i ts own set of challenges. The DBRC (2003) and another group of researchers, @Peer Group (2006) outlined several challenges facing DBR researchers: credibility of data, generalizability, and collaborative partnership. In addition, three more challenge s can be added to this list (@

PAGE 41

29 Peer Group, 2006): sustainability, funding and publication, and achieving Institution al R eview Board (IRB) approval. To understand clearly how these challenges can affect research, an explanation of each challenge follows. As mentioned in a DBRC (2003) journal article, a credibility gap (Levin & d that the gap exist s partly because of a lack of consensus within the discipline as t o what constit Another reason for the existence of the credibility gap is that theories are not well articulated in practice to display how well they work or do not work ther the data can withstand validity, objectivity and reliability tests (@Peer Group, 2006). With DBR research, this issue of credibility is problematic. In DBR there is interaction, rather than separation, between context and intervention. Moreover, the re is social interaction which may result in the Hawthorne effect (i.e. when attention is paid to the participant they react by trying to perform tasks at a higher than normal level) (Levin uation s that were (2004, p. 256). concedes that DBR does offer credibility, though on a limited bas is, via its iterative nature. Credible evidence can be established with DBR if the outcome of the intervention can be replicated, relationship established, and appropriate group comparisons can be Likewise, the group of re searchers that comprise DBRC

PAGE 42

30 Figure 3 Extending the role of the learning designer through design based research Note LD = Learning Desig ner. From Design based research and the learning designer Seeto and J. Herrington, J, 2006, i n L. Markauskaite, P. Goodyear, & P. Reimann (Eds.), Proceedings of the 23rd Annual Conference of the Australasian Society for Computers in Learning in Te Learning? Whose Technology? pp. 741 745 Sydney: Sydney University Press.

PAGE 43

31 Cobb, 2001 ; Collins, 1992; as cited in DBRC, 2003 p.5). Generalizability of results is anoth er challenge for DBR researchers. In general, if any success with a particular intervention within a particular context can be claimed, then the research outcome should be replicated in various contexts to claim generalizability. Critics often note that i f learning can be claimed in one context, it could be due to other factors not measured such as interaction with other factors, environment, instructor, learners or numerous other elements (DBR C 2003). To address this argument within DBR, the DBRC (2003) researchers (p. 5). From this viewpoint, the educational intervention is in itself an outcome of the context. Collaborative partnership is another issue that can be a challenge for DBR researchers. Instructional technology research in secondary education, according to Reeves et al (2005) may possibly involve one or a few researchers from a single dep artment. However, DBR encourages collaboration among many disciplines (Reeves et al 2005). As an example, DBR was conducted for the Quest Atlantis project (Barab, Arici & Jackson, 2005) which is a 3 D interactive narrative environment developed for 9 12 The research is arly Childhood Education as well as Curriculum and Instruction research

PAGE 44

32 Reeves et al (2005) highlighted the collaborative nature of DBR further by providing a hypothetical problem: geoscie nce instructors in large universities are frustrated with having to teach the fundamental concepts repeatedly in their classes. According to Reeves et al (2005) using a DBR approach, the collaborative aspects would include a diverse team of geoscience fa culty members (creating the research core), instructional designers, programmers, educational researchers, and multi media specialist s Sandoval and Bell (2004) support Reeves et al. (2005) views on the collaborative aspects of DBR and add a distinction be tween the research and design aspects. From the research aspect researchers engaged in DBR can be from 2004, p. 200). The challenge in collaboration lies with the length of time it takes for a DBR study to be complete d It may take up to two years or mor e before a DBR study is completed (@ Peer Grou p, 2006; Reeves et al., 2005). A study conducted over a long period of time may be subjected to participant burnout, loss of motivation, and other unexpected factors (@Peer Group, 2006). Three more challenges f aces DBR researchers, t hey are : sustainability, funding and publication, and IRB approval (@Peer Group, 2006). Lack of sustainability can be motivating and self (@Peer Group, 2006, para. 1 1). I n addition, including the instructors in the design of the intervention would build commitment to the project and help them to perceive the value

PAGE 45

33 viewpoint (@Peer Group, 2005 para. 12 ) Funding and publishing a DBR study is difficult because it is not easily categorized and it is not standardized (@Peer Group, 2006 ; Collins et al., 2004). Collins et al. (2004) noted that DBR diverse methodologies are considered new and lacks standardiz ation within the research community making publishing and funding studies difficult. To overcome the problem, the @Peer Group suggested that DBR researchers (2006, para. 15) studies that typify the core elements of DBR (e.g. i terative/cyclical, integrative, interactive, and flexible). Another challenge for design based researchers is obtaining IRB approval (@Peer Group, 2006). IRB reviewers have a clear set of rules and guidelines that they follow. The design based researchers are already challenged with the scope of their DBR study and the l ength of their research study. IRB reviewers are seeking a clear start and end time but the iterative and flexible nature of DBR studies makes it difficult to supply this information (@Peer Group, 2006). To overcome this problem @Peer Group (2006) recommended splitting the study into various phases and submitting approval requests to the IRB for each phase. Another recommendation made by @Peer Group (2006) is to clearly state the mai n idea a nd course of the study in the first submission for IRB approval. Some researchers in the field of instructional technology strongly campaign for the inclusion o f DBR as a viable methodology. DBR answers the call among distinguished instructional technology scholars to use a methodology that is focused on either creating new theories or enhancing existing ones. In addition, DBR concentrates on

PAGE 46

34 refining the design process through an iterative process that includes a real world environment. DBR bridges the gap between the work of researcher and practitioner, the theoretical and practical. Web B ased Instruction (WBI) Learning occurring on the World Wide Web (www) using the Internet has given rise to various terminology (Bitpipe.com, 2007) such as: Web Based Trai ning (WBT) Interactive Training, Online Tutorials, Technology Based Learning, Computer based Training ( CBT ) Electronic Learning, Interactive Learning, Internet Based Learning, Web Learning, Computer Based Learning (CBL) Computer Based Instruction (CBI) Media Based Training (MBT) Web Training, Online Learning, Online Courses, Computer Based Training, Web Based Education, Online Training, Technology Based Training ( TBT ), e Learning. This researcher recognizes that there may be differences among the type s of e learning mentioned in the previous list but has chosen, for eb based instruction ning using the Internet and its resources or any online intervention designed to include interaction, feedbac k or use of the World Wide Web for delivery of the intervention to facilitate learning. Khan (1997) as well as Relan and Gillami (1997) explained that using the resources of the World Wide Web to facilitate learning defines web based instruction. Relan an d Gillami define d instructional strategies within a constructivist and collaborative learning environment, ( 1997 p. 43 ). In Nov ember 1999, a 16 member Web based Education Commission was created by President Clinton, the Democratic and Republican Congress leaders and then

PAGE 47

35 Education Secretary, Richard Riley. The commission which dissolved in March, 2001, studied the impact and the p romise of the Internet on education and made recommendations for policy reforms for pre K, K 12, post secondary and corporate training institutions ( The Web based Education Commission, 2000). Among the many recommendations, the ones of particular interest were : ( a ) build a new research framework of how people learn in the Internet age, and ( b ) develop high quality online educational content that meets the highest standards of educational excellence (The Web base d Education Commission, 2000, p. 12) When the 2001 National Association of State Boards of Education (NASBE) released a study group report on e learning it was apparent that there was top level leadership support for web ba sed instruction (NASBE, 2001). NASBE offi cials recognized the a ffect of the Internet in the field of education. In fact, conclusions learning will improve American education in valuable ways (NASBE, 2001, p. 4) Also, the group recognized that technology per se was not a panacea to cure all learning problems but could be the answer to some of the educational challenges (NASBE, 2001). Fast forward to April, 2007 where a $10 million grant was awarded to th e Department of Education (DOE) to study the use of various educational software programs in schools ( eSchool News & wire service reports, 2007). Review of the study was pertinent here in the sense that it reviewed a different approach to teaching, not the traditional instructor led classroom but rather the use of technology in t he classroom, and it set out to examine the effectiveness of 15 classroom software programs in four categories: early reading (first grade); reading comprehension (fourth grade); p re algebra

PAGE 48

36 (eSchool News & wire service reports 2007 p. 1) For the study, researchers used 132 schools and surveyed approximately 10,000 students in 439 classrooms (eSchool News & wire service reports, 2007). Whe n the researchers compared achievement scores between groups that used the educational software and those that did not, the results yield ed no statistical differences. Although the results were disappointing to education technology experts, it was not surp rising since they believed that implementation was problematic in the study (eSchool News & wire service reports, 2007). The experts listed three major reasons why the study failed in some sense: ( a ) participating teachers did not receive the necessary coa ching or support, ( b ) strong leadership for the project was absent, and ( c ) student usage of the softw are accounted for an average 10 % or 11% of the total instructional for the school among all four experiment al groups (eSchool News & wire service reports, 2007) Also in this article, Mary Ann Wolf, executive director of the State Educational Technology Directors Association (SETDA), stated the study lacked several key ingredients such as strong leadership which other researchers clearly agree were needed t o successfully a ). According to Wolf a successful federal evaluation grant study on the use of technology and the effect y, the teachers and students were provided required support including hardware, software, connectivity, personnel, and professional development (eSchool News & wire service reports, 2007). The results revealed that studen ts in the IMPACT model schools, who originally had poorer test scores than their peers in reading and math, not only caught up

PAGE 49

37 but surpassed their peers in the first year of the study and maintained this lead in the second. Wolf referred to another successful technology program eMINTS that was integrated into several schools in Utah, Missouri and Main e. A study on the eMints program revealed that when students were involved in technology based curricula, it increased their test scores 10 % to 20% higher when compared to students in the contr ol groups (eSchool New s & wire service reports, 2007) Despite the criticisms made about the 2007 DOE study, Phoebe Cottingham, the commissioner of education evaluation and regional assistance for the Institution of Education Science and Mark Dynarski, the lead researcher, defended their methods by stating that the study was flawless. eSchool News & wire service reports, 2007, p. 26) by the results and stated that no one should make premature co nclusions based on the results. The y believe that more research is required and plan to do a second round As noted from the previous discourse, how various t echnology based interventions are incorporated into the curriculum, whether web based or not, and their a ffect on learning is still b eing researched by education researchers. Web B ased Instruction for Institutes of Higher Education (IHEs) and Quality Concerns Researchers funded by the National Center for Education Statistics (NCES) using the Postsecondary Education Quick Information Sys tem (PEQIS), conducted a 1 year study of 130 IHEs (see Table 1 ) to establish national estimates on distance education at 2 year and 4 year Title IV eligible, degree granting institutions (Waits & Lewis, 2003). Waits and Lewis (2003) stated in their report that between the academic year of 2000 and 2001 public universities were more likely to offer distance learning courses. According to

PAGE 50

38 the researchers, 90% of public 2 year and 89% of public 4 year institutions offe red distance education courses (Waits & Le wis, 2003). In comparison, 16 % of 2 year and 40% of 4 year private institutions offered distance education courses (Waits & Lewis, 2003). Table 1 Number and percentage distribution of 2 year and 4 year Title IV degree granting institutions, by distance education program status and institutional type and size: 2000 2001 Institutional type and size Total number of institutions Distance education program status Offered distance education in 2000 2001 Plan to offer distance education in the next 3 years Did not offer in 2000 2001 and did not plan to offer in the next 3 years Number Percent Number Percent Number Percent 4,130 2,320 56 510 12 1,290 31 Institutional type Public 2 1,070 960 90 50 5 50 5 Private 2 640 100 16 150 23 400 62 Public 4 620 550 89 20 3 50 8 Private 4 year 1,800 710 40 290 16 790 44 Size of institution 2,840 1,160 41 460 16 1,220 43 870 770 88 50 5 60 7 420 400 95 10 2 10 2 Note The percentages are based on the estimate d 4,130 2 year and 4 year Title IV eligible, degree granting institutions in the nation. Detail may not sum to totals because of rounding. Education at Higher Education Institutions, 2000 U.S Department of Education, 2002, National Center for Education Statistics, Postsecondary Educ ation Quick Information System. Popularity of distance learning was evidenced by having 48% of 4 year public institutions and 33 % of private institutions design programs that could be comple ted totally via online (Waits & Lewis, 2003). In regards to certificate programs completed totally through distance education, 2 year public and 4 year private institutions offered

PAGE 51

39 15 % and 14 % respectively in comparison to 25 % offered by 4 year public inst itu tions. These statistics reveal a growing movement among IHEs to rapidly develop online accessibility to their programs. IHEs mainly used the Internet and two way video technologies to deliver their courses online (Waits & Lewis, 2003). Among the IHEs 90% researched delivered asynchronous computer based instruction via the Internet (Waits & Lewis, 2003). Asynchronous means that availability of the courses to learners is anytime and anyplace. Other technologies employed by IHEs to deliver online courses were two way video s with two way audio (43%) and CD ROMS (29% ) (Waits & Lewis, 2003). Interestingly enough, 88 % of IHEs planned to create more online courses within the next 3 years to be delivered asynchronously through the Internet (see Table 2 ) (Waits & Lewis, 2003). After considering the rate of expansion and types of technologies employed by IHEs to create a distance learning curriculum, there were factors that existed in preventing some IHEs from developing one. Some of these factors stated by Waits and Lewis (2003) included : inability to obtain state authorization (86 percent), lack of support from institution administrators (65 percent), restrictive federal, state, or local policies (65 percent), ercent), lack of access to library or other resources for instructional support (58 percent), inter institutional issues (57 percent), legal concerns (57 percent), and lack of perceived need (55 percent) ( p.16) The list quoted above is a partial list. T able 3 displays a complete list. The prohibiting factor with the utmost relevance to the present study is the concern IHEs expressed about

PAGE 52

40 course quality. Of all the IHE respondents, 35 % of IHEs in th e Waits and Lewis (2003) report who did not intend to cr eate a distance learning curriculum listed course quality as a prohibitive factor. In contrast, 14 % of IHEs who wanted to conduct a major expansion of their existing online curriculu m were concerned about quality but did not find it prohibitive. Similarly, IHEs who wanted a mi nor and moderate expansion, 29 % and 23 % respectively were concerned about course quality but continued with their plans. Overall it was program development costs that were considered more prohibiting rather than concerns about quality However, according to researchers Mariasingam and Hanna (2006), they believe that quality assessment of online course is in its infancy and certainly more research of this nature is needed. Mariasingam and Hanna (2006) emphasize that: The most important of all is the need to establish a systematic process for developing and (Online Journal of Distance Learning Administration (OJDLA), para. 25). Furthermore, Mariasingam and Hanna (2006) Quality, a s is well known, lies in the eye of the beholder. There are, (OJDLA, para. 8). Although it may be difficult to conceptualize the meaning of quality in regards to online learning some researcher s have tried to do just that. In a news release in year 2000, the National Education Association (NEA) listed twenty four measures divided into seven categories of quality in Internet Based Distance Learning (see Table 4) (NEA, 2000). Although the list can be used as a basic guideline for educators to develop an online program, it lacks specificity and details in regards to actual elements used to assess quality of a web based module.

PAGE 53

41 Table 2 Percentage distribution of 2 year and 4 year Title IV degree granting institutions that offered distance education courses in 2000 2001 or planned to offer distance education in the next 3 years, by the planned level of distance education course offerin gs over the next 3 years, and by the planned primary technology for instructional delivery: 2002 Primary technology for instructional delivery Planned level of distance education cour se offerings Reduce the number Keep the same number Start or increase the number No plans to use the technology Two way video with two way audio (two way interactive 4 13 40 43 One way video with two way 2 4 1 2 82 One 1 4 11 84 One 6 15 23 56 Two 1 4 9 86 One 1 5 13 81 Internet courses using synchronous computer based 1 4 62 33 Internet courses using asynchronous computer based 1 6 88 6 CD 1 8 39 53 Multi 2 31 67 # # 5 94 # Rounds to zero Reporting standards not met. Note This question was asked in the present tense ra ther than referring to 2000 2001, and thus the estimate reflect the responses of the institutions at the time the data were collected in Spring 2002. Percentages are based on the estimate 2,500 institutions that either offered distan ce education course in 2000 2001 (2, 320 institutions), or that planned to offer distance education courses in the next 3 years an d coul d report their technology plans (490). Details may not sum totals because of rounding. Educati on at Higher Education Institutions, 2000 U.S Department of Education, 2002, National Center for Education Statistics, Postsecondary Educ ation Quick Information System.

PAGE 54

42 Table 3 Percentage distribution of 2 year and 4 year Title IV degree granting institutions by the extent to which various factors are preventing the institution from starting or expanding dis tance education course offerings: 2002 Note This questions was asked in the present tense rather than referring to 2000 2001, and thus the estimate reflect the responses of the institutions at the time the data were collected in Spring 2002. Percents a re based on the estimate 4, 130 2 year and 4 year Title IV eligible, degree granting institutions in the nation. Detail may not sum to totals because of rounding. From Education Institutions, 2000 by the U.S D epartment of Education, 2002, National Center for Education Statistics, Postsecondary Educ ation Quick Information System.

PAGE 55

43 Table 4 based distance learning Institutional Support Benchmarks A documented technology plan that includes electronic security measures to ensure both quality standards and the integrity and validity of information. The reliability of the technology delivery system is as failsafe as possible. A centralized system provides support for building and maintaining the distance education infrastructure. Course Development Benchmarks Guidelines regard ing minimum standards are used for course development, design, and delivery, while learning outcomes -not the availability of existing technology -determine the technology being used to deliver course content. Instructional materials are reviewed perio dically to ensure they meet program standards. Courses are designed to require students to engage themselves in analysis, synthesis, and evaluation as part of their course and program requirements. Teaching/Learning Benchmarks Student interaction with facu lty and other students is an essential characteristic and is facilitated through a variety of ways, including voice mail and/or e mail. Feedback to student assignments and questions is constructive and provided in a timely manner. Students are instructed i n the proper methods of effective research, including assessment of the validity of resources. Course Structure Benchmarks Before starting an online program, students are advised about the program to determine if they possess the self motivation and commit ment to learn at a distance and if they have access to the minimal technology required by the course design. Students are provided with supplemental course information that outlines course objectives, concepts, and ideas, and learning outcomes for each cou rse are summarized in a clearly written, straightforward statement. Students have access to sufficient library resources that may include a "virtual library" accessible through the World Wide Web. Faculty and students agree upon expectations regarding time s for student assignment completion and faculty response. Student Support Benchmarks Students receive information about programs, including admission requirements, tuition and fees, books and supplies, technical and proctoring requirements, and student sup port services. Students are provided with hands on training and information to aid them in securing material through electronic databases, inter library loans, government archives, news services, and other sources.

PAGE 56

44 Table 4 ( continued ) Throughout the d uration of the course/program, students have access to technical assistance, including detailed instructions regarding the electronic media used, practice sessions prior to the beginning of the course, and convenient access to technical support staff. Ques tions directed to student service personnel are answered accurately and quickly, with a structured system in place to address student complaints. Faculty Support Benchmarks Technical assistance in course development is available to faculty, who are encoura ged to use it. Faculty members are assisted in the transition from classroom teaching to online instruction and are assessed during the process. Instructor training and assistance, including peer mentoring, continues through the progression of the online c ourse. Faculty members are provided with written resources to deal with issues arising from student use of electronically accessed data. Evaluation and Assessment Benchmarks The program's educational effectiveness and teaching/learning process is assessed through an evaluation process that uses several methods and applies specific standards. Data on enrollment, costs, and successful/innovative uses of technology are used to evaluate program effectiveness. Intended learning outcomes are reviewed regularly to ensure clarity, utility, and appropriateness. Note. From Study Finds 24 Measures of Quality in Internet Based Distance Learning study released at Blackboard Summit by National Educ ation Association (NEA) 2000 The Instructiona l Systems Design (ISD) Process Review of the literature on systems design led to an understanding that considerable confusion exists among the definitions and usage of the terms Instructional Design (ID), Instructional Systems Design (ISD) and Instructiona l Design Theories (IDTs) within the fiel d of instructional technology. Although the present research is focused on instruction al systems design (ISD) and its systematic process, it is important are pre sent within the field. Perhaps interpretation of the terms ID, ISD and IDTs are dependent on perspective, usage and the researcher. This observation is highlighted by the discussion between Bichelmeyer rticle, she attempted to show

PAGE 57

45 what she claims is the misuse of the terms used for instructional theory and IDT. It should be noted first that Bichelmeyer clearly stated that she is basing her definitions of rather tha n an academic perspective. She began her argument by pointing the difference between instruction and instruction al design but the two were inter related because they impacted and informed each other (see Figu re 4 ). To clarify, Bichelmeyer acknowledged that both instruction and instructional design were d erived from learning theories. However, they differed in relation to context, objec tives, activities and concerns. For example, an instructional designer is f ocused on conducting analysis, designing and developing instruction, addressing issues with implementation, and conducting formative and summative eva luations (see Table 5 for summary of differences). Bichelmeyer characterized instruction theory as having more to value of instructional design models, exploring issues such as the efficiency and effectiveness of ADDIE and rapid prototyping models (Gordon & Zemke, 2000 as cited in Bichelmeyer, 2003 IDT Rec ord, para 12 ). She pointed out this confusion with term usage as represented in print by the books, Instructional Design Theories and Models: An Overview of their Current Status, edited by Charles Reigeluth (1983) and i n Instructional Design Theories and Models, Volume II, edited by Reigeluth (1999). Bichelmeyer referred to these books as the well respected scholar in instructional technology and a colleague of Bichelme yer. These

PAGE 58

46 Figure 4 Relationship and differences between instruction and instructional design From Instructional theory and instructional design the care? by B. Bichelmeyer 2003, IDT Record. sc holars within the discipline referred to instruction theory rather than instructional design The Algo Heuristi c Theory of A Cognit the second book include (as cited in Bichelmeyer, 2003) David N. Perkins and Chris Teaching and Hannafin, Land and Olive plaining his interpretation of the terms. Reigeluth agreed that there is confusion surrounding usage of

PAGE 59

47 Table 5 Summary of differences between instruction and instructional design Instruction Instructional Design Objectives Ensure learning to the best of each student's abilities, given variation between and individual differences of learners Facilitate sta ndardization of instruction by accounting for variation between instructors, locations and schedules Activities Set expectations Present examples Provide resources Facilitate practices Administer assessments Give feedback Task analysis Con text analysis Learner analysis Instructor analysis Identify design constraints Materials development Evaluation Prototypical Theories Gagne's Nine Events of Instruction (Gagne, Briggs & Wager, 1992) Merrill's 5 Star Instruction (Merrill, 200 3) Instructional Systems Design model (Briggs, 1977) Rapid Prototyping model (Tripp & Bichelmeyer, 1990) Concerns Sufficiency of instructional approaches Variation between learners Efficiency of design process Efficiency of instructional produ cts Standardization of instructional delivery Note From care? by B. Bichelmeyer 2003, IDT Record. professionals in our field (IDT Record, para. 3). First, he tackled the term that most in the field agree upon, ISD process, which involves analysis, design, development, implementation, and evaluation (ADDIE). Reigeluth suggested that confusion arises wh address the whole ISD process. These terms have part of a named ISD process in the title. Reigeluth suggested w avoid confusion. He also predicted further confusion like Bichelmeyer among

PAGE 60

48 researchers and practitioners in the field if they only study our field through the lens of ( IDT Record, para. 7) Reigeluth ( 2003 ) agreed with Bichelmeyer (2003) that there is a need for : (1) a knowledge base (aka design theory) about what instruction should be like and (2) one about what the process for creating instruction should be like, but we also need (3) a kn owledge base about how to evaluate existing instruction (independent of the ISD process) and perhaps (4) one about how to manage instruction (unless you view that as part of #1). These are all different but highly IDT Recor d, para. 7) The discussion above highlights again, the disparate views of various researchers and practitioners in the fie ld of instructional technology. A consensus has not been reached about the meanings of the terms. It is somewhat dependent on the rese archer or practitioner (or both) and the context in which it is being used. ISD is defined in the next section as it relates most to the present study. A Systematic Approach for Web B ased Instruction (Banathy, 1987, as cited in Gustafson & Branch, 2002). A system may occur naturally or it may also be constructed (Dick et al 2005). Prominent ISD researchers, (Dick et al 2005; Dick & Carey, 1996) strongly believed that successful t echnology based instruction beg an with a systematic approach. ISD as defined by Seels and Glasgow (1998) is the incorporation of processes to develop instructional materials that facilitate learning with measurable outcomes. Dick and Carey (1996) listed a number of compelling reasons why a systematic approach to instructional design has been effective. According to Dick and

PAGE 61

4 9 Carey (1996) a systematic approach to design provided focus and helped to determine instructional strategy for a desired outcome. Howev er, the most salient reason fo r using a systematic approach in it is an emp ( p. 8 ) Li kewise, Seels and Glasgow (1998 ) also believed that a systematic approach ( p. 7 ) whether learning objectives have been met and may also provide a means to improve the instruction through evaluation and revision until the learning objectives have been achieved. In order to reach a deeper understanding of this topic a historical perspective into the ISD methodology is needed. By the early 1970s, the ISD approach had grown from a standard training approach within the military to becoming the standard among corporations (Gustafson & Branch, 2002). Furthermore, the systematic approach helped ranch, 2002 p. 19 ). In the 1980s the growing accessibility of computers and their use fulness in instructional development Simultaneously 43). Some of the characteristics from this end analysis, on the job performance, business results, and non 43). These characteristics altogether formed a major impact on th e practices of instructional design by the 1990s. Some researchers (Dick et al 2005; Dick & Carey, 1996; Dempsey & Van Eck, 2002; Seels & Glasgow, 1998; Gagn Briggs & Wager, 1988; Reiser 2002; Rothwell &

PAGE 62

50 Ka nzans as, 2004) believed that a systematic app roach to instructional design produced several advantages for instructional development. First, a systematic approach helped instructional designers plan and develop their instruction through an analytical approach (Seels & Glasgow, 1998; Gustafson & Branc h, 2002). In other words, the systematic approach was goal oriented (Gustafson & Branch, 2002). All inputs, interactions and outputs of the process were analyzed and synthesized. Second, the planning that occurred during the analysis of instruction integra (Seels & Glasgow, 1998, p. 18) into the process and assured quality. Third, Seels and Glasgow (1998 ) have stated that the ISD has been a problem solving approach during which cause and effect relationships can be identifie ion or trial and error planning ( p. 18 ). Fourth, the systematic approach created documentation and established an audit trail so reliable examination and evaluation can occur (Seels & Glasgow, 1998 ). The fifth adva ntage of a systematic approach has been the careful analysis and storage of the information in a database. A database that showed the characteristics of the instructional problem, the demographics, the learning habits of the targeted audience and the learn ing objectives is a knowledge base that is very useful to instructional designers (Seels & Glasgow, 1998 ). Learning outcomes are important in ISD (Seels & Glasgow, 1998 ). When instruction has been developed through the use of the knowledge base, the perfor mance standards set in a systematic approach have been assured because the learning goals have cycled through several iterations of testing and revisions before any implementation has occurred (Seels & Glasgow, 1998 ). Therefore learners may rest

PAGE 63

51 assured t hat the quality of learning is high. Creating learner centered instruction is a major goal of the systematic process (Gustafson & Branch, 2002). There are several factors that introd uce variability in instruction. This variability affects the quality of in struction. For example, some factors affecting quality in instructor led classes could be the environment, the instructor, the learning material, the student 1998 ). However, a systematic approach may reduce the incident of variability in ruction the same way every time (Seels & Glasgow, 1998 p. 18 ). Furthermore, it may also help the instructional designer create instruction that addre sses individual learning needs (Seels & Glasgow, 1998 ). Another advantage of the systematic approach has been its ability to augment replicability ( Seels & Glasgow, 1998 ). To clarify a web based course in comparison to an instructor led course is not cons tricted by a physical classroom; therefore it can be designed to be more accessible to more learners. Replicability can also affect cost ( Seels & Glasgow, 1998 ). A larger number of learners can be served when a course has been replicated therefore the cost per student has been reduced considerably ( Seels & Glasgow, 1998 ). C learly there are many justifiable reasons for using a systematic approach to create web based instruction. The next section will describe a systematic approach for instructional developme nt: ADDIE. The ISD Process: ADDIE ISD is articulated in theory and practice through the use of models. A generic process known as ADDIE is sy nonymous with the ISD process. Presently there are discussions as to whether ADDIE is a model or a conceptual frame work (Mo lenda, 2003;

PAGE 64

52 Bichelmeyer, 2005 ). In the present study, ADDIE has been defined as a conceptual framework of the ISD process. ADDIE is comprised of analysis, design, development, implemen tation, and evaluation phases. It also provides a conceptual fr amework for the ISD process (Reiser, 2002; Bichelmeyer, 2005, Magliaro & Shambaugh, 2006). ADDIE has been described as generic because other ISD models include the phases of ADDIE to some extent ( Scafati, 1998; Reiser, 2002). For example, all five phases o f ADDIE have been included in the Dick and Carey ISD model ( Dick & Carey, 1990, 1996 ; Dick et al., 2005 ). Figure 5 shows a comparison between the ADDIE processes and the Dick and Carey model ( McGriff, 2001 ) In this figure the phases of ADDIE have been cl early represented in the Dick and Carey model (1990, 1996; Dick et al., 2005). Further comparison can be seen in Table 6 between the Seels and Glasgow model ( S&G ) and ADDIE. The five phases of ADDIE, as defined by Seels and Glasgow (1998) are: 1. Analysis: C ollecting and analyzing dat a to determine needs, tasks and content, and instructional requirements. The process of defining what is to be learned (p. 327). 2. Design: The process of specifying how learning will occur (p. 329). 3. Development: The process o f authoring and producing the materials (p. 329). 4. Implementation: The process of installing the process in the real world (p. 330) 5. Evaluation: The process of determining the adequacy of instruction and learning ( p. 330).

PAGE 65

53 Figure 5 Instructional systems design models There are several benefits of using a process to guide developmen t. In general, ADDIE or 2002, p. 19). In addition, ADD IE has been utilized to establish guidelines and to manage the development process (Gustafson & Branch, 2002). Moreover, ADDIE does facilitate an important aspect of successful development: communication between the client and the developers (Gustafson & Branch, 2002). From ISD knowledge base / Instructional design & development Instructional systems design models by S. J. McGriff, 2001 Portfolio of Steven J. McGriff (modified)

PAGE 66

54 Instructional designers use ADDIE not only for the benefits mentioned previously but also because it answers specific questions (see Table 6). Note in Table 6 the list of questions that are addressed in each phase of ADDIE. ADDIE is methodic al, which is a character istic of a systematic approach. The questions that are addressed in ADDIE help to provide clarity to instructional designe rs (Gustafson & Branch, 2002). The approach used was a deductive approach which means the instructional design er has moved from having a general idea of the need to having specific details of the need. The point of view of the learner in instructional development has also been included in this approach (Kemp, Morrison & Ross, 1994). Even though there are several b enefits to utilizing ADDIE there are still researchers and practitioners in the field of instructional design tha t find ADDIE restrictive in its approach. Some challengers of ADDIE (Allen, 2006; Gordon & Zemke, 2000; Zemke & Rossett, 2002) believed that AD DIE pro duces ineffective instruction. Moreover, the y think that the instruction produced from this process is awkward (Gordon & Zemke, 2000). Also, it tends to direct the attention of the instructional designer on the process rather than on the outcome (Go rdon & Zemke, 2000). Additionally, Alle n (2006) believed that ADDIE is inadequate because it does not take into consideration the technological advances in the tools used for instructional development. Allen (2006) stated that technologically advanced tool s which create powerful visual effects, such as 3 D graphics, simulations and interactions for a more engaging learning experience cannot utilize ADDIE simultaneously to guide development Anothe r researcher, Notess (2004) made an interesting commentary as he reviewed Zemke concerning the responses they had received to Gordon and Zemke (2000) first article

PAGE 67

55 criticizing ADDIE. Notess (2004) found that Zemke and Rossett (2002) divided the responses into two categories: responses that concurred wi th Gordon and Zemke (2000) that is, those that believed ADDIE was pre disposed to producing faulty instruction and Table 6 ADDIE phases: Questions answered ADDIE Phases Steps in S&G Model Questions Answered Analysis 1. Needs Analysis What is the problem /need? What are the parameters of the problem/need? 2. Task and Instructional Analysis What should the content be Design 3. Objectives and Assessment What should be assessed and how? 4. Instructional Strategy How should instruction be organized? 5. Delivery System Selection and Prototyping What will the instruction look and sound like? Development 6. Materials Development What should be produced? 7. Formative Evaluation What revisions are needed? Implementation 8. Implementation and Maintenance W hat preparation is needed? Evaluation 9. Summative Evaluation Are the objectives achieved? 10. Diffusion and dissemination Has the innovation been disseminated and adopted? Note. From Making instructional design decisions (2 nd ed.), (p. 180), by B. See ls, B and Z. Glasgow, 1998, Upper Saddle River, New Jersey: Prentice Hall, Inc. Other researchers ( for e.g. Gustafson & Branch, 2002 ; Dick et al., 2005 ) who believed that ADDIE does create quality instruction advised that although ADDIE is drawn in a linea r fashion (see Figure 5 ) it should not be articulated this wa y when developing instruction. As data is collected during the lifespan of a project developers gain insight by The apparent streng th of the ISD pr ocess has been its Branch, 2002, p. 19).

PAGE 68

56 Learning The discussion is written in an attempt to help the reader understand the term h context. Understanding h ow learners learn can help instructional designers design an effective multimedia learning module (Mayer, 2003). Discuss be addressed in this section. In this s ection, contri butions to foundational theories of learning by John B. Carroll and Robert M. Gagn will be discussed. In addition how these foundational theories add value to instructional technology will be addressed. Learning is a complex process and how we learn or how we acquire knowledge is a question philosophers, educators, psychologists and learners themselves have pondered at one time or another. Learning as described by Driscoll (1994) is based 8), and second, the change in the learners performance must be dependent on their interact ion with the environment. Gagn (1977) disposition or capability, which persists over a period of time, and which is not simply ascr i b ( p. 3). Although there have been a n abundance of inquiries on the topic of learning throughout the decades, it still continues to be studied and defined by researchers presently. O ver the years, researchers have derived many learning theories based on various singular, ) and i ) (Driscoll, 1994 p. 15 ). These epistemologies have formed the foundation of many learning theories, for example

PAGE 69

57 according to Driscoll (1994): objectivism is associated with behaviorism, cog nitive information processing and Gagn educational semiotics and Jer mental theory and constructivism. In regards to learning, the research work of Gagn (explained later) as well John B. Carroll, the Model School of Learning is applicable in the present st udy. Carroll (1963, 1973, 1981 1989 ) concluded that learners neede d time to understand concepts and that instructors also needed to recognize this as a fact or affecting learning. He predicted the amount of learning as a ratio of time ac tually spent to the amount of time needed (Gentile, 1997 ): Amount of Learning = Time Actually Spent Time Needed However, Carroll (1989) attributed the above interpretation to Benjamin Bloom (1968, as cited in Carroll 1989). Carroll believed that his model represents a broader view of interpreting learning in schools. In a 1989 article, Carroll discussed a somewhat modified view of his theory, Model School of Learning, which he first presented two decades earlier. Carroll (1989) listed five varia bles that af f Three were associated with the factor of time, they wer e (Carroll, 1989; Gentile 1997 ): (a ) a ptitude which is the amount of time the learner needs to learn a particular task to reach a pre defined level of master y. When a learner has a high aptitude, it means the learner needs less time to learn, while a learner with a low aptitude needs more time t o learn

PAGE 70

58 (more than average); (b ) o pportunity to learn is the time the learner is allowed to learn, for example, a sch ool schedule. Learners usually view this time as less than adequate; (c ) p erseverance is how much time the learner is willing to spend learning a task. This factor, Carroll (1989) believed 26 ). The next two factors d i scussed relate to achievement. They were (Carroll, 1 989): (d ) q uality of instruction relates to how well directions and explanat i ons are given to the learners. How clearly the learners understand what they need to do is an indica t ion of quality of instruction. If a learner requires more time to learn, it could indicate that quality of instruction is less than ideal; and (e ) a bility to learn refers to the p rehension of the instructions. Sometimes language barriers or th e inability ability to learn. occurs. Instructors as well as instructional designers can use these insights to create instruction that is effective and relevant to their learners. Although there are a large number of learning theories in existence, it is the work of eminent cond itions of learning that adds value to this discourse as well Driscoll (1994) states that Gagn and Benjamin Bloom, another influen tial education psychologist who s e work on understood the c oncept that humans had various capabilities which required different conditions of

PAGE 71

59 In the field of instructional technology, which some say is still grappling with its identity within academia, there is a struggle to find foundational research that defines the discipline. Instructional designers and researchers of early online courses used existing conditions of learning and its adds theoretical substance and value to their body of research in instructional design. It also offers a practical guidel ine to instructional designers. These are some of the reasons Gagn ( Gagn, 19 65 ; Gagn , 1977; Gagn, 1984 ; Gagn et al., 1988 ) theory, conditions of learning provides several c ategories of learning that give insight to various capabilities of humans and the (Gredler, 2001, p. 133). Gagn described capabili ties of a human as their skills, knowledge, attitudes and values and it is by learning that they acquire these capabili ties (Gredler, 2001). Gagn described five conditions of learning that lead to the attainment of these capabilities, they are: intellectu al skills, cognitive strategy, verbal information, motor skill and attitude ( Gagn et al 1988 ). five conditions of learning is listed in Table 7 In Table 7, Gredler (2001) describes intellectual skills as how the lea rner can make decisions using symbols and interacting with thei r environment. She explains how s ymbols are denoted by p. 136). Gagn et al., ( 1988 ), Gredler (2001) and Driscoll ( 1994) also i dentified a (Gredler,

PAGE 72

60 2001 p. 138). Gagn (1988) termed this capability cognitive strategies So metimes the term metacognition which will be discussed in the next section is used to refer to these types of strategies. However, Gredler (2001) disagrees because she believes rather than a description of skills as Gagn intended. Next verbal infor mation is a capability where the facts and labels and large bodies of knowledge can be learned and meaningful connections made (Gredler, 2001). An example of verbal information is stating the provisions of the first Amendment to the United States constitut ion ( Gagn 1977). Table 7 Overview of the five conditions of learning Category of Learning Capability Performanc e Example Verbal Information Retrieval of stored information (facts, labels, discourse) Stating or communicating the information in some way Paraphrasing a definition of patriotism Intellectual Skills Mental operations that permit individuals to respon d to conceptualizations of the environment Interacting with the environment using symbols Discriminating between red and blue; calculating the area of a triangle Cognitive Strategy Executive control processes that learnin g Efficiently managing thinking, and learning Developing a set of note cards for writing a term paper Motor Skill for performing a sequence of physical movements Demonstrating a physical sequence or acti on Tying a shoelace; demonstrating the butterfly stroke Attitude Predisposition for positive or negative actions toward persons, objects, and events Choosing personal actions toward or away from objects, events or people Electing to visit art museums; avo iding rock concerts Note From Learning and instruction: Theory into Practice (4 th ed.) (p. 135), by M. E. Gredler, 2001, Upper Saddle River, New Jersey: Prentice Hall. Motor skills are when an individual can perform movements or physical actions that ar e organized, precise and performed smoothly ( Gagn 1977; Gredler, 2001). H ighly

PAGE 73

61 can improve over time with practice ( Gagn 1977, p. 43). Finally, attitude pertains to the personal choices an individual makes ( Gagn 1977). An individual behavior can be affected by their attitude ( Gagn 1977, Gredler, 2001). Attitude is largely an internal state and is comprised of three characteristics, first is the cognitive charac teristic, the ideas an individual may have, second is affective characteristic, that is, decisions are made based on emotions and feelings, and thirdly, a behavioral characteristic that refers G agn et al. (1988 ) also ( p. 43 ) According to Gagn (1984 ) : any set of categories that purports to describe human learning should meet at least four major criteria: 1. Each category should represent a formal and unique class of human performance that occurs through learning. 2. Each category should apply to a widely diverse set of human activities and be independent of intelligence, age, race, socioeconomic status, classroom, grade level, and so on. 3. Eac h category should require different instructional treatments, prerequisites, and processing requirements by the learner. 4. Factors identified as affecting the learning of each category should generalize to tasks within the category but not across categories (with the exception of reinforcement) ( Gagn, 1984, p. 2 as cited in Gredler, 2001 p. 133) The five capabilities or outcomes of learning meet all four of the aforementioned criteria (Gredler, 2001).

PAGE 74

62 et al., 198 8) five conditions of learning can lead to simplification of instructional planning if learning objectives are assigned to each of the five human capabilities. Gagn (1977) went further to distinguish the conditions of learning by identifying their inte rna l or external qualities. Gagn (1977) explains that for learning to occur or to achieve any of the five learning outcomes, there are some conditions that are internal (within) the learner and some conditions are external (outside) to the learner. To clari fy, to increase intellectual skill capability, for example, if children need to learn to fin d the difference between the 22 3 / 16 and 24 1 / 8 assuming that they do not know how to do this already, Gagn (1977) described a situation where a child may have comp onent or subordinate skills. A child having component or subordinate skills (internal ce by subtracting fractions havin 30). Gagn believed that if these internal conditions were previously learned, then learning the new skill will not be difficult H owever if there is no component or subordinate skills to be reca lled, then the skill s will have to be learned. A verbal communication is an example of an external condition that is often used to help learner s remember a subordinate skill. First, to continue with the example, a act fractions like 3 / 16 from 4 / 16 communication may be followed by other hints to guide the learner, as well as an opportunity for the le a rner to use his/her new skill. Therefore, according to Gagn (1977), when learning an intellectual skil l the interna l conditions consist of :

PAGE 75

63 1. T he previously learned skills which are components of the new skill; 2. T he processes which will be used to recall them and put them together in a new form; ( p. 31 ) ng an intellectual skill : 1. S timulating recall of the subordinate skills; 2. I nforming the learner of the performance objective; 3. providing an occasion for the performa nce of the just learned skill in connection with a new example ; ( Gagn, 1977, p. 31 ) This leads us to one of Gagn Gagn 1977, p. 51). Gagn was always concerned about the practical applications of his research e events of instruction offered a practical guide to an instructional designer, whether the instru ction is traditional or web et al 1988; p. 180; Gagn 1977). In Gagn et al (1988), eight (p. 81 ) are listed: ( a ) attention: which helps a learner verify the b ) selective perception also known as pattern recognition : this is the conversion of arriving st imulation to a form that can be stored in short term memory; ( c ) rehearsal: this is how the information received is

PAGE 76

64 term memory; ( d ) semantic encoding: preparation for storage in long term memory; ( e ) retrieval/search: whe n information from stored memory moves to the working memory to help provide a response; ( f ) response organization: g when the learner receives information about their p erformance; and ( h ) executive control processes: also known as learners guide their own learning, thinking, acting, and fe influencing changes to all of the ot her internal processes (Driscoll, 1994, p. 341) Gagn et al. (1988) stated that it is possible for external events to influence the As an aside, instruction is def ined by Drisco ll (1994) (p. 332). The events of instruction were presented to the learner to aid them in advancing from their present situation to where they want to be in terms of learning capability ( Gagn et al. 1988). Sometimes, the events are followed in a sequence as a natural chain of events but usually it takes an instructional designer or a teach er to arrange the events in a particular fashion to enhance learning ( Gagn et al. 1988). Table 8 shows the connection between processes of learning, instructional events and procedures using an English grammar concept as an example This also offered direction to instructional designers when creating a computer based lesson ( Gagn Wager, Rojas, 1981 as cited in Gagn et al. 1988)

PAGE 77

65 Table 8 Instructional events and their relation to processes of learning in design of a computer based lesson Instructional Event Relation to Learning Process Procedure 1. Gaining attention Reception of patterns of neural impulses Present initial operating instructions on screen, in cluding some displays that change second by second. Call attention to screen 2. Informing learner of the objective Activating a process of executive control State in simple terms what the student wil l have accomplished once she or he has learned. Example: Two sentences, such as sentences contains a word that is an object, the other does not. Can you pick out the object? In the first senten ce, ball is the object of the verb chased You are about to learn how to identify the object in a sentence. 3. Stimulating recall of prerequisite Retrieval of prior learning to working memory Recall concepts previously learned. Example: Any sentence has a subject and a predicate. The subject is usually a noun, or a noun phrase. The predicate begins with a verb. What is the subject upset the cart 4. Presenting the stimulus material Emphasizing features for selective perception Present a definition of the concept. Example: An object is a noun in the predicate to which action (of the verb) is directed. roof is the object 5. Providing learning guidance Semantic encoding; cues for retrieval is the cow and that is the object of the verb. Notice, though, that s fell is not stated to be directed at something. So, in this sentence, there is no object 6. Eliciting the performance Activating response organizatio n Present three to five examples of sentences, one by one, ask, O if this sentence has an object, then type the word that 7. Providing feedback about performance correctnes s Establishing reinforcement Give information about correct and incorrect responses. Example: Book is the object of the verb closed in the first sentence. The second sentence does not have an object. 8. Assessing the performance Activating retrieval; maki ng reinforcement possible Present a new set of concept instances and noninstances in three to five additional pairs of sentences. Ask questions requiring answers. Tell the learner if mastery is achieved and what to do next if it is not. 9. Enhancing reten tion and transfer Providing cues and strategies for retrieval Present three to five additional concept instances, varied in questions at space s intervals. Note. From Principles of instructional design (3 rd ed.) by R.M Gagn L. J. Briggs and W.W. Wager, 1988, New York, NY: Holt, Rinehart and Winston Inc.

PAGE 78

66 The f (1997) review of Gagn ev ent and the corresponding process of learning: 1. attention; 2. State the instructional objective expectancy; 3. Stimulate memory of relevant information accessing long term memories, bringing to working memory; 4. Present the sti mulus, information, or distinctive features to be learned pattern recognition, perception 5. Guide the learning encoding (the process of categorizing labeling, or finding meaning in incoming information or other stimuli. This allows the information to pas s from working memory into long term storage (Gentile, 1997, p. 601)), chunking (the process of combining separate pieces of information into meaningful units (Gentile, 1997, p. 598)), practice 6. Elicit performance retrieval, active participation, practice ; 7. provide feedback correction of errors, reinforcement; 8. Assess performance metacognition, retention; 9. Provide for retention and transfer overlearning, distributed practice, generalization; (p. 413) All nine events do not have to be included by the ins tructional designer simultaneously or in sequence. Th e inclusion of an event and its sequence depends on the objective, the audience and instructional content ( Gagn et al., 1988). Designing instruction can be simplified if each skill to be learned is defi ned by a performance objective (Gredler,

PAGE 79

67 2001). This aids in the selection o f suitable instructional events which can then aid in finding appropriate media and other support aid for effective instruction. Metacognitive Learning Driscoll (1994) defines met regulatory behavior Adding to this is another definition that refer s to metacognition as knowledge people have about their own thought processes (Bruning, Schraw, Norby & Ronning, 2 004, p 81). Review of the literature about metacognition showed that college students in particular, freshmen were not aware of their learning styles and most were neither self regulated nor independent learners (Cukras, 2006). According to Zimmerman (1986) a self regulated learner is one who is metacognitively, motivationally and behaviorally active participants in their own (as cited in Zimmerman, 1990, p. 4). In an effort to help students become self regulated and independent learners, many colleg es have program s where instructors teach study skills courses (Cukras, 2006). Studies have indicated that metacognitive awareness develops with age and older learners are much more capable of describing their cognitive characteristics (Bruning et al 2004 ). Furthermore, these studies have also indicated that younger learners can be easily trained in metacognitive knowledge (Bruning et al 2004). Additionally, instructors have been encouraged because research has indicated that learners with low ability an d poor knowledge can be helped if they become metacognitively aware of their situation (Bruning et al 2004). Studies also provided evidence to indicate that metacognitive awareness aids learners who have been considered high or low level achievers (Bruni ng

PAGE 80

68 et al 2004). Instructors should note that teaching metacognitive strategies has been helpful to students especially those who are trying to learn new concepts. Teaching Test T aking Strategies This topic is important to the present study because the mo dule that will be converted from an instructor led format to a web based format pertains to objective test taking strategies. Researchers (Scruggs & Mastropieri, 1992) believed that teaching test and helped students to be better prepared in test taking situations. The term test wiseness was defined by Millman, Bishop and Ebel (1965 ) a of the test and/or the test taking situation to Mastropieri, 1992 p. 707 ). Another researcher, Durham (2007) explains further that test Durham (2007) believes that a lear ner past experiences also help to acquire test wiseness which is why for example elementary school children usually have little or no test wiseness. Sarnacki (1979) added that test wiseness was not about guessing at answers, although teaching guess taking strategies h as been part of test wiseness. Furthermore, this knowledge alone does not guarantee that the learner will pass every test (Sarnacki, 1979). Sarnacki (1979) reviewed a number of research studies pertaining to test wiseness in his article and con cluded that a variety of methods can be used to teach this subject. He found that research studies have shown that teaching low test wise individuals test wiseness strategies has helped to increase their test performance. Instructors were helped by the re search work of Millman et al (1965, as cited in Scruggs & Mastropieri, 1992)

PAGE 81

69 2) that were comprised of six elements. knowledge and knowledge (Scruggs & Mastropieri, 1992). According to Scruggs and Mastropieri (1992) t he four independ ent elements are : 1. Time using strategies Working quickly and ef ficiently, solvi ng problems and answering items you know, and saving more difficult items for last. 2. Error avoidance strategies Paying careful attention to directions, careful marking of answers, and checking answers. 3. Guessing strategies Making effective use of guessing when it is likely to benefit the test taker. 4. Deductive reasoning strategies Applying a variety of strategies, including eliminating options known to be incorrect, or using content information from the stem (ques tion) or other test information. (p. 2) Also as stated by Scruggs and Mastropieri ( 1992 ) t he two elements dependent on the knowledge of the instructor as well as his/her test objectives are : 1. Intent consideration strategies Include consideration of the purpose of the test or intent of the test cons tructor when selecting answers. 2. Cue using strategies. Include use of known idiosyncrasies of the test maker, (specific determiners), when it is known that such options are rarely c orrect. (p. 3)

PAGE 82

70 The se elements could be used to help guide the instructor when teaching test taking strategies. There are several factors that affect how test taking skills are taught. Teaching about teaching specific test items or subjects (Scrugg s & Mastropieri, 1992, p. 4). A their ability as well as the specific ity of the skills to be learned were factors that affected teaching test taking strategies. Scruggs and Mastropieri (1992) strongly advised instructors that the point of view of the learner should alwa ys be taken into consideration. The goal of the instructor should be to help the learners respond and answer the test questions to the best of the ir ability (Scruggs & Mastropieri, 1992). Bruning et al (2004) discussed seven guidelines for in structors teaching strategies: 1. Match encoding strategies with the material to be learned. Instructors need to information. For example, learners need to match the strategies they use with their learning goals, materials and the type of evaluation they will encounter. Instructors should provide materials to encourage their learners. 2. E ncourage students to engage in deeper processing Deeper processing of information results in a st r onger formation in the memory. To encourage deeper processing of information, first, instructors can concurrently encourage their learners to make some conne ction with their prior knowledge as well as with the context in which the learning occurred. Secondly instructors can promote affective type responses to the information. Finally instructors can

PAGE 83

71 answer questions about the information to be learned while en couraging l earners to also ask questions. These suggestions can help promote deeper processing of the information to be learned. 3. Use instructional strategies that promote elaboration Instructors should use instructional strategies that will help learners gain mea ning in what they are learning. When learners have been active participants in their learning, they were more likely to take responsibilit y for their learning. A technique an instructor can use is schema activation. Schema activation is about find ing ways to help learners recall information. An instructor has several ways to help their students process this new information such as brainstorming sessions, pre teaching, explaining key concepts, or even asking the learners to categorize the informatio n. 4. Help students become more metacognitively aware. Effective learners are learners who have declarative and procedural knowledge as well as metacognitive awareness. Learners who are highly aware of how, why and when they lea r n can regulate their learning Therefore, i nstructors should teach metacognitive strategies since it is vital to good learning. 5. Make strategy instruction a priority Research indicates that a learner possession of knowledge does not make an indepen d ent or self regulated learner. There fore the learner should also know how to u se the knowledge strategically. Instructors should actively discuss strategies, introduce one strategy at a time allowing the learners to practice and discuss the strategies in detail while providing feedback to th e learners. Research shows that learners

PAGE 84

72 who had been taught strategies were empowered by their learning ability which their high achievements reflected. 6. Look for opportunities to help students transfer strategies Frequently, learners have been unable to transfer the strategic knowledge from o n e learning context to another. Bruning et al recommended that instructors inform learners of the various context in whi ch they can use the strategies. In addition, they also recommended that instructors try to limit the number of strategies presented to the learner. 7. Encourage reflection on strategy use. Time to reflect has been an important aspect to developing m etacognitively aware learners. Writing journal entries, group discussions and short essays have been strat egies that an instructor can utilize to help learners reflect. (pp. 86 87) These seven guidelines can assist instructors to teach strategies, including test taking strategies, to learners. Scruggs and Mastropieri (1992) also described their experience in t eaching test taki n g strategies in the classroom. They stated that they would first teach the concepts followed by a practice session where the students had been given a practice test. Afterwards, the instructors would follow up with re view, evaluation and feedback. The learners have an importan t part in this proces s as well. The learners have the respons i bility to practice the skills. for some learners (Scru g gs & Mastropieri, 1992, p. 3). This has been especially problematic for learners who have difficulty

PAGE 85

73 learning and usually they are the ones that can benefit the most from learning test taking strategies (Scruggs & Mastropieri, 1992). Learner Satisfaction and Quality of Web based Instruction Researchers Shaik, Lowe and Pinegar ( 2006 ) stated: S atisfaction is generally associated with a single transaction whereas service quality is based on the cumulative assessment of the quality of services rendered over time ( functional quality) and the outcome resulting from those services (technical quality) ( OJDLA, p. 1, para. 4 ) Shaik et al. (2006) study employed a validated instrument called DL sQUAL (Distance Learning Service Quality) to analyze the quality of distance learning services The researchers review of the literature revealed that there was a strong need to measure distance learning service quality due to the rise in demand of distance [web based] courses. Education services are made up of core services such as teaching and learning while supporting services are time information about institutional policies, procedures and courses, student advising, registrations, orientation, student accounts, help desk, complaint handling, feedback, and student place ( Shaik, Lowe & Pinegar, 2006, p. 1 para. 4) Shaik et al. (2006) believe d that emphasis should be placed on measuring what comprises quality in di stance [web based] education. Although Shaik the DL sQUA L instrument, other researchers were able to shed light on what comprises lea r ner satisfaction and what is considered quality in web based instruction. defined, the problem exi sts in measurement of this variable. Astin (1993 ) defined student

PAGE 86

74 that pertains to the college experience and perceived value of the education received while attending as cit ed in Boll i ger and Martindale, 2004, p. 62). This is consistent with a common factor among many of the studies that attempt to measure learner satisfaction where researchers eption of successful learning. In parti cular, learner satisfaction is dependent on elements the learner perceives as constituting successful web (Hong, 2002; Stokes, 200 1 ; Northrup, Lee & Burgess, 2002; Neuhauser, 2002; Moore, 2002; Fr ederickson, Pickett, Shea, Pelz & Swan, 2000; Bolliger & Martindale, 2004). [web issues, ( p. 61 ) whereas learner satisfaction in traditional courses a re based on different factors. Factors such as (Astin, 1993 as cited in Bolliger & Martindale, bility of career advisors, (c) student social life on campus, and (d) overall relationships wit h faculty and administrators ( p. 63 ). Furthermore, it was inconclusive whether factors such as gender, age, learning styles, time spent on the course, perceptio ns of student student interactions, and course activities affe cted learner satisfaction (Hong 2004; Kim & Moore, 2005). Along with the three constructs th at Bolliger and Martindale (2004 ) proposed another factor to consider is quality of web based course s. Swan (200 3 ) reviewed several studies on what constituted effective learning using computers and learning in higher

PAGE 87

75 educat ion. From her review, Swan (200 3 ) provided a list of common elements that web based course developers and instructors should conside r: 1. Clear goal s and expectations for learners; 2. Multiple re presentations of course content; 3. Frequent o pportunities of active learning; 4. Fre quent and constructive feedback; 5. Flexibility and choice in satisfying course objective; 6. Instructor guidance and support ; ( p. 19 ) Swan (2003) notes that although the course design elements listed above is an acceptable framework it is uncertain whether they apply specifically to web based course s. Indeed, she proposes that there is a need for researchers to study how particu ons (Swan, 2005 p. 19). Swan (2003) set about to analyze several research stud ies conducted on this premise. The results of in practical insights for web bas ed instructional designers and instructors (see Tables 9 10, and 11 ). ) research findings, Table 9 shows the importance of interaction, consistency in terms of navigation, design elements and organization and the importance of immediate feedback. Moving on to Table 10 there appeared to be a direct correlation between student instructor interaction and student satisfaction. even their support is seen as having a positive effect on the web based learn er (Swan, 2005, p. 36). Table 11 reports on the importance of designing to encourage online discussion and a web based social presence of the learner.

PAGE 88

76 Table 9 Interaction with course interfaces and content: Research findings and practical implications RESEARCH FINDING IMPLICATIONS FOR PRAC TICE Interactions with course interfaces are a real factor in learning; difficult or negative interactions with interfaces can depress learning Work with major platforms to improve interfaces to support learning. Develop consistent interfaces for all cour ses in a program. Provide orientations to program interfaces that help students develop useful mental models of them. Provide 24/7 support for students and faculty. Make human tutors available Greater clarity and consistency in course design, organization goals, and instructor expectations lead to increased learning Review courses being taught and/or being developed to insure clarity and consistency. Establish quality control guidelines that address issues of clarity and consistency Address issues of cour se design and organization and instructional goals and expectations in faculty development Ongoing assessment of student performance linked to immediate feedback and individualized instruction supports learning. Automate testing and feedback when possibl e. Provide frequent opportunities for testing and feedback. Develop general learning modules with opportunities for active learning, assessment and feedback that can be shared among courses and/or accessed by students for remediation or enrichment. Note From Learning effectiveness: What research tells us by K. Swan, 2003, in J. Bourne, & J. Moore (Eds.), Elements of quality online education: Practice and direction (pp.13 45). Needham, MA: Sloan C Further design guidelines for instructional designers are offered by Mehlenbacher (2002) who is particularly concerned with the usability of WBI. In Table 12, Mehlenbacher (2002) pointed out that the environment is an important issue to consider when learning takes place on the Web. The design guidelines that Mehlenbacher (2002) emphasized are ). The web based learning environment is considered well easy to navigate, convenient, reliable, accurate, and comprehensive

PAGE 89

77 Table 10 Interaction with instructors: Research findings and practical implications RESEARCH FINDING IMPLICATIONS FOR PRACTICE The quantity and quality of instructor interactions with students is linked to student learning. Provide frequent opportunities for both public and private interactions with students. Establish clear expectations for instructor student interactions. Provide t imely and supportive feedback. Include topic of instructor interaction in faculty development. Instructor roles change in online environments Include the topic of changing roles in faculty development and provide examples of how other instructors have cop ed. Provide ongoing educational technology support for faculty. Develop forums for faculty discussion of changing roles online and F2F. Note From Learning effectiveness: What research tells us by K. Swan, 2003, in J. Bourne, & J. Moore (Eds.), Eleme nts of quality online education: Practice and direction (pp.13 45). Needham, MA: Sloan C

PAGE 90

78 Table 11 Interactio n with classmates and vicarious interactions: Research findings and practical implications RESEARCH FINDING IMPLICATIONS FOR PRACTICE Learning occurs socially with communities of practice; there is greater variability in sense of community ratings among o nline courses than in F2F courses. Design community building activities. Model the use of cohesive immediacy behaviors in all interactions with students. Develop initial course activities to encourage the development of swift trust. Address issues of commu nity in faculty development. Verbal immediacy behaviors can lessen the psychological distance between communicators online; overall sense of social presence is linked to learning. Develop initial course activities to encourage the development of swift tr ust. Model and encourage the use of verbal immediacy behaviors in interactions with students. Encourage students to share experiences and beliefs in online discussion. Introduce social presence and verbal immediacy in faculty development. Student learni ng is related to the quantity and quality of postings in online discussions and to the value instructors place on them. Make participation in discussion a significant part of course grades. Develop grading rubrics for participation. Require discussion par ticipants to respond to their responses to their own postings. Stress the unique nature and potential of online learning Vicarious interaction in online course discussion may be an important source of learning from them. Encourage & support vicarious interaction Require discussion summaries that identify steps in the knowledge creation process. Use tracking mechanisms to reward reading as well as responding to messages. Online discussion may be more supporti ve of experimentation, divergent thinking, exploration of multiple perspectives, complex understanding & reflection than F2F discussion. Encourage experimentation, divergent thinking, multiple perspectives, complex understanding & reflection in online disc ussion through provocative, open ended questions, modeling & support & encouragement for diverse points of view. Develop grading rubrics for discussion participation that reward desired cognitive behaviors Develop initial course activities to encourage t he development of swift trust. Online discu ssion may be less supportive of convergent thinking, instructor directed inquiry & scientific thinking than F2F discussion. Use other course activities to support these such as written assignments, one on one tut orials, small group collaboration & self testing. Develop grading rubrics for discussion participation that reward desired cognitive behaviors Note From Learning effectiveness: What research tells us by K. Swan, 2003, in J. Bourne, & J. Moore (Eds.) Elements of quality online education: Practice and direction (pp.13 45). Needham, MA: Sloan C

PAGE 91

79 Table 12 Usabi lity design principles for WBI

PAGE 92

80 Table 12 (continued.) Note. From Usability Design Principles for Web Based Instruction (WBI) 1 of 2 (cf. Najjar, 1998; Nielsen, 1994; Selber, Johnson Eilola, and Mehlenbacher, 1997)) as cited in Mehlenbacher 2002.

PAGE 93

81 Summary Developing and validating a web based module to teach metacognitive learning strategies to students in higher education is the foc al point of the present study. Researchers of instructional technology acknowledge that WBI development has a unique set of characteristics that differentiates it from developin g traditional type instruction. Development of WBI can be viewed from several perspectives, a research perspective, a design and development perspective, and a learning p erspective Chapter Two is a discourse on the literature that is considered pertinent to the present study. Four major topics were covered: design based research (DBR) (i.e. a research perspective), web based instruction (WBI) and instructional systems design (ISD) (i.e. a design and development perspective) and learning (i.e. a le arning perspective ) From a research perspective, i t was critical that the research method utilized in the present study be defined and discussed carefully since DBR is still considered by some in the i nstructional technology field as a new and untraditional approach. The definitions of DBR presented here underscored the ongoing discussions among researchers about it s scope and value to the fiel d of instructional technology. Moreover, t he ad van tages of a DBR research approach and how it can meet the unique characteristics of studying the design process was clearly presented in the liter ature review. Critical guidance for the methodology adopted in the present study was presented in the discussion of Seeto and Herrington (2006) guide for DBR research. Additionally, s ome top researchers in the field of instructional design are advocates of DBR and are calling for more studies of design and development to utilize this research approach, therefore the presen t study will add to the body of research.

PAGE 94

82 In consideration of a design and development perspective, o ther topics of importance that were highlighted in the literature review wer e WBI in higher education an d the ADDIE process within ISD. The Internet World Wide Web and its resources create a unique environment for learning. As discussed in the literature, WBI provides increased accessibility to learners yet for some IHEs there are prohibitive factors to developing WBI, with concerns of the quality of WBI be ing one such factor. It is important to not only understand what WBI is but equally important is to understand the design and development perspective that could affect the quality of WBI The premise of the present study was based on studying the effect of utilizing systematic ap proach to design and develop WBI. Hen ce the reason for the inclusion of a review of the systematic approach, I SD and the ISD process, ADDIE. T he literature reviewed contend s that using a systematic approach should result in quality WBI. ADDIE which is a generic process is used by many instructional designers to guide them in creation of WBI Conversely, there are some practitioners and researchers that do not support the use of ADDIE for WBI creation. The conflict among practitioners and researchers in using ADDIE is another reason why the present researcher is interested in discovering whether the systematic approach using the ADDIE process will result in a WBI that is high in quality In other words, it is important and socially rel evant to understand what process creates WBI that is considered educationally valuable to learners Another pers pective considered when developing WBI is learning In particular, how learners can learn in a web based environment is vital in helping to des ign the WBI for the study I ncluded in this portion of the literature review were foundational research

PAGE 95

83 studies on learning metacognitive learning, teaching test taking skills, and elements of learner satisfaction and quality of WBI. How one learns and in particular how one learns me tacognitive learning stra tegies provides a theoretical framework germane to the present research. Furthermore, research on teaching test taking skills provide d relevant approaches for designing content for the WBI created for the present study. It was also important to understand what elements comprised learner satisfaction and quality as it pertains to web based courses The information here provided guidelines to determine the attributes of a product that is effective and of high quality. All fo ur topics reviewed DBR, WBI, ISD and learning are the foundation of the present study Moreover, the analyse s of some of the relevant studies conducted within these disciplines provide a strong theoretical framework fo r designing and de veloping WBI. The impact of the theoretical frame work discussed here in Chapter Two influence s the methods used in this study. Chapter Three follows with a review of the methodology of the present study.

PAGE 96

84 C hapter T hree Methods This chapter provides an overv iew of the research design and the research metho ds of the study Furthermore, a pilot study was conducted, and it was comprised of the completion of the Analysis phase of the ADDIE process and development of the prototype of the web based module that is p art of the Design Phase of ADDIE A derived from the pilot study are part of the discussion included in this chapter. The d ata collection method in the Analysis phase set the precedence for the rest of the phases of ADDIE Therefore, following the pilot study discussion, the methods of the rest of the phases of ADDIE, that is Design, Development, Implementation and Evaluation are included in this chapter. Since a number of instruments w ere used in this study, a detailed description of the instruments, instrument development and validation (i.e. the expert review of the instruments ) for each phase of ADDIE are included in this chapter as well. As in other DBR studies, a large amount of q u alitative data was collected and the method utilized for data analysis and data reduction is described fully later in this chapter. The reader will also notice reference to data displays throughout the chapter. A ccording to Miles and Huberman ( 19 9 4 1984 ) data displays are a crucial part of the data analysis and reduction process.

PAGE 97

85 Overview of the Study The present study was effect of applying a systematic approach to development of a web based module for teaching metacognitive learning strategies to students in a higher education It was also designed to meet a number of objectives that comprise d the outcome of the study. The following objectives were met: Research Objective 1: To create a sy stematically and rigorously designed product intended to meet research design goals. Research Objective 2 : To produce data that indicate d the validity and effectiveness of the product. Res earch Objective 3: Deliverables : Deliverable A: A list of generalize Deliverable B: Report on the effectiveness of the specific instructional strategies utilized. Deliverable C: An analysis of quantitative, qualitative and descriptive outcome measures of learning among field test participants. Deliverab le D: A module that is considered valid and effective at the juncture where the study completes a second iteration evaluate refine using data collected via form ative and summative evaluations guided by the ADDIE process.

PAGE 98

86 Information was gathered by researching the systematic process ADDIE where it was used to guide the conversion of one learning module of a course that is currently an instructor led course into a web based module. The targeted module for conversion was part of an undergraduate course Learning Strategies within Academic Disciplines taught at a major research university Further discussion about the course, the targeted module and why it was us ed in this study is explained in a later section of this chapter. The systematic process ADDIE was used as the conceptual framework for designing and developing the web based module. S ince this research study was designed to conduct formative and summativ e evaluations, the data collected provided d refinements of the ADDIE process and insights for the basis of a new ISD model altogether. Systematically going through the phases of ADDIE provide d information on th e feasibility of this web based development process For example, the design of the study provide d data about the overall time and cost factors involved in creating a web based module Most importantly it provided insight into the design de ci sion making p rocess. Figure 6 displ ays a timeline of the ADDIE process and t he length of time taken to complete each phase of AD DIE Setting All the phases of ADDIE were examined in a naturalistic setting. There were a number of reasons for conducting this study in a naturalistic setting. Foremost of which is Manion & Morrison 2000, p. 138). According to Cohen, et al. (2000) some further reasons for utilizing a naturalistic setting are

PAGE 99

87 public research university. The university was recognized by the Carnegie Foundation as one of 39 community engaged public univ ersities and one of the top 63 research university in the nation. Figure 6 Timeline in weeks for web based development Sampling In this study an exploratory, inductive qualitative approach was utilized. This approach did not have pre determined directions or delimitations set for the course of this study (Trochim, 2001). The sampling method used in this study was non probabili stic and the sample was convenient. Non probability sampling or purposive sampling means the chances o

PAGE 100

88 unknown, that is, not everyone ha d e included in the sample (Cohen et al. 2000, p. 99). Participants The participants in the study var ied depending on the phase an d the type of evaluation required. For example, in order to test usability of the web based module throughout the phases of the ISD process (i.e. ADDIE) measures from learners, a subject matter expert (SME) two instructional design experts, a n instruction al designer and a were sought. A further discussion of each type of participant in the study follows It should be noted that the Principal Investigator (PI) also functioned in the roles of Instructional Designer and a programmer in this study. Learner s Learner participants were students enrolled in courses conducted by the Student Learning Services (SLS) program. Learners were enrolled in Learning Strategies Critical Reading and Writing and The University Experience courses which were all within the Academic Disciplines coursework The Critical Reading and Writing course helps students develop the fundamentals of reflective and critical reading and on effective analytical writing utilizing multiple so urces from various disciplin es. The course meets the criteria of Gordon Rule Writing requirements ( University of South Florida ( USF ) http://www.ugs.usf.edu/sab/sabs.cfm para. 1 ). The University Experience course is a course. In se minars, small groups discuss the academic qualities necessary to succeed at USF: test taking and study skills, time management, writing, critical thinking, computer and library resources, career planning and USF policies ( USF, http://www.ugs.usf.edu/sab/sabs.cfm para. 1 ). These course s were all

PAGE 101

89 part of the LEARN Program now called Student Learning Services at the university where the research was conducted The commonalities among these courses were that they all contain ed a metacoginitive component to help guide their students to recognize their learning habits and learning style and they represented the target audience demographics All participants were 18 years old or older and participation was volunt ary. Subject Matter Expert (SME) From a design perspective, a critical participant in the study was the SME The SME was also one of the instructors of the Learning Strategies course and w as a participant in seven of the twelve questionnaires and two inte rviews conducted in the present study The SME was also a doctoral candidate in Instructional Technology. She taught the Learning Strategies course for two years. Her experience resulted in reliable content knowledge of the Learning Strategies course that was used to guide content development of the web based module. Instructional Design Experts T wo experts in the field of instructional technology reviewed the development of the instruments and the product. They were also participants in two questionnaires. Both experts hold doctoral degrees in Instructional Technology and have over three years of expertise in this field. One ID expert referred to as ID Expert A in the study is currently the program manager and instructional designer for the Distance Course Design and Consulting G roup at a major research university She has worked as an Instructional Designer in both higher education and with a military contractor. I n higher education this expert was an Instructor for First Y ear Student programs and an Acad emic Advisor The

PAGE 102

90 other ID expert referred to in this study as ID Expert B is the Assistant Dean of Curriculum, School of Nursing at a private for profit university Her area of expertise includes managing and directing all aspects of the course developm ent for Associates, Bachelors, and Master's degree n ursing programs at the private for profit university Instructional Design er The PI was also the instructional designer i n this study. Later in this discussion when the role of the instructional designer takes priority over the role of the PI the reader will see a reference to the PI (as Instructional Designer). The PI is a doctoral candidate in the field of instructional technology. She is currently employed as an instructional designer at a private libe ral arts university As an instructional designer, she has over three years of expertise in developing web based courses. She has participated in several ID projects where she has converted existing traditional lessons into a web based format. She has anal yzed, designed, developed, implemented and evaluated s everal web based training. Also, the PI has expertise in using several development tools such as Authorware 6.0 Adobe Captivate 3.0 and Lectora. Prog r a mmers Two programmer s were assigned to this stu dy. One programmer developed the prototype and for the rest of the study, will be referred to as Programmer 1. Programmer 1 experience. Although he developed the prototype Programmer 1 d id not participate in any other phases of the study The second assigned programmer Programmer 2 was also the PI of the present study. The PI has over six years of programming experience and developed the web based module. Later in this dis cussion when the role of the

PAGE 103

91 Programmer 2 takes priority over the role of PI the reader will see reference to PI (as Programmer 2). Ethical Considerations Participation in the study was voluntary and all participants were assured that the data collected wo uld be anonymous and confidential. No participant in the study was harmed in any way and no incentives were used to entice participants to take part in this study. The data was not used for any purpose other than to meet the objectives of this study. IRB ( Institutional Review Board) permission had been sought and adhered to for all the phases of the ADDIE as well as for the DBR evaluations. Initially IRB approval was sought and granted for the first phase of ADDIE: Analysis. Once this phase was completed, I RB approval was sought and granted to conduct the rest of the ADDIE phases. The study was granted the sta The Principal Investigator (PI) The role of the PI in the study must be examined to allay any suspicions on potential rese archer bias. Along with her role as PI she act ed as an instructional designer and as one of the two computer programmers for the study. The PI has more than six years of computer programming experience that was utilized to facilitate the develo pment of th e web based module. She has been employed as an instructor in the College of Education at a major research university and has taught course s on integrating technology into the classroom. She is currently employed as an instructional designer at a private l iberal arts university. Since much of this inquiry was qualitative in nature, which implie d some interpretation of the data it is important to delve into the scholarly qualities of the PI A

PAGE 104

92 major scholarly interest of the PI is to understand the use of technology to enhance and supplement learning. Although the PI has a technical educational background and has worked in the information technology field for a number of years the PI believes that technology in and of itself cannot fill all the gaps that o ccur in the learning environmen t. She believes that technology should be teamed with other successful learning interventions to influence learning outcomes in a positive manner. For example, web based learning should have well though t out interactions tha t should create active and not passive learners. The PI is also interested in understanding how educators can create online interventions that can produce positive learning outcomes by using innovative tools and teaching methods It should be noted that p rior to the start of the study, t he PI had been acquainted with both the SME and Programmer 1 Programmer 1 developed the prototype for the study. Also, t he PI attended various instructional design courses together with the SME and Programmer 1 Additional ly, t he SME also being an instructional designer, did make suggestions in reference to design elements of the web based module. However, the PI background did not introd uce a bias. T he content information gathered was strictly from a SME perspective best to present the information and what kind of design elements may be feasible within the environmen t were taken into consideration. The information derived did not involve any personal or subjective information. The questionnaire s and interview questions used in the study were derived from noted researchers in the field of instructional technology Fin ally, the PI document ed the entire

PAGE 105

93 research process of the ADDIE phases by logging entries on a weekly basis (see Appendix D for an excerpt of the Logbook) The entries capture d the progress and provide d a source of information for performing DBR evaluativ e functions in the study. Description of the Course for Conversion: Learning Strategies within Academic Disciplines The Learning Strategies course is based on a model of developing self regulated learners through understanding concepts related to motivati on, attitude, goal planning, and the process of learning. It is a two credit seminar style course with three main objectives: to encourage critical thinking, to help students self regulate their academic actions, and to create reflective learners. The goal of the course is to help learners develop an understanding of their learning style through the practice of reflection. The hope is that this understanding will serve them well in their academic career and beyond. Currently the course is experiencing low enrollment rates. The target audience for the course is any student who requires help acquiring learning strategies skills. If this course is made available on the Web it would more than likely increase enrollment rate and also attract higher level stude nts such as juniors and seniors. Fu rthermore, in the present state the course does not allow any flexibility in terms of content development It is not flexible, modular or scalable. Instructors in this course may have to teach a variety of students who ra nge from freshmen to seniors in one combined learning session. At the moment, the rigidity of the course design at times makes this concept too difficult for freshmen and at the same time too easy for juniors and seniors. As a result of the audience not be ing typical, instructors are asking for a web based module that is

PAGE 106

94 comprised of several subsets to suit the learner. Web based courses, if designed properly, can fit the needs of many individual learners. In t his study the PI create d a web based module us ing the content of the metacognition module as its foundation. In particular, the concept of self regulatory learning strategies in combination with test preparation skills was co nverted to a web based format. Comments from student s and instructors for the Learning Strategies course suggest ed that the learning module on metacognition was one of the most difficult of all the concepts in this course. This module was ideal to transition to the Web because it combined theory with practi ce. Focusing on the outcome of transitioning this module from instructor led to web based has add ed value and relevance to this study. Data Collection Instruments This section describes the instruments utilized in the data collection for the ISD process, ADDIE as well as for the DBR approach. Table 13 is an overview of the instruments and their relation to the phases of the ISD process and the DBR evaluation functions. The method of analysis for each instrument is also displayed. Examples of all the instr u ments employed and results of the Analysis Phase of ADDIE can be found in Appendix A, the instruments and results for the rest of the phases of AD DIE can be found in Appendix B Interview alyze the beginning of the Analysis phase. Recently, the LEARN program was renamed to Student

PAGE 107

95 Learning Services (SLS) The interview was informal and conversational. Although this relevant information (Cohen et al., p. 271). Moreover, a rapport between the PI and the SME was established and a general idea of why a web based module was desi red was explicitly addressed. Leadership support for the module development was also established with the Director of the LEARN program now the Director o f the Student Learning Services (Cohen et al., 2000, p. 271) type of interview with the SME, P rogrammer 1 and the PI who relied on her instructional design and programming expertise (see Specimen B 1 in Appendix B) The PI outlined the interview questions prior to the meeting. Like the previous interview, this one was also conversational in nature but systematically followed the questions outlined prior to the interview and provided a comprehensive collection of the data (Cohen et al., 2000). Questionnaires There were twelve questi onnaires included in this study. More details of the validation process of each of these questionnaires are included in a later section of this chapter. All questionnaires were reviewed by two ID experts. The questionnaires generated the numerical data tha t was pertinent to the study and supplemented the descriptive narratives of the study (Cohen et al., 2000). See Table 13 for a detailed view of the ADDIE phases and the assigned instruments as well as the targeted participants for each instrument.

PAGE 108

96 Table 13 Overview of instruments showing relation ship between ADDIE and DBR evaluation functions ADDIE Phases DBR Evaluation Function DBR Instruments/Tools ADDIE Instrument Name Type of Instrument Participants Method of Analysis for ADDIE Phase Analysis Review Logbook Results from analysis phase Litera ture review sources Analyze the Problem Interview SME, Director of LEARN Program Observational, descriptive Needs Analysis Needs Analysis Questionnaire SME Descriptive Audience Analysis Questionnaire SME Descriptive Task Analysis Questionna ire SME Descriptive Content Analysis Questionnaire SME Descriptive, Context Analysis Questionnaire SME Descriptive Learner Analysis Questionnaire Learner Descriptive, frequencies Design Formative Logbook Results from design phase Literature review sources Evaluate Design Decisions Questionnaire Design Module Discussion Interview SME, ID, Programmer Observational, descriptive Questionnaire SME, ID, Programmer Descriptive, frequencies Development Formative, Effectiveness Logbook Resul ts from development phase Literature review sources Module Development Questionnaire Evaluate Usability of Module Questionnaire SME Descriptive, frequencies Expert Review of Module Questionnaire ID Expert Descriptive, frequencies Learner: Eva luate Usability of Module Survey Questionnaire Learner Descriptive, frequencies Questionnaire ID, Programmer Descriptive, frequencies Implementation Formative Logbook Results from implementation phase Literature review sources Implementation of Modu le Observation ID, SME Observational Evaluation Summative, Effectiveness Logbook Results from summative survey Summative Usability Evaluation Questionnaire Learner, ID Expert Descriptive, frequencies

PAGE 109

97 The f ollowing details the list of questionnaire s and the assigned ADDIE phase s : 1. Analysis: Needs Analysis; Audience Analysis; Task Analysis; Content Analysis; Context Analysis; Learner Analysis; 2. Design : Evaluate Design Decisions Questionnaires (DBR perspective) ; 3. Development : Evaluate Usability of Module (DBR perspective) ; Module Development Questionnaire; Expert Review of Module; Learners: Evaluate Usability of Module; 4. Evaluation : Summative Usability Evaluation Table 14 summarizes all the participants for each instrument and includes type of instrument and par ticipant for each instrument as wel l a In regards to learner participants, three questionnaires were developed to gather their observations and opinions For the pilot study, in the Analysis phase, the participants of the Learne r Analysis questionnaire were enrolled in a Learning Strategies course. Similar l y, in the De velopment and Evaluation phases participants of the Learners: Evaluate Usability of Module (see Specimen A 1 in Appendix A) and the Summative Usability Evaluation ( see Specimen B 6 in Appendix B) questionnaires were enrolled in the Critical Reading and Writing course and The University Experience course respectively. All three courses had common metacognitive learning strategies components and were all under the umbr ella of the Student Learning Services department. Again, as seen in Table 14, participants of the remaining questionnaires also included the SME, Programmer 1, PI (as Programmer 2 ) ID experts and the PI (as

PAGE 110

98 Instructional Designer). For five of the ques tionnaires in the Analysis phase the participant was the SME. Table 14 List and type of instruments, particip ants and learner course descriptions ADDIE Phases ADDIE Instrument Name Type of Instrument Participants Learner Course Analysis Analyze the Problem Interview SME, Director of LEARN Program Needs Analysis Questionnaire SME Audience Analysis Questionnaire SME Task Analysis Questionnaire SME Content Analysis Questionnaire SME Context Analysis Questionnaire SME Learner Analysis Questionnaire Learner Learning Strategies Design Design Module Discussion Interview SME, ID Programmer s Evaluate Design Decisions Questionnaire Questionnaire (DBR) SME, ID, Programmers Development Evaluate Usability of Module Questionnaire SME Module Development Questionnaire Questionnaire (DBR) Programmer 2 / ID Expert Review of Module Questi onnaire ID Expert Learners: Evaluate Usability of Module Questionnaire Learner Critical Reading and Writing: Implementation Implementation of Module Observation PI Evaluation Summative Usability Evaluation Questionnaire Learner, ID / ID Expert Th e University Experience (2 sections):

PAGE 111

99 The list of questionnaires in the Analysis phase was Needs Analysis, Task Analysis, Audience Analysis, Content Analysis, and Context Analysis (see Appendix A) and they were distributed to the SME via email. At t he Des ign phase of ADDIE, to gain insight from a DBR perspective the Evaluate Design Decisions q uestionnaire (see Specimen B 10 in Appendix B ) was administered and the target participants were the SME, PI (as Instructional Designer) and the p rogrammer s. The r esponses to this questionnaire provided rich details of the decision making process from a DB R perspective There were four questionnaires included in the Development phase of ADDIE. The participant of the Evaluate Usability of Module was the SME (see Spec imen B 2 in Appendix B) Programmer 1 who developed the prototype at the D esign phase did not participate in any of the other phases in the study therefore he was not a participant of any other questionnaire from the Development and Evaluation phases For the Expert Review of Module questionnaire of the Development phase the participants were ID Expert A and ID Expert B (see Specimen B 3 in Appendix B). The Learner: Evaluate Usability of Module was a third questionnaire administered to learner participants in the Development phase of ADDIE (see Specimen B 4 in Appendix B). The final questionnaire of the Development phase was the Module Development Questionnaire and the intended p articipant was the Programmer 2 and the I nstructional Designer (see Specimen B 1 1 in Appendix B). As noted the PI was both the Instructional Designer and Programmer 2 in the study. Data collected from this questionnaire aided the DBR analysis of the present study. According to Cohen et al. (2000) questionnaires, like the ones deployed on the Web for this study, provided an economical and efficient way to meet a larger audience. In this case, m ost of the participants of the study were

PAGE 112

100 geographically dispersed therefore these questionnaires were deployed on the Web using the survey tool http://survey.acomp.usf.edu Observation The implementation of the module was not extensive in nature The module was placed on the Web by simply copying the files from the PI server owned by the university where the research was condu cted To implement the module, a hyperlink was created to a site that the PI All participants were provided a password to gain access to the hyperlink. A direct observation of the implementation process by the PI was sufficient to pro vide data for review in this phase. Logbook When the study commence d the PI kept a log of all events pertaining to the research (see Appendix D for excerpt of the logbook ) She wrote one entry per week regarding the progress of the ISD process. This logb ook provide d substantial data for DBR analyses The logbook was also used as an organizational tool during the lifespan of the study. Research Design As mentioned previously, the DBR approach calls for an assortment of evaluation instruments to be utilize d for the phases of ADDIE The result of the DBR approach provide d a comprehensive view of the ISD process. A conceptual model of the res ea rch design is shown in Figures 7 and 8 This model buil t model (see Figure 3 ) Usi ng the ADDIE process as a conceptual framework to guide the study aid ed in providing construct vali dity at the core of the study. Johnson and

PAGE 113

101 Christensen (2004 ) explain order construct is repre sented i ( p. 247 ). Figure 7 Pictorial representation of construct validity elements included in the research design Construct Validity : Overall Study There were two tiers to the research that provide d construct va lidity to the study First, the five phases of the ADDIE process provided the overarching goal of the entire study Second, the four phases of DBR research phases (see Figure 2 in Chapter Two ) were at the core of the study and integrated a research persp ective that resulted in evaluation as listed in Chapter Two were used to assess the entire study from a DBR

PAGE 114

102 perspective. Additionally, Figure 8 displays the overview of the study. It shows the relationship between the ADDIE phases, the DBR phases and the evaluations functions. Furthermore the systematic development of the web based module high order constructs was represented by the phases of ADDIE (Analysis, Design, Develop men t, Implementation, Evaluation). Some of the phases (e.g. Analysis and Implementation ) were easier to operationalize, that is, construct is represen ted by specific steps to follow by u sing ADDIE to guide the study. In contrast information was gathered i n an iterative fashion between the design and development phase s therefore operationalizing the sequence was somewhat complex but when accomplished provided clarity. Although the complexity of the study had increased because a variety of measures were emp loyed per phase that call ed for multiple operationalism (i.e. more than one measure per construct) there were clear con s tructs that guided the process As mentioned earlier, another element that provided further construct validity in this study was Reeves functions of evaluation: review, needs analysis, formative and effectiveness were employed throughout the phases of the ISD process. Reeves and Hedberg (2003) recommended that the last two functions of evaluation, impact and maintenance be conducted after a module has been in use for more than a year. For this study the last two functions were not feasible due to time constraints. As recommended by Reeves and Hedberg (2003), formative evalua tions were conducted throughout the phases of the evaluate in the Development phase before the module was implemented. A summative evaluation was administered at the Evaluation p hase of ADDIE and prompted two iterations of

PAGE 115

103 ANALYSIS 1. Analyze the Problem SME Director of Learn Program Interview 2. Needs Analysis SME Questionnaire 3. Audience Analysis SME Questionnaire 4. Task Analysis SME Questionnaire 5. Content Analy sis SME Questionnaire 6. Context Analysis SME Questionnaire 7. Learner Analysis Learners Online Survey PI drafts principles 1. use Logbook 2. use results from analysis 3. use literature review sources DESIGN 1. Design Module Discussion SME, Programmer and ID Interview 2. Evaluate Design Decisions SME, ID, Programmer Questionnaire PI drafts principles and document rationales for design decisions and models/strategies/innovations used 1. use Logbook 2. use results from design analysis 3. use literature r eview sources DEVELOPMENT 1. Module Development Questionnaire Programmer Questionnaire 2. Evaluate Usability of Module SME Questionnaire 3. Evaluate Module Instructional Expert Review of Module ID expert Questionnaire 4. Evaluate Usability of Module S urvey Learners Survey PI drafts principles and information on design and development of the product/learning environment use Logbook use results from development analysis use literature review sources IMPLEMENTATION 5. Implement Module SME and P rogrammer Observation PI drafts principles and information on implementation of the product/learning environment 6. use Logbook 7. use results from implementation observations 8. use literature review sources EVALUATION 9. Evaluate all artifacts from the Analysi s phase PI Refine analysis principles 10. Evaluate all artifacts from the design and development phase PI Refine design and development principles 11. Summative Usability Evaluation Learner and ID expert Survey 12. Evaluate artifact from implementati on process PI refine implementation principles PI refines principles for overall study 13. use Logbook 14. use results from all phases 15. use literature review sources 16. Evaluate all artifacts from the ADDIE phases PI Refine design principles Figure 8 Overview of research design. Review Needs Analysis For mative Effectiveness Refinement of problems solutions and methods Development of solutions with a theoretical framework Analysis of practical problems by researchers and practitioners Documentation and reflection to Design Based Research Phases Evaluation and t esting of solutions in practice

PAGE 116

104 evaluate research and how the research was conducted. Construct Validity for Instrument Development: Expert Review of Ins truments The basis of each question in the instrument s w as derived or modified from credible sources like those that have been cited in the literature review in Chapter Two To re iterate, research by Seels and Glasgow (1998), guidelines by Br uning et al. (2004), Swan (2003) and Mehlenbacher (2002) were influential in this study. Furthermore, to add rigor and to reduce researcher bias to these instruments all of them were expertly reviewed prior to being distributed to the participants. Two Instructional Design (ID) experts with doctoral degrees in the field and with o ver three years of expertise reviewed the questionnaires. For the Analysis phase, description of the expert review process of the instruments is included in the s ummary of the pilot study in this chapter. Seels and Glasgow (1998) provided a list of questions that should be answered at each phase of ADDIE. Their guideline was also used to conduct the data reduction and analysis for the study. Bruning et al. (2004) has put forward seven recomme nded guidelines to be used by instructors of m etacognitive learning methods. Also, both Swan (2003) and Mehlenbacher (2002) offered practical design guidelines for instructional designers and instructors to create a product that is effecti ve and high in qu ality. There was a n expert review process developed by the PI for ID Expert A and ID Expert B. The procedure of expertly evaluating the instruments w as as follows: (1) the four guidelines, by Seels and Glasgow (1998) Bruning et al. (2004), Mehl enbacher (20 02), and Swan (2003) was emailed to the ID experts (2) the questionnaires pertaining

PAGE 117

105 Figure 9 Execution of research plan Data Reduction Data Reduction Instruments: Developed and Expertly Reviewed Provides method and guidelines for other phases of ADDIE Pilot Study: Analysis D B R : L O G B O O K E V A L U A T I O N F U N C T I O N S I N S T R U M E N T S Outcomes Design Instruments: Formative Evaluation Prototype Outcomes Development Feedback Feedback ADDIE Phases Instruments: Formative Evaluation Product: Web based Module Objective Test Taking Strategies Outcomes Refine Implementation Product on Web: Test Environment Observation Outcomes Evaluation Refined Product on Web: Test Environment Instruments: Summative Evaluation Outcomes 2 Iterations of Evaluate Develop Instruments for rest of phases Expert Review of Instruments Refine Review Needs Analysis Data Review Design Data Data Reduction Review Dev. Data Data Reduction Review Imp. Data Data Reduction Review Eval. Data 1 Iteration of Evaluate

PAGE 118

106 to each phase was emailed to the ID experts as an email attachment (3) ID Expert A and ID Expert B were asked to read the guidelines first and to use these guidelines as the common criteria to assess each questionnaire (4) the experts were asked to respond with their suggested changes using via email within one week if their schedule permitted (5) the PI reviewed each suggested change (6) the PI after making suggested changes re sent the links to the questionnaires to ID Expert A and ID Expert B (7) ID experts reviewed the questionnaires for a second time (8) again, the sug gested changes for the questionnaires were emailed to the PI (9) the PI reviewed and made the changes, and (10) the PI puts the final version of the questionnaire on the Internet. Beyond this, no specialized training was necessary for the ID experts. By re ading the guideline research, the ID experts made informed recommended changes to the instruments. Such independent assessment of the instruments helped the PI in collecting relevant and unbiased data. Data Reduction and Analysis Da ta reduction and an alysis began at the point of data collect ion and continued until the end of the study. Cohen et al (2000) state d (p. 147). The phases of ADDIE were the core organizing eleme nt of the study. Data reduction is an iterative process (Miles & Huberman, 1994 ; Bogdan & Biklen, 1992 ) Bogdan and Biglen (1993) states manageable units, synthesizing them, sea rching for patterns, discovering what is s seen in Figure 9 after the collection of data from each phase the data was carefully analyzed and summarized from two

PAGE 119

107 perspectives: the instructional design perspe ctive and from the DBR perspective. Figure 9 shows the execution of the research plan for the study. R esearchers LeCompte and Preissle (1993 ) stated that the goal for qualitative data analysis and theory ( as cited in Cohen et al., 2000, p. 148) To progress to theory generation LeCompte and Preissle (1993, as cited in Cohen et al., 2000 ) made the following recommendations: 1. et out the main outlines of the phenomena that are under investigation ; 2. ssemble chunks or groups of data, putting them together to make a 3. contrasting aggregating, co (p. 148) These guideline s were adhered to by the PI The objectives of the study that were listed in Chapter One and at the beginning of this chapter are the main outline s of the study. Data was collected in chunks if th e reader considers that at each phase there was data collection via various methods. At each phase the data had to be analyzed and summarized so information could be extracted to continue to the next phase and proceed with the development of the web based module. Finally, when all the data was collected this together with the PI was used to compare, contrast, synthesize and aggregate information to develop a comprehensive view of the study.

PAGE 120

108 Summary of Pilot Study Results : Analysis Phase and Proto type Outcomes The outcome of the Analysis phase created a foundation and robust guideline for conducting the Design, Development, Implementation and Evaluation phases of ADDIE In particular, the method used in the Analysis phase set the premise for conduc ting the next four phases, Design, Development, Implementation and Evaluation. Included in the pilot study was the development of the prototype which was part of the Design phase (see Appendix A for instruments and summary of results from the pilot study) The DBR approach resulted in Figure s 3 and 8 ) was employed and expanded to lend direction to the study. Guidelines deve loped by Seels and Glasgow (1998) B runing et al. (2004), Swan (2003) and Mehlenbacher (2002) provided constructs that aided in the development of the qualitative and the quantitative measures that were utilized in the study. IRB permission was granted to complete the A nalysis phase In this phase, seven instruments were employed six questionnaires and one interview The goal of the A nalysis phase was to identify the need for the web based instruction and to understand why a learning gap existed (Dick et a l., 2005). Moreover, t he information gathered at this phase assisted the instructional designer to comprehend the reasons for developing the learning module. For the pilot study, data collection at the Ana lysis phase was conducted for the duration of one week during the summer semester after the instruments were expertly Director and PI (see Appendix A). From this interview, the PI learned the reasons why a

PAGE 121

109 web based module was desired. Most importantly the SME pointed out that the need for a web based product was not in response to a problem per se but instead it was being reactive to the needs of the learners and the availability of the technology. During this initial meeti ng, the target class and module for web based conversion were introduced by Analysis Phase: Instruments Following the interview, the PI turned her attention to developing the six questionnaires. Fo r the pilot study, the instruments were expertly reviewed by the SME evaluate for the validation of the instruments This method of expert review of instruments set the standard for the rest of the phases in regards t o expert review on instruments. The instruments provided insight s into the characteristics of the target audience and the content that should be included in the module. It also provided a clear reason as to why a web based module was needed. The seven instruments, participants and type of instrumentation were as follows: 1. Analyze the Problem SME Director of LEARN program (LEARN has been renamed to Student Learning Services) Interview 2. Needs Analysis SME Questionnaire 3. Task Analysis SME Questionnaire 4. Audience Analysis SME Questionnaire 5. Content Analysis SME Questionnaire 6. Context Analysis SME Questionnaire 7. Learn er Analysis Learners Questionnaire

PAGE 122

110 For the DBR perspective the PI rev iewed the Analysis phase using: 1. Logbook 2. Results from the analysis phase 3. Literature review sources The objective was to reflect on the process, to review the instruments and literature review, and to develop principles of design that enhanced the process. Six questionnaires (see Appendix A for pilot study instruments and results) were developed using several guidelines from noted researchers (Dick et al., 2005; Seels & Glasgow, 1998; Mager, 1984 ; Mager 1997 ). All six instruments were reviewed by an ID exper t and the SME. The six questionnaires were emailed to the SME and ID expert for a first review via email. After a week, the first round of suggested changes was sent to the PI. The recommendations for changes were made, and afterwards a second ID expert re viewed the instruments. The suggested changes were obtained, again via email, within two weeks. In general, for all instruments, changes were made regarding grammar and demanded only minor changes, such as keeping verb tense consistent and some minor spelling corrections. The PI made the recommended changes to all the instruments Afte r the second round of recommended changes was completed, five of the instruments, Needs Analysis, Audience Analysis, Task Analysis, Content Analysis and Context Analysis were delivered via email to the SME for data collection since she was now a participan t in the study. The delivery method was email, preferred by the SME given a lack of availability to meet face to face.

PAGE 123

111 The only instrument of the Analysis phase intended for learner participants was the Learner Analysis (see Specimen A 1 in Appendix A for instrument and summary of results ) This instrument was comprised of twenty two question s with four sections: Student Information, Computer Usage, Online/Distance Learning Course Information and the final section was an open ended question asking for the l online courses. The instrument was developed for the web using a survey tool ( http://survey.acomp.usf.edu ) adopted by the University at which the research was conducted The instrument wa s delivered online. A url (universal resource locator) link to the instrument was delivered via email to the SME. As noted previously, t he SME was also an instructor of one of the LEARN courses. As an i nstructor the SME asked students in her class to part icipate in the analysis on a voluntary basis. The students were given the option to complete the questions online in class or at another convenient time outside of class room The instrument was kept online for two weeks. Ten responses were received at the end of two weeks and this marked the completion of the A nalysis phase. Analysis Phase: Outcomes The data was analyzed from two perspectives: an instructional designer perspective and a DBR perspective. Analysis of the data from an instructional design per spective of the pilot study included identifying design elements as well as development requirements of the web based module. Careful scrutiny of the A nalysis phase data collected from the seven instruments revealed a wealth of information. The review of t he data explained to the instructional designer the need for the web based module The data also provided an understanding of the current learning environment. The characteristics of the target audience for the web based module was also derived

PAGE 124

112 from the da ta collected. Initially, when the SME reviewed the Needs Analysis instrument instrument was r eworded and resent via email. As the SME pointed out, the reas on for seeking development of a web based new opportunities to diversify current academic support services and this should not be defined as a problem Also, the re quest for a web based module was the result of the leadership stakeholder (the Director of the LEARN program) wanting to take advantage of available technological advances. Other reasons cited for needing a web based module were as follows: (a) a need to m ake the course more accessible in order to serve a larger number of students; (b) more flexible in terms of changing the content to easily meet different curricu la criteria and target audience; and (c) having the ability to adapt quickly to future changes. Clearly the needs analysis explained the purpose of the web based module Additionally, the current learning environment was described. Altogether, this gave the instructional designer information to help make design decisions. The information derived fro m the Needs Analysis provided an explanation to the instructional designer as to why a web based module was needed (see Appendix A). The SME described the target audience and together with the information gathered from the Learner Analysis instrument, a mu ch richer profile was developed. An instructional desi gner should be able to use the Needs A nalysis not only to discover what the learning gap is but why the learning gap had occurred (Rothwell & Kanzansas, 2004). In this study, the instructional designer found that the Needs Analysis information provided an

PAGE 125

113 based module. The instructional designer discovered that the web based module was desired because the Director of the program wanted to reach out to more students and to diversify current program services. From the Audience Analysis we learned some of the demographic details of the audience. F ranged from 19 to 25 years old and they were comfortable usin g computers. More importantly the respondents were not opposed to learning the material online. Some were enrolled in the course because it was a requirement course To explain, a requirement course is a course a student must take in order to meet some sp ecified university rule. The SME pointed out that the students were learning a skill (i.e. test taking strategies) and the requirements outside the classroom, for hours o r less of homework per week The responses from the Audience Analysis gave the instructional designer valuable information in regards to who would be using the web based module. Furthermore, this type of information could help instructional designers desi gn a module to meet specific needs of a target audience (Rothwell & Kanzansas, 2004). There was no pre requisite knowledge needed for taking the course beyond baseline knowledge of a college fres hman. In the Task Analysis (see Appendix A ) the data reveale d that the current instructors tried to engage the learners by having them participate in various activities. Some of the activities included administering various types of tests such as objective or subjective to the students in an effort t o help the stud ents understand the different approaches. A task analysis according to Rothwell and

PAGE 126

114 needed to be developed as a web based module and what pre requisite information, if any, was required by the learners. For the instructional designer, the data collected from the Task Analysis specified the task and expectations of the module. However, the Content Ana lysis provided detailed information to the instructional designer about the learning objective of the proposed web based module Test Taking Strategies (see Appendix A). How the content should be arranged and presented as well as how the information should be processed by the learner was defined in this instrument. taking strategies for objective tests was derived from the Content Analysis T eaching test taking strategie s for objective te sts lesson were taught in the following manner: students were administered objective tests, followed by the instructor pointing out several strategies for choosing the correct answer and encouraging several discussions. Various kinds of declarative knowled ge such as: levels of knowledge, levels of intellectual ability, characteristics of objective and subjective test, commonly used test taking strategies and general rules of test taking strategies were also presented to the learner. Furthermore, affective k nowledge such as: being responsible for studying, being self regulated, being self motivated and being committed to their studies were also discussed with the students More importantly, the SME described some of her preferences in an online environment s of interactivity and including video and audio, if they add value, should also be considered. Content information was important to the instructional designer because it helped to guide design

PAGE 127

115 decisions. The Content Analysis informed the instructional designer of how the current instructor led class was being taught. Specific information pertaining to the modules was also divulged in the questionnaire. The environment for planning, learning and per formin g (Seels & Glasgow, 1998) were considered in the Context Analysis (see Appendix A ) In regards to planning, financial constraints for implementing a web based module were examined. This was an area where the study differed from a project occurring outside the context of research. The development of the module was funded by the PI Cost analysis is an important aspect of web based development or any development for that matter. The study was limited in this area. Although a cost structure was developed after the initial meeting among the PI the SME and the Director, it was apparent to the PI that the research and development would not be funded by the department However, there was strong support in the sense of accommodation to the PI and accessibility to i nformation from key personnel (e.g. the SME and Director) In respect to the learning context, the SME made it clear that stude nts were held ultimately responsible to understand their own learning style She pointed out that a possible social or physical c onstraint that may prevent web based learning of the target audience could be lack of access to a personal computer (PC). However the SME proposed that this problem could be overcome with open use labs on the campus. The Context Analysis instrument did not yield much more information than what was previously garnered from the other instruments. This points to a poor fit for the study or that the instrument needed further modification. Further information about the intended audience was divulged from the Lea rner Analysis instrument (see Specimen A 1 in Appendix A ) This instrument provided more

PAGE 128

116 details about computer usage and the Specimen A 1 in Appendix A shows the full list of he Learner Analysis instrument From this survey, 90% of the students surveyed were full time while 10% were part time. As far as computer usage was concerned, most of the students (70%) were comfortable using computers as a study aid. Interestingly, altho ugh 70% of the students either strongly agreed or agreed that they were comfortable using the computer to do real time chats or online discussions 20% strongly disagreed and 10% disagreed that they were comfortable with online chats or discussions. Pertai majority of participants (80%) were currently enrolled in a distance learning course. A little more than half of the participants (60%) agreed that distance learning courses were easy compared to traditional instructor led course. However the rest (40%) of the participants disagreed that distance learning courses were easy in comparison to opini ons of the Learning Strategies course itself, all of the participants either agreed (50%) or strongly agreed (50%) that the material covered in the Learning Strategies course would help them improve their grades in other courses. If the course was online, the majority of students (80%) would choose the web based version rather than the instructor led version while a minority (20%) would prefer the instructor led version. In Section D of the Learner Analysis instrument there was a comment area for participan ts to write their thoughts about features they would like to see in a web based version of the Learning Strategies course. Overall, students expressed a desire for more interaction and more examples of test questions.

PAGE 129

117 From an instructional design perspecti ve, according to Seels and Glasgow (1998) data analysis at the Analysis phase should answer three important questions. These questions and the response to the questions after data analysis were as follows: 1. What is the problem or need? (Seels & Glasgow, 199 8, p. 180): The conversion of the course to web based was initiated by the Director to: (a) take advantage of the availability of advanced technology, (b) diversify current program stated that a web based course would: (a) provide flexibility to access information and content (b) d ecrease barriers to distance learning ( e.g. help commuter students ), and (c) i ncrease the possibility of reaching students in regional campuses The respo nses from both the Director and the SME were similar and convergent. From an instructional design perspective this was positive because it appeared that important stakeholders agreed on the instructional and development approaches. 2. What are the parameters of the problem/need? (Seels & Glasgow, 1998, p. 180) questionnaires showed several parameters for instructional design and approach. First, the I expect that the online version should have a FUN approach. If there is audio, maybe we can incorporate something like peer talk and anim ation. Stay away from scripted type audio Audience Analysis and Learner Analy sis questionnaires also highlighted a preference for and 25. Out of the 10 respondents, to the Learner Analysis questionnaire, all of them either agreed (30%) or strong ly agreed (70%) that they were comfortable using the

PAGE 130

118 computer as a study aid. Here, the PI (as Instructional Designer) inferred the desired tone of module and the level of interactivity (i.e. a high level of interactivity) required. Second, there were fin ancial constraints. Although the Director supported the conversion to a web based module verbally, she could not support it financially. Here, the PI (as Instructional Designer) had to determine what would be cost effective from the available group of deve lopment tools. The PI sought the advice of Programmer 1 who explained that using the development tools Authorware 6.0 and Dreamweaver would meet the requirement of this web based initiative. Third, in respect to delivery of the web based module, the SME stated that it had PI (as Instructional Designer) determined that no proprietary applications should be used. Using Authorware 6.0 required a player to be downloaded. Computers on campus did not have t his player downloaded in their labs. Another problem in using Authorware 6.0 was its lack of popularity among instructional designers or instructional developers. Seeing this as a drawback to the design, the PI (as Instructional Designer) decided that the prototype would be developed using Authorware 6.0, but the actual module would be developed using a different tool, such as Adobe Captivate 3.0. Adobe Captivate 3.0 creates Flash files (i.e. .flv file extension) and most computers sold today come with the Flash player pre loaded. Most importantly, the open use labs on the university campus either already had the free player or allowed the download. The PI (as Instructional Designer) decided that the module would be developed so that it could be executed in both Windows and Macintosh systems.

PAGE 131

119 3. What should the content be? (Seels & Glasgow, 1998, p. 180) The Topic Analysis and Content Analysis questionnaires delivered the pertinent data to address this question. Here the SME clearly identified the content f or the web some learners want to enhance their learning strategies and learning techniqu es Strategies course is a skill based class and from the Content Analysis the SME listed a number of things she did in her class for this particular module: 1. Learners get a sample of an objective test ( e.g. a mock test Multiple choice, true /false etc. type questions ). 2. Students take the test 3. We then engage in a discussion on the level of difficulty of the test. 4. We talk about strategies they used to overcome the difficulty they experienced. 5. Most of the time, the students tend to choose the c orrect strategy 6. correct strategies. I cover them either way, just to help them understand the strategies better. 7. I teach and point out keywords that they can use The PI (as Instructional Designer) used this list as a starting point to create the flowchart of how the content would be presented to the learners. Beyond the three questions that guided the analysis and data reduction in this phase, an instructional design plan (IDP) was devel oped (see Appendix C). This

PAGE 132

120 document crystallized the design information gathered from all of the Analysis phase development believed that IDPs help to reduce time taken to deve lop the training as well guideline for development of the final product, it had not been completed while Programmer 1 was developing the prototype. Instead, Programmer 1 was given guidelines from the PI (as Instructional Designer) and the SME pertaining to flow of content of the training and the need for interactions. The PI (as Instructional Designer) and Programmer 1 had several conversations verifying the content plus several design elements including feedback and interactivity. Programmer 1 was given basic information in terms of color, font and type of interaction. However, Programmer 1 was not limited to these guidelines and was free to use and did use different des ign elements provided by the development application. To develop the prototype using Authorware 6.0, Programmer 1 took 40 hours. The goal of this prototype was to develop a sample of the web based module to demonstrate to the stakeholders. The prototype e ncapsulated some of the SMEs ideas for flow of data and inclusion of interactivity (see Figures A 1 and A 2 in Appendix A). To show the SME the prototype, a hyperlink to the prototype was created on the PI receiving the link, the SME responded with her review via email to the PI Following is a summary of her response: 1. Level of interactivity is poor; 2. Feedback to incorrect and correct response was not what was desired; 3. The look of the prototype was too bland;

PAGE 133

121 4. The feel of the module 5. Overall the module was ineffective; 6. Content was correct; Reacting to the SMEs obvious disappointment with the prototype, the PI met with the SME informally to try and gather further information as well as to assure her that th e final product will be re designed to closely meet her suggestions. The Analysis phase of the ISD process result ed in detailed information gathered regarding purpose, audience, content, and context of creating the online module. In all, seven instruments were utilize d in this phase, one interview and six questionnaires The first interview was developed as a way to establish a relationship with the SME as well as to gain support for the initiative. The study was further enriched by the information gathere d from the Learner Analysis. Learners interpreted their questions based on their 1977). The Learner Analysis tool provided the instructional designer with information abo ut the characteristics of the learners. Knowing this an instructional designer would be able to design a module for the specific skills, knowledge and attitudes of a targeted audience (Rothwell & Kanzansas, 2004). Overall, the information derived from six of the seven instruments in the Analysis phase resulted in valuable data for the instructional designer However, before the instructional designer sets out to modify the existing instruments used in this phase, he or she sh ould examine carefully why one w ould analyze the situation and what one should be analyzing. The feasibility of conducting analysis was something that had to be considered carefully by the instructional designer. In many instances time, cost and

PAGE 134

122 resources have been limiting factors in carrying out detailed analysis. However, from the analysis of information gathered in this study, it show ed that a needs analysis was necessary in order to proceed with the design and development of a module. An instructional designer could in fact design a module without knowing why a learning gap existed or who the module was intended for but the usability and effectiveness of that module would be questionable. At the end of the Analysis phase, t he research question applying a syste matic approach to development of a web based module for teaching was addressed partially. The DBR perspective has le d the PI to infer that the purpose of cond ucting this part icular phase had been met. The information gained provide d an important foundation for further development. As mentioned earlier, some researchers (Rothwell & Kaz ansas, 2004; Dick et al. 2005) believe d that the Analysis phase should provide the instructio nal designer with an overview of the problem, the reasons why the module is required, the nature of the content, the context surrounding the proposed development of the online module and a profile of the learner or the intended audience. Furthermore, the review of the Analysis phase revealed the importance of establishing a rapport with the SME who had been a source of valuable information throughout the Analysis phase Establishing a relationship w ith an individual like the SME provide d support throughout the ISD process. Also, acquiring leadership support at the start of the ISD process had been crucial. Identifying the stakeholder(s) such as the person or persons in leadership positions i.e. directors and instructors, and the learners

PAGE 135

12 3 themselves also hel environment. Regarding the four objectives or deliverables of the study, only one was delivered at the end of the Analysis phase a list of generalizable and provisional Lessons Learned for the Analysis phase. It was considered provisional because the rest of the ADDIE phase s were not yet completed. However, all of the deliverables were met once all phases of ADDIE were completed. T he effectiveness of using the ADDIE process to systematically develop a web based module to teach metacognitive learning strategies to students in higher education was determined at the end of this study Thus, the deliverable a list of provisional Lessons Learned pertains only to the pilot study (i.e. the Analysis phase ) of the ISD process. Prototype Development One outcome from the Design phase that was included as part of the pilot study was the prototype development of the web based module. As information was being gathered in the Analysis phase, the PI (as In structional Designer) used the information to create a n instructional design plan (IDP) for the module. Given the performance of development tools, design ideas on paper can be quickly translate d to a prototype The information from the Analysis phase yiel ded information pertaining to preferred design elements relating to the flow of information within the module, the level of interactivity and the content of the module. Using this critical information and the design plan created b y the PI (as Instructional Designer) Programmer 1 was able to create a prototype of the web based module. The prototype was developed using the computer program

PAGE 136

124 Authorware version 6.0. In Appe ndix A Figures A 1 and A 2 displays two screen shots of the prototype. The prototype was created to be delivered via a hyperlink (i.e. allows the stakeholders to navigate to the web based module located on the web server) to some of the key stakeholders in the web based development initiative for this study, such as the Director o f the LE ARN program and the SME An informal interview occurred while showing the stakeholders the web based module prototype. The results of this interview combined with the data extracted from the Module Design Interview provided feedback for the PI (as Instruct ional Designer) to refine the design principles and style guide of the IDP. T he results of these interviews can be seen in Specimen B 1 in Appendix C The use of a prototype in this instance was to generate decisions of particular design elements that need ed to be retained or discarded. The prototype gave the stakeholders and the in structional designer the opportunity to formalize their thoughts on what design elements were to be considered beneficial to the learning process. Analysis Phase: Provisional Les sons Learned Again, p P rovisional Lessons L rest of the phases of ADDIE had not yet been completed It was only after t he research on all phases had been conducted that a complete Lessons Learned list woul d be justified. The following is a list of Provisional Lessons Learned that was the outcome of the pilot study: 1. The instructional designer had to establish whether or not the development of the web based module require d a detailed analysis. The analysis ph ase of this study required a commitment of time, money and human resources. These elements are

PAGE 137

125 not always available in practice due to deadline dates and marketing commitments. 2. T he proposed de velopment did not require the use of many analytical tools. Some times a guided interview and a needs assessment provide the necessary information to design and to develop the module. The PI learned that the Context Analysis had not really been necessary because the questions asked in this instrument had already been ad dressed by similar questions in the Needs Analysis, Content Analysis and Task Analysis instruments. 3. At the start of the ISD process it had been important to establish relationships with the decision makers and leaders of the initiative. 4. Informal intervie ws helped to establish relationships between key personnel. 5. To b e flexible. The ADDIE process is a systematic process but it did not imply rigidity. 6. Within the analysis phase, it was important to limit the number of instruments to only what was necessary because filling out questionnaires and conducting surveys and interviews disrupt ed 7. There were many valid instruments available for conducting various types of analyses. It was more prudent to use an instrument that had already establis hed validity. In other words, utilizing an instrument previously created by a reputable researcher or resource group provide d reliability to the data collected. In addition it save d time and money because the instrument did not have to be developed. Modify ing existing instruments rather than trying to create and to validate new ones are recommended.

PAGE 138

126 Design Phase Method In the desi gn phase of module development, the PI in the roles of the Instructional Designer and Programmer 2 together with the SME work ed to determine what design elements should be included in t he module. T he design phase included one interview and one questionnaire. The name of the instrument participants and an explanation o f the instruments are as follows: Design Module Interview Parti cipants : SME Programmer 1 PI (as Instructional Designer and Programmer2) Through detailed research, t he PI had a good grasp of what types of questions should be asked in this interview. T he development of these questions were aided by the guidelines de veloped by Seels and Glasgow (1998) B runing et al. (2004), Swan (2003) and Mehlenbacher (2002). The type of interview process that used is known as T he PI decided the sequence of the questions as wel l as what question s to use throughout the course of the interview. Also, t he SME and Programmer 1 addressed design issues the PI had about content, hardware, software, interventions style, and timeline (see Specimen B 1 in Appendix B ) Evaluate Design De cision Questionnaire Participants : SME, Programmer 1, PI (as Instruc tional Designer ) To develop the questions for this questionnaire, the PI again relied upon the four guidelines listed previously (i.e. Seels & Glasgow, 1998; Bruning et al., 2004; Swan, 2005 ; Mehlenbacher, 2002) The expert review process that was explained in a previous section was followed. Before the expert review there were a total of 26 questions.

PAGE 139

127 However after two rounds of expert review, there were 28 questions in total. As seen in Specimen B 10 in Appendix B t he re are three se ctions in this questionnaire: (a ) objectives and assessments, (b ) instructional strategy, and (c ) delivery system s election and p rototypin g. Once the data had been gathered including feedback from the observ ation of the prototype by the stakeholders (e.g. the SME and Director), the PI (as Instructional Designer) re visited the IDP and made changes to the style guide. The procedure for administering and collecting the data were as follows: This questionnaire w as placed online using the tool, http://survey.acomp.usf.edu and it was not password protected. The SME and Programmer 1 were sent an email that contained a hyperlink to the questionnaire. An email was sent to the SME who accessed and completed the questionnaire two days after receiving the email A similar email was sent to Programmer 1 who completed the questionnaire nine days after receiving the email notification. The PI (as Instructional Designer and Programmer 2) completed the questionnaire one day after the other two participants had responded to the questionnaire. DBR Overview: Design Phase The design principles for web based development at this phase were developed using the outcome fro m the Analysis phase, the feedback from the prototype and data from the Evaluate Design Decision questionnaire The design principles were further refined after extracting information from the data received from administering the instruments from the Design phase. Data from the two evaluations along with the following data provided an overview of the Design phase from a rese a r : 1. Logbook

PAGE 140

128 2. Data from the Design Phase instruments 3. Literature review sources The interview and the questionnaire provided the details to t he PI from the perspective of the SME, ID and programmer s These perspectives contained concrete guidelines that were used to develop the web based module At this stage the PI used the various sources of data to document rationales for design decisions an d models, strategies, and innovations used in developing the module. Development Phase Methods At this phase the PI (as Programmer 2) used the IDP and the style guide created and refined in the Design phase The following instruments were developed and e xpertly reviewed : Evaluate Usability of Module Participant: SME This questionnaire changed the most during expert review. The expert review process as detailed in a previous section was followed. The questionnaire was originally developed with 35 questio ns and two sections using the four guidelines (i.e. Seels & Glasgow, 1998; Bruning et al., 2004; Swan, 2005 ; Mehlenbacher, 2002). However, after the second round of expert review the questionnaire comprised of 42 questions, two sections with four subsectio ns in the second section As seen in Specimen B 2 in Appendix B the two sections are: (a ) Materials Development and (b ) Evaluation of Web based Module The ID experts thought that more clarification was needed or more questions were needed to gather pert inent information to guide development Following an example provided by

PAGE 141

129 Seels and Glasgow (1998 ) the PI decided that the second section should be sub divided into : (a ) Accessibil i ty, (b) Design Elements, (c ) Graphics/ Animations/Multimedia, and (d ) Navigat ion. The ID experts approved the revisions and did not recommend any further changes. As far as the procedure for administering and gathering the data, t he questionnaire was placed on the Internet using the tool, ht tp://survey.acomp.usf.edu An email containing a hyperlink to the web based module and the questionnaire was sent to the SME. The SME was asked first to view the web based module then respond to the questionnaire immediately after viewing the module. The SME responded one day after being notified that the questionnaire was available online. This questionnaire was not password protected. Expert Review of Module Participants : ID Expert s Again, this questionnaire was developed by the PI using the four guidel ines (i.e. Seels & Glasgow, 1998; Bruning et al., 2004; Swan, 2005 ; Mehlenbacher, 2002). Also, the expert review process was strictly adhered to. In this questionnaire the completion of the second iteration of expert review resulted in 29 questions and six sections. Previously, the questionnaire had 28 questions and five sections. The six sections as listed in Specimen B 3 in Appendix B are: (a) Accessibility, (b) Design Elements, (c ) Gra phics/Animations/Multimedia, (d) Navigation, (e ) Training Module Conte nt, and (f ) Your Opinion Matters. The procedure for administering and gathering the data were as follows: This instrument was placed on the Internet by the PI using the survey tool, http://survey.acomp.usf.edu Pas swords were emailed along with a notification to the ID Experts that the q uestionnaire was available online Include in this email were two

PAGE 142

130 hyperlinks, one to view the web based module and another to the questionnaire. ID Expert A and ID Expert B accessed and completed the questionnaire one week after being notified. Both ID experts were asked first to view the web based module then respond to the questionnaire immediately after viewing the module. Learners: E valuate Usability of Module Participants : Learn ers enrolled in REA 2105, Critical Reading and Writing during Summer 2008 At this j uncture of the Development phase some sections and questions developed for earlier q uestionnaires were re used for this instrument development Additionally the four guid eline s (i.e. Seels & Glasgow, 1998; Bruning et al., 2004; Swan, 2005 ; Mehlenbacher, 2002) were also used as a reference. The expert review process as described earlier was followed. The expert reviewers asked that this instrument be refined to contain seve n sections instead of five sections (see Specimen B 4 in Appendix B) The two iterations of expert review further refined the instrument so that the number of questions increased from 28 to 35. The increase in questions was due to the addition of questions pertaining to the le The ID experts agreed that this could add another dimension to the research. Th e sections as seen Specimen B 4 are as follows: (a) Learner Background, (b ) Accessibility (c) Design Elements, ( d ) Gr aphics/Animations/Multimedia, (e) Navigation, (f ) T raining Module Content, and (g ) Your Opinion Matters As far as administering and gathering the data, t he instrument was administered via the Internet. This instrument was placed on the Internet by t he PI using the survey tool, http://survey.acomp.usf.edu an d it was password protected. The participants were

PAGE 143

131 asked to first view the web based module then respond to the questionnaire. The questionnaire was passwo rd protected. After viewing the web based module, the participant s were given the password to access the questionnaire. This was accomplished during one class meeting. Two participants who viewed the web based module could not access the questionnaire. The password for the questionnaire was based on the However, two participants had not yet been issued Blackboard access at the time of the study therefore could not participate i n the questionnaire. M odule Development Questionnaire Participants : PI (as Instructional Designer and P rogrammer 2 ) This questionnaire was developed using the four guidelines (i.e. Seels & Glasgow, 1998; Bruning et al., 2004; Swan, 2005 ; Mehlenbacher, 200 2). Specimen B 11 in Appendix B displays this instrument and a summary of the results The questionnaire was also designed to gather information that would add to the DBR perspective. The expert review process was followed. The Development phase evaluation s helped the programmer to develop and refine the web based module at this jun cture before it was implemented. The reflections of any decisions made by the PI (as Instructional Designer) and PI (as Programmer 2) at this phase were recorded in this question naire. This questionnaire was placed online using the survey tool, http://survey.acomp.usf.edu To the PI this instrument provided somewhat of a dilemma. Since the PI functioned both in the roles of Instructional Designer and Programmer 2 for the study, there was no data to collect from any other participants However, after careful del iberation with other IDs and research experts, the PI decided to do two separate responses, one as ID and the other as the

PAGE 144

132 programm er. This allowed the PI to give structure to the thought process and design decisions made by her at this juncture of the study. DBR Overview : Development Phase Again, the outcome of the development phase evaluation inform ed the study of the effectiveness of the module and include d information gathered from the: 1. Logbook 2. Results from the D evelopment phase 3. Literature review sources The Development phase evaluations helped the programmer to develop and refine the web based module at this juncture before it wa s implemented. The development of the module can be an intensive time for programmers and instructional designers Formative evaluations were developed to clarify whether or not these evaluations could aid the programmer and the instructional designer. The Expert Review Questionnaire and the Learners: Evaluate Usability of Module questionnaire (see Tables B 3 and B 4 respectively) instigated an iteration of the evaluate cycle in the Development phase before being implemented. Im plementation Phase Method The module was implemented on the Web by PI (as Programmer 2) T he executable files of the module were placed on a server owned by the university in which the research was conducted. PI (as Programmer 2) used a feature in Adobe Captivate 3.0 to generate an executable pr ogram that was Flash compatible. This decision was based on the fact that flash files are relatively smaller in size than other formats and can run on

PAGE 145

133 most platforms today. Once the files were copied, a hyperlink to the program s was placed on a simple webpage designed for this study. DBR Overview: Implementation Phase At this phase of the ISD process, the PI simply used the application to generate an executable program. The process was recorded in the logbook. The instruments used to gain a DBR perspective were: 1. Logbook 2. Results from the implementation phase 3. Literature review sources Evaluation Phase Methods A summative evaluation of the module was performed at this phase util izing the Summative Usability Evaluation quest ionnai re (see Tables B 6 and B 8 in Appendix B ) Summative Usability Evaluation Participants : Learner s enrolled in two sec tions of The University Experience course in Summer 2008 I D Experts This questionnaire was closely based on the Learners: Evaluate Usabil ity of Module questionnaire from the Development phase The expert review process was followed. The Summative Usability Evaluation questionnaire before expert review consisted of five sections and 28 questions. After the second and final round of expert r eview, the questionnaire consisted of seven sections and 35 questions. (a ) Learner Background, ( b ) Accessibility (c) Design Elements, (d ) Gr aphics/Animations/Multimedia, (e) Navigation,

PAGE 146

134 (f ) T raining Module Content, and (g ) Your Opinion Matters. This quest ionnaire, like the others were placed on the Internet using the http://survey.acomp.usf.edu by the PI. To administer and gather data, the following procedure was followed: For the first evalua te in the Evaluation Phase participants were enrolled in The University Experience course At the beginning of the class, they were asked to view the web based module then immediately after respond to the questionnaire No passwords were required to access the questionnaire. A count of participants in the classroom and a count of responses to the questionnaire verified that no one took the questionnaire more than once. T he time taken for them to view the web based module was also recorded This que stionnaire was administered to the ID experts via email as well The email contained two hyperlinks, one to the refined web based module and the other to the online questionnaire. No passwords were required. The experts responded to the questionnaire one w eek after being notified. After all responses were collected t he PI analyzed several ins truments to determine refinement changes (see Specimen B 7 in Appendix B) Using this list of refinements as a guideline, PI (as Programmer 2) determined which changes were feasible based on software application capability, content availability scope of the project and time. Within one week, the PI (as Programmer 2) made the refinements changes to the web based module For the second iteration evaluate the participants were recruited from another section of The University Experience course Again, the procedure to administer and gather data was the same as previously mentioned. A t the start of a class session, the participants were asked to view the now refined web based module online then to immediately respond to the Summative Usability Evaluation

PAGE 147

135 questionnaire (see Specimen B 8 in Appendix B) The PI had created a copy of the original questionnaire and placed it on the Internet for thi s second group to access so data collected would be in a separate database. The ID experts also re sponded to questionnaire in this iteration. After all data was collected, the PI reviewed only the data received from this questionnaire to extrapolate any su ggestions for refinements of the web based module. A second list of refinement changes was created. At this juncture the study was closed. DBR Overview: Evaluation Phase Figure 8 show s how four functions of evaluation : review needs analysis formative and effectiveness were employed throughout the phases of the ISD process. Reeve s and Hedberg (2006) recommended that the last two functions of evaluation, impact and maintenance be conducted after a module has been in use for more than a year. This timeframe was not feasible for the present study therefore these two functions of evaluation were not included. Despite this exclusion, the function evaluations included gave a full representation of a typical systematic approach to a web based module development pr ocess A summary review of all data and design principles of each phase was analyzed to determine the listed objectives of the study. A major objective was to provide a list of generalizable Lessons Learned. Additionally, a report on the effectiveness of t he specific instructional strategies used and an analysis of quantitative, qualitative and descriptive outcome measures of learning among field test participants was two more objectives that were clearly represented. An important objective of the study was also to create a web based module with a known valid ity and effective ness status using a systematic approach.

PAGE 148

136 The next section explains how validity and effectiveness status of the module was arrived at in the study. Ev aluation Phase: Evaluation Goal On e of the deliverables of the study was to produce a web based module that was considered valid and effective Again, the guidelines developed by Seels and Glasgow (1998) B runing et al. (2004), Swan (2003) and Mehlenbacher (2002) for instructional designer s, developers and educators act ed as a framework for assessing the web based module These guidelines also provided construct validity for instrument development The information collected in the formative stage guide d the refinement process for web based development. Each of t he formative and summative instruments of the Analysis, Design, Development, Implementation and Evaluation phases of the ADDIE process (see Table 13 ) provide d enough information so that the validity and effectiveness of the module we re determined At the Evaluation phase the results of the questionnaire, Summa tive Usability Evaluation were influential in deriving the validity and effectiveness status of the web based module. As stated previously, t he participants for the Summative Usa bility Evaluation questionnaire (see Table 1 4 displayed earlier in this chapter ) were l earners enrolled in undergraduate courses that had metacognitive learning strategies components and two ID expert s. An analysis of the outcomes provide d a clear picture for interpretation of whether or not the module was considered valid and effective.

PAGE 149

137 Summary How the research question was addressed and how the research objectives were met has been discussed in this chapter. Also included here was a description of the r esearch design and an explanation of the resear ch methods utilized in the present study A framework of the study which revolved around the ADDIE process, Seeto and Herington DBR research model as well as evaluat ion functions were discussed. direction to the present study and was modified to a small extent Furthermore, this chapter included a detailed description of the pilot study which comprised of the A nalysis p hase of the ISD process and its outcomes. Also in this discourse was the description of the prototype, one outcome of the Design phase and how it helped to define and refine design elements for the web based module. Consequently, conducting the Analysis phase pro vided guidance in regards to the method utilized in the next four phases of ADDIE. Moreover, the types of m easures specifically developed for both the descriptive and quantitative measures for each phase of the ISD process have also been described The res ults and discussion of the study follows in Chapters four and five.

PAGE 150

138 C hapter F our Results The overall validity and effectiveness of the web based module was interpreted as when the respondents to the formative and summative evaluati ons provided a generally positive overview of the module. The end of the study was evident by the completion of the second iteration evaluate refine in the E valuation phase of ADDIE Since the pilot study and its outcomes, and the met hods for the rest of the ADDIE phases was discussed in Chapter Three i n this chapter the results of the rest of the phases of ADDIE is discussed in this chapter T he summary of data that has been analyzed and refined will be pres ented in the following ma nner: (1 ) design phase results, (2) development phase results, (3 ) implementation phase results, (4 ) evaluation phase results, and (5 ) DBR results and perspective for each phase of ADDIE The ADDIE process provided a n overall guideline for data collection. Design Phase R esults At this phase, t he IDP which was completed by the end of the prototype development was revisited and design changes were made to incorporate the SMEs suggestio n s. T he PI (as Instructional Designer) decided to make several changes to the design. She decided to use Adobe Captivate 3.0 as the application to develop the module for the final product. That decision was made based upon two things : (a ) the level of interactivity that was required for the final product and (b ) the availabili ty of Adobe

PAGE 151

139 Captivate 3.0. In the duration of one week, design revisions were made to the IDP. A summary of the revisions made to the IDP were: 1. Style Guide: a. Create a template to provide consistency; b. Font style: Arial; Font size: Ranges between 14pt and 16pt; Font color: (black) c. Place feedback in same location for each question ; d. Place navigation buttons in same location on each screen/slide; 2. Content Flow: a. Introduction; b. First Section: 10 Questions each question followed by quick feedback (e.g. correct/i ncorrect) present one question at a time to the learner ; c. Section Break : learner can see score then move on to the final section; 3. Final Section : each question and correct answer should be fully explained; 4. Instructional Strategy: a. Introduction: grab learne use a story /or set a scene short animation (use audio) ; b. Explain sections and what to expect (use audio) ; c. Present one question at a time; d. Display score to the learner at the end of the first section; e. Second section use audio to explain t he correct answer for each question use Adobe Captivate 3.0 interactive built in techniques ;

PAGE 152

140 Design Phase: Expert Review of Analysis Phase Instruments Beginning from the Design phase two ID expert review ers were recruited to review the instruments deve loped for the rest of the phases. At this phase an interview and a questionnaire was utilized to collect data. Of the two instruments, o nly t he instrument was expertly reviewed. The expert review process included g rammar, spelling and tense changes to some questions. They also proposed clearer definitions of terms, for example for the statement Interaction interfaces and interaction design were established in meetings at this phase one of the ID expert asked for further clarification. The st at ement wa s later changed to Interaction interfaces and interaction design elements were established in meetings at this phase (i.e. Design Phase of ADDIE) as it w as used in some of the questio ns. After the final iteration the term Design Phase: Analysis of Data At this phase, an open ended was planned for the SME, PI (as Instructional De signer) and Programmer 1. The interview was conducted before the IDP was completely developed and before the prototype was developed. Specimen B 1 provides a summary of the information derived from this interview. The purp ose of this interview was to: (a ) introduce the SME to Programmer 1 ( b ) confirm design approach (c ) confirm instructional strategy approach, (d ) determ ine technical strategies, and (e ) learn of any limitations present and foreseeable problems A combination of this information and design information from the Analysis phase led to the development of the prototype and to the refinement of the IDP. The

PAGE 153

141 intentio n at this phase was to confirm learning objectives, to identify assessments and instructional strategies and to design the delivery system. Prototype development was also included in this phase. The prototype and its outcomes have already been discussed in the pilot study. Following the example set in the Anal ysis p hase of ADDIE, d ata reduction was again facilitated by answering three questions as put forward by S eels and Glasgow (1998 ). The question s and their responses were as follows: 1. What should be assessed and how? (Seels & Glasgow, 1998, p. 180). This particular module was unique in the sense that it was an assessment of the lear metacognitive ability to recognize the best strategies for answering questions for objective tests using a multiple choice format. Since the module itself was comprised of questions and is an assessment, the point of the web based module then was to help a learner understand how to make the right choices in an objective (i.e. multiple choice type test) test by identifying learning strategies. This information was made clear in the interview, where the SME stated that it was the Objective Test Taking S trategies module that would be the best to start developing first. 2. How should instruction be organized? (Seels & Glasgow, 1998, p. 180) The interview provided clear details on how the SME visualized the web based module. She wanted a certain amount of ques tions, (e.g. 10 to 15) and she wanted the questions presented first then followed by feedback for each choice. The feedback needed to The SME Programmer 1 and the PI (as Instructional Designer) all agreed that t he module should not be very long, in fact a length of twenty minutes was considered ideal.

PAGE 154

142 3. What will the instruction look and sound like? (Seels & Glasgow, 1998, p. 180) The prototype was a major factor in deciding what the instruction should look and so und like. In fact, the negative responses to the prot otype led to developing a web based module that was more closely aligned to what the stakeholders, example the Director and the SME desired. The SME provided specific feedback that was presented in the d escription of the pilot study earlier in this chapter. The majority of the Design phase was completed in two non consecutive weeks. However, the Design phase over lapped with the Development and Evaluation phase s of ADDIE because of the inclusion of the it evaluate Development Phase Results The Development phase was completed in 10 non consecutive weeks. Adobe Captivate 3.0 was used to develo p the module. At this phase the PI (as Programmer 2) used the IDP to guide t he development H owever, during development some changes were made because the full capability of the application provided more interactions that were not fully explored in the IDP. These opportunities provided a higher level of interaction and were not ig nored since it would help to align the web based module closer to the requirements of the SME and the Director. The Figures 10 through 14 displays several screen shots of the web based module. Some instructional strategies used in this module development a attention as seen in Figure 10 (b) immediate feedback are given to the learners as seen in Figure 11 (c) overall results of the quiz is shared as seen in Figure 12 (d) audio explanation of c orrect choices as seen in Figure 13, and (e) inclusion of learner interaction to encourage learners to become active in learning rather than passive

PAGE 155

143 Figure 10 Screen Shot 1 of web based module Figure 11 Screen Shot 2 of web based module.

PAGE 156

144 Figure 12 Screen Shot 3 of web based module. Figure 13 Screen Shot 4 of web based module

PAGE 157

145 Figure 14 Screen Shot 5 of web based module To view the module in its entirety on the Web, please refer to http://www.coedu.usf.edu/it/DissApps/Singh/ The following are the changes made to the design during developmen t: 1. Animation: Animation was used in the introductory slides as well as in the feedback and conclusion sections of the module. 2. Text animations: Text animations were used in the feedback section to highlight key words. 3. Audio: The introduction and feedback all used short audio recordings. 4. Graphics: Pictures, arrows and highlight boxes were used in the feedback portion of the module. 5. Input boxes: Input boxes were used on the basis that it provided an opportunity to ask the learner to participate and therefore increased the level of interactivity.

PAGE 158

146 6. Template: A simple template was used. The IDP (see Appendix C) was still useful as it provided information on other style elements such as content flow, font (i.e. size and color), slide background colors and hardware environment information. Development Phase: Expert Review of Development Phase Instruments The ID experts reviewed four questionn aires for the Development phase: Evaluate Usability of Module Expert Review of Module and Learners: Evaluate Usability of M odule and Module Development Questionnaire Similar to the previous phase, the ID experts received the original instruments via email and responded after one week. The feedback included spelling and grammar changes. However the most important changes were Changes in this instrument affected the other instruments in this ph ase and the E valuation phase of ADDIE After the first review of all instruments, the PI Evaluation of a Web into four sub sections: Accessibility, Design Elements, Graphics/Animations/Multimedia and Navigation. The experts agreed to this further delineation and believed that it provided clarity to the instrument Development Phase : Analysi s of Data At this phase, formative evaluations provided data on how the web based module should be developed and what refinements were required before it were implemented. eva luate based module was developed, the formative evaluations were conducted. The purpose was to gather data to develop and

PAGE 159

147 refine the module before implementation. Again, data reduction was guided by See ls and Glasgow (1998 ). The questions asked at this phase were: 1. What should be produced? (Seels & Glasgow, 1998, p. 180) This information was already derived from the Analysis and Design phases. All content and design information was already collected and it was clearly outlined to the PI (as Programmer 2) what needed to be developed. It should be noted here that feedback received from the SME and Director after they viewed the prototype aided in defining and clarifying what design elements were and were no t acceptable. 2. What revisions were needed? (Seels & Glasgow, 1998, p. 180) Specimens B 2 B 3 and B 4 provide d some direction as to what revisions were needed (see Appendix B) Specimen B 2 shows the summary of results from the Evaluate Usability of Module questionnaire In addition, t he module was evaluated in the Development phase by the two ID experts and a group of learners using the Expert Review of the Module (see Specimen B 3) and the Learners: Evaluate Usabil i ty of the Modu le (see Specimen B 4) questionnaires respectively. To analyze the from Specimen B 3 the researcher e xamined data from the tw o sections of the questionnaire the Materials Development section and the Evaluation of the Web Based Module secti on. From the results, t he SME indicated that the content of the web based module was correct, reading level was appropriate and content flow was what she had recommended. The SME had a clear response to question 12 which ask ed what changes were required wh en the content from a traditional format to a web based The change required the use of a theme and animations to

PAGE 160

148 keep students engaged and interested. It also required a narrator to provide explanations that were needed In rega rds to data collected from the second section of the questionnaire, the SME either strongly agreed or agreed that the modules were accessible using the browser on her computer and that all links in the module worked. The SME chose to disagree with the stat ement that the module executed without technical delay. Moving on to the design elements such as good use of color, simple design, good directions for learners to follow, consistency in appearance of layout, feedback and error messages engaging tone of th e module, the SME consistently either strongly agreed or agreed that these elements were acceptable. Similar positive responses were also gathered from the SME when she considered th e graphics, animation, multimedia and navigation aspects of the module When the SME was asked to state what changes she would recommend she highlighted a number of things such as : (a) the text explanation in question 6 which took too long to clear (b) in question 7 when she moved he r cursor the screen disappeared (c) in quest ion 8, the narration stopped if she moved off a particular area on the screen, and value. items of the web based module that were needed and most of her suggestions were taken into account. on question 9, the researcher disagreed based on the premise of which is to involve the learner and help them become active participants in the learning process. Moving on to Specimen B 3 here a summary of results from the Experts Review of Module questionnaire is displayed T he two ID experts reviewed the module first and

PAGE 161

149 then responded to the questio nnaire The ID experts were asked to share their opinions of the web based module in regards to accessibility, design elements, graphics, animation, multimedia, navigation and content. From th eir perspective they generally strongly agreed or agreed that t he module was acceptable in the areas of accessibility, design elements, graphics, animations, multimedia, navigation and training module content. However, one expert did disagree and found that the ends of the sections within the module were not clearly d elineated. Navigation appeared to be an issue for one expert as well, for example although there was 100% agreement that navigation was consistent, one expert found that she could neither navigate to the beginning of the module easily nor did she think tha t the navigation buttons were clearly marked. Also, one expert did experience problems downloading and viewing the module and that was reflected in her I did not experience any technical delays while going through this trai ning module As noted previously, t he SME had a similar response because she also experienced a technical delay when trying to view the module on her computer (see Specimen B 2) In regards to content, there was 100% agreement by the two experts that the examples of questions in the module made learning the concepts easier. Similarly, the SME strongly agreed to a similar question that was posed to her. There was also 100% agreement by the experts that the feedback given to the learners will help the learn ers understand the concepts of the lesson. When if the sequencing of the information made it easy for the learners to learn, 50% strongly agreed and 50 % agreed. Comparable results were found when the experts were asked whether the informati on was relevant to the learner.

PAGE 162

150 Like the SME, the two experts like d the simplicity of the design and thought that the module was engaging. As one expert wrote was very engaging. I enjoyed the motif of going on a jungle mission. The audio narration and graphics helpe d to carry this On the other hand, t o li st one thing they did not like, one expert mentioned that for future changes perhaps hyperlinks should be created to give learners access to resources abou t test taking strategies and allow the learner to download these resources. Next, Specimen B 4 shows the summary of results from the Learners: Evaluate Us ability of Module questionnaire, the responses of a group of learners ( n =7) All learners were enrolle d in the Reading Course It was comprised of 71% female and 29 % male Juniors dominated the class with 71%, while 14% were of senior standing and 14% chose The a ges ranged between 19 and 29. It was interesting to note that most (71%) learners pr eferred to attend traditional ( i.e. face to fa ce) courses. Again, here, the PI found that most learners either strongly agreed or agreed in areas of accessibility, design elements, graphics, animations, multimedia, navigation and training module content. As far as accessibility was concerned, according to the learners, all of them strongly agreed that the module executed on their computers without problems. Additionally, 100% of the participants also strongly agreed that all links worked within their brows er. However, when it came to technical delays, t he PI discovered that one learner did experience a technical delay when he/she tried to view the module. When considering some of the design elements, 86% strongly agreed and 14% agreed that the design was simple and uncluttered. A similar percentage breakdown was found when

PAGE 163

151 participants stated that they either strongly agreed or agreed that the directions given to the learner was easy to understand. In t he statement s concerning the start and end of the sec tions within the module, two learners (14% each) disagreed and strongly disagreed respectively about the sections being marked with in the module. Like the ID experts and SME, the participants also indicated that the tone of the module was engaging, that is 57% strongly agreed and 43 % agreed with the statement. Most of the learners participants, 29% strongly agreed and 43% agreed that the graphics used in the module helped to enhance their learning. One participant disagreed and one chose not to respond to that statement. Participants appeared to react positively to the audio, text animations and interactions. For instance, 86% strongly agreed and 14% agreed with the statement that audio provided useful information that enhanced their learning. Participant s, that is 57% strongly agreed and 29% agreed that the text animations helped them to focus on what they should be learning but one participant disagreed. interactions make the trainin When considering navigation within the module, in terms of navigation buttons being clearly marked, 57% of participants strongly agreed and 29% agreed with that statement. Again, one participant disagreed. A parallel distribution of respon ses was received when participants were asked about the availability of tracking information to track their progress within the module. As far as the ease of navigating to various parts of the module, 43% of participants strongly agreed and 57% agreed that it was easy. Regarding the content of the module, similar to the ID experts and the SME responses, d ata from learner participants s ignified a positive outlook. Particularly, 43% and 57% of

PAGE 164

152 learner participants strongly agreed respectively that the informa tion in the module was useful. Also, feedback given in the module was placed to help the students learn, and it appeared to be a positive aspect of the module as 71% of participants strongly agreed as well as 29% agreed with that statement. There were a to tal of 33 statements in this questionnaire and two additional open ended questions. In analyzing the overall learner participant responses, it appeared that ended response to being asked what they liked about the module, the participant stated that it was one thing they would change about the web based module. In response, t t In reflecting on this to his/her liking or the module did not meet his/her particular design and content preferences. The PI did investigate all of the negative responses received. Although all responses were carefully considered for refinement purposes, especially negative respo nses, the PI considered that the majority of responses generated a positive view of the module. Overall, w hen asked to list one thing they liked about the module, the learners generally thought it was informative, and the module gave them relevant informa tion A s expressed by a learner questions to better help the students learn. I also like how they showed key words to look What the le arners did not like

PAGE 165

153 hed A full list of the refinement s derived from the formative review at the Development phase before moving on to the Implementation phase can be seen in Specimen B 5 in Appendix B Please note that the first four refinement suggestions were derived from an informal meeting with the SME a nd Director of the LEARN program. The rest of the refinement suggestions in Specimen B 5 were derived from the responses to the Development phase instruments. Also, the average time it took the learners to complete the module was 9.1 minutes. In Specimen B 5 of the twenty six refinements listed, seventeen were addressed and nine were not addressed for two reasons: they were either a personal stance or opinion of the participant and had either no relevance to the design (e.g. Refinement nos. 9, 10, 15, 23, 24, 26) or they were already addressed (e.g. Refinement nos. 6, 12, 25). Implementation Phase Results This phase was small in sc ope R eferring to Seels and Glasgow ( 1998 ), the question to be addressed was: 1. Wh at preparation is needed ? ( Seels & Glasgow, 19 98, p. 180) This phase was completed in one week, non consecutive days. The length of time included updating the module and copying the files to the web server after each of the refinement i terations. It was simply a matter of the PI (as Programmer 2) copy ing the b page was developed and a hyperlink was added to give access to p articipants of this study

PAGE 166

154 Evaluation Phase Results The Evaluation phase was completed in two non consecutive weeks. Here the evaluations w ere considered summative evaluation s for the study In this phase, t wo iterations of the evaluate cycle were conducted. At the end of the second iteration, the study was closed. Evaluation Phase: Expert Review of Evaluation Pha se Instruments The Learn ers: Evaluate Usability of Module provided the premise on which the Summative Usability Evaluation instrument was based upon. The ID experts received the instrument via email and gave their feedback after one week. After the re comm ended changes were completed, the updated instrument was sent via email for the second and final review of the instrument. The Summative Usability Evaluation comprised of seven sections: (a) Learner Background, (b ) Access ibility, (c) Design Elements, (d ) G raphics/Animation/Multimedia, (e) Navigation, (f ) T raining Module Content, and (g ) Your Opinion Matters. The ID experts pointed out spelling and grammar errors. Since they were already familiar with this organization, they did not request further changes. Evaluation Phase: Analysis of Data evaluate phase. The summative evaluation instrument, called the Summative Usability Evaluation, was first administered to a group of learners enro lled in a University Experience course as well as to the ID experts. Similarly, a t the second cycle evaluate administered to a different group of learners enrolled in a different section of the University Experience course as well as to the two ID experts Overall, t he average time the learners took to view the module in the first and second

PAGE 167

155 iteration was 9.61 minutes and 9.85 minutes respectively In contrast, according to information the SME provided i t takes approximately 60 minutes to cover the same material in a traditional class This indicates a considerable amount of time saving for students using the web based module. A full summary of r esponses is displayed in Specimen B 6 in Appendix B Additio nally, a list of refinement changes were derived from their responses as well and is shown in Specimen B 7 Again, data reduction was guided by Seels and Glasgow (1998) questions: 1. Are the objectives achieved? (Seels & Glasgow, 1998, p. 180) The majority of the objectives pertaining to requirements first listed by the SME, PI (as Instructional Designer) and Programmer 1 were met a s seen in Table 15 Table 15 shows the list of objectives derived from Design Module Discussion and whether the objectives were met As can be seen two objectives could not be met because it was no longer applicable after the module was developed. To meet the majority of the objectives, evaluate were: (a) hyperlink from course website, and (b) using Authorware 6.0 to create web based module. The Summative Usability Evaluation questionnaire yielded a number of refinements that needed to be addressed in order to meet the objectives of the development of the module A summary of the data is displayed in Specimens B 6 and B 8 in Appendix B respectively. As seen in Specimen B 6, data gathered from the questionnaire ( n =15) yielded information so that 5 refinements to the web based module were identified. The se participants comprised of 33% male and 67% female. The learner participants, 87%, were freshmen at college. Of all the participants, if given a choice between traditional and

PAGE 168

156 Table 15 Design Information Source of Information Was Objective Met? 1) No test bank required for module. Preferably a generic test of about 10 to 15 questions should be developed. SME Yes : 10 questions were used. 2) The module should last no more than 15 to 20 minutes (no more than hour online). Yes : Learners Overall Average Time to view module was 9.85 minutes 3) The module should hav e info on: how to prep for a test -> should present the questions -> ask the students to answer question -> highlight different parts of the questions. Yes : During refinement cycles slides were added to inform learners on how to prep for the test. Stu dents were first asked all q uestions, they were scored and then feedback on each question was shared. 4) For each module, there are about 6 8 strategies per module. Yes : There were 8 strategies 5) The first module to be developed should be Objective Test Tak ing Strategies Yes 6) May need to store answers and score person. This way they can get immediate feedback. Yes : The application had a built in mechanism to track answers, score and give feedback. 7) The module should contain animation, it should not be bori ng. Avoid boring. Yes : There is animation. According to comments, most participants found the module engaging and interesting. 8) Audience all high school graduate students. N/A Delivery Information 9) Should it be web based (as opposed Internet)? Web based was decided. Programmer 1 & PI (as Instructional Designer) Yes : Web based 10) Multimedia containing audio as well as text Yes : There is audio and text animation. 11) Broadband Yes 12) Link from the SVC site No : Site not yet available 13) Maybe Authorware 6.0 was the best solution however everyone was concerned about scalability/ compatibility/flexibility No : Captivate 3.0 student version was used to develop module. 14) The module should be delivered via the support Dreamw eaver/Flash. SME Yes : It is web based and uses Flash Player 9.0.

PAGE 169

157 web based courses, 73% would prefer traditional and 27% would prefer web based course. When asked to respond to various statements concerning accessibility, 80% strongly agreed and 13% agre ed that the modules was able to run on their computer without any problems. However one participant disagreed. The PI discovered that this sign elements of the web based module, all of the participants, that is 80% strongly agreed and 20% agreed that the module was simple and uncluttered, the directions were easy to follow, the start and end of each section were clearly understood and the fon ts and colors used promoted legibility within the module. However, when attention was drawn to layout consistency of feedback messages, help messages and error messages, one participant consistently chose to disagree with these statements. Although, it sho uld be noted that the majority of participants (93%) responded positively to these same design elements by strongly agreeing or agreeing with the statements. Within the group of statements regarding graphics, animation and multimedia, analysis of the data showed that 60% strongly agreed and 40 % agreed respectively that the layout of graphics was consistent and the various text animations used in the module allowed them to focus on what needed to be learned. Furthermore, although 93% of the participants ei ther strongly agreed or agreed that the graphics used helped to enhance their learning, there was one participant who strongly disagreed with this statement. The indica tion that he/she experienced any technical difficulties with the module. According to data results, 67% of participants strongly agreed and 33% agreed that navigation

PAGE 170

158 buttons were clearly marked. Other navigation statements elicited positive responses of s trongly agreed and agree among participants as well. Content of the module was found to be useful and relevant as 47% participants strongly agreed and 54% of participants agreed. Similarly, all participants (i.e. 53% strongly agreed and 47% agreed) were of the view that feedback and the way the information was presented in the module facilitated learning. In the open ended questions in the Summative Usability Evaluation the participants were asked to state what they liked most about the module. One particip came back and told me where I messed up and what ways I could have looked at each six of them mentioned that they found the feedback helpful. Another participant mentioned that they also mentioned as aspects of the module that participants liked. Responses pertaining to what participants would like to change about the module appeared to be an issue for two participants. For example one of quiz, it should give more time in between the questions to take in all of the useful ortunately the two participants did not realize that they could pause or go navigate forward or back through the program. One of the ID experts did experience technical delays when trying to download the module. Upon investigation, the PI

PAGE 171

159 discovered later that the expert was using a computer that was over five years old from a public library and had limited memory. Certainly the number of refinements was reduced from twenty six to five at the evaluate e of the Evaluation phase. At the end of second iteration there was seven refinements identified. The refinement lists are shown in Specimens B 5 and B 9 respectively. At the end of the second iteration, the data from the Summative Usability Evaluation ( n= 22) showed that technical issues were resolved since 100% of participants either strongly agreed or agreed that they did not experience technical delays while downloading or viewing the module (see Specimen B 8 ). Overview of the data indicates a general po sitive opinion of the modules. Results were positive and similar to that found from the first iterative cycle. Furthermore, when participants were asked to comment on what they liked most about the module, many of them referred to the simplicity of design and the relevance of the content. As one one thing about the module, some par ticipants changed, that more questions and explanations should be included and learners could be given a practice test as well. One participant thought that relevancy was a problem and nge the into to make it more relevant to college level Referring to Specimen B 9, the list of refinements garnered the second time the Summative Usability Evaluation questionnaire (see Specimen B 8) was administered showed that generally one or two learners at the most having problems discerning the end

PAGE 172

160 of each section within the module or disagreeing about the consistency of the position of the error message. One participant also strongly disagreed and another disagreed that they found the modul e engaging. In addition, as seen in at the end of the first and second iterations, some of the changes could not be accomplished either due to time constraints or due to lack of available content. To explain, Specimen B 9 shows that one respondent each req uested more explanations or another test. This did not prompt a third round evaluate did not align with objectives set for the module. 2. Has the innovation been disseminated and adopted? (Seels & Glasgow, 1998, p. 180) Time constraints prevented this question from being answered. As Reeves and Hedberg (2003) point ed out, for this to be measured and to understand the impact of this web based module on the learners would require an evaluation after one to two years. Unfortunately this could not be realized in this study; however it is certainly something to consider for future recommendations. Design Based Research (DBR) Results The analysis of the DBR perspective was directed by Se eto and Herrington (2006) guideline which is comprised of the ADDIE phases, the four phases of DBR as 03) six functions of evaluation The four phases of DBR as well as the evaluation functions enha nced con s truct validity within this study. Following is a description of the DBR perspective of each phase of the ISD process, ADDIE.

PAGE 173

161 DBR Overview: Analysis Phase T he first phase of the goals of DBR as defined by Reeves (2000) is to analyze practical pro blems by researchers and practitioners (see Figure 2 ) The problem summarized here and as described fully in Chapter One was that considering the increase in the number of web based training modules, quality among them has been inconsistent and generally p oor. In this study, the research analyzed the use of a systematic process ADDIE, to develop a web based module to determine whether quality was incorporated due to a systematic development approach This research effort had several objectives also listed in Chapter One Seeto and Herrington (2006) pointed out that both the SME and instructional designer should be involved in the needs assessment The instructional designer should perform the needs assessment and seek the aid of the SME to help clarify and analyze the data in the Analysis phase of ADDIE. Concurring with this viewpoint, t o evaluate the Analysis phase of ADDIE with respect to DBR the PI incorporated the evaluation functions presented by Reeves and Hedberg (2003). In the first phase of DBR, t w o evaluative functions, review and needs analysis (Reeves & Hedberg, 2003) were used to guide data reduction According to Reeves and Hedberg (2003) the review function should develop a web based module and shoul d clearly list the objectives of the web based development initiative as well as provide instructional design guidelines. The PI discovered that both evaluation functions were encompassed in the needs analysis ( note: needs analysis is referred to by Reeve conducted in the Analysis phase of ADDIE For instance, t he initial

PAGE 174

162 interview Analyze the Problem with the SME, Director of the LEARN Program and the PI yielded the for developing the web base d module. Typically this is one of the objectives of the review function Essentially, the stakeholders wanted to reach out to more students or o seek new opportunities to diversify current academic support services avail able to the students The needs analysis also provided the objectives for the web based module as well as detailed design guidelines both of which were presented earlier in this chapter. Another item of note would be the element of time involved to comp lete the Analysis phase of ADDIE versus the time involved to complete the Design and Development phases of ADDIE Referring to Figure 6 in Chap ter Three, it shows that the Analysis phase was completed in 16 non consecutive weeks the Design phase was compl eted in two non consecutive weeks and the Development phase was completed in 10 non consecutive weeks to complete. Lee and Owens ( 2004) pointed out that the t ime taken to complete a thorough analysis at the beginning invariably more than made up for time Upon reflection, the PI discovered that this opinion held true in this study. The PI had originally scheduled three weeks for the Design phase and 12 weeks for the Development phase. DBR Overview: Design Phase Next, i nsight of the D BR perspective of the D esign phase was gathered from the Evaluate Design Decisions Questionnaire as well as the Design Module Discussion of the ADDIE process Reeves (2000) stated that the second phase of DBR is to develop solutions to the problem with a t heoretical framework. The theoretical framework which provided a possible solution to the problem was t he ISD process ADDIE. ADDIE was

PAGE 175

163 used as it is a popular generic process that most instructional designers claim to use to guide the development of their training modules. The point of using ADDIE was to investigate the systematic process and to determine whether it was still relevant in designing a web based module that incorporated computer inte ractions What was also being investigated was whether a sys tematic approach such as ADDIE would also produce a web based module that was considered hig h in quality Having computer interactions and being web based are two elements that were not fully conceptualized when ADDIE came into popularity in the late 19 8 s (Molenda, 2003 ) Feedback in response to the prototype occurred early in the Design phase of ADDIE and was used to refine the IDP. From a DBR perspective Specimen B 10 in Appendix B shows a summary of results derived from the Evaluate Design Decision Q uestionnaire which provide d details about the decision making process that occurred in the Design Phase of ADDIE The participants ( n =3) were the SME, Programmer 1, and the PI (as Instructional Designer) Regarding the results, data was gathered in three s Instructional Strategy Delivery Selection System and Prototyping The results clarified how the design decisions were throughout the rest of the ISD proces s. Specimen B 10 shows that all three participants believed that they knew the purpose of the web based initiative. More importantly they all agreed that the most important stakeholder were the learners. The SME disagreed that the learners needed a knowle dge assessment prior to using the web based module. Additionally, the SME together with the PI (as Instructional Designer) both made the choice to

PAGE 176

164 regards to giving learners feedback after the assessment; they did not believe further feedback was necessary in this module. When considering the responses to the questions asked in there was a consensus among the participants that the SME was the primary source of content for module development. Furthermore, a ll agreed that the design decisions had been made very early in the D esign phase but the SME could not distinguish at what phase or at what point within a phase of ADDIE that the meetings were held Mor e over, i t was not surprising to see that when asked wh o was most influential in choosing an instructio nal strategy for the initiative both Programmer 1 and PI (as Instructional Designer) believed it was the SME, however, the SME believed it was the PI (as Instructional Designer) Also, there was general agre ement, that is, 67% strongly agree ( PI (as Instructional Designer) and Programmer 1) and 33% agree ( SME ) that an IDP was essential in guiding the development of the web based module Moreover, the SME did not know if an IDP had been developed for this web based initiative section of the questionnaire related how the participants viewed the decision s made about the design element s utilized in the web based module. On an interesting note, b oth the P rogrammer 1 an d the PI (as Instructional Designer) believed that the person who was most influential in setting design elements for the initiative was the i nstructional designer. On the other hand, the SME believed that the programmer was most influential in setting design elements for the initiative Here it is important to point out that Instructional Designers need to understand clearly who makes the design decisions.

PAGE 177

165 Hardware and software decisions according to the PI (as Instructional Designer) were a collaborative decision made by the Programmer 1 and the PI (as Instructional Designer) Interestingly t he SME was of the opinion that the programmer and instructional designer were equally influential regarding hardware and software decisions. As far as d ecisions made about interaction interfaces and interaction design elements navigation and how the information would be presented to the learners there was agreement among the participants that they had been made in the Design phase of ADDIE. However, all participants either strongly agreed or agreed that the u se of media elements such as audio, video, animation and gra p hics was guided by the information derived from the Analysis phase of ADDIE. Data from stem Selection and P conveyed that the SME, PI (as Instructional Designer) and Programmer1 all either strongly agreed (67%) or agreed (33%) that a prototype is always recommended when developing a web based module Similarly, t he same agreement was arrived at when participants considered the statement that t he feedback from the prototype is expected to refine the design and devel o pment of the web based module Considering that a prototype would help to reduce costly design changes, the SME dis agreed with that statement but Programmer 1 and the PI (as Instructional Designer) both agreed and strongly agreed respectively with the statement. However, all participants strong ly agreed that a prototype helped to show what the final web based module c ould potentially "look, sound and feel" like. When participants were asked who they thought influenced the delivery system (e.g. whether via Internet or Face to Face or Blended) choice, both Programmer 1 and the PI (as

PAGE 178

166 Instructional Designer) stated it was the SME but the SME believed that the Instructional Designer was most influential in making the delivery method choice. From additional comment s Programmer 1 stated that Proper testing of desired design deliverable content was initially over shadowed by incapacitates of delivery method. This was quickly resolved with the Instructional Designe I did not know if a formal IDP was created or when it was created. Additionally, I did not know if the meetings that were conducted fell before or after the design or analysis phases. Finally, the PI (as Instructional Designer) The prototype helped trem endously in refining design elements. In this case, especiall y what design elements that was desired and not desired. A simple IDP was developed but not formally presented to the SME. The programmer was given information on how the content should be presen ted (questio ns first, followed by feedback) details about colors/fonts was mentioned. D BR Overview: Development Phase Although the analysis presented here occurred at the Development phase of ADDIE, when viewed from a DBR perspective and using Seeto and guide (see Figure 3 ) the analysis was still at the second phase of the Reeves (2000) DBR model (see Figure 2 ) that was development of solutions with a theoretical framework. Working within the Seeto and Herrington (2006) guide, the y express ed the importance of using formative evaluations while developing the learning environment. In the earlier research by Reeves and Hedberg (2003), they also agree with the use of formative evaluations when developing any learning module. Reeves and He d berg (2003) believe d

PAGE 179

167 that formative evaluations would help to improve the p roduct as it is being developed as well as check the usability and relevance of the product. In the present study, the formative evaluations were conducted at the Development p hase of ADDIE and discussed pre viously in this chapter were: (a ) the Evaluate Usability of Module SME, (b ) the Expert Review of Module ere the two ID experts, and (c the learners enrolled in a Reading Experience course. To satisfy the requirement as suggested by Seeto and Herrington (2006) that from a DBR perspective some record of reflection should occur A note to the readers: it should be understood that at this juncture, the PI was functioning in the roles of t he I nstructional Designer and P rogrammer 2 Programmer 1 could no longer be a part of the study. Therefore the PI faced a dilemma, whether to s of instructional designer and programm er or to respond twice to the questionnaire in separate roles of instruc tional designer and programmer. The PI decided to keep the roles separate and respond twice to the questionnaire. In making this decision the PI believed that by keeping the two roles separate it would support the integrity of the study. The participants were the PI (as Instructional Designer) and the PI (as Programmer 2) A full summary of responses to questions in the Module Development Ques tionnaire are shown in Specimen B 11 in Appe ndix B In this questionnaire there are Content Development

PAGE 180

168 Design Elements Consistency in responses in regards to accuracy of the content, the appropriateness of the content for the target audience branching of the content and the modularization of the content. It was apparent that all content criteria were met that was set out in the Analysis and Design phases of ADDIE respectively. It took approximately ten d ays to integrate the content to the module design It was agreed that the sequencing of the content followed the design criteria set in the IDP and there was strong agreement that the SME helped to maintain accuracy of the content throughout the developmen t. DBR Overview: Implementation Phase The implementation of the web based module did not occur t o the degree where it warranted detailed analysis from a DBR perspective. The PI simply copied the files for the web based module to a web server for deployme nt. The SME was not involved in this process. DBR Overview: Evaluation Phase It was evaluate Also, f rom the DBR perspective it was at this juncture where the third and fourth phase of Reeves (2000) DBR model cul minated This can be ( see Figure 3 ) The third phase of DBR states that the solution to the problem should be evaluated and tested in practice T he output of the fourth and the final phase of DBR should be from the documentation and reflection of the study In the third and fourth phase s of DBR the analysis was guided by the evaluation function Reeves and Hedberg (2003) explain ed that this function was a means to appraise strategies of the web based module

PAGE 181

169 in the environment it is meant to be used. In essence they recommend a usability study at the Evaluation phase of ADDIE. The final two evaluation functions, impact and maintenance a ccording the Reeves and Hedberg (2003) are best studied after the web based module has been in the intended environment f or a year (Seeto & Herrington, 2006). Therefore as stated previously these two functions were not included in the present study but cou ld be something to be considered in a future study. First, in regards to the t hird phase of Reeves DBR model, the questionnaire Summative Usability Evaluation was administered in two cycles to the ID experts as well as to the learners and this helped to e valuate the web based module in practice. The results from the iteration s evaluate cycles have been discussed previously in this chapter. As noted, t he number of refinement changes decreased after each iteration of the evaluate r cycle (see Specimens B 5 and B 9) The results from the questionnaire and after each iteration highlighted what was working as well as what needed to be re designed. Although at each cycle the group o f learners was different, the number of refinemen t changes continued to decrease dramatically. Based on the refinement changes identified via the summative evaluation questionnaire the PI noted that it was an important tool that helped to create a web based module that was generally acceptable to the ma jority of the learners and met the requirements of the other stakeholders such as the SME and the Director of the program Second, the outcome of the fourth phase o f Reeves DBR model was based upon the PI collected from all twelv e questionnaires and two interviews to produce a set of design principles This outcome was part of the deliverables as stated in Chapter One that was to produce a

PAGE 182

170 and to report o n the effectiveness of specific instructional str ategies used in the present study. Summary In this chapter, the discussion presented the outcomes from the Design, Development, Implementation and Evaluation phases of ADDIE. The iterative cycle of evaluate occurred o nce in the Dev elopmen t phase and twice in the Evaluation phase of ADDIE Results of the study highlighted the importance of formative evaluations and iterative cycles to develop a valid and effective web based module Additionally, participants agreed that development of a pro totype early in the ISD process is an important guide for instructional designers and developers. Furthermore, the DBR ov erview lent further insight in to the ISD process that provided relevant information to Instructional D esigners. Next, Chapter Five conc ludes the study by discussing the results and what the results indicate. Chapter Five also presents the list of d eliverables, implications of the study and directions for future research.

PAGE 183

171 C hapter F ive Summary There were two purposes of this study: (a ) to examine the use of a systematic ISD process, ADDIE, to develop a web based module that would be consid ered valid and effective, and (b ) to employ the DBR methodology to create relevant outcomes for practitioners in the field of IT while adding to the body of IT research. In this chapter, the outcomes of the integration of the ADDIE process and DBR methodology will be used to discuss the research objectives the research question limitations and threats to the study, direction for future research and impli cations of the study. Discussion of the Research Question and the Theoretical Implications What is the effect of applying a systematic approach to the development of a web based module for teaching metacognitive learning strategies to students in a higher education environment? As stated earlier in Chapter One one of the purposes of the pres ent study was to study the development of a web based module using a systematic ISD approach: ADDIE. Some critics of ADDIE believe that it is an obsolete process. ADDI E, they think is too rigid and cannot be used to accommodate the development of web based modules that involves interactivity (Allen, 2006). In contrast, in the present study, it was found that ADDIE provided construct validity for the research as well as a flexible

PAGE 184

172 guideline for developing an interactive web based module. Although caution is necessary in defining levels of interactivity, the web based module did contain interactions that encouraged the learner to change from passive to active. As seen in t he results in Chapter Four when asked if they found the module engaging, of the 22 respondents to the final summative evaluation (see Specimen B 8 ) 55% strongly agreed and 32% agreed respectively that they found the module engaging. Moreover, the result s of the present study indicated that using a systematic approach such as ADDIE to develop a web based module that included interactivity was still a valid approach. Regardless of technological advancements and levels of interactivity, in the present study ADDIE was found to still provide a serviceable approach. Additionally, there were a number of activities that were included within the systematic process that also contributed to creating a valid and effective web based modul e. To summarize the activities : 1. C onductin g a detailed front end analysis. 2. D eveloping a proto type early in the process. 3. I ntegrating form ative and summative evaluations. 4. A evaluate les throughout the process. 5. A ccommodating flexibility within the process. The PI believes that these five elements were critical in using the systematic appr oach successfully. Figure 1 5 displays the five activities and how it related to ADDIE in the study. Each activity shown in Figure 1 5 contributed to developing a valid and effective web based module using a systematic ISD approach.

PAGE 185

173 Figure 15 Five activities and how they related to ADDIE. A nalysis D esign D evelopment I mplementation E valuation Allow Flexibility Assimilate Evaluate Integrate Formative & Summative Evaluations Develop Prototype Conduct Analysis

PAGE 186

174 Dick et al. (2005) as we ll as Gustafson and Branch (2002 ) pointed out t hat the systematic approach naturally lends itself to iterative cycles. Specifically, Dick et al. (2005) mentioned that instructional designers are continuously refining their designs throughout the ISD process. As explained in Chapter Four the present st udy had three iterations of evaluate Dick et al. (2005 ) and Gustafson and Branch (2002 ). Another important aspect the study also confirmed was the importance of formative and s ummative evaluations throughout the ADDIE process. Dick et al. (2005) designers use to obtain data that can be used to revise their instruction to make it more Also, Dick et al. (2005) believe instructional designers should be able to conduct formative evaluations with confidence. Moreover, formative evaluations they conclude 340). Regar ding summative evaluations, Dick et al. are also strong advocates. They state evaluati ons provide two things: (a) expert judgment to determine whether the instruction met the needs of the organization, and (b) field trial to determine the effectiveness of the instruction with the target audience. In the present study, expert judgment showed that the instruction did meet the original goals (see Table 15 in Chapter 4 ). Additionally, the feedback from the target audience also revealed that the instruction was effective. Additionally, the results of the study highlighted the importance of conduc ting a front end analysis. Without some analysis, clarification of the purpose of the

PAGE 187

175 development and why it is needed will be unknown. This can obviously lead to a poorly designed product. Furthermore, the results showed that development of a prototype wa s an efficient method utilized to truly grasp the actual design elements that were desired by the stakeholders. Finally, flexibility, as shown in Figure 15 must be incorporated throughout the process. Flexibility appeared to be an inherent characteristic of ADDIE since it is generic in its approach and open to the interpretatio n of the instructional designer From this perspective, the flexibility of ADDIE can be interpreted as a positive aspect of the process. However, upon close investigation, one of the problems encountered in the study very early in the process was that ADDIE was found to be too generic and did not provide enough details. For example, at the start of the Analysis phase, a front end analysis was determined to be beneficial to developing a quality web based module. How to accomplish the analysis was not readily available and this added a level of complexity to the process The PI conducted research and found guidelines by noted researchers (for e.g. Dick et. al., 2007 ; Seels and Glasgow, 1 998) in the IT field that provided details to create several instruments for the front end analysis (see Appendix A). Other instructional designers are encouraged to use similar references to overcome the lack of specificity in the ADDIE process. Therefore in reflecting on the ADDIE process, although its generic nature gave rise to adaptability to develop an effective and valid web based module, it did not provide sufficient detailed guidelines for instructional designers.

PAGE 188

176 Discussion of Research Objectives Research Objective 1: To create a systematically and rigorously designed product intended to meet research design goals. As the results in Chapter Four indicated, the ADDIE phases not only provided a systematic approach to developing the web based module but also provided a rigorous approach as well. ADDIE provided a guideline that ensured certain elements such as a front end analysis, prototype development and evaluations are included in the process (see Figure 15 ). Some critics of the ADDIE process clai m that it is a static model and therefore inadequate to create interactive web based modules. However in this study that was not the case. The PI discovered and as mentioned previously by researchers Gustafson & Branch ( 2002 ) that if ADDIE was used as a fl exible guideline and not as a static step by step process, it would allow production of an interactive web based module. It may be one of the reasons why ADDIE is still taught to Instructional Designers and why it still persists in the field of Instruction al Technology. As seen in Table 15 the design goals set ou t in the Analysis phase of ADDIE were met. Some changes occurred during the development process and this was also reflected in Table 15 In the Analysis phase, the instruments utilized provided the information the product was desired (Dick et. al, 2005 ) However, design goals w ere discussed of the LEARN program and the PI (as Instructional Designer) Furthermore design goals were crystallized in the among the SME, the Programmer 1 and the PI (as Instructional Designer) early in the Design phase.

PAGE 189

177 Research Objective 2: To produce data that indicates the validity a nd effectiveness of the p roduct. The re are several reasons that support the validity and effectiveness of the final product. As noted, t here we re a total of twelve questionnaires in this study. Although all twelve instruments helped to determine the validity and the effectiveness of the web based module, there were four instruments in particular that provided more detailed data regarding validity and effectiveness The instruments in the Development phase were: Evaluate Usability of Module Expert Review of Module and Learners: Ev aluate Usability of Module The instrument at the Evaluation phase was: Summative Usability Evaluation These four instruments altogether showed through a majority of positive responses from the learners, ID experts, programmers, instructional designer and SME that the web based module could be viewed as valid and effective. In addition, another indicator that offered confirmation that the web based module should be considered valid and effective were the results of the iterations of evaluate refi cycles t hat occurred once in the Development phase and twice in the Evaluation phase. As mentioned in Chapter Four Specimens B 5 B 7 and B 9 shows that the number of refinem ents decreased from twenty six to seven. Notably, the se se ven refinements lis ted in Specimen B 9 were not part of the original design goals nor were they part of a majority opinion and t herefore it was not feasible to prompt further develop ment The reduction in the number of refinements ind icated that the product had evolved and h ad been refined to a level acceptable to the majority of the learners.

PAGE 190

178 Research Objective 3: Deliverable A : A list of gene An integration of the data from the Provisional Lessons Learned listed as an outcome in the pilot study in Chapter Three the DBR perspective and the ADDIE process was used to d etermine a list of generalized Lessons Learned Upon reflection, it was found that the Provisional Lessons Learned still held true by the close of the study The Provisional Lessons L earned lent itself to some organizational categories that are included in the final report on the ollowing is the list of Lessons Learned : General Lessons Learned 1. Establishing relationships: At the start of the process it had been impor tant to establish relationships with the decision makers and leaders of the initiative. 2. Interviewing : Informal interviews helped to establish relationships between key personnel. 3. Identifying stakeholders: In this study the stakeholders were the learners the SME a nd the Director of the program. Identifying stakeholders early will help the instructional designer when making design and development decisions. 4. Making decisions early in the process : The instructional designer had to establish whether or not t he development of the web based module required a detailed analysis. The A nalysis phase of this study required a commitment of time, money and human resources. These elements are not

PAGE 191

179 always available in practice due to deadline dates and marketing commitme nts. However, making decisions early cannot be overstated. 5. Documenting the process: It is important not only from a research perspective but from a design perspective as well that the entire process is documented. This type of documentation provided a deta iled and useful audit trail. Documentation will help instructional designers reflect on methods used and aid in refinement of the process. 6. Determining project goals and timelines: A critical part of a successful project is to develop the product within the expected timeframe and budget. Although this project did not use expansive project management tools, simple timelines and goals were set. ADDIE Lessons Learned 1. Conducting f ront end analysis: conducting in depth analysis at the start of the process led to defining many of the design elements necessary to make a product that met the requirements set out by the stakeholders. Analysis was found to be a critical part of creating a product that met the requirements of the stakeholders. Another important aspect o f conducting detailed front end analysis was that it was found to save on design and development time. 2. Relying on expert knowledge and research: Since ADDIE provided a generi c and flexible process, it lacked each phase. T o overcome this t he instructional designer can rely on

PAGE 192

180 research conducted by noted researchers in the field as was done in this study, or if possible, employ an expert for guidance. 3. Ensuring content validity: Content validity was not an issue in this stu dy however it was still something that had to be considered. Recall that in this study the SME was also an instructor of the targeted course for conversion. The SME provided content for conversion that was based on a strong theoretical foundation. The PI d id not have to conduct further research for content material. T here was not any concern about the validity of the conten t However, in a different situation, where content is being newly developed rather than being converted, content validity measures shou ld be integrated into the process. Some steps to ensure v alidity of content is to get expert advice (e.g. employ a SME) and to conduct research. 4. Being flexible: The ADDIE process is a systematic process but it does not imply rigidity. ADDIE was used as a f lexible guideline and as the outcome of the study displayed it can be utilized to develop a valid and effective web based module that includes interactivity. 5. Developing a prototype: This was a critical part of the ISD process. It helped to determine what design elements were desirable and what w ere not. It provided information to narrow or expand the design scope. 6. Integrating formative evaluations: Integrating formative evaluations throughout the process provided critical information that improve d the prod uct within the development life cycle The formative evaluations

PAGE 193

181 provided useful and timely feedback from the ID experts as well as the learners. 7. Establishing evaluate Including i tera tions of evaluate cycles were a very powerful element in the ISD process. This element helped to establish the effectiveness of the module throughout development. It also helped to establish effectiveness and validity of the web based module. Design and Development Process Lessons Lea rned 1. Determining what is critical and what is not: T he development did not require the use of many analytical tools. Sometimes a guided interview and a needs assessment provide d the necessary information to design and to develop the module. As learned in t his study for example, the Context Analysis was not really necessary because the questions asked in this instrument had already been addressed by similar questions in the Needs Analysis, Content Analysis and Task Analysis instruments. 2. Being cognizant of pa rticipants schedules : Within the A nalysis phase, it was important to limit the number of instruments to only what was necessary because filling out questionnaires and conducting surveys and 3. Utilizing existing inst ruments and expertise: There were many valid instruments available for conducting various types of analyses. It was more prudent to use an instrument that had already established validity. In other words, utilizing an instrument previously created by a rep utable

PAGE 194

182 researcher or resource group provided reliability to the data collected. In addition it saved time and money because the instrument did not have to be developed. Modifying existing instruments rather than trying to create and to validate new ones ar e recommended. Also using guides and questions created by noted rese archers i n the ID field provided cost effective expertise. 4. Employing objective evaluators: Having used two independent ID experts who were not stakeholders in the product ensured that a va lid quality control measure was included in the ISD process. There assessment of the product and the process provided an unbiased and objective perspective. 5. Developing and using the IDP: Developing an IDP plan was helpful but again, it was considered a gui deline. Like the ADDIE process, the IDP should be considered flexible but also be specific. It should allow for innovative ideas that may arise during the development process. Research Objective 3: Deliverable B: Report on the effectiveness of the specific in structional strategies utilized. There were a variety of instructional strategies that were employed in the present study Some of the strategies were derived from Gagn (1977) nine events of instructions which are known effective learning strategies A ll of Gagn (1977) nine events were utilized to some extent. Explanations on how they were utilized in the web based module are as follows : 1. G : A concerted effort was made to develop the web based module as a way to le arn. This was one of the requests made by the

PAGE 195

183 SME. The r esults indicate d that this request had been met. During formative evaluation both ID experts strongly agreed that the web based module had been engaging. Similarly in Specimen B 4 57% and 43 % ( n =7) of the learners strongly agreed or agreed respectively that they found the module engaging. Furthermore, i n the summative evaluation the overall majority of respondents found the web bas e d module to be engaging. 2. Stat e the instructional objecti ve : The obj ective stated clearly in the beg inning of the web based module was to teach the learner metacognitive learning strategies for objective test taking. 3. Stimulate memory of relevant information : The content of the web based module was developed to help the le arner recall relevant terms that they were already familiar with. For example to help the learner distinguish between absolute and relative qualifiers These are words that all learners were already familiar with but probably could not categorize them in the context of a metacognitive learning strategy. 4. Present the stimulus, information, or distinctive features to be learned : This was another strategy employed in the web based module The desi gn of the web based module presented test questions to the learners The learners had the opportunity to answer each question and this was followed by the explanations of the correct answer The explanation for each correct choice taught learners how to re cognize a particular objective test taking strategy 5. Guide the learning : In the web based module the learner was guided through the process. First the questions were presented. Next, t he learners w ere presented with

PAGE 196

184 the ir results. Following this, the learn er s were prompted to go the final section of the module where they could gather more in depth knowledge about each correct answer. 6. Elicit performance retrieval, active participation, practice : The learner s had to be active participant s to complete the we b based module. Questions were posed to the learner s and by choosin g an answer the learner s received feedback (i.e. correct, incorrect). More over, after the learner s received the ir score s the next section of the module gave them more detailed information about the correct choice. It also prompted them in certain cases to click on different areas of the screen or to type a part icular word. These actions help ed to encourage the mode to change from passive to active. 7. P rovide feedback correction of errors, reinforcement : Feedback and reinforcement strategies were used in the web based modules. For example, i n the section where the questions were asked immediate feedback (i.e. correct, incorrect) was used. Following the breakdown of the sc ore, the new information was reinforced by having the learner s go through the final section of the web based module where the explanation of each correct choice was presented. 8. Assess performance metacognition, retention : perfo rmance was con ducted in a limited sense. To clarify, the learner was assessed on their initial knowledge of test taking strategies. However, a fter the learner gained new knowledge, no further assessment of the learner occurred within the web based module.

PAGE 197

185 9. Provide for retention and transfer overlearning, distr ibuted practice, generalization : It was expected of the learners that they would use the knowledge gained from using the web based module to improve their general test taking strategies skills. Anoth er learning strategy employed in the web based module was the element of time as mentioned by Carroll (1963, 1973, 1981, 1989 ). To reiterate, Carroll (1963, 1973, 1981, 1989 ) believed that giving learners time to learn any new concept was a factor that aff ected learning. In this study w hen l earners accessed the module they were not given any time limits. Learners were free to go through the module as quickly or as slowly as they chose. evaluate ted revealed that the average time the learners took to complete the module was 9.85 minutes. The analysis of the data did reveal that one respondent ( n =22) had a problem concerning time. The learner felt that more pause time should have been placed betwee n the question and the explanation sections of the web based module This respondent was unaware that they had the capability to pause the module as they wished. The average times recorded for learners to complete the module at each iteration of the ign evaluate s were 9.11 minutes, 9. 61 minutes and 9.85 minutes respectively. As a nalysis of the data shows in general learners took little less than 10 minutes to complete the web based module. In contrast, according to the SME, in a trad itional classroom, it takes approximately six times that time (i.e. 60 minutes) to cover the same concept. In this case, this indicates that the time it takes the learner to learn the same concept has been reduced considerable.

PAGE 198

186 Research Objective 3: D eliverable C: An analysis of quantitative, qualitative and descriptive outcome measures of learning among field test participants The data gathered in this study was generally qualitative. Data reduction was accomplished by using questions developed by r es earchers Seels and Glasgow (1998 ) for each ADDIE phase The int ent of the data gathering was primarily to evaluate the systematic design process using ADDIE to develop a web based module In addition, the data also helped to determine the validity and ef fectiveness of the web based module. Among the various field test participants i n this study the learners were considered the most important stakeholders. From the perspective of the learners, the data gathered from both the formative and summative evalua tions indicated a positive outcome regarding the validity and effectiveness of the web based module. A s explained earlier in Chapter Four d ata revealed that the majority of the learners either strongly agreed or agreed that the key aspects of the web base d module such as: accessibility, design, graphics/animations/multimedia, navigation and content were ef fective. In the third and evaluate when asked to state what they would change about the module, one learner thought that there should be more explanations and another thought that it should be more relevant to college level learning. In contrast, w hen asked to state what they liked most about the module, one learner respon se encapsulated the point of the web based module. The learner stated I like the main goal which will help me to focus mor e on the wording the next time I take a quiz or test Questions regarding content of the module indicated that it was relevant to

PAGE 199

187 t he learners. When learners, including the ID experts were asked in the final iteration of evaluate refin of the respondents ( n =22) either strongly agreed or agreed that the examples used in the module made learnin g the concepts easy the feedback helped them to learn and having the questions presented first followed by the feedback also accommodat ed learning (see Specimen B 8 ) More revealing was the response received to the question asking whether the information in the web based module was considered useful and relevant, 68% strongly agreed and 32% agreed it was useful and relevant Generally, data collected from the learners consistently showed that the majority of them, that is, over 80 %, either agreed or strong ly agree d with various statements concerning the validity and effectiveness of the web based module Research Objective 3: Deliverable D: A module that is considered valid and effective at the juncture where the study completes a second iteration design evaluate refine derived using data collected via formative and summative evaluations guided by the ADDIE process. As previously stated, r esults indicated that the web based mod ule should be considered valid and effective at the juncture where the study completed the final evaluate Validity and effectiveness of the web based module was derived from two perspectives. First, the information d erived from the participants of the study was an obvious source of information to indicate that the module was valid and effective. As mentioned earlier in this chapter, overall, the majority of learner s 80% and over along with the ID experts had a positi ve view of the module relevance (see Specimen B 8 )

PAGE 200

188 A second indicator that the web based module was valid and effective was that the number of refinements was reduced from twenty six ( Specimen B 5 ) at the Development phase of ADDIE to seven (see Specime n B 9 ) by the end of the second iteration of the Evaluation phase of ADDIE At the Development phase the first iteration of the evaluate cycle occurred. From the formative evaluations 26 refinement suggestions were gathered from the learne rs, SME, ID experts and the Director of the LEARN program. Seventeen of the 26 refinements were completed The ones that were not co mpleted were either due to their incompatibility with the scope of the project or was a minority opinion that is, one or tw o respondents opinion. Two more iterations of the evaluate cycle occurred at the Evaluation phase of ADDIE At the end of the second iteration in this phase, the number of refinements was reduced to seven. Again, n o further changes were made either due to their incompatibility with the scope of the project or were a minority opinion, that is, one or two Overall, the reduction in the number of refinement requests was significant to the study The objective here was to reduce elements within the web based module that could have inhibit ed learning Additionally, reduction of refinements was viewed as a positive outcome that indicated a better quality product and a valid and effective product Implications Concerning Qual ity of the Web B ased Module The q uality of web based modules or lac k thereof as stated in Chapter One is an issue that educators should address presently. Due to the rise in demand for web based courses, many IHEs have been sharply increasing the number o f web based courses in their curriculum I n regards to web based courses, Kilby (2008) believes that,

PAGE 201

189 an (para. 1) Although measurement of quality was not within the scope of the present study, producing a quality web based module was an expectation. In the present study, the PI sought to develop a web based module that was valid and effective using a systematic process. For the PI validity and effectiveness implied a product that was also high in quality. To support this notion, t four measure s of quality implies that effectiveness is an aspect of quality measurement. Admittedly, to some researchers, this is debatable but within the confines of the study the result s did provide evidence that the key stakeholders, the learners as well as the ID experts found the web based module to be effective. Overview of DBR Methods for Instructional Design Research and Theoretical Implications Another purpose of the study was to utilize the DBR approach. Some a dv ocates of DBR (for e.g. Dawson & Ferdig, 2006; Reeves et al., 2005, 2000; Robyler 2005; Schrum, et al., 2005; Barab & Squire, 2004; Bell, 2004; Collins et. al, 2004; Cobb et al., 2003) regard DBR itself as a means for ins tructional technology researchers to provide practical, timely and relevant research. as well as Reeves and evaluation functions provided substantial support as well as construct validity for the study. Without Seeto and He guide, designing the research study would have been more challenging than what it turned out to be. Moreover, s ome of the challenges mentioned by the @Peer Group (2006) were not experienced to any great extent in the present study. For ex ample, obtaining IRB approval did not pose a problem. IRB approval was sought in two parts, first for the pilot study that inclu ded the Analysis phase of ADDIE and second for the rest

PAGE 202

190 of study that included the four other phases of ADDIE that is, Design, D evelopment, Implementation and Evaluation IRB approval was granted with an exemption status. Collaboration among peers from different disciplines, and length of time for the study also did not provide any extenuating challenges in the study. Publishing wa s not attempted therefore no comment can be made on it being a challenge. Ano gap as explained earlier in Chapter Two Since credibility in research is dependent on certain factors s uch as validity, objectivity and reliability tests (@Peer Group, 2006), this is a challenge to overcome in all studies including DBR studies As an example, i n the present study, there was interaction, rather than separation, between context and interventi on. This is typical of DBR. because of the iterative nature of a DBR study, some level of credibility can be provided. Iteration s of the evaluate cycle s provided a level of consistency that gave rise to evidence that established the validity and effectiveness of the web based module and support the use of a systematic approach to develop a web b a sed module Generalizabilty of the study was another challenge to address when using the DBR approach. Typi cally, t o claim that the outcomes of a study are generalizable, if the study is replica ted in various contexts then it should provide the same outcomes However, as critics to the DBR approach claim there are various factors that affect learning that are not measured such as interaction with other factors, for example, the environment, instructor s learners or numerous other elements. In this study, to help overcome this challenge, advocates of the DBR approach propose d that the intervention be viewed 5). To clarify, the DBRC (2003) group believes

PAGE 203

191 that the educational intervention is in itself an outcome of the context. T herefore, in reference to the formative and summative results of the study it suggest ed that utilizing a s ystematic approach such as ADDIE and incorporating the five activities mentioned earlier will produce a valid and effective interactive web based mo dule It is not yet known whether s ustainability will be a challenge in this study A characteristic of DBR is its iterative nature. To maintain sustainability, the PI would have to refine Beyond the challenges the DBR approach did produce practical design principle s for practitioners in the inst ructional tech nology field as seen in the Lessons Learned section listed earlier in this chapter Another outco me of the DBR approach was that it provided a current and in depth examination of a systematic approach using ADDIE to develop a web based module For example, one of the key things that were highlighted was that the stakeholders like the SME, programmers and instructional designers had to establish relationships early in the process to make effective decisions. Also, n ot surprising, t he primary s ource of content information was the SME. What was interesting was that the programmer and instructional designer both thought that the SME was most influential as far as decisions made for instructional strategies used in the module I n contrast the SME f elt it was the instructional designer who was most influential. development application used. The d ynamics of the decision making process was highlighted in this reporting. It is important for an instructional designer to know what

PAGE 204

192 their role is in the design decision making process It is important to delineate who should be making the decisions and ho w to go about making informed decisions. These types of detai ls mentioned above and reported in Chapter Four are representative o f the type of data a PI can receive in a DBR study. As Reeves ( 2000) pointed out there is a need for r elevant research in the field of instructional te chnology. In the present study the DBR approach provided insights into the decision making process for developing a web based module This was a critical aspect of the study captured with a DBR approach. On different note, the PI did experience s ome problems using the DBR methodology. The PI believes that some of these problems resulted from the combination of a lack of financial support and available resources. For instance, in this study the PI acted as the Instructional Designer and a l s o as one of the progr ammers. Given the opportunity and the financial support, the PI would have preferred to employ another programmer to complete the study rather than act in this role The PI discovered that holding three roles in a study was som ewhat cumbersome and time consuming. It also became awkward when r esponding to the particular questionnaires. Another issue with using DBR for this study was the difficulty experienced by the PI in trying to begin the study. To elaborate, t he PI learned t hat a v ery difficult aspect of a However, the PI overcame this problem by seeking the advice of a mentor who was familiar with DBR and who offered strategic information when required Also, by clearly defining the research q uestion th e PI was able to develop the objectives of the study and this helped to guide the study in the early phase Furthermore, by focusing on one objective of the study at a time the PI was also able to

PAGE 205

193 slowly design the framework of the study In addi tion, a device that worked well for this PI was the use of flowcharts to map out the design of the study. The flowcharts helped the PI to visualize how the study could best be designed to gather the necessary data. Also, it pointed out any missing features Moreover f or researchers or practitioners contemplating using DBR the leng t h of time involved may also be a daunting aspect. Analyzing the logbook, the PI notes that entries from the logbook began at June 2006 and ended in March 2009, approximately 1 38 weeks (i.e. a little more than two and a half years). Of those weeks approximately 32 weeks could be deducted since the PI did not complete any tasks in those weeks. Next, per week since this was not recorded. Approximately an average of 25 hours per week was dedicated to the study. Therefore the study comprised of research hours plus development hours is estimated to be 2650 hours (i.e. (138 weeks 32 weeks) multiplied by 25 hours) As Champion (1999) points out estimating time may help instructional designers develop better training. In Chapter Three as shown in Figure 6, the ADDIE process lasted 31 non consecutive weeks. Again, using the estimate of 25 hours per week, t he total hours estimated to develop the 10 minute web based module is 776 hours (i.e. 31 weeks multiplied by 25 hours). In retrospect, the element of time is important to developers and this should have been recorded The PI would recommend that other DBR researchers should plan to record time dedicated to th eir study or product development in terms of hours, days and weeks. Despite s ome of these challenges the PI recommends using DBR for instructional design research studies. Since i nstructional designer s have been seeking timely and

PAGE 206

194 relevant guidance the outcomes of this study did provide some evidence that the DBR approach is a valid method for the field of instructional technology. The out comes of the present study showed that p ractitioners and resear chers alike in the field of instructional technology can find more in depth information that c an inform their design decision making process and development process using a DBR approach. Limitations and Threats A large amount of the data collected was des criptive and the PI had to continuously guard against researcher bias. The PI was cognizant of any bias and removed them from the narration. In addition, an independent ID researcher and an editor were asked to read the narration frequently in order to poi nt out any bias. Furthermore there were some instances where the ADDIE process and the DBR approach were abstract and this added a level of compl exity to the measurement methodology For example, the ADDIE process was not rigid meaning that sometimes ph ases overlapped (e.g. some parts of the Design and Development phases occurred simultaneously ). In order to measure each phase as they occurred, they had to be distinctive and ope rationalizing the study in its entirety was difficult Another threat that wa s encountered was that of instrumentation. There were twelve questionnaires and two interviews that were used in this study. To guard against this threat, each instrument was derived from established researcher s in the ID field and their sample questions t hey provided Additionally, each instrument went through two cycles of expert reviews To guide the expert reviewers, the PI gave both of them four specific guidelines that were also derived from credible sources. Modifications to the questionnaires were b ased on these guidelines and expert knowledge.

PAGE 207

195 Directions for Further Research There are several directions to recommend for further research. First, two of the six evaluation functions (Reeves & Hedberg 2003), impact and maintenance, could not b e accomplished within the time frame set out to complete the study. Reeves and Hedberg (2003) advise d that these function s should be conducted a year or two after the product has been in the environment. On an interesting note, these functions may also wor k to overcome the challenge of sustainability thus continuing to utilize the DBR methodology. A second direction for future research could be t he consideration of the instruments used for the study. As noted, a ll instruments were based on existing and vali d research work. The process described in the study was intended to show instructional designers how to use existing research and instruments to gather data as well as how to analyze the data to help guide design decisions IT r esearchers may want to condu ct further statistical analysis on the instruments themselves. Lastly a third direction for future research could be to further analyze the element of quality in a web based module. Quality measurement is a detailed process and this could be the basis of another study entirely from a different perspective Currently, t here are few rigorous studies concerning the quality of web based module s Moreover, there are even fewer studies that utilize DBR and quality measurement of web based modules. Summary A hol istic view of the completed research yield ed valuable and practical insights for instructional designers and add ed to the body of existing design based research. To re iterate t here were two purposes in conducting this study: (a ) to examine the utilization

PAGE 208

196 of a systematic ISD process, that is, ADDIE, to de velop a web based module that would be considered valid and effective and (b ) to use the DBR methodology to create relevant outcomes based on ISD theories for practitioners in the field of IT and to add t o the body of IT research. The outcomes of the study provided evidence that using a systematic approach such as ADDIE to develop a valid an d effective interactive web based module was still viable. Additionally, a lthough the outcomes from this study did no t form a basis to propose a ne w ISD model, it highlighted five key activities that could be added to the ADDIE process to accommodate development of a quality interactive web based produ ct. The five activities are as follows : 1. T o conduc t a detailed front en d analysis. 2. T o develop a protot ype early in the process. 3. T o integrate formative and summative evaluations. 4. T evaluate throughout the process. 5. T o accommodate flexibility within the process. Furthermore, u sin g the DBR methodology yielded results that added to the body of IT researc h. Moreover, it provided support of the use of this methodology within the instructional technology discipline. in this study was one example of the use fulness of the DBR methodology for practitioners within the IT discipline. Many i nstructional designers seek guidance and relevant information with strong theoretical support Results from DBR studies appear to meet that need.

PAGE 209

197 References @Peer Group (20 06). The design based research EPSS. A peer tutorial for design based research. What a re the challenges of doing DBR? Retrieved April 27 2007 from http://projects.coe.uga.edu/dbr/enact03.htm Alle n, I.E., & Seaman, J. (2007). Online n ation ; Five y ears of g rowth in o nline l earning. The Sloan Consortium Retrieved November 7 2007 from http://www.sloan c.org/publications/ survey/pdf/online_nation.pdf Allen, M. W. (2006). Creating successful e learning. A rapid system for getting it right first time, every time. San Francisco, CA: Pfeiffer: An Imprint of Wiley. Allen, M. W. (2003 ). learning: Buil ding interactive, fun and effective learning programs for any company Hoboken, New Jersey: John Wiley & Sons. Association for Information Systems (AIS). (2006). Design research in information systems. Retrieved March 14, 2006, from http://www.isworld.org/Researchdesign/drisISworld.htm Banathy, B. H. (1968). Instructional systems Belmont, CA: Fearon. Bannan Ritland, B. (2003). The role of design in research: The integrative learning design framework. Educational Research 32 (1), 21 24.

PAGE 210

198 Barab, S., Arici, A., & Jackson, C. (2005). Eat Your Vegetables and Do Your Homework : A Design Based Investigation of Enjoyment and Meaning in Learning. Educational Technology 45 (1), 15 21. Barab, S., Squir e, K. (2004). Design based research. Putting a stake in the ground. The Journal of Learning Sciences 13 (1), 1 14. Bell, P. (2004). On the theoretical breadth of design base research in education. Educational Psychologist 39 (4), 243 253 Bereiter, C. (2 002). Design research for sustained innovation. Cognitive Studies Bulletin of the Japanese Cognitive Society 9 (3). 321 327. B ichelmeyer B difference and why s hould we c are? IDT Record. Retrieved September 5 2007 from http://www.indiana.edu/~idt/articles/documents/ID_theory.Bichelmeyer.html Bichelmeyer, B. (2005). A met aphor for the lack of clarity in the field of IDT. AECT Futures Group Presentations IDT Record 2005. Bitpipe.com (2007) E learning. Retrieved September 16, 2007 from http://www.bitpipe.com/tlist/ eLearning.html Bogdan, R.C. & Biklen, S.K. (1992). Qualitative research for education: An introduction to theory and methods Needham Heights, MA: Allyn and Bacon Bolliger, D. U., & Martindale, T. (2004). Key factors for determining student satisfaction in online c ourses International Journal on E Learning 3 ( 1 ), 61 67 Brown, A. L. (1992). Design experiments: Theoretical and method ological challenges in creating complex interventions. Journal of the Learning Sciences 2 (2) 141 178.

PAGE 211

199 Brown, G. & Wack, M (1999). The difference frenzy and matching buckshot with buckshot. The Technology Source May/June. Retrieved August, 24, 2006 from http://ts.mivu.org/default.asp?show=article&id=1034 B runing, H, Schraw, G. J., Norby, M.M., & Ronning, R.R. (2004). Cognitive psychology and instruction. Upper Saddle River, NJ: Pearson Education, Inc. Carroll, J.B. (1989). The Carroll model: A 25 year retrospective and prospective view. Education R esearch er 18 (1), 26 31. Current Contents 1 3, 20. Carroll, J.B. (1973). Basic and applied research in education: Definitions, distinctions, and implications. In H.S. Broudy, R.H. Ennis, & L.I. Krimerman (Eds.) Philosophy of educational research (pp. 108 121). New York: John Wiley & Son Carroll, J. B. (1963). A model of school learning. Teachers College Record, 64 723 733. Champion, R. (1999). Absolutely alarming: Even I am shocked by the amount of preparation time. Journal of Staff Development, 20 Retrieved March 24, 2009, from http://www.nsdc.org/news/jsd/champion204.cfm Chao, T., Saj, T., & Tessier, F. (2006). Establishing a quality review for on line courses; A formal review of online course measures there quality in key areas and reveals changes needed for improvement, if any. Educause Quaterly 3 32 39. Clark, R. E. ( 1994 ) Media will never influence learning. Educational Technology Research an d Development 42 (2), 21 29. Clark, R. E. (1 991 ) When researchers swim upstream: Reflections on an unpopular argument about learning from media. Educational Technology 31 (2), 34 40.

PAGE 212

200 Clark, R. E. ( 1985 ) Evidence for confounding in computer based instruct ion studies: Analyzing the meta analyses. Educational Communications and Technology Journal 33 (4), 249 262. Clark, R. E. (1983). Reconsidering research on learning from media. Review of Educational Research 53 (4), 445 459. Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003) Design experiments in educational research. Educational Researcher 32 (1), 9 13. Cohen, L., Manion, L., & Morrison, K. (2000). Research methods in education ( 5 th ed.) New York, NY: RoutledgeFalmer Collins, A. (1999) The changing infrastructure of education research In E.C. Lagemann and L.S. Shulmann (Eds .), Issues in education research; Problems and possibilities (pp. 289 298 ) San Francisco, CA: Jossey Bass. Collins, A. (1992) Toward a design science of education (Eds.), New directions in educational technology (pp. 15 22). Berlin: Springer Verlag Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. The Journal of Learning Sciences 1 3 (1), 15 42. Cox, S., & Osguthorpe, R.T. (2003). How do instructional design professionals spend their time? TechTrends, 47 (3), 45 47 Cukras, G.G. (2006). The investigation of study strategies that maximize learning for under prepared students. College T eaching 54 (1), pp. 194 197.

PAGE 213

201 Dawson, K., & Ferdig, R. E. (2006). Commentary: Expanding notions of acceptable research evidence in educational technology: A Response to Schrum et al. Contemporary Issues in Technology and Teacher Education 6 (1), 133 142. De de, C. (2005). Commentary: The growing utilization of design based research. Contemporary Issues in Technology and Teacher Education 5 (3/4), 345 348. Dempsey, J.V., & Van Eck, R. N. (2002). Instructional design on line: Evolving expectations. In R.A. Rei ser, & J.V. Dempsey (Eds.) Trends and Issues in Instructional Design and Technology (pp. 21 25 ). Upper Saddle River, NJ: Pearson Education, Inc. Design based Research Collective (DBRC) (2003). Design based research. An emerging paradigm for educational i nquiry. Educational Researcher 32 (1): 5 8. Design based Research Collective (n.d) What is design research? Retrieved September 16, 2006 from http://www.designbasedresearch.org/dbr.html Dick. W. Car e y, L. & Carey, J. O. ( 2005 ). The systematic design of i nstruction (6 th ed.). Glenview, IL: Harper Collins. Dick. W. & Carey, L. (1996 ). The systematic design of i nstruction ( 4 th ed.) Glenview, IL: Harper Collins. Dick. W. & Car ey L. (1990). The s ystematic design of i nstruction (3 rd ed.). Glenview, IL: Harper Collins. Distance Educator.com. (2003). Exclusive interview with Dr. Thomas C. Reeves, author o Interviewed by Farhad Saba, Ph. D. CEO, Distance Edu cator.com Retrieved September 16, 2007 from http://www.coe.uga.edu/coenews/2003/ReevesQ&A.html

PAGE 214

202 Driscoll, M. P. (1994). Psychology of Learning for Instruction Needham Heights, MA: Allyn & Bacon Durham, G. (2007). Teaching test taking skills: Proven techniques to boost your Plymouth, UK: Rowman & Littlefield Education Duin, A. H. (1998). The culture of distance education: Implementing an online graduate level course in aud ience analysis Technical Communication Quarterly 7 (4): (365 388) Edelson, D. C. (2002). Design Research: What we learn when we engage in design. Journal of the Learning Sciences 11(1), 105 121. eSchool News staff & wire service reports. (2007 May 1 st ). Ed study slams software efficacy eSchool News 20 educator. Frederickson, E., A. Pickett, P. Shea, W. Pelz, & K. Swan. (2000). Student satisfaction and perceived learning with online courses: Principles and examples from the SUNY Learning Network. Journal of Asynchronous Learning Networks 4 (2) Gagn R. M. ( 1965 ). The conditions of learning (2 nd ed.) New York: Holt, Rinehart and Winston. Gagn R. M. ( 19 77 ). The conditions of learning (3 rd ed.) New York: Holt, Rineha rt and Winston. Gagn R. M. (1984) Learning outcomes and their effects: Useful categories of human performance. American Psychologist 27 (4) 377 385. Gagn R. M., Wager, W., & Rojas, A. (1981). Planning and authoring computer assisted instructional less ons. Educational Technology, pp. 17 26

PAGE 215

203 Gagn R. M., Briggs, L.J., & Wager, W.W. (1988) Principles of instructional design (3 rd ed.) New York, NY : Holt, Rinehart and Winston Inc. Gentile, J. R. (1997). Educational psychology (2 nd ed.). Dubuque, Iowa: Ken dall/Hunt Publishing Company. Gordon, J., & Zemke, R. (2000). ISD under attack. Training 37(4), 42 53. Greer, M. (1992). ID project management: Tools and techniques for instruct ional designers and developers. Englewood Cliffs, New Jersey: Educational Tech nology Publications, Inc. Gredler, M.E. (2001). Learning and instruction: Theory into Practice (4 th ed.). Upper Saddle River, New Jersey: Prentice Hall. Gustafson, K.L., & Branch, R.M. (2002). What is instructional design. In R.A. Reiser, and J.V. Dempsey (Eds.), Trends and Issues in Instructional Design and Technology (pp 16 25) Upper Saddle River, NJ: Pearson Education, Inc. Hoadley, C. (2002). Creating context Design based research in creating and understanding CSCL. In G. Stahl (Ed.), Computer Su pport for Collaborative Learning (pp. 453 462). Mahwah, NJ: Lawrence Erlbaum Associates. Hong, K. (2002) variables with satisfaction and learning from a w eb based course Internet and Higher Education ( 5 ) pp. ( 267 281 ). Pergamon. Janicki, T., Liegle, J. O. (2001). Development and evaluation of a framework for creating web based learning modules: A pedagogical and systems p erspective Journal of Asynchronous Learning Networks 5 (1): 58 84 A Sloan C Public ation

PAGE 216

204 Johnson, B., & Christensen, L. (2004). Educational research: Quantitative, qualitative, and mixed approaches (2 nd ed.). Boston, MA: Pearson Education, Inc. Joseph, D. (2004). The practice of design based research:Uncovering the interplay between de sign, research, and the real world context. Educational Psychologist 39 (4): 235 242. Lawrence Erlbaum Associates, Inc. Kim, K., & Moore, J. L. (2005, November 7). Web based learning: Factors affecting ction and learning experience. First Monday, 10 (11). Retrieved December 3, 2007, from http://firstmonday.org/issues/issue10_11/kim/index.html Khan, B. (1997). Web Based i nstruction. New Jersey: Educational Technologie s Publishing. Kemp, J.E., Morrison, G.R., & Ross, S.M. (1994). Designing effective instruction New York, NY: Macmillan College Publishing Company. Kilby, T. (2008). What constitutes quality in web based training? WBTIC Retrieved November 25 th 2008, fr om http://www.webbasedtraining.com/primer_quality.aspx Kozma, R. B. (1994). Will media influence learning? Reframing the debate. Educational Technology Research and Development 42 (2): 7 19. Kozma, R. 1991. Learning with media. Review of Educational Research 61 (2): 179 211. Lee, W. W., & Owens, D. L. (2004). Multimedia based instructional design: Computer based training, web based training, distance broadcast training, performance based s olutions (2 nd ed.). San Francisco, CA : Pfeiffer. credibility gaps? Issues in Education 5 177 229.

PAGE 217

205 Levine, A., & Sun, J.C. (2003). Barriers to Distance Education. EDUCAUSE 2 1 23. In Series Summary, Distributed Education: Summary of a Six Part Series. American Council for Education Center for Policy Analysis (ACE). Mager, R.F. (1997). Goal analysis ( 3 rd ed .) Atlanta, GA: CEP Press. Mager, R. F. (1984). Prepari ng instructional objectives (2 nd ed.). Belmont, CA: Pitman. Mayer, R. E. (2003). Theories of learning and their application to technology. In H.F. ), Technology applications in education: A learning view (pp. 127 157). Mahwah, NJ : Lawrence E rlbaum Associates Magliaro, S.G., & Shambaugh, N. (2006). Student models of instructional design. ETR& D 54 (1), 83 106. Association of Educational Communication and Technology. Mariasingam, M.A,. & Hanna, D.E. (2006). Benchmarking Quality in Online Degre e Programs Status and Prospects Online Journal of Distance Learning Administration 9( 3 ) Fall 2006 Retrieved November 24 th 2008, from http://74.125.45.104/search?q=cache:CepIK0mG0EUJ:www.westga.edu/~distance/oj dla/fall93/mariasingam93.htm+concern+about+quality+of+web+based+courses&hl= en&ct=clnk&cd=10&gl=us McGriff, S.J. (2001). ISD knowledge base / Instructional d esign & development / Instructional systems design m odels Portfolio of Steven J. McGriff. Retrieved August 5, 2007, from http://www.personal.psu.edu/faculty/s/j/sjm256/portfolio/kbase/IDD/ISDModels.html #addie Mehlenbacher B. (2002). Assessing the usability of online instructional materials. In R. S. Anderson, J. F. Bauer, and B. W. Speck (Eds.). Ass essment Strategies for the On

PAGE 218

206 line Class: From Theory to Practice. New Directions for Teaching and Learning Series Number 91 (91 98). San Francisco, CA: Jossey Bass. Miles, M. B & Huberman, A M. (1984). Qualitative Data Analysis Beverly Hills, CA: Sage Publications. Miles, M.B. & Huberman, A. M. (1994). Qualitative data analysis: an expanded sourcebook. (2 nd ed.). Thousand Oaks, CA: Sage Publications. Molenda, M. (2003). In search of the elusive ADDIE model. Performance Improvement. May/June. Moore, M. G. (2002). Editorial: What does resear ch say about the learners using computer mediated communication in distance learning? The American Journal of Distance Learning, 16 (2), 61 64 Lawrence Erlbaum Associates, Inc. Moore, M. G. (1989). Recruiting and retai ning adult students in distance education. New Directions for Continuing Education 47 69 98. National Association of State Boards of Education (NASBE). (2001). Any time, any place, any path, any pace: Taking the lead on e learning policy. NASBE. Washing ton, VA. National Education Association (NEA), (2000). NEA and Blackboard Inc. Study Finds 24 Measures of Quality in Internet Based Distance Learning "Quality On The Line" study released at Blackboard Summit from http://www.nea.org/nr/nr000321.html Neuheuser, C. (2002). Learning style and effectiveness of online and face to face instruction. T he A merican Journal of Distance Education 16 (2), 99 113, Lawrence Erl baum Associates, Inc.

PAGE 219

207 Notess, M. (2004). Applying contextual design to educational software development. In Anne Marie Armstrong (Ed.) Instructional Design in the Real World: A View from the Trenches Information Science Publishing. Northrup, P., Lee, R. Burgess, V. (2002). Learner perceptions of online interactions. ED MEDIA 2002 World Conference on Educational Multimedia, Hypermedia, Telecommunications. Proceedings Denver, CO: AACE. Retrieved March 12 th 2008, from http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/8 0/1b/19/8d.pdf Educational Psychologist 39 (4): 255 260. Lawrence Erlbaum Associates, Inc. Parker, A. (2003). Identifying Predictors of academic p ersistence in distance e ducation Journal of the United States Distance Learning Association January, 55 61. difference? A review of contemporary research on the effectiveness of distance learning in higher education. The Institute of Higher Education Policy. Washington D.C. Retrieved March 13, 2007 from http://www.ihep.com/Pubs/PDF/Difference.pdf Ramage, T. R. (2002). The "No significant difference" phenomenon: A literature review. e Journal of Instructional Science and Technology 5 (1). Retrieved May 25, 2006 from http://www.usq.edu.au/electpub/e jist/docs/html2002/ramage.html Reeves, T. C. (1995). Questioning the Questions of Instructional Technology Research. Paper presented at the Proceedings of the 1995 Annual National Convention of the Association for Educational Communications and Technology AECT), Anaheim, Ca.

PAGE 220

208 Reeves, T. C. (2000). ies Paper presented at the 21 st annual meeting of the American Educational Research Association, New Orleans, LA. Reeves T.C ., Herri ng ton, J., & Oliver, R. (2005 ) Design research: A soci ally responsible approach to instructional technology research in higher education. Journal of Computing in Higher Education 15 (2), 97 116. Reeves, T.C., & Hedberg, J .G. (2003). Inter a c tive learning systems evaluation. Englewood Cliffs, N.J: Educational Technology Publications. Reigeluth, C.M. (Ed.) (1983). Instructional design t heories and m odels: An overview of their c urrent s tatus Hillsdale, NJ: Lawrence Erlbaum As sociates. Reigeluth, C.M. (Ed.) (1999). Instructional d esign t heories and models: A n ew p aradigm of i nstructional t heory ( vol. 2 ). Hillsdale, NJ: Lawrence Erlbaum Associates. Reigeluth C. M (2003). Clearing the m uddy w aters: A r esponse to Barbara Bichel meyer IDT Record. Retrieved September 5, 2007 from http://www.indiana.edu/%7Eidt/articles/documents/Reigeluth_response_to_Bichelme yer.htm Reiser, R.A. (2002). A history of instructional design and technology In R.A. Reiser, and J.V. Dempsey, (Eds.), Trends and Issues in Instructional Design and Technology 2002 (pp 16 25). Upper Saddle River, NJ: Pearson Education, Inc.

PAGE 221

209 Relan A., & Gillani, B.B. (1997) Web Based i nformation and the t raditional c lassroom: Similarities and d ifferences. In Khan, B.H., (Ed.), Web Based Instruction Englewood Cliffs, New Jersey: Educational Technology Publication s. Resnick, L. B. (1999). Making America smarter. Education Week pp. 38 40. Roblyer, M. D. (2005). Educational technology research that makes a difference: Series introduction. Contemporary Issues in Technology and Teacher Education [Online serial], 5(2). Retrieved November 29, 2005, from http://www.citejournal.org/vol5/iss2/seminal/article1.cfm Robyler, M.D., & Wiencke, W.R. (2003). Design and use of a rubric to assess and encourage interactive qualities in distance courses. The American Jour nal of Distance Education 17 (2), 77 98 Lawrence Erlbaum Associates, Inc. Rothwell, W.J, & Kanzansas, H. J. (2004). Mastering the instructional design process: A systematic approach 3 rd Ed. Hoboken, NJ: John Wiley and Sons. Sandoval, W. A. (2004) Develop ing learning theory by refining conjectures embodied in educational designs. Educational Psychologist 39 (4), 213 223. Lawrence Erlbaum Associates, Inc. Sandoval, W.A., & Bell, P. (2004). Design based research methods for Studying learning in Context: Int roduction. Educational Psychologist 39 (4), 199 201. Lawrence Erlbaum Associates, Inc. Sarnacki, R. E. (1979). An examination of test wiseness in the cognitive test d omain Review of Educational Research 49 (2 ), 252 279

PAGE 222

210 Scafati, A. A. (1998). A case stud y for the systems approach fo r developing curricula: ut the baby with the bath w instructional systems design. Acquisition Review Quarterly Fall. Defense Acquisition University Press. Schrum, L., Thompson, A., Sprague, D., Maddux, C. McAnear, A., Bell, L., & Bull, G. (2005). Advancing the field: Considering acceptable evi dence in educational technology research. Contemporary Issues in Technology and Teacher Education [Online serial], 5(3/4). Retrieved September 5, 2007 from http://www.citejournal.org/vol5/iss3/editorial/article1.cfm Scruggs, T.E., & Mastropie ri, M.A. (1992). Teaching test taking skills Helping students show what they know In M. Pressley (Seri es Ed.), A volume in the series on Cognitive Strategy Instruction Purdue University: Brookline Books. Seels, B., & Glasgow, Z. (1998). Making instructional design decisions ( 2 nd ed.) Upper Saddle River, NJ : Prentice Hall, Inc. Seeto, D., & Herrington, J. (2006). Design based research and the learning designer. In L. Markauskaite, P. Goodyear, & P. Reimann (Eds.), Proceedings of the 23rd Annual Conference of the Australasian Society for Computers in Learning in Tertiary hnology? (pp. 741 745). Sydney: Sydney University Press. Shaik, N., Lowe, S., Pinegar, K. (2006). DL sQUAL: A Multiple Item Scale for Measuring Service Quality of Online Distance Learning Programs Online Journal of Dista nce Learning Administration 9 ( 2 ), 1 [Online Serial]. University of West Georgia, Distance Education Center Retrieved March 10 th 2008 from: http://www.westga.edu/~distance/ojdla/sum mer92/shaik92.htm

PAGE 223

211 Simonson, M., Smaldino, S., Albright, M., & Zvacek, S. (2003). Teaching and learning at a distance: Foundations of distance education. 2 nd ed. Upper Saddle River, NJ: Prentice Hall. Smith, P.L., & Ragan, T.J. (2005). Instructional desig n Hoboken, NJ: John Wiley & Sons, Inc. Stokes S.P. (2001 ). Satisfaction of college students with the digital l earning environment : Internet and Higher Education ( 4 ) pp. (31 44). Pergamon. Swan, K. (2003 ). Lea rning effectiveness: What research tells us. In J. Bourne, & J. Moore (Eds.), Elements of quality online education: Practice and direction (pp. 13 4 5). Needham, MA: Sloan C. The Campus Computing Project. (2006). The 2006 national survey of information techn ology in U.S. higher education. Wireless networks reach half of college classrooms; IT security incidents decline this past year. Retrieved April 18, 2007 from www.campuscomputing.net The Web Based Education C ommission. (2000). The power of the Internet for learning: Moving from promise to practice Report of the Web Based Commission to the Congress of the United States. Retrieved November 7, 2007 from http://www.ed.gov/offices/AC/WBEC/FinalReport/WBECReport.pdf Trochim, W.M.K. (2001). Research methods knowledge base ( 2 nd e d. ). Atomic Dog Publishing. University of South Florida ( USF ) (n.d). Search a bull. Retrieved November 7, 2007 from http://www.ugs.usf.edu/sab/sabs.cfm

PAGE 224

212 Van den Akker, J., Gravemeijer, K., McKenney, S., & Nieveen, N. (2006). Introducing educational design research. In J. van den Akker, K. Gravemeijer, S. McKenney, & N. Nieveen (Eds.), Educational design research (pp. 3 7). New York, NY: Routledge. v an den Akker, J. (1999). Principles and methods of development research. In J. van den Akk er, N. Nieveen, R. M. Branch, K. L. Gustafson, & T. Plomp, (Eds.), Design methodology and developmental research in education and training (pp. 1 14). The Netherlands: Kluwer Academic Publishers. Wang, F., & Hannafin, M. J. (2005). Design based research an d technology enhanced learning environments. Educational Technology Research and Development 53 (4), 5 23. Waits T., & Lewis L. (2003). Distance Education at Degree Granting Postsecondary Institutions: 2000 2001 ( NCES 2003 017 ). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Webopedia.com ( 2007 ) Did you know the difference between the internet and world wide web? Retrieved September 14, 2007 f rom http://www.webopedia.com/DidYouKnow/Internet/2002/Web_vs_Internet.asp What is.com (2007). Shovelware. Retrieved November 25, 2007 from http://whatis.techtarget.com/definition/0,,sid9_gci212982,00.html Zemke, R., & Rossett, A. (2002). A hard look at ISD Training. 39 (2), 26 35. Zimmerman, B. J. (1990 ). Self regulated learning and academic a chievement: An o verview Educational Psychologist 25 (1), 3 17

PAGE 225

213 Bibliography Cole, R.A. (Ed ). (2000). Issues in web based pedagogy: A critical primer Westport, CT: Greenwood Press. Cooley, L., & Lewkowicz, J. (2003). Disser tation writing in practice: T ur ning ideas into text Aberdeen, HK: Hong Kong University Press. El y, D.P. & Plomp, T. (Eds ). (2001 ). Classic writings on instructional technology (Vol 2). Englewood, CO: Libraries Unlimited, Inc. Ely, D.P. & Plomp, T. (Eds ). (1996). Classic writings o n instructional technology (Vol 1). Englewood, CO: Libraries Unlimited, Inc. Flippo, R.F. & Caverly, D.C. (Eds.). Teaching reading & study strategies at the college level Newark, DE: International Reading Association. Gall, J.P., Gall, M.D., & Borg, W.R. (Eds.). (1999). Applying educational research: A practical guide (4 th ed.). New York: Addison Wesley Longman. Iannuzzi, P., Strichart, S.S., & Mangrum II, C.T. (1998). Teaching study skills and strategies in college Needham Heights, MA: Allyn and Bacon. Joseph, N.L. (1999). Research writing: Using traditional and electronic sources Upper Saddle River, NJ: Prentice Hall. New York, NY: Routledge. Juwah, C. (Ed.). (2006). Interactions in online education implications for theory & practice New York, NY: Rou tledge.

PAGE 226

214 Kwan, R., & Fong, J. (Eds ). (2005). Web b@sed learning: Technology and pedagogy. Proceedings of the 4 th International Conference : 1 3 August, Hong Kong Singapore : World Scientific Publishing. Lechuga, V.M. (2006). The changing landscape of the ac ademic profession: A culture of faculty at for profit colleges and universities New York, NY: Routledge Levy, Y. (2006). Assessing the value of e learning systems Hershey, PA: Information Science Publishing. Magoulis, G.D. & Chen, S. Y. (Eds ). (2006) A dvances in web based education: Personalized learning environments Hershey, PA: Information Science Publishing. McNiff, J. & Whitehead, J. (2006). All you need to know about action research Thousan d Oaks, CA: SAGE Publications. (Eds ). (2003). Web based learning: Theory, research, and practice Mahwah, N.J: Lawrence Erlbaum Associates. ). (2003). Technology applications in education: A learning view Mahwah, N.J: Lawrence Erlbaum Associates. Porte r, L.R. (2004). Developing an online curriculum: Technologies and techniques Hershey, PA: Information Science Publishing. Piskurich, G. W., Beckschi, P., & Hall, B. (Eds ). (2000). The ASTD handbook of training design and delivery. A comprehensive guide t o creating and delivering training programs instructor led, computer based, or self directed New York: McGraw Hill. Rabinowitz, M., Blumberg, F.C., & Everson, H.T. (Eds ). (2004). The design of instruction and evaluation: Affordances of using media and t echnology Mahwah, N.J: Lawrence Erlbaum Associates.

PAGE 227

215 Reeves, T.C., Herrington, J., & Oliver, R. (2004). A development research agenda for online collaborative learning. ETR&D 52 (4), 53 65. Robins, K., & Webster, F. (Eds ). (2002). The virtual university? Knowledge, markets, and m anagement Great Clarendon St., OX: Oxford University Press. Rogers, P. L. (2002). Designing instruction for technology enhanced learning Hershey, PA: Idea Group Publishing. Sikora, A.C. (2002). A profile of participation in dist ance education: 1999 2000. Postsecondary descriptive analysis report. Education Statistics Quarterly, 4 (4). Sorenson, E. S., & Murch, D.O. (Eds ). (2006). Enhancing learning through technology Hershey, PA: Information Science Publishing. State of Florida (2000, 2004). Department of State. Retrieved November 14, 2008, form http://www.cpt.fsu.edu/ese/in/strmain.html Tessmer, M. (1993). Planning and conducting formative evaluations: Improving the qu ality of education and training. London, UK: Kogan Page van den Akker, J., Gravemeijer, K., McKenney, S., & Nieveen, N. (Eds.). (2006). Educational design research New York, NY: Routledge

PAGE 228

216 APPENDICES

PAGE 229

217 Appendix A : Results : Pilot Study ( Analysis Phase & Prototype)

PAGE 230

218 Needs Analysis 1. Is the LEARN program seeking to enhance its current instruction method? Answer: Yes. 2. Can you identify the need/s of the LEARN programs? Answer: We would like to implement an online/web based course that has the possibili ty of reaching out to more students or to new student populations. The LEARN Program needs to seek new opportunities to diversify current academic support services available to the students. Two specific areas that maintain the ampus are study skills workshops and credited courses in critical reading skills and learning strategies. Courses and workshops are currently to face instructional methods. 3. What do you believe will address the need/s that currently exists? Answer: Cognizant of the advancement of technology in instructional settings, and interested in broadening services to remote students, the LEARN Program director has decided to implement changes that will allow web based delivery of at least one of the courses taught in the program. A consecutive step will be the design and development of stand alone instructional modules that can be implemented as part of the course or taught separate as one of the workshops offered in the progra m. Online/web based modules will help us to reach out to students not currently served. 4. Describe what changes (you) the administrators of the LEARN program would like to implement? aterial is online and certain aspect of the course (e.g. discussions) occurs online. Currently none of the modules that are taught in class is accessible online. We would like make these modules available online. 5. What need/needs are you trying to meet by implementing the proposed changes to the LEARN program now? Answer: 1. Provides flexibility to access information and content 2. Decrease barriers to distance learning e.g. help commuter students 3. Increase the possibility of reaching students in regional camp uses 6. Why do you think the changes are important to implement now?

PAGE 231

219 Answer: Maybe we are losing potential students that could be served if we had online accessibility. This is just an observation and you should know that no formal needs analysis has been conducted, no statistics have been collected. However we believe that this is an opportune time to move the modules online. 7. If any instructional changes are implemented, what benefits do you expect? Answer: 1. Greater number of students will be served 2. Co urse will be more accessible 3. Flexibility of online modules will help develop workshop 4. Will help us to diversify instructional capabilities 5. Will help us to adapt to future changes (easier to integrate changes in online modules) 8. Have there been any other i describe what the intervention was and what the actual outcome versus the expected outcome was. Answer: No.

PAGE 232

220 Audience Analysis (For SME) 1. What is the age range of the learners? Answer: Between 18 and 25 2. What are the general computer capabilities of the learners? Answer: In general, most of our students can be considered computer literate. They possibly use the computers on a daily basis and are comfortable using them. 3. In particular, are they comf ortable using an email application? Answer: Yes. 4. Can the learners attach and send files via email? Answer: Yes. 5. Do all the learners have access to computers? (Either on campus or at home or in class) Answer: Yes. 6. What does the learner want to lea rn? Answer: Our classes are not content based but skill based. Therefore, we use their own interest in any subject matter to make our teaching and their learning relevant. For instance, if the topic to discuss is lecture note taking in a procedural class, students need to come up with one of the classes they are taking that meets the definition of procedural. Based on that class we will work with that subject matter to facilitate the strategy to be learned. Sometimes the learner is in class because it is a be on academic probation. However, some learners want to enhance their learning strategies and learning techniques. Some may want to understand how they learn so they can raise their GPAs. 7. What does the learner alrea dy know about the subject matter? Answer: They know a little about time management, general knowledge of note taking and how to read a textbook. These are general knowledge also discussed in class.

PAGE 233

221 8. How motivated is the learner? Answer: Generally spea king, they are all motivated. Yet, some students take the class only to get a couple of extra credits they need to fulfill administrative requirements. So it sometimes depends on why they are taking the class (highly motivated tend to be better students). 9. How much time is the learner willing to spend studying? Answer: No more than the time they need to complete assignments, the class does not require more than 2 hours worth of work per week. Requirements are considered ntent based, there are no tests. Hence, student progress is evaluated based on the successful completion of assignments that will reinforce strategies discussed in class. 10. What does the learner think of distance learning classes? Answer: Although I could comfortable with the idea. Currently, all our classes have an asynchronous discussion board where students independently react to readings. Overall, students enjoy this activity. However, I could not answer for sure how they would like a complete online interaction. 11. What is the reading level (in English) of the learners? Answer: Comprehension is 8 th grade and above. Average rate of reading is 250 words per minute. These are national average for this level. 12. How do learners apply the knowledge from this class? Answer: By incorporating the learning/study strategies and metacognitive awareness on the work they do for their other classes. They go back to their classes and change some of their study prac tices. At least, I hope so. We address their study concerns in write a reflection paper at the end my class. In this reflection paper, the students explain where and how they i mplemented learning concepts and systems.

PAGE 234

222 Task Analysis Topic Analysis (for SME) 1. What is the information that needs to be taught? 2. How is this information presented to the learners (lecture, h omework, reading assignments, class discussions etc.)? test items and the different types of tests. 3. Is there any pre requisite knowledge required for this course? Answer: No thing beyond the baseline knowledge of a college freshman. 4. Is there any pre requisite knowledge required before learning this information? Answer: Nothing beyond the baseline knowledge of a college freshman. 5. Is there any pre requisite skill(s) required ? Answer: Nothing beyond the baseline skills of a college freshman.

PAGE 235

223 Content Analysis 1. What procedures/skills are presented to the learners? Answer: In general, they are introduced to a number of concepts: 1. Intentional learning 2. Knowledge of self regula tion 3. Memory and concentration 4. Autonomy and motivation 5. Objective and subjective test taking strategies 6. Time management Subjective tests Objective tests Levels of intellectual performance Levels for test quest ions Essay test vs. multiple choice/true false/mix match etc. 2. Describe in detail the sequence of steps in which the procedure/skill (procedural knowledge) is presented to the learners for the Objective tests: a. Learners get a sample of an objective test (a mock test Multiple choice, true/false etc. type questions b. Students take the test c. We then engage in a discussion on the level of difficulty of the test? d. We talk about strategies they use d to overcome the difficulty they experienced? e. Most of the time, the students tend to choose the correct strategy f. correct strategies. I cover them either way, just to help them understand the strategies better. g. I teach and point out keywords that they can use. ? For Subjective (essay type) tests: a. I use PowerPoint slides and SmartB oard to provide prompts for students b. Students read the prompt and attempt to answer the question (they have 10 minutes to write an answer)

PAGE 236

224 c. We follow this by a discussion as to why they found the question difficult d. e. I discuss Levels of knowledge which is similar to Blooms Taxonomy f. g. How to structure paragraphs h. I also cover test anxiety 3. Describe in detail how you would like the procedure/skill to be taught in an online environment. Answer: I would: a. Like it to be highly interactive b. Example have a test bank for objective test randomize questions have easy to hard questions etc. c. Audio, video if it is pertinent and will add value to the learning d. Scenarios presented to learners e. Fun approach fresh voice (no Mom and Dad voiceovers giving 4. What conceptual facts/rules /principles (declarative knowledge) are presented to the learners? a. Levels of knowledge b. Levels of intellectual ability c. Characteristics of objective and subjective tests d. Commonly used test taking strategies e. General rul es of test taking strategies 5. What attitudes and values (affective knowledge) are presented to the learners? Answer: I cover a lot, for example: a. Being responsible for studying b. Being self regulated c. Being self motivated d. Being committed to their studies

PAGE 237

225 6. Describe in detail how you would like the attitudes and values to be taught in an online environment. Answer: I expect that the online version should have a FUN approach. and animatio

PAGE 238

226 Context Analysis Environment for: Learner Characteristics Instructional Setting Organizational Support Planning What behaviors, prior knowledge, ability and attitudes (e.g., towards content, delivery, and the or ganization) will the learner bring to the situation? Answer: Previously answered What constraints exists that will affect this online intervention? Answer: Limits human interaction (face to face) but I do not think it will affect the learning outcomes. The re may be financial limitations. What resources will affect the selection and preparation of this online intervention? What resources will be available for planning and development? What purpose will the online interv ention serve for the organization? Answer: previously answered. Learning What are characteristics of the learners and how will it affect individual learning? Answer : Students are responsible to know and understand their own learning style .Are individua l learning preferences met? Answer: Students are responsible to know and understand their own learning style. We can equip them with knowledge to self regulate. What characteristics of the social and physical setting affect learning? Answer: Not applicabl e. Are the instructors well versed on the subject matter? Answer: Yes How will instruction be monitored? Answer: Assignments, reflection papers, discussions. How will its relevance be established? Answer: Already established in traditional class. Not an is sue. Performance What support is needed? Answer: Website, on site, IT support What social and physical constraints can hamper use of the new learning or skills? Answer: Lack of access to PCs. How can they be eliminated? Answer: Open use labs on campus Ho w will diffusion (adoption and maintenance) of the learning be encouraged? Answer: Through in class instruction. Instructors will encourage learners to use the online module Source: Seels & Glasgow, 1998

PAGE 239

227 Specimen A 1 Analysis Phase: Instrument and s umma ry of results from t questionnaire n = 10 A. STUDENT INFORMATION Item Category % in Category 1. I am a Full time 90 Part time 10 B. COMPUTER USAGE 2. I feel comfortable using computers to aid in my studies. Strongly Agree 7 0 Agree 30 3. I regularly use the computer to read emails and send information. Strongly Agree 60 Agree 40 4. I am comfortable attaching files to emails to send to my instructors family Strongly Agree 60 Agree 30 Disagree 10 5. I am comfortable using the computer to do real time chats or online discussions. Strongly Agree 50 Agree 20 Disagree 10 Strongly Disagree 20 6. I have access to the Internet all the time. Strongly Agree 60 Agree 30 Disagree 10 C. ONLINE/DISTANCE LEARNING COURSE INFORMATION 7. I have taken a distance learning course in the past. Yes 60 No 40 8. I am currently enrolled in a distance learning course. Yes 20 No 80 9. I think distance learning courses are easy in comparison to traditional (or instructor led) courses. Agree 60 Disagree 40 10. I think distance learning courses are difficult in comparison to traditional (or instructor led) courses. Agree 20 Disagree 80 11. If this course (Learning Strategies) was online, I would take the online course inste ad of the traditional (instructor led classroom) course. Strongly Agree 30 Agree 50 Disagree 20 12. I am taking this course because it was recommended to me by my advisor. Strongly Agree 30 Agree 10 Disagree 30 Strongly Disagree 30 13. I am tak ing this course because I want to improve my learning strategies and skills. Strongly Agree 50 Agree 30 Disagree 10 Strongly Disagree 10 14. I am a motivated learner. Strongly Agree 10

PAGE 240

228 Agree 70 Disagree 20 15. I already know a lot about the sub ject matter covered in this Learning Strategies course. Strongly Agree 20 Agree 50 Disagree 30 16. I know very little about the subject matter covered in this Learning Strategies course. Strongly Agree 10 Agree 20 Disagree 50 Strongly Disagree 2 0 17. I am willing to spend 1 or more hours per week studying the information from my Learning Strategies course. Strongly Agree 20 Agree 60 Disagree 20 18. I will apply the knowledge gained from this class to help improve my grades in my other class es. Strongly Agree 50 Agree 50 19. I heard about this course from my friend. Strongly Agree 10 Agree 10 Disagree 30 Strongly Disagree 50 20. I heard about this course from my advisor. Strongly Agree 20 Agree 30 Disagree 30 Strongly Disagree 20 21. I saw this course advertised in the course schedule/flyer/department. Strongly Agree 40 Agree 30 Disagree 10 Strongly Disagree 20 D. YOUR OPINION IS IMPORTANT TO US. YOU CAN HELP US CREATE AN EFFECTIVE ONLINE COURSE. WE WOULD APPRECIATE I T IF YOU TAKE A MOMENT TO SHARE YOUR OPINIONS WITH US. 22. If you had the opportunity to create an online version of your Learning Strategies course, what are some of the features would you include(e.g. would you include animation, more online discussion interactions, scenarios etc.)? Remember you want to keep it educationally Open ended Response 1. Q uizzes to build understanding surrounding our learning types. More examples of our l earning types and ways we can suc c eed in numerous courses. Response 2. I think tha t PowerPoints should be availa ble with each chapter. Online discussions should be there, but not interactions. I think we should have other discussions besides just the "Hop e" discussions. Response 3. I would keep the same model of the class as the traditional class but I would just instead of the discussion involve more group problem s o lving. Response 4. I would use the PowerP oint presentations a lot because they give a lot of useful information. Most of the learning material could be taught by PowerP oint Response 5. I think that you should have more online discussion assignments. Response 6. Weekly live meetings to introduce material. Online discuss ion and assignment c ould all be done via the web course. Response 7. I personally don't have a problem coming to class every week, and prefer that way better. If it were online though, I would say add more s c enarios and di s cussions. Discussions help everyone understand every one has problems too and they can help each other out. Animation might be nice to add, for the visual learners.

PAGE 241

229 Note: A 4 point Likert s cale is used: Strongly Agree/Agree/Disagree/Strongly Disagree Response 8. No animations that's just silly. The graphs shown in class that used our (the students) in put was a very good use of technology and something along those lines, semi interactive, would be a good addition to an on line version of this class. Realtime discus s ions don't seem to work to well in some of the other online classes that I've take. A select few students are the ones asking the questions and voicing their opinions. The "Hope" discu s sion board I thought was very good. The way Ms. Ruiz set it up allowed for the comfortable exchange of ideas by all s t udents. MORE; DISCUSSION BOARDS, weekly postings Response 9. If I had the choice to creat e this as an online course, I would make more online discussions to find out what the students are thinking. I would also make discussions for the students to interact with each other. I would have role playing activities so the students can ask e ach other questions and answer the questions. I would not have timed tests so the students can feel pressured and fail, but I would have practice time tests so the students can practice, and enhance their timed test taking skills.

PAGE 242

230 Figure A 1 .Screen Shot #1 of prototype of web based module

PAGE 243

231 Figure A 2. Screen Shot #2 of prototype of web based module

PAGE 244

232 Appendix B Results: Design through Evaluation Phases of ADDIE

PAGE 245

233 Specimen B 1 S n =3 Design Information Source of Information 1) No test bank required for module. Preferably a generic test of about 10 to 15 questions should be developed. 2) The module should last no more than 15 to 20 minutes (no more than hour online). 3) The module should have info on: (a) how to prep for a test (b) s hould present the questions (c) ask the students to answer question and (d) highlight different parts of the questions. 4) For each m odule, there are about 6 8 strategies per module. 5) The first module to be developed should be Objective Test Taking Strategies 6) May need to store answers and score person. This way they can get immediate feedback. 7) The module should contain animation, it shou ld not be boring. Avoid boring. 8) Audience all high school graduate students. SME Delivery Information 9) Should it be web based (as opposed Internet)? Web based was decided. 10) Multimedia containing audio as well as text 11) Broadband 12) Link from the SVC site 13) M aybe Authorware 6.0 was the best solution however everyone was concerned about scalability/ compatibility/flexibility Programmer 1 and Instructional Designer/Researcher 14) server can support D reamweaver/Flash. SME

PAGE 246

234 Specimen B 2 Development Phase: Instrument and s ummary of results from the Evaluate Usability of Module questionnaire n=1 Section 1: Materials Development Questions Category % in Categories 1. Most or all of the content for t he web based module was based on information provided by the Subject Matter Expert/s (SME/s). Strongly Agree 100 2. The choice of content to be used for instructional development was made by (choose all that apply): SME 100 3. The organization of the co ntent for the instruction was influenced by (choose all that apply): SME 100 4. The terminology and wording of the content is based on what is currently being used in the traditional classroom. Strongly Agree 100 5. The terminology and wording of the con tent is based on material that has a theoretical foundation. Strongly Agree 100 6. The terminology and wording of the content is familiar to the target audience. Strongly Agree 100 7. Do you believe that the readability (i.e. reading level) of the conten t is appropriate for web based delivery? Yes 100 8. Do you believe that the quality of the content is appropriate for web based delivery? Yes 100 9. The text/audio is written in an active voice. Agree 100 10. Highlighting and other animation techniques are used appropriately to bring attention to key phrases and words. Agree 100 11. Did moving the traditional content matter to a web based format require changes? Yes 100 If you responded "Yes" to the above question please go to Question 12 else if you r esponded "No" please go to Question 13. 12. Please state some of the changes that occurred when moving the content from a traditional format to a web based format. (For example, for the learners to understand the concepts in a web based format, were there instructional strategies such as games, animation etc. used?) Open ended 100 Response 1. The change required the use of a theme and animations to keep students engaged and interested. It also required a narrator to provide explanations that were needed. 13. Moving the traditional content to a web based format required changes in how the information was sequenced (whether linear, branching etc.). Agree 100 14. The content in the web based module will produce the same learning outcomes as the traditional format for the learner. Strongly Agree 100 15. Learners can easily understand the content presented in the module. Strongly Agree 100 Section II: Evaluation of the Web Based Module A. Accessibility 16. The module is accessible from my browser. Strongly Agree 100 17. The module executed without technical delays. Disagree 100

PAGE 247

235 18. All the links that I clicked within the module worked on my browser. Agree 100 B. Design Elements 19. The module design is simple and uncluttered. Strongly Agree 100 20. The graphics and colors employed are aesthetically pleasing. Strongly Agree 100 21. The fonts and colors promote legibility within the module. Strongly Agree 100 22. All links, icons and navigation buttons work as expected. Agree 100 23. The information is structured in a meaningful manner in the module and facilitates learning. Strongly Agree 100 24. The directions given for the user to follow are easy to understand. Strongly Agree 100 25. The ends of sections within the module are clearly understood. Agr ee 100 26. There is tracking information available to the users so they can see where they are within the modules at all times. Agree 100 27. The question examples used in the module facilitate learning. Strongly Agree 100 28. The feedback used in the m odule facilitates learning. Strongly Agree 100 29. Feedback messages appear in a consistent layout on each page. Strongly Agree 100 30. Help messages appear in a consistent layout on each page. Strongly Agree 100 31. Error messages appear in a consisten t layout on each page. Strongly Agree 100 32. The explanations of concepts (i.e. strategies) used in the module facilitate learning. Strongly Agree 100 33. The way (i.e. linear or branching etc.) the content information is presented in the module facilit ates learning. Strongly Agree 100 34. The overall tone of the module is engaging. Strongly Agree 100 35. The module provides a suitable learning environment for all users. Strongly Agree 100 C. Graphics/Animations/Multimedia 36. There is consistency in layout of graphics, fonts, color, and positioning of icons. Agree 100 37. The various text animations (e.g. text highlight, text movement etc.) help to emphasize what learners should be learning. Agree 100 38. The audio provide useful information to enh ance learning. Strongly Agree 100 39. The interactions made the training interesting. Strongly Agree 100 D. Navigation 40. The user can navigate to various parts of the module as desired. Agree 100 41. The learner can navigate to the beginning of the m odule easily. Agree 100 42. Navigation is consistent within the module. Strongly Agree 100 43. State what you liked most about the module. Open ended 100 Response 1. Feedback is consistent and clear Tutorial is simple and well organized Students get a summary of their results after completing the test Good job emphasizing important concepts with visual and auditory clues

PAGE 248

236 44. State what changes you would recommend to the module. Open ended 100 Response 1. Feedback section: Question 6 When explanati on is over, screen takes too long to clear Question 7 I moved the cursor and the screen disappeared Question 8 If cursor is not left on the nm square, the narration stops Question 9 e. This screen takes too long to fade out.

PAGE 249

237 Specimen B 3 Development Phase: Instrument and s ummary of results from the Expert Review of Module questionnaire n=2 Section I. Accessibility Questions Category % in Category 1. The training module ran on my computer without any problems. Strongly Agree 50 Disagree 50 2. All the links within the module worked in my browser. Strongly Agree 50 Agree 50 3. I did not experience any technical delays while going through this training module. Strongly Agree 50 Disagree 50 Section II. Design Elements 4. The module design was simple and uncluttered. Strongly Agree 100 5. The organization of the training module was easy to follow. Strongly Agree 50 Agree 50 6. The directions given for the learner to foll ow were easy to understand. Strongly Agree 50 Agree 50 7. The ends of sections within the module were clearly understood. Agree 50 Disagree 50 8. The fonts and colors promoted legibility within the module. Strongly Agree 50 Agree 50 9. Feedback me ssages appeared in a consistent layout on each page. Strongly Agree 50 Agree 50 10. Help messages appeared in a consistent layout on each page. Agree 100 11. Error messages appeared in a consistent layout on each page. Strongly Agree 100 12. All links icons and navigation buttons worked as expected. Strongly Agree 50 Disagree 50 13. The overall tone of the module was engaging. Strongly Agree 100 Section III. Graphics/Animation/Multimedia 14. The graphics complemented the learning. Agree 100 15. Layout of graphics, fonts, font colors, font size, and positioning of icons were all consistent. Strongly Agree 50 Agree 50 16. The various text animations (e.g. text highlight, text movement etc.) helped the learner to focus on the learning objectives. Strongly Agree 50

PAGE 250

238 Agree 50 17. The audio provided useful information and enhanced learning. Strongly Agree 50 Agree 50 18. The interactions made the training interesting. Strongly Agree 100 Section IV. Navigation 19. Within the module, navigation buttons were clearly marked. Agree 50 Disagree 50 20. I navigated to various parts of the module as desired. Agree 100 21. I could navigate to the beginning of the module easily. Strongly Agree 50 Disagree 50 22. Navigation was consistent within the module. Agree 100 23. Tracking information was available to the learners so they could see where they were within the modules at all times. Agree 50 Disagree 50 Section V. Training Module Content 24. I think that the question examples used in the mod ule made learning the concepts/learning strategies easier Agree 100 25. I think that the feedback given in the module will help the learner to understand the concept/learning strategies. Agree 100 26. The way the information is sequenced (all the questi ons first followed by feedback) in the module will help the learner to understand the material. Strongly Agree 50 Agree 50 27. The information provided was relevant information to the learner. Strongly Agree 50 Agree 50 Section VI. Your Opinion Matte rs 28. Please state what you liked most about this training module: Open ended 100 Response 1. It was very engaging. I enjoyed the motif of going on a jungle mission. The audio narration and graphics helped to carry this through and keep me interested in what was coming up next. Response 2. The simplicity and clarity of expectations 29. Please state one thing you would change in this module: Open ended 100 Response 1. Just an addition, maybe for a future version a PDF I could print out with the ti ps as a review sheet later and may be some other resources, websites I could go to for study/test taking skills. Response 2. The nav buttons at the bottom were unclear at first. I had to roll over the "forward" button to be sure I was on the right one. I made the assumption that it was "forward", but it might be helpful for the buttons to be labeled.

PAGE 251

239 Specimen B 4 Development Phase: Instrument and s ummary of results from the Learners: Evaluate Usability of Module questionnaire n =7 Section I. Learner Background Questions Category % in Category 1. What is your gender? Male 29 Female 71 2. In college, I am a (choose one) Freshman 14 Sophomore 0 Junior 71 Senior 0 Other 14 3. What is your age? Open ended 100 4. I attend college (choose one) Full time 71 Part time 29 5. If given a choice between a traditional (i.e. classroom/face to face class) version of a course and a web based version, where both are available at convenient times, I would take the: Traditional (classroom version) 71 W eb based version 29 Section II. Accessibility 6. The training module runs on my computer without any problems. Strongly Agree 100 7. All the links within the module work in my browser. Strongly Agree 100 8. I did not experience any technical delays whi le going through this training module. Strongly Agree 71 Agree 14 Disagree 14 Section III. Design Elements 9. The module design is simple and uncluttered. Strongly Agree 86 Agree 14 10. The organization of the training module is easy to follow. Strongly Agree 71 Agree 29 11. The directions given for the learner to follow are easy to understand. Strongly Agree 86 Agree 14 12. The start of each section within the module is clearly understood. Strongly Agree 43 Agree 43 Disagree 14 13. Th e end of each section within the module is clearly understood. Strongly Agree 43 Agree 29 Disagree 14 Strongly Disagree 14 14. The fonts and colors promote legibility within the module. Strongly Agree 43 Agree 57 15. Feedback messages appear in a consistent layout on each page. Strongly Agree 57 Agree 29 No Answer 14 16. Help messages appear in a consistent layout on each page. Strongly Agree 57

PAGE 252

240 Agree 29 No Answer 14 17. Error messages appear in a consistent layout on each page. Strongly Agree 57 Agree 29 Disagree 14 18. All links, icons and navigation buttons work as expected. Strongly Agree 57 Agree 43 19. The overall tone of the module is engaging. Strongly Agree 57 Agree 43 Section IV. Graphics/Animations/Multimedia 20. Th e graphics used in the module help to enhance my learning. Strongly Agree 29 Agree 43 Disagree 14 No Answer 14 21. Layout of graphics, fonts, font size, font color, and positioning of icons were all consistent. Strongly Agree 43 Agree 43 No Answ er 14 22. The various text animations (e.g. text highlight, text movement etc.) helped me to focus on what I should be learning. Strongly Agree 57 Agree 29 Disagree 14 23. The audio provides useful information and enhances my learning. Strongly Agree 86 Agree 14 24. The interactions make the training interesting. Strongly Agree 57 Agree 29 Disagree 14 Section V. Navigation 25. Within the module, navigation buttons are clearly marked. Strongly Agree 57 Agree 29 Disagree 14 26. I can navig ate to various parts of the module as desired. Strongly Agree 43 Agree 57 27. I can navigate to the beginning of the module easily. Strongly Agree 43 Agree 43 Disagree 14 28. Navigation is consistent within the module. Strongly Agree 43 Agree 57 29. There is tracking information available so that I can see where I am within the module at all times. Strongly Agree 57 Agree 29 Disagree 14 Section VI. Training Module Content 30. I think that the question examples used in the module made learni ng the concepts easy. Strongly Agree 57 Agree 43 31. The feedback given in the module helped me to learn. Strongly Agree 71 Agree 29 32. The way the information (all the questions first followed by feedback) is presented in the module helps me to lea rn. Strongly Agree 43 Agree 43 Disagree 14 33. I find the information in the module to be useful and relevant to me. Strongly Agree 43 Agree 57 Section VII. Your Opinion Matters 34. Please state what you like most about this training module: Open ended 100 Response 1. It was informative

PAGE 253

241 Response 2. Interesting tips Response 3. The interaction kept me focused, which I really liked. Response 4. I really enjoyed the training module very much, and hope for the success of it because I would like to have this module as a course here at [name of school] Response 5. How t o break down the question that I didn't understand. Response 6. I liked the way that the module showed how to break down the questions to better help the students learn. I also like how they showed key words to look at to help decide which answer was best for me to choose. Response 7. What I liked most about this training module is how each unfamiliar word was explained in different parts which made finding the answer very easy. 35. Please state one thing you would change in this module: Open ended 100 Response 1. S ome things seem to be bad examples, like the last question. When an answer doesn t flow with the question it seems to be more of an error than a giveaway Response 2. For certain facts, checking "will" or "likely" is not very helpful. Response 3. T he guy that pops up on the screen is annoying. Response 4. Nothing. Response 5. There was no clear statement to let me know that I was finish Response 6. Some times, during t he ans wer session, the graphics would get distracting, instead of focusing on what was being taught to me, I started watching the falling letters. Response 7. The narrator's voice was a little boring at times.

PAGE 254

242 Specimen B 5 Development Phase: List of r efinements from formative evaluations Participant/s Number Description Changed? Comment SME, Director of LEARN Program 1 Number each question. YES Numbered each question. 2 In narration at the beginning, questions first. At the e nd you will find an explanation for each choice. YES Updated narration. 3 In introduction narration add replacement for good preparation. We are going to introduce you several strategies you can use to aid in your test prep aration. You cannot pass a test based only on these strategies but you can certainly YES Updated narration. 4 In evaluation find out whether YES Conduct summative evaluation. SME, ID, Programmer1 NULL No ref inement information N/A SME 5 Technical delays investigate YES Found out that the Flash Player needed to be updated to the current version. 6 Check navigation links NO Buttons work as they should. A feature in Captivate using a 'skin' which is a pre made navigation menu and color scheme, was used. "Closed utilized but appeared on the module. 7 End of sections not defined YES Added slides to explain end of each section 8 Question 6. When explanation is over, screen takes too long to clear YES Decreased it by approx. 15 seconds. 9 Question 7. I moved the cursor and the screen disappeared NO Problem did not re occur during re testing 10 Question 8. If cursor is not left on the nm square, the narration stops NO Thi s is how it is meant to work. 11 instructional or entertaining value. This screen takes too long to fade out. YES Focused on technical issue slow fade out. Left "grammar clues". From an instructional perspective it is useful. Reinforces that the learner needs to look for

PAGE 255

243 "grammar clues". Also change from a passive learner to an active learner. ID, Programmer #2 NULL No refinement information N/A 2 ID Experts 12 Check navigation links NO No chan ge (1 respondent) 13 Technical delays investigate YES Did not download Flash player. 14 Change "Ace My Test" to "Ace Your Test" YES Changed. 15 For a future version a PDF I could print out with the tips as a review sheet later and may be some ot her resources, websites I could go to for study/test taking skills. NO No change (1 respondent) 16 The navigation buttons at the bottom were unclear at first. I had to roll over the "forward" button to be sure I was on the right one. I made the assumpti on that it was "forward", but it might be helpful for the buttons to be labeled. YES which dictates the navigation buttons appearance, is part of the design template. In between sections, a prompt appears to tell the student w hat button to use next to move forward. 7 Learners Reading Class 17 Technical delays investigate YES Old PCs Speed and memory 18 Organization? (2 respondents) YES Added more narration and a new slide. 19 Directions (1 respondent) YES Added mor e narration and a new slide. 20 Start and end of sections not clearly understood YES Added more narration and a new slide. 21 Layout of error message consistency problem YES Consistent 22 Navigation clearly marked? YES Added more narration and a new slide. 23 Question examples bad NO No change (1 respondent) 24 Guy that pops up onscreen annoying NO No change (1 respondent) 25 No clear statement that it was completed NO This is clearly marked and stated in the narration 26 Graphics dis tracting NO No change (1 respondent)

PAGE 256

244 Specimen B 6 Iteration 1 Evaluation Phase: Instrument and s ummary of results from Summative Usability Evaluation questionnaire n =15 Section I. Learner Background Questions Category % in Category 1. What is y our gender? Male 33 Female 67 2. In college, I am a (choose one) Freshman 87 Sophomore 0 Junior 0 Senior 0 Other 7 No answer 7 3. What is your age? Open ended 100 4. I attend college (choose one) Full time 80 Part time 7 No answer 7 5. If given a choice between a traditional (i.e. classroom/face to face class) version of a course and a web based version, where both are available at convenient times, I would take the: Traditional (classroom version) 73 Web based version 27 Section II. Accessibility 6. The training module runs on my computer without any problems. Strongly Agree 80 Agree 13 Disagree 7 7. All the links within the module work in my browser. Strongly Agree 87 Agree 13 8. I did not experience any technical delays whi le going through this training module. Strongly Agree 67 Agree 33 Section III. Design Elements 9. The module design is simple and uncluttered. Strongly Agree 80 Agree 20 10. The organization of the training module is easy to follow. Strongly Agree 93 Agree 7 11. The directions given for the learner to follow are easy to understand. Strongly Agree 80 Agree 20 12. The start of each section within the module is clearly understood. Strongly Agree 80 Agree 20 13. The end of each section within the module is clearly understood. Strongly Agree 80 Agree 20 14. The fonts and colors promote legibility within the module. Strongly Agree 80 Agree 20 15. Feedback messages appear in a consistent layout on each page. Strongly Agree 20 Agree 67 Di sagree 7

PAGE 257

245 16. Help messages appear in a consistent layout on each page. Strongly Agree 20 Agree 67 Disagree 7 17. Error messages appear in a consistent layout on each page. Strongly Agree 20 Agree 53 Disagree 20 Strongly Disagree 7 18. All link s, icons and navigation buttons work as expected. Strongly Agree 47 Agree 47 Disagree 7 19. The overall tone of the module is engaging. Strongly Agree 60 Agree 40 Section IV. Graphics/Animations/Multimedia 20. The graphics used in the module help to enhance my learning. Strongly Agree 53 Agree 40 Strongly Disagree 7 21. Layout of graphics, fonts, font size, font color, and positioning of icons were all consistent. Strongly Agree 60 Agree 40 22. The various text animations (e.g. text highlig ht, text movement etc.) helped me to focus on what I should be learning. Strongly Agree 60 Agree 40 23. The audio provides useful information and enhances my learning. Strongly Agree 73 Agree 27 24. The interactions make the training interesting. Str ongly Agree 67 Agree 33 Section V. Navigation 25. Within the module, navigation buttons are clearly marked. Strongly Agree 67 Agree 33 26. I can navigate to various parts of the module as desired. Strongly Agree 47 Agree 53 27. I can navigate to the beginning of the module easily. Strongly Agree 47 Agree 53 28. Navigation is consistent within the module. Strongly Agree 47 Agree 47 Disagree 7 29. There is tracking information available so that I can see where I am within the module at all t imes. Strongly Agree 40 Agree 60 Section VI. Training Module Content 30. I think that the question examples used in the module made learning the concepts easy. Strongly Agree 53 Agree 40 Disagree 7 31. The feedback given in the module helped me to learn. Strongly Agree 53 Agree 47 32. The way the information (all the questions first followed by feedback) is presented in the module helps me to learn. Strongly Agree 53 Agree 47 33. I find the information in the module to be useful and relevant to me. Strongly Agree 47 Agree 53 Section VII. Your Opinion Matters 34. Please state what you like most about this training module: Open ended 93 Response 1. It was real easy, very understanding, and very helpful Response 2. It gave me test taking ti ps that I would have never thought of. Response 3. Although I only had two answers wrong the traini n g module gave me tips on all ten questions Response 4. I like the graphic design of the tutorial because it would emphasize certain words that I needed

PAGE 258

246 to know. Response 5. IT HAS GREAT EXAMPLES USED TO HELP ON TEST TAKING SKILLS!! Response 6. I liked that after I was finished testing, it didn't just give me a score. I came back and told me where I messed up and what ways I could have looked at each quest ion differently. Response 7. What I like most about this training module is that the questions were clear and enhan c ed my learning. Response 8. I like the audio and how it broke down some simple tips that I tend to look over it was very helpful Response 9. What I liked the most in the module was the qu e stions that were asked. Though the questions were challenging, they were interesting to think about. Response 10. I liked the audio that went along with this module, because it made it alot easier to take in the information. Response 11. What I liked the most about the traini n g was how after I finished the test, the module showed me what to look for in a question, whether I got the question right or wrong. Response 12. What I liked most about the trainin g module was the fact that it was very helpful when the feedback was given. Response 13. What I liked most of the module is it explained how they got the answers to the questions at the end of the test. Response 14. From the ID perspective, it was engagi ng in nature the "mission" theme/motif was carried through the whole module. 35. Please state one thing you would change in this module: Open ended 100 Response 1. Nothin g at all Response 2. There isn would change about it. Response 3. The v oice is a bit monotone Response 4. I honestly can't think of anything to change about this tutorial. It covered every detail and question I would have had. Response 5. I WOULD EXTEND THE TIME AND ADD MORE QUESTIONS AND EXAMPLES! Response 6. Nothing, it was an excellent tutorial for me. Response 7. One thing I would change in this module is adding more questions related to the subject. Response 8. H onestly nothing it was all good Response 9. What I would change about this module is to provide more feed back on the answers given and to explain why a person might have chosen that answer. Response 10. The thing that I would change would be, during the learning part, after the quiz, it should give more time in between the questions to take in all of the use ful information. Response 11. I would change the voice of the speaker, some people would prefer a softer voice. Response 12. I would change the amount of question. It was short but it hit the important ones but adding a few more would be good. Response 13. I would have m ore questions and make them mor e of a challenge. Response 14. Decrease the time to downlo a d It took me approx: 5 minutes (which seemed like longer) using the Internet connection and computer at a public library. Response 15. One of th e last questions asked the user to "roll over" the text in blue. The user actually needs to "click" on the blue text to see the statement.

PAGE 259

247 Specimen B 7 Iteration 1 Evaluation Phase: List of refinements derived from summative review after iteration 1 o evaluate Participant/s Number Description Changed? Comment Learners UE class 1 Error message layout not consistent (1 respondent) YES Checked all messages and re set to appear at top right hand corner. 2 Th e voice is a bit monotone (2 respondents) NO Not a feasible option 3 Extend time and add more questions NO No more content at this point in time. 4 Provide more feedback on the answers given and explain why a person might have chosen that answer. NO Un fortunately, the content does not provide the "psychology" as to why a person would choose an particular response ID Experts 5 Decrease the time to download It took one user approx: 5 minutes using the Internet connection and computer at a public librar y. NO Unfortunately this is an attribute that cannot be easily corrected there are no jpegs files there are a number of audio that may be causing delays I recompiled program but there are no options given in Captivate to reduce download also ther e is an option for audio quality but adjusting this resulted in a degraded quality.

PAGE 260

248 Specimen B 8 Iteration 2 Evaluation Phase: Instrument and s ummary of results from Summative Usability Evaluation questionnaire n =22 Section I. Learner Background Questions Category % in Category 1. What is your gender? Male 36 Female 64 2. In college, I am a (choose one) Freshman 91 Sophomore 0 Junior 0 Senior 0 Other 5 No answer 5 3. What is your age? Open ended 100 4. I attend college (choose on e) Full time 86 Part time 5 No answer 9 5. If given a choice between a traditional (i.e. classroom/face to face class) version of a course and a web based version, where both are available at convenient times, I would take the: Traditional (classroom version) 77 Web based version 18 No answer 5 Section II. Accessibility 6. The training module runs on my computer without any problems. Strongly Agree 82 Agree 14 No answer 5 7. All the links within the module work in my browser. Strongly Agree 91 Agree 9 8. I did not experience any technical delays while going through this training module. Strongly Agree 82 Agree 18 Section III. Design Elements 9. The module design is simple and uncluttered. Strongly Agree 64 Agree 36 10. The organiz ation of the training module is easy to follow. Strongly Agree 77 Agree 23 11. The directions given for the learner to follow are easy to understand. Strongly Agree 82 Agree 18 12. The start of each section within the module is clearly understood. St rongly Agree 77 Agree 23 13. The end of each section within the module is clearly understood. Strongly Agree 73 Agree 14 Disagree 9 Strongly Disagree 5 14. The fonts and colors promote legibility within the module. Strongly Agree 68 Agree 32

PAGE 261

249 1 5. Feedback messages appear in a consistent layout on each page. Strongly Agree 59 Agree 41 16. Help messages appear in a consistent layout on each page. Strongly Agree 50 Agree 45 Disagree 5 17. Error messages appear in a consistent layout on each page. Strongly Agree 41 Agree 55 Disagree 5 18. All links, icons and navigation buttons work as expected. Strongly Agree 59 Agree 41 19. The overall tone of the module is engaging. Strongly Agree 55 Agree 32 Disagree 5 Strongly Disagree 5 S ection IV. Graphics/Animations/Multimedia 20. The graphics used in the module help to enhance my learning. Strongly Agree 55 Agree 45 21. Layout of graphics, fonts, font size, font color, and positioning of icons were all consistent. Strongly Agree 55 Agree 45 22. The various text animations (e.g. text highlight, text movement etc.) helped me to focus on what I should be learning. Strongly Agree 68 Agree 27 Disagree 5 23. The audio provides useful information and enhances my learning. Strongly A gree 68 Agree 27 Disagree 5 24. The interactions make the training interesting. Strongly Agree 59 Agree 41 Section V. Navigation 25. Within the module, navigation buttons are clearly marked. Strongly Agree 73 Agree 23 Disagree 5 26. I can nav igate to various parts of the module as desired. Strongly Agree 68 Agree 27 Disagree 5 27. I can navigate to the beginning of the module easily. Strongly Agree 59 Agree 36 Disagree 5 28. Navigation is consistent within the module. Strongly Agree 64 Agree 32 Disagree 5 29. There is tracking information available so that I can see where I am within the module at all times. Strongly Agree 68 Agree 27 Disagree 5 Section VI. Training Module Content 30. I think that the question examples used in the module made learning the concepts easy. Strongly Agree 73 Agree 23 Disagree 5 31. The feedback given in the module helped me to learn. Strongly Agree 59 Agree 41 32. The way the information (all the questions first followed by feedback) is presented in the module helps me to learn. Strongly Agree 64 Agree 36 33. I find the information in the module to be useful and relevant to Strongly Agree 68

PAGE 262

250 me. Agree 32 Section VII. Your Opinion Matters 34. Please state what you like most about th is training module: Open ended 100 Response 1. It helps you to focus on keywords in the question that may be there to trick you. Response 2. I like the main goal which will help me to focus more on the wording the next time I take a quiz or test. Respon se 3. I thought the questions were interesting Response 4. Fun an easy to learn. Response 5. The thing that I liked most about this module was the animation. it wasn't too much so as to get me distracted, yet it was enough to keep me interested. Respons e 6. The interactive questions. Response 7. I like the whole concept of the module. I learn something new today so thank you Response 8. What I liked most is that it was helpful and easy to do. Response 9. I liked that there was audio with the module be cause although I like reading the material, it is easier to understand for me when I hear what I'm being taught. Response 10. It taught me useful tricks to taking a test. Response 11. I liked the color and the way everything is clear l y pointed out in the module. Response 12. The most interesting part is the way the examples are explained at the end, I liked how specific and helpful certain words are in a sentence. Response 13. It was interesting. Response 14. It was interesting and made answering the q uestions fun and interactive. Response 15. I liked the way the learning concepts were broken down. They seem to be presented in an intelligent way. Response 16. I like that the words and tips were highlighted in specifics so that I could better understan d how to answer each different kind of question Response 17. I liked how the module actually went over the test with you and picked out different things in the questions and pointed them out Response 18. It was short, sweet, and to the point. There wasn' t any fluff or unnece s sary information. Response 19. It was interesting and fun. The questions were also good questions based on really life situations which made it fun. Response 20. I liked the pictures that were in the test review section. They made l istening to the explanation interesting. Response 21. The module is very engaging visually. Response 22. The module was interesting and I think would keep the attention of an elementary or middle school student. 35. Please state one thing you would chan ge in this module: Open ended 82 Response 1. I would make the questions more objective. Response 2 Explain some things more. Response 3. The one thing that I would change about this module is in the section where you have to type the word in the yellow box it should say now "click the forward button" when you are finished. Response 4. I would just change the audio on the one part where the nar r ator says "Micro". Response 5. I really wouldn t change anything about it I thought that it was good. Respon se 6. I think that the module was sufficiently put together, but if I was to change something about it, I would put more time in between the questions during the teaching part because they go by kind of fast. Response 7. I would change the int r o to make i t more relevant to college level learning. Response 8. I did not like the voice so much ,but it isn't too bad Response 9. Make the person speaking a little bit more lively, although he is very good at explaining and has very good pronunciation, he is ver y monotone, and it can make the module a bit boring, and because there is so much info to be learned, you don't want the student to get bored. But besides that, it was great :) Thank you Response 10. It would've been nice to take another test after learni ng to put the information to good use. Response 11. Nothing.

PAGE 263

25 1 Response 12. The voice within the program needs to be altered. It is a little creepy. Response 13. I think the learner section went a little fast Response 14. I don't think there is anything to change Response 15. Nothing that I can think of. Whoever made this module did a good job!!! Response 16. I would add a little more color. Response 17. Printable notes, tips, or som ething to save to my own computer for quick reference later. Response 18. IF the audience is higher than elementary or middle school, I would suggest changing the narrator.

PAGE 264

252 Specimen B 9 Iteration 2 Evaluation Phase : List of refinements derived from summative review after iteration 2 evaluate e Participant/s Number Description Changed? Comment UE Experience Class 1 End of section somewhat unclear NO Acceptable majority 87% either strongly agreed or agreed that the sections were clearly marked 2 Error messages are not consistent on pag e NO Acceptable majority 95% either strongly agreed or agreed that error messages were consistent in the layout 3 Navigation buttons are not clearly marked, consistent and cannot navigate easily NO Acceptable majority 95% either strongly agre ed or agreed that the navigation buttons are clearly marked, consistent and that you can navigate module easily 4 More explanations needed (1 respondent) NO Content provides no further details. 5 Narrator could be more lively or different NO Not fea sible due to a time constraint 6 Printable notes (1 respondent) NO Not feasible due to a time constraint 7 Add another test to practice what was learned NO Not feasible due to a time constraint

PAGE 265

253 Specimen B 10 DBR Perspective: Instrument and s ummary of r esults from Evaluate Design Decisions Questionnaire n =3 Section I: Objectives and Assessments Questions Choices % in Categories 1. My role in this web based initiative is: Programmer 33 Instructional Designer 33 Subject Matter Expert 33 2. The purpose of this initiative, to develop a web based training module, is clear to me. Strongly Agree 100 3. Please state what you believe to be the purpose of developing this web based training module. Open ended 4. I know the stakeholders who are inv olved in this initiative to develop a web based training module. Strongly Agree 67 Agree 33 5. Who are the most important stakeholders in this initiative to develop a web based training module? Administrators 0 Faculty Learners 100 Other 0 6. T o develop the web based training module, it was important to assess learner's knowledge or each learner's previous knowledge. Strongly Agree 33 Agree 33 Disagree 33 7. For this module development initiative it is important to provide feedback to the l earners after assessment. Strongly Agree 33 Agree Disagree 67 8. Learner analysis information gathered from the "Analysis" phase was used to guide instructional strategies proposed for the module. Strongly Agree 67 Agree 33 Section II. Instruction al Strategy 9. The instructional strategy was decided (i.e. moving from traditional to online and providing interaction and instant feedback for learners) early in the "design phase". Strongly Agree 100 10. Was an Instructional Design Plan (IDP) develope d by the instructional designer? Yes 67 No 0 33 If you answered "Yes" to the previous question please go to Question 11 else please go to Question 12. 11. How many meetings did you have before an IDP was created? 0 0 1 0 2 0 3 67 No Answer 33 12. I think an IDP is essential in guiding development. Strongly Agree 67 Agree 33 13. Who do you think was most influential in choosing an instructional SME 67

PAGE 266

254 strategy for the initiative? Programmer 0 Instructional Designer 33 Oth er 0 14. Who do you think was most influential in setting design elements for the initiative? SME 0 Programmer 33 Instructional Designer 67 Other 0 15. Design elements such as font (size and color), background color, animations, graphics and audio were discussed in detail in meetings. Agree 67 Disagree 33 16. Who do you think was most influential in choosing hardware and software for the initiative? SME 0 Programmer 0 Instructional Designer 67 Other 33 17. Interaction interfaces and intera ction design elements were established in meetings at this phase (i.e. Design Phase of ADDIE). Strongly Agree 33 Agree 67 18. Use of media elements such as audio, video, animation and graphics was guided by the information derived from the Analysis phas e. Strongly Agree 33 Agree 33 No Answer 33 19. Navigation issues were discussed in meetings in this phase (i.e. Design Phase of ADDIE). Strongly Agree 33 Agree 33 No Answer 33 20. How the information would be presented to the learner (whether one concept one question at a time followed by feedback or a group of concepts/questions followed by feedback) was discussed in meetings at this phase (i.e. Design Phase of ADDIE). Strongly Agree 33 Agree 67 21. There were quality control guidelines that a ddressed clarity and consistency issues (e.g. using an IDP to guide development). Strongly Agree 67 No Answer 33 22. The source of the content for the initiative was derived mainly from: SME 100 Programmer 0 Instructional Designer 0 Other 0 Secti on III: Delivery Selection System and Prototyping 23. The delivery system (e.g. whether via Internet or Face to Face or Blended) choice was influenced most by: SME 67 Programmer 0 Instructional Designer 33 Other 0 24. A prototype was developed in t his phase (i.e. Design Phase of ADDIE). Strongly Agree 67 Agree 33 25. If a prototype was developed, it helped to show what the final web based module would potentially "look, sound and feel" like. Strongly Agree 100 26. A prototype would help to reduc e costly design changes. Strongly Agree 33 Agree 33 Disagree 33 27. The feedback from the prototype is expected to refine the design and development of the web based module. Strongly Agree 67 Agree 33 28. Given the opportunity, I will always recomm end developing a prototype when developing web based instruction. Strongly Agree 67 Agree 33

PAGE 267

255 29. If you have additional comments, please add it here: Open ended 100 Response 1. Proper testing of desired design deliverable content was initially over sha dowed by incapacitates of delivery method. This was quickly resolved with the Instructional Designer, the SME and the programmer. Response 2. In section II, I could not answer questions 11, 18, 19, and 21 because I did not have that was created. Additionally, I did not know if the meetings that were conducted fell before or after the design or analysis phases. Response 3. The prototype helped tremendousl y in refining design elements. In this case, especially what design elements that was desired and not desired. A simple IDP was developed but not formally presented to the SME. The programmer was given information on how the content should be presented (qu estions first, followed by feedback). D etails about colors/fonts were mentioned.

PAGE 268

256 Specimen B 11 DBR Perspective: Instrument and s ummary of results of the Module Development Questionnaire n=2 Section I. Content Development Questions Categories % on Cat egories 1. In this module development initiative, I was a: Programmer 50 Instructional Designer 50 2. Content sequencing (i.e. linear or branching) meets the requirement determined in the design phase. Strongly Agree 100 3. Content is modularized and can be easily changed and customized. Strongly Agree 50 Agree 50 4. Content is appropriate for the target audience as determined from the analysis phase. Strongly Agree 100 5. The content is accurate as determined from the analysis phase. Strongly Agre e 100 6. How much time (in days) did it take to integrate the content to module design? Open ended 100 Response 1. 10 days Response 2. 10 days 7. The sequencing of the content followed the design criteria set in the Instructional Design Plan (IDP). Ye s 100 If you responded "No", go to Question 8, otherwise go on to Question 9. 8. Explain why the sequencing was different than what was requested in the IDP. Open ended 0 9. The Subject Matter Expert (SME) helped to maintain accuracy of content throug hout development. Strongly Agree 100 Section II. Hardware and Software Elements 10. The software application used to develop the training module can be easily deployed in any platform (e.g. Windows or Mac environment). Strongly Agree 100 11. The softwar e application used to develop the training module is the same as the one requested in the IDP (Instructional Design Plan). No 100 If you responded "No", go to Question 12, otherwise go on to Question 13. 12. Please explain why a software application ot her than the one defined in the IDP was used to develop the training module. Open ended 100 Response 1. The first programmers suggested and used Authorware 6.0 However, the design called for questions presented first, followed by detailed feedback of ea ch question. The first programmer could not accomplish this using Authorware 6.0 The Captivate application was used for module development. Response 2. Captivate was used instead of Authorware 6.0 Change in programmers. Captivate offered an easier wa y to accomplish animations, include audio etc. It was also easy to deploy and use. It required a download of the Flash 9 player. 13. No downloads are required to run the module. Yes 50 No 50 14. No special hardware is required to run the module. Yes 50

PAGE 269

257 No 50 15. How much time (in days) did it take to decide on hardware and software elements? Open ended 100 Response 1. 1 Response 2. 1 Section III. Design Elements 16. The prototype helped to define desired design elements. Strongly Agree 100 1 7. Interactions that were developed followed the specifications defined in the IDP. No 100 If you responded "No", go to Question 18, otherwise go on to Question 19. 18. Please explain why interactions were not developed as defined in the IDP plan. Open ended 100 Response 1. During development, some minor changes occurred. Some of the highlight text and text animations features were decided upon during development. Response 2. Captivate afforded different types of animations to be used. Flash and other text animations were used. This is different from what could have been created using Authorware. 19. Interactions complemented the learning. Strongly Agree 100 20. Colors of fonts and background work well in various browsers. Strongly Agree 100 21. Tex t animations are used to complement the learning. Strongly Agree 100 22. All navigation buttons are consistent throughout the module. Strongly Agree 100 23. How much time (in days) did it take to develop the design elements? Open ended 100 Response 1. 1 0 Response 2. 10 Section IV. Your Opinion Matters 24. State any problems you encountered in developing the module: Open ended 100 Response 1. As stated previously. The initial decision was to use Authorware 6. However, later, a change in programmers prompted the use of Captivate 3.0. Response 2. Captivate has some limitations, such as no drawing features (cannot draw lines). Also some timing issues between audio and slide animations took some time to resolve. 25. State what assisted you in the devel opment of the training module: Open ended 100 Response 1. IDP plan was used as a guideline throughout development. Also, the SME was available when questions about content arose. Response 2. Although Captivate has limitations, it is relatively very easy to use in comparison to Authorware.

PAGE 270

258 Appendix C : Instructional Development Plan (IDP)

PAGE 271

259 IDP: Test taking Strategies Objective Test Module Hardware PC Audio/Graphics card Software Authorware 6.0 (possibility: Captivate 3.0) Target Audience College students enrolled in Learning Strategies course (18 25) Accessibility Internet Environment delivered via Internet run on PC/Mac Interactivity Level High IDP: Style Guide Font: Arial (size will change depending on length of question) Background: White/Graphic Navigation Template (bottom right) find an existing template (if using Adobe Captivate) Animation graphics text graphics (highligh t s, rollovers). Audio only for Intro, explanation and conclusion IDP: Flowchart: Objective Test Module

PAGE 272

260 Appendix D: Excerpt from Logbook

PAGE 273

261 Week1 Wednesday 21 st June, 2006 I have gathered several pieces of information for the first phase of this project. The analysis phase involves many types of ana lyse s such as Audience, Technical, Goals Content and Context Within each of these areas, there is a refined breakdown of information. The literature on t he Ana l ysis portion is immense, yet trying to pinpoint examples is rather difficult. There are many books and journal article on "How To do an analysis but no strongly structured examples are given, or if there are examples I have not found them. Anyway, it looks like I will have to take bits and pieces of t he examples I have found and add my own information. I have also spoken to Claudia a nd Dr. White, trying to gather as much information as possible. Claudia will be the SME. I don't have a clear idea of what type of measures can be done. I am hoping that grey area will be cleared up soon. Week 2 Wednesday 28 th June, 2006 The process of actually writing the interview questions for all of the analysis artifacts is very difficult. Trying to get the theories behind the needs analysis, task analysis, context and content analysis is also difficult. The literature on each part is a bit ambi guous. I have created a flowchart of how the phases will occur and what is going to occur in each phase. I have made changes to this after doing some reading on the analysis portion. I have to meet with Dr. White on Friday so he can go over the Analysis ar tifacts. I have not completed these artifacts. I have also emailed Claudia to let her know where I am at and also to ask her for any enrollment data she has over the past years. I think this will prove useful when trying to justify why the course is moving to online. Week 3 Wednesday 5 th July, 2006 The IRB process has begun. Met with Brenda Kuska (974 6433) and Dr. White. Since this is only the Analysis phase that I have to complete. The IRB approval will only cover this first phase. All the interview q uestions are attached to the IRB application. Scientific reviewer is Dr. Kealy. At this point I am awaiting approval. Week 4 Wednesday 12 th July, 2006 I had to renew my IRB certification and submit that to Brenda. The application has to be reviewed agai n today. Claudia asked if the questions can be emailed to her once IRB approval has been awarded. She thinks that she can put more details if she spends more

PAGE 274

262 time on her own answering the questions as opposed to an interview situation. Week 5 Wednesday 19 th July, 2006 Received IRB approval for Analysis phase. Emailed questions to Claudia. Week 6 Wednesday 26 th July, 2006 Claudia has class and work conflicts so she will not be able to respond to the questions until next week. She also will not hav e access to any of her students to answer some of the questions written for the users. Claudia also wants me to meet with her boss. Currently her boss is on vacation and will return in late August, but then Claudia will go on vacation then. It appears that my Analysis phase will be delayed a couple of weeks. Week 7 Wednesday 2 nd August, 2006 Research continues for literature review. Week 8 Wednesday 9 th August, 2006 Reviewing answers provided by Claudia. Some of Claudia comments suggested the question in the analysis section should be used, since they really are not addressing a problem, Week 9 Wedne sday 16 th August, 2006 Preparing draft for pre proposal. I did not make much progress. Dr. White mentioned that I should try to find information that deals with web based initiatives. Also, after ses with Dr. White, he suggested that it brought up an dissertation. Careful consideration must be placed when wording the e ach question in the those questions, therefore the responses were not what I was looking for. The next step is to reword the questions and have a face to face interv iew with her. There were a number of questions that she did not respond to at all, this also need to be clarified. Dr. White

PAGE 275

263 suggests that it is bet to request a face to face meeting with Claudia. It will alleviate some of the frustration of trying to de al with this via email. Also, if Claudia has any questions about the questions, I will be able to address the issue immediately, rather than having to wait a week for a response. Week 30 Wednesday 10 th January, 2007 Research on DBR. Contacted Claudia fo r meeting. Week 31 Wednesday 17 th January, 2007 Meeting is set with Claudia for next 2 weeks. Have to think about design issues look over responses to open ended questions. Also thinks about hardware/software issue that may be of concern, Week 32 W ednesday 24 th January, 2007 From analyzing the responses some students are concerned with animations (1 person commented that it was silly) and they like the interactivity (online chat etc.). Week 33 Wednesday 31 st January, 2007 Meeting with Claudia ca ncelled to 02/09/07 Week 34 Wednesday 7 th February, 2007 Meeting with Claudia yielded some design information/issues. First it will be a standalone website, maintained by Claudia. There is an assigned programmer but for my module I will be getting my o wn programmer. The links to the course will be delivered through BB but will link to outside website. Module will be on test taking strategies. Some design issues: think of style, use Dreamweaver/Flash, not sure if tracking is required, no login requirem ent, animation and must be meaningful. No more than 20

PAGE 276

ABOUT THE AUTHOR Oma B. Singh is currently working as an Instructional Design er at a private liberal arts not for profit university. Prior to her doctoral pursuit of a degree in Instructional Technology at the College of Education at the University of South Florida, Oma was a computer programmer analyst at a multi national magazine distribution company. After having gained six years of expert computer programming experience, she decided to pursue a Master of Science degree in Management Information Systems (MIS) with an emphasis in database systems analysis, design and programming. Oma holds both a Bachelor and Master of Science degree in MIS. While pursuing her doctoral degree, she had the opportunity to work as the Director of Training and Instructional Designer at a small private company that specialized in providing systems eng ineering as well as computer based and inst ructor led training development for clientele that included various branches of the U.S Department of Defense. Oma is looking forward to teaching and conducting further research to help practitioners in the field of Instructional Technology.