USF Libraries
USF Digital Collections

A comparison of traditional physical laboratory and computer simulated laboratory experiences in relation to engineering...

MISSING IMAGE

Material Information

Title:
A comparison of traditional physical laboratory and computer simulated laboratory experiences in relation to engineering undergraduate students conceptual understandings of a communication systems topic
Physical Description:
Book
Language:
English
Creator:
Javidi, Giti
Publisher:
University of South Florida
Place of Publication:
Tampa, Fla.
Publication Date:

Subjects

Subjects / Keywords:
Modulation
Demodulation
Electronics
Quantitative
Qualitative
Dissertations, Academic -- Secondary Education - General -- Doctoral -- USF   ( lcsh )
Genre:
government publication (state, provincial, terriorial, dependent)   ( marcgt )
bibliography   ( marcgt )
theses   ( marcgt )
non-fiction   ( marcgt )

Notes

Summary:
ABSTRACT: This study was designed to investigate an alternative to the use of traditional physical laboratory activities in a communication systems course. Specifically, this study examined whether as an alternative, computer simulation is as effective as physical laboratory activities in teaching college-level electronics engineering education students about the concepts of signal transmission, modulation and demodulation. Eighty undergraduate engineering students participated in the study, which was conducted at a southeastern four-year university. The students were randomly assigned to two groups. The groups were compared on understanding the concepts, remembering the concepts, completion time of the lab experiments and perception toward the laboratory experiments. The physical groups (n=40) treatment was to conduct laboratory experiments in a physical laboratory. The students in this group used equipment in a controlled electronics laboratory.The Simulation groups (n=40) treatment was to conduct similar experiments in a PC laboratory. The students in this group used a simulation program in a controlled -PC lab. At the completion of the treatment, scores on a validated conceptual test were collected once after the treatment and again three weeks after the treatment. Attitude surveys and qualitative study were administered at the completion of the treatment.The findings revealed significant differences, in favor of the simulation group, between the two groups on both the conceptual post-test and the follow-up test. The findings also revealed significant correlation between simulation groups attitude toward the simulation program and their post-test scores.Moreover, there was a significant difference between the two groups on their attitude toward their laboratory experience in favor of the simulation group.
Thesis:
Thesis (Ph.D.)--University of South Florida, 2005.
Bibliography:
Includes bibliographical references.
System Details:
System requirements: World Wide Web browser and PDF reader.
System Details:
Mode of access: World Wide Web.
Statement of Responsibility:
by Giti Javidi.
General Note:
Title from PDF of title page.
General Note:
Document formatted into pages; contains 191 pages.

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
aleph - 001681162
usfldc doi - E14-SFE0001080
usfldc handle - e14.1080
System ID:
SFS0025401:00001


This item is only available as the following downloads:


Full Text
xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam Ka
controlfield tag 001 001681162
003 fts
005 20061108104736.0
006 m||||e|||d||||||||
007 cr mnu|||uuuuu
008 051012s2005 flu sbm s000 0 eng d
datafield ind1 8 ind2 024
subfield code a E14-SFE0001080
035
SFE0001080
040
FHM
c FHM
090
2005
1 100
Javidi, Giti.
2 245
A comparison of traditional physical laboratory and computer simulated laboratory experiences in relation to engineering undergraduate students conceptual understandings of a communication systems topic
h [electronic resource] /
by Giti Javidi.
260
[Tampa, Fla.] :
b University of South Florida,
2005.
502
Thesis (Ph.D.)--University of South Florida, 2005.
504
Includes bibliographical references.
516
Text (Electronic thesis) in PDF format.
538
System requirements: World Wide Web browser and PDF reader.
Mode of access: World Wide Web.
500
Title from PDF of title page.
Document formatted into pages; contains 191 pages.
520
ABSTRACT: This study was designed to investigate an alternative to the use of traditional physical laboratory activities in a communication systems course. Specifically, this study examined whether as an alternative, computer simulation is as effective as physical laboratory activities in teaching college-level electronics engineering education students about the concepts of signal transmission, modulation and demodulation. Eighty undergraduate engineering students participated in the study, which was conducted at a southeastern four-year university. The students were randomly assigned to two groups. The groups were compared on understanding the concepts, remembering the concepts, completion time of the lab experiments and perception toward the laboratory experiments. The physical groups (n=40) treatment was to conduct laboratory experiments in a physical laboratory. The students in this group used equipment in a controlled electronics laboratory.The Simulation groups (n=40) treatment was to conduct similar experiments in a PC laboratory. The students in this group used a simulation program in a controlled -PC lab. At the completion of the treatment, scores on a validated conceptual test were collected once after the treatment and again three weeks after the treatment. Attitude surveys and qualitative study were administered at the completion of the treatment.The findings revealed significant differences, in favor of the simulation group, between the two groups on both the conceptual post-test and the follow-up test. The findings also revealed significant correlation between simulation groups attitude toward the simulation program and their post-test scores.Moreover, there was a significant difference between the two groups on their attitude toward their laboratory experience in favor of the simulation group.
590
Adviser: Dr. James White.
Co-adviser:
653
Modulation.
Demodulation.
Electronics.
Quantitative.
Qualitative.
0 690
Dissertations, Academic
z USF
x Secondary Education General
Doctoral.
773
t USF Electronic Theses and Dissertations.
4 856
u http://digital.lib.usf.edu/?e14.1080



PAGE 1

A Comparison of Traditional Physical Laboratory and Computer Simulated Laboratory Experiences in Relation to Engineering Undergraduate Students Conceptual Understandings of a Communication Systems Topic by Giti Javidi A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy Department of Secondary Education College of Education University of South Florida Major Professor: James White, Ph.D. William Keally, Ph.D. John Ferron, Ph.D. Dewey Rundus, Ph.D. Date of Approval: December 3, 2004 Keywords: modulation, demodulation, electonics, quantitative, qualitative,

PAGE 2

Dedication This dissertation is dedicated to my family who taught me by example the value of setting goals and working with persistent effort. Their constant love and caring are every reason for where I am and what I am. My gratitude and my love to them are beyond words. This work is dedicated to my husband, Ehsan Sheybani, who never fails to remind me how to live as if everyday were my most precious. This work is also dedicated to love of my life, my children, Natasha and Nojan, without whom I would have never had the determination to complete this project.

PAGE 3

Acknowledgement Sincere gratitude is expressed to Dr. James White, my advisor, for his expertise, encouragement, support and patience through every stage of my doctoral program. It is not often that one finds an advisor who always finds the time for listening to the little problems and roadblocks that unavoidably crop up in the course of performing research. Special thanks also go to the members of my doctoral committee: Dr. William Kealy, Dr. John Ferron, and Dr. Dewey Rundus, for their support and comprehensive review of this dissertation. Special Thanks to Dr. Darrel Bostow for making the time to serve as the chairperson for my dissertation defense. Deep appreciation is given to the faculty and staff of USF for their assistance with all aspects of my degree program. Special thanks to Ms. Lesley Butler, Dr. JT Johnson and Dr. Shelton Houston for serving as experts for validating the instruments designed for this study. Last, but not least, warm appreciation to my husband, Dr. Ehsan Sheybani, for his understanding and love, his support and encouragement and his expert advice.

PAGE 4

i Table of Contents List of Tables iv List of Figures v ABSTRACT vi Chapter 1: Introduction 1 Background of the Study 1 Motivation for the Study 4 Focus of the Study 8 The Multimedia Comparison Debate among Educators 9 Need for the Study 10 The Significance of the Study 11 Research Questions 13 Qualitative Research Questions 13 Statement of Hypothesis 14 Variables in the Study 16 Assumptions 16 Delimitation 16 Limitations 17 Definition of Terms 18 Chapter 2: Review of Literature 19 Status of Current Study in Engineering 19 Alternatives for Traditional Laboratory 21 What is Simulation? 23 Categories of Simulation 26 Characteristics of Simulation 28 Advantages and Disadvantages of Simulation 29 Instructional Simulation 31 Simulation Laboratory in Science classroom 34 Simulation Laboratory in Engineering Education 36 An Overview of Concept Learning 40 Meaningful Conceptual Learning : Critical for Learning Science 41 Importance of Conceptual Understa nding in Engineering Education 42 Assessing Conceptual Understanding 43 Overview of Learning Objectives 44 Importance of Learning Objectives 45 Learning Objectives Pert aining to Current Study 47

PAGE 5

ii Summary 51 Chapter 3: Procedures 53 Research Design 53 Quantitative Research Design 54 Variables for the Quantitative Research 54 Participants 55 Methodology and Procedures 56 An Overview of Laboratory Sessions 61 Research Questions, Materials and Instruments 61 Quantitative Research Questions 63 Statement of Null Hypotheses 64 Instrumentations and Material 65 Reliability of Test Score 69 Content Validity, Credibility and Reliability of the Instruments 70 Statistical Analysis Procedures 71 Threats to Internal Validity 72 Qualitative Research Design 74 Qualitative Research Question 74 Qualitative Interview Questionnaire 75 Content Validity of Qualitative Instrument 75 Group Interview 76 Sampling for Qualitative Research 77 Transcription and Coding Process 78 Observations 79 Chapter 4: Results 81 Introduction 81 Null Hypotheses 81 Procedures 82 Descriptive Data 83 Statistical results as related to null hypothesis 91 Results of Qualitative Data 97 Written questionnaire 97 Group Interview 99 Interview Themes 100 Concept Clarification 100 Visual Learning 102 Memorability 102 Problem-solving Strategies 103 Time spent in the laboratory 103 Simulation in Place of laboratory 104 Simulation lab experience 104 Exit questions 104 Observation 105 Summary of the Quantitative Results 106 Summary of Qualitative Results 108

PAGE 6

iii Chapter 5: Discussion 110 Review of the Study 110 Discussion of Results 112 Discussion of Qualitative Results 119 Summary 120 Limitations of the Study 121 Implications for Practice 122 Recommendations for Further Study 125 References 128 Appendices 135 Appendix A: Physical Laboratory Experiment Sheets 136 Appendix B: Simulated Labor atory Experiment Sheets 141 Appendix C: Pre-Lab Instruction 143 Appendix D: Student Demographic and Background 144 Appendix E: Attitude Survey Questionnaires 145 Appendix F: Conceptual Achievement Test 147 Appendix G: Instructor Rating Sheet for Open-Ended Questions 153 Appendix H: Qualitative Instruments 154 Appendix I: Consent Form 158 Appendix J: Pilot Study 160 Appendix K: Simulation Program 166 An overview of the Subject Matter 166 Modulation 167 Description of the Simulation Program 167 Appendix L: Research Instrument Validation Forms 172 Appendix M: Request for Permission for Observations and Interview 175 Appendix N: Student Responses 176 About the Author End Page

PAGE 7

iv List of Tables Table 1. Design of the Study 54 Table 2. Overall Study Questions, Data Collection Techniques, Instruments and Data Source 62 Table 3. An Overview of the Conceptual Achievement Instrument 68 Table 4. Inter Rater Reliability of Post-test Scores 84 Table 5. Descriptive Statistics for both Post-test and Follow-up test 85 Table 6. Factor Analysis of the 9-items Attitude Survey Questionnaire 87 Table 7. Descriptive Results for the 9-items Attitude Survey Questionnaire 88 Table 8. Factor Analysis for the 13-items Attitude Survey Questionnaire 90 Table 9. Descriptive Results for the 13-items Attitude Survey Questionnaire 91 Table 10. Mean & Sig. Results for Attitude Survey Questions for Both Groups 92 Table 11. Mean Comparison Simulation and Physical Group on each Post Test Question 95 Table J-1. Performance Measure Statistics 161 Table J-2. Group Means for Performance Test Items 163 Table J-3. Descriptive Statistic for Student Attitude 167

PAGE 8

v List of Figures Figure 1. Research Variables 55 Figure 2. Overview of Research Procedures 60 Figure 3. PC Lab Structure 79 Figure 4. Comparison of Means on the Conceptual Test Over Time 85 Figure 5. T-test: Comparison of Means for Post-test Measures 93 Figure 6. T-test: Comparison of Means for Follow-up Measures 94 Figure 7. T-test: Comparison of Lab Completion Time 97 Figure G-1. Grading Rubric 153 Figure J-1. Independent Samples t-test for performance 164 Figure K-1. A Typical Communication System block diagram 166 Figure K-2. An Example of Modulated Signal using AM Modulation 170 Figure K-3. An Example of Modulated Signal using FM Modulation 171 Figure L-1. Instrument Validation Form 172 Figure L-2. Grading Rubric 173

PAGE 9

vi A Comparison of Traditional Physical Laboratory and Computer-Simulated Laboratory Experiences in Relation to Engineering Undergraduate Students Conceptual Understanding of a Communication Systems Topic Giti Javidi ABSTRACT This study was designed to investigate an alternative to the use of traditional physical laboratory activities in a communication systems course. Specifically, this study examined whether as an alternative, computer simulation is as effective as physical laboratory activities in teaching college-level electronics engineering education students about the concepts of signal transmission, modulation and demodulation. Eighty undergraduate engineering students participated in the study, which was conducted at a southeastern four-year university. The students were randomly assigned to two groups. The groups were compared on understanding the concepts, remembering the concepts, completion time of the lab experiments and perception toward the laboratory experiments. The physical groups (n=40) treatment was to conduct laboratory experiments in a physical laboratory. The students in this group used equipment in a controlled electronics laboratory. The Simulation groups (n=40) treatment was to conduct similar experiments in a PC laboratory. The students in this group used a simulation

PAGE 10

vii program in a controlled -PC lab. At the completion of the treatment, scores on a validated conceptual test were collected once after the treatment and again three weeks after the treatment. Attitude survey s and qualitative study were administered at the completion of the treatment. The findings revealed significant differences, in favor of the simulation group, between the two groups on both the conceptual post-test and the follow-up test. The findings also revealed significant correlation between simulation groups attitude toward the simulation program and their post-test scores. Moreover, there was a significant difference between the two groups on their attitude toward their laboratory experience in favor of the simulation group. In addition, there was significant difference between the two groups on their lab completion time in favor of the simulation group. At the same time, the qualitative research has uncovered several issues not explored by the quantitative research. It was concluded that incorporating the recommendations acquired from the qualitative research, especially elements of incorporating hardware experience to avoid lack of hands-on skills, into the laboratory pedagogy should help improve students experience regardless of the environment in which the laboratory is conducted.

PAGE 11

1 Chapter 1 Introduction The purpose of this study is to examine an alternative to the use of physical laboratory activities in a communication systems laboratory. Specifically, this study examines whether computer simulation is as effective as physical laboratory activities in teaching college-level electronics engineering education students about the concepts of signal transmission, modulation and demodulation. Also of interest are the effects that computer simulation have on a) students knowledge retention after a period of time and b) students attitudes towards the use of the simulation as a substitute for the physical activities. Background of the Study Engineering education is under considerable pressure to include additional and novel material, to accommodate ABET 2000 criteria and to restructure content using new approaches and technologies. All of these are to be achieved within a nominal four-year format. Many engineering educators and administrators anticipate that new learning/teaching technologies can relieve so me pressure without loss of learning or added costs. In addition, many colleges and universities are witnessing challenges associated with offering online academic opportunities to those who are unable to attend traditional classrooms (Brent, 2002). Research indicates that at this time,

PAGE 12

2 three-quarters of twoand four-year colle ges offer distance-learning opportunities. A third of these offer accredited degree programs online (Watts, 2003). Soon most colleges across the country will be offering some of their courses online, and by the completion of 2004, a hundred million Americans are expected to take part in continuing education using some form of the new communication technology (Watts, 2003). Despite the tremendous success in the development and marketing of online learning and its anticipated future, one major challenge remains that leaves several specialized fields of education far from being ready to go online. In engineering technology programs where laboratory sessions are indispensable, students would not be able to complete degree requirements without attending real campuses that provide real lab facilities. The primary solutions to this challenge, specifically to engineering, have been home-kit, on-campus laboratory visits, and in some instances, computer-simulated laboratories. In engineering literature, however, despite the use of these methods, there is little evidence on their effectiveness. The existing studies reported in the engineering literature are small case studies and lack different control groups to isolate the effect on learning derived from the simulation. Such evidence can be found in a study conducted by Kadiyala and Crynes (2000), which provided an exhaustive overview of findings and trends in research over the last 15 years. Reviewing 760 reports, evidence was established that information technologies are capable of enhancing learning when pedagogy is sound and when there is a good match of technology, techniques and objectives. However, Kadiyala et. al (2000) could not restrict th eir reviews to only engineering and related

PAGE 13

3 subjects, for there were too few studies that met their criteria. One criterion in particular was notable; provide quantitative results on an outcome variable measured in the same way as with a technology-taught group and a conventionally instructed group. Wiessner and Lan (2004) agree with Kadiyala and Cryness claims and point out that in the area of engineering, despite the need, it is rare to see a controlled study involving the comparison of student performance and satisfaction in different types of learning experiences (Coleman, Kinniment, Burns & Kolemans, 1998; Zywno & Waalen, 2001). Zywno et al. (2001) emphasize that despite the efforts to enhance engineering education, there appear to be few studies derived from a statistically significant data set on which to base an evaluation of the effectiveness of the presently available tools, including simulation. Examples of various studied areas encompass the teaching of subjects such as electricity and magnetism (Chou, 1998); electrical amplifiers (Dobson & Hill, 1995); basic electronics (Moslehpour, 1993); engineering fluid mechanics (Engle, Weinstock, Campbell, & Sathianthan, 1996); basic thermodynamics (Buttles, 1992); chemistry (Grosso, 1994); and engineering physics (Chien, 1997). Most of these studies indicat e that computerized simulation could be an effective instructional tool for enhancing theories presented in lectures. Through the literature it has also been established that instructional computerized simulation can assist students in developing mental m odels of many different types of complex systems (Mayer, 1989; Mayer & Sims, 1994; Murno & Towne, 1992; Perkins & Unger, 1994). The value of this study lies in the fact that despite considerable research in using simulation software with science laboratory instruction, there is very little

PAGE 14

4 quantitative and qualitative research on the effectiveness of simulation for conducting engineering laboratory experiments and its potential as a substitute for physical laboratory activities at the college level engineering technology/education. Given the potential benefits of engineering programs incorporating simulated laboratories, an investigation of such a program at the college level is desirable. Such investigation would fill a gap in engineering education research and contribute considerable knowledge in the area of using simulation technology for learning and teaching enhancement in engineering higher education. Moreover, the impact of simulation-based laboratory instruction in relation to student learning and attitude using a mixed method (quantitative and qualitative) in the field of engineering education has not been investigated. As pointed out in the literature, simulation programs are being used widely for engineering laboratory instruction; nevertheless there is lack of evidence on their effectiveness (Zywno & Waalen, 2001). It is not sufficient to support and encourage the use of educational tools including simulation in any specific subject area on the basis of common sense or educational theory alone; empirical evidence is imperative. Motivation for the Study Initially, the motivation for this study came from a few factors that drive the effort to find alternatives to physical labs. The first is access. In the physical laboratory setting, labs can be costly, timeconsuming and difficult to schedule. Many students would prefer to work on labs late at night when faculty may have other commitments. A second factor is consistency. Implementation fidelity of learning programs

PAGE 15

5 in labs depends on TA (teaching assistants), who are often students themselves. The consistency of the learning experience may be low when the student works with different TAs. A third factor is the need to replace obsolete equipment with expensive new equipment. Low-cost simulation can replace a great deal of expensive physical equipment, decrease the amount and cost of equipment and increase access to up-todate electronic laboratory experiences. The fourth factor is online access. With the increase in online distributed learning, we face an issue of online access: the requirement for students to come to physical labs. As a result, the author investigated the use of simulated laboratory for beginning communication systems labs. Powerful as they are, simulations are not utilized as effectively and efficiently as they could be (Thiagarajan, 1998). Even though many simulation advocates have claimed the effective outcome of educational simulation, the most sweeping claims generally are not yet empirically based (Thiagarajan, 1998). The literature on computer-based instructional simulation is filled with contradictions concerning its use and effectiveness (Lee, 1999). What is the cause of conflicting research results on simulation in computerbased instruction? Lee (1999), after conducting a meta-analysis on the value of computer-based instructional simulation, concluded that the conflicting research results were due to studies on different type s of simulation and diffe rent ways of using simulation. For example, the main types of simulation were not distinguished in these studies. As a result the research outcomes are inconsistent and sometimes contradictory.

PAGE 16

6 Many educators and researchers feel there is a strong need for research on different types of simulation in computer-based instruction and effective ways of using each different type of simulation for the learning purposes. They have called for more research on the effective use of simulation, and this study is a response to the appeal. The idea and the motivation for this study also came from literature that recommended the value and importance of such research in the area of engineering education: According to Gomes, Choy, Barton & Romagnoli, (2000), a major shortcoming, caused by rising costs and infrastructure requirements, with conventional engineering education is the exigency of providing equipment and laboratory tools. The authors contend that it is now important to facilitate and assess higher level learning in laboratoryoriented courses with the availability of affordable computer software. In addition, Perry, Porter & Votta (2001) assert that in engineering research, empirical studies have not had the same success as other sciences. They stress that the biggest barriers to using pragmatic studies in engineering lie in the details of conducting them. For example, Fenton, Pfleeger & Glass (1994) point out that many empirical studies in engineering have poor statistical design (As sited by: Perry et al., 2001). Therefore, we need to create better studies and draw more credible conclusions from them (Perry et al., 2001).

PAGE 17

7 Simulation programs are being used widely for engineering laboratory instruction but there is lack of evidence on their effectiveness (Zywno & Waalen, 2001). While the power of integrating simulation technologies into the classroom with respect to asynchronous and distributed learning has been amply demonstrated in the literature, reports on the formal assessments of the effectiveness of technology-enabled instruction in engineering education are still rare (Zywno & Waalen, 2001). In engineering research many reports of improved student learning with computer-aided instruction focus upon the details of the software (Powell, Anderson, Van der & Pope, 2003; Cooper & Dougherty, 1999; Murphy, Gomes &Romagnoli, 2002; Mandai, Wong & Love, 2000; Li, Leboeuf, Basu & Turner, 2003) and do not rigorously assess the impact of such technology upon learning using objective measures of student knowledge (As cited by Weisner et al., 2004). Assessment, where done, often relies upon students view of the courseware in terms of usability and not upon measures of knowledge acquired (Weisner et al., 2004). Weisner et al., 2004 summarize the state of the research in the field of engineering by stating that when appropriately applied, information technologies have the potential to significantly enhance student learning in the engineering program. However, there is a dearth of studies evaluating their effectiveness in engineering curricula and to what extent they can replace physical experiments.

PAGE 18

8 The primary goal of this research was to investigate the effectiveness of educational simulation-based laboratory instruction to teach conceptual knowledge in the field of engineering. It is the aspiration of the author that the results of this study will provide practical information and can be generalized to other areas within engineering education. Focus of the Study Initially, it was envisioned that this research would involve the development of new simulation software for instructional and laboratory purposes. Conversely, the focus of the study evolved as a consequence of finding that there are already a number of simulation programs available that are being used. Unfortunately, there have been sparse research efforts to contrast some of the software packages with the traditional physical laboratory exercises. Much of the research effort to date has been designed to investigate the use of existing simulation software as a method to enhance, enrich or improve traditional lecture or laboratory courses rather than using simulation software in place of hardware laboratories. In an effort to determine alternatives to offering online engineering technology laborat ory courses, computer simulation was compared with physical laboratories. If the results indicate that the simulation-based laboratory method is as effective as the physical laboratory method, then this could assist in a reduction in laboratory costs and make such training available to those who are unable to attend traditional classrooms.

PAGE 19

9 The Multimedia Comparison Debate among Educators There are controversial arguments in research on the value of media comparison studies. According to Clark (1983) it is the method of instruction rather than media that leads more directly and powerfully to learning. On the other side of the debate, Kosma (2000) argues that media and methods influence each other and media constrain and enable methods. Jonassen (1994) dismisses the importance of the above argument by suggesting that concern with the role of media attributes and methods for providing information are inappropriate. He also claims that the world has moved and that the recent scientific revolutions in the psychology of learning have refocused theoretical and practical attention to the role of the learner rather than the effects of instruction. In reference to the media debate, Jonassen (1994) goes on to describe the learner as a part, interacting with the learning activity and environment, which is embedded in the learning context which itself is embedded in the social context. The author of this study acknowledges the above arguments and finds it important to emphasize a few points. 1. This study is not aimed to compare multimedia tools (i.e., simulation vs. traditional), but to recognize the potential of simulation as a substitute for physical laboratory experiments, which may lead to less laboratory cost and less experiment time. According to Clark (2001), a promising area to examine for evidence of media effects on learning is to ask about their capacity to speed learning and make it less effortful or expensive. This study is aimed to do just that. 2. In spite of the controversial arguments, media and technologies have become important in the field of engineering education, and physical laboratory activities have become an inseparable part of engineering courses. Therefore, multimedia tools have always been and will be an important asset to the field of engineering. But the question of multimedia technologies as replacements for physical laboratories in the area of engineering still remains. Hence, the aim of this study is to a) discover the potential of simulation in a laboratory-based communication systems course on the topic of modulation/ demodulation, b) to provide valuable

PAGE 20

10 insights on whether simulation software can replace the physical laboratory, and c) to compare students performance and attitude. 3. Also, in this study, it is the methods of instruction that are being compared not the media itself. According to Surry & Ensminger (2001), research should move toward intra-medium studies. Intra-medium studies improve on the media comparison design because they use a media attribute such as instructional strategy as the independent variable instead of media itself. Need for the Study The pursuit of an understanding of the potentials of simulation methods for conducting laboratory activities, (both offand on-campus) in an engineering education context is worthwhile for several reasons. Simulation potentially offers students opportunities to explore situations that may be impossible, too expensive, difficult or time-consuming to accomplish with actual laboratory or real-life experiences. Even if real-life experiences seem feasible, simulation offers students the opportunity to explore a wide range of variables more rapidly can supplement such experimentations. In addition to being safe, convenient and controllable, the simulation-based laboratories can be made available to anyone, anywhere, anytime. A report by Carnevale (2000) indicates that, of the schools offering online learning programs, only 12 percent offered courses in engineering. The low percentage of online engineering courses may be due to the fact that traditionally undergraduate engineering courses employ lectures and laboratories as the most common method of delivering education. In many engineering courses, physical laboratory activities are an inseparable part of the curriculum. But delivery of laboratory experiments beyond laboratory walls, where conducting physical experiments is not possible, has always been the greatest challenge of online engineering education. Despite the challenge, researchers argue that there is a great need for delivering online engineering courses and laboratories due to changing

PAGE 21

11 demographics and growing competition (Bourne, 1997). In response to the need for resources that provide practical experience to online engineering students, this study has been designed to investigate the effects of simulation for conducting laboratory experiments on the topic of communication systems. By demonstrating that simulation-based laboratory methods can provide comparable outcomes to traditional physical laboratory methods, the cost of providing engineering laboratories can be dramatically reduced. By reducing the costs, specialized materials and equipment needs and facility requirements, engineering laboratory training would be more accessible for current engineering students as well as those individuals who are unable to attend traditional classrooms. The Significance of the Study The author believes that there is no substitution for real-life experience, but unfortunately due to many factors, such as safety, budget and time constraints, numerous engineering curricula are lacking such experiences. This lack of doing the real thing is supported by the work done by Dorato and Abdallah (1993), in which it was discovered that the lack of financial support for laboratory facilities is a common problem in engineering programs worldwide and that many countries are now following the American model of very theoretically oriented undergraduate education in engineering (As cited by: Wyatt, 2000). Furthermore, this issue of the lack of enough laboratory experience is magnified in the area of distance education. There is no doubt that online education for courses like mathematics or history, where there are no experiments involved, might do justice to the needs of the student at a distance. However, the scenario is

PAGE 22

12 entirely different for engineering courses when experiments form an integral part of the course content. But still many of theses courses, classroom or online, lack enough laboratory experiments due to the problems previously discussed. Therefore, it is imperative to find an alternative to real-life experiences to accommodate students the best way possible. One of those alternatives may be a simulation package. While simulation packages have a role to play in distance education, the question still remains as to whether they can replace the need for real and practical laboratory knowledge. Hence, the goal of this study is to contribute to traditional and online engineering education by infusing simulation for performing laboratory experiments and investigating its effects. The dynamic and graphical information display capabilities of simulation software may provide laboratory experiments otherwise unlikely to be available to learners. It is the aspiration of the author that such a study would not only contribute to the field of engineering education and online education but also provide an alternative for teaching laboratory-based technical courses in engineering environments. Overall, the significance of this study can be seen in the following ways: a) This study can provide new insight for understanding the potential of simulation programs in relation to laboratory activities, and b) if advantages of providing laboratory thru simulation for learning the complex process of modulation and demodulation in a traditional setting can be shown, then perhaps traditional and online engineering laboratory instruction can be approached similarly. Such a conclusion may shed some light on designing and teaching engineering courses. The results may also suggest the feasibility of a change in the way we teach engineering courses.

PAGE 23

13 Research Questions The contrast of a simulated laboratory approach and traditional physical laboratory approach to teaching engineering laboratory concepts provides an opportunity to explore the value of computer simulation to enhance traditional engineering teaching. The purpose of this research is to explore the effects of using a simulation program for conducting modulation and demodulation laboratory experiments and to compare those effects with the traditional physical laboratory. This experimental study compared results of two ways of teaching the topic of modulation and demodulation and their operations in a laboratory setting through two experiments to undergraduate engineering students at a four-year college. The comparison was made on the basis of the performance of two groups of students in which each group was exposed to one of two methods of instruction within the topic of modulation and demodulation. In order to explore the operation and theoretical concepts related to the topic of modulation and demodulation, one group performed the laboratory experiments in a traditional physical laboratory while the other group performed the same laboratory experiments using a simulation program. Qualitative Research Questions The research question for this research project is: Can simulation-based laboratory replace physical laboratory methods? Specifically, Question 1. In terms of student conceptual learning, how do simulation-based laboratory experiences compare to physical laboratory experiences?

PAGE 24

14 Question 2 How does students attitude toward the use of the simulation affect their post-test score? Question 3 How does the simulation group attitude toward the laboratory experience differ from that of the physical group? Question 4. In terms of completion time of the assigned laboratory experiments, how do simulation-based laboratory experiences compare to physical laboratory experiences? Question 5. In terms of student knowledge retention, how do simulationbased laboratory experiences compare to physical laboratory experiences? Question 6 What are the perceptions of both groups on the use of laboratory experiments in general for learning the concepts? Question 7. What is the students perception toward the use of simulation in place of physical laboratory? A mixed study of quantitative and qualitative research methods was applied to seek answers to the questions. An experimental research design was conducted to examine Questions 1-5 while a qualitative case study design was carried out to explore Questions 6-7. The motivation for Question 3 came from Alkazemis (2003) recommendation that when simulation is used compared with traditional laboratory instruction, further research is needed to explore the time to complete tasks. Statement of Hypothesis The focus of this study was to discuss the effects of simulation in terms of its capabilities to replace physical laboratory methods. It is hypothesized that the

PAGE 25

15 treatment group students will perform as well as the control group, appreciate the subject matter and value the instructional treatment more, and will spend less time completing the lab experiments. Specific null hypotheses are as follows. H 0 1: There is no significant difference (at p = 0.05 level) between the physical group and the simulation group attitudes toward the laboratory experience as measured by attitude survey at the completion of the posttest. H 0 2: There is no significant difference (at p = 0.05 level) on post-test scores between students performing physical experiences on a traditional communication systems topic as compared to those performing the same experiments using a computerized simulation program. H 0 3: There is no significant difference between simulation and physical laboratory groups long-term retention of the concepts as measured by mean scores on a follow-up instrument. H 0 4: There is no significant difference on laboratory completion time between students performing physical experiences on a traditional communication systems topic as compared to those performing the same experiments using computerized simulation program. H 0 5: There are no significant correlations between simulation groups attitude toward the use of the simulation and their performance post-test scores.

PAGE 26

16 Variables in the Study Independent variables: Methods of instruction, a variable with two categories: computer simulation and physical laboratory. Dependent variables: Post-test scores, attitude scores, follow-up scores and laboratory completion time scores. Assumptions The results of this study were based on several assumptions that are listed below. 1. The students participating in this study have satisfied the course prerequisite, which includes Circuit I. 2. The students participating in this study have similar prior experience with assembling and disassembling circuits. Delimitation The scope of this research study is delimited in several ways: 1. The subjects are restricted to undergraduate junior/senior level electronic engineering students. 2. The learning content is restricted to the field of communication systems and digital signal processing. 3. The learning objectives are limited to the analysis, synthesis and evaluation levels in the cognitive learning domain.

PAGE 27

17 Limitations The following limitations should be taken into account before the results of this study are generalized in any way. 1. Since the simulation utilized only electronic concepts, the results of the study can be generalized only within this domain. 2. Since the experimental treatments were short, the results may be affected by this time limitation. 3. The findings are also limited geographically and by characteristics of the sample. 4. This study did not examine the use of a computerized simulation laboratory in an online setting. 5. This study examined the use of computerized simulation as a tool for conducting laboratory experiments and not as a tool for engineering analysis of the system.

PAGE 28

18 Definition of Terms The following terms used in the study may be operationally defined as follows: Laboratory : A place for practice, observation or testing Physical laboratory : A workplace devoted to conducting experiments with the exception that the equipment needed for the experiments is preassembled. Hands-on laboratory : A workplace devoted to conducting experiments, with the exception that students assemble the equipments needed for each experiment. Computer simulation Computer simulation is computer programs that allow the user to interact with a computer representation of either (a) a scientific model of the natural or physical world or (b) a theoretical system. (Weller, 1996) Computer-simulated experiments Computer-simulated experiments are computer simulation that provide learner-centered environments and allow students to explore systems, manipulate variables and test hypotheses (Windschid & Andre, 1998). Conceptual simulation Conceptual simulation is simulation that involves models of invisible phenomena that have mathematically interrelated variables that can be manipulated to observe changes (Windschid & Andre, 1998). Instructional simulation Instructional simulation is a simulation that is intended to result in a predetermined learning outcome (Armstrong, 1991).

PAGE 29

19 Chapter 2 Review of Literature This chapter incorporates a summary of the information contained within the literature pertaining to computer simulation research (definitions, characteristics and categories as well as advantages and disadvantages). Also included is how this research relates to the application of computer simulation for classroom instruction as well as laboratory instruction in engineeri ng and science education covering reports of various alternatives to physical laboratories. The last section of the chapter will provide an overview of the importance of learning objectives in general, the Blooms Taxonomy and the instructional objectives relevant to this study. Status of Current Research in Engineering According to Wankat (1999), the most commonly used research instruments in studies reported in the Journal of Engineering Education are student surveys and endof-course ratings. Surveys are easy to use and frequently satisfy reviewers of proposals and papers related to engineering education. In spite of this, results based entirely on surveys lack the credibility needed to persuade engineering faculty to modify their teaching methods (Wankat, 1999). The author also asserts that: most published studies in which the research has gone beyond surveys have involved comparisons of experimental and control group test scores and retention rates. Quantitative studies of this type are much more credible than

PAGE 30

20 survey-based studies to engineering faculty members, but there are several obstacles to their use. One is that few engineering classes have enough students to form experimental and control groups large enough to yield statistically significant results; another is that few engineering professors are familiar with the complexities and ethical issues involved in human subject research; and still another is that control group studies must be planned in advance, whereas many innovations in engineering education seem to develop more by natural growth and change than from preplanning. Due in part to these difficulties, relatively few of the studies reported in the Journal of Engineering Education have used rigorous quantitative methods and many of those that have done so suffer from methodological weaknesses. According to Springer, Stanne and Donovan (1999), one notable area is the body of research focusing on laboratory learning. Many studies have shown that the more students work in laboratory, the more they learn, the better they understand what they are learning, the easier it is for them to recall what they learn and the better they feel about themselves, the class and their classmates. Springer et al. (1999) metaanalyzed the research for college-level science, mathematics, engineering and technology and found significant effects on student persistence and achievement in these fields and positive attitudes toward their education. Such studies are likely to be more persuasive in the engineering education community than any other type. On the other hand, although there is a lack of qualitative research in the field of engineering, qualitative methods used widely in the social sciences are gradually percolating into the engineering education literature, even though few engineering faculty are familiar with them (Wankat, 1999). This type of research will undoubtedly become more common and more imperative in engineering and technology as more faculty members discover that some of the skills specified by the Accreditation Board of Engineering and Technology (ABET) 2000 can be assessed most effectively using qualitative methods (Wankat, 1999).

PAGE 31

21 Alternatives for Traditional Laboratory One major challenge in engineering programs where laboratory sessions are indispensable is the fact that some students who cannot attend traditional classrooms would not be able to complete degree requirements without attending real campuses that provide real-life lab facilities. The literature has provided few solutions to this challenge. An extensive survey (Alhalabi, Anandapuram, & Hamza, 1998) was carried out by examining the course content offered by many leading private and public North American Universities and some colle ges in the United Kingdom that offer fullor part-time programs via the Internet. Most institutions have recognized the challenge of offering lab courses over the Internet and have spent significant efforts to overcome this weakness. Following are four alternative methods that have been employed to place laboratories online. Among these four schemes, simulation software has been identified as the best alternative, because it is highly portable and cost-effective (Aotani, 1997, as cited by: Alhalabi, et. al. 1998). 1. Videotapes : The Open University employs videotapes in Great Britain, which also uses the other distance education techniques. If the presentation of a simple experiment is sufficient in instructing the student in full measure, then the videotape showing the experiment is mailed to the student. Later, the knowledge of the student is tested by an online examiner who asks probing questions assessing the students comprehension. 2. Home Kits : If physical experience is considered essential, then a custom designed home kit, with relevant instructional material, is sent to the

PAGE 32

22 student. The Open University has designed several such kits for use by students. In spite of this, when we refer to courses like Logic Design, Microprocessors etc., the possibility of providing a home kit becomes nearly cost-prohibitive. Further, the student may not have the accessory facilities needed to use the kit at home. Geographical distances, which will add to the delay in receiving the material, may deter the student from accepting these course offerings. 3. Local Arrangements : The third, and perhaps the best, choice is to make available real laboratory facilities near the students locale. Accredited colleges in the vicinity may offer such lab facilities for a week or two. Alternatively, the university itself can make available the laboratory facilities for a week or two on its campus. Intensive laboratory activities during this period help students to finish the requirements needed by the course or may assist them in completing the remaining component in their homes in a satisfactory manner. This alternative is by far the most satisfactory from the student point of view; yet it suffers several disadvantages in that the distance between the student locale and the university may be a major drawback. This inconvenience substantially adds to the cost of the course and, for the majority of students, makes it more unaffordable. The university staff may also have difficulty in opening its laboratory facilities for a short duration, given that it may affect on-campus students. 4. Software Simulation : Simulation packages are designed for the purpose of bringing laboratory facilities to the door of the student (Aotani, 1997).

PAGE 33

23 Constant improvements are being made in simulation packages to make the whole experience nearer to reality (Aotani, 1997). What is Simulation? Simulation has been defined in the literature in different ways. In a broad sense, a simulation is defined as an abstraction or simplification of a real-life situation or process. Typically a simulation is defined as a model of a real-world environment, usually with the facility, for the user to interact with the environment (Thurman, 1993). Alessi and Trollip (2001, p.227) provided the most comprehensive definition of computer simulation. In an educational context, a simulation is a powerful technique that teaches about some aspect of the world by imitating or replacing it. Students are not only motivated by simulation, but learn by interacting with them in a manner similar in they way they would react in real situations. In almost every instance, a simulation also simplifies reality by omitting or changing details. In this simplified world, the student solves problems, learns procedures, comes to understand the characteristics of phenomena and how to control them or learns what actions to take in different situations. In each case, the purpose is to help the student build a useful mental model of part of the world and to provide an opportunity to test it safely and efficiently. A review of the literature reveals that the definitions and characteristics of simulation microworlds, games and desktop virtual realities may heavily overlap or even be synonymous as well as remain distinct, depending on their design and most importantly how they are used in a learning interaction. In order to present a rationale for this study, closely related terms need to be clarified. Therefore, the author finds it necessary to distinguish between simulation and those other media to help readers understand the reasoning on labeling the tool used in this study as simulation software.

PAGE 34

24 Simulation vs. Microworlds There is no accepted definition of simulation and microworlds that allows for a clear distinction between the two. As a result the distinction between the two is indistinct. A microworld, can be defined as a model of a concept space, which may be a very simplified version of a real world environment or it may be a completely abstract environment. Normally, a user can create some sort of construction within the microworld, which will behave in a way consistent with the concepts being modeled (Papert, 1993; Rieber, 1992). The microworld idea is about three decades old. Based on a review of microworld literature, Edwards (1995) makes a useful distinction between structural and func tional views of the microworld idea. According to Edwards (1995); the former view prioritizes the idea of a microworld as a concrete embodiment of a mathematical structure that is extensible (so tools and objects can be combined to build new ones) but also transparent (so its workings are visible and rich in different representations.) The latter view prioritizes features of the microworld that become apparent in use, where learners are expected to explore and build, learn from feedback while involved in the iterative design of long-term projects rather than in trying to master de-contextualized knowledge fragments. Therefore, microworlds are environments where people can explore and learn from what they receive back from the computer in return for their exploration. Miller, Lehman and Koedinger (1999) designed a simulation in which the topic is electricity, more specifically electrically charged particles. In the simulation called electric field hockey, students were expected to gain an intuitive feel for the qualitative interactions of electrically charges particles by playing a game in which they had to place charged particles in such a way on a hockey field that another

PAGE 35

25 particle that was given an initial speed and direction from a certain point hits a hockey goal. Environments like the one just mentioned are often labeled microworlds rather than simulation. Simulation vs. Games Simulation resembles games in that both contain a model of some kind of system, and learners can provide input and observe the consequences of their actions. According to Gredler (1996) the deep structure of games and simulation differs in three important ways: 1) instead of attempting to win, participants in a simulation are executing serious responsibilities with associated consequences and privileges; 2) the event sequence of a game is typically linear, whereas, a simulation sequence is nonlinear; and 3) rules in games can be imaginative and need not relate to real-world events, whereas the basis for a simulation is a dynamic set of relationships among several variables that change over time and reflect authentic casual processes (i.e., the relationships must be verifiable). Simulation vs. Virtual Reality Computer simulation is a computer-generated version of real-world objects or processes. They may be presented in 2-dimensional, text-driven formats or increasingly 3-dimensional, multimedia formats. Computer simulation can take many different forms, ranging from computer renderings of 3-dimensional geometric shapes to highly interactive computerized laboratory experiments. Virtual Reality (VR), on the other hand, is a technology that allows students to explore and manipulate computer-generated, 3-dimensional multimedia environments in real time. One form

PAGE 36

26 of VR is Desktop VR (DVR), which uses an interactive computer-based, multimedia environment in which the user becomes a participant with the computer in a virtually real world (Pantelidis, 1993). DVR has the potential to enhance and improve learning by enabling the user to interact with the environment. DVR environments are presented on an ordinary computer screen and are usually explored by keyboard, mouse, wand, joystick or touch-screen. Web-based "virtual tours" are an example of a commonly available DVR format. One of the major methodologies used in DVR is that of simulation and modeling (Van Weert, 1995). Educational computer simulation is based on dynamic interaction between a learner and a computer program and may be defined as that part of the modeling process involving the learners execution of a model. The learner experiments with the simulated phenomenon by observing and analyzing the interactions between him/herself and the modeled phenomenon. In simulation systems, the learner enters a powerful learning environment and engages in a cycle of expression, evaluation and reflection. With design changes, simulation-based programs can become VR-based programs. Categories of Simulation Alessi (2000) categorized simulation into the following four different types (a) physical simulation, in which a physical object such as electric cell is displayed on the computer screen, giving the student an opportunity to manipulate it and learn about it; (b) procedural simulation, in which a simulated machine operates so that the student learns the skills and sections needed to operate it; (c) situational simulation, which normally give the student the chance to explore the effects of different methods to a

PAGE 37

27 situation; and (d) process simulation, which is different from other simulation in that the student neither acts as a participant (as in situational simulation) nor constantly manipulates the simulation (as in physical or procedural simulation) but instead selects values of various parameters and then watches the process occur without intervention. Similarly, De Jong & Van Joolingen (1998) divided simulation into two types: (a) conceptual simulation which hold principles, concepts and facts related to the class of systems being simulated and (b) operational simulation including sequences of cognitive and non cognitive operations that can be applied to the class of simulated systems. Conceptual simulation can be altered into a more operational simulation (game-like) by adding specific goals (De Jong et al., 1998). Gredler (1996) proposed two categories of simulation: (a) experimental simulation, which establish a particular psychological reality and put participants in defined roles within that reality and (b) symbolic simulation in which the behavior that is simulated is usually the interaction of two or more variables over time, and the learner can manipulate these variables in order to discover scientific relationships, explain or predict events or confront misconceptions (Harper, Squire & McDougall, 2000). Students using a symbolic simulation manipulate the virtual environment from outside of the simulation (Gredler, 1996). The representation of reality is usually mediated through a symbol system, such as graphs of output or diagrams of processes. Students using symbolic simulation maintain an advantage point that is more detached than the experiential simulation. Additionally, the representation of reality is more abstract (Gredler, 1996).

PAGE 38

28 The simulation used in this study falls into both the conceptual and symbolic category. On one hand, the simulation holds principles, concepts and facts related to the waves that are being simulated and the mathematical operations behind each processed output according to the input variables. On the other hand, the students can manipulate these variables in order to discover the relationships between sample frequency, amplitude and carrier frequency, which assists them to explain or predict events. In addition, it not only represents graphs of modulated and demodulated signals, but it will also present the modulated or constructed signal in form of audio. Characteristics of Simulation Simulation has been used in education and training environments for many years (Harper, Taranto, Edwards & Daily, 2000), but it is only in the recent literature that the characteristics of simulation have been clearly defined. There seems to be a general agreement that the goal of simulation must be to provide interactive experiences mimicking the real world as closely as possible. It has been noted by Harper et al. (2000) that the key disti nguishing feature of simulation designed for educational purposes is that they make use of a model to represent some event or process which the user can interact with and manipulate during their exploration within a learning landscape that presents information in a multi-representational format. The need for interactivity, active engagement and navigational support in simulation has been noted as a significant characteristic that contributes to the educational outcome of such tools. Additionally, an important characteristic of a simulation is its validity. Different types of validity can be distinguished. Content validity expresses the degree

PAGE 39

29 to which a simulation environment captures relevant aspects, activities and parameters of the real-life operational environment it simulates or refers to. Construct validity expresses the degree in which the constructs, knowledge and skills the learner has to have to use/develop in a simulation environment resemble the ones that one has to use in the real world. Advantages and Disadvantages of Simulation While both traditional and laboratory activities and simulation are forms of inquiry which engage the learner in the process of observing, hypothesizing, experimenting and forming conclusions, computer-simulated experiments, as inquiry tools, are considered by some authors to be superior to conventional laboratories (Mintz, 1993). In addition to many practical advantages, computer-simulated experiments have a number of instructional advantages. Mintz (1993) listed the following advantages: 1. Various types of research problems, which cannot be addressed by conventional experimentation, such as prediction and forecasting, can be presented to the learner through simulation. 2. Simulation can provide immediate i nput and output, allowing students to see immediate connections between hypotheses and experimental results. Immediate responses to what if questions encourage students to examine various system states and investigate as many hypotheses as they desire without fear of error and without having to repeat their experiments.

PAGE 40

30 3. Isolation and control of variables enable students to assess the effect of each individual variable as well as their combined effects, promoting clearer understanding of this key aspect of inquiry work. 4. Simulation can display information in a variety of formats, improving student ability to interpret and organize data. Min (1995) presented some other advantages, asserting that simulation allows the student to insert those parameter values that he or she thinks will produce a result that is of interest to him, as well as allow a student to choose how he or she wants to approach a simulation or experiment. Computer simulation also allows the student to repeat the experiment as often as desired. It is important to mention that there are disadvantages associated with the use of computer-simulated programs in education. It is of note that these limitations are in some cases the result of the wrong or inappropriate use of such programs. Min (1995), listed several possible limitations: 1. Simulation concerns the manipulation of a number of variables of a model representing a real system. However, manipulation of a single variable often means that the reality of the system as a whole can be lost. 2. A computer simulation program cannot develop student emotional and intuitive awareness that the use of simulation is specifically directed at establishing relations between variables in a model. 3. Computer simulation cannot react to unexpected sub-goals which the student may develop during a learning process; 4. Computer simulation programs may function well from a technical point of view, but they are difficult to fit into a curriculum.

PAGE 41

31 5. Often a computer simulation program cannot be adapted to take into account different student levels within a group or class. 6. During the experience of interaction with a computer simulation program, the student is frequently asked problems in which creativity is often the decisive factor to success. Instructional Simulation Research conducted over the past two decades on the effectiveness of instructional simulation yielded mixed results (Lee, 1999). In an early evaluation effort, Cherryholmes (1966) reviewed the fi ndings of six studies and concluded that except for highlighted interest, no substantial evidence could be found to support claims that simulation produces greater cognitive gains and effective changes than other methods of instruction. A decade later, Pierfy (1997) reviewed the results of 22 comparative studies and concluded that, in terms of fostering student learning, simulation was no more effective than conventional instructional methods. However, he found evidence that simulation supported retention of information and changes in attitude. Using meta-analysis on the data from 93 simulation studies, Dekkers and Donatti (1981) failed to support Pierfys findings concerning retention. In support of simulation Orlansky and String (1979) reviewed the results of 48 studies comparing military training simulation with conventional training and concluded that simulation produced equal or better achievement in about 3 percent less time. De Jong & Van Joolingen (1998), after reviewing a large number of studies on learning from simulation deduced, The general conclusion that emerges from these

PAGE 42

32 studies is that there is not clear and univocal outcomes in favor of simulation. An explanation why simulation based learning does not improve learning results can be found in the intrinsic problems that learners may have with discovery learning. They also concluded that adding instructional support to simulation might help to improve the situation. Fredriksen, White and Gutwill (1999) showed that leading students through a graduated series of electricity simulation led to the development of dynamic mental models that facilitated understanding of electricity concept. Rieber, Smith & Noah (1998) report on a study with adult learners to investigate the influence of game-like and graphical organizers during a computerbased simulation in physical science. What they found was that although the learners enjoyed and were able to use the simulation, they had difficulty transferring the experiential knowledge gained in using the simulation into an explicit understanding of the scientific principles which they measured using a traditional performance test. Studies conducted by Rieber (1990, 1991a, 1991b; Rieber, Boyce, & Assad, 1990) have shown positive effects of animated visuals over static visuals in computerbased science instruction, whereas Riebers previous study (1989) did not demonstrate any powerful influence of computer animation on learning. However, the lack of differential effects in that research (Rieber, 1989) was attributed to poor instructional design and task difficulty. These effects in design were supplemented and improved in his subsequent studies, which indicated positive effects of animation in computer-based instruction. In some of his research (Rieber, 1990; 1991b; Rieber, Boyce, & Assad, 1990), Rieber employed interactive animation like structured simulation activity as practice activity, pointing out the superiority of animated

PAGE 43

33 graphics over static graphics. Rieber (1991a, 1991b) also revealed that students were able to successfully extract incidental information from computer-animated presentation of science concepts without any harm to intentional information. In yet another study, Rieber (1996) contended that learning through animated visual displays remains implicit because the attribute of animation allows the information to be presented like natural phenomena. In this sense, incidental learning can be drawn out from natural and implicit representation of knowledge. Rieber (1996), exploring the role of computer animation as real-time graphic feedback, employed a post-test as an explicit measure to assess students formal learning of science principles and game-score as implicit tacit measure. The learning task for all the students was to understand the relationship between acceleration and velocity by way of interactive computer simulation. The computer-based instruction embodied simulation in a game-like context. Game-score was measured as the time in seconds to complete the cognitive game successfully. The lower the game-score was, the higher student performance. The results revealed that with respect to gamescore which was a tacit measure, the students who participated in real-time simulation feedback outperformed those in textual feedback, whereas there was no significant difference with respect to post-test as explicit measure. Several explanations concerning the inconsistent results of simulation research have been offered. Poor research designs are partly to blame (Butler et al, 1988; Lee, 1999), but a much more serious problem is lack of a theoretical framework for the instructional use and evaluation of simulation (Bredemeier & Greenbelt, 1981). This result inconsistency encourages more focus on the role of simulation in development

PAGE 44

34 of meaningful learning environments. In his analysis of previous reviews, Lee (1999) divided simulation, based on the designs, to two forms pure and hybrid and further divided instruction into two modes presentation and practice. The impure simulation incorporates expository instructional features and the pure simulation does not have those features. The hybrid simulation mixes pure simulation and some features of expository instruction (providing the students with a large number of examples and a series of guidance together). Lees review leads to the following conclusions: Within the presentation mode, the hybrid simulation is much more effective than the pure simulation. Simulation is almost equally effective for both presentation and practice mode. Specific guidance in simulation seems to help students to perform better. When students learn in the presentation mode with the pure simulation, they showed a negative attitude toward simulation. Science seems to be a subject fit for simulation type of learning. Simulation Laboratory in Science classroom The use of simulation packages to aid laboratory instruction has also made its way into the science classroom. Computers have successfully been used to simulate plant growth experiments in college biology class for non-biology majors. Statistically significant differences were obtained when compared to instruction without a laboratory segment (Buttles, 1992). In an effort to increase student learning of basic thermodynamics, middle school students used computer simulation to supplement conventional laboratory practices in a physical science course. The Computers as Lab Partners system allowed the students to enter data gained from conventional experimentation, plot their data and see immediate results (Linn & Songer, 1988, p.2). The teacher

PAGE 45

35 observed that when compared to conventional lecture and laboratory practices, laboratory time management was improved and overall student cognitive knowledge increased (Linn & Songer, 1988). A dissertation completed at Texas A&M University (Van LeJeune, 2002) synthesized the findings from existing research on the effects of computer-simulated experiments on studies in science education. Results from 40 reports were integrated by the process of meta-analysis to examine the effect of computer-simulated experiments and interactive videodisk simulation on student achievement and attitudes. Findings indicated significant positive differences in both low-level and high-level achievement of students who use computer-simulated experiments and interactive videodisc simulation as compared to students who used more traditional learning activities. No significant differences in retention, or in student attitudes toward the subject or toward the educational method were found. Based on the findings of the study, computer-simulated experiments and interactive videodisk simulation should be used to enhance student learning in science, especially in cases where the use of traditional laboratory activities is expensive, dangerous or impractical (Van LeJeune, 2002). The following is a more detailed conclusion of this study: 1. The use of computer-simulated experiments and interactive videodisk simulation in science classrooms improves students low-level achievement, such as ability to learn facts, comprehend scientific processes and apply that knowledge to everyday phenomena as compared to traditional science laboratory activities. 2. The use of computer-simulated experiments and interactive videodisc simulation in science education classrooms improves student problemsolving ability and other higher-order thinking skills as compared to traditional science laboratory activities. 3. The use of computer-simulated experiments and interactive videodisc simulation in science education classrooms is as equally effective as traditional science laboratory activities in promoting retention of material for a period of two weeks or more.

PAGE 46

36 4. The use of computer-simulated experiments and interactive videodisc simulation in science education classrooms promoted positive student attitudes toward the subject matter. 5. Research on the effects of simulation on student attitudes is much less prevalent than research on student achievement. Van LeJeune (2002) asserts that with respect to student achievement, the conclusions of his study on the effect of simulation in science education are consistent with the conclusions of previous meta-analysis performed in all fields of education. For instance, support for these conclusions was also found in Armstrong (1991), who performed a meta-analysis on the effect of computer simulation across a broad area of subject matter. Armstrong found an effect si ze of +0.31 for low-level recall of facts, consistent with the mean effect size of +0.34 found in Van LeJeunes study. In addition, Armstrong found a mean effect size of +.028 for higher-level achievement, consistent with the mean effect size of +0.38 found in Van LeJeunes study. As part of the results of the meta-analysis, Van LeJeune (2002) reports that surprisingly, simulation produced between 1983 and 1993 proved to be more effective at promoting student achievement than those more recent, especially in achievement outcome relating to achievement in low-level thinking skills. The author goes on by explaining that: perhaps the earlier simulation was less complicated and easier for the students to master. Additionally, earlier simulation was more guided, while recent simulation demonstrate more realistic representations of a traditional laboratory. The lack of guidance in more recent simulation might serve to confuse and intimidate students. A simple, guided, unsophisticated approach might be a more effective strategy for teaching low-level concepts. Simulation Laboratory in Engineering Education Several studies have been conducted on the use of process/conceptual

PAGE 47

37 simulation to aid in laboratory instruction. One of the most common purposes of computer implementation has been as a means of reducing costs as well as the time required to complete laboratory assignments. Dobson and Hill (1995) reported on a survey of student response to the implementation of simulation in an operational amplifier (op-amp) course conducted at the Department of Mechanical Engineering at Southampton University (U.K.). A personal computer-based simulation package from Interactive Image Technologies, Ltd. titled Electronic Workbench was used to replace the traditional physical experiments that had been in place for several years. Sixty-four second-year engineering students conducted their op-amp laboratories using either the traditional physical circuit boards with which they assembled and tested the actual components or using the simulation package. An eight-question survey was then administered to the participating students. The results of the survey indicated that: 1. The students felt that there were no significant learning differences. 2. A higher percentage of the students rated the simulation package easier to use than the conventional lab exercise. 3. There was not a correlation between pre-test disposition towards computers and preference toward computer replacement of the conventional laboratory experience. 4. The simulation group strongly agreed that lab experiments conducted using the simulation package took much less time to complete. 5. The simulation group appeared to find the lab assignments slightly easier than did those using the conventional equipment. 6. Many of the students, percent of the simulation group and 41 percent of the conventional group would favor replacing the conventional lab with the simulation (Dobson & Hill, 1995, p.19). Additional findings from the same study also indicated that the laboratory instructor workload was reduced while using the simulation package. However, almost 75 percent of the students surveyed voiced concerns about the loss of skill development if the physical conventional laboratory component were totally

PAGE 48

38 eliminated (Dobson & Hill, 1995, p.20). The efficacy of software simulation of electronic circuits laboratory to support beginning electrical engineering students wa s also investigated by another group of researchers. The experiment was conducted with 40 college sophomores. Physical lab subjects received seven physical labs. Combined lab subjects received a combination of seven simulated labs and two physical labs. The latter repeated two of the simulated labs to provide physical lab practices. Both treatments used the same assignments. Learner outcome measures were: time required to complete a new criterion physical lab, score on written lab and theory tests over all the labs, and comments on the lab experience. The group that used combined simulated and physical labs performed significantly better on the written tests than the group using entirely physical labs. Both groups were equivalent in time to complete the criterion physical lab. Comments about the simulated labs were generally positive (Campbell, Bourne, Mosterman & Brodersen). For a Ph.D. dissertation completed at Iowa State University, the application of computer simulation in an electronics cla ss laboratory was also studied. Each group received the same lecture; however, the control group received four hours of traditional physical lab per topic, while the experimental group received two hours of physical lab and two hours of simulation lab per topic. The study found that there were no significant differences between the two groups on the mid-term and final exams as well as on homework assignments. However, the control group did score significantly higher on four of the 12 quizzes at the alpha = .05 level. The author recommended that computer simulation be applied to complex topics starting with the

PAGE 49

39 beginning courses (Moslehpour, 1993). In a similar study conducted at Pennsylvania State University, a computerized program was developed and tested for use as a lab activity in an engineering fluid mechanics course. Titled the Fluid Flow Construction Set, the personal computer based software was used to introduce engineering students to fluid flow in piping systems without requiring expensive laboratory equipment. The authors stated that use of this simulation software allowed students to conduct more advanced experiments than could be done in a conventional lab and also motivated student learning in this topic area (Engel et al., 1996). In a similar study, a Ph.D. dissertation completed at the University of Florida analyzed the effects of an instructional sequence of a conceptual computer simulation and traditional laboratory on middle grade students understating of a topic in electrochemistry. In this study the science teachers and students in science middle school science classes used a computer simulation and traditional teaching and learning methodology to study the physical science topic electrochemistry. Group A students received the simulation (treatment 1) prior to the traditional laboratory experience (treatment 2), while Group B subjects received the computer simulation after traditional laboratory experiment. The study incorporated ANCOVA. The results of the study indicted no statistical support for the theory that prior use of a simulation before the traditional laboratory can improve learning. The treatment group who completed the simulation activities before the actual physical lab performed slightly better on the achievement post-test than the other group (Alkazemi, 2003). In a study conducted by Choi and Gennaro (1987), it was found that a

PAGE 50

40 computer-simulated activity was not as effective as a hands-on laboratory activity in teaching the volume displacement concept. In the study, 128 eighth-grade students from five different science classes at a middle school in Minnesota were randomly assigned to one of two treatment groups: the computer-simulated experience (experimental group) and the hands-on laboratory experience (control group). The experimental group was taught the concept using a series of five simulated experiments on the computer. The control group was taught the same concepts using five parallel hands-on laboratory experiments. Upon completion of the treatments, a post-test was administered and, results showed that there was a significant difference (in favor of physical group) between the tw o groups of 16 students in the learning of the volume displacement concepts. Based on these results the researchers concluded that computer-simulated experiences were not as effective as hands-on experiences. However, the authors conclude that the results could be due to an insufficient study design. In a similar study, Hall (2000) examined the effectiveness of using conceptual computer simulation software for laboratory instruction in lieu of using actual components and equipment in a hands-on hardware laboratory. The results indicated that there are no significant differences in student achievement between those who simulate a laboratory exercise and those who perform the same laboratory exercise in a traditional hardware laboratory. An Overview of Concept Learning This study emerges from two main topics in the research on, and the practice of, engineering education. One is that engineering educators are paying increasing

PAGE 51

41 attention to conceptual understanding; and the other is that educators are progressively showing interest in integrating computers into laboratory instruction. Researchers agree that conceptual learning is extremely important in learning science (Savander-Ranne & Kolari, 2003; Tennyson, 1996), but what is conceptual learning and why is it important? Meaningful Conceptual Learning: Critical for Learning Science Understanding is a common word in our language. When we say that we understand something, we mean that we know when it happens or exists, why it occurs in a certain way, and in which direction it probably will develop. Thus understanding means much more than knowing the facts and imitating the operation. As far as science education is concerned, understanding includes conceptual, mathematical and operational understanding, among which conceptual understanding is critical. According to Tennyson (1996), concepts are defined as classes of objects, symbols and events that are grouped together in some fashion by shared characteristics. There are three kinds of concepts: object concepts, symbolic concepts and event concepts. Object concepts exist in time and space and can easily be represented by drawings, photographs, models or the object itself, such as tables and chairs. Symbolic concepts consist of particular kinds of words, numbers, marks and numerous other items that represent or describe objects, events or their relationship, either real or imagined. Even concepts describe the interaction of objects, either living or organic, at a particular time. Referring to this definition or description of concepts, one may think that learning concepts is to learn certain words or phrases,

PAGE 52

42 including to which objects or events they refer, which attributes these objects have in common, and whether one object belongs to the concept class or not. However, it is imperative to know that concepts find their m eanings within a theoretical context. For instance, the concept of signal and data transmission is better understood in the context of establishing the relationship between a transmitted and received signal. Therefore, conceptual learning in this study means much more than memorizing the definitions of concepts. By meaningf ul conceptual learning, Tennyson (1996) explains that students build the learned concepts into their cognitive structure and build up a consistent conceptual framework. This conceptual framework is required by students to develop the higher order level abilities that enable them to use and apply their understanding in a meaningful way. Importance of Conceptual Understanding in Engineering Education An important target for engineering education is the gaining of problemsolving skills. Intense mastery of relevant concepts and phenomena generates a necessary base for the acquisition of knowledge and understanding in engineering subjects; it also provides the requisite skills for goods problem solving (SavanderRanne & Kolari, 2003). In the field of engineering, problem solving is ultimately applied to the design of new products, to planning or troubleshooting industrial processes and so on. It is argued that good problem-solving skills can be achieved through a mastery of concepts and understanding phenomena (Pfundt & Duit, 1994). It is also claimed that a common cause of failure in problem solving in the physical sciences and engineering subjects is the lack of conceptual understanding and deeper insight into

PAGE 53

43 the consequences of phenomena (Herron, 1996). Savander-Ranne & Kolari (2003) argue that several studies report that students who are able to solve numerical problems are not necessarily able to solve conceptual problems. They point out that students have been found to rely more on algorithmic techniques rather than reasoning skills. For example, students may be able to solve numerical problems dealing with gas laws but are unable to solve conceptual problems on the same topic when problems are presented in the form of a diagram. Students who are able to solve stiochiometric problems may have serious difficulties in understanding a diagram-based performance on the combination of atoms and molecules, yet be unable to solve problems presented in this form. Such results have been replicated in studies with both homogeneous and heterogeneous student population (Savander-Ranne & Kolari, 2003). Researchers agree that conceptual understanding cannot be assumed to follow when the focus is on narrowly defined problem solving. Conceptual understanding and a more qualitative approach need to be incorporated in setting educational goals, and the instruction should be designed accordingly (Pushkin, 1998; Savander-Ranne & Kolari, 2003). Assessing Conceptual Understanding Teaching communication system concepts can be a challenge as electrical engineering students often do not see the immediate relationship between cause and effect, which can be seen, for example, in mechanical or manufacturing engineering experiments. Nevertheless, the following observations from the literature facilitated designing the Conceptual Achievement Test.

PAGE 54

44 Savander-Ranne & Kolari (2003) claim that it is not easy to know if students are learning and even more difficult to know whether they have achieved true conceptual understanding. Assessing understanding requires careful observation and thorough analysis. A students ability to recite definitions of concepts is of limited value as an indicator of conceptual understanding. Definitions should, at the very least, be accompanied by examples. Even then, students are very talented in sorting out examples they are sure of and avoiding those examples that they find unclear or difficult. Hence, no significant information is obtained on the quality of understanding of a concept (Savander-Ranne & Kolari, 2003). Additional questions need to be asked by which the definition can be clarified and situations need to be designed where justifications must be presented. As noted before, the ability to solve numerical problems and handle algorithms is no proof of conceptual understanding and does not display the conceptual difficulties of an issue and how a student is able to cope with these difficulties. According to Savander-Ranne & Kolari (2003), the following are engagements that may give insight into student understanding. Ask students to: Define, describe and visualize a concept or phenomena. Synthesize an answer by providing explanations and justification, such as: Why does something happen? How does something happen? What are the consequences of this? Analyze an example or information that is new to them. Overview of Learning Objectives As stated by St. Clair (2000), no assessment of learning can be performed if the learning objectives are not clearly defined. Therefore, the learning objectives based on cognitive levels were integrated into this research for several reasons. First, they were integrated to provide a systematic approach to clearly state what the

PAGE 55

45 students needed to learn at the completion of the experiments and as a result what the instructor needed to prepare to achieve higher-order thinking. This also helped with understanding the different cognitive levels that the individuals could gain during the learning process. Furthermore, this allowed the creation of an assessment tool to measure the knowledge gained by the students at the different cognition levels (analysis, synthesis and evaluation). More specifically, the lectures and laboratory experiments were prepared by focusing on the cognition levels, followed by assessment tools that were compatible with the learning objectives. The following sections of this chapter will provide an overview of the importance of learning objectives in general and the learning objectives relevant to this study. Importance of Learning Objectives Engineering curriculums often stress low-level items such as knowledge, comprehension and application that are most efficiently achieved by the use of pure lecture. However, higher-order experiences such as analysis, synthesis and evaluation can be most effectively developed by the use of learning strategies such as physical experiments and hands-on activites. Learning through laboratory experiments and demonstrations can serve to illustrate concepts as well as help to strengthen a student's intuitive reasoning skills. In general, the higher the degree of activity involved for the student, the greater the retention of material and development of higher-order skills in Blooms taxonomy (1956). As stated by St. Clair (2000), no assessment of learning can be performed if the learning objectives are not clearly stated. Therefore, instructional objectives

PAGE 56

46 based on cognitive levels were integrated into this research in several states. First, this was done to provide a systematic approach to clearly state the educational objectives. This clear set of objectives is pointed out by Diamond (1998) and Palomba and Banta (1999) as one of the first steps in any assessment (as cited by St. Clair, 2000). In addition, the learning objectives provide the understanding of the different cognitive levels that the individual could gain during the learning process. Therefore, it allows the creation of an assessment tool to measure the knowledge gained by the student at different cognition higher levels (analysis, synthesis and evaluation). The systematic approaches for this research to implement the instructional objectives are as follows: 1. The cognitive levels using Blooms Taxonomy were first used to define the laboratory instructional objectives of the modulation/ demodulation process. The modulation/demodulation was the engineering content used in this research due to the complexity of the subject matter. 2. Then, based on the laboratory objectives, the simulation program was selected. 3. The concept achievement test was prepared and used to assess the student conceptual learning. The suggestions provided by Savander-Ranne & Kolari discussed in the previous section were used as a guideline while preparing the test. This study is based on the belief that using instructional objectives in engineering education and engineering education research is advantageous, due to the fact that objectives state exactly what a student must learn and therefore indicates

PAGE 57

47 exactly what must be assessed. Some of the instructional objectives can be met through the lectures and some are met through the laboratory experiments. However, this research only emphasizes those objectives that are met in the course of performing laboratory activities to achieve higher order learning such as analysis, synthesis and evaluation. Learning Objectives Pertaining to Current Study A large number of engineering students are visual, sensing and active learners and it is necessary for them to see before they can fully process engineering concepts (Felder & Silverman, 1988). Therefore, knowing how and why are essential requirements of technical engineering courses. Encouraging the students to participate in higher-order thinking can be challenging; however, utilizing the taxonomy of learning objectives devised by Bloom (1956) can facilitate the process. In the case of learning modulation and demodulation topics in communication systems, the undergraduate engineering students are required to gain a higher level understanding of a) the various modulation techniques, b) their functional relationship with respect to each other, and c) the skills necessary to choose the appropriate modulation technique in a given situation. One vehicle that can reinforce cognitive knowledge, provide the students the opportunity to put theory into practice, and encourage higher-order thinking is physical ac tivities. The following is a synthesis of how it was expected that the laboratory activities through physical experiments and the particular simulation used in this study could provide the students with cognitive development in higher levels of Blooms taxonomy (analysis, synthesis and evaluation). Each level includes a brief description, key words, instructional

PAGE 58

48 objectives of the environment at that level, activity to be performed by the students within the environment. and the assessment techniques. Analysis Description : This level emphasizes learner understanding of the meaning and intent of the concepts. The learner can break down a communication into its constituent elements or parts. Keywords : Outline, analyze, break down, categorize (St. Clair, 2000) Instructional Objective : The students will be able to outline and analyze the key stages of modulation techniques. Simulation-based Activity : The simulation allows the student to navigate through a set of modulation environments, which focus on the key stages of the modulation process. In each environment, the student is prompted to input parameters relevant to carrier wave, data signal and sampling frequency and modulation type. Based on these inputs, the simulation will plot the input, modulated and reconstructed signal. As a result, the student is put in the position of making decisions based on the situation presented. Physical Activity : The student is guided to assemble an electronic circuit for the modulator and then vary the frequency dependent components of the circuit in order to observe changes in the output signal. Then the student is asked to plot the input, modulated and reconstructed signal. As a result, the student is put in the position of making decisions based on the situation presented. Assessment : The student will be asked to provide an outline and a brief analysis of the key parameters and stages of the modulation techniques. Why is this important ? This allows the student to understand and analyze the

PAGE 59

49 criteria at each stage of modulation process. Synthesis Description : This level focuses on learner ability to put together elements or parts to form a whole. Generally, this would involve a recombination of parts of previous experiences with new material, reconstructed into a new and more or less well-integrated whole. Keywords : Integrate, formulate, create, build, generate (St. Clair, 2000) Instructional Objective : The learner will be able to integrate the modulation techniques with physical characteristics of signal waves. Simulation-based Activity : The simulation provides the student with the opportunity to explore the change in characteristics of the signal waves and its effects on the modulation process; thus the student can recombine his or her experiences to build an integrated knowledge of the modulation process. As a result, the student is encouraged to think about the generation of a procedure that includes input criteria vs. effects on the output signal. Physical Activity : The physical laboratory experiments provide the student with the opportunity to explore the change in characteristics of the output signal waves and the modulation process, thus the student can recombine their experiences to build an integrated knowledge of modulation process. As a result, the student is encouraged to think about the generation of a procedure that includes input criteria vs. effects on the output signal. Assessment : The student will be asked to integrate the modulation techniques with physical characteristics of signal waves. Why is this important ? It is important for the engineering student to make

PAGE 60

50 informed decisions about the modulation techniques while drawing information from sources such as physical characteristics of signal waves. Evaluation Description : This level emphasizes ability of the learner to make judgments about the value of material or methods for a given purpose. This judgment may be either qualitative or quantitative, and the criteria may be from the learner or any other source. Keywords : Criticize, argue, evaluate, judge (St. Clair, 2000) Instructional Objectives : The student will be able to evaluate the appropriateness of using a modulation technique for a given situation. Simulation-based Activity : The simulation provides a series of plots and allows the student to change the characteristics of the modulation. The student is given visual feedback based on choice of characteristic he or she had chosen. The student is aided by visual prompts on the appropriateness of his or her choice. This will allow the student to evaluate each situation. Physical Activity : By changing the physical characteristics of electronic circuits, and observing the variation in the modulation process and the output of the system, the student may experience the correlation between the choice of physical characteristics and the modulation process. This will allow the student to evaluate each situation. Assessment : The student will be given a scenario and will be asked to evaluate the physical parameter setting such as frequency, amplitude and phase. Why is this important ? Because it highlights the need for the engineering student to evaluate product outcome of the decisions made during a modulation technique

PAGE 61

51 selection as part of a communication system design. Summary In summary, there is considerable research conducted over the past two decades on the effectiveness of instructional simulation, but results are inconsistent. Several explanations concerning the inconsistent results of simulation research have been offered. Poor research designs are partly to blame (Butler et al, 1988; Lee, 1999), but a much more serious problem is lack of a theoretical framework for the instructional use and evaluation of simulation (Bredemeier & Greenbelt, 1981). There is also great amount of work on the use and effectiveness of computersimulated laboratory experiments in the field of science education. The findings support the use of simulation for laboratory activities, which are inconsistent with the results obtained from the use of simulation as an instructional tool. As part of the results of the meta-analysis, Van LeJeune (2002) reported that, surprisingly, simulation produced between 1983 and 1993 proved to be more effective at promoting student achievement than more recent simulation, especially in achievement outcome relating to achievement in low-level thinking skills. It was also found that there is a gap in the literature in terms of application of simulation for laboratory instruction purposes in the field of engineering education, which could contribute to not only laboratory cost reduction, but also to the availability of such laboratories to those who cannot attend traditional classrooms. The small amount of related literature pertaining to the use of simulation as a means of conducting laboratory experiments in engi neering education is a strong indication of the lack of emphasis that this subject area has received in the past.

PAGE 62

52 Some of the engineering areas that have been reviewed include the teaching of subjects such as electricity and magnetism (Chou, 1998); electrical operational amplifiers (Dobson & Hill, 1995); basic Electronics (Moslehpour, 1993); engineering fluid mechanics (Engle et al., 1996); basic thermodynamics (Buttles, 1992); and engineering physics (Chien, 1997; Choi and Gennaro,1987). The results of these studies are also inconclusive.

PAGE 63

53 Chapter 3 Procedures The purpose of this study was to examine the effectiveness of utilizing a computerized simulation program to perform modulation and demodulation laboratory experiments and compare its effects with a traditional physical laboratory. The simulation program is a demo designed by MATLAB and revised to fit the purpose of this study. The chapter is divided into the following sections: (1) research design, (2) participants, (3) methodology and procedures, (4) overview of laboratory sessions, (5) research questions, (6) instruments and materials and (7) statistical analysis procedures and qualitative research design. Research Design This study is a mixed method study, which combines both quantitative and qualitative approaches into the research methodology. Therefore, the current research effort has three complementary tracks. The first of these is a quantitative study to examine the differences between the two groups on their scores on post-test as well as follow-up measure. In addition, the quantitative section examines the difference in terms of lab completion time. As shown in Table 1, the physical lab group performed communication systems laboratory exercises using traditional hardware laboratory (Appendix A) and the simulation group used simulation software for performing

PAGE 64

54 similar laboratory exercises (Appendix B). Groups Treatments Simulation Computer Simulation Lab Physical Traditional Physical Lab Table1. Design of the Study The second track is also a quantitative study using an attitude survey questionnaire (Appendix E) to examine the attitudes of the students toward the simulation as well as the attitude of both groups toward the use of a laboratory in general. The third track was a qualitative study that uncovered issues and differences that were not shown by the quantitative study. Quantitative Research Design The general research design for the quantitative portion of the study is pure experimental in which the students were randomly assigned to either the simulation or the physical laboratory group. Variables for the Quantitative Research A description of the variables in the study is shown in Figure 1.

PAGE 65

Independent Variables Dependent Laboratorymethods Simulation Physical Achievementscores Attitude Lab completiontime Knowledgeretention Figure 1. Research Variables Participants Three sections of a digital design course were offered during the Fall semester with an total of 87 students with 28, 25, 34 students in each class, respectively. Only data from 80 students were used due to the fact that three students dropped the course before the midterm, and four students did not take the midterm exam and as a result did not produce any scores for the follow-up test. As a result, the scores of those seven students were eliminated from the final study. Dropped students were evenly distributed over the two groups. The sample included 80 of the students enrolled in the course during the data collection period. Students enrolled in the course were junioror senior-level undergraduate students pursuing four-year degree in electronics or computer engineering technology. All three sections were taught by the same instructor, which included 2 hours of lecture and 2 hours of lab for each section. The demographics and backgrounds of the students were obtained through a student data sheet (Appendix D). The demographic survey acted as a filter for inclusion in the final study. Criteria for inclusion consisted of previous experience with working with circuits (Circuit I or similar subject as prerequisite). 55

PAGE 66

56 Data from the demographic survey indicated that of 80 students from all three sections, 57 were male and 23 were females. In terms of age, 31 of the students were less than 20 years of age, 46 were 20-30 and only 3 were 31-40. There were 55 seniors and 25 juniors. Only 10 out of 80 students reported that they had used simulation before. In addition, six out of 80 reported that the subject of modulation and demodulation had been covered previously in some of their classes (no use of simulation) but all six reported that they did not understand the concept. In addition all 80 students had Circuit I or a similar course and the grade were as follows; 11 students A, 24 students B, and 45 students C. The students in each section were randomly assigned either to the simulation or the physical laboratory group that signifies that the research design is true experiment. Random assignment is the best technique available for assuring initial equivalence between different experimental groups. In addition, internal validity will increase due to random assignment of the participants. To ensure that the students were motivated to participate in the study, they were reminded that their test score would count in the course grade and they would also earn 5 extra credit points on their final grade by participating in the study. For those who participated in the qualitative portion, they earned another 5 extra credit points. Methodology and Procedures The independent variable in this study is the method of instruction, a variable with two categories: computer simulation and physical laboratory. The dependent variables are the post-test score, follow-up scores, attitude scores and laboratory

PAGE 67

57 completion time scores. The post-test was made up of problem-oriented type of items and a few multiple-choice questions. A description of the post-test is included later in the chapter. The subject matter for this study is the signal modulation and demodulation. As mentioned before, three sections of a digital design course were offered, and only one instructor taught all three sections. All sections met once a week on three different days for a period of five hours. Normally, two hours is dedicated to lectures and two hours is used as laboratory time. However, in this study an hour and a half was used for lecture. The students met in the classroom as scheduled. All participants received an hour and a half lecture on the topic of FM and AM modulation and demodulation. Then for 30 minutes the research project was explained to them and they were asked to sign the consent form and were allowed to keep a copy of the consent form. They were reminded again about the 5 points extra credit for participating in the study. They were also asked to take a few minutes and answer background questions (Appendix D). Based on the last two digits of the subjects student ID, each student was assigned to one of the two groups. Then for the rest of the hour the physical lab group met in the hardware laboratory, and the simulation group met in the computer lab. Each group was given a pre-lab (see Appendix C) for 20 minutes followed by two laboratory experiments specifically designed for each group. Overall treatment time was the same for both groups. The pre-lab for both groups was designed with five objectives in mind: 1. Introduce the students to the simulation program or the laboratory equipment. Allow students to get familiar with new material. 2. Alert the student to the overall nature of the process.

PAGE 68

58 3. Establish the need for deeper understanding. 4. Answer questions. For approximately an hour and a half, the physical lab performed the experiments (see Appendix A) in a well-equipped electronics lab at the college of engineering technology proctored by a teaching assistant. The laboratory equipments and instruments were pre-assembled for the experiments. The simulation group performed similar lab experiments (see Appendix B) using computer simulation software (see Appendix K) in a well-equipped PC lab proctored by another teaching assistant. The simulation program and MATLAB software were pre-installed on each PC in the computer lab. The simulation software was installed only in a lab that did not have open lab hours. Thus the students could only access the software during the scheduled class time. In addition, electronics lab was made available to the students only during the scheduled class hour. In addition, the PC lab was equipped by LinkSys hardware, which allowed the researcher to observe the students monitors and their activities to assure that they restrict the simulation to complete only the two lab experiments and no other extra activities on the simulation. Both groups were given the same guidelines for completing the lab activities to help achieve a cognitively similar treatment for both groups. Teaching assistants were responsible for proctoring each laboratory and the exam session. The teaching assistants were instructed to put a start and end time stamp for each participant in order to keep track of the time it took the students in each group to complete each experiment. Initially, the result of the pilot study revealed that the two hours lab time allocated for students to complete the assignments could be decreased to one hour and 30 minutes. But then, based on the discussion with the instructor, it was decided that

PAGE 69

59 such a result might have been be due to the small number of students in each group. It was decided that as the sample size got larger, the two-hour lab time would be more sufficient. Therefore, no changes were made to the initial lab time allocation. However, on the actual study, lab experimentation time did not exceed an hour and a half for each group. At the completion of the experiments, the students remained where they were and took a one-hour exam. The pilot test indicated that the time allocated for the exam was sufficient since the pilot students finished their exam within the 40 minutes. But due to the larger number of participants, it was decided to allocate one hour to the exam. After completing the lab experiments, both groups were asked to remain in their seats and complete the attitude survey questionnaire. Then, the physical lab group was dismissed from the physical lab but the simulation group remained in the PC lab and completed an additional attitude survey and the qualitative survey questionnaire. A few days after post-test, three students from each group were randomly selected to participate in a group interview. The details of the interviews are discussed later in the chapter. Three weeks after the first treatment, all 12 post-test questions were incorporated into the students midterm exam to examine the difference between the two groups in terms of their knowledge retention. An overview of the research procedure is presented in Figure 2.

PAGE 70

Contact theinstructor Explain the researchpurpose, procedures, etc. Obtain IRB approval 1 Initial Procedures 2 Validation Process Validate the testand treatmentinstruments Test reliability of theinstruments Validate the qualitativeinstruments 3 Training Procedure Training theinstructor Training the TeachingAssitants 4 Research Design Quantitative Qualitative Simulation Group Physical Group Random assignment toone of the two conditions Consent forms signed bystudents Pre-labinstruction Pre-labinstruction Simulation LabExperiments Simulation LabExperiments CognitiveAchievementTest CognitiveAchievementTest Lecture given to allstudents by the instructor Observation Group interview Randomly pullthree studentsfrom each group Conducted andtranscribed by theresearcher TA1 TA2 Attitude Survey ExitQuestionnaire ExitQuestionnaire IndividualQuestionnaire Attitude Survey Administeredone more timeto both groupsin three weeksas follow-up Figure 2. Overview of Research Procedures 60

PAGE 71

61 An Overview of Laboratory Sessions In order to eliminate any type of bias imposed by the instructor and the researcher, it was decided that two teaching assistants (TAs) would be involved in the study. One TA was assigned to each lab, one to the physical lab and one to the PC lab. The researcher was present in each lab only for the purpose of observation. The TAs were given training prior to the study. They were provided with a written instruction for the pre-lab (see Appendix C) and were instructed to read the instructions aloud to the students without adding any additional comments. The training also included: Performing the entire experiment in advance to help the TAs become familiar with the experiments and some of the stumbling blocks that the students may confront which may be fixed before the experiments. Enforcing laboratory rules since safety is an issue. Recording the questions that are asked or problems that arise. Answering to the problems with the equipment but not answering questions related to the experiment itself. Stamping the ending time of the experiment. Taking about 10 minutes to perform a sample experiment for the students to familiarize them with the equipment and the simulation. Research Questions, Materials and Instruments Table 2 provides overall study questions, data collection techniques, instruments and data sources.

PAGE 72

Research Question Techniques Task/Material/Instruments Data Sources 1. In terms of student conceptual learning, how do simulation-based laboratory experiences compare to physical laboratory experiences? Experimental study Conceptual Achievement Test Rubric for grading the test Laboratory activities sheets Pre-lab instruction sheets Post-test scores 2. How does the students attitude toward the use of the simulation affect their post-test scores? Experimental study Attitude survey questionnaire Attitude scores 3. How does the simulation group attitude toward the laboratory experience differ from the physical group? Experimental study Attitude survey questionnaire Attitude scores 4. In terms of completion time of the assigned laboratory experiments, how do simulation-based laboratory experiences compare to physical laboratory experiences? Experimental Study Time log Time log 5. In terms of student knowledge retention, how do simulation-based laboratory experiences compare to physical laboratory experiences? Experimental Study Conceptual Achievement Test Rubric for grading the test Follow-up test scores 6. What are the perceptions of both groups on the use of laboratory experiments in general for learning the concepts? Group interview Pre-structured interview questions Audio transcription of interviews Observation notes 7. What is the students perception toward the use of simulation in place of physical laboratory? Questionnaire Group interview Pre-structured interview questions Individual questionnaire Audio transcription of interviews Observation notes Exit questions Table 2. Overall Study Questions, Data Collection Techniques, Instruments and Data Source 62

PAGE 73

63 Quantitative Research Questions The main research question for this research project is: Can simulation-based laboratory replace physical laboratory methods? Specifically, Question 1. In terms of student conceptual learning, how do simulation-based laboratory experiences compare to physical laboratory experiences? Question 2 How does the students attitude toward the use of the simulation affect their post-test score? Question 3 How does the simulation group attitude toward the laboratory experience differ from the physical group? Question 4. In terms of completion time of the assigned laboratory experiments, how do simulation-based laboratory experiences compare to physical laboratory experiences? Question 5. In terms of student knowledge retention, how do simulationbased laboratory experiences compare to physical laboratory experiences? Question 6 What are the perceptions of both groups on the use of laboratory experiments in general for learning the concepts? Question 7. What is the students perception toward the use of simulation in place of physical laboratory? An experimental research design was conducted to examine Questions 1-5, while a qualitative case study design was carried out to explore Questions 6-7.

PAGE 74

64 Statement of Null Hypotheses : The focus of this study was to discuss the effects of simulation in terms of its capabilities to replace physical laboratory methods. It is hypothesized that the treatment group students will perform as well as the control group, appreciate the subject matter and value the instructional treatment more and will spend less time completing the lab experiments. H 0 1: There is no significant difference (at p = 0.05 level) between physical group and simulation group attitudes toward the laboratory experience as measured by attitude survey at the completion of the post test. H 0 2: There is no significant difference (at p = 0.05 level) on post-test scores between students performing physical experiences on a traditional communication systems topic as compared to those performing the same experiments using computerized simulation program. H 0 3: There is no significant difference between simulation and physical laboratory groups long-term retention of the concepts as measured by mean scores on a follow-up instrument. H 0 4: There is no significant difference on laboratory completion time between students performing physical experiences on a traditional communication systems topic as compared to those performing the same experiments using a computerized simulation program. H 0 5: There are no significant correlations between simulation groups attitude toward the use of the simulation and their performance post-test scores.

PAGE 75

65 Instrumentations and Material Laboratory Experiments : It was recognized early in the study that not all programs challenge the students to apply what they practice in the freeform interaction with the simulation and, as a result, the design of laboratory assignments based on real-life problems was of utmost importance. In this study, the objective of having students complete the laboratory exercises was to anchor their learning by getting them to instinctively react to system changes in a meaningful context. In doing so, students become aware of the relationship between real-life phenomena and how they affect the system input and output variables studied in class. The laboratory experiments were designed by the instructor and the researcher and validated by three professors (details of the validation process are explained later in this chapter). There are two matched sets of two specific laboratory experiments developed for this study (see appendix A & B). One set of the two lab assignments required the use of the physical laboratory equipments and, the other set of the two assignments required using the computerized simulation program. The laboratory experiments covered two different topics in communication Systems: AM modulation and FM modulation. The experiments had the same level of difficulty. However, the physical group members were required to work in a circuit laboratory where the circuits were pre-assembled and ready for experimentation. This laboratory was aimed at students at a junior/senior level who had already been exposed to circuit assembly and trouble-shooting techniques in lower-level engineering classes. Therefore, the objective of these experiments was for students to learn underlying concepts and not laboratory techniques and troubleshooting.

PAGE 76

66 Therefore, one of the assumptions of this study was that the students enrolled in the digital design course had already satisfied the course prerequisites including Circuit I. For each laboratory assignments, the students received a detailed handout describing the laboratory exercise to be completed and also received basic instruction on the use of the simulation program. Simulation software: The simulation program used in this study was a demo program, which was designed for the purpose of training the students with the concept of modulation and demodulation. The researcher had to make minor modifications and corrections to the simulation program for the purpose of this study. For more details on the simulation program, refer to Appendix K. Laboratory completion time: All TAs were responsible for stamping the start and the ending time on each students laboratory sheet. Attitude Survey Questionnaire: The purpose of the attitude survey questionnaires was to learn about the (1) attitude of the simulation group toward the use of the simulation program and (2) attitudes of both groups toward the use of laboratory experiments as part of the course curriculum. The composition of the attitude survey questionnaires was based on the guidelines by Crocker and Algina which include (a) putting statements in present tense, (b) avoiding if or because clauses and (c) avoiding universal quantifiers (1996, p. 30). A team of two survey design experts evaluated an earlier draft of both attitude surveys. Originally, the attitude survey for only the simulation group included 10 items. Each item was checked for accuracy, wording, ambiguity and some other technical flaws. Changes that resulted from the team evaluation were to (a) increase the number of questions from 10 to 13 items in order to include some reverse

PAGE 77

67 questions (questions 2,6,8) and (b) improve the wording and clarity of the questions. Then again the same team of survey design experts reviewed the revised document and several suggestions resulted in few minor changes in wording. Conceptual Achievement Test: For the purpose of measuring students conceptual achievement, the researcher and the instructor of the communication systems course developed the exam and three professors in electronics education, who had previously taught the communication system course, validated the content. The conceptual achievement test consisted of 12 items, which were carefully designed based on the objectives (discussed in chapter 2) for the modulation and demodulation section of the course (Table 3). The test was designed to be a test of student understanding of data transmission process. All of the questions were of a conceptual nature. The test was not produced to fully cover the domain of communication systems. The questions were created for one of the topics of communication systems, namely modulation and demodulation on which students most often have misconception. To answer the questions, simply recalling the definition of a concept is not enough; but students need to understand them and apply them to some situation. Therefore, these questions can solicit students intuitive concepts and, in the meantime, test students understanding of concepts. Before completing the exam, the students were encouraged to think about the strategies that they could be using to solve the problems. The first page of the exam provided the students with a list of some strategies to give them something to think about before answering each question. The exam consisted of 8 open-ended questions and 4 multiple-choice questions. A grading rubric was developed for the open-ended questions (see Appendix G).

PAGE 78

68 Cognitive Level Item Assessment Objective Analysis 1, 9, 12 The students will be asked to provide an outline or graph of the key parameters and stages of the modulation techniques. The students will be able to outline and analyze the key stages of modulation techniques. Synthesis 3, 4, 6, 7, 10, 11 The students will be asked to integrate the modulation techniques with physical characteristics of signal waves. The learners will be able to integrate the modulation techniques with physical characteristics of signal waves. Evaluation 2, 5, 8 The students will be given scenarios and will be asked to evaluate the physical parameter setting such as frequency, amplitude and phase. The students will be able to evaluate the appropriateness of using a modulation technique for a given situation. Table 3. An overview of the Conceptual Achievement Instrument As shown in Table 3, the purpose of the exam was to measure student learning at higher cognitive levels. Therefore, the majority of the questions on the exam is open-ended and requires the students to analyze, evaluate and synthesize each question. The exam consists of four multiple choice and eight open-ended questions. A reliability estimate of performance measure revealed a Cronbachs alpha of .70. Based on the results, questions 1, 5 and 12 were eliminated, but such change increased the Cronbachs alpha by only a couple of points (.73). After a discussion with the expert panel, it was decided that those questions should be kept. It also seemed reasonable to compare the group means obtained from the pilot data to make some further analysis. The results of the pilot study revealed that (Appendix J):

PAGE 79

69 Both groups did comparably at the synthesis level (questions 3, 4, 6, 7, 10, 11). Simulation group did better than the physical laboratory group at the evaluation level (questions 5 and 8). Physical laboratory group did slightly better on question 1 and performed equally on the remaining questions at analysis level (questions 9, 12). On the actual study the results were different than the pilot study, and those results will be discussed on chapter 4. Reliability of the Test Scores During the pilot study, only one instructor graded the achievement tests. However, one important question, which became apparent as the result of the pilot study, was: How reliable are those test scores? This question was addressed by having two instructors; using the same grading rubric, evaluate the students achievement test. Then the following question was: What is the extent of inter-rater reliability among the scores assigned for each student? To answer this question, alpha reliability was computed for the scores reported by the two instructors for each student to examine the internal consistency in grading. The calculations revealed an alpha reliability ranging from the low of .96 to the high of 1 with an overall reliability of .94, indicating an acceptable consistency of grading for the instructors. Since the ratings are positively correlated, we can be reasonably sure that they are measuring the same construct.

PAGE 80

70 Content Validity, Credibility and Reliability of the Instruments Several approaches were conducted for evaluating the instruments designed for this study in terms of validity and credibility. According to Law (2003), validation is the process of determining whether an instrument is an accurate representation of the system, for the particular objectives of the study. Credibility is when the decision maker accepts a simulation program and its results as correct. In evaluation simulation, validity does not imply credibility and vice versa (Law, 2003). Content Validity and Credibility of Simulation Program and Conceptual Achievement Test Instrument: Content validity for these instruments for both simulation and the physical laboratory group was established by having three electronics professors at two universities review them. Each professor was given a booklet, which contained a copy of the lecture material, lab experiments I and II (for both groups), post-test and a CD-Rom containing the simulation program. The booklet included a cover letter and an evaluation rubric (see Appendix G). The researcher had a meeting with each reviewer to discuss the following details: The overall objective of the study The specific questions to be answered by the study The time frame for the study Differences between validity and credibility of the instruments The importance of instruments meeting the learning objectives The reviewers were also asked to review everything that was included in the booklet carefully and provide comments in addition to the scores in the rubric. Based on the feedback, some changes were made to the organization of the lecture material, lab experiments and the post-test.

PAGE 81

71 Reliability for Performance Instrument: Reliability of the performance test instrument was estimated which resulted in a Cronbachs alpha of .70. Reliability for Attitude Survey Instrument: The attitude survey was pilot tested during the pilot study (see Appendix J). The internal consistency estimates using coefficient alpha was computed for both attitude surveys (see Appendix E). The survey administered to both group of students had a coefficient alpha of 0.92. The alpha reliability of the second attitude survey administered to only the simulation group revealed a coefficient reliability of 0.89. Statistical Analysis Procedures To test H 0 1: There is no significant difference (at p = 0.05 level) between the physical group and the simulation group attitudes toward the laboratory experience as measured by an attitude survey at the completion of the post test. This hypothesis was evaluated by using descriptive statistics procedure using the mean scores of the physical group and the simulation group on each question. In addition two-tailed ttest was calculated to determine any significant difference between the two groups. This hypothesis was first evaluated in the pilot study. For more details, refer to Appendix J. To test H 0 2: There is no significant difference (at p = 0.05 level) on post-test scores between students performing physical experiences on a traditional communication systems topic as compared to those performing the same experiments using computerized simulation program. This hypothesis was evaluated by using independent two-tailed t-test procedure usi ng the physical groups post-test scores and the simulation groups scores.

PAGE 82

72 T o test H 0 3: There is no significant difference between simulation and physical laboratory groups long-term retention of the concepts as measured by means scores on a follow-up instrument. This hypothesis was evaluated by using independent two-tailed t-test procedure using the physical groups follow-up test scores and the simulation groups scores. This hypothesis was also tested in the pilot study. For more details, refer to Appendix J. To test H 0 4: There is no significant difference on laboratory completion time between students performing physical experiences on a traditional communication systems topic as compared to those performing the same experiments using computerized simulation program. This hypothesis was evaluated by using independent two-tailed t-test procedure using the laboratory time log of the physical and the simulation group. To test H 0 5: There are no significant correlations between simulation groups attitude toward the use of the simulation and their post test performance scores. This hypothesis was evaluated by using Pearson Correlation analysis. Threat to Internal and External Validity Threats to Internal Validity To maximize internal validity, the participants in the study were randomly assigned to each treatment group. However, the concerns of a threat to internal validity to this study included testing, behavior bias, and evaluation anxiety. Testing This could be a threat to the internal validity of this study since the students take post-test and follow-up test, which may cause testing sensitization.

PAGE 83

73 Behavior bias This is when a participant may have bias toward an intervention, either positively or negatively. Every measure was taken to prevent any type of bias during the study. Evaluation anxiety. This is when the participant is subjected to a time event or placed into a situation that causes him or her anxiety. This could have been an internal validity threat to this study. However, the students were allocated enough time to complete the laboratory assignments and the post-test. Threats to External Validity Concerns of a threat to internal validity to this study included population validity, Hawthorne effect, and novelty effect. Population validity This was the first external validity that might have affected this investigation. Population validity is the extent to which findings are generalizable from the sample of individuals on which a study was conducted to the larger target population of individuals, as well as across different subpopulations within the larger target population. In this investigation this could have been a threat to external validity due to the limited and narrow sampling of the population of undergraduate junior-senior level electronic engineering students. Hawthorne effect This occurs when subjects perform differently because they know they are being studied. This could have been a threat to the external validity of this investigation. Novelty effect A treatment may work because it is novel and the subjects respond to the uniqueness, rather than the actual treatment.

PAGE 84

74 Qualitative Research Design The qualitative component of the study consisted of an interview questionnaire, group interviews and brief observation. The simulation group completed individual interview questionnaires after the completion of the post-test. Group interviews were conducted on both simulation and physical group participants a few days after the post-test. The interviews were transcribed and analyzed for themes, which provided insights regarding the effectiveness of the laboratory experience, use of problem-solving strategies and simulation groups attitude toward the use of simulated laboratory in place of physical laboratory. The use of interviews as a data collection method is presumed to provide truthful and meaningful perspectives from the participants. The advantage of an interview over a paper and pencil survey is the possibility of interpersonal contact and the opportunity to follow up on interesting comments. Qualitative Research Question A commonly stated objective of engineering laboratory work is to allow students to learn to handle the lab equipment that is used in an actual experiment. For the purpose of modulation laboratory experiments, the assumption of the digital design course was that the students have had Circuit I prior to taking this course. Thus, in Circuit I or any other similar course, they have learned to work with lab equipment. Therefore, the main objective of these specific experiments was to allow the students to gain a better understanding of the modulation process by observing the process and by manipulating the variables. The intention was not to teach them how to assemble the circuits. Therefore, normally, in the case of these specific

PAGE 85

75 experiments, the circuits are pre-assembled and the students are required to operate and manipulate the circuits according to the lab experiment handouts and make observations. Despite the fact that the physical lab group was not required to assemble the circuits, they were, however, required to handle the circuits accurately and safely. Therefore, a qualitative research method based on an interview questionnaire (Appendix H) was used to reveal not only the experimental group members reaction toward the use of simulation, but also their thoughts and feelings about their lack of access to physical equipment. Therefore, the two main qualitative research questions were: What are the perceptions of both groups on the use of laboratory experiments in general for learning the concepts? What is the students perception toward the use of simulation in place of physical laboratory? Qualitative Interview Questionnaire An interview questionnaire was designed for the simulation group (Appendix H). The questionnaire helped with gaining a deeper understanding of students feelings and thoughts on the use of simulation instead of physical laboratory equipment. Content Validity of Qualitative Instrument A panel of two experts reviewed the survey for content-related validity. These experts consisted of the director of research at a southern university and the director

PAGE 86

76 of grants and proposals. The questionnaire which, consists of seven questions, is intended to provide more information on the feelings and thoughts of the students on the use of the simulation program in place of the physical laboratory. Both experts suggested that the questionnaire method should be replaced with a group interview, if possible. They suggested that the interview questionnaire method does not capture the true feelings of the students. As a result, both methods of survey questionnaire and group interview were employed. Group Interview Standardized achievement test and questionnaires can supply researchers with relatively objective data and are easily administered to a larger number of participants with low cost and less time, but they cannot probe deeply into respondents opinions and feelings. An alternative method used to obtain a deep understanding is the interview, which makes it possible for researchers to gain information that individuals probably would not reveal by any other data-collection method. This was supported by some reported studies (Gall et al., 1996). On the other hand, in science education literature many researchers report cases in which students get right answers in standardized tests by guessing or through the wrong understanding of the phenomena. Therefore, the right answer for one special question does not necessarily mean students understand the associated phenomena (Berg & Brouwer, 1991). Group interviews were conducted to remedy these shortcomings of quantitative measures. Group interviews were conducted on both simulation and physical group participants few days after the completion of treatment. Six students, three from each group, were invited to a group interview. The interviews were

PAGE 87

77 transcribed and analyzed for themes, which provided insights regarding the effectiveness of the laboratory experience, use of problem-solving strategies and simulation groups attitude toward the use of simulated laboratory in place of physical laboratory. An interview guide or questionnaire utilizing open-ended questions was used in the qualitative data-gathering phase (see Appendix H). The questionnaire had several functions: (1) It provided structure and organization and ensured that all the ground was covered in the same order for each respondent; (2) It established channels for the direction and scope of discourse and; (3) It protected the larger structure and objectives of the interview (McCracken, 1988). The interview guide was basically intended to establish a conversation with the participants. Prior to the start of the interview, students were encouraged to express their thoughts and were assured of confidentiality as stipulated by the Institutional review board. Sampling for Qualitative Research Unlike for quantitative research, the purpose of the qualitative research is normally not to a test hypothesis or theories, but to develop a deeper understanding of the studied phenomena. It is basically of the nature of interpretation. Sampling for a qualitative study is therefore much different from what it is in quantitative research. In contrast to the random selection in quantitative research, the process of sampling for qualitative research is called purposeful sampling (Patton, 1990). The sample is selected from those that typically represent the studied phenomenon. It can be more than 100, but it can be less than 10 even only 1 (Patton, 1990).

PAGE 88

78 In this study, six students were selected, three from the simulation and three from the physical group, for a 30-minute interview. An electrical engineering professor familiar with the course but who did not teach the course during the current semester and the researcher conducted the interviews. The interviewees were selected based on the class observations and the response to the exit questions included on the last page of the Conceptual Achievement Test (see Appendix F). Gender was another criterion for the sample selection. This method of selecting was employed for two purposes: a) to provide a wide range of opinions and b) to avoid extreme onesidedness. The interviews were audio taped with the permission of the interviewees. An agreement form (Appendix M) was provided to the participants prior to the treatment. Transcription and Coding Process The interviews were transcribed and analyzed for themes, which provided insights regarding the effectiveness of the laboratory experience and the simulation program. The interview transcripts were read several times before the transcripts were coded. The themes then were color-coded. The answers to the guiding questions produced important thematic data. The interviews were recoded to ensure that no major themes were overlooked. The themes are supported with quotes from interviews representing the students voice. In certain cases, a cumulative account of the most popular student views is presented. Some of the students opinions are presented in an indirect voice to provide a concise account of their narration. Some participant quotes were edited for conversational flow to improve readability.

PAGE 89

Observations In this study, the researcher conducted brief real-time laboratory observation. Some scholars have reminded us of the bias of observation; namely, the effect of the observer on the observed. For example, students are likely to change their normal behavior pattern when an observer enters the room. Fortunately, this was not a problem since the researcher was able to monitor students PC activity while they worked. Each engineering PC lab was equipped with 30-35 computers, one for each student and one computer, equipped with a server, for the teacher. In each lab, there were several subnets or separate networks interconnected by a switch called a LinkSys router. Behind one of the subnets lies the professor PC. Behind another subnet lies the student PC. Each router connected to the switch and each computer located behind each subnet has a unique IP address. As shown in Figure 3, this type of hardware network allows the teacher to see each students screen and observe what he or she is doing (one student at a time). Instructor -Server Student 1 Client A Student 2 Client B Figure 3. PC Lab Structure 79

PAGE 90

80 The availability of LinkSys in the labs was very helpful because it allowed the researcher to monitor the student activities and jot down a few observations that seemed important for further analysis. The data analysis primarily consisted of following these steps; reading notes, identifying important issues and drawing conclusions.

PAGE 91

81 Chapter 4 Results Introduction This chapter begins by restating the hypotheses that were tested in order to answer the research questions that this study sought to answer. Secondly, the procedures that were implemented throughout the study were described. Next, descriptive statistics of dependent measures of achievement administered at two time periods (post-test and follow-up) are presented and discussed. Then the results from the statistical analysis are presented. Finally, summary of the results of the tests of the null hypothesis are presented, suggesting the answers to the research questions that were posted. Possible threats to validity are then presented and discussed. Also, a summary of the qualitative data gathered from the students via questionnaires, observation and group interview is presented. Null Hypotheses The hypotheses, stated in null form, that were tested in the study are as follows: H 0 1: There is no significant difference (at p = 0.05 level) between physical group and simulation group attitudes toward the laboratory experience as measured by the attitude survey at the completion of the post-test.

PAGE 92

82 H 0 2: There is no significant difference (at p = 0.05 level) on post-test scores between students performing physical experiences on a traditional communication systems topic as compared to those performing the same experiments using computerized simulation program. H 0 3: There is no significant difference between simulation and physical laboratory groups long-term retention of the concepts as measured by mean scores on a follow-up instrument. H 0 4: There is no significant difference on laboratory completion time between students performing physical experiences on a traditional communication systems topic as compared to those performing the same experiments using a computerized simulation program. H 0 5: There are no significant correlations between simulation groups attitude toward the use of the simulation and their performance post-test scores. Procedures For all students, the research project was explained briefly at the beginning of the class. Students were asked to sign the consent form to participate in the study and the agreement form to participate in the group interview. They were allowed to keep a copy of the forms for their own records. The students were not told whether their group would experience control or experime ntal conditions. Next, students were asked to complete a demographics form (see Appendix D). Results of the demographics are discussed on the previous chapter. Then, one instructor presented the lecture to all sessions. The lessons covered an introduction to analog modulation and demodulation techniques, methods for

PAGE 93

83 creating and demodulating the FM signals, estimating the modulation index for FM modulation, the methods for creating and demodulating AM signals, estimating the modulation index for AM modulation, the differences between the AM and FM modulation techniques, frequency features of carrier and signal waves and discussion of noise performance. Then the students were randomly assigned either to the physical lab or the simulation lab. In both groups, after the laboratory experiments were complete, the Conceptual Achievement Test and the Attitude Survey were administered as posttests. The Conceptual Achievement Test was used as the second test of the semester. No make-up measures were necessary for either treatment group. Two instructors scored the Conceptual Achievement Test since the questions on the test were openended questions. Then students received their score and a personal summary of the types of questions they had missed since they were not allowed to see the actual test again at this time due to the integrity of the follow-up test, which was yet to come. Immediately after the treatment measures, a randomly selected sample of students from both groups were invited for a group interview. These qualitative data are presented in Appendix N and discussed later in this chapter. Then, three weeks after the first post-test, the follow-up Conceptual Achievement Test was given. A comparison of the grades is presented later in the chapter. Descriptive Data Conceptual Tests. The conceptual test was administered twice to each student in the sample: during the 5 th week of the semester after the experimental treatment

PAGE 94

84 and at the 8 th week of the semester during the mid-term exam week. All the 12 posttest questions were embedded into the midterm exam to assess the students retention level. Each student test was graded by two independent instructors: first, the instructor of the course and second an instructor who was not familiar with the study and its methodology. Additionally, each instructor was unaware that another instructor would be grading the same test. Alpha reliability was computed for the scores reported (Table 4) by the two instructors for each student to examine the internal consistency in grading. The calculations revealed an alpha reliability ranging from the low of .96 to the high of 1 with an overall reliability of .94, indicating an acceptable consistency of grading for the instructors. Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q11 Q12 Inter-rater Alpha Reliability 1 0.97 0.97 0.96 0.99 0.97 0.97 0.99 0.96 1 1 0.97 Table 4. Inter Rater Reliability of Post-test Scores Although, the researcher could have selected one of the instructors reported grades for analysis, it was decided to include the average of post-test scores as the students final score. Table 5 outlines the mean scores and other descriptive statistics for both the conceptual tests.

PAGE 95

Measure Statistic Total Physical Simulation N 80 40 40 Minimum 11.00 11.00 26.00 Maximum 36.50 15.50 36.50 Mean 22.71 13.78 31.65 Std. Deviation 9.22 1.14 2.68 Skewness 0.10 -0.39 -0.03 Learning Post test Kurtosis -1.84 -0.52 -0.64 N 80 40 40 Minimum 11.00 11.00 12.50 Maximum 35.50 15.50 35.50 Mean 19.42 13.35 25.50 Std. Deviation 8.14 1.15 7.57 Skewness 0.65 -0.19 -0.79 Follow-up Test Kurtosis -1.39 -0.76 -0.99 Table 5. Descriptive Statistics for Both Post-test and Follow-up Test For the conceptual post-test, skewness for the simulation group was more positive than the physical group. However, this was reversed during the follow-up test. The following graph presents a visual comparison of the mean scores of the physical group and the simulation group on the conceptual test (Figure 4). 05101520253035posttestfollowupTimeScore Physical Simulation Figure 4. Comparison of Means on the Conceptual Test Over Time 85

PAGE 96

86 This preliminary comparison hints an interesting trend with respect to the conceptual test. With respect to post-test score, the mean score for the simulation group ( M = 31.65) appears to be much higher than the mean score of the physical group ( M = 13.78). In other words, the simulation group did much better immediately after the treatment. On the follow-up test, during the midterm exam, the simulation group still performed better ( M = 25.50) than the physical group ( M =13.35), but the score for the physical group did not change significantly. The mean score on the posttest and the follow-up test for the both groups still is statistically different, in favor of the simulation group. Even though the conceptual test graph (Figure 4) exhibits change over time for the simulation group, it shows very minimal change for the physical group. The results suggest that in fact the physical group initially had a lower score than the simulation group but its retention remained the same whereas the simulation groups still performed better than the physical group on the follow-up test but the scores slightly decreased over time. This is an interesting finding and it will be discussed later in this chapter and Chapter 5. Attitudes Survey. A 9-item attitude survey questions was administered at the completion of the treatment to both groups (physical and simulation) to assess their attitudes towards the laboratory experience. First, factor analysis with Varimax rotation was used to identify underlying variables, or factors that explain the pattern of correlations within this attitudes survey instrument. Table 6 below outlines the results of factor analysis. All items were loaded on one factor positive attitudes towards laboratory experience. An alpha reliability of .89 was established for this factor.

PAGE 97

87 Questions Factor 1 Factor 2 Factor 3 1. The laboratory experiments complement the lectures. .65 -.02 -.28 2. Conducting lab experiments increases your knowledge; you learn about things that you otherwise would not have learned from pure lecture. .51 -.43 .49 3. Conducting lab experiments made concepts easier to understand. .71 -.23 .27 4. Through doing the lab experiments you get an idea of how the things work. .71 -.02 -.21 5. Lab experiments made the subject more interesting. .61 .41 .25 6. Lab experiments made the subject less abstract. .66 .16 .52 7. The information provided was clear. .65 -.21 -.21 8. Working with the program took up too much time. .74 -.03 -.04 9. Pre-lab instruction was helpful. .49 .45 -.43 Table 6. Factor Analysis of the 9-item Attitude Survey Questionnaire In addition, descriptive analysis was computed for the 9-items attitude survey questionnaire. The results are shown in Table 7.

PAGE 98

88 Questions Mean SD Skewness Kurtosis 1. The laboratory experiments complement the lectures. 3.35 0.53 0.11 -0.90 2. Conducting lab experiments increases your knowledge; you learn about things that you otherwise would not have learned from pure lecture. 3.55 0.60 -0.96 0.01 3. Conducting lab experiments made concepts easier to understand. 3.40 0.55 -0.08 -1.01 4. Through doing the lab experiments you get an idea of how the things work. 3.60 0.59 -1.20 0.52 5. Lab experiments made the subject more interesting. 2.73 0.82 0.26 -0.90 6. Lab experiments made the subject less abstract. 2.18 0.81 0.57 0.20 7. The information provided was clear. 3.18 0.55 0.10 0.16 8. Working with the program took up too much time. 2.58 0.71 -0.39 -0.31 9. Pre-lab instruction was helpful. 3.15 0.53 0.15 0.43 Physical Group Combined 3.06 .25 -.386 -.52 1. The laboratory experiments complement the lectures. 3.15 0.58 0.00 0.00 2. Conducting lab experiments increases your knowledge; you learn about things that you otherwise would not have learned from pure lecture. 3.33 0.57 -0.12 -0.59 3. Conducting lab experiments made concepts easier to understand. 3.23 0.62 -0.18 -0.45 4. Through doing the lab experiments you get an idea of how the things work. 3.38 0.63 -0.48 -0.58 5. Lab experiments made the subject more interesting. 3.08 0.69 -0.10 -0.83 6. Lab experiments made the subject less abstract. 3.03 0.66 -0.03 -0.57 7. The information provided was clear. 3.00 0.75 0.00 -1.18 8. Working with the program took up too much time. 3.45 0.50 -0.21 -2.06 9. Pre-lab instruction was helpful. 3.25 0.71 -0.40 -0.88 Simulation Group Combined 2.99 .40 .90 -.711 Table 7. Descriptive Results for the 9-item Attitude Survey Questionnaire

PAGE 99

89 Based on these descriptive results, it appears that the two groups were slightly different on their attitudes toward the laboratory experience (simulation M = 2.99; physical M = 3.06). However, analyzing each item revealed that on question # 5, the simulation group felt that the lab experiments made the subject more interesting (simulation M = 3.08; physical M = 2.73). On question # 8, the simulation group reported that working on the lab experiments did not take too much time (simulation M = 1.55; physical M = 2.53). These results support the notion that the simulation program is more interesting, it cuts down on the time it takes to complete the laboratory assignments and it makes the subject matter less abstract. In addition to this, a 13-item survey was also administered at the completion of the simulation program. Similar to the previous analysis for the attitudes survey instrument, factor analysis was also used to identify underlying variables, or factors that explain the pattern of correlations within this survey instrument (Table 8). Table 8 below outlines the results of factor analysis that resulted with two factors: Factor 1 was identified as positive attitudes towards simulation and Factor 2 as other. As a result, only Factor 1 was utilized for additional analysis (particularly for testing Ho5). An alpha reliability of .91 was established for Factor 1.

PAGE 100

90 Questions Factor 1 Factor 2 Factor 3 Factor 4 1. The simulation motivates me to learn. .81 .16 .001 -.32 2. The simulation is interesting. .70 .43 -.13 -.27 3. The simulation is a better tool than regular physical laboratory. .67 .12 .18 -.28 4. The simulation is enjoyable. .84 .01 -.21 -.22 5. It takes less time to do the lab experiments using the simulation. .20 -.57 .46 .34 6. The simulation is effective for laboratory use. .65 .20 .30 .40 7. The simulation makes learning faster. .75 .10 -.27 -.24 8. The simulation is as effective as physical laboratory experiments. .30 .69 .39 .09 9. The simulation makes understanding of the conceptual theories more clear. .85 -.31 -.00 .02 10. The simulation would be an excellent laboratory tool. .86 -.02 -.04 .34 11. Doing the experiments with the simulation is motivating. .85 -.26 .19 .21 12. More simulation programs like the one are needed in our educational system. .74 -.48 -.06 -.02 13. The use of simulation technologies is an effective method of conducting laboratory activities. .89 -.11 -.06 .05 Table 8. Factor Analysis for the 13-items Attitude Survey Questionnaire In addition, Table 9 provides descriptive results for the instrument item to assess whose attitudes toward the simulation program. Items 2, 5, 6 and 8 in the survey were worded negatively to control for subject bias.

PAGE 101

91 Questions Mean SD Skewness Kurtosis 1. The simulation motivates me to learn. 3.08 0.69 -0.59 0.95 2. The simulation is interesting. 3.30 0.56 -0.04 -0.50 3. The simulation is a better tool than regular physical laboratory. 2.93 0.73 -0.72 1.10 4. The simulation is enjoyable. 2.95 0.71 -0.37 0.28 5. It takes less time to do the lab experiments using the simulation. 3.08 0.83 -0.71 0.22 6. The simulation is effective for laboratory use. 3.23 0.58 -0.03 -0.23 7. The simulation makes learning faster. 3.05 0.75 -0.47 0.09 8. The simulation is as effective as physical laboratory experiments. 3.13 0.65 -0.72 2.02 9. The simulation makes understanding of the conceptual theories more clear. 2.88 0.72 -0.66 0.97 10. The simulation would be an excellent laboratory tool. 3.35 0.58 -0.20 -0.64 11. Doing the experiments with the simulation is motivating. 3.13 0.72 -0.62 0.64 12. More simulation programs like the one are needed in our educational system. 3.28 0.64 -0.93 2.71 13. The use of simulation technologies is an effective method of conducting laboratory activities. 3.28 0.68 -0.40 -0.75 Table 9. Descriptive Results for the 13-item Attitude Survey Questionnaire Statistical results as related to null hypothesis The following results was found with respect to the null hypotheses stated in chapter 3: H 0 1: There is no significant difference (at p = 0.05 level) between the physical group and the simulation group attitudes toward the laboratory experience as measured by the attitude survey at the completion of the post-test. Result: Rejected H 0 1. As shown in Table 10, the two groups significantly

PAGE 102

92 differ on attitudes toward the laboratory experience as measured by attitude survey (F = 10.55, p = .002). The simulation group reported a more positive attitude ( M = 3.20) than the physical group ( M = 3.07). More specifically, on the individual attitudes questions, the simulation group found lab experience significantly more interesting ( F = 4.27, p = .042), less abstract ( F = 26.36, p = .000) and less time-consuming ( F = 40.2, p = .000). Group N Mean SD F P Q1 Physical 40 3.35 0.53 Simulation 40 3.15 0.58 2.579 .112 Q2 Physical 40 3.55 0.59 Simulation 40 3.33 0.57 2.961 .089 Q3 Physical 40 3.40 0.54 Simulation 40 3.23 0.62 1.798 .184 Q4 Physical 40 3.60 0.59 Simulation 40 3.38 0.62 2.726 .103 Q5 Physical 40 2.73 0.81 Simulation 40 3.08 0.69 4.270 042** Q6 Physical 40 2.17 0.81 Simulation 40 3.03 0.66 26.365 .000** Q7 Physical 40 3.18 0.54 Simulation 40 3.00 0.75 1.415 .238 Q8 Physical 40 2.58 0.71 Simulation 40 3.45 0.50 40.249 .000** Q9 Physical 40 3.15 0.53 Simulation 40 3.25 0.70 .510 .477 Physical 40 3.07 0.65 Overall Simulation 40 3.20 0.64 10.55 0.002** Table 10. Mean & Sig. Results for Attitude Survey Questions for Both Groups H 0 2: There is no significant difference ( p = 0.05) on post-test scores between students performing physical experiences on a traditional communication systems topic as compared to those performing the same experiments using a computerized

PAGE 103

93 simulation program. Result : H 0 2 rejected The two groups significantly differ on post-test scores (p = .00; t = -38, p = .00, df = 78). The simulation group ( M = 31.65) performed significantly higher the physical group ( M = 13.77). The results support the notion that simulation treatment appears to improve the conceptual understanding of the students (Figure 5). Group N Mean Std. Deviation Std. Error Mean Final Post Test Physical 40 13.7750 1.14326 .18077 Simulation 40 31.6500 2.67515 .42298 Levene's Test for Equality of Variances t-test for Equality of Means F Sig. t df Sig.(2tailed) Mean Difference SED 95% Confidence Interval of the Difference Lower Upper Final Post Test Equal varian ces assum ed 22.96 .00 -38.8 78 .00 -17.87 .459 -18.79 -16.95 Equal varian ces not assum ed -38.8 52.7 .00 -17.87 .459 -18.79 -16.95 Figure 5. T-test: Comparison of Means for Post-test Measures Initially, it was perceived that the simulation group would perform as well as the physical group or slightly better. But to the authors surprise, the simulation group performed much better than the physical group. There was no surprise, however, by the scores obtained from the physical group considering the levels of the students who participated in the study. The scores of the post-test for the physical

PAGE 104

94 group were consistent with history of the institution where this research took place. However, the simulation program seemed to have helped the simulation group considerably since their scores improved significantly. H 0 3: There is no significant difference between simulation and physical laboratory groups long-term retention of the concepts as measured by mean scores on a follow-up instrument. Results : H 0 3 rejected The two groups significantly differ on the follow-up test scores ( p = .00; t = -18.93, p = .00, df = 78). The simulation group ( M = 27.81) performed significantly higher than the physical group ( M = 13.17). The results of H 0 2 reveal that the simulation group did perform significantly higher on the post-test than the physical group. In addition, results of H 0 3 indicate that the simulation group also performed significantly higher than the physical group. However, it is interesting to note that, on the follow-up test, the scores for the simulation group dropped, whereas the scores for the physical group remained. Group N Mean Std. Deviation Std. Error Mean Followup Physical 40 13.1750 1.36132 .21524 Simulation 40 27.8125 4.69682 .74263 Levene's Test for Equality of Variances t-test for Equality of Means F Sig. t df Sig. (2tailed) Mean Difference SED 95% Confidence Interval of the Difference Lower Upper Follow up Equal varianc es assume d 20.57 .000 -18.93 78 .000 -14.63 .77320 -16.17 -13.09 Equal varianc es not assume d -18.93 45.5 .000 -14.63 .77320 -16.19 -13.08 Figure 6. T-test: Comparison of Means for Follow-up Measures

PAGE 105

95 Post hoc analysis using paired sample t-test examined the significant differences within groups. Results revealed no significant difference in the physical groups scores between the post-test and the follow-up test ( t = 2.80, p = .008). The simulation groups scores at the post-test were, however, significantly higher than the follow-up scores ( t = 4.85, p = .000). These results clearly support the fact that the simulation groups follow-up scores were still significantly higher than those obtained by the physical group as discussed above. These results may imply that there is perhaps some educationally practical difference of the post-test means. However, it may be interesting to see what type of questions showed a difference in performance between the two groups (Table 11). Group N Mean SD Std. Error Mean Physical 40 0.73 0.45 0.07 Q1 Simulation 40 0.75 0.44 0.07 Physical 40 0.73 0.45 0.07 Q2 Simulation 40 0.75 0.44 0.07 Physical 40 2.05 1.01 0.16 Q3 Simulation 40 3.18 0.71 0.11 Physical 40 1.95 0.93 0.15 Q4 Simulation 40 3.13 0.72 0.11 Physical 40 2.25 1.17 0.19 Q5 Simulation 40 3.30 0.88 0.14 Physical 40 2.25 1.10 0.17 Q6 Simulation 40 3.18 0.90 0.14 Physical 40 2.30 1.04 0.16 Q7 Simulation 40 3.53 0.68 0.11 Physical 40 2.43 0.96 0.15 Q8 Simulation 40 3.40 0.63 0.10 Physical 40 0.85 1.25 0.20 Q9 Simulation 40 2.23 1.73 0.27 Physical 40 0.85 1.25 0.20 Q10 Simulation 40 2.20 1.70 0.27 Physical 40 2.90 0.98 0.16 Q11 Simulation 40 3.48 0.99 0.16 Physical 40 2.93 0.89 0.14 Q12 Simulation 40 3.48 0.99 0.16 Table 11. Mean comparison simulation and physical group on each post-test question

PAGE 106

96 Table 11 shows that on all questions except #1 and #2, the group mean of the simulation group is higher than that of the physical group. It is impossible to declare with certainty from this study the reasons that the simulation group answered those questions correctly, but one can speculate. It is entirely possible that the simulation program gave the simulation group a more secure notion of the concepts involved. It would appear that they understood the aspects of the modulation and demodulation and graphing the related waves better which was a notion on every question. They could also see the relationship between the carrier wave and the modulated wave more clearly. It is also interesting to note that both groups could visualize the waves. The simulation group could see the change in the waves according to each variable on their PC screen and the physical group could see the waves on the screen of the oscilloscope. However, since the simulation program provides more details on each displayed wave, it might have provided a better mental image of possible waves under various conditions. H 0 4: There is no significant difference on laboratory completion time between students performing physical experiences on a traditional communication systems topic as compared to those performing the same experiments using computerized simulation program. Results : H 0 4 rejected. As shown in Figure 7, the two groups significantly differ on laboratory completion time ( p = .001; t = 8.67, p = .00, df = 78). The simulation group ( M = 71.68) utilized significantly less laboratory time than the physical group ( M = 90.28).

PAGE 107

97 Group N Mean Std. Deviation Std. Error Mean Time Physical 40 90.28 6.164 .975 Simulation 40 71.68 12.073 1.909 Levene's Test for Equality of Variances t-test for Equality of Means F Sig. t df Sig. (2tailed) Mean Difference Std. Error Difference 95% Confidence Interval of the Difference Lowe r Upper Time Equal variances assumed 12.63 .001 8.67 78 .000 18.60 2.143 14.3 22.86 Equal variances not assumed 8.67 58.04 .000 18.60 2.143 14.3 22.89 Figure 7. T-test: Comparison of Lab Completion Time H 0 5: There are no significant correlations between simulation groups attitude toward the use of the simulation and students performance post-test scores. Results : H 0 5 rejected There is a significant positive relationship between the attitudes of the simulation groups toward the use of the simulation and their post-test performance (r = .69, p = .00) at the 0.01 level. Results of Qualitative Data Written questionnaire All simulation group students were given a questionnaire after the completion of the post-test to share their reaction to the use of the simulation program for conducting the laboratory experiments. The responses from the simulation group are found in Appendix N. The responses to the interview questions showed that, in general, the

PAGE 108

98 simulation group liked using the simulation program and were willing to use it again. Few of the students seemed to recognize the ease of use of the simulation program as compared to the physical lab equipments. Few students recognized that working with the simulation is less time-consuming and less frustrating. Specifically, Question 1 asked if the simulation program was effective as a laboratory tool. Overall, the reaction of th e students to this question was positive. The themes that emerged from question 1 were ease of use, speed, visual details, enjoyment, and better understanding of concept. Question 2 asked if the simulation programs could be a substitute for a physical laboratory. Some students answered yes to this question but indicated that students should be exposed to both. One st udent mentioned that simulation might not be suitable for all engineering subjects. A few students felt that nothing can substitute for real hands-on experience. Question 3 asked if a simulation program would be beneficial to online students. All students agreed that simulation could benefit those who cannot attend physical laboratory if no other alternative is available. One student pointed out that one has to have good observational skills to work solely with simulation. In addition, on question 4, when asked if simulation program should be incorporated into communication system laboratories, all students showed positive response. Question 5 asked if a simulation program should be used for topics in communication system in place of physical laboratory; the majority of students had positive response, but a few mentioned that they need more hands-on experiments. Their comments included combining the two methods for laboratory instruction. Also, on question 6, on their choice of the laboratory in the future, the majority of the

PAGE 109

99 students chose simulated laboratory. A few mentioned again that either they need both or only hands-on work. In addition, one student mentioned that the physical laboratory allows more peer interaction. On question 7, when asked if the students experienced any problems with using the simulation software, all agreed that it was easy to use, with no problems. Group Interview Two group interviews were conducted, one with three students randomly selected from the simulation group and another with three students randomly selected from the physical group. The interviews were used: to probe more deeply into the students' experiences, opinions and beliefs about physical versus computer-simulation experiments; to examine how they perceived laboratory experiments in general and simulation experiments in particular, e.g., to explore their opinions on whether the computer simulation were more or less useful than physical experiments and to determine the aspects of the computer simulation that made it more (or less) difficult than the physical experiments. All individual group interview transcripts were coded for the purpose of organizing the data. To make sense of the data, Miles and Huberman (1994) recommend organizing those initial codes into themes, also known as categories, the goal of which is to look for patterns in the data. Therefore, the following key will be used throughout the remainder of Chapter 4: S = Simulation group, P = Physical Group, M=Male, F=Female. Codes 1 3 refer to participant numbers in each group. For example, SM1 refers to Simulation Male #1 and PF2 refers to Physical Group Female #2. The results are as follows.

PAGE 110

100 Interview Themes Concept Clarification All six students in both groups were asked if the laboratory experiments affected the clarification of concepts. Students in both groups noted that the concept of signal transfer was either clarified or reinforced after performing the laboratory experiments. Students in both groups agreed that the laboratory experiments created an opportunity to go beyond the curriculum, and gain a deeper understanding of the subject itself. All six students who were interviewed spoke about the fact that the laboratory experience clarified the concepts, which seemed difficult prior to the experiments. Three quotes from the physical group students follow. Yeah, Im not a very good engineering student, engineering concepts are very abstract to me and I am a very visual learner and when I get to do the stuff, that makes it more real to me, it makes it more understandable. [PF1] You get to see it from a different angle here in the lab and you can also apply the exact same concepts and see how they actually work. You know, that kind of oh, I understand how this formula actually works. [PF3] When you are actually doing in the lab there are so many things going on that become a lot clear when you actually look at it compared to somebody else trying to describe it to you in the board. [PF2] On the other hand, one interesting comment from one student in simulation group [SF1] was that the simulation avoided the tension of working with small objects such as capacitors and allowed her to focus on the concept itself. The same student mentioned that the advantage with the simulation is that it can always be available to them so they could use it over his/her free time again and again to gain some extra knowledge on concepts that were not covered in lecture or the lab experiments.

PAGE 111

101 All the simulation group students indicated that the simulation was especially helpful in understanding the frequency domain concepts better. The biggest thing I think would be helping me see the transition to the frequency domain. I really wasnt sure in class but then when I used the simulation I thought I understood better how a signal looked once it was in the frequency domain. [SM1] The same student [SM1] student also mentioned that he was often able to recall a specific concept in the course from his experience working in the lab on that particular concept. The students in both groups were asked if the lab had any effect on their perceived retention. The following quote is from one physical students response. Oh, I would say definitely. May be few years from now Im not going to remember a formula or anything like that, but what I will remember though is the output of the analyzer. And may be how it applied to our frequency and the frequency domain. [PM1] Also in the simulation group all three students agreed that if he/she could see the process, he/she would remember it better. One student noted: This was my first time working with a simulation program. But I like it. I think it had some details that you wont gain by working in a lab. Because of those details, I think I remember the concepts better. And I will always have that visual image of the waves in my head. [SM1] Two simulation group students explained how the simulation helped them gain a deeper understanding of the lecture concepts. With the audio filter you could see as it is the frequency being cut off as it goes along and that was kind of interesting to see because you actually see and hear, you know how much frequency you are cutting off when you are sitting there messing with parameters. [SF1] I had a better idea of the different properties that each modulation scheme would have by adjusting the parameters and seeing the effect on the screen. [SM2]

PAGE 112

102 Visual Learning All the students in the simulation group at some point during the interview mentioned the importance of having visual representation of concepts. Despite the fact that the physical group could visually see the change of waves on the oscilloscope, interestingly, none of the interviewees mentioned anything about the visual aspect of the lab. All simulation interviewees at some point claimed that they learn better visually and that the simulation was very helpful in converting abstract mathematical concepts into concrete illustrations that they could understand better. One student offered a different perspective on the advantages of having visual experience in an engineering course that is highly mathematical. The instructor talked about those things in class and when I actually saw it happen on the computer, then it all made sense. [SF1] The student was then asked what she meant by made sense. It made perfect sense. I understood it on a deeper and more meaningful way. I could explain it to somebody else without skipping a beat. Another student [SM2] mentioned that he realized it was something real in contrast to his prior belief that modula tion and demodulation was only an abstract concept. All three students agreed that the simulation helped in elaborating on the concepts because of the visual representation of the mathematical concepts. Memorability Students in both groups were asked if the lab affected the memorability of the content. Three students in the simulation group told us that there was positive influence on their memorability while one student in the physical lab group said that there was no impact. The following is a quote about the impact of the lab on

PAGE 113

103 memorability: I mean when you are speaking about long term memory obviously repetition is a good thing. Even not just because of the lab. When you are doing experiments and actually try it out using real life instruments I think it really puts it in your memory a little more. It is really a hard question to answer but I think it has had an effect. I think it really helped it sticking there. [SM1] Generally I have a hard time remembering things after a semester or two but I think labs have always helped me remember a little better. [PF1] Problem-solving Strategies All students in both groups were asked if they used any problem-solving strategies when answering the questions on the exam. Surprisingly, none of them did. However, a student in the simulation group [SM1] had an interesting comment. He said that, while he was answering the questions during the exam, he used mental pictures of what he had seen in the lab when he did the assignments. Hence, it was easy for him to draw on those pictures and answer the exam questions. But he did not realize that this was really a strategy. One student in the physical group [PM2] noted the problem of not using any strategy stems from weakness of their educational background, which he felt placed him at a disadvantage. Time spent in the laboratory When simulation students were asked to comment on their thoughts over the time spent in the lab based on their past experience with physical laboratory, all three students pointed out that they liked the fact that they spent less time in the lab than they would have if they had done those experiments in a physical lab. They also mentioned at some point that it was also less frustrating to work with the simulation.

PAGE 114

104 Simulation in Place of laboratory When the simulation group was asked to comment on their thought on using simulation programs in place of the physical laboratory, all agreed that nothing can replace a real hands-on experience. However, the group also pointed out that simulation is a feasible alternative in situation where physical laboratory equipment is not available. One student also mentioned that simulation might not work for every lab experience in the engineering field. Simulation lab experience At the end of the interview when the simulation group was given a chance to express their thoughts about the simulation lab, all three were happy to have a facility like the simulation program in their laboratory curriculum. All three were also happy that they had a chance to have the simulated lab experience. One student was curious to know if more of such simulated labs would be integrated in the curriculum. Exit questions The post-test contained a cover sheet explaining to the students the problemsolving strategies that they could use to help them solve the problems on the test. After the completion of the exam, on the last page of the test, the students were asked a few open-ended questions regarding the strategies that they used to come up with the solution. A summary of the comments showed that none of the students used any of the recommended problem solving strategies. Only a total of 18 students answered one question, which asked if the laboratory experience improved their ability to answer

PAGE 115

105 the questions on the exam. Three students reported that the lab experience did not help them much and the remaining of the respondents agreed that the lab experience helped them with answering the exam questions. For the detail of the comments for the essay, refer to Appendix N. Observation During the experiment, the researcher had the opportunity to briefly observe both the simulation and physical lab activities. As mentioned before in the chapter, LinkSys hardware in the PC lab allowed the researcher to observe the students monitor one at a time. An interesting observation that was made of the simulation group was realizing that students had different methods and styles of working with the simulation program. For example, before the treatment, the author had believed that a key strategy was being systematic: varying one variable at a time. However, other styles can be productive, such as varying several variables to rapidly look for unforeseen special cases to investigate first and doubling one variable while halving another. Another observation was that one student ran the simulation several times before even starting the simulation and when asked he indicated that the first time through he looked at values, then at variables and tried to remember the underlying formulae and understand the graphs. The students were reminded again and again to use the simulation software to only complete the two laboratory experiments. They were reminded that they should not use the simulation to perform any additional experiments by entering various parameters for the variables. However, during monitoring the screen using LinkSys,

PAGE 116

106 on few occasions during the laboratory experiments, a few students had to be reminded again that they should only complete the experiments and do not work with the simulation beyond the requirements of the labs. A third observation was that despite the fact that the students were reminded that the experiments must be conducted individually, the majority of the students in the physical group started talking to each other and sharing information whereas the students in the simulation group worked independently and did not talk or exchange any information. However, every action was taken to remind the students in the physical group about individual work. Based on the observation, a concern that was obvious from the simulation lab was that despite the fact that it allowed the students to work individually and concentrate more on the interface, in a way, it was short on the student-student interaction, which sometimes is needed in a laboratory setting. This might have had an impact on the results of this study and it brings up an important question prior to making any decision on the laboratory delivery mode how important student collaboration for any specific laboratory experiment is and how such collaboration can be accommodated in an online environment. However, at this point, this is beyond the scope of this study and should be subject to investigation in the future studies. Summary of the Quantitative Results The analysis of the data does show some positive effects of using a computersimulated laboratory to learn the complex concept of modulation and demodulation. The results may support the notion that simulated laboratories could replace physical

PAGE 117

107 laboratories on some subject matters. In addition, the results indicate that simulation group reported more positive attitude toward the laboratory experience than the physical group. In particular, on specific items such as time spent in the lab and student enjoyment, the mean of the simulation group was higher. It was also found that there was positive correlation between the simulation group attitude and their post-test score. Furthermore, the results showed that there was significant difference between the simulation group and the physical group on their post-test scores and follow-up scores. However, no significant differen ce was found in the physical groups scores between the post-test and the follow-up test. The simulation groups scores at the post-test were, however, significantly higher than the follow-up scores. These results clearly support that the physical group retained knowledge between the two tests better than the simulation group although the simulation groups follow-up scores were still significantly higher than those obtained by the physical group as discussed above. There was also significant difference between the two groups on their lab completion time in favor of the simulation group. Thus, apparently, one can conclude that the simulation program could be used, for some engineering subject matters; in place of a physical laboratory and it might help on the conceptual understanding of the material. In addition, the use of simulation will reduce the amount of time students spend on the laboratory experiments. But the most dramatic result emerging from this study is that the simulation approach seems to have an initial effect on the understanding of the concepts but no effect on retaining the concepts. This is very important result and the

PAGE 118

108 implications will be discussed more fully in Chapter 5. Summary of Qualitative Results Following is a summary of the results of interviews and personal observation. For students in both groups, the lab experiments were helpful in reinforcing their knowledge and understanding of the modulation and demodulation concepts. The simulated lab eliminated the frustration of a physical lab that focuses on making the equipment work, which does not contribute to their learning. There were many students who had never used a simulation before and felt that the simulated lab was helpful and was really a replica of the oscilloscope screen. Some students expressed concerns in terms of loss of hands-on skills. Many students appreciated the fact that the lab experiments did not involve unnecessary calculations and repetitive procedures that actually did not contribute to their learning Simulation students expressed the benefit of being able to see a mathematical equation in the form of a graph or the theoretical concepts beings presented visually. All students expressed that they understood the concept of the modulation and demodulation better after performing the laboratory experiments. A frequent comment was that the lab ties everything together. All students mentioned that the simulation lab increased the interest level in the course.

PAGE 119

109 The lab experiments gave them a chance to find answers for the what-if questions that were discussed in the class. There were no expressions of boredom or disinterest in any part of the interviews. The students found the simulation to be motivating, easy to use and less-time consuming. The students stated that they liked being able to translate the theoretical concepts into real examples on the laboratory equipment. In fact, very few students manifest meta-cognition when they stated that the laboratory helped them to see what was described by the mathematics. A concern was that direct hands-on student interaction with the experimental equipment is of absolutely paramount importance for the educational effectiveness of the experimental experience. Another concern that was voiced repeatedly relates to the perceived difficulties in enforcing the independence of remotely performed student work. None of the students used problem-solving strategies or they did but they were not aware of it. Or they claimed that not using it has to do with a poor educational foundation.

PAGE 120

110 Chapter 5 Discussion Review of the Study This experimental study compared the performance of two groups of students, one using the simulated laboratory and one using the physical laboratory to learn about modulation and demodulation as a topic in communication system. These students were compared on the basis of their attitude toward the use of the laboratory experiments, their laboratory completion time and their conceptual achievement scores with respect to modulation and demodulation concepts. The achievement scores were measured over a period of time, from the post-treatment to three weeks later. The purpose of this study is to examine an alternative to the use of physical laboratory activities in a communication systems laboratory. Specifically, this study examines whether computer simulation is as effective as physical laboratory activities in teaching college-level electronics engineering education students about the concepts of signal transmission, modulation and demodulation. The following are the research questions that were examined: Question 1. In terms of student conceptual learning, how do simulation-based laboratory experiences compare to physical laboratory experiences?

PAGE 121

111 Question 2 How does students attitude toward the use of the simulation affect their post-test score? Question 3 How does the simulation group attitude toward the laboratory experience differ from the physical group? Question 4. In terms of completion time of the assigned laboratory experiments, how do simulation-based laboratory experiences compare to physical laboratory experiences? Question 5. In terms of student knowledge retention, how do simulationbased laboratory experiences compare to physical laboratory experiences? Question 6 What are the perceptions of both groups on the use of laboratory experiments in general for learning the concepts? Question 7. What is the students perception toward the use of simulation in place of physical laboratory? A mixed study of quantitative and qualitative research methods was applied to seek answers to the questions. An experimental research design was conducted to examine Questions 1-5, while a qualitative case study design was carried out to explore Questions 6-7. In the preceding chapter, the results were presented in detail and the statistical support for the results below were presented in detail. These results are briefly discussed here. Immediately after the treatment, there was a significant difference between the simulation group and the physical group in post-test scores in favor of the simulation group.

PAGE 122

112 Three weeks later, there was a significant difference between the simulation group and the physical group in follow-up test scores in favor of the simulation group. There was a significant difference between the simulation group and the physical group in lab completion time in favor of the simulation group. There was a significant difference between the simulation group and the physical group in their attitude toward the laboratory experience, in favor of the simulation group. There was a positive correlation between the simulation groups attitude toward the simulation program and their post-test scores. Discussion of Results As shown in chapter 4, there was significant difference on the conceptual test scores between the two groups, in favor of the simulation group. The findings are inconsistent with the findings by Moslehpour (1993) and Hall (2000) reporting that they did not note any significant differences in student achievement between those who simulate a laboratory exercise and those who perform the same laboratory exercise in a traditional hardware laboratory. The findings are also inconsistent with the findings of Choi et al. (1987), whic h showed that there was a significant difference (in favor of physical group) between the two groups in the learning of the volume displacement concepts. Based on these results the researchers concluded that computer simulated experiences were not as effective as hands-on experiences which does not agree with the results obtained from this study. One might ask what makes this study different than the previous ones, which found no significant difference between the two groups and/or significant difference

PAGE 123

113 in favor of the physical group. Based on the observations made from past literature, the author believes that some missing links are evident in previous studies reported in the literature. For instance, the author has identified few factors, which have contributed to the results of this study namely simulation design and quality, experimental design and type of learning. Simulation Quality and Design : One contributing factor to the result of this study could be the alignment between the course objectives, the assessment procedures, the lectures, and the selection of a simulation program. Based on the result of this study, it is not unreasonable to suspect that the design of computer simulation selected for this study must have met the specific learning objectives of the laboratory experiments. In fact this was a very important factor at the initial stages of this study. For instance, simplicity and ease of use of the simulation program were few factors that were pointed out during th e interview. Another contributing factor could be the quality of the simulation software in terms of "realism" of the simulation model. Relevance of the simulation to the topic could also be an important factor. It can be argued that simulations should be used as a tool to advance a clear set of learning objectives, rather than as a game or classroom activity that is fun but has little relevance to the larger curriculum. Types of Cognitive Learning: Previous studies have attempted to assess cognitive learning at a lower level of Booms taxonomy whereas in this study, effects of simulated laboratory at learning at higher level (analysis, synthesis and evaluation) were the subject of investigation. Thus, it could be hypothesized that the use of simulation programs for laboratory purposes might prove more effective at higher

PAGE 124

114 levels of cognitive learning. Experimental Design: Fenton, Pfleeger & Glass (1994) pointed out that many empirical studies in engineering have poor statistical design (As sited by: Perry et al., 2001). In addition, the author believes that controlled comparisons of randomly allocated groups to students, taught by the same instructor, represents the ideal research design which previous studies lack. Thus experimental design employed in this study could be a contributing factor to the higher learning of the simulation group. Considering the above factors and based on the results of this study, it can be concluded that the simulation program can be as effective or better than physical laboratory in certain areas of engineering subjects. And it is not unreasonable to claim that a simulated laboratory could be a feasible tool for some online engineering courses. In addition, an interesting finding of this study was that the simulation groups conceptual test scores decreased noticeably from post-test to follow-up, whereas the physical groups scores dropped very little. These results may suggest to some that the simulation group demonstrated inferior knowledge retention over time. However, it is important to note that the physical groups initial post-test scores were very low. Indeed, the mean group performance was failure. Under this condition, one must question how much knowledge was gained to begin with. In laymans terms, the simulation group had much more to lose thatn the physical group. Therefore, it is unreasonable to declare that the physical lab students had higher knowledge retention than simulation group.

PAGE 125

115 It is very difficult to declare with certainty from this study the reasons that simulation group did much better on both post-test and the follow-up test, but one can speculate. It is entirely possible that the simulation program gave the simulation group a more secure notion of the concepts involved. It would appear that they understood the aspects of the modulation and demodulation and graphing the related waves better which was a notion on every question. They could also see the relationship between the carrier wave and the modulated wave more clearly. It is also important to note that both simulation and physical group could visualize the waves. The simulation group could see the change in the waves according to each variable on their PC screen and the physical group could see the waves on the screen of the oscilloscope. However, since the simulation program provided more details on each displayed wave, it might have provided a better mental image of possible waves under various conditions and as a result a better understanding of the concept in general. In addition, it was mentioned few times during the interview that the simulation program seemed to eliminate the distractions caused by manipulating the equipments and the tiny devices such as capacitors and the resistors and allowed the students concentrate on the concept. In other words, the physical students had to deal with additional content (manipulating of the apparatus), which was not tested. Such factors might have contributed to an increase in the cognitive load, which likely interfered with physical groups learning As a result, it is not unreasonable to assume that conceptual simulation programs could be feasible substitute for hands-on exercises, when the purpose of the experiments is to understand the concepts and not manipulate the equipment, since it helps reduce the unnecessary cognitive overload.

PAGE 126

116 For further investigation, the researcher compared the post-test score and the follow-up score for each student in both the simulation and the physical group. For each student in the physical group, the scores changed for only few points but in the simulation group, the scores for only five students dropped their score by more than ten. When the scores for those students were eliminated for both posttest and followup, then the result showed higher knowledge retention over time in favor of the simulation group. Therefore, one could suspect that between the post-test and the follow-up test some external variables might have impacted the scores of those five students. But even if we include those five students, it appears that the simulation program had some effect on the long-term retention of the material. It could be argued that the novelty of the simulation experience might have impacted high scores in the simulation group. But it might have also influenced the reduction in long-term recall of the simulation group although their scores were still significantly higher than those obtained by the physical group. Knowledge retention is a complex phenomenon and is impacted by many factors one of which could be previous experience. It is not unreasonable to assume that previous physical laboratory experiences might have contributed to physical groups knowledge retention since they worked with actual equipments. And if we repeat this study with students who have had previous experience with simulations, then the results might show an increase in retention rate for the simulation group. Some interesting trends are also noted by looking at the types of questions the simulation group and the physical group missed on the post-treatment test. Most students in both groups correctly answered questions 1 and 2 (see Appendix F).

PAGE 127

117 These questions asked the students about the physical differences between the AM and FM modulation by explaining it in question 2 and recognizing it in question 1. Clearly both groups understood these initial concepts. But, as discussed in chapter 4, questions that extended these ideas to more general ones showed more differences. For example, question one was a multiple-choice question, which asked the students to analyze a waveform. Only 29 out of 40 students in the physical group and 30 out of 40 students in the simulation group answered this question correctly. Therefore, both group performed comparably on this question. Question two asked students to explain the physical differences between AM and FM modulation. In order to answer this question the students should have learned to evaluate the physical differences based on the appropriateness of the techniques for given situation. Both groups performed comparably on this question. Questions three, four and five required the students to integrate the modulation techniques with physical characteristics of signal wave and report the changes in the modulation and also understand the relationship between carrier and sampling frequency. On these questions, simulation group performed better than physical group. On questions six, seven and nine, they were asked to sketch diagrams. The purpose of these questions was to assess students ability to recognize the appropriate changes in the signal waves and its effects on the modulation process. The simulation group performed better on these questions also. Questions eight, ten and eleven were multiplechoice questions. On question eight; only 33 out of 40 answered this question correctly as compared to physical group with 21 students out of 40. On question ten, Only 13 students answered this question correctly as opposed to 17 students in the simulation group. And again, on question eleven, only 15 students in physical group provided the correct answer as opposed to

PAGE 128

118 21 in the physical group. On all these questions, the simulation group performed better that physical group. The educational objectives of modulation and modulation topic were designed to impart a higher order of skills rather than factual information. It is conceivable that the simulation program provided a learning mode that produced higher order cognitive learning at the analysis, synthesis and evaluation levels. The results of the study also revealed that it took the simulation group less time than the physical group to complete the laboratory assignments. This concurs with the findings of Orlansky and String (1979) that simulation produced equal or better achievement in about 30% less time. The importance of this result lies in the fact that in engineering physical laboratory setting, conducting laboratory experiments can be costly, time-consuming and difficult to schedule. The findings of this question then support the hypothesis that conducting laboratory experiments on simulation program is less time consuming than the traditional physical laboratory. Additionally, the results of the attitude of the students toward the laboratory experience revealed that there was a significant difference in favor of the simulation group. Specifically, when looking at the sub scores of the attitude survey, the simulation group had a higher mean on specific items such as the ease of use of the simulation, time spent on the experiments, and ease of conducting the experiments which concur with Dobson and Hills (1995) findings that a higher percentage of the students rated the simulation package easier to use than the conventional lab exercise, the simulation group strongly agreed that lab experiments conducted using the simulation package took much less time to complete, the simulation group appeared to find the lab assignments slightly easier than did those using the conventional equipment.

PAGE 129

119 Finally a positive correlation was found between the simulation groups attitude toward the use of simulation and their score on post-test. As discussed in Chapter 4, when looking at the subscores of the attitude survey, the attitudes are positive. Specifically, for the questions of finding the simulation motivating, finding the simulation interesting, feeling of understanding and suitability of simulation program for physical laboratory, the rating was positive. Discussion of Qualitative Results The exit questions at the end of the post-test asked the students which problem-solving skills they used, if any, to solve the problems, if they used any other strategies to solve the problems, and if they found the laboratory instructions useful for solving the problems on the exam. Surprisingly, none of the students answered the exit questions on their use of the problem-solving strategies or any new strategies. Only a few answered no to both questions. When asked during the interviews, a few students mentioned that they never think about their strategy and corresponded that to the weakness of their educational background, which they felt placed them at a disadvantage. The results of the qualitative study are consistent with the findings of Dobson et al. (1995) indicating that (a) higher percentage of the students rated the simulation package easier to use than the conventional lab exercise, (b) the simulation group strongly agreed that lab experiments conducted using the simulation package took much less time to complete and (c) the simulation group appeared to find the lab assignments slightly easier than did those using the conventional equipment. But more interestingly, the findings also agree with Dobson et al. (1995)

PAGE 130

120 claims that the students surveyed voiced concerns about the loss of skill development if the physical conventional laboratory component were totally eliminated. A majority of the students agreed that the simulation program would be a feasible alternative for online students, but in a traditional classroom, the students would benefit from a combination of simulation and physical laboratory. But this would be subject to further studies. Summary The quantitative part of this research supports the conclusion that whether the laboratory exercises are conducted in the traditional hardware laboratory or in the computer laboratory using simulation software, students will learn their lessons. But such conclusion can only be made for laboratory experiments, which are not hands-on intensive. In those cases, students who cannot attend laboratory classes on campus could take the same courses using computer simulation without fear that their experience or achievement would be somehow less than it would have been attending classes on campus. At the same time, the qualitative research has uncovered several issues not explored by the quantitative research. Incorporating the recommendations acquired from the qualitative research, especially elements of incorporating hardware experience to avoid lack of hands-on skills, into the laboratory pedagogy should help improve students experience regardless of the environment in which the laboratory is conducted. Finally, the challenge with engineering curriculum is NOT whether or not laboratory experiments should be used, but rather (a) how to avoid eliminating

PAGE 131

121 laboratory work due to budget constraints, (b) how to maximize laboratory efficiency in terms of cost and time while increasing students learning and, (c) how to maximize the accessibility of laboratories to onas well as off-campus students. Clearly, one alternative is simulation program. While simulation programs may not be feasible alternative for some topics in engineering, they will, however, be suitable for others. Limitations of the Study Several limitations of the research method must be noted. One limitation involves the administration of the experimental treatment itself. Due to some legal issues at the university where the data were collected and the lack of a central video camera, videotaping of the students was not possible. The study has several limitations to generalization. As reported in chapter 1, one limitation was the nature of the demographic profile of the population from which the sample was drawn. Therefore, caution should be made not to over generalize the result to all electrical engineering undergraduate students. In addition, caution must be made when generalizing the results to other laboratories. It is important to realize that the purpose of laboratory experiments in this study was to increase the students conceptual understanding and not their hands-on skills. Therefore, the results of this study can only be generalizable to certain labs on certain topics. Other limitations involve the administration of the experimental treatment itself. Unfortunately despite the initial instructions in terms of conducting the laboratory experiments individually, there was no way to make absolutely certain that students did not talk and share information during the laboratory session. As a result,

PAGE 132

122 the actual experience within each group may have impacted their scores at each level. Despite the fact that the students had to be reminded and prodded again and again to work individually, the physical group seemed to have enjoyed the verbal interaction during the laboratory, which may have impacted their recall abilities. And as a result, the interaction among the physical group could be a potential source of variance. The simulation group may have utilized the simulation exercise to input additional variables, which might have led to their higher scoring than the physical group. This however, may be the reason their recall level dropped since they did not experience verbal interactions with their class members. In addition, the novelty of the simulation experience may have impacted high scores in the simulation group. This may have also impacted the reduction in recall scores by the simulation group, although their scores were still significantly higher than those obtained by the physical group. Implications for Practice Curricula in engineering technology and engineering education are frequently billed as hands-on programs. Often, persons who like to work with their hands are attracted to an engineering technology degree program. Even so, the job market, especially the job market for engineering technology graduates, is requiring more computer-based design and problem-solving skills than ever before. At the same time, higher education is moving into distance education and, as a result, Internet delivery of credit courses has wide appeal to potential students who, for a variety of reasons, cannot attend on-campus classes. In addition, in engineering education, educators may not be able to provide students the opportunity to engage in

PAGE 133

123 hands-on activities due to cost, feasibility, and/or cost/ While some courses lend themselves easily to the Internet environment, engineering laboratory courses have always used expensive laboratory test and measurement equipment for conducting the experiments. Yet with computer simulation, students can duplicate most, if not all, of the laboratory experience on their personal computers. Signal transmission, for instance, is a valuable concept in the communication system area. Conducting laboratory experiments for enhancement of conceptual understanding of this topic is also very valuable, and elimination of such experiments due to cost or unavailability of equipment is not a feasible option. Therefore, using a simulation program to provide the means of conducting such experiments outside of the physical laboratory is an appealing prospect. It is undeniable that in the area of engineering education, computer simulation has been used frequently in a tutorial sense, but to use the simulation as a replacement for laboratory purposes opens up limitless possibilities for the engineering curriculum. From an analysis of the group interview transcripts, two immediate observations were made. First all students interviewed favored the use of the laboratory experiments as part of the engineering curriculum. Second, all students agreed that simulation would be a feasible alternative to a physical laboratory for distance learners, but they felt that nothing can ever replace real-life experience. In addition, the students reported during the interview and through the attitude survey that simulation is more effective in terms of time and ease of use. The quantitative and qualitative outcomes of this study can be used specifically to refine the laboratory experience of the engineering undergraduate

PAGE 134

124 students in programs that are preparing for the accreditation process for the Accreditation Board for Engineering and Technology (ABET 2000). At the university of study, the results of this research will be incorporated into the assessment and feedback process required by ABET 2000. In conclusion, on a practical level, computer simulation can provide engineering faculty with the flexibility to meet the ever-demanding needs of the laboratory-based classrooms. For instance, if under budget constraints, the faculty may want to consider using computer simulation in lieu of expensive hands-on activities that require large amounts of consumable material and costly equipment. The faculty may also want to consider the time saved by using the simulation program. In addition, if laboratory space is in premium, simulation laboratory may help the faculty to eliminate the need for a physical laboratory space. Computer simulations can also be beneficial for allowing those who are absent to make-up missed laboratory activities. Thought the use of simulated laboratory in place of physical laboratory may seem as a feasible alternative, it is the personal opinion of this researcher, that whenever possible, real life experiences should always supersede simulated experiences. Yet, computer simulation holds promise in allowing students to engage in a variety of activities that otherwise may be unattainable due to cost, space, time and place. Therefore, computer simulation should be considered as an alternative to hands-on activities in meeting education goals if no better alternatives are available.

PAGE 135

125 Recommendations for Further Study It does appear that the potential of using simulated laboratory to replace some but not all, physical laboratories merits continued research and attention. Several areas are worthy of further research. First, the answers to the research questions revealed that simulation might be a feasible alternative to a physical laboratory in some subject areas in engineering. However, it would be interesting to design this study with a large pool of students. In addition, since this study was limited to one institution and one specific laboratory topic, in-depth quantitative and qualitative studies of other topics at other institutions should be conducted to help generalize the findings. In addition, some emerging themes in the quantitative and qualitative section of the study suggested some inconsistencies between students attitude toward the simulated laboratory and their perception of its use in place of the physical laboratory. A few students suggested that both simula tion and physical laboratories should be used for a deeper understanding of the concept without losing the hands-on skill. Thus, with a large number of students, it would be interesting to design a true experiment with random assignment to three groups: one with simulation, one with physical lab and one with both. It would be informative to compare all three. This research would speculate that the third group would have higher achievement scores, higher knowledge retention and higher satisfaction with the treatment. Only additional research would show the relationships clearly. In addition, in a replication study, an attitude survey may be administered over time to measure the changes in students attitude. Another recommendation for a similar study is to consider the students

PAGE 136

126 motivation as an additional variable. Perhaps, one can investigate whether there is a correlation between performance score and motivation. Perhaps, the motivation would render the use of simulation more effective. As mentioned in the earlier section, the novelty of the simulation experience may have impacted high scores in the simulation group. The future research should control for novelty effect by recruiting students who have previous exposure to simulation programs. In a replication study, students interaction may also be considered as an additional variable. The actual experience within each group may have also impacted their scores at each level. The physical group seemed to have enjoyed the verbal interaction during the program, which may have impacted their recall abilities. The simulation group may have utilized the simulation exercise to input additional variables, which, may have led to their higher scoring over the physical group. This, however, may be the reason their recall was dropped since they did not experience verbal interactions with their class members. Only additional research would show if the student-student interaction had any effects on students learning and recall. This study revealed some interesting results on retention. More work in this area with studies spanning a greater period of time and presenting other laboratory experiments would further advance this line of inquiry. Another line of inquiry that this study di d not address is what types of students were helped most by simulation or physical experience. Is it students with a preferred visual processing mode? Can the simulation help those students who prefer learning by doing hands-on experiments? Perhaps, having the concepts presented both by hands-on physical work and simulated work helps all students. But such claims can

PAGE 137

127 only be supported by further study in this area. It is also interesting to conduct these laboratories in a pure distance education environment where the students in the physical group would receive the lecture and perform the labs in a school setting and th e simulation group would receive the lecture and conduct the laboratory experiments from a remote site. The quantitative and qualitative results also have led to new questions, which are worth discovering. Do the students have limited meta-cognitive awareness regarding the possible impact of the laboratory? Does access to technology generate motivating factors for students most of whom stated that the simulated lab made the laboratory experience more interesting, easier and less time-consuming? Is it possible that the motivational influence of technology is a separate factor for this study, and can this variable be isolated? Is there a correlation between the ease of simulation program and cognitive learning?

PAGE 138

128 References Alessi, S.M. (2000). Building versus using simulation. In J.M. Spector & T.M. Anderson (Eds.), Integrated & holistic perspectives on learning, instruction & technology: Improving understanding in complex domains, p. 175-196. Dordrecht, The Netherlands: Kluwer. Alessi, S. M. & Trollip, S. R. (2001). Multimedia for learning: Methods and development. Needham, MA: Allyn & Bacon. Alhaabi, B., Anandapuram, S. & Hamza, M. K. (1998). Real laboratories: An innovative repartee for distance learning. Retrieved from: http://www.cse.fau.edu/~bassem/Publications/Pub-21-JOpenPraxis1998.PDF Alkazemi, E. (2003). The effect of the instructional sequence of a computer simulation and a traditional laboratory on middle-grade students conceptual understanding of electrochemistry. Unpublished doctoral dissertation, University of Florida. Anderson, J., Greeno, J., Reder, L., Simon, H. (2000, May). Perspectives of Learning thing and activity, Educational Research 11-13. Aotani, M. (1997). Distance Education and Web in Engineering and Mathematics: Examples and Projects at Harvard Stanford and UC at Berkeley. Webnet Conference (AACE). Armstrong, P. S. (1991). Computer-based simulation in learning environments: A meta-analysis of outcomes. Dissertation Abstract International 53(01), 100A. (UMI No. 215517). Bloom, B. S. (1956). Taxonomy of educational objectives: The classification of educational goals New York: Longman. Bourne, J. R., Brodersen, A. J., Campbell, J. O., Dawant, M. M., & Shiavi, R. G. (1997). A Model for On-Line Learning Networks in Engineering Education. Retrieved from: http://www.aln.org/publications/jaln/v1n1/pdf/v1n1_bourne.pdf Brent, M. (2002). Selecting A Distance Education School. Retrieved from: http://www.usdla.org/html/journal/APR02_Issue/article05.html

PAGE 139

129 Buttles, S. (1992, Nov/Dec). A model for incorporating and evaluating use of computer laboratory simulation in the non-majors biology course. American Biology Teacher, 54(8), 491-494. Campbell, J. O., Bourne, J. R., Mosterman, P. J. & Brodersen, A. J (2002). The effectiveness of learning simulation for electronics Laboratories, Journal of Engineering Educatio n, 91(1), 81-87. Carnevale, D. (2000, January 7). Survey finds 72% rise in number of distance education programs [Electronic version]. The Chronicle of Higher Education, A57. Cherryholms, C. (1966). Some current research on effectiveness of education simulation: Implications for alternative strategies. The American Behavioral Scientist, 10, 4-7. Chien, C. C. (1997). The effectiveness of interactive computer simulation on college engineering student conceptual understanding and problem solving ability related to circular motion. Unpublished doctoral dissertation, The Ohio State University. Choi, B., & Gennaro, E. (1987). The effectiveness of using computer simulated experiments on junior high students' understanding of the volume displacement concept. Journal of Research in Science Teaching, 24 (6), 539552. Chou, C. H. (1998). The effectiveness of using multimedia computer simulation coupled with social constructivist pedagogy in a college introductory physics classroom (electricity, magnetism). Unpublished doctoral dissertation, Columbia University Teachers College. Clark, R. E. (1983). Reconsidering research on learning from media. Review of educational research 53, 445-459. Clark, R. E. (2001). What is next in the media and methods debate? In R. E. Clark (Ed.). Learning from Media. Greenwich, CN: Information Age Publishers, Inc., 327-337, Coleman, J. N., Kinniment, D. J., Burns, F. P., & Kolemans, A. M. (1998). Effectiveness of computer-aided learning as a direct replacement for lecturing in degree-level electronics, IEEE Transaction on Education 41(3), 177-184. Committee on Education (2002). Examining the Benefits and Challenges of WebBased Education Retrieved from: http://frwebgate.access.gpo.gov Crocker, L. & Algina, J. (1986). Introduction to classical and modern test theory. Holt, Rinehart and Winston, Inc. Orlando, FL.

PAGE 140

130 Dekkers, J. & Donatti, S. (1981). The integration of research studies on the use of simulation as an instructional strategy. Journal of Educational Research, 7496), 424-427. De Jong, T., & van Joolingen, W. R. (1998). Scientific discovery learning with computer simulation of conceptual domains. Review of Educational Research, 68, 179-202. Dobson, E. L., & Hill, M. (1995). An evaluation of the student response to electronics teaching using a CAL package, Computers and Education 25(1-2), 13-30. Edwards, L. D. (1995). Microworlds as Representations. In A. A. diSessa, C. Hoyles & R. Noss (eds), Computers and Exploratory Learning 127-154. Berlin/Heidelberg: Springer-Verlag. Engle, R.S., Weinstock, M.A., Campbell, J.P, & Sathianathan, D. (1996, Spring). Pipe flow simulation software: A team approach to solve an engineering education problem. Journal of Computing in Higher Education, 7(2), 65-77. Fredriksen, J., White, B. & Gutwill, J. (1999). Dynamic mental models in learning science: The importance of constructing deivational linkages among models. Journal of Research in Science Teaching, 36(7), 806-836. Felder, R. & Silverman, L., (1988, April). Learning and teaching styles in engineering education, Engineering Education, 674-681. Gomes, V. G., Choy, B., Barton, G. W., & Romagnoli, J. A. (2000). Web-based courseware in teaching laboratory-based courses, Global Journal of Engineering Education, 4(1), 65-72. Gredler, M. E. (1996). Educational games and simulation: a technology in search of a research paradigm. In: D. H. Jonassen (Ed.), Handbook of research for educational communications and technology. New York: Macmillan Library References USA. Grosso, M.R. (1994). The comparison of computer simulation and traditional laboratory exercises in a college freshman chemistry course. Unpublished doctoral dissertation, State University of New York at Buffalo. Hall, T. (2000). Quantitative analysis of the effectiveness of simulated electronics laboratory experiments, Journal of Engineering Technology, 17(2), 60-66. Harper, A. M., Taranto, S. E., Edwards, E. B., Daily, O. P. (2000). An update on a successful simulation project: The UNOS liver allocation model, Proceedings of the 2000 Winter Simulation Conference, 1955-1962.

PAGE 141

131 Herron, J. D., (1996). The Engineering Classroom Washington D.C.: American Engineering Society. Jonassen, D. H. (1994). Learning with media: Restructuring the debate. Educational Technology Research and Development, 42 (2) 31-39. Kadiyala, M & Crynes, B. L. (2000). A review of literature on effectiveness of use of information technology in education, Journal of Engineering Education, 82(2), 177-189. Kozma, D. (2000). Reflections on the state of educational technology research and development. Educational Technology Research and Development 48(1), 515. Lee, J. (1999). Effectiveness of computer-based instructional simulation: A metaanalysis, International Journal of Instructional Media 26, 71-85. Li, Y. S. LeBoeuf, E.J. & Turner, L. H. (2003). Development of web-based mass transfer processes laboratory: System development and Implementation, Computer Applications in Engineering Education 11(1), 2003, 25-39. Linn, M. & Songer, N. B. (1988) Curriculum reformation: Incorporating technology into science instruction. (ERIC Document Reproduction Service No. ED 303352). Mandai, P., Wong, K. K. & Love, P. E. D. (2000). Internet-supported flexible learning environment for teaching system dynamics to engineering students, Computer Applications in Engineering Education 8(1), 1-10. Mayer, R. E. (1989). Models of understanding. Review of Educational Research, 59(1), 43-64. Mayer, R. E. & Sims, V. K. (1994). For whom is a picture worth a thousand words? Extensions of a dual-coding theory multimedia learning. Journal of Educational Psychology, 86(3), 389-401. McCracken, G. (1988). The Long Interview. London: SAGE Publications. Miles, M. B., & Huberman, A. M. (1994). Qualitative Data Analysis: An Expanded Sourcebook. Thousand Oaks, CA: SAGE Publications, Inc. Miller, C. S., Lehman, J. F., & Koedinger, K. R. (1999). Goals and learning in microworlds. Cognitive Science, 23(3), 305-336. Min, R. (1995). Simulation Technology and Parallelism in Learning Environments. DeLier, Nethelands: Academic Book Center. Mintz, R. (1993). Computerized simulation as an inquiry tool School Science and Mathematics, 93(2), 76-80.

PAGE 142

132 Moghaddam, F. M., Walker, B. R., & Harre, R. (2003). Cultural distance, levels of abstraction and the advantages of mixed methods. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 111-134). Thousand Oaks: Sage Publications, Inc. Moslehpour, S. (1993). A comparison of achievement resulting from learning electronics concepts by computer simulation versus traditional laboratory instruction. Unpublished doctoral dissertation, Iowa State University. Murano, A., & Towne, D. M. (1992). Productivity tools for simulation centered training development. Educational Technology Research & Development 40(4), 65-80. Murphy, T., Gomes, V. G. & Romagnoli, J. A. (2002). Facilitating process control in a virtual laboratory environment, Computer Applications in Engineering Education 10(2), 79-87. Orlansky, J. & String, J. (1979). Cost-effectiveness of computer-based instruction military training. IDS paper. Institute of Defense Analysis, Alexandria, VA. Pantelidis, V. S. (1993). Virtual reality in the classroom, Educational Technology 33, 23-27. Papert, S. (1993). The Children's machine: Rethinking school in the age of the computer. New York: Basic Books. Patton, M. Q. (1990). Qualitative education and research methods (2 nd edition). Newbury Park, CA: Sage. Perkins, D. N., & Unger, C. (1994). A new look in representations for mathematics and science learning Instructional Science 22, 1-37. Perry, D. Porter A. & Votta L., Empirical Studies of Software Engineering: A Roadmap. Future of Software Engineering ", Ed: Anthony Finkelstein, ACM, (2000), pp. 345-355. Pfundt, S. & Duit, A.(1994). Bibliogra phy: Students alterative frameworks and science education (4 th edition). Kiel: IPN at the University of Kiel. Pierfy, D. (1977). Comparative simulation game resrach: Stumbling blocks and stepping stones. Simulation and Games, 8(2), 255-268. Powell, R. M., Anderson, H., Van der Spiegel, J. & Pope, D. P. (2003). Using webbased technology in laboratory instruction to reduce costs, Computer Applications in Engineering Education, 10(4), 204-214. Pushkin, D. B. (1998). Introductory students, conceptual understanding and algorithmic success. Journal of Engineering Education, 75(7), 809-810.

PAGE 143

133 Rieber, L. P. (1989). The effects of comput er animated elaboration strategies and practice on factual and application learning in an elementary science lesson. Journal of Educational Computing Research 5(4), 431-444. Rieber, L. P. (1990a). Animation in Computer-based instruction. Educational Technology Research and Development 38(1), 77-86. Rieber, L. P. (1990b). Using computer animated graphics in science instruction with children. Journal of educational psychology 82(1), 135-140. Rieber, L. P. (1991a). Effects of visual grouping strategies on computer animated presentations on selective attention in science. Educational Technology Research and Development 39(4), 5-15. Rieber, L. P. (1991b). Animation, incide ntal learning and continuing motivation. Journal of Educational Psychology 83(3), 318-328. Rieber, L. P. (1992). Computer-based micrworlds: a bridge between constructivism and direct instruction, Educational Technology Research and Development 40(1), 93-106. Rieber, L. P. (1995). A Historical review of Visualization in Human Cognition. Educational technology Research and Development 43(1), 45-56. Rieber, L. P. (1996). Animation as feedback in computer-based simulation: Representation matters. Educational Technology Research and Development 44(1), 5-22. Rieber, L. P., Boyce, M. J. & Assad, C. (1990). The effects of computer animation on adult learning and retrieval task. Journal of Computer-Based Instruction 17(2), 46-52. Rieber, L. P., Smith, L., & Noah, D. (1998). The value of serious play. Educational Technology 38(6), 29-37. Savander-Ranne, C. & Kolari, S. (2003). Promoting the conceptual understanding of engineering students through visualization, Global Journal of Engineering Education 7(2), 189-200. Sechrest, L. (1979). Unobtrusive measurement today San Fransisco: Jossey-Bass. Springer, L., M.E. Stanne and S.S. Donova n. (1999). Effects of small-group learning on undergraduates in science, mathematics, engineering and technology: A meta-analysis. Review of Educational Research, 69(1), 21. St. Clair, S. (2000). The Instructional Objective Writing Assistant (IOWA): Addressing the need for learning objectives in the engineering classroom, Master of Science Thesis, April 2000.

PAGE 144

134 Surry, D. W. & Ensminger, D. (2001). What's wrong with media comparison studies? Educational Technology 31 (4), 32-35. Tennyson, R.D. (1996) Concept learning. In E. de Corte and F.E. Weinert (eds.) International Encyclopedia of Developmental and Instructional Psychology Elsevier Science, Pergamon, pp. 381-385. Thiagarajan S (1998). The myths and realities of simulation in performance technology. Educational Technology 38(5), 35-41. Thurman, R. A. (1993). Instructional simulation from a cognitive psychology viewpoint. Educational Technology Research & Development, 41 (4), 75-79. Van LeJeune, J. (2002). A meta-analysis of outcomes from the use of computersimulated experiments in science education. Unpublished Ed.D. dissertation, Texas A&M University. Van Weert, T. J. (1995) IFIP working group 3.1: towards integration of computers into education, in JD Tinsley and TJ Van Weert (eds), WCCE : Liberating the Learner, London, Chapman and Hall, 3-12. Wankat, P. C. (1999). An Analysis of the Articles in the Journal of Engineering Education. Journal of Engineering Education, 88(1), 37. Watts, M., M. (2003). Technology: Taking the Distance Out of Learning. New Direction For Teaching and Learning, 94, 1-3. Weller, H. G. (1996). Assessing the impacts of computer-based learning in science. Journal of Research on Computing in Education, 28(4), 461-484. Windschild, M. A. & Andre, T. (1998). Using Computer simulation to enhance conceptual change: the roles of constructivist instruction and student epistemological beliefs Journal of Research in Science Teaching 35, 145-160. Wiesner, T. F. & Lan, W. (2004). Comparison of student learning in physical and simulated unit operations experiments, Journal of Engineering Education 93(3), 195-205. Wyatt, T. (2000). Development and evaluation of an education software tool for Geotechnical Engineering, Ph.D. dissertation, Georgia Institute of Technology, May 2000. Zywano, M. S. & Waalen, J. K. (2001). Student outcomes and attitudes in technology-enables and traditional education: a case study, Global Journal of Engineering Education, 5(1), 49-56.

PAGE 145

135 Appendices

PAGE 146

136 Appendix A: Physical Laboratory Experiment Sheets Lab Experiment I AM Modulator Objectives: 1. To observe the operation of an AM modulator. 2. To observe the operation of an AM peak detector (demodulator) Materials Used: Equipment: 1 protoboard 1 dual dc power supply (+12 V dc and -5 to +5 V dc) 1 audio signal generator (0 20 kHz) 1 standard oscilloscope (10 MHz) 1 assortment of test leads and hookup wire Parts List: 1 XR-2206 function generator 1 1 k-ohm resistor 3 4.7 k-ohm resistor 2 0.001 F capacitor 2 10 k-ohm resistors 1 10 F capacitor 1 47 k-ohm resistors 2 1 F capacitor A Linear Integrated Circuit AM Modulator/Demodulator In this section the processes of AM modulation and demodulation are observed. In this experiment the XR-2206 function generator (Figure 1) is used to generate an AM waveform. Assemble the circuits according to the schematic for the linear integratedcircuit AM modulator circuit shown in Figure 2. Confirm the operation of the circuit as discussed in lecture and plot the input, intermediate and output waveforms per following instruction: 1. Connect the oscilloscope probe to pin 1 of XR-2206 to observe the input wave form (modulating signal). Using the Auto-Plot button on the oscilloscope get a hard copy plot of the waveform. 2. Connect the oscilloscope probe to pin 11 of XR-2206 to observe the carrier wave for this modulation. Using the Auto-Plot button on the oscilloscope get a hard copy plot of the waveform.

PAGE 147

Appendix A (Continued) 3. Connect the oscilloscope probe to pin 3 of XR-2206 to observe the carrier wave for this modulation. Using the Auto-Plot button on the oscilloscope get a hard copy plot of the waveform. 4. Using the frequency knob on the generator, change the frequency of modulating signal and repeat steps 1, 2, 3 above. 5. Using the three groups of Resistor-Capacitor combinations given to you, replace the RC components on pins 5, 6, 7, 8 (for 5KHz, 500KHz, 1500KHz frequencies respectively) to change the carrier frequency and repeat steps 1, 2, 3, 4 above. 12345678 161514131211109AM Input CurrentSwitchesMultiplierand sineshaper+1VCO XR-2206SymmetryAdjWaveformadj.GroundSync. OutputBypassFSK InputMULT. OUT+VccTimingCapacitorsTimingResistors OR Output Figure 1. XR-2206 block diagram CarrierSignalGenerator V cc = 20 V dc R c = 10 k R 1 = 22 k V out C 2 = 0.001 F Q1 3904 R L = 2.2 k C 1 = 0.01 F R2 = 10 k R E = 10 k CE = 0.022 F Audio Signal Generator Figure 2. AM Modulator 137

PAGE 148

138 Appendix A (Continued) Lab Experiment II FM Module Objectives: 1. To observe the operation of an FM modulator. 2. To observe the operation of an FM peak detector (demodulator) Materials Used: Equipment: 1 protoboard 1 dual dc power supply (+12 V dc and -3 to +3 V dc) 1 audio signal generator (0 Hz to 20 kHz) 1 standard oscilloscope (10 MHz) 1 assortment of test leads and hookup wire Parts List: 1 XR-2206 function generator 1 1 k-ohm variable resistor 3 4.7 k-ohm resistor 2 0.001 F capacitor 2 10 k-ohm resistors 1 10 F capacitor 1 47 k-ohm resistors 2 1 F capacitor A Linear Integrated Circuit FM Modulator/Demodulator Procedure: In this section the processes of FM modulation and demodulation are observed. In this experiment the XR-2206 function generator (Figure 1) is used to generate an FM waveform. Assemble the circuits according to the schematic for the FM modulator shown in Figure 2. The output (modulated signal) is connected and displayed on the screen of an oscilloscope. Confirm the operation of the circuit as discussed in lecture and plot the input, intermediate and output waveforms per following instruction: 1. Connect the oscilloscope probe to pin 1 of XR-2206 to observe the input wave-form (modulating signal). Using the Auto-Plot button on the oscilloscope get a hard copy plot of the waveform.

PAGE 149

Appendix A (Continued) 2. Connect the oscilloscope probe to pin 11 of XR-2206 to observe the carrier wave for this modulation. Using the Auto-Plot button on the oscilloscope get a hard copy plot of the waveform. 3. Connect the oscilloscope probe to pin 3 of XR-2206 to observe the carrier wave for this modulation. Using the Auto-Plot button on the oscilloscope get a hard copy plot of the waveform. 4. Using the frequency knob on the generator, change the frequency of modulating signal and repeat steps 1, 2, 3 above. 5. Using the three groups of Resistor-Capacitor combinations given to you, replace the RC components on pins 5, 6, 7, 8 (for 5KHz, 5MHz, 100MHz frequencies respectively) to change the carrier frequency and repeat steps 1, 2, 3, 4 above. 12345678 161514131211109AM Input CurrentSwitchesMultiplierand sineshaper+1VCO XR-2206SymmetryAdjWaveformadj.GroundSync. OutputBypassFSK InputMULT. OUT+VccTimingCapacitorsTimingResistors OR Output Figure1. XR-2206 block diagram 139

PAGE 150

Appendix A (Continued) 0.001 F XR-2206FunctionGenerator12345678161514131211109 DC Controlvoltage Audio signalgenerator V ou t 4.7 k 4.7 k 10 F R 3 = 1 k 47 k 1 F C 1 = 0.001 F R 1 = 10 k 1 F Vc R 2 = 10 k Figure 2. FM Modulator 140

PAGE 151

141 Appendix B: Simulated Laboratory Experiment Sheets Lab Experiment I AM Modulator Objectives: 1. To observe the operation of an AM modulator. 2. To observe the operation of an AM peak detector (demodulator) A Simulated AM Modulator/Demodulator When a relatively high frequency carrier signal is mixed in a nonlinear device with a relatively low frequency-modulating signal, amplitude modulation occurs. In this experiment a MATLAB based simulator is used to generate an AM waveform. The output (modulated signal) is displayed on the screen of the computer monitor. The graphical user interface of the simulator allows you to change frequencies relevant to the operation of a mod/demodulator and display the results with the new setup. You can also change the type of mod/demodulation from a drop down menu. Confirm the operation of the simulator as discussed in lecture and plot the input, intermediate and output waveforms per following instruction: 3. Run and display the simulator program from the MATLAB interface. From the modulation type drop down menu, select AM to observe the input wave form (modulating signal), carrier wave and demodulated signal on the monitor of your computer. Using the Print Screen button on the keyboard get a hard copy plot of these waveforms. 4. Using the frequency drop down menus on the screen, change the carrier and sampling frequencies and repeat step 1 above. 5. Repeat step 2 to cover a wide range of frequencies that are one decade apart.

PAGE 152

142 Appendix B (Continued) Lab Experiment II FM Modulator Objectives: 1. To observe the operation of an FM mod/Demodulator. 2. To observe the operation of an FM peak detector (demodulator) A Simulated FM Modulator/Demodulator When a relatively high frequency carrier signal is mixed in a nonlinear device with a relatively high frequency-modulating signal, frequency modulation occurs. In this experiment a MATLAB based simulator is used to generate an FM waveform. The output (modulated signal) is displayed on the screen of the computer monitor. The graphical user interface of the simulator allows you to change frequencies relevant to the operation of a mod/demodulator and display the results with the new setup. You can also change the type of mod/demodulation from a drop down menu. Confirm the operation of the simulator as discussed in lecture and plot the input, intermediate and output waveforms per following instruction: 1. Run and display the simulator program from the MATLAB interface. From the modulation type drop down menu, select FM to observe the input waveform (modulating signal), carrier wave and demodulated signal on the monitor of your computer. Using the Print Screen button on the keyboard get a hard copy plot of these waveforms. 2. Using the frequency drop down menus on the screen, change the carrier and sampling frequencies and repeat step 1 above. 3. Repeat step 2 to cover a wide range of frequencies that are one decade apart.

PAGE 153

143 Appendix C: Pre-Lab Instruction Good afternoon! My name is ____________. I am the lab assistant for this section of the course. Every one of you should have a folder with two laboratory experiment guidelines. If you dont them, then please let me know. For the next 20 minutes, I will be discussing with you some information and some rules that you need to know in order to complete the laboratory experiments and then I will answer your questions. You will have two hours to complete the two lab experiments. We will all start at the same time, so please write the start time on the time log sheet. I will stamp the ending time on that sheet when you turn in your completed lab experiments. Please work on each experiment individually. Do not share information and/or help each other and/or ask any questions during. If you have technical problems with the lab equipments, please report the problem to me. If you have difficulty with the lab experiments, then record the problem on a piece of paper since I wont be answering questions that are related to the experiments. After completing the experiments, please stay in the lab to take a test. Last but not least, while performing your laboratory assignments, please think about the mental strategies that you use to achieve the objectives of the lab. Remember that the key to understanding signal transmission process is to grasp 'the big picture,' or how all the little things you learn are part of the complete description of how digital and analog signals interact. You will have to focus on details, but as you're doing so, periodically think about how they fit into the big picture. Understanding concepts is more important than memorizing facts. If you truly understand the concepts by performing these experiments, then you have a good grasp of the concept of modulation and demodulation.

PAGE 154

144 Appendix D: Student Demographic and Background Name: _________________ Date: __________ 1. Gender ___ Male ___ Female 2. Academic level ____Freshman ____Sophomore ____Junior ____Senior 3. Age ____Less than 20 ____ 20-30 ____ 31-40 ____ above 40 4. Have you taken any engineering course in which simulation software was used in the course? ____ Yes ____ No If yes, was it used for ____ as a tutorial to enhance lecture martial ____ as a substitute for lecture ____ as a tool to enhance laboratory instruction ____ as a substitute for physical laboratory instruction 5. What grade did you earn in Circuit I? ____ A ____ B ____ C ____ D 6. Have you studied the concept of modulation and demodulation previously in another class? ____ Yes ____ No If yes, do you understand that topic really well? ____ Yes ____ No

PAGE 155

145 Appendix E: Attitude Survey Questionnaires Survey Questionnaire for the Simulation Group Please use the following scale to rate each statement and circle the number that best describes your answers (Circle one). SD = Strongly Disagree D = Disagree A = Agree SA = Strongly Agree SD D A SA 1. The simulation motivates me to learn. 1 2 3 4 2. The simulation is dull and uninteresting. 1 2 3 4 3. The simulation is a better tool than regular 1 2 3 4 physical laboratory. 4. The simulation is enjoyable. 1 2 3 4 5. It takes less time to do the lab experiments 1 2 3 4 using the simulation. 6. The simulation is not effective for laboratory use. 1 2 3 4 7. The simulation makes learning faster. 1 2 3 4 8. The simulation is not as effective as physical 1 2 3 4 laboratory experiments. 9. The simulation makes understanding of the conceptual 1 2 3 4 theories more clear. 10. The simulation would be an excellent laboratory tool. 1 2 3 4 11. Doing the experiments with the simulation 1 2 3 4 is motivating. 12. More simulation programs like the one are needed 1 2 3 4 in our educational system. 13. The use of simulation technologies is an effective 1 2 3 4 method of conducting laboratory activities.

PAGE 156

146 Appendix E: Continued Survey Questionnaire for both Groups Please use the following scale to rate each statement and circle the number that best describes your answers (Circle one). SD = Strongly Disagree D = Disagree A = Agree SA = Strongly Agree SD D A SA 1. The laboratory experiments complement 1 2 3 4 the lectures. 2. Conducting lab experiments increases 1 2 3 4 your knowledge; you learn about things that you otherwise would not have learned from pure lecture. 3. Conducting lab experiments made concepts 1 2 3 4 easier to understand. 4. Through doing the lab experiments you 1 2 3 4 get an idea of how the things works. 5. Lab experiments made the subject more interesting. 1 2 3 4 6. Lab experiments made the subject less abstract. 1 2 3 4 7. The information provided was clear. 1 2 3 4 8. Working with the program took up too much time. 1 2 3 4 9. Pre-lab instruction was helpful. 1 2 3 4

PAGE 157

147 Appendix F: Conceptual Achievement Test Direction Before taking the exam, read the following statements, which will provide you with some strategies, you may use to solve the problems. Think about each problem and determine which strategy can help you arrive to a correct solution. Think to yourself, do you understand the problem? Try to remember if you have solved a similar problem in the laboratory session. Think about what information gained from the lecture and the laboratory session you need to solve this problem. Create a picture in your head or on a piece of paper to help you understand the problem. On a separate sheet of paper, jot down formulas and the information needed to solve the problem. Think about how the concepts learned during the laboratory sessions can help you solve the problem. Look back at your solutions and see if it all makes sense. Determine if you have solid evidence to support your solution.

PAGE 158

Appendix F: Continued Name: _______________________ Date: ____________________ 1. Four waveforms are shown in the following figures. The first waveform is the source. Which waveform is the single-side band suppressed carrier AM modulation? a. Source signal b. c. d. 2. Explain the physical differences between AM and FM modulation. _______________________________________________________________ _______________________________________________________________ _______________________________________________________________ _______________________________________________________________ _______________________________________________________________ 148

PAGE 159

Appendix F (Continued) 3. How does changing the carrier frequency and sampling frequency affect the AM modulation? _______________________________________________________________ _______________________________________________________________ _______________________________________________________________ 4. How does changing the carrier frequency and sampling frequency affect the FM modulation? _______________________________________________________________ _______________________________________________________________ _______________________________________________________________ 5. What is the relationship between carrier frequency and sampling frequency for optimal modulation? _______________________________________________________________ _______________________________________________________________ _______________________________________________________________ 6. Plot the modulated waveform of an AM modulator with a carrier frequency of 500kHZ and a modulating signal frequency of 10kHZ. 149

PAGE 160

Appendix F (Continued) 7. Plot the modulated waveform of an FM modulator with a carrier frequency of 100mHZ and a modulating signal frequency of 10kHZ. 8. If the carrier frequency is set to 20 MHz, what can the maximum sampling frequency for a 6 KHz signal in an FM modulated signal be? a. 6 KHz b. 12 KHz c. Greater than 20 KHz d. Greater than 12 KHz e. None of the above 9. Sketch the output envelope for an AM modulator with a carrier frequency of 500 KHz and modulating signal of 10 KHz. 10. In AM modulation, which of the following stays constant? a. Amplitude b. Phase c. Frequency d. Both A & C 150

PAGE 161

151 Appendix F (Continued) 11. In FM modulation, which of the following stays constant? a. Amplitude b. Phase c. Frequency d. Both A & C 12. Based on your observation made in the lab; provide an outline of the process of modulation and demodulation process. _______________________________________________________________ _______________________________________________________________ _______________________________________________________________ Exit Questions: Did your laboratory experience improve your ability to answer the questions on this exam? If so, please do your best to explain how. _______________________________________________________________ _______________________________________________________________ _______________________________________________________________ _______________________________________________________________ _______________________________________________________________ Did you use the any of the above strategies to solve the problems? If yes, which ones did you use? Did you find them useful? _______________________________________________________________ _______________________________________________________________ _______________________________________________________________ _______________________________________________________________

PAGE 162

152 Appendix F (Continued) Did you use any other strategy, which was not listed? Please explain. _______________________________________________________________ _______________________________________________________________ _______________________________________________________________

PAGE 163

153 Appendix G: Instructor Rating Sheet for Open-ended Questions on Post-test Grading Rubric Written Questions 4 Demonstrates complete understanding -Answer is clearly and correctly stated 3 Demonstrates considerable understanding -Answer is correct, with minor omissions or inaccuracies 2 Demonstrates limited understanding -Answer is partially stated and/or some evidence of correct result is shown or invalid assumptions are made 1 Demonstrates little or no understanding -The answer, if any, show evidence of little understanding -No attempt of solving problem Plot Questions The plot is correct: ___Yes (1 point) ___ No (0 point) Figure G-1. Grading Rubric

PAGE 164

Appendix H: Qualitative Instruments Individual Questionnaire Thank you for your participation in this study. The following survey is to assist in evaluating your opinion of the simulation program as it relates to having used or not used as an alternative to physical laboratory. 1. In what ways do you think the simulation program was effective as a tool for conducting the laboratory experiments? ____________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________ 2. Do you think that simulation programs could be a substitute for physical laboratory activities? Explain. ___ Yes ___ No Explain. __________________________________________________________________ __________________________________________________________________ __________________________________________________________________ 3. Do you feel the simulation program would be beneficial to online students? ___ Yes ___ No Explain. __________________________________________________________________ __________________________________________________________________ __________________________________________________________________ 154

PAGE 165

Appendix H (Continued) 4. Should the simulation program be incorporated into online communication system course? ___ Yes ___ No Explain. __________________________________________________________________ __________________________________________________________________ 5. Should the simulation program be used instead of the physical laboratory for communication systems course? ___ Yes ___ No Explain. __________________________________________________________________ __________________________________________________________________ __________________________________________________________________ 6. If you had a choice for conducting similar experiments, which would you choose? ___ Simulated laboratory ___ Traditional laboratory Explain. __________________________________________________________________ __________________________________________________________________ __________________________________________________________________7. Did you have any problems with using the simulation software? __________________________________________________________________ __________________________________________________________________ __________________________________________________________________ 155

PAGE 166

Appendix H (Continued) Sample Group Interview Questions for Simulation Group Background Information Questions Have you ever participated in a study where simulation was used? On Simulation Did you think the simulation was helpful? What were the difficulties that you had with using the simulation? Did the use of the simulation help you on solving the problems on the exam? Do you think the simulation can be a substitute for the physical laboratory? If you had a choice for conducting similar experiments, which would you choose, simulation or physical laboratory? Do you think online students can benefit from the use of simulation? Do you feel you learn better when you have a hands-on experience? Based on your prior experience with physical labs, what is your perception of the time spent using simulation versus physical laboratory? On Laboratory Experiments Did the lab experiment help you with understanding the concept of modulation/demodulation better? If yes, in what way? Did the laboratory experiments help you on solving the problems on the exam? On Solving the Problems When solving problems, do you ever use strategies? If yes, what strategies do you us? Did you find the strategies provided to you before taking the exam useful? Do you think it would be easier if you have a strategy before solving a problem? 156

PAGE 167

Appendix H (Continued) Sample Group Interview Questions for Physical Group On Laboratory Experiments Did the lab experiment help you with understanding the concept of modulation/demodulation better? If yes, in what way? Did the laboratory experiments help you on solving the problems on the exam? On Solving the Problems When solving problems, do you ever use strategies? If yes, what strategies do you us? Did you find the strategies provided to you before taking the exam useful? Do you think it would be easier if you have a strategy before solving a problem? 157

PAGE 168

Appendix I: Consent Form CONSENT FORM AUTHORIZATION TO PARTICIPATE IN RESEARCH PROJECT Consent is hereby given to participate in the study titled: A Comparison of Traditional Physical Laboratory and Computer Simulated Laboratory Experiences in Relation to Engineering Undergraduate Students Conceptual Understanding of a Communication Systems Topic Purpose The purpose of this research is to explore the effects of using a simulation program for conducting modulation and demodulation laboratory experiments and compare those effects with traditional physical laboratory. This study is an effort to improve classroom instruction as well as online instruction in Engineering courses. Procedures By taking part in this study, you will be asked attend a lecture session which will last one hour and 30 minutes and then you will be given two laboratory assignments for a duration of two hours. At the completion of the lab assignments, you will be asked to take a 40 minutes exam. And then take 10 minutes to answer an attitude survey which will measure your attitude toward the instructional method. Then you may be randomly selected to answer some interview type questions for the qualitative portion of the study. The interview will be paper-based. The you will be asked to answer some interview-type questions to express their thoughts and feelings about the instructional tool in more detail. The duration of the experiment is five hours. You must be at least 18 years of age to participate in this study. Benefits The greatest potential benefit of your participation in this study is that you have the opportunity to influence the development of future online laboratory teaching methods. Risks As a participant, there are no known physical, psychological or emotional risks in taking part in this study. Voluntary Participation/Withdrawal Taking part in this study is voluntary. You may choose not to take part in this study or if you decide to take part and later you change your mind, you can withdraw from the study. Withdrawal for the study will involve no penalty. It will not affect your grade in the course in any way. You are free not to answer or skip any item or question. In addition, You will complete course assignments whether you agree to participate or not. If at any time, you decide to stop participating, please contact Giti Javidi at 601-266-5949. 158

PAGE 169

Appendix I: (Continued) Confidentiality All information collected about you during the course of this study will be kept confidential to the extent permitted by law. Completed questionnaires and the post-tests will be locked in the researchers office. The researcher is the only person with access to this room except the cleaning personnel. To avoid any unauthorized access, these papers will be kept in a locked file. Any documents not used in the study will be destroyed. You will not be identified in the study by any identifying data. The questionnaires will utilize a non-identifying numbering system so your responses will remain anonymous. In addition, any potentially identifiable descriptions of students shall be withheld from the study itself to protect your anonymity. The investigator and the instructor will not know who is participating and who is not until after the grades are turned in. Questions Whereas no assurance can be made concerning results that may be obtained (since results from investigational studies cannot be predicted) the researcher will take every precaution consistent with the best scientific practice. Participation in this project is completely voluntary and subjects may withdraw from this study at any time without penalty, prejudice or loss of benefits. Questions concerning the research should be directed to Giti Javidi (601) 266-5949. This project and this consent form have been reviewed by the Institutional Review Board, which ensures that research projects involving human subjects follow federal regulations. Any questions or concerns about rights as a research subject should be directed to the Chair of the Institutional Review Board, The University of Southern Mississippi, Box 5147, Hattiesburg, MS 39406, (601) 266-6820. A copy of this form will be given to the participant. In conformance with the federal guidelines, the signature of the subject or parent or guardian must appear on all written consent documents. The University also requires that the date and the signature of the person explaining the study to the subject appear on the consent form Signature of the Research Subject Date Signature of the Person Explaining the Study Date 159

PAGE 170

Appendix J: Pilot Study The pilot study consisted of 16 subjects (enrolled in an undergraduate Signal Processing course in the College of Engineering Technology) who signed the consent form (Appendix G) and were randomly assigned to either a hands-on treatment group or computer simulation treatment group. Subjects met in the classroom and received the same set of lectures, instructions and experimental procedures as described in chapter 3. Based on the data from the demographic survey, female students numbered 4, while male students numbered 12. Students ranged in age from 20-30 with the exception of one male student of 31-40. Four students were at the junior level while the remaining 12 were at the senior level. None of the students had any previous experience with any type of simulation. Students average grade in Circuit I or any similar course was B. Conceptual Achievement Test Reliability: As shown in Table H-1, the Cronbachs alpha for the performance measure was .70. 160

PAGE 171

Appendix J: (Continued) Reliability Cronbach's Alpha Cronbach's Alpha Based on Standardized Items N of Items .655 .708 12 Item Statistics Mean Std. Deviation N p1 .6875 .47871 16 p2 2.5625 1.03078 16 p3 2.1250 .80623 16 p4 1.9375 .77190 16 p5 1.0625 1.28938 16 p6 1.6250 .71880 16 p7 1.6250 .80623 16 p8 .0625 .25000 16 p9 1.3750 .88506 16 p10 .4375 .51235 16 p11 .3750 .50000 16 p12 2.6875 .60208 16 Item-Total Statistics Scale Mean if Item Deleted Scale Variance if Item Deleted Corrected Item-Total Correlation Squared Multiple Correlation Cronbach's Alpha if Item Deleted p1 15.8750 17.583 -.021 .826 .671 p2 14.0000 14.800 .235 .691 .652 p3 14.4375 14.929 .345 .742 .626 p4 14.6250 15.450 .277 .858 .638 p5 15.5000 15.867 .019 .721 .724 p6 14.9375 13.796 .640 .749 .576 p7 14.9375 13.929 .523 .938 .592 p8 16.5000 16.933 .356 .617 .644 p9 15.1875 13.096 .601 .923 .570 p10 16.1250 16.117 .328 .748 .634 p11 16.1875 15.229 .577 .900 .606 p12 13.8750 16.650 .146 .592 .656 Table J-1. Performance measure statistics 161

PAGE 172

Appendix J: (Continued) Comparison of Group Means on Each Item: Group means for each item on the exam were compared for further analysis (Table H-2). On question # 1 (Analysis level), the simulation group had a mean of 0.37 (SD = 0.52) whereas the physical laboratory group had a mean of 1.0 (SD = 0) indicating that the physical laboratory group did better on this question. On question # 5 (Evaluation level), the simulation group had a mean of 1.75 (SD = 1.38) whereas the physical laboratory group had a mean of 0.37 (SD = 0.74) indicating that the simulation group did better on this question. On question # 8 (Evaluation level), the simulation group had a mean of 1.12 (SD = 0.35) whereas the physical laboratory group had a mean of 0.00 (SD = 0.00) indicating that the simulation group did better on this question and none in the physical laboratory answered this question correctly. 162

PAGE 173

Appendix J: (Continued) GROUP N Mean Std. Deviation Std. Error Mean P1 Simulation 8 .3750 .51755 .18298 Lab 8 1.0000 .00000 .00000 P2 Simulation 8 2.3750 1.30247 .46049 Lab 8 2.7500 .70711 .25000 P3 Simulation 8 2.5000 .75593 .26726 Lab 8 1.7500 .70711 .25000 P4 Simulation 8 2.0000 .92582 .32733 Lab 8 1.8750 .64087 .22658 P5 Simulation 8 1.7500 1.38873 .49099 Lab 8 .3750 .74402 .26305 P6 Simulation 8 1.3750 .91613 .32390 Lab 8 1.8750 .35355 .12500 P7 Simulation 8 1.5000 .92582 .32733 Lab 8 1.7500 .70711 .25000 P8 Simulation 8 .1250 .35355 .12500 Lab 8 .0000 .00000 .00000 P9 Simulation 8 1.2500 1.03510 .36596 Lab 8 1.5000 .75593 .26726 P10 Simulation 8 .3750 .51755 .18298 Lab 8 .5000 .53452 .18898 P11 Simulation 8 .5000 .53452 .18898 Lab 8 .2500 .46291 .16366 P12 Simulation 8 2.6250 .51755 .18298 Lab 8 2.7500 .70711 .25000 Table J-2. Group means for performance test items 163

PAGE 174

Appendix J: (Continued) Independent Samples T-test for H 0 1: In addition, the data obtained from the pilot test was used to test the first null hypothesis stating that there is significant difference (at p = 0.05 level) on post-test scores between students performing physical experiences on a traditional communication systems topic as compared to those performing the same experiments using computerized simulation program. An independent sample t-test is used to see if there are statistically significant differences between the two groups on performance test scores. As shown in Figure H-1, the post test scores (simulation: M = 16.75, SD = 5.60; physical laboratory: M = 16.73, SD = 2.56) were not significantly different (t < 0.172, p< .05). Therefore, the simulation groups performance was comparable to the physical laboratory group. GROUP N Mean Std. Deviation Std. Error Mean TOTAL Simulation 8 16.7500 5.59974 1.97981 Lab 8 16.3750 2.55999 .90509 Levene's Test for Equality of Variances t-test for Equality of Means F Sig. t df Sig. (2tailed) Mean Difference Std. Error Difference 95% Confidence Interval of the Difference Lower Upper TOTAL Equal variances assumed 4.613 .050 .172 14 .866 .3750 2.17689 4.29396 5.04396 Equal variances not assumed .172 9.804 .867 .3750 2.17689 4.48862 5.23862 Figure J-1. Independent Samples t-test for performance 164

PAGE 175

Appendix J: (Continued) In addition, the data from the pilot study was used to pilot test the third hypothesis stating that the students performing the experiments using computerized simulation program demonstrate negative attitude toward the use of the simulation in place of physical laboratory equipment. As shown in Table H-4, the result of the pilot test revealed that the students in the simulation group had a positive attitude (M=37.5, SD=6.78) toward the use of the simulation program in place of physical laboratory equipments. Valid 8 N Missing 8 Mean 37.5000 Median 38.5000 Std. Deviation 6.78233 Minimum 23.00 Maximum 46.00 Frequency Percent Valid Percent Cumulative Percent 23.00 1 6.3 12.5 12.5 34.00 1 6.3 12.5 25.0 38.00 2 12.5 25.0 50.0 39.00 1 6.3 12.5 62.5 41.00 2 12.5 25.0 87.5 46.00 1 6.3 12.5 100.0 Valid Total 8 50.0 100.0 Missing System 8 50.0 Total 16 100.0 Table J-3. Descriptive Statistic for Student Attitude 165

PAGE 176

Appendix K: Simulation Program An overview of the Subject Matter The increasing availability of fast personal computers is making simulation techniques effective in teaching in many areas. Many systems can be easily by computer-simulated and their behavior analyzed under different working conditions. The accuracy of results may increase compared with the hardware approach where the students need to read and document information from various instruments. The simulation used for the purpose of this study uses MATLAB programming language to demonstrate ways in which communication systems can be simulated. A typical communication system consists of a transmitter, channel and receiver and is shown in Figure K-1. Also shown is the noise source, which is added to the modulated signal. Channel Reciever Transm itter Modulator Demodulator N oise Output Signal Input Signal Figure K-1. A Typical Communication System block diagram 166

PAGE 177

Appendix K (Continued) Modulation Modulation is an essential process in communication since it enables multiple signals transmitted simultaneously over a common medium or communication channel. The process involves transferring the spectrum of the signal to be transmitted (i.e. the modulating signal) to a higher frequency. The process involves using a signal known as carrier. For a sinusoidal carrier, its amplitude, frequency or phase can be varied by the modulating signal. When its amplitude is varied in accordance with the modulating signal, the form of modulation is known as amplitude modulation. On the other hand if its frequency or phase is varied, the result is a frequency or phase modulation, respectively. Description of the Simulation Program The simulation program was developed by MATLAB using the MATLAB language, which is an interactive programming language for scientific and engineering computations. The basic units in MATLAB are metrics. MATLAB enables metrics to be easily manipulated. For example, they can be added, subtracted, multiplied, divided, transposed, etc. MATLAB has numerous toolboxes, which allow scientific and engineering mathematical operations to be carried out with a minimum amount of programming. This specific simulation was designed in Signal Processing Toolbox. 167

PAGE 178

Appendix K (Continued) The program allows the students to experiment with four different types of modulation scheme: Amplitude Modulation (AM), Frequency Modulation (FM), Phase Modulation (PM) and Amplitude Modulation Single Sideband (AMSSB). In this study, the students will only be experimenting with the Amplitude and Frequency modulations. The simulation uses MODULATE and DEMOD in the Signal Processing Toolbox to implement these schemes. The message signal is displayed in the top left plot. The modulated version is displayed in the middle left plot. The demodulated version of the modulated signal (the "reconstructed" waveform) is displayed in the bottom left plot. The popup menus on the upper right of the figure control: 1. How to display the signals (upper popup). The choices include: o Time: time domain waveform o Psd: power spectral density (frequency domain) o Specgram: spectrogram (time AND frequency domain) 2. Which message signal to use (lower popup). The choices include: o Speech: digitized speech waveform originally sampled at 7418 Hz o Sine: 2 seconds worth of 1 Hertz sine wave o Square: 2 seconds worth of 1 Hertz square wave o Triangle: 2 seconds worth of 1 Hertz triangle wave Fc and Fs are the carrier and sampling frequencies, respectively, in Hertz. In each modulation scheme, a "carrier signal" (cosine) of frequency Fc is altered in some way by the message signal: AM amplitude of carrier is message signal (Figure I-2) FM instantaneous frequency of carrier is message signal (Figure I-3) 168

PAGE 179

Appendix K (Continued) The "Play" buttons let the students listen to the message signals in speech, sine, square or triangle forms. However, the speech form is recommended to the students to use with their experiments. 169

PAGE 180

Appendix K (Continued) Figure K-2. An example of modulated signal using AM Modulation 170

PAGE 181

Appendix K (Continued) Figure K-3. An example of modulated signal using FM Modulation 171

PAGE 182

Appendix L: Research Instrument Validation Forms Instrument Review Form Thank you for agreeing to serve as a reviewer for the instructional material and experimental instruments provided to you in this package. The package includes the objectives of the lesson, the lecture material, laboratory experiments, a post laboratory exam and a copy of the simulation program. The purpose of the reviews is to test the material in terms of content validity. Please notice that there is two folders included in the package each containing two sets of laboratory experiment guidelines. The folders are labeled as simulation group or physical lab group. Other than these obvious differences, please comment on any observations that you make while reviewing these materials. Your feedbacks are greatly appreciated! Giti Javidi Figure L-1. Instrument validation form 172

PAGE 183

Appendix L (Continued) Element 0 Point 1 point 2 points 3 points Lesson The lesson is not focused on the content area. The lesson is loosely focused on the content area. The lesson is focused on the content area. The lesson is tightly focused on the content area. Comments: Element 0 Point 1 point 2 points 3 points Objectives The objective(s) is (are) imprecise or Unclear and do not identify the learning that will take place. The objectives do not address higher order thinking skills. Some of the objectives are clear and some are not. At least one objective addresses higher order thinking skills. Each objective is stated in terms of Student behavior; identifies the learning that will take place; and is measurable and observable. More than one objective address higher order thinking skills. Each objective is stated in terms of Student behavior; identifies the learning that will Take place; and is measurable and observable. The objectives address higher order thinking skills. Comments: Figure L-2. Grading Rubric 173

PAGE 184

Appendix L (Continued) Element 0 Point 1 point 2 points 3 points Laboratory Experiments The laboratory experiments are disconnected from the lecture material and from each other. They are not focused on the objectives. The laboratory experiments are focused on the objectives but hard to follow and contain some errors. The laboratory experiments are focused on the objectives and hard to follow but no errors. The laboratory experiments are focused on the objectives and easy to follow with no errors. Comments: Element 0 Point 1 point 2 points 3 points Exam The exam questions are not focused on the objectives and are irrelevant. The exam questions are focused on the objectives but they are clear and contain some errors. The exam questions are focused on the objectives and they are clear and contain no errors. The exam questions are focused on the objectives, they are clear and contain no errors. The questions are appropriate and sufficient. Comments: Figure L-2. Continued 174

PAGE 185

Appendix M: Request for Permission for Observations and Interview REQUEST FOR PERMISSION FOR OBSERVATION AND INTERVIEW Title of project: A Comparison of Traditional Physical Laboratory and Computer Simulated Laboratory Experiences in Relation to Engineering Undergraduate Students Conceptual Understanding of a Communication Systems Topic Principal Investigator: Giti Javidi, gjavidi@vsu.edu Dear _________________________________ (Student's Name): This is to acknowledge that you have signed a consent form agreeing to participate in the study "The effects of question prompts and peer interactions in scaffolding ill-structured problem solving processes". I would like to thank you sincerely for your participation and offer of help. In the meantime, I would like to inform you that you have been selected for observation during the lab session designated for this study. I will conduct the observation and take notes while you are completing the lab experiments. The observation will also be followed by a group interview, which will be conducted several days later. The conversations exchanged between the principal investigator and the students, as well as among the students during the group-interview session, will only be used for the purpose of this study. Only the investigator of this study has access to all the data recorded during the observation and interview, which will be stored in a secure location in the investigator's office in Hunter McDaniel Building. If you agree to be observed and interviewed as part of the study, please give us your permission by signing this form. Should you have any questions or concerns regarding the techniques or procedures, please contact the principal investigator at the email above. Thank you! Participant I, ______________________ (Print Name), understand the information given to me. I have received answers to any questions I may have had about the techniques and procedures indicated in this permission request form. __________________________________ ______________________ Signature Date Principal Investigator: I certify that the informed consent procedure has been followed and that I have answered any questions from the participant above as fully as possible. __________________________________ _______________________ Signature Date 175

PAGE 186

Appendix N: Student responses Simulation qualitative questionnaire responses Question 1 In what ways do you think the simulation program was effective as a tool for conducting the laboratory experiments? It has a more practical approach when a simulation program is used. It helps have a better understanding of what is being taught. Doing the simulation would help clear any questions or help the student get better understanding. I think the visual part of the simulation was an extremely great tool. It allows you to get hands-on experience. It reinforces the ideas that we learned in class. It helps make what you are doing more fu. I think we would need more of it in the future. It made it simpler, easier and less messy. Seeing the graph and plugging in numbers. It is effective because you learn the same concepts in less time and if you have problem, you can always go back and try again. You dont even have to be in the lab to do it. I like working with computers, so I found to be an effective tool for the purpose of these specific lab experiments. It helped conducting the experiments faster and easier and more efficient. It helped understand the lecture better. It was a quicker way to complete a lab. It was time-consuming. Gave me a better grasp of the concepts. Very visual. The simulation gave a visual of the lecture in class. The simulation shows greater details and it reinforces the lecture. It helps you see the process. I have to get used to it. Question 2 Do you think that simulation programs could be a substitute for physical laboratory activities? Explain. Yes. But I think that we should be exposed to both. No. I think physical labs are more effective but a simulation along with a physical lab or having a section to be the simulation would help the students more. Yes. But not for all engineering subjects. 176 Yes. It is more interesting than physical lab.

PAGE 187

Appendix N (Continued) Yes. Yes. It is more helpful. Yes, I found it as effective as physical lab. Yes. You know you have the correct output and then you can analyze it. Yes. Because you learn quickly and efficiently. Yes. I think that it would improve the understanding of the concept. No, I like to work in a lb with real equipments. Yes. I actually dont enjoy sitting in the lab, I rather do it at home or at my own time. No. Physical laboratory is more hands-on. I say yes and no, because physical laboratory let you interact with the material than a simulation. But I also like the simulation but not for every class. Eliminates lots of work. For visual project. But not hands-on. Yes, it could be a substitute it would be more accurate. Physical laboratories are boring and are not as straightforward as simulation programs. Yes. It felt like a hands-on lab. It is much faster. Question 3 Do you feel the simulation program would be beneficial to online students? Yes. If they have good observation skill, it will be very helpful. Yes. Because online students would at least have some type of practice besides textbook or illustrations and printouts. Yes. It will be a helpful tool for long distance learners. Yes. To get practical knowledge. Yes. Yes. It is a good tool if there are no other alternatives. Yes. It gives you same experience. Yes. Because they get to do the same experiments without being present in the lab. Yes. I think it will be beneficial to the online students because it will give them some practical experience. Seeing is always much better than reading or listening. Yes. It will benefit those who are unable to come to class. Yes. Because they can do it at home. Yes. I like to work on my own. Yes. If you can do it on the computer then it is ore beneficial. It provides more opportunities to more people who cannot attend class. 177

PAGE 188

Appendix N (Continued) Yes. If they have no way of doing the physical experiments. Yes. It is like doing lab outside of lab. Yes. In online you can do things with the simulation that you cannot learn otherwise. Question 4 Should the simulation program be incorporated into online communication system course? Yes. It will help online students keep up with the updated technology. Yes. Yes. Yes. To get the same experience. Yes. Yes. Yes. It is very easy to use. Yes. Because it allows you to analyze the data. Yes. Convenience. Yes. Then more people may be motivated to take online engineering courses. Yes. It will help the students learn how the communication system works. Question 5 Should the simulation program be used instead of the physical laboratory for communication systems course? Yes. The industry will be using simulation program anyway so why not start using them at school. No. Same answer as #2. No. It should be combined. Yes. It gets the students more involved. Yes. Yes. It is much easier to use. Yes. Less time consuming. No. I believe on hands-on-experiments. No. I like to do more hands-on. No. I like doing things more hands-on. But it would be better if I could at home. Nothing can replace a real experience. Both should be used. Yes. The simulation lab is faster. No. Physical laboratory is more hands-on. But for this subject, I would choose the simulation. But not in general. 178

PAGE 189

Appendix N (Continued) Question 6 If you had a choice for conducting similar experiments, which would you choose? Both. I am not sure because I think we need experience in both labs. Both. I truly like to have some traditional labs along with the simulation. Simulated laboratory. Simulated laboratory. More interesting. Simulated laboratory. It is easier to use. Simulated laboratory. Simulated laboratory. It motivates more to actually do the labs. Both. Simulated laboratory would be good if no other choice was available. Simulated laboratory. But it really depends on the subject that is being studied too. For some subjects, you have to have hands-on activities. It is hard to say. For these particular experiments, I like to try both to see which one I like better. Tradition lab. I like hands-on. I also like group experiments. Hard to say. Traditional lab. Traditional lab. Simulated laboratory is easier for me to understand. Simulated laboratory because it is easier to understand. Simulated laboratory. It is easy to use. I rather learn things hands-on. Question 7 Did you have any problems with using the simulation software? Not at all. I was able to adjust to and understand it as well. No. It was very easy to use. No!! No. No. No 179

PAGE 190

Appendix L (Continued) Exit survey responses Question1 Did your laboratory experience improve your ability to answer the questions on this exam? If so, please do your best to explain how. Yes, this laboratory experience has shown me what kind of waveform is produced as far as its shape. Yes, because during the lab I had a visual aid and sound effects to help analyze each parameter change. Yes, working with waves previously definitely helped my ability to answer the by helping me have a mental picture of the wave in my head. Yes. I learned learn the concepts by doing the experiments. Yes, it reinforced the concepts that were discussed in class. Yes, I think the combination of laboratory and classroom teaching will better me in understanding the concepts. Yes, it helped me to see the differences between the two signals and how they change over frequency, etc. No. Because I was still unable to graph problems 6, 7 & 9. I think answering the questions is easier when you do the lab first. Because then you have a mental image. Yes, because I actually had a visual showing the waves during modulation and demodulation during AM/FM radio, which helped me, understand things better. Not much. Yes, it did improve my ability to answer questions because while answering the questions I could see the simulation in my mind and the graphs. Yes, because I could see how the waves changed. Yes, I thought of the waves when I was answering the questions and I related them to the concepts learned during lecture in class. Yes, because it is visual. Yes. Because it helped me understand frequency and amplitude. Yes. But some things were not still clear. No! 180

PAGE 191

About the Author Giti Javidi received a Bachelor of Science Degree in Computer Science at University of Central Oklahoma (UCO), and a Masters of Science Degree in Computer Science at University of South Florida (USF). Prior to entering the Ph.D. program at USF, Giti Javidi worked as a Software Engineer at IBM, Tampa, FL, for four years. And taught Computer Science courses as an adjunct faculty at USF, St. Petersburg and Lakeland campuses. After entering the Ph.D. program, she continued to teach full time as a Faculty of Computer Information System (CIS) at Tampa Technical Institute, Tampa, FL. for two years. Then in order to gain experience in the area of Distance Education, she accepted a position at Educational Outreach, USF, for two years as the lead Multimedia and Instructional Designer. Prior to completion of her dissertation, Giti Javidi moved to Morehead, Kentucky with her husband and two kids where she held a position of Assistant Professor of Computer Information Technology at Morehead State University for one year. A year later, she and her family moved to Mississippi where she served as a Visiting Faculty of Engineering Education at University of Southern Mississippi for another year. As of August 2004, Giti Javi di has been an Assistant Professor of Computer Science at Virginia State University.