USF Libraries
USF Digital Collections

Personal epistemological growth in a college chemistry laboratory environment

MISSING IMAGE

Material Information

Title:
Personal epistemological growth in a college chemistry laboratory environment
Physical Description:
Book
Language:
English
Creator:
Keen-Rocha, Linda S
Publisher:
University of South Florida
Place of Publication:
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Chemistry education
Laboratory instruction
Microcomputer-based
Pedagogy
Intellectual development
Dissertations, Academic -- Science Education -- Doctoral -- USF   ( lcsh )
Genre:
non-fiction   ( marcgt )

Notes

Summary:
ABSTRACT: The nature of this study was to explore changes in beliefs and lay a foundation for focusing on more specific features of reasoning related to personal epistemological and NOS beliefs in light of specific science laboratory instructional pedagogical practices (e.g., pre- and post- laboratory activities, laboratory work) for future research. This research employed a mixed methodology, foregrounding qualitative data. The total population consisted of 56 students enrolled in several sections of a general chemistry laboratory course, with the qualitative analysis focusing on the in-depth interviews. A quantitative NOS and epistemological beliefs measure was administered pre- and post-instruction. These measures were triangulated with pre-post interviews to assure the rigor of the descriptions generated.Although little quantitative change in NOS was observed from the pre-post NSKS assessment a more noticeable qualitative change was reflected by the participants during their final interviews. The NSKS results: the mean gain scores for the overall score and all dimensions, except for amoral were found to be significant at p < or = .05. However there was a more moderate change in the populations' broader epistemological beliefs (EBAPS) which was supported during the final interviews. The EBAPS results: the mean gain scores for the overall score and all dimensions, except for the source of ability to learn were found to be significant at p < or = .05. The participants' identified the laboratory work as the most effective instructional feature followed by the post-laboratory activities. The pre-laboratory was identified as being the least effective feature.The participants suggested the laboratory work offered real-life experiences, group discussions, and teamwork which added understanding and meaning to their learning. The post-laboratory was viewed as necessary in tying all the information together and being able to see the bigger picture. What one cannot infer at this point is whether these belief changes and beliefs about laboratory instruction are enduring or whether some participants are simply more adaptable than others are to the learning environment. More research studies are needed to investigate the effects of laboratory instruction on student beliefs and understanding.
Thesis:
Dissertation (Ph.D.)--University of South Florida, 2008.
Bibliography:
Includes bibliographical references.
System Details:
Mode of access: World Wide Web.
System Details:
System requirements: World Wide Web browser and PDF reader.
Statement of Responsibility:
by Linda S. Keen-Rocha.
General Note:
Title from PDF of title page.
General Note:
Document formatted into pages; contains 498 pages.
General Note:
Includes vita.

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
aleph - 002000962
oclc - 319169334
usfldc doi - E14-SFE0002522
usfldc handle - e14.2522
System ID:
SFS0026839:00001


This item is only available as the following downloads:


Full Text

PAGE 1

Personal Epistemological Growth in a College Chemis try Laboratory Environment by Linda S. Keen-Rocha A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy Department of Science Education College of Education University of South Florida Major Professor: Dana L. Zeidler, Ph.D. Kathy Carvalho-Knighton, Ph.D. Elaine Howes, Ph.D. Kofi Marfo, Ph.D. Noreen Poor, Ph.D. Date of Approval: May 9, 2008 Keywords: chemistry education, laboratory instructi on, microcomputer-based, pedagogy, intellectual development, student images Copyright 2008, Linda S. Keen-Rocha

PAGE 2

Dedication I dedicate this dissertation to the most important people in my life. To my husband and best friend, Michael Rocha: for your encouragement, love, support and understanding. To my daughters Jennifer and Heather: for your fri endship, hugs, love, and support. To my dad and mom, Lawrence and Myrtle Keen: for encouraging me to be the best I could be. To my sister, Debbie Allen: for being not only my s ister, but a friend. To the memory of my grandmother, Leah Keen: for yo ur unconditional love, inspiration, and support.

PAGE 3

Acknowledgements I wish to thank my committee members, Dr. Kathy Car valho-Knighton, Dr. Elaine Howes, Dr. Kofi Marfo, and Dr. Noreen Poor for thei r feedback, support, and guidance through this dissertation. Thanks to Dr. Kathy Carvalho-Knighton for providing me with a multitude of opportunities and experiences as a graduate researc h and teaching assistant in chemistry and chemistry education. Foremost, I wish to thank my major professor and me ntor, Dr. Dana Zeidler. His faith in my abilities, insights and guidance has be en indispensable to my education. I thank all the teachers who have guided me in my l earning experiences from Florida, Hawaii, Maryland, and Virginia. I express my gratitude for these individuals, who opened up the world by instilling a passion in me for learning, thinking, and knowledge. A special thanks to Kristy Loman Chiodo for perform ing the interviews for this study as well as the students that participated in this study. Finally I thank my family (Michael, Jennifer, and H eather), my sister (Debra A.) and friends (Loretta H., Chyrisse T., and Linda M.) who supported all my efforts and gave me encouragement until the very end. And most of all thanks to my two grandsons, Rod and Jake Eason for all the hugs, kis ses, and laughter you gave freely during the last few years of this experience.

PAGE 4

i Table of Contents List of Tables .................................... ................................................... ............................ vii List of Figures ................................... ................................................... ............................. xi Abstract .......................................... ................................................... .............................. xii Chapter One: The Problem ......................... ................................................... .................. 1 Introduction .......................... ................................................... .............................. 1 Nature of the Study ................... ................................................... ......................... 4 Research Issues ....................... ................................................... ......................... 8 Nature of Personal Epistemol ogy ............................................... ................ 8 Development of Personal Epis temology .......................................... .......... 9 Constructivist Manner and Co gnitive Disequilibrium ............................ .... 10 Nature of Science .......... ................................................... ........................ 14 Nature of Students’ Images of Science .. .................................................. 14 Nature of Learning Chemistry in the Laboratory ................................ ...... 15 Problem Statement ..................... ................................................... ..................... 17 Definitions ........................... ................................................... ............................. 20 Possible Links Between PEB and NOS .... ................................................... ....... 21 Research Questions .................... ................................................... ..................... 22 Question 1 ................. ................................................... ............................ 22 Rationale ....... ................................................... ............................. 22 Sub-Question 1a ............ ................................................... ...................... 24 Rationale ....... ................................................... ............................. 24 Sub-Question 1b ............ ................................................... ....................... 25 Rationale ....... ................................................... ............................. 25 Question 2 ................. ................................................... ............................ 26 Rationale ....... ................................................... ............................ 27 Sub-Question 2a ............ ................................................... ....................... 28 Rationale ....... ................................................... ............................. 28 Sub-Question 2b ............ ................................................... ....................... 30 Rationale ....... ................................................... ............................. 30 Significance of the Study ......................... ................................................... ......... 31 Summary ........................................... ................................................... ............... 32 Chapter Two: Literature Review ................... ................................................... ............... 36 Introduction .......................... ................................................... ............................ 36 Models of Epistemological Development ................................................... ........ 37 Epistemological Intellectual Development ...................................... .......... 37 Perry’s Model .............. ................................................... .......................... 38 Women’s Ways of Knowing .... ................................................... .............. 39

PAGE 5

ii King-Kitchener Model of Refl ective Judgment ................................... ...... 42 Baxter-Magolda’s Model of Ep istemological Reflection .......................... 44 Kuhn’s Model of Reasoning Sk ills .............................................. .............. 48 Multidimensional Models of Epistemolog ical Beliefs ...................................... .... 50 Epistemological Beliefs .... ................................................... ..................... 50 Schommer-Aikins System of In dependent Beliefs ................................. .. 50 Hofer and Pintrich’s Epistem ological Theories Model ........................... ... 52 Nature of Science .................... ................................................... ........................ 57 Defining the Nature of Science ......... ................................................... ..... 57 Students’ Images of Science ................................................... ................. 57 Student Understanding of the Nature of S cience ..................................... 58 Measuring the Understanding of the Natur e of Science ........................... 59 Connections between the Nature of Scienc e and Epistemology .............. 61 Eliciting and Developing Students’ Under standing of NOS ...................... 61 Research Methodology Issues ...................... ................................................... .. 63 Personal Epistemological Beliefs Assessm ents ....................................... 64 Personal Epistemological Beliefs in Scie nce Assessments ..................... 68 Nature of Science Assessments........... ................................................... 72 Applicability to College Science Education ....... .................................................. 77 Epistemological Orientations in the Scie nces ........................................... 77 Assessing Epistemological Le vels in the Classroom ............................. .. 80 Promoting Epistemological Gr owth .............................................. ............ 81 Learning Tasks – Variety and Choice ........................................... ........... 85 Expectations – Communicating and Explaining ................................... .... 86 Modeling and Practice ...... ................................................... ..................... 88 Constructive Feedback ...... ................................................... ................... 92 Learner-Centered Environment .................................................. .............. 93 Respecting Student Developme nt Levels ......................................... ....... 98 The Laboratory in Chemistry Education ............ ............................................... 100 Introduction ............... ................................................... .......................... 100 Nature of Laboratory Instruc tion .............................................. ............... 101 Developmental Positioning of Chemistry Laboratory Instruction ............ 102 Laboratory Instructional Met hods .............................................. ............. 104 Laboratory Pedagogical Appro aches ............................................. ........ 106 Pre-Laboratory ............. ................................................... ....................... 107 Personal Response System ... ................................................... ............. 107 Laboratory Work ............ ................................................... ...................... 108 Microcomputer-Based Laborato ry Instruction .................................... .... 109 Post Laboratory ............ ................................................... ....................... 111 Summary .............................. ................................................... ......................... 111 Chapter Three: Methods ............................ ................................................... ................ 115 Introduction ..................................... ................................................... ............... 115 Research Questions ................... ................................................... ................... 120 Elaboration of Research Ques tions ............................................. .......... 120 Context and Participants ............. ................................................... .................. 124 Setting ................................ ................................................... ................. 124 Population Sample ...................... ................................................... ........ 124

PAGE 6

iii Research Instruments-Measures ......... ................................................... .......... 125 Chemical Concepts Inventory ............ ................................................... 125 Personal Epistemological Beliefs Assessm ent ....................................... 126 Nature of Scientific Knowledge Scale.... ................................................. 1 30 Student Reflective Assessmen t of Laboratory Methods............. ............ 1 33 Chemistry Laboratory Course Description .................................................. ...... 135 Introduction ........................... ................................................... ............... 135 Organization of Laboratory Instruction ............ .................................................. 137 Introduction ........................... ................................................... ............... 137 Pre-Laboratory Course Activities ....... ................................................... .. 140 Laboratory Work Course Activities ...... ................................................... 143 Post-Laboratory Course Activities ...... ................................................... 145 Data Collection ................................... ................................................... ............ 148 Researcher’s Role ...................... ................................................... ......... 148 Phase One: Quantitative .... ................................................... ................. 150 Phase Two: Qualitative .... ................................................... .................. 150 Phase Three: Quantitative a nd Qualitative .................................... ....... 151 In-Depth Semi-Structured Interviews .... .................................................. 151 Summary of Data Collection ........................ ................................................... ... 155 Introduction ........................... ................................................... ............... 155 Instruments ............................ ................................................... .............. 156 Semi-Structured Interviews ............. ................................................... .... 156 Data Analysis ..................................... ................................................... ............ 157 Introduction………………………………………………………… ............. 157 CCI Analysis ........................... ................................................... ............. 158 Quantitative ............... ................................................... .............. 158 EBAPS Analysis ......................... ................................................... ......... 159 Quantitative ............... ................................................... .............. 159 Qualitative ................ ................................................... ............... 160 NSKS Analysis .......................... ................................................... .......... 160 Quantitative ............... ................................................... .............. 160 Qualitative ................ ................................................... ............... 161 Semi-Structured Interviews ............. ................................................... .... 161 Reliability and Validity in Qualitative Research .. ............................................... 162 Introduction ........................... ................................................... ............... 162 Trustworthiness ........................ ................................................... ........... 163 Credibility ............................ ................................................... ................. 163 Applicability .......................... ................................................... ............... 165 Dependability .............. ................................................... ........................ 165 Confirmability ............. ................................................... ......................... 166 Summary ............................... ................................................... ......................... 166 Chapter Four: Quantitative Finding ............... ................................................... ............ 168 Introduction ........................... ................................................... .......................... 168 Characterization of Participants’ Episte mological and NOS Beliefs ................... 169 Research Question One and Sub-Questions ................................................... .. 169 Description of Participants ............ ................................................... .................. 170 Chemical Concepts Inventory Results .... ................................................... ........ 170 Epistemological Beliefs Assessment Phy sical Sciences Results ..................... 172

PAGE 7

iv Descriptive EBAPS Statistics – All Participants ............................... ...... 172 EBAPS T-Test Results – All P articipants ....................................... ........ 175 EBAPS Correlations – All Par ticipants ......................................... .......... 177 EBAPS Results Interview Part icipants .......................................... ......... 179 Descriptive Statistics – Int erview Participants ............................... ......... 179 EBAPS T-Test Results – Inter view Participants ................................. .... 182 EBAPS Correlations – Intervi ew Participants ................................... ...... 184 Nature of Scientific Knowledge Results ................................................... ......... 187 Descriptive NSKS Statistics – All Participants................................. ....... 188 NSKS T-Test Results – All Pa rticipants ........................................ ......... 191 NSKS Correlations – All Part icipants .......................................... ........... 194 Descriptive NSKS Statistics – Interview Participants .......................... ... 195 NSKS T-Test Results – Interv iew Participants .................................. ..... 198 NSKS Correlations – Intervie w Participants .................................... ....... 201 Discussion ............................. ................................................... .......................... 203 Range of Initial Beliefs ... ................................................... ..................... 203 RQ1 ............ ................................................... ............................. 203 Changes in NOS Beliefs .... ................................................... ................ 206 RQ1a ........... ................................................... ............................ 206 Changes in Personal Epistemo logical Beliefs ................................... ..... 209 RQ1b ........... ................................................... ............................ 209 Summary ................................ ................................................... ......................... 212 Chapter Five: Development of Epistemological Belie fs................................................ 215 Introduction ........................... ................................................... .......................... 215 Method of Analysis ..................... ................................................... ..................... 216 Summary of EBAPS Overall Scores ........ ................................................... ....... 218 Summary of EBAPS Interview Scores ...... ................................................... ...... 219 Characterization of Epistemological Beli efs ............................................... ........ 220 Initial and Final Epistemological Belief s Interviews ...................................... ...... 221 Responses to the Personal Epistemologica l Beliefs Probes .............................. 222 Structure of Scientific Know ledge ............................................. ............. 222 Nature of Knowing and Learni ng Science ........................................ ...... 228 Real-Life Applicability of S cience ............................................ ............... 235 Evolving Scientific Knowledg e ................................................. .............. 240 Source of Ability to Learn S cience ............................................ ............. 251 Discussion ............................. ................................................... .......................... 257 Changing Epistemological Bel iefs .............................................. ............ 257 RQ1 ............ ................................................... ............................. 257 RQ1b ........... ................................................... ............................ 260 Summary ................................ ................................................... ......................... 269 Chapter Six: Development of NOS Beliefs ......... ................................................... ...... 272 Introduction ........................... ................................................... .......................... 272 Method of Analysis ..................... ................................................... ..................... 272 Summary of NSKS Overall Scores ......... ................................................... ........ 273 Summary of NSKS Interview Scores ....... ................................................... ....... 276 Characterization of Nature of Science Be liefs ............................................. ....... 277 Initial and Final NOS Beliefs Interviews .................................................. ........... 278

PAGE 8

v Responses to the Initial and Final NOS B eliefs Probes ..................................... 279 Creative Dimension ......... ................................................... .................... 279 Developmental Dimension .... ................................................... .............. 284 Parsimonious Dimension ..... ................................................... ............... 289 Testable Dimension ......... ................................................... ................... 294 Final NOS Interviews ................... ................................................... ................... 299 Discussion ............................. ................................................... .......................... 305 Changing NOS Beliefs ....... ................................................... ................. 305 RQ1 ............ ................................................... ............................. 305 RQ1a ........... ................................................... ............................ 309 Summary ................................ ................................................... ......................... 316 Chapter Seven: Laboratory Instructional Features.. ................................................... .. 319 Introduction ........................... ................................................... .......................... 319 Method of Analysis ..................... ................................................... ..................... 319 Characterization of Participants’ Reflec tion of Laboratory Instruction ................ 321 Participant Reflections of Laboratory In struction ......................................... ....... 324 Reflective Comments of Laboratory Instru ctional Preferences .......................... 326 Final Interview Discussion of Instructio nal Methods........................................ ... 329 Final Interview Questions On e and Two ......................................... ....... 329 Question One – M ost Effective Instructional Feature ............... .. 330 Question Two – L east Effective Instructional Feature .............. .. 333 Final Interview Question Thr ee – Promoting Learning ........................... 337 Final Interview Question Fou r – Laboratory Skills ............................. ..... 338 Final Interview Question Nin e – Laboratory Notebook ........................... 340 Final Interview Question Ten – Scientific Analysis ............................ ..... 342 Reflections of Pre-Post Labo ratory Experiences................................. ... 344 Reflective Assessment Bloo m’s Taxonomy ...................................... ... 348 Reflections Laboratory Lea rning – Bloom’s Taxonomy ........................ 3 50 Final Interview Question Ele ven – Bloom’s Taxonomy .......................... 3 55 Characterization of Participants’ Episte mological Reflections ............................ 359 Epistemology and Instructional Methods ................................................... ....... 359 Final Interviews – Epistemological Belie fs and Instructional Methods ............... 362 Structure of Scientific Know ledge ............................................. ............. 363 Nature of Knowing and Learni ng Scientific Knowledge.......................... 3 65 Real-Life Applicability of S cientific Knowledge ............................... ........ 368 Evolving Scientific Knowledg e ................................................. .............. 370 Source of Ability to Learn S cientific Knowledge ............................... ...... 373 Characterization of Participants’ NOS Re flections ......................................... .... 375 NOS and Instructional Method s ................................................. ............ 375 Final Interview NOS Beliefs and Instruct ional Methods ..................................... 377 Discussion ............................. ................................................... .......................... 379 Essential Laboratory Pedagog y ................................................. ............ 379 RQ2 ............ ................................................... ............................. 379 Epistemological Beliefs and Laboratory Pedagogy ............................... 381 RQ2a ........... ................................................... ............................ 381 NOS Beliefs and Laboratory P edagogy ........................................... ...... 383 RQ2b ........... ................................................... ............................ 383 Summary ................................ ................................................... ......................... 384

PAGE 9

vi Chapter Eight: Conclusions ....................... ................................................... ................ 386 Introduction ........................... ................................................... .......................... 386 Overview of Dissertation ............... ................................................... .................. 386 Major Findings of Study ................ ................................................... .................. 396 Question One ............... ................................................... ....................... 398 RQ1 ............ ................................................... ............................. 398 Sub-Question-1a ............ ................................................... ..................... 400 RQ1a ........... ................................................... ............................ 400 Sub-Question-1b ............ ................................................... ..................... 402 RQ1b ........... ................................................... ............................ 402 Question Two ............... ................................................... ....................... 404 RQ2 ............ ................................................... ............................. 404 Sub-Question-2a ........... ................................................... ..................... 407 RQ2a ........... ................................................... ............................ 407 Sub-Question-2b ........... ................................................... ..................... 409 RQ2b ........... ................................................... ............................ 409 Limitations ............................ ................................................... ........................... 410 Further Research ....................... ................................................... ..................... 412 References ........................................ ................................................... ......................... 414 Appendices ........................................ ................................................... ........................ 430 Appendix A: Chemical Concepts Inventory .................................................. ..... 431 Appendix B: Epistemological Beliefs Ass essment .......................................... 438 Appendix C: Nature of Scientific Knowle dge Scale ......................................... .. 445 Appendix D: Initial Laboratory Work Que stionnaire ........................................ .. 449 Appendix E: Student Evaluation of Labor atory Instruction ................................ 451 Appendix F: Potential Interview Formats /Scripts .......................................... ..... 456 Appendix G: Sample Laboratory Work .... ................................................... ....... 467 Appendix H: Sample Pre-laboratory Activ ities ............................................. ...... 470 Appendix I: Keeping a Laboratory Note book .............................................. .... 474 Appendix J: Sample Pre-laboratory Discussion Acti vities ................................ 476 Appendix K: General Overview of Laborat ory Reports ..................................... 4 81 Appendix L: Consent Form ............. ................................................... .............. 485 Appendix M: Chemical Concepts Inventory Key .............................................. 489 Appendix N: EBAPS Scoring Scheme ..... ................................................... ..... 490 Appendix O: NSKS Scoring Procedures .. ................................................... ..... 497 Appendix P: CCI-EBAPS-NSKS Interview P articipants’ Scores ...................... 498 About the Author .................................. ................................................... ............ End Page

PAGE 10

vii List of Tables Table 1. Unidimensional Models of Epistemologica l Beliefs ........................................ 38 Table 2. Pedagogical Applications that Facilitat e Epistemological Growth .................. 84 Table 3. Learner Epistemological Views of Educat ional Characteristics .................... 103 Table 4. Descriptors of Laboratory Instructional Methods .......................................... 104 Table 5. Basic Elements of Laboratory Notebook ................................................... .. 109 Table 6. Epistemological Beliefs Assessment Phys ical Sciences Scale .................... 128 Table 7. EBAPS Instrument Variables ............ ................................................... ........ 129 Table 8. Nature of Scientific Knowledge Scale .. ................................................... ..... 133 Table 9. Topics of Laboratory Instruction ....... ................................................... .......... 137 Table 10. Anticipated Laboratory Course Outcomes .................................................. 138 Table 11. Organization of Laboratory Instruction ................................................... ..... 139 Table 12. Relationship of Data Collection to Instr uction ............................................ .. 147 Table 13. Data Collection Timeline ............... ................................................... ............ 149 Table 14. Interview Probe Questions .............. ................................................... .......... 154 Table 15. Probe Questions – Unpacking Interview Te rms ........................................... 154 Table 16. EBAPS Coding – Subscales ............... ................................................... ...... 159 Table 17. Descriptive Statistics – Chemical Concep ts Inventory Scores ..................... 171 Table 18. Distribution of Participants’ CCI Scores .................................................. ..... 171 Table 19. Descriptive Statistics – EBAPS Scores – All Participants ............................ 172 Table 20. Participant Shifts Between Epistemologic al Beliefs Levels .......................... 174 Table 21. EBAPS Score Range – Pre-Post Count ..... .................................................. 174

PAGE 11

viii Table 22. EBAPS T-Test Analysis – All Participants .................................................. .. 176 Table 23. EBAPS Paired Samples Correlations ...... ................................................... 178 Table 24. Descriptive Statistics – EBAPS Scores – Interview Participants .................. 179 Table 25. Participant Shifts Between Epistemologic al Belief Levels ............................ 181 Table 26. EBAPS Score Range – Pre-Post Count ..... .................................................. 181 Table 27. EBAPS T-Test Analysis Interview Partic ipants .......................................... 1 83 Table 28. EBAPS Paired Samples Correlations ...... ................................................... 186 Table 29. Descriptive Statistics – NSKS Scores A ll Participants ............................... 189 Table 30. NSKS Assessment Range ................. ................................................... ...... 190 Table 31. NSKS Beliefs Shifts Pre-Post Assessment – All Participants ...................... 191 Table 32. NSKS T-Test Analysis – All Participants ................................................... ... 192 Table 33. NSKS Paired Samples Correlations ....... ................................................... ... 195 Table 34. Descriptive Statistics – NSKS Scores – I nterview Participants .................... 196 Table 35. NSKS Score Range – Pre-Post Count ...... ................................................... 198 Table 36. NSKS Beliefs Shifts Pre-Post Assessment ................................................. 198 Table 37. NSKS T-Test Analysis – Interview Partici pants............................................ 1 99 Table 38. NSKS Paired Samples Correlations ....... ................................................... ... 202 Table 39. NSKS Percent Change .................... ................................................... ......... 206 Table 40. Demographic Statistics of Interview Part icipants ......................................... 217 Table 41. Descriptive Statistics EBAPS Scores – Al l Participants ............................... 218 Table 42. Descriptive EBAPS Statistics – Interview Participants ................................. 220 Table 43. EBAPS – Structure of Scientific Knowledg ePre-Post Statistics ................. 224 Table 44. Participant Reflections – Structure of S cientific Knowledge ......................... 226 Table 45. EBAPS – Nature of Knowing-Learning – Pre -Post Statistics ....................... 230 Table 46. Participant Reflections – Nature of Know ing-Learning ................................. 232

PAGE 12

ix Table 47. EBAPS – Real-Life Applicability of Scien ce Pre-Post Statistics ................. 237 Table 48. Participant Reflections – Real-Life Appl icability of Science ......................... 238 Table 49. EBAPS – Evolving Scientific Knowledge – Pre-Post Statistics .................... 243 Table 50. Participant Reflections – Evolving Scien tific Knowledge .............................. 245 Table 51. Descriptive EBAPS Statistics – Source of Ability to Learn Science ............. 254 Table 52. Participant Reflections – Source of Abil ity to Learn Science ........................ 255 Table 53. Demographic Statistics – Interview Parti cipants .......................................... 274 Table 54. Descriptive Statistics NSKS Scores – A ll Participants ............................... 275 Table 55. Descriptive NSKS Statistics – Interview Participants ................................... 27 7 Table 56. Descriptive NSKS Statistics – Creative D imension ...................................... 282 Table 57. Participants’ Interview Reflections – Cr eative ........................................... .. 283 Table 58. Descriptive NSKS Statistics – Developmen tal Dimension ........................... 286 Table 59. Participants’ Interview Reflections – De velopmental .................................. 288 Table 60. Descriptive NSKS Statistics – Parsimonio us Dimension.............................. 291 Table 61. Participants’ Interview Reflections – Pa rsimonious ..................................... 2 93 Table 62. Descriptive NSKS Statistics – Testable D imension...................................... 296 Table 63. Participants’ Interview Reflections – Te stable ........................................... .. 298 Table 64. Final Interviews – Nature of Science ... ................................................... ...... 303 Table 65. Demographic Statistics Interview Parti cipants ........................................... 322 Table 66. Descriptive Statistics Interview Parti cipants’ Scores .................................. 323 Table 67. Participants’ Laboratory Instructional P references ....................................... 325 Table 68. Interview Participants’ Laboratory Instr uctional Preferences ....................... 326 Table 69. Participants’ Reflections Instructiona l Methods ......................................... 328 Table 70. Final Interview Laboratory Instruction al Features ...................................... 330 Table 71. Participants’ Reflections – Effective In structional Methods .......................... 332

PAGE 13

x Table 72. Participants’ Reflections – Least Effect ive Instructional Methods ................ 335 Table 73. Interview Participants’ Reflections – Pr omoting Learning ............................ 338 Table 74. Interview Participants’ Reflections – La boratory Skills ................................. 3 39 Table 75. Interview Participants’ Reflections – La boratory Notebook .......................... 341 Table 76. Interview Participants’ Reflections – Sc ientific Analysis ............................... 3 43 Table 77. Reflections Pre-Post Laboratory Experien ces Statements .......................... 345 Table 78. Participant Assessment of Laboratory Cog nitive Domains .......................... 349 Table 79. Laboratory Activities in Terms of Bloom’ s Taxonomy .................................. 350 Table 80. Participants’ Reflections on Cognitive D omains ........................................... 355 Table 81. Descriptive Bloom’s Taxonomy Statistics – Interview Participants .............. 357 Table 82. Interview Participants’ Reflections – Bl oom’s Taxonomy ............................. 358 Table 83. Participants’ Reflections Epistemology Instructional Methods ................. 360 Table 84. Instructional Feature – Structure of Sci entific Knowledge ............................ 363 Table 85. Structure of Scientific Knowledge – Inst ructional Methods .......................... 364 Table 86. Instructional Feature – Nature of Knowin g and Learning Science ............... 365 Table 87. Nature of Knowing and Learning Science Instructional Methods............... 367 Table 88. Instructional Feature – Real-Life Applic ability Scientific Knowledge ............ 368 Table 89. Real-Life Applicability – Instructional Methods ........................................... .. 369 Table 90. Instructional Feature – Evolving Scienti fic Knowledge ................................. 370 Table 91. Evolving Scientific Knowledge – Instruct ional Methods ............................... 372 Table 92. Instructional Feature – Source of Abilit y to Learn ........................................ 373 Table 93. Source of Ability to Learn – Instruction al Methods ....................................... 374 Table 94. Participants’ Reflections NOS – Instru ctional Methods .............................. 376 Table 95. Instructional Feature NOS Beliefs .... ................................................... ....... 377 Table 96. Interview Participants’ NOS Reflections – Instructional Methods ................. 378

PAGE 14

xi List of Figures Figure 1. Graphic Summary of Personal Epistemology ................................................. 10 Figure 2. Graphic Summary of Pedagogical Factors c onnected to Students’ Epistemological Theories ...................... ................................................... ....... 13 Figure 3. Graphic summary of Pedagogical Applicati ons that Facilitate Epistemological Growth ............ ................................................... ................... 85 Figure 4. Overview of the Organization of Chapter 3 ................................................. .. 118 Figure 5. General Context and Measures Overview .. .................................................. 119 Figure 6. NSKS Representative Placement Scale .... ................................................... 188 Figure 7. NSKS Beliefs – Range Scale ............. ................................................... ........ 278

PAGE 15

xii Personal Epistemological Growth in a College Chemis try Laboratory Environment Linda S. Keen-Rocha ABSTRACT The nature of this study was to explore changes in belie fs and lay a foundation for focusing on more specific features of rea soning related to personal epistemological and NOS beliefs in light of specific scien ce laboratory instructional pedagogical practices (e.g., preand postlaborat ory activities, laboratory work) for future research. This research employed a mixed methodology, foregro unding qualitative data. The total population consisted of 56 students enrolled in several sections of a general chemistry laboratory course, with the qualitative analysis focusing on the in-depth interviews. A quantitative NOS and epistemological beliefs measure was administered preand post-instruction. These measu res were triangulated with pre-post interviews to assure the rigor of the descriptions generated. Although little quantitative change in N OS was observed from the pre-post NSKS assessment a more noticeable qualitative change was reflected by the participants during their final interviews. The NSKS results: t he mean gain scores for the overall score and all dimensions, except for amoral were fo und to be significant at p .05. However there was a more moderate change in the pop ulations’ broader epistemological beliefs (EBAPS) which was supported during the fina l interviews. The EBAPS results: the mean gain scores for the overall score and all dimensions, except for the source of ability to learn were found to be significant at p .05. The participants’ identified the laboratory work as the most effective instructional feature followed by the post-laboratory

PAGE 16

xiii activities. The pre-laboratory was identified as b eing the least effective feature. The participants suggested the laboratory work offered real-life experiences, group discussions, and teamwork which added understanding and meaning to their learning. The post-laboratory was viewed as necessary in tyin g all the information together and being able to see the bigger picture. What one cannot infer at this point is whether thes e belief changes and beliefs about laboratory instruction are enduring or whethe r some participants are simply more adaptable than others are to the learning environme nt. More research studies are needed to investigate the effects of laboratory ins truction on student beliefs and understanding.

PAGE 17

1 Chapter One: The Problem Introduction There is growing recognition in educational and psy chological research regarding how learners’ epistemologies play an important role in helping them construct knowledge. Epistemology, the study of knowing and knowledge, has been one of the major foundations of the philosophy of science educ ation. Amid the fundamentals of epistemological research are questions relating to the nature and form of human knowledge and about the processes by which such kno wledge is verified. Science students, science educators, and scientists hold different images of learning science. Many of their own ideas about sc ience and the construction of scientific knowledge differ. These differences are observed more often by students when engaged in learning environments in the physic al sciences such as chemistry and physics. The most effective chemical pedagogical t echniques used in learning chemistry are those that create a cognitive conflict with an inadequate mental model held by a learner, leading to dissatisfaction with his or her current view. As learners move from secondary school through college, they experience a developmental progression in their attitudes toward knowing, learning, and teaching. T herefore, it is important for college science faculty, in their roles as instructors, to assume a new level of responsibility for understanding the various dimensions of epistemolog ical beliefs of their students, as well as what beliefs they hold themselves. Pedagog ical techniques designed to help science students attain the intellectual maturity t hey will need to function effectively as

PAGE 18

2 science professionals must attend to and promote th e epistemological development of the learner. Facilitating meaningful learning in college science education contexts has been the focus of many research studies, particularly wi thin the body of literature concerning student learning. The image that researchers have about knowledge and knowing centers on a range of research avenues that include the following: epistemological beliefs (Schommer, 1990), epistemological theories (Hofer & Pintrich, 1997), reflective judgment (King & Kitchener, 1994), and epistemologi cal reflection (Baxter Magolda, 2004). These areas are part of a larger body of re search categorized as “personal epistemology” (Hofer & Pintrich, 2002). The field of “personal epistemology” examines what learners believe about how knowing occurs, what counts as knowledge, where kno wledge resides, how knowledge is constructed, and how knowledge is evaluated (Hof er, 2004). An extensive body of research indicates that educators need to focus on how epistemological beliefs influence student learning. Learning always requires the de velopment of an epistemological perspective about the content within the context of a certain domain of knowledge (e.g.; science). Epistemology as defined by Hofer and Pint rich (1997) concerns the nature and justification of human knowledge, while epistemolog ical beliefs denote “the theories and beliefs they hold about knowing, and the manner in which such epistemological premises are part of and an influence on the cognit ive processes of thinking and reasoning.” Students have a range of images of science also ref erred to as the Nature of Science (NOS) beliefs. Abd-El-Khalick and Akerson (2004) suggest that students’ understanding of the NOS is impacted by their perso nal epistemological beliefs, aka worldview beliefs. Students learning of the NOS is mediated often by motivational,

PAGE 19

3 cognitive, and worldview factors. Lederman (1998) defines NOS as the characteristics of the scientific enterprise that are accessible an d relevant to one’s everyday life and include the following aspects: creativity, culture, empirical basis, tentativeness, theory based and socially embedded. Therefore, learners’ personal epistemology about the nature of scientific knowledge and knowing can be t heir domain-specific epistemology of science (Hogan, 2000). Ryder, Leach, and Driver (1 999) studied undergraduate science students’ images of science and suggested three mai n epistemological positions concerning the NOS: knowledge claims as descriptio n; knowledge claims as distinct from data, yet provable; and knowledge claims as go ing beyond the data. The range of images presented by science learners’ can offer a profile of epistemological and sociological reasoning of each individual. Epistem ological belief systems have been shown to affect a plethora of students conceptual u nderstanding of how science connects to real world problems that are embedded i n socioscientific issues (Ryder, et al., 1999; Zeidler, Walker, Ackett, & Simmons, 2001 ). Students have had a wide range of exposure to science including K-12 education, un dergraduate science, interactions with science instructors, televised scientific docu mentaries, and scientific issues reported through various forms of news media. These experie nces with science give students episodic knowledge about science. According to Ryd er, et al., (1999) from a social reasoning perspective these episodic experiences of the world of science will form the basis of external and internal dialogue about scien ce through which student images of science are constructed, sustained, and changed. In other words, depending on the context, the learner will draw on different forms o f reasoning. The remainder of this chapter presents the proble m statement, the nature of the study as well as introduces concepts and issues cen tral to the research: nature and development of personal epistemology, the role of s tudent images of science, the nature

PAGE 20

4 of chemistry learning, the possible link between pe rsonal epistemology and NOS, the role of the laboratory instructional environment, a nd research methodology issues. In addition, the research questions are presented foll owed by the study’s significance for chemistry education research. Nature of the Study The nature of this study was to explore and lay a f oundation for focusing on more specific features of reasoning related to personal epistemological and NOS beliefs in light of specific science laboratory instructional features for future research. This study used a semi-naturalistic mixed-methods approach to investigate the following: whether students’ personal epistemological and nature of sc ience (NOS) beliefs change by the completion of a semester general chemistry laborato ry course and what laboratory pedagogical practices (pre-lab, laboratory work, or post-lab) that students believe were essential to their understanding of the laboratory material. In addition, the study examined what laboratory pedagogical practices stud ents believe influenced their personal epistemological and/or NOS beliefs. The consensus among researchers is that quantitativ e and qualitative research, also known as, mixed-methods research can complemen t each other by providing richer insights and raise more interesting questions for f uture research than if only one method is considered (Gall, Borg, & Gall, 2003). By defini tion, mixed-methods research is where the researcher combines qualitative and quantitativ e research techniques to answer research questions when the constructs and their me asures can be specified in advance of data collection, but also use qualitative method s to discover additional constructs that are relevant to the study’s goals. A mixed-methods approach to evaluation can increase both the reliability and validity of evaluation data. The validity of resul ts can be strengthened by using more

PAGE 21

5 than one method to study the same phenomenon. This approach called triangulation is considered the main advantage of the mixed-method a pproach. A search of academic data bases or the Internet wo uld identify a variety of studies in the behavioral, educational, health and social sciences that utilize a mixedmethods approach (Tashakkori & Creswell, 2007). Th ese studies are considered “mixed” because they utilize qualitative and quanti tative methods in one or more of the following ways: (1) two types of research question s (with both methods); (2) two types of data collection procedures (e.g., surveys and in terviews); (3) two types of data (e.g., numerical and textual); (4) two types of data analy sis (e.g., statistical and thematic); and (5) two types of conclusions (e.g., emic and eitic representations, “objective and subjective, “ etc.) (Libarkin & Kurdziel, 2002; Tas hakkori & Creswell, 2007). There will be three data collection phases for this study, which will be described in the methodology section. In the first phase of data was collected from the participants using a quantitative assessment to determine the pa rticipants’ current understanding of chemistry knowledge, as well as surveys to determin e their current personal epistemological beliefs of the physical sciences, c urrent nature of science beliefs, and current beliefs about laboratory practical work. The second phase of data collection occurred during the semester course. During this phase, since the researcher was the ins tructor an outside interviewer conducted the initial semi-structured interviews wi th volunteering participants to further examine their beliefs. In addition, the participa nts completed a laboratory instructional questionnaire after each laboratory experience to a ssess their reaction to the three broad areas of instructional methods associated wit h each laboratory activity (e.g., prelaboratory, laboratory work, and post laboratory). Data was collected regarding the participants preferred laboratory instructional met hods.

PAGE 22

6 The final phase of data collection occurs at the en d of the semester. During this phase, the initial belief assessments concerning pe rsonal epistemological and NOS beliefs were re-administered The data from the pre and post assessments and surveys was analyzed to determine if the participants’ beli efs changed by the completion of the semester course. This was followed with an outsid e interviewer conducting a final semistructured interview with those participating in th e initial interviews. Data was collected regarding the participants’ actual and preferred la boratory instructional method(s) and current personal epistemological and NOS beliefs. Reliability usually measures the extent to which th e results of an instrument or study would be replicated given the same sample. R eliability is an important precondition for establishing validity (Lincoln & Guba 1985). However, the qualitative research tradition recognizes that participants and their interpretations of research instruments are dynamic. Therefore, exact replicat ion of results is not an assumption of this study. Initial and final interviews were impl emented to assist in checking the validity of the participants’ scores on the EBAPS and NSKS. The initial scores of the interview participants were compared to their initial intervi ew responses. This method was repeated with the final scores and interviews. The Cronbach alpha coefficient as well as Pearson correlations are reported and used as indic ators of internal consistency and to describe the strength and direction of the linear r elationship between the dimensions of each instrument. This study was of an exploratory nature to lay a fo undation for focusing on more specific features of epistemological and NOS reason ing in light of specific instructional features (pre-lab, laboratory work, or post-lab) fo r future research. Therefore the use of the word “growth” in the title of the dissertation may be a misnomer. It is a bit too presumptuous to infer growth patterns from two data points. The design of the study

PAGE 23

7 makes it difficult to explain the observed changes either as indicators of the general effects of instruction or of a particular form of i nstruction. In any event there is not sufficient data to make definitive claims about “gr owth”. The word change may be a more suitable term. Descriptive statistics such as frequencies, means, and standard deviations were computed to summarize the participants’ responses t o the pre-post assessments. A paired-samples t-test (repeated measures) was used to compare the pre-post mean scores for the participants. The variability for t he paired-samples t-test was calculated using eta squared. The effect size (d) was interpr eted using the guidelines from Cohen (1998). In this dissertation, effect sizes were ca lculated from the mean gain score (mean Time 2 – mean Time 1) divided by the pooled s tandard deviation of the Time 1 and Time 2. To interpret the effect size values the following guidelines from Cohen (1998) were used: 0.20 = small effect, 0.50 = mode rate effect, and 0.80 = large effect. Pearson product-moment correlation was used to dete rmine the degree that quantitative variables were linearly related. The variability for the paired-samples t-test was c alculated using the formula for eta squared. Eta squared can range from 0 to 1 and represents the proportion of variance in the dependent variable that is explaine d by the independent variable. To interpret the eta squared values the following guid elines from Cohen (1998) was used: 0.01 = small effect, 0.06 = moderate effect, and 0. 14 = large effect. Variablity is defined here as t 2 divided by t 2 plus sample size minus 1 (eta squared = t 2 / t 2 + N-1). The data analysis is discussed further in chapters three and four.

PAGE 24

8 Research Issues Nature of Personal Epistemology Personal epistemology has its origins in the theori es of cognitive development and the studies of student intellectual development (Hofer, 2004). Over the last twentyfive years, researchers have conceptualized persona l epistemology in two ways: as a cognitive developmental process that proceeds in a patterned, one-dimensional, developmental sequence (Baxter Magolda, 1992; King & Kitchener, 1994) and as a belief system (Schommer, 1994; Schraw, Bendixen, & Dunkle, 2002). Those who view personal epistemology as a developmental progressio n have suggested that learners move through a developmental sequence that reflects an evolving ability to coordinate the objective and subjective aspects of knowing (Ba xter Magolda, 1992; King & Kitchener, 1994; Kuhn & Weinstock, 2002). Accordin g to Pintrich (2002), many in the field hold the belief that the construct of persona l epistemology involves the nature of knowledge and knowing. This construct includes bel iefs about (1) the certainty of knowledge, (2) the justifications for knowing, (3) the simplicity of knowledge, and (4) the source of knowledge (Bendixen & Rule, 2004). Baxte r Magolda (2004) views these beliefs as the core of personal epistemology. The overarching purpose of this study is to investigate the nature of personal epistemology in the context of the learner’s views about thinking and beliefs about knowledge and know ing in science in general, and chemistry in particular. Figure 1 presents a graph ic organizer of the major themes related to core epistemological beliefs which are a ddressed in this section, and relevant to the main focus of this study.

PAGE 25

9 Development of Personal Epistemology Over Time Since the 1960s, numerous studies have presented co untless links between epistemological beliefs and learning (Hofer & Pintr ich, 1997; Schommer & Walker, 1997). A learner’s individual epistemological beli efs have become the focus of research in the educational, particularly the psychological literature, and mathematical and science education. Research studies indicate the m ore learners believe that knowledge is simple, certain, and handed down by an authority figure, the more likely they are to generalize complex contextual information, perform poorly on assessments, misinterpret tentative conclusions, and seek single solutions wh en multiple solutions are more suitable (Schommer, 1990). In science education i nvestigations of learners’ belief systems in relation to scientific concepts have rev ealed that held beliefs will influence learners’ behavior and processing of information wh ile other studies have demonstrated that learners’ belief systems about their failures or successes affect their effort and performance (Kuhn, Amsel, & O’Loughlin, 1988). Ana lysis of the literature suggests that epistemological beliefs are multidimensional and mu ltilayered. That is, learners possess general beliefs about knowledge, as well as beliefs about academic forms such as scientific knowledge. The nature of this study was to explore and lay a found ation for focusing on more specific features of reasoning related to persona l epistemological beliefs in light of specific science laboratory instructional feature s for future research. This study investigated the development of personal epis temological beliefs in the context of whether students’ personal epistemological beliefs of science (chemistry) change by the completion of a semester general chemistry laborato ry course.

PAGE 26

10 has origins in has been proceeds in a as a examines Personal Epistemology Cognitive Development Theories Conceptualized in two ways Patterned developmental sequence Belief system how knowing occurs what counts as knowledge where knowledge resides how knowledge is constructed how knowledge is evaluated Figure 1 Graphic summary of personal epistemology Constructivist Manner and Cognitive Disequilibrium Personal epistemological beliefs vary from nave (n ovice), dualistic beliefs in the existence of fixed truths to the sophisticated (exp ert), relativistic beliefs that knowledge is tentative, personal, and relative to a variety of c ontexts (Bransford, Brown, & Cocking, 2000). The term nave (novice) is used particularl y in relation to learners who have an inclination to believe that truth is certain, absol ute, and transferred by an authority. The term sophisticated (expert) on the other hand, is u sed in relation to learners who believe that truth is relative, changing, and actively cons tructed by the learner. The consensus among researchers is that personal ep istemologies may develop in a constructivist manner (Hofer & Pintrich, 1997; King & Kitchener, 1994) but the actual

PAGE 27

11 process or mechanism is undefined. Bendixen and Rul e (2004) identified cognitive dissonance and personal relevance as two potential conditions for the mechanism of epistemological change. Cognitive dissonance, a ps ychological event, refers to the uneasiness felt when a discrepancy occurs between w hat the learner already knows and new information. Therefore, dissonance occurs when there is a need to accommodate new ideas. However, if learners are called upon to learn something which contradicts what they already think they know, particularly if they are committed to that prior knowledge, they are likely to resist the new learni ng unless it has personal relevance. Under these conditions having a share in the outcom e, an interest in the topic or emotional involvement may promote epistemological b elief change. Change in epistemological beliefs takes place when learners are challenged to reconstruct naive beliefs into more sophisticated w ays of knowing (Hofer & Pintrich, 1997). Evidence from some studies suggests that edu cation influences epistemological development (Perry, 1970; Schommer, 1993) specifica lly in college curricula that exposes the learner to a variety of educational vi ewpoints. Learners who develop expertise in knowing and learning through advanced education and life experiences may be more able to see multiple perspectives and offer tentative explanations when defending their perspectives of what constitutes kn owledge and beliefs. Exposure to advanced education and life experiences may cause c ognitive conflict that results in the reconstruction of naive epistemological beliefs int o more relativistic, sophisticated beliefs about knowing (Belenky, Clinchy, Goldberger, & Tarule, 1986; Scho mmer, 1994). However, other studies suggest that the realization of a sophisticated, critically aware view toward knowledge is rare even in adulthood (King & Kitchener, 1994; Kuhn, 1991) and that an advanced education may have a smaller effect than p redicted (Hofer &

PAGE 28

12 Pintrich, 1997 ). Figure 2 provides a general summary of pedagogical factors that are theoretically linked to students’ epistemological t heories. The nature of this study was to explore and lay a f oundation for focusing on more specific features of reasoning related to personal epistemological and NOS beliefs in light of specific science laboratory instructional features for future research. This study investigated what laboratory pedagogical practices (e.g., preand postlaboratory activities, laboratory work) that students believe were essential to their understanding (cognitive dissonance) during the semester general chemistry laboratory learning experience. In addition, the study examined the ex tent that the laboratory pedagogical practices (e.g., preand postlaboratory activiti es, laboratory work) the students believe influenced their personal epistemological and NOS b eliefs about science during the semester general chemistry laboratory course.

PAGE 29

13 may determine may influence Work together for Achieving influence Students' Epistemological Theories Meaningful Learning Classroom pedagogical practices Instructors' epistemological beliefs Learner Motivation Beliefs about learning Strategy Selection knowledge acquisition and transformation guided by Figure 2 Graphic summary of pedagogical factors connected t o students’ epistemological theories

PAGE 30

14 Nature of Science The phrase Nature of Science (NOS) is used in discu ssing issues such as what science is, how science works, the epistemological and ontological foundations of science, how scientists operate as a social group, and how society influences and reacts to scientific endeavors (Clough, 2006). According to Khishfe and Lederman (2005) there is no consensus among scholars on a specific definition for the NOS. The NOS in general refers to the epistemology of science, scie nce as a way of knowing, or the beliefs and values associated with the development of scientific knowledge (Abd-ElKhalick & Lederman, 2000; Lederman, 1992). T he characteristics of NOS include the concepts that scientific knowledge is tentative, em pirically based, subjective (theoryladen), to a certain extent the product of human in ference, imagination, and creativity, and socially and culturally embedded. Conceptions of the NOS have changed with developmen ts in different scientific disciplines. For instance, in physics there has be en a change from the classical deterministic conceptualization to a quantum indete rministic conceptualization of the discipline. These changes in the conceptions of t he NOS have mirrored shifts in emphasis and focus in the areas of the history, phi losophy, and sociology of science. Nature of Students’ Images of Science Science students develop images of science from an early age as a result of messages communicated through daily experiences, ed ucation and the media. These images of science profile the mental representation s of science that inform a learner’s decisions about how to respond within a scientific context (Leach & Driver, 1997). At the core of students’ images of science is their belief s and understanding about the Nature of Science (NOS).

PAGE 31

15 Nave personal images of science have been identifi ed as a major obstacle to the achievement of conceptual change in science educati on (Bransford, et al., 2000; Schommer, 1993; Songer & Linn, 1991; Thoemer & Sodi an, 2002,). Lederman (1992) concluded from a review of the literature on studen ts’ understanding of the NOS that students’ views reflect misconceptions about the na ture of scientific knowledge. The NOS is a complex and theoretical concept that invol ves reflecting on the scientific enterprise in ways not encouraged by the usual text book-based science curriculum (Bell, 2001). Students’ images of science provide reference point s that enable them to act within a scientific environment (Ryder, et al., 199 9). Students can draw on these images when discussing science and in choosing an appropri ate course of action during a scientific task. This study investigated the development of the Natu re of Science (NOS) in the context of whether students’ NOS beliefs change by the completion of a semester general chemistry laboratory course. The nature of this study was to explore and lay a foundation for focusing on more specific features o f reasoning related to NOS beliefs in light of specific science laboratory instructional features for future research. Nature of Chemistry Learning in the Laboratory Chemistry is an experimental science. The social n ature of chemistry learning is established by the human interaction that occurs in the general chemistry laboratory, just as in any research or larger scientific community. In addition to the social nature is the perspective that knowledge is not transmitted from person to person but is constructed by student interactions through self-thought and co mmunication (Driver, 1989). The actual learning of chemistry requires that student’ s converse in order to have their views

PAGE 32

16 accepted or rejected. In addition, this learning r equires that learners listen to and analyze the views of other learners as well as the experts. Laboratory instructional environments have had a lo ng standard and central role in the science curriculum. Laboratory instruction is viewed as an important component of undergraduate chemistry education. The value of chemistry laboratory instruction has been questioned on the grounds of both cost and mea ningful learning for many years. Although it has the potential to enrich the formati on of chemistry concepts by fostering inquiry, intellectual development, manipulative ski lls, and problem-solving skills, it often fails to reach its full potential (Hofstein, 2004). Literature reviews of laboratory instructional environments have found it can be a l earning environment in which very little meaningful learning takes place (Domin, 1999 ). The instructional activities are often “cookbook” in makeup with emphasis on collecting da ta using specific, detailed procedures with expected results. Almost no attent ion is placed on planning the investigation or analyzing data in order to interpr et results. That is, students spend more time determining if they have obtained the “right” answer than actually thinking about the chemistry principles being applied and developing m anipulative and observational skills (Johnstone & Al-Shuaili, 2001). Berg (2005) discusses how the learner’s epistemolog ical views of laboratory instruction can influence their cognitive processes The student view that knowledge is a set of accumulated facts and he or she is a recepto r of knowledge can create a conception of laboratory instruction as an illustra tion of facts and learning of procedures. The learner view that knowledge is an integrated se t of constructs in which the learner constructs knowledge can stimulate a conception of the laboratory activity as a situation where knowledge is generated and the learner is lea rning not only procedures, but also scientific methods.

PAGE 33

17 The effect that experiences and instructional strat egies within the educational setting have on a learner’s personal epistemologica l beliefs and attitudes is a major research interest. By definition, attitudes conve y our evaluation of someone or something such as the notion “I like laboratory wor k” (Berg, 2005). Developing positive attitudes towards learning chemistry is one of the important goals of instruction. These can be divided into two affective aims; attitudes t o chemistry (i.e., confidence, interest, motivation) and chemistry (scientific) attitudes. Attitudes are believed to be formed by affective, behavioral, and cognitive processes. There is a need to know more about how the learner makes sense of the epistemological aspects of their instructional envi ronments. For instance what practices are most relevant, how are they interpreted through the students’ existing beliefs and knowledge, and which beliefs are being altered in t he process. This study sought to gain insight into which labora tory pedagogical methods the students believe influenced their understanding of the material being presented as well as their personal epistemological and NOS beliefs o f science (chemistry) during a semester general chemistry laboratory course. The nature of this study was to explore and lay a foundation for focusing on more speci fic features of reasoning related to personal epistemological and NOS beliefs in light of specific science laboratory instructional features for future research. Problem Statement “To many students, a ‘lab’ means manipulating equip ment but not manipulating ideas.” Lunetta, 1998, p. 250 Epistemology is defined as a theory of knowledge. As a subject of long-time interest to philosophers personal epistemology has become a topic of interest to

PAGE 34

18 educational psychologists and science educators (Ho fer, 2001). Personal epistemological beliefs relate to the nature of kno wledge and knowing. The two general areas that characterize the research of personal ep istemological beliefs include: (1) examining the nature of development and change in h ow learners think about knowledge and knowing and (2) examination of how these belief s can facilitate or constrain learner achievement, learning, reasoning, and thinking. With interest in the subject growing, several questions have surfaced in the context of c ollege science laboratory instruction. What is personal epistemological development and wh y is it important to college science laboratory instruction? First, what does one mean by personal epistemological development? Research in this area broadly address es personal epistemological development as a learners’ thinking and beliefs abo ut knowledge and knowing and usually includes some of the following ideas: defin ition of knowledge, how knowledge is constructed, how knowledge is evaluated, the self a nd the learning process, and metacognition (Benedixen & Rule, 2004; Hofer, 2001) Other important issues to address include the image s of science that undergraduate science students hold, how and if stu dents’ epistemological beliefs are linked to their images of science, how different in structional situations in the chemistry laboratory affect a learner’s image of science and personal epistemology, what conceptual changes occur during instruction, and ho w student images of science affect that change. Perhaps even more important is why pe rsonal epistemology matters and what its implications are for student achievement. Are learners epistemological beliefs a result of the instruction they receive, do these be liefs determine how instruction is received, or is there a symbiotic interaction betwe en the two? Research dealing with the importance of personal epistemological developm ent in learning chemistry has increased dramatically within the last decade. Acc ording to Hofer (2001)

PAGE 35

19 epistemological perspectives play a significant rol e in learning experiences in which learners encounter new knowledge. Given the parallels between personal epistemology a nd NOS beliefs, it is easy to concede that a relationship must exist between the two. As both constructs deal with beliefs about knowledge, it may be rational to assu me that NOS is a part of the science beliefs component of personal epistemologies. Acco rding to Hogan (2000), research that defines learners’ knowledge about the NOS more as a belief, than as declarative knowledge overlaps with studies on the psychologica l construct of epistemology. Personal epistemologies can act as standards for ju dging the validity of knowledge claims (Hewson, 1985; Hofer & Pintrich, 1997). The refore, personal beliefs learners have about the nature of scientific knowledge and k nowing can be considered to be their domain-specific epistemology of science. This does not imply that all the knowledge a learner possesses about the scientific enterprise i s an epistemological belief. However, studies on the relationship between personal episte mologies and NOS are virtually nonexistent. What is unclear is what effect NOS in struction has on learners’ epistemological development. The way students approach and view the laboratory l earning environment is affected by students’ personal epistemological beli efs and images of science. As discussed earlier some students hold the conception that knowledge is a set of accumulated facts and view laboratory learning as a n illustration of facts and learning of routine procedures. On the other hand, the concepti on that knowledge is an integrated set of constructs and that students construct their own knowledge may promote a view of laboratory learning as an endeavor in which know ledge is generated and the student learns not only procedures, but also the nature of science (Berg, 2005).

PAGE 36

20 Despite the research most of the epistemological an d NOS studies have investigated the college science classroom (e.g. le cture) (Dagher, Brickhouse, Shipman, & Letts, 2004; Hofer, 2004; Hofer, 2000; Wenk & Smi th, 2004) and only investigated general NOS and epistemological beliefs related to learning outcomes in the laboratory (Bell, 2004; Hofstein & Lunetta, 1982; Ryder, et al ., 1999; Wickman, 2003). It remains to be determined whether certain effective instruct ional practices are linked to the development of specific epistemological and NOS bel iefs. The nature of this study was to explore and lay a foundation for focusing on mor e specific features of reasoning related to personal epistemological and NOS beliefs in light of specific science laboratory instructional features for future resear ch. The major intent of this study was to determine whether students’ NOS and personal episte mological beliefs of chemistry change by the completion of the semester course, a s well as what laboratory classroom instructional practices did the students believe we re necessary to their understanding of the laboratory material, and may of influenced thei r NOS and personal epistemological beliefs during a semester general chemistry course. Definitions Two constructs are central to this study’s purpose : personal epistemological beliefs and nature of science. The constructs are defined to convey the meaning and the operational definition that is given to them. Personal epistemological beliefs (PEB): Epistemolog y is a branch of philosophy that is directed toward theories of an individual’s beliefs about the nature of knowledge and learning (Schommer, 1993). The core dimensions of personal epistemology include: (1) the nature of knowledge (structure an d stability of knowledge) and (2) the nature of knowing (source and justification of know ledge claims). For the purpose of this study, personal epistemological beliefs will be def ined as beliefs about the process of

PAGE 37

21 knowing and the nature of knowledge as related to s cience and learning science (Hofer & Pintrich,1997). Nature of Science (NOS): NOS sometimes described a s images of science is a broad area of human endeavor which includes the val ues and beliefs inherent to scientific knowledge, and its development. The con sensus view of NOS objectives from science education scholars such as Lederman, Abd-Kh alick, Bell, and Schwartz (2002) is extracted from international science education s tandards documents. These scholars define NOS as involving aspects related to the foll owing terms: creative, empiricallybased, human imagination, inferences, tentative, th eory-laden, and socially and culturally embedded. For the purposes of this stud y, NOS refers to the epistemology of science or science as a way of knowing that include s the beliefs and values inherent to the development of scientific knowledge. Possible Links Between PEB and NOS According to Hofer (2002) personal epistemological beliefs deal with questions such as “how do we know what we know,” as well as a person’s beliefs about the nature of knowledge. Learners’ personal epistemological b eliefs are unlikely to be equally relevant or advanced across a variety of subject co ntexts. This implies a need for a specific subject focus when considering learners’ p ersonal epistemological beliefs. Similarly, NOS knowledge deals with learners’ perso nal epistemological values and beliefs inherent to scientific knowledge and its de velopment (Abd-El-Khalik, Lederman, Bell, & Schwartz, 2002). Both constructs deal with the beliefs about knowledge. Personal epistemological beliefs of science refer t o learners’ understanding of how scientific ideas are built, including their kno wledge about the process of knowing about scientific knowledge (Songer & Linn, 1991). In general, NOS refers to the epistemology of science, or science as a way of kno wing that includes the values and

PAGE 38

22 beliefs inherent in the development of scientific k nowledge. Studies concerning learning science suggest that student beliefs about NOS and science learning influence achievement (Driver, Asoko, Leach, Mortimer, & Scot t, 1994; Jehng, Johnson, and Anderson, 1993; Schommer, 1990). The features of NOS can be useful in assisting lear ners to think about their epistemology. Investigating NOS can provide charac teristics that differentiate science knowing from other ways of knowing and explicitly a ssist learners examine their rationale in forming ideas (Duschl, Hamilton, & Grandy, 1992) Research Questions Question 1 What range of personal epistemological beliefs (dev elopment level), and images of chemistry (NOS) do undergraduate science student s have at the beginning of a general chemistry laboratory course? Rationale. Personal epistemologies are quite simply a learner’ s beliefs about the nature of knowledge (Hofer & Pintrich, 1997). Stud ies of personal epistemology attempt to determine how learners focus their conceptions o f knowledge and knowing and how these are used to develop an understanding of the w orld (Hofer, 2002). Indeed sophisticated epistemological beliefs are no t essential for survival. However, when considering credibility of sources, h ow to weigh evidence, and how to make decisions about the world, we see that each of these constructs depends on our underlying beliefs about knowledge. According to H ofer (2002), the importance of these beliefs can be seen in action everyday from selecti ng politicians and serving on juries, to the choices we make in our daily lives. Research has shown as well as having a conceptual understanding of science, the importance of students’ understanding the NOS. This understanding includes the

PAGE 39

23 students’ epistemological “values and beliefs inher ent to scientific knowledge and its development” (Add-El-Khalik, Lederman, Bell, & Schw artz, 2001). According to Ryder, Leach, and Driver (1999), knowledge relating to sci ence can be viewed as involving two interrelated areas, the knowledge of science and th e nature of science. The knowledge of science involves concepts, ideas, laws, models, theories and experimental procedures of science. The NOS may include the soc ial and cultural aspects of science, how scientists decide what to investigate, how to i nterpret data once collected, and how to believe findings published in research journals. Bringing undergraduate science students inside of s cience involves introducing both areas of knowledge. Research studies have ide ntified two basic arguments supporting the significance on learning of student’ s images of science (Ryder, et. al., 1999). The first argument is that from a learning perspective evidence from studies suggests that students’ approaches to learning are influenced by their images of the nature of the discipline (Leach, Ryder, & Driver, 1 997; Schommer, et. al., 1992; Songer & Linn, 1991). For instance, students holding the view that the endpoint of a laboratory investigation is the data collected and not the int erpreting of that data using theoretical insights. The second argument is from a “cultural perspective” that when these science students graduate they will be required to make dec isions that require an understanding of the nature of science such as critiquing a resea rch paper, preparing documents on scientific issues, or informing the public on scien tific evidence. It is possible for individuals to have epistemological beliefs that ar e both sophisticated (more relativistic) and nave (more dualistic) (Brownlee, 2002). Magol da (2002) suggests that direct observation or interview is the best way to investi gate a subject’s beliefs. This study examined undergraduate science students’ initial images of the NOS and personal epistemological beliefs of chemistry d uring a semester general chemistry

PAGE 40

24 laboratory course using the Epistemological Beliefs Assessment for the Physical Sciences (EBAPS) and the Nature of Scientific Knowledge Scale (NSKS). The nature of this study was to explore and lay a foundation for focusing on more specific features of reasoning related to personal epistemological and N OS beliefs changes in light of specific science laboratory instructional features for future research. Sub-Question 1-a Do students’ images of the nature of chemistry (NOS ) change over the course of laboratory instruction by the completion of a semester general chemistry laboratory course? Rationale. According to Lunetta (1998), many students view lab oratory as a means of manipulating equipment but not manipulatin g ideas. The science laboratory has been thought of as one of the best places for t he building and refining of student images of scientific knowledge. The purpose of lab oratory instruction is to develop a student’s knowledge of the natural world, understan ding of scientific concepts, understanding of how scientists undertake empirical investigations to address a problem of interest, and the ability to use standard labora tory instruments and procedures in investigations (Leach, Millar, Ryder, Sere, Hammelev, Niedderer, & Tselfes, 1998; Millar, Le Marechal, & Tiberghien, 1998). Students carrying out laborat ory activities must draw upon understandings of the nature of the data, the scien tific claims, the ways in which these claims and data are related, and the purposes of us ing certain instruments, procedures and techniques. Encouraging learners to self-reflec t on their learning may provide insight into how laboratory instruction may influence their science images. The nature of this study was to explore and lay a foundation for focusing o n more specific features of

PAGE 41

25 reasoning related to changes in NOS beliefs in light o f specific science laboratory instructional features for future research. This study sought to investigate if during instruction student images of the NOS (chemistry) c hange during a semester general chemistry laboratory course. Sub-Question 1-b Do students’ personal epistemological beliefs about science (development level) change by the completion of a s emester general chemistry laboratory course? Rationale. Bell (2004) explains in terms of epistemological ou tcomes that students develop images of science from their labor atory investigations and learn about their own learning. Studies involving student imag es of science indicate that these images influence student learning and participation during laboratory instruction (Sere, Leach, Niedderer, et al., 1998; Tibergein, Veillard Le Marechal, Buty, & MIllar, 2001; Ryder, et.al., 1998). Buehl and Alexander (2004) p oint out that as student beliefs became more sophisticated, the learning strategies they used also became more sophisticated. However, little is known about how science laboratory experiences and instruction develop students’ images of science the reby influencing their personal epistemological development. According to Hofer and Pintrich (1997) there is a c onsensus in the field of research on personal epistemological beliefs about a trend toward developmental progression, especially for those who experience a college education. Nevertheless, there is little agreement on what causes the change (Hofer & Pintrich, 1997; Hofer, 2000; Paulsen & Wells, 1998; Schraw, 2001). Studen t’s personal epistemological beliefs have been shown to influence attitudes and behavior in a variety of contexts,

PAGE 42

26 including the academic areas (Schommer, 1990). A li terature review by Schommer (1994) described that “epistemological beliefs may help or hinder learning” as the beliefs “affect the degree to which learners: (1) actively engage in learning, (2) persist in difficult tasks, (3) comprehend written material, and (4) cop e with ill-structured domains.” Students' epistemological beliefs and images of sci ence affect their mindset, metacognitive practices, and study habits. Evidence from studies suggests that having a more m ature epistemology in science contributes to better learning of science c ontent (Hammer, 1994; Schommer, 1993; Songer & Linn, 1991). In addition, more matu re epistemologies in science are associated both with understanding how to evaluate competing evidence in science and understanding that the existence of uncertainty in science does not weaken science’s usefulness in decision making in light of controver sies (Schwab, 1962). Despite the importance of developing mature scientific epistemo logies, studies of college students repeatedly demonstrate that college students enter (and often leave) college with factbased views of knowledge and authority-based means of making decisions (Baxter Magolda, 1992; Hofer & Pintrich, 1997; King & Kitch ener, 1994). The nature of this study was to explore and lay a foundation for focus ing on more specific features of reasoning related to personal epistemological belie f changes in light of specific science laboratory instructional features for future resear ch. This study sought to investigate the extent student personal epistemological beliefs cha nge by the completion of laboratory instruction. Question 2 What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work) do students believe were essential to their understanding during the semester general chemistry laboratory learning expe rience?

PAGE 43

27 Rationale. Supporting meaningful learning in chemistry require s the implementation of appropriate pedagogical practices Within the laboratory learning environment inquiry-based instruction, cooperative groups, self-reflection, use of learning technologies (e.g. MBL), preand post-lab oratory activities, and small-group discussions can facilitate the development of a stu dent’s personal epistemology (Drayton & Falk, 2002; Felder & Brent, 2004; Tapper 1999). However, interviews in an epistemological study of instructional strategies b y Hofer (2004) evoked a sense from the students that altering their personal epistemol ogical beliefs might also alter a sense of self. It appears that learners filter their per ceptions of instructional practices through their own epistemological perspectives. Learners need to be afforded the time necessary f or the “deep processing” of these principles with higher-order cognitive tasks (pedagogical strategies). Through the use of higher-order pedagogical strategies students are able to integrate their new experiences with prior knowledge, establish a conte xt for the laboratory instructional activity, and determine its relevance, all of which are characteristics of intellectual development (epistemological change) (Felder & Bren t, 2004). Science education research literature (Hofstein & Lunetta, 2004; Nati onal Research Council, 1997) emphasizes the importance of rethinking the role an d practice of laboratory instructional environments. According to Hofer (2004), we need to know more about how learners make sense of the personal epistemological aspects of their instructional environments, what pedagogical strategies are most salient, and h ow learners interpret those strategies through their lens of images and beliefs. In this study, NOS instruction was not purposively implemented, however several of the lab oratory activities offered inquirybased aspects necessary for NOS instruction and are indicated in chapter three. The nature of this study was to explore and lay a found ation for focusing on more specific

PAGE 44

28 features of reasoning related to their learning and specific science laboratory instructional features for future research. This st udy explored the laboratory pedagogical practices students believe were essential to their understanding during the semester general chemistry laboratory learning experience. Sub-Question 2-a What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work) do students believe influenced the ir personal epistemological beliefs about science (development) during the semester gen eral chemistry laboratory course? Rationale. According to Hofer (2004) there is limited empirica l evidence that explains what fosters changes in personal epistemol ogical beliefs. However, it has shown that students’ perceptions of instructional p ractices are interpreted through the lens of their epistemological beliefs. Researchers agree that epistemological beliefs develop over time and that better-educated students are more advanced in terms of their epistemological beliefs (Schommer, 1994, Valanides & Angeli, 2005). Developmental models suggest that disequilibrium th rough educational pedagogy fosters a movement to stimulate cognitive conflict and subsequent reorganization. Empirical studies have also identi fied connections between personal epistemological beliefs, critical thinking, and reasoning skills (Valanides & Angeli, 2005). For example, Kuhn (1991) showed that evaluat ive epistemologists were more likely than others to use counter-arguments and gen erate alternative perspectives. Studies suggest that epistemological beliefs can change wh en students work collaboratively and are given opportunities to reflect on their thinking and evaluate their beliefs such as in a laboratory setting (Hofer, 2 001; Valanides & Angeli, 2005).

PAGE 45

29 Schwab (1978) provides a broad framework for thinking about what occurs in educational settings. Schwab describes four “commonplac es”: the learner, the instructor, the learning environment in which learning takes place, and the subject matter. Three of the four commonplaces is addressed in this study, but narrowed to address the major constructs under investigation. More sp ecifically, reference to the learner includes both the background of the student pa rticipants and exploration of their individual personal epistemological beliefs of science (chemistry). This study limited the focus to the laboratory environment and th e subject matter of concern in chemistry. Various instructional elements that may carry an ep istemological impact would need further investigation. The nature of thi s study was to explore and lay a foundation for focusing on more specific features o f reasoning related to personal epistemological and NOS beliefs in light of specifi c science laboratory instructional features for future research. Laboratory instructional pedagogy expected to have epistemological significance fall into one of three categories: pre-laboratory a ctivities, laboratory work, and post laboratory activities. Although pilot observations in other general chemistry laboratory classes suggest that each of these might carry epis temological meaning, we do not know how students make such interpretations. This s uggests the need for a study that explores these instructional practices in context. The nature of this study was to explore and lay a foundation for focusing on more specific features of reasoning related to personal epistemological beliefs in light of specif ic science laboratory instructional features for future research. This study sought to investigate and identify the laboratory instructional practices that students believed infl uenced their personal epistemological beliefs during the semester general chemistry labor atory course.

PAGE 46

30 Sub-Question 2-b What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work) do students believe influenced the ir images of the nature of chemistry (NOS) during the semester general chemistry laborat ory course? Rationale. As stated previously the consensus definition of NO S is that it refers to the epistemology of science, science as a way of knowing or the values and beliefs inherent to the development of scientific knowledge (Lederman, 1992; Tao, 2003). The delivery of science (chemistry) instruction in most classrooms today rely heavily on textbooks that suggest that scientific knowledge ha s evolved in a linear and comprehensive manner (Zeidler, Walker, Ackett, & Si mmons, 2002). By engaging learners in activities that bring to light the char acteristics of science (chemistry), a more comprehensive representation of the NOS can be expl ored. According to Bell (2004), attempts to improve learn ers’ understanding of the NOS fall into two generalized instructional categories: (1) implicit approaches, where gains in understanding NOS stem implicitly through process s kills and/or inquiry based instruction and (2) explicit approaches, where spec ific aspects of the NOS are addressed purposively and explicitly, usually in th e context of the history or philosophy of science or inquiry-based instruction. Studies sug gest that explicit approaches appear to more effective in facilitating understanding of the NOS (Abd-El-Khalick & Lederman, 2000; Khishfe & Abd-El-Khalick, 2002). Coburn (2004) suggests that laboratory instruction can assist learners in developing an understanding of the NOS. However, l earners will not learn about the NOS simply by performing a laboratory activity. La boratory instruction can help learners understand the NOS if the activities are more openended and include reflective, active discussion sessions. In this study, NOS instructio n was not purposively implemented,

PAGE 47

31 however several of the laboratory activities offere d inquiry-based aspects necessary for NOS instruction and are indicated in chapter three. The nature of this study was to explore and lay a foundation for focusing on more s pecific features of reasoning related to NOS beliefs in light of specific science laborat ory instruction features for future research. This study sought to investigate and iden tify the laboratory instructional practices that students believed influenced their N OS beliefs during the semester general chemistry laboratory course. Significance of the Study Understanding the influences that learners’ persona l epistemologies and images of science have on their performance is one of the primary concerns of educational research. Previous research suggests that most col lege students are quite nave in their images and epistemological understandings of scienc e (Abd-El-Khalick & Lederman, 2000). Learners’ personal epistemological beliefs and images of science can be profoundly influenced by the instructional context or learning environment. There is some evidence that indicates learner beliefs can st rongly affect how they approach certain learning situations (Schommer, 1990). To h elp the learner advance from nave belief that knowledge is simple, absolute, and cert ain instructors should use pedagogical activities that provide opportunities for the learn er to discover that knowledge must be adapted, when applied and interpreted in different situations, thus revealing the dynamic and complex characteristics of the structure and na ture of knowledge (Paulsen & Feldman, 1999). The way a learner approaches and views the laborato ry is affected by the learner’s epistemological and NOS beliefs. The vie w that knowledge is a set of accumulated facts and the learner is a receptor of knowledge may lead to the view that laboratory is just an illustration of facts and lea rning of routine procedures. However, a

PAGE 48

32 view that knowledge is an integrated set of constru cts and that the learner constructs knowledge may promote a view of laboratory as an en deavor in which knowledge is generated and the learners not only learn routine p rocedures, but also the nature of science. Research in chemistry education focuses on understa nding and improving chemistry learning. Research that focuses on unders tanding what goes on in chemistry courses is especially useful if one is trying to im prove the teaching and learning of chemistry (Phelps, 1994). The nature of this study was to explore and lay a foundation for focusing on more specific features of reasoning related to personal epistemological and NOS beliefs changes in light of specific scienc e laboratory instructional features for future research. In addition, the study explored a nd laid a foundation for focusing on more specific features of reasoning related to thei r learning and specific science laboratory instructional features for future resear ch. This chapter describes the main purpose of this study as determining whether stude nts’ NOS, and personal epistemological beliefs about science (chemistry) c hange by the completion of a semester general chemistry course as well as, what laboratory pedagogical practices students’ believe influenced those changes during a semester general chemistry laboratory course. Summary This chapter presented the problem statement, the n ature of the study as well as introduces concepts and issues central to the resea rch: nature and development of personal epistemology, the role of student images o f science, the nature of chemistry learning, the possible link between personal episte mology and NOS, the role of the laboratory instructional environment, and research methodology issues. In addition, the

PAGE 49

33 research questions were presented followed by the s tudy’s significance for chemistry education research. Chapter two presents a review of relevant studies i n the science education and educational psychological literature focusing on th e research questions described in earlier in this chapter. Chapter two is divided int o six main sections and consisted of a review of relevant studies in the science education and educational psychological literature focusing on the research questions descr ibed in Chapter 1. The research literature includes reviews of: (1) models of perso nal epistemological development; (2) multidimensional models of personal epistemological development; (3) the nature of science; (4) research methodology issues; (5) the a pplicability to college science education; and (6) the nature of laboratory instruc tion. Chapter three describes in six sections the design and methodology of the research study. Section one restates the purpose o f the study, elaborates on the rationale behind the research questions, and presen ts an overview of the analysis, design, and methodology. Section two describes the context and participants of the setting. Section three discusses the research inst ruments, measures, and techniques which include the: (1) Chemical Concepts Inventory (CCI), (2) Epistemological Beliefs Assessment for the Physical Sciences (EBAPS), (3) Nature of Scientific Knowledge Scale (NSKS), (4) Students’ Reflective Assessment of Laboratory Methods, and (5) Indepth semi-structured interviews. Section four ide ntifies the forms of pedagogical treatment involved in the laboratory instruction. This section offers an overview of the laboratory environment and pedagogy. Included is a discussion of the three general instructional features under consideration for this study, pre-laboratory, laboratory work, and post-laboratory. Section five summarizes data collection giving a general overview of the phases of data collection and the researcher ’s role during the study. Section six

PAGE 50

34 summarizes the how the data is analyzed by describi ng the potential quantitative and qualitative analysis methods implemented for the st udy. The last section discusses aspects used in monitoring the reliability and vali dity of the data collection and analysis. Chapter four presents a description of the particip ant sample followed by the presentation of the quantitative analyses of the st udy’s first research question and subquestions. The questions are presented with the qu antitative results of the analyses for all the participants (N=56) and of the twenty whom participated in the interviews. The results are discussed and related back to the key N OS and personal epistemological beliefs literature. Chapter five presents a description of the developm ent of the participant’s personal epistemological beliefs through the presen tation of qualitative analyses of the study’s first research question and sub-question 1b. The characterization of personal epistemological beliefs with the results of the ana lyses of the participants’ responses to interview probes is presented. The combination of interviews and quantitative measures provides a glimpse into some students’ per sonal epistemological beliefs changes during the course of a semester and what th e participants’ believed influenced their beliefs. The results are discussed and relat ed back to the key personal epistemological beliefs literature. Chapter six presents a description of the developme nt of the participants’ NOS beliefs through the presentation of qualitative ana lyses of the study’s first research question and sub-question 1-a. The characterization of NOS beliefs with the results of the analyses of the participants’ responses to inte rview probes is presented. The combination of interviews and quantitative measures provide a glimpse into participants’ NOS belief changes during the course of a semester and what the participants’ believed

PAGE 51

35 influenced their beliefs. The results are discussed and related back to the key NOS beliefs literature. Chapter seven characterizes the findings of the ins tructional features of the second research question and sub-questions 2-a, and 2-b. The characterization of laboratory instruction with the quantitative and qu alitative results from the Student Evaluation of Laboratory Instruction Questionnaire as well as the results of the analyses of the participants’ responses to interview probes is presented. This provides a glimpse of the participants’ overall beliefs concerning the laboratory aspects of the semester course. The results are discussed and related back to the key laboratory education literature as well as the NOS and personal epistemo logical beliefs literature. Chapter 8 of this dissertation presents an overview of the study, limitations to the study, a summary of the major findings, and areas f or future research.

PAGE 52

36 Chapter Two: Literature Review Introduction This study was primarily concerned with developing an understanding of the relation between a student’s images of science, per sonal epistemological beliefs and laboratory classroom instructional practices. The n ature of this study was to explore and lay a foundation for focusing on more specific feat ures of reasoning related to personal epistemological and NOS beliefs changes in light of specific science laboratory instructional features for future research. In add ition, the study explored and laid a foundation for focusing on more specific features o f reasoning related to learning and specific science laboratory instructional features for future research. Therefore, this chapter comprises a review of relevant studies in t he science education and educational psychological literature focusing on the research q uestions described in Chapter 1. The first and second section of the review consider s models of personal epistemological development beginning with a discus sion of five major uni-dimensional epistemological models of development followed by a description of two multidimensional models of epistemological beliefs. The third section is literature based on research c oncerning student’s images of science. The section begins with a research-based d efinition of NOS, followed by a discussion of how students view NOS. This section concludes with a general overview of NOS instruments and the nature of NOS and person al epistemology. The fourth section discusses research methodology i ssues related to the potential instruments used to assess students’ NOS and personal epistemological

PAGE 53

37 beliefs. The discussion begins with a general over view of the types of instruments followed by two sections that review instruments cu rrently used to assess the aforementioned beliefs in general and in the domain of science. The fifth section relates to the applicability of p romoting epistemological growth in the college science classroom through the use of ce rtain pedagogical applications. The discussion begins with an overview of epistemologic al orientations in learning science followed by a description of assessing epistemologi cal levels in the classroom in order to promote epistemological growth. The remainder of t his section discusses six pedagogical applications identified in the literatu re that facilitate epistemological growth. The final section consists of a review of the liter ature on science laboratory instruction. The section begins with a description of the nature of laboratory instruction, how the developmental levels relate to laboratory i nstruction, and concludes with a discussion of science laboratory pedagogy and instr uction. Models of Epistemological Development Epistemological Intellectual Development The leading body of research in the area of persona l epistemology suggests that learners move through a patterned sequence of devel opment in their beliefs about knowledge and knowing as their ability to make mean ing develops (Hofer, 2001). Each of the five developmental models has its origins in the traditions of cognitive development. These models have similar origins an d parallel paths but significant differences as well. According to Hofer (2001), the se models share with the traditional models of cognitive development a constructivist, i nteractionist, cognitive developmental view of the learner’s developing understanding of t he world. This section reviews the five major uni-dimensional developmental models of epistemological beliefs: Perry’s Model (Perry, 1970 ), Belenky’s Ways of Knowing Model

PAGE 54

38 (Belenky, Clinchy, Goldberger, and Tarule, 1986), t he epistemological reflection model (Baxter Magolda, 1992, 2002, 2004), Model of Reflec tive Judgment (King & Kitchener, 1994, 2002, 2004), and epistemological reasoning sk ills (Kuhn, 1991; Kuhn, Cheney, & Weinstock, 2000; Kuhn & Weinstock, 2002). Table 1 presents an overview of the five major developmental models to be covered in this se ction of the review. Table 1 Uni-Dimensional Models of Epistemological Beliefs Level Perry (1970) Belenky et al. (1986) King and Kitchener (1994) BaxterMagolda (1986) Kuhn (1991) Low Silenced Realist Dualism Received Knowing Pre Reflective Thinking Absolute Knowing Absolutist Medium Multiplism Subjective Knowing Quasi Reflective Thinking Transitional Knowing Multiplist MediumHigh Relativism Procedural Knowing Independent Knowing Evaluativist High Commitment Relativism Constructed Knowing Reflective Thinking Contextual Knowing Perry’s Model One of the most influential researchers in the area of epistemological beliefs was William Perry. However, Perry never conceptualize d his groundbreaking work as the study of learners’ epistemologies but as the intell ectual and moral development of college learners. In the late 1950s and early 1960s Perry conducted a longitudinal study of the interaction between the degree of reliance o n outside authority and epistemology with white male Harvard liberal arts students over the course of their undergraduate education with open-ended and relatively unstructur ed interviews. Upon analysis of these interviews, Perry noticed trends in the learn ers’ descriptions of their educational experiences and developed a scheme for learners’ in tellectual development. Perry determined that these learners moved through severa l positions in the various

PAGE 55

39 intellectual and moral challenges they encountered in college by adopting varied perspectives toward knowledge and learning (Pavelic h & Moore, 1996). Perry associated these varied perspectives with different levels of educational experience. According to the study learners, usually freshman p roceed from Levels 1 and 2 blind acceptance of authority (there are right and wrong answers) referred to as dualism to the belief some authorities are right while others are wrong (Levels 3 and 4) known as multiplicity. The next position, contextual relati vism (Level 5) constituted a major shift in the learner’s epistemological thinking because they now valued opinions supported by evidence in some context. Learners moving from Lev el 5 to Level 6 held a view that one actively and personally constructs knowledge. Fina lly, the position of commitment (Levels 7-9) is where the learner recognizes the ne ed for commitment in one’s beliefs and about the degree of reliance on outside authori ty Learners in the dualistic stage (black-and-white) believe that external authorities can tell them the right answers to the questions while more mature learners trust their ow n ability to make decisions. Piaget’s influence on Perry’s research includes recognition that learning and development follows a linear sequence and that learning is stage-driven Perry found that the students in his study entered college at number of levels, includin g Level 1 and reached at least Level 6 upon graduation with a few reaching Level 9 (Felder & Brent, 2004). Women’s Ways of Knowing The Perry model has been challenged by Belenky et a l. (1986) because its validation was based almost entirely on interviews with males and fails to account for gender differences in developmental patterns. In Wo men’s Ways of Knowing, the authors Belenky et al. (1986) discuss the results o f their study that examined women’s perspectives of truth, knowledge and authority. A diverse sample of 135 women, with 90 women being college-educated, of different ages and varied ethnic and class

PAGE 56

40 backgrounds were interviewed in a manner similar to the one conducted by Perry on their life experiences as learners and as knowers. The interview approach of Belenky et al. (1986) differed from Perry in several points. F irst the initial interview question “What stands out for you in your life over the last few y ears?” was much broader. Second, specific aspects of the participants were targeted while Perry’s questions were nondirective. Finally, the more educated participan ts received a more detailed series of questions with respect to ways of knowing. Transcr ipts of the interviews were examined to identify five different perspectives on knowing displayed by the subjects. Most of the perspectives had counterparts in the Perry model bu t differed in certain ways that the authors attributed to gender differences in pattern s of intellectual development. Belenky et al. (1986) proposed a new classification model after initial attempts to apply Perry’s model to the participant’s responses failed. The levels of the Belenky model are silence (1), received knowing (2), subjective knowi ng (3), procedural knowing (4), and constructed knowing (5). The silence level is characterized by women experi encing a passive, voiceless existence, listening solely to authority. Few wome n in the study and none with college experience fell into this category. At the second level received knowing, women view knowing as originating outside the self and can mem orize, and repeat whatever the authorities say. A parallel to Perry’s dualism exi sts, however while dualists are often outspoken and sometimes confrontational with others about their ideas and attempt to align themselves with authority figures, received k nowers are more concerned with getting along with others and tend to feel separate d from authorities. The third level, subjective knowing, rejects autho rities and others as reliable sources of truth and analytical reasoning as a basi s for judgment, relying instead on intuitive reaction and personal experience. With procedural knowing, the women

PAGE 57

41 recognize that intuition can be wrong and replaces it with observation, analysis, and other individual’s expertise, sometimes rigidly and inappropriately. Two gender-related patterns of this category were identified as separa te knowing and connected knowing. Separate knowing resembles the latter stages of Per ry’s multiplicity (Level 4). Belenky et al. (1986) proposed two different patterns for proc edural knowing: separate knowing and connected knowing. Separate knowers work hard to el iminate subjective feelings from their decision-making process. They rely on critica l thinking to arrive at truth, subjecting all ideas and beliefs, including their own to inten se scrutiny and doubt. However, women who exhibit this pattern are less likely than men e xhibiting this pattern to do their challenging in confrontational public forums. Conne cted knowers take the opposite approach and treat personal experience as the most reliable source of knowledge. Unlike subjective knowers, however, they believe th at other individual’s experience is at least as valuable as theirs and they go to great le ngths to understand and identify with others, honoring their points of view and ways of t hinking and avoiding negative judgments. Thus, while doubt is the first response of separate knowers, it is the last resort of connected knowers. The final level, constructed knowledge, acknowledg es both intuition and the ideas of authorities and others as valid sources of knowledge. Individuals at this level make mature use of both objective logic and subject ive feelings when making judgments. The individual may reject the idea of ab solute truth at this level. The individual recognizes that all knowledge is context ual and the knower plays a vital role in constructing it. This level resembles Level 5 (cont extual relativism) of Perry’s model.

PAGE 58

42 King-Kitchener Model of Reflective Judgment Subsequent models of learners’ beliefs about knowle dge and knowing resemble the stance proposed by Perry (1970) and Be lenky et al. (1986), although based on populations more varied with regard to age and educational background. For instance, King and Kitchener (1994) sampled over 17 00 learners from a wide age range and concentrated on general epistemological beliefs that trigger reasoning in nonacademic contexts. In their efforts to unders tand the processes used in argumentation King and Kitchener (2002) interviewed the participants over the course of 15 years. Participants were presented with four dif ferent, ill-structured tasks and a series of follow-up questions to assess various aspects of their beliefs about knowledge and justification of those beliefs. Extensive testing a nd analysis of the RJM revealed that educational activities tended to improve reasoning on ill-structured activities and that older, more educated learners tended to receive hig her scores. King and Kitchener found that learners’ assumptions and beliefs about knowledge were related to how they chose to justify their beliefs. In the 1980s, King and Kitchener (2002) used the da ta from their study to develop and validate a model of how the learner dev elops reflective judgment from late adolescence through adulthood. The Reflective Jud gment Model (RJM) considers how the learner evaluates knowledge claims and justifie s his or her beliefs about arguable issues (King & Kitchener, 2004). The model’s level s constructed from John Dewey’s work on reflective thinking closely parallel the fi rst six levels of Perry’s model. Dewey argued that reflective judgments, are initiated whe n a learner recognizes that there is controversy about a problem that cannot be answered by formal logic alone, and involve careful consideration of one’s beliefs in the prese nce of supporting evidence. The stages of the RJM closely echo those proposed by Pe rry (1970) and elaborate upon

PAGE 59

43 epistemological views beyond relativism. The RJM de scribes a progression in the development of reflective thinking leading to the a bility to make reflective judgments in seven stages within three levels. Each stage repres ents a qualitatively different epistemological perspective. The seven stages grou ped into three levels include prereflective thinking (stages 1-3), quasi-reflective thinking (stages 4-5), and reflective thinking (stages 6-7). King and Kitchener’s 3-level pre-reflective thinkin g corresponds to Perry’s dualism and multiplicity positions. Learners at the first two levels of pre-reflective thinking believe in the certainty of knowledge, that single correct answers exist for all questions, make judgments based exclusively on direct observat ion and the word of authorities. Learners at the third level of pre-reflective think ing accept the existence of uncertainty but believe that it is only a temporary guess, and do not use evidence to make judgments about uncertain issues. King and Kitchene r’s 2-level quasi-reflective thinking resembles Perry’s multiplicity position (Level 4). Quasi-reflective thinkers use evidence to make judgments about uncertain issues, but reali ze that one cannot know with certainty. Stage 4 quasi-reflective thinking is ch aracteristic of the reasoning of a majority of college students (King & Kitchener, 2004). Lear ners at the lower stage (4) believe that all judgments are distinctive, with evidence b eing interpreted according to the learner’s beliefs, and so the quality of the judgme nts cannot themselves be judged. Learners at the higher stage (5) of quasi-reflectiv e thinking are moving toward the recognition that uncertainty is a part of the knowi ng process, the ability to see knowledge as an abstraction, and the recognition that that kn owledge is constructed becoming more sophisticated in the use of evidence to justify con clusions. King and Kitchener’s 2-level reflective thinking is analogous to Perry’s positio ns on relativism (Levels 5-7). Reflective thinkers accept the doubt in decision-ma king but rarely experience

PAGE 60

44 powerlessness. The reflective thinkers make judgmen ts and decisions by carefully weighing of all available evidence, the reasonablen ess of the solution, and the practical need for action. Baxter Magolda’s Model of Epistemological Reflectio n Marcia Baxter Magolda (2002) a social constructivis t, views of cognitive development are grounded in the constructive develo pmental tradition. Constructivists believe that knowledge is fundamentally subjective in nature, assembled from our perceptions and commonly agreed upon principles. Ac cording to this view, learners construct new knowledge rather than simply acquire it via memorization or through transmission. Learners construct meaning by assimil ating information, relating it to our existing knowledge, and cognitively processing it. Social constructivists believe that this process works best through discussion and social in teraction, allowing the learner to test and challenge his or her own understandings with th ose of others. For a constructivist, laws exist because they have been constructed by in dividuals from evidence, observation, and deductive or intuitive thinking, a nd, primarily, because certain communities (scientists) have equally agreed what c onstitutes valid knowledge. According to Bock, (1999) Baxter Magolda’s researc h which has a noticeably academic focus (Magolda, 2002) has contributed to o ur understanding of the development of complex reasoning among college stud ents. Baxter Magolda’s work was influenced by Perry’s interest in understanding lea rners’ viewpoints on learning in college as well as Belenky et al.’s (1986) referenc e to possible gender differences. Beginning in 1986, Baxter Magolda conducted her lon gitudinal study by interviewing 101 first year college students (51 fe males and 50 males) in an attempt to understand their “ideas about learning from a stude nt perspective”. The semi-structured interviews were conducted over the course of their undergraduate education, as well as

PAGE 61

45 the year after their graduation in hopes of examini ng learners’ patterns of cognitive development in order to explain discrepancies betwe en what she observed in learners’ patterns of cognitive development and Perry’s (1970 ) model of development. This study extended Perry’s theoretical framework and King and Kitchener’s (2002) reflective judgment model. Her recognition of the similarities between Perry’s work and Belenky’s theory of women’s ways of knowing provided addition al motivation for her to examine gender related patterns of knowing (Bock, 1999). Baxter Magolda’s interview questions referred predo minantly to classroom and learning experiences and allowed participants to vo ice their opinions freely. For instance, the opening question (i.e., “Tell me abou t the most significant aspect of your learning experience in the past year.”) reflected a n open-ended approach similar to Perry (1970) and Belenky et al. (1986) yet focused on learners’ educational experiences. Baxter Magolda developed the Measure of Epistemolog ical Reflection (MER) that consisted of short answer questions in order to tri angulate the interview data. Baxter Magolda identified six principles that cont ributed to both the process and the results of her study: 1. The making of meaning is influenced by each lear ner’s worldview and by interaction with others and is influenced by the co ntext of the learner’s experience. 2. That ways of knowing can best be understood thro ugh the principles of naturalistic inquiry, which protect the honesty of stories and experiences. 3. Reasoning patterns are not mutually exclusive an d shift over time with changing contexts. 4. Patterns are not dictated by, but related to gen der.

PAGE 62

46 5. Learner stories and interpretations cannot autom atically be generalized to other contexts. 6. Ways of knowing and reasoning patterns within th e learners were presented as levels in order to describe the predominant ways of knowing. Baxter Magolda tried unsuccessfully as Belenky et a l. (1986) to apply Perry’s model to participant responses. Therefore, she proposed her own model, the Epistemological Reflection Model. Even though, Baxter Magolda’s as sessment of beliefs is academically focused she addressed a number of beliefs that were not necessarily epistemological in nature (i.e., beliefs about the role of the instruc tor, learner, instructor and evaluation) in the development of her model. Baxter Magolda ident ified four knowledge stages that described the various levels of reasoning character ized in her Epistemological Reflection Model: absolute knowing, transitional knowing, inde pendent knowing, and contextual knowing. According to this model, college students may be found at any of four developmental stages, exhibiting either of two gend er-related patterns of behavior in all but the last stage. Absolute learners believe that all knowledge that m atters is certain, all question have one correct answer, and authorities have the k nowledge and the answers. Learners in this stage exhibit the receiving knowle dge pattern, the lowest of the epistemological patterns, and function in a passive way. Learners at this level and pattern tend to be female. This pattern corresponds with Belenky’s level of received knowledge (2), and King and Kitchener’s early pre-r eflective thinking stage (1). Learners in the mastery pattern of absolute knowing tend to be male feel free to ask questions and challenge authority. This pattern corresponds w ith Perry’s level of late dualism (2), and King and Kitchener’s early pre-reflective think ing stage (1).

PAGE 63

47 Learners at the transitional knowing stage believe some knowledge is certain and some is not. Authority figures have the responsibil ity to communicate the certainties, and the learners must make their own judgments regardin g the uncertainties. In the impersonal pattern (male), learners make judgments using a logical pro cedure prescribed by authority figures. This pattern corre sponds with Perry’s stage of multiplicity subordinate (3), and King and Kitchener’s late prereflective thinking stage (2). In the interpersonal pattern (female), learners collect ideas however base judgm ents on intuition and personal feelings. This pattern corre sponds with Belenky’s level of subjective knowledge (3), and King and Kitchener’s late pre-reflective thinking stage (2). The uncertainty of some knowledge is accepted at t he stage of independent knowing. Learners take responsibility for their own learning rather than relying heavily on authorities or personal feelings. In the individual pattern (male), learners rely on objective logic and critical thinking. This patter n corresponds with Perry’s multiplicity stage, level 4, Belenky’s level of procedural knowl edge, separate pattern (4), and King and Kitchener’s stage of quasi-reflective thinking (4-5). Learners in the inter-individual pattern (female), rely on caring, empathy, and unde rstanding of others’ positions as bases for judgments. This pattern corresponds with Belenky’s level of procedural knowledge (4), connected pattern, and King and Kitc hener’s stage of quasi-reflective thinking (4-5). Contextual learners (male and female) believe that all knowledge is contextual and individually constructed. This shift alters b oth the source and process of knowing (Baxter Magolda, 1992). They use all sources of ev idence and remain open to changing their decisions if new evidence is presented. This pattern corresponds with Perry’s level of contextual relativism (5-7), Belenky’s level of constructed knowledge (5), and King and Kitchener’s stage of reflective thinking (6-7).

PAGE 64

48 Kuhn’s Model of Reasoning Skills Kuhn’s argumentative model (1991) pertains more to general knowledge beliefs. Kuhn (1991) studied beliefs about knowledge in her attempt to understand the reasoning that occurs in everyday lives by presenting three i ll-structured problems (i.e., what causes learners to fail in school?, what causes une mployment, and what causes prisoners to return to crime?) to a cross-sectional group ranging in age from teens to the sixties. The key factors of Kuhn’s design included the broader sample of participants and that each age group included 40 participants with g ender and educational level (college and noncollege) equally represented. Kuhn individua lly interviewed each participant twice for 45 and 90 minutes each time. The partici pants were asked to explain how they came to hold a view and to justify their position w ith supporting evidence. In addition, the participants produced opposing views, provided rebuttal to that view, and then offered a remedy for the problem. Lastly, the part icipants were asked to reflect on the reasoning presented. The model she proposed from this study closely corresponds to the epistemological models developed by Perry (1970 ), and King and Kitchener (2002). In Kuhn’s model (1991, 2000; Kuhn & Weinstock, 2002 ), learners shift from a realist to an absolutist to a multiplist, then to an evaluativ ist belief of knowledge and knowing. The realist level is characterized by assertions ar e copies of an external reality, reality is directly knowable, knowledge comes from an external source and is certain, and critical thinking is unnecessary. This level i s consistent with Perry’s early dualism (1), Belenky’s level of silence (1), and King and K itchener’s early pre-reflective thinking stage (1-2). According to the absolutist belief, knowledge is a bsolute, certain, nonproblematic, right or wrong, and does not need to b e justified since it originates from authority. This belief depicts epistemological thi nking in childhood, and it can appear at

PAGE 65

49 later ages. At the level of absolutist assertions are facts that are correct or incorrect, critical thinking is a vehicle for comparing assert ions to reality and determining their truth or falsehood, while the dimensions reality and know ledge remain unaltered. This pattern is consistent with Perry’s late dualism (1), Belenk y’s level of received knowledge (2), King and Kitchener’s late pre-reflective thinking s tage (2-3), and Baxter Magolda’s absolute knowing(1). The third level, the multiplist, views assertions a s opinions freely chosen by and accountable only to their owners, reality is not di rectly knowable, knowledge is generated by human minds and is uncertain, and crit ical thinking is irrelevant. From the multiplist view knowledge is regarded as unclear an d distinctive, since each learner has his or her own views and truth. This view is typ ical of adolescence. This pattern is consistent with Perry’s multiplicity (3-4), Belenky ’s level of subjective knowledge (3), King and Kitchener’s quasi reflective thinking stag e (4-5), and Baxter Magolda’s transitional knowing (2). The final level, the evaluativist, considers assert ions as judgments that can be evaluated and compared according to criteria of arg ument and evidence, critical thinking is valued as a vehicle that promotes sound assertions and enhances understanding, while the dimensions reality and knowledge remain u nchanged. An evaluativist position incorporates and organizes both the objective and s ubjective aspects of knowing. A learner with an evaluativist view believes that two individuals may hold viewpoints that are both “right,” but one viewpoint can be “more ri ght” than the other in that it is better supported. This more sophisticated point of view d evelops well into adulthood leading to a mature understanding of the nature and justificat ion of knowledge that involves active processes of reflection and thinking (Mason, 2003). This pattern is consistent with Perry’s relativism, portions of commitment within r elativism (5-7), Belenky’s level of

PAGE 66

50 procedural, portions of connected knowledge (4-5), King and Kitchener’s late quasi reflective thinking, portions of reflective thinkin g (5-7), and Baxter Magolda’s independent knowing as well as portions of contextu al knowing (3-4). Multidimensional Models of Epistemological Beliefs Epistemological Beliefs Current epistemological beliefs research (Hofer and Pintrich, 1997; Schommer, 1990) has challenged portions of the aforementioned models for their stage-like, unidimensional characteristics. The proposed multi dimensional models suggest that personal epistemology is a collection of beliefs ab out knowing and learning, and may be more independent, rather than progressing in a deve lopmental sequence. The central alternative models of epistemological beliefs indep endent epistemological beliefs (Schommer, 1990; Schommer-Aikins, 2002) and epistem ological theories (Hofer & Pintrich, 1997; Hofer, 2000) are outlined below. Schommer-Aikins System of Independent Beliefs A second approach to understanding personal epistem ology was pioneered by Schommer (1990) using a more quantitative methodolo gy than that of her colleagues. Schommer’s (1990) interest in how learners’ beliefs about nature and the acquisition of knowledge impacted their approach to learning led h er to dispute the one-dimensional conception of beliefs. Instead she held that learn ers’ epistemological beliefs are a multilayered system of beliefs composed of separate dimensions or elements. Schommer proposed a model of five different epistem ological elements related to certainty, source, and structure of knowledge, as w ell as control and speed in the acquisition of knowledge (Schommer, 1990). The fir st three elements (i.e., certainty, source, and structure) evolved from Perry’s model, whereas control and speed of knowledge acquisition were drawn from Dweck’s and L egget’s, (1988) work on beliefs

PAGE 67

51 about intelligence and Schoenfeld’s (1983) work on the learners’ beliefs about mathematical learning. To assess these multiple elements, Schommer (1990) developed a written (paper and pencil) quantitative measure, the Schomm er Epistemological Questionnaire (SEQ). The SEQ consisted of 63 short statements t hat characterized epistemological beliefs. that uses a five-point Likert scale. In 1990, a total of 263 college students responded to the SEQ using a five-point Likert scal e. Three educational psychologists reviewed and categorized the statements into 12 sub sets reflective of the five elements proposed by Schommer. A factor analysis indicated that the 12 subsets loaded onto four independent factors, reflective of four of the five proposed elements, excluding knowledge. The first factor, Innate or Fixed Abili ty, characterized the learners’ control over knowledge acquisition with positions ranging f rom being fixed at birth to a skill that can be learned. The second factor, Simple Knowledg e, characterized the structure of knowledge, from knowledge being isolated to being i nterrelated. The third factor, Quick Learning, characterized the speed at which acquisit ion of knowledge occurs, quickly, gradually or not at all. Finally, the fourth facto r, Certain Knowledge, characterized beliefs on a continuum that knowledge is absolute t o that knowledge is tentative and evolving. Schommer verified the factors in succeeding studies with large samples of high school and college students (Schommer, 1993; Schomm er, et al., 1992). As did Perry, Schommer found evidence of developmental trends in learners’ beliefs. For instance, in a cross-sectional study, first year high school stu dents believed more in the simplicity an certainty of knowledge, the innateness of ability, and the quickness of learning than did high school seniors (Schommer, 1993). Therefore th e younger learners held less sophisticated and more nave views than the older l earners.

PAGE 68

52 In an earlier study Schommer, et al., (1992) explo red the relationship between epistemological beliefs and comprehension, specific ally focusing on how beliefs about the structure of knowledge related to the comprehen sion of integrated text material. Primarily freshman and sophomore college students r ead a highly integrated text from a statistics book. Measures assessing mastery of the material, prior knowledge, and use of study strategies were administered as well as th e learners’ confidence in understanding the passage. A regression analysis r evealed that learners who believed that learning occurs quickly or not at all tend to draw oversimplified conclusions from the text and performed poorly on the mastery test due t o an overestimation of their comprehension (Schommer, 1990). Subsequent factor analyses have replicated the fou r factors (Schommer. Crouse, & Rhodes, 1992). Schommer’s quantitative a pproach to the study of personal epistemology may have contributed to the increase i n research of personal epistemology. The SEQ has allowed researchers to m easure and identify more distinctly the relation between epistemology and le arning. Hofer and Pintrich’s Epistemological Theories Model Challenges exist to some of the views in both the d evelopmental models and independent beliefs model. Hofer and Pintrich’s (1 997) model of epistemological theories consists of elements of both the developme ntal models and independent beliefs model. Hofer and Pintrich (1997) proposed that a le arner’s beliefs about knowledge and knowing are organized into personal theories as structures of interrelated pro positions that are interconnected and logical. This view pre serves the multidimensionality of epistemological beliefs but implies more integratio n among a learner’s perspectives. Hofer and Pintrich (2002) view the nature of person al epistemology as including the

PAGE 69

53 learners’ cognition and beliefs about the nature of learning, intelligence, instruction, classrooms, domain-specific beliefs about disciplin es, and beliefs about the self. Hofer and Pintrich (1997) at length reviewed the re search related to epistemological beliefs. The review describes thre e key areas of research, which included investigations regarding how learners inte rpret their learning experiences (Belenky et al., 1986; Perry, 1970); the influence of epistemological beliefs on reasoning (King & Kitchener, 1994; Kuhn, 1991); and the idea of multidimensional beliefs (Schommer, 1994). In this review Hofer and Pintrich (1997) questioned Schommer’s characterization of factors related to speed and th e control of knowledge. Hofer and Pintrich believe the factors related to the dimensi ons Quick Learning and Innate Ability were reflective of learners’ beliefs about intellig ence. As an alternative, Hofer and Pintrich (1997) categorized learner’s epistemologic al beliefs into four dimensions. This model includes dimensions related to the nature of knowledge (what learner believes knowing is) and the nature of knowing (how learner comes to know). Within the area nature of knowledge Hofer (2000a) identifies the di mensions certainty of knowledge and simplicity of knowledge, and within the nature of k nowing the dimensions source of knowledge and justification for knowledge. The least developed epistemological dimension certa inty of knowledge is the degree that learners believe that knowledge is fixe d (low level), while other learners believe that knowledge is fluid (high level). Belie f that knowledge is fluid and open to interpretation is a key factor of King and Kitchene r’s (1994) reflective thinking stage (6-7) and Kuhn’s (1991) evaluativist level. Simplicity of knowledge is the degree that learners believe that knowledge consists of an accumulation of facts (low level), w hile other learners believe that knowledge is a system of related constructs (high l evel). According to Hofer (2000a) the

PAGE 70

54 lower level view of knowing is seen as concrete, di screte, and knowable facts while at the higher level learners see knowing as contextual contingent, and relative. This dimension is reflective of Schommer’s (1990) model that knowing is viewed on a continuum as an accumulation of facts (nave) or as highly interrelated concepts (sophisticated). The first dimension of the nature of knowledge, sou rce of knowledge considers the degree learners belief that knowledge is transm itted from external sources (low level) while other learners believe that knowledge is inte rnally constructed. At the lower levels of other epistemological models (Baxter Magolda, 19 92; Belenky et al., 1986; King and Kitchener, 1994; Kuhn, 1991; Perry, 1970) knowing o riginates outside the self and resides in external authority. The developmental t urning point is the ability of the self to construct knowledge. The most developed epistemological dimension justif ication for knowledge is the degree that learners rely upon external authority w hile other learners believe that knowledge relies on personal evaluation and integra tion. This dimension considers how learners evaluate knowledge claims, use evidence, u se authority and expertise, and their evaluation of experts (Hofer, 2000a). At the higher levels within the models learners use rules of inquiry and begin to evaluate and integrate the views of experts. Hofer’s (2000) study had two purposes: (1) to asses s the dimensions of personal epistemology across models, through the development of a new instrument; and (2) to examine whether learners recognize disciplinary in differences in epistemological beliefs. Additional research questions were explored such as the extent to which choice of academic major related to discipline-specific epist emological beliefs, gender differences, and the relation between grades and general and dis cipline-specific epistemological beliefs.

PAGE 71

55 A total of 326 first-year college students enrolle d in an introductory psychology course participated. Each participant was given a shortened version of the Schommer general epistemological beliefs questionnaire and t wo identical forms of a newly developed epistemological beliefs instrument to ass ess the four dimensions the Discipline-Focused Epistemological Beliefs Question naire (DEBQ) one labeled “Psychology” and one “Science” (Hofer, 2000). The new measure consisted of 27 items referring to the discipline as the frame of referen ce that learners responded to using a 5point Likert scale. In order to examine the dimensionality of epistemol ogical theories exploratory factor analyses of the psychology and science DEBQ data were conducted revealing four similar factors for both disciplines. In this factoring, certainty of knowledge and simplicity of knowledge did not emerge as separate dimensions and instead are representative of one cluster of beliefs about know ing (Hofer, 2000). Justification for knowledge and source of knowledge did appear as fac tors and appear to represent two distinct positions but not the range Hofer (2000) h ad expected. Finally an additional unexpected factor emerged related to the “attainmen t of truth.” With respect to the discipline differences research question the study indicated significant differences in learners’ beliefs about psychology and science. In other words, learners considered science knowledge to be more ce rtain and unchanging and suggests that first-year college students are capab le of making epistemological distinctions. Additionally, for science, authority and expertise were viewed as the source of knowledge and truth was perceived as being more attainable than for psychology (Hofer, 2000). Hofer’s (2004) qualitative, exploratory case study focused on the epistemology of instructional practices as interpreted by students in two versions of college chemistry,

PAGE 72

56 general and organic chemistry each with different u nderlying, epistemological assumptions. Her study combined observations of cl asses and interviews with students in order to provide several sources of evidence and contribute to triangulation of the data. Hofer’s qualitative study addressed epistemo logical issues using four dimensions within two clustered central areas: the nature of knowledge (what one believes knowledge is) and the nature of knowing (how one co mes to know). The nature of knowledge cluster area included the dimensions cert ainty of knowledge and simplicity of knowledge. The nature of knowing cluster area incl uded the dimensions of source of knowledge and justification for knowing. The four dimensions as described by Hofer (2004) are discussed in the following paragraphs. The dimension certainty of knowledge is the degree to which one views knowledge as certain (fixed or more fluid. At lowe r levels, absolute truth exists with certainty, while at higher levels knowledge is tent ative, evolving, and modified in interchange with peers. The simplicity of knowledg e at the lower levels, is knowledge viewed as discrete, knowable facts, and at higher l evels, students see knowledge as contextual, contingent, and relative. This dimensi on describes a range of beliefs that move from viewing knowledge as an accumulation of f acts to seeing knowledge as highly inter-related concepts (Schommer, 1994; 1990 ). Source of knowledge refers to the locus of knowledge, perceived as originating ou tside the self and residing in external authority or, as actively constructed by students i n interaction with the learning environment, and peers (Baxter Magolda, 1992; Belen ky et al., 1986). The dimension justification of knowledge involves how students ev aluate knowledge claims, including use of evidence, the use of authority and expertise and their evaluation of experts. Students may justify their beliefs through authorit y, observation, on the basis of what feel’s right, or through the evaluation of authorit y, evidence, and expertise with the

PAGE 73

57 assessment and integration of the views of experts (King & Kitchener, 1994). At lower levels students justify beliefs through observation or authority. Nature of Science Defining the Nature of Science In the past, the debates about the definition of th e NOS have centered on epistemological and sociological questions. However over the past ten years researchers have studied the aspects of the nature of science, and recently agreed on the elements of the nature of science (McComas et. al., 1998; Driver et al., 1996). The literature identifies several issues that character ize the NOS that defines science as a discipline: 1) scientific knowledge is durable, yet tentative, 2) empirical evidence is used to support ideas in science, 3) social and historic al factors play a role in the construction of scientific knowledge, 4) laws and theories play a central role in developing scientific knowledge, yet they have different functions, 5) ac curate record keeping, peer review and replication of experiments help to validate sci entific ideas, 6) science is a creative endeavor, and 7) science and technology are not the same, but they impact each other (McComas, 2004; Lederman, 2004; Leach, et al., 1996 ). Students’ Images of Science Influences upon students’ actions and learning duri ng laboratory investigations include their personal images of science and of lea rning. Leach et al., (1998) use the phrase “images of science” to refer to the descript ions of the epistemology and sociology of science used by learners in specific contexts fo r specific purposes. Leach and colleagues laboratory instruction study found that learners draw upon images of science to explain the purposes of empirical investigation, relationships between data and knowledge claims, and relationships between knowled ge claims and experimental design, analysis and interpretation of data. Three categories of learners’ images of

PAGE 74

58 science were determined. The first image of scienc e classifies learners with a datafocused view, in which learners appear to view the process of data collection as a simple one of the description of the real world. The secon d image used by other learners involves a radical relativist view, in which learne rs appear to view the process of drawing conclusions as so problematic that it is never poss ible to select one explanation as being better than another one. The final image used by s ome learners is a theory and data linked view in which data, theory, and methodologic al aspects of laboratory instruction are viewed as inter-related, each being able to inf luence the other. Other research supports the aforementioned view that learners deve lop a range of images about science rather than a cohesive view (Linn & Hsi, 20 00; Bell & Linn, 2002). This perspective echoes with Strike and Posner’s (1992) belief that learners have complex cognitive images about science based on their varie d experiences and sources of knowledge. Student Understanding of the Nature of Science Studies into learner understanding of the NOS tend to arrive at the same basic finding that learners need to experience cognitive dissonance in order to eliminate ancient conceptions of the NOS. When learners were presented with discrepant events their notions of the NOS began to conform to profes sional scientists’ understanding of the nature of science (Clough, 1997). Hogan (2000) suggests that researchers can gain a better understanding of how learners view the nat ure of science by dividing up their knowledge into two categories: distal knowledge, ho w students understand formal scientific knowledge, and proximal knowledge, how l earners understand their own personal beliefs and commitments in terms of scienc e. In another study of learner understanding of the NOS, it was found that a learn er’s views depended greatly on moral and ethical issues, rather than in newly pres ented material (Zeidler, Walker, Ackett

PAGE 75

59 & Simmons, 2002). Instead of changing their ancient notions of the nature of science, learners tended to hang on to their prior understan dings even when presented with conflicting information. Undergraduate science majo rs were found to change their conceptions of the NOS during a study that offered the learners many opportunities to discover conflicting information (Ryder, et al., 19 99). Therefore, it appears from the research that learners will change their conception s of the NOS from naive to more sophisticated through exposure to discrepant inform ation. Measuring the Understanding of the Nature of Scienc e According to Lederman (1992), early research into learner’s conceptions of the NOS consisted of forced-choice instruments that pro vided little insight into the conceptions underlying learners’ responses. Many o f these instruments used in the studies regarding the NOS tended to be objective, p encil and paper assessments which subsequently changed into more descriptive instrume nts. There are several studies of learners’ images of sc ience in the literature that are based upon the use of pencil and paper assessments and closed-response questions. In a recent study reported by Leach et al., (1998) the focus was upon the images of science that influence a students’ learning during laboratory activities. The implications from the study were that many learners do not recog nize the epistemological basis of routine algorithmic procedures used for data analys is and this can lead learners to taking inappropriate actions; that learners are likely to view knowledge claims as emerging directly from the logical analysis of data and not how particular theories and models assist in shaping scientists’ ways of evaluating an d interpreting data; and that some learners draw strong conclusions from empirical inv estigations, based on inconclusive evidence.

PAGE 76

60 According to a study by Lederman and Zeidler (1987) the NOS refers to the values and assumptions inherent to the development of scientific knowledge. In the study these values and assumptions were identified with Rubba’s (1977) six categories of nature of scientific knowledge explained in his nature of scientific knowledge scale. According to these categories, scientific knowledge is amoral, creative, developmental, parsimonious, testable and unified. Learners’ belie fs about how scientific knowledge fits into these categories reflect their understanding o f the NOS. In the 1990’s researchers argued that traditional p aper and pencil assessments would not be adequate enough to fully explain what needs to be known about learner conceptions of the NOS (Carey et al., 1989, Carey & Smith, 1993; Lederman, Wade & Bell, 1998; Smith, et al., 2000). Researchers respo nded by conducting interviews along with the questionnaires or by including several ope n-ended questions on the questionnaires in order to obtain more descriptive data. Another approach to probing learners’ images of science reported by Carey et al ., (1989) is to pose questions about particular laboratory activities that the learners are carrying out. To assess learners’ understanding of the NOS Carey and colleagues (1989 ) developed the “Nature of Science” interview to probe for an abstract definit ional understanding of the key elements of the process of scientific inquiry. Thi s instrument assesses learners’ understanding of the nature of the following: scien ce, scientific ideas, a hypothesis (prediction), and an experiment. Several versions of an instrument originally developed by Lederman, the Views of Nature of Science (VNOS), have been used mostly by the researchers who focus on preservice teachers. Other instruments have been developed to be more descriptive in explaining learner achiev ement in the nature of science such as Scientific Inquiry Capabilities and Scientific D iscovery (Zachos, Hick, Doane & Sargent, 2000). Although the objective, pencil and paper assessments have been

PAGE 77

61 altered to include more description of mechanisms, there is still a need for improved assessments regarding the nature of science. Connections Between the Nature of Science and Epist emology Hofer (2002) explains personal epistemology as deal ing with questions such as “how do we know,” as well as an individual’s person al beliefs about the nature of knowledge. In similar fashion, NOS knowledge deals with learners’ epistemological “values and beliefs inherent to scientific knowledg e and its development” (Ad-El-Khalick, et al., 2002). With the similarities in these two c onstructs it is easy to accept that a relationship must exist between them. As both cons tructs deal with the beliefs about knowledge, then one can place NOS as the science su bcomponent of personal epistemology. Exposure to the features of the NO S can be useful in helping learners to think about their epistemology. Examining the natur e of science can supply characteristics that distinguish science from other ways of knowing and explicitly help learners examine their rationale in forming ideas. Eliciting and Developing Students’ Understanding of NOS Instructors, often overlook the importance of NOS i nstruction (Abd-El-Khalick et al. 1998; Bell, et al. 2000). Recent thinking in N OS instruction is that it has to be targeted rather than relied on as a by-product of g eneral science learning. Abd-El-Khalick and Khishfe (2002) categorize the me thods to enhance learners’ images of science into the following three categori es: 1) historical, 2) implicit, and 3) explicit-reflective. Learners, like scientists int erpret new science experiences from a framework consisting of their experiences and prior knowledge. The historical method, suggests incorporating the h istory of science into science instruction to augment learners’ views of the NOS. Contextualizing the NOS means Integrating historical science examples that are ti ed to the fundamental concepts taught

PAGE 78

62 in the science discipline. Using historical exampl es illustrate the challenges and complexities scientists and the scientific communit y experience (Clough, 2006). However, according to Abd-El-Khalick and Khishfe (2 002), two national studies produced conflicting results of the effectiveness o f this method. The implicit method suggests that learners will dev elop NOS conceptions simply by participating in inquiry-based activities (Leder man & Abd-El-Khalick, 1998; Abd-ElKhalick & Khishfe, 2002). This pedagogical approac h relies on implicit NOS messages embedded within the activities. Research does not support the instructor view that planning inquiry laboratory activities that reflect NOS will result in students’ noting the implicit messages (Lederman, 1992; Moss, et al., 20 01; Khishfe & Abd-El-Khalick, 2000). The explicit method is needed to directly target NO S, while providing for reflective activities to enhance learners’ understa ndings in an effort to develop coherent overarching NOS frameworks (Abd-El-Khalick, et al., 2000; Southerland, et al., 2003). The essential role of explicit NOS instruction that draws learners’ attention to specific NOS ideas is clearly identified in the literature ( Bell, et al., 1998; Lederman, 1998; AbdEl-Khalick & Lederman, 2000a; Clough, 2006) Explic it instruction is not didactic instruction, but a thoughtful process resulting in learners reflecting on NOS phenomena through class discussion embedded with instruction (Abd-El-Khalick, 2000). According to several studies, the best way to instruct NOS co ncepts is through the use of an explicit, reflective instructional approach (Akerso n, et al., 2000; Lederman & Abd-ElKhalick, 2000; Khishfe & Lederman, 2005). In order for the instruction to be explicit the instructor cannot rely on learners picking up the i deas on their own. Learners are dependent on the explicit means of targeting NOS th rough activities, discussion, and writing. In order for instruction to be reflective learners need to consider what they know

PAGE 79

63 about a topic in order to change their minds and co ntinue learning. These instructional methods require that the learners be made aware of how their conceptions vary from that of the scientific way of knowing (Settlage, et al., 2003). Research Methodology Issues Even today, researchers struggle to find a means to assess NOS and personal epistemological beliefs. Most of the NOS and epist emological beliefs instruments (Duell and Schommer-Aikins, 2001) that exist were develope d from studies done in the 1950s and 1960s. In the process of studying models aimed at mapping the structure and the development of NOS and epistemological beliefs, res earchers have created qualitative and quantitative measurement instruments, which ran ge from interviews to task performances, to paper and pencil questionnaires. The validity of the instrument used is an important consideration for weighing the results yielded by the studies, as the instruments themselves necessarily reflect a partic ular conceptualization of the construct, which consequently constrains the partic ular dimensions which emerge. Researchers must follow the basics of assessment ad ministration. The researcher should take great care giving instructio ns to avoid influencing participants. If the initial instrument is presented with other inst ruments counterbalance the order of assessments. Any form of assessment, whether quali tative or quantitative, can be rendered invalid if it is not properly carried out or properly analyzed. The measuring instruments associated with general p ersonal epistemological beliefs, generally fall into two categories: uniand multidimensional. The difference between the instruments is the relationship among t he different theories of epistemological beliefs. Unidimensional theories c onsider that epistemological beliefs are mutually correlated, while multidimensional the ories consider that epistemological beliefs are independent of one another, and thus fr ee to vary. According to Schraw

PAGE 80

64 (2001) no attempt has been made to justify whether unior multidimensional theories are more accurate, although empirical findings currentl y provide more support for the multidimensional viewpoint. However, most resear chers agree that using a variety of research methods and instruments in a fruitful and positive manner may further clarify and validate the measures. The history of the development of assessments assoc iated with NOS began in the early 1960s. The first assessments emphasized a quantitative approach (Lederman, et al., 1998). With few exceptions, prior to 1980 the instruments developed allowed for easy grading and a quantified measure of learners’ understanding. Empirical studies of learners’ beliefs about the na ture and validation of knowledge, can present particular barriers to resea rchers as most NOS and personal epistemological beliefs are not directly apparent b ut suppressed from view. For instance, most learners do not discuss NOS and personal epist emological questions about knowledge and may have conflicting beliefs about kn owledge and knowing making it difficult to ask direct NOS or epistemological ques tions. The following sections present a general overview o f several instruments used over the past 30 years in assessing general persona l epistemological beliefs, science epistemological beliefs, and NOS beliefs. Personal Epistemological Beliefs Assessments Perry and his colleagues created the Checklist of E ducational Views (CLEV) to identify students on a continuum as dualistic or re lativistic thinkers. The CLEV was administered to a random sample of 313 freshmen in 1954 and again to these same students a year and a half later. Subsequently, Per ry and his colleagues conducted 366 interviews which included 67 four-year recordings. Perry provided evidence for interrater reliability of the interviews as well as vali dity of the CLEV to assess students’

PAGE 81

65 beliefs about knowledge (Perry, 1968/1999). Critic isms of Perry’s work include that he worked with a male sample of students and his sampl e was limited to an elite, private institution. Variability in school setting and subj ect gender would help to determine the degree to which instruction drives or hinders epist emological development. Belenky et al. (1986) utilized the phenomenological approach with long, openended interviews that allowed the interviewer and p articipant to openly reflect upon their beliefs. This qualitative approach differed greatly from Perry’s in that the technique developed into the theory, rather than the hypothes is driving the methodological approach. Interviews were conducted of 135 women fr om nine institutions ranging from coed adult education programs to private liberal ar ts colleges. Interviews were 2-5 hours in length and all were recorded and transcribed int o a 5000 page report. The interviews took the form of a case study that allowed the subj ects to “tell their whole story” without the researcher imposing any preconceived hypothesis onto the subject. Interview questions were broad in nature and open-ended, and subjects were encouraged to respond based upon their own points of view. Specif ic questions to assess Perry’s nine positions also were embedded into the interviews. Results from the interviews were coded by blind rev iewers who attempted to classify the data into Perry’s nine positions. It w as found that this data, from women and more specifically women from diverse backgrounds, d id not fit neatly into the Perry Scheme (Duell & SchommerAikins, 2001). This led t o the Women’s Ways of Knowing model put forth by Belenky et al. The methods emplo yed provide great insight into an individual’s beliefs about knowledge and the social construct of those beliefs. However, conducting this type of interview is a long and ard uous process that requires a skilled interviewer and ample time. Belenky et al. do not r eport evidence for reliability and

PAGE 82

66 validity of the interview as a research instrument for assessing epistemological development (Duell & Schommer-Aikins, 2001). Baxter Magolda developed the Measure of Epistemolog ical Reflection (MER) to conduct her research. This instrument consists of a standardized, open-ended questionnaire interview and a standardized rating p rotocol. Questions in the instrument focus on beliefs as well as justifications for beli efs, specifically beliefs about the certainty of knowledge as well as the implications these beli efs have for decision making, what the role of the learner should be, what the role of peers should be in the learning process, what the role of instructor should be and what role evaluation plays in the learning process. The drawback to using this instru ment is that interpretation is time consuming and requires a knowledgeable rater King developed the Reflective Judgment interview to assess student beliefs about what can and cannot be known, how people come to know something and the certainty of knowledge. Specifically, the intervie w identifies into which of the seven previously discussed stages an individual falls. Th e interview is comprised of four illstructured problems in the areas of physical scienc e, social science, history and biology that illustrate alternative or opposing conceptions of the dilemma. Each problem is based on an area of current interest with which the sampl e is likely to be familiar. For each problem, the subjects are asked probing questions t hat elicit an explanation and defense of their judgment about the issue. They also are as ked to explain in what way they know their opinion is true. Subjects are encouraged to e xpand fully on their responses (Duell & Schommer-Aikins, 2001). Inter-rater reliability of this instrument ranges f rom moderate to high and is also ensured by training and certification of the interv iewers and scorers. The interview also

PAGE 83

67 has fared well on validity measures. However, King and Kitchener caution that since no contextual support is offered to the participants d uring the interview, it may be actually measuring the individual’s functional level, define d by Fisher and Pipp (1984) as a person’s cognitive capacity when there is no availa ble support, and thus may underestimate his or her ability to think reflectiv ely. When contextual support is provided, individuals are able to perform closer to their upp er limit, which is referred to as their optimal level. Fischer and Pipp (1984) refer to the space between the functional level and the optimal level as the developmental range (K ing & Kitchener, 2004). Due to limitations of the Reflective Judgment inter view, Kitchener, Wood, & Jensen (1999) developed a paper-and-pencil measure for the Reflective Judgment Model. This measure is comprised of two components. The first focuses on the student’s ability to differentiate between more or less sophi sticated approaches to a dilemma. The second aspect addresses the level of sophistication of approaches that individuals see as similar to their own. Current reliability and va lidity measures appear to be similar to those of the Reflective Judgment interview (Duell & Schommer-Aikins, 2001). Schommer developed a questionnaire to assess the fi ve belief dimensions discussed in her theory. Subsets of items were crea ted to assess beliefs in multiple ways and were written in a positive and negative va lence for the following aspects: the certainty of knowledge, the relationship between ha rd work and success, the ability of individuals to learn how to learn, the innateness o f learning ability, the speed in which learning takes place, the importance of effort, the value of multidisciplinary approaches and the role of authority figures. The instrument i s comprised of 63 items that subjects respond to on a 5-point Likert scale. There is evid ence to support the reliability, content validity and predictive validity of the instrument. Schommer cautions while this instrument is useful for identifying strengths in a n individual’s epistemology, additional

PAGE 84

68 instruments may be needed for a more penetrating vi ew into specific dimensions of interest to the researcher (Duell & Schommer-Aikins 2001). Kuhn and her colleagues created a 15-item questionn aire to evaluate the Argumentative Reasoning Model. While acknowledging the value of qualitatively rich responses from long interviews, they believe this i nstrument to be practical for assessing epistemology across judgment domains and age groups At the writing of this review, there is evidence of concurrent validity, but nothi ng reported on issues of reliability. This instrument is still a work in progress (Duell & Sch ommer-Aikins, 2001). Personal Epistemological Beliefs in Science Assessm ents If we want to understand whether our students are l earning both process and scientific thinking, we need to find some way to pr obe the state of their personal epistemological beliefs about science. More impor tant to the study of personal epistemological beliefs however, there is evidence that learners’ beliefs about the value of knowledge in a particular academic domain, is re lated to their decision to pursue courses in that domain (Buehl & Alexander, 2004; Sc hommer, et al., 2003). Early epistemological beliefs studies were guided by the assumption that beliefs were domain general. Domain specific epistemological beliefs have become the focus in a emerging line of research. In 1995, Schommer and Walker addressed domain speci ficity by testing the domain generality of personal epistemological belie fs across two academic domains: mathematics and social sciences. With the use of a n instrument developed by Schommer (1990) two experiments were performed. In experiment one, participants were asked to complete a survey about personal epis temological beliefs while either thinking about mathematics (e.g., algebra and geome try) or social sciences (e.g.; psychology and sociology). In the second experimen t two design changes were made

PAGE 85

69 one to the survey involving the addition of domain reminders and the addition of a control group. Results indicated that the particip ants were able to keep a specific domain in mind while completing the survey. The ma jority of the participants demonstrated a consistent level of epistemological sophistication. Epistemological assessments geared toward the doma in of science include the Maryland Physics Expectation (MPEX), the Views abou t Science Survey (VASS), the Colorado Learning Attitudes about Science Survey, a nd the Epistemological Beliefs about the Physical Sciences (EBAPS). Development o f the aforementioned instruments include aspects of the personal epistemological bel ief theories developed by Schommer (1990), using multiple dimensions and Hofer and Pin trich (2002) views that personal epistemology includes the learners’ cognition and b eliefs about the nature of learning, classrooms, domain-specific beliefs about disciplin es, and beliefs about the self. The Maryland Physics Expectation (MPEX) survey was developed by Redish, Saul, and Steinberg in the 1990s by the Maryland Ph ysics Education Research Group (PERG) as part of a project to study the attitudes, beliefs, and expectations of students that have an effect on what they learn in an introd uctory calculus-based physics course. Students are asked to agree or disagree on a five-p oint Likert-scale from strongly agree to strongly disagree with 34 statements about how t hey view physics and how they think about their work in their physics course. The focus of the survey was not on students’ attitudes in general, such as their epistemologies or beliefs about the nature of science and scientific knowledge, but rather on their expec tations. By expectations the authors mean to ask the students to ask themselves: “What d o I expect to have to do in order to succeed in this class?” The MPEX items were validated with hours of intervi ews, listening to students talk about each item, how they interpreted it, and why they chose the answer they did. In

PAGE 86

70 addition, the uniformity of the favorable MPEX resp onses was validated by offering it to a series of expert physics instructors and asking wha t answers they would want their students to give on each item (Redish, 1998). A second survey on student beliefs toward science w as developed by Ibrahim Halloun and David Hestenes ([Halloun, 1996). The Vi ews about Science Survey (VASS) comes in four forms: biology, chemistry, mathematic s, and physics. The physics survey has 30 items while the chemistry survey has 50 item s. Each item offers two responses, and students respond to each item on an eight-point Likert-scale. This eight-point scale has been found to confuse students thereby influenc ing the reliability and validity of the instrument. In addition to items that probe expec tations, the survey includes items that attempt to probe a student’s epistemological stance toward science. The VASS is designed to probe student characteristics on six at titudinal dimensions: three scientific (structure of scientific knowledge, methodology of science, & approximate validity of scientific results) and three cognitive (learnabili ty, reflective thinking, & personal relevance). According to (Redish, 2003), both the MPEX and the VASS suffer from the problem of probing what learners think they think r ather than how they function. In addition, they have the problem that for many items the “answer the instructor wants” is reasonably clear, and learners might choose those a nswers even if that’s not what they believe. In the Epistemological Beliefs Assessment for Physical Science (EBAPS), (Elby, et al., 1999; Redish, 2003) attempt to overc ome the aforementioned problems by presenting several formats, including Likert-scale items, multiple-choice items, and “debate” items. Many EBAPS items attempt to provide context-based questions that ask students what they would do rather than what they t hink. The design of the EBAPS is similar to the multi-dimensional models of Schommer and Hofer discussed earlier. The

PAGE 87

71 EBAPS contains 17 agree-disagree items on a five-po int scale, six multiple-choice items, and seven debate items for a total of 30. The EBAPS examines epistemological beliefs along the following five axes: (1) Structure of kn owledge, (2) Nature of learning, (3) Real-life applicability, (4) Evolving Knowledge and (5) Source of ability to learn. The statistics of the EBAPS chosen as the personal epis temological beliefs assessment instrument for this study is discussed further in c hapter three. Another way in which EBAPS differs from MPEX is by construction, MPEX probes a combination of students' epistemological b eliefs about knowledge and students' expectations about their physics course. Redish et al. (1998) designed MPEX to probe both epistemology and expectations. The EB APS was constructed to probe epistemology alone, to the extent that it can be te ased apart from expectations. The dimensions of the EBAPS are similar to those di scussed by Schommer (1990) and Hofer (2004) when describing their multi -dimensional beliefs theories. For instance, the first dimension structure of knowledg e on the EBAPS probes students’ beliefs concerning whether science is a coherent bo dy of knowledge or a loose collection of perceived facts parallels both Hofer’ s and Schommer’s epistemological dimension of the simplicity of knowledge. By their definition simple knowledge suggests a range of beliefs from that of knowledge as isolat ed, unambiguous bits to a view of knowledge as highly interrelated concepts (Hofer & Pintrich, 1997). The second dimension of the EBAPS, nature of knowin g and learning probes learners’ views on whether learning science is prop agated from authority or self constructed. This dimension is similar to Hofer an d Schommer’s dimension source of knowledge. This dimension is further described as the locus of knowledge ranging from knowledge acquired from authority figures versus kn owledge derived from empirical evidence and reason.

PAGE 88

72 The third dimension of the EBAPS, real life applica bility probes learners’ beliefs concerning whether science is relevant to everyone’ s life or if it an exclusive concern of scientists. This dimension considers learners’ vie ws of the applicability of scientific knowledge as distinct from the learners’ own desire to apply science to real life. Hofer’s dimension justification for knowing considers how i ndividuals justify what they know and whether it is relevant. The fourth dimension of the EBAPS, evolving knowled ge probes the extent to which learners’ beliefs navigate between absolutis m, thinking all scientific knowledge is set in stone and extreme relativism, making no dist inctions between reasoning and mere opinion. In this dimension the approximate validi ty of scientific results is probed by determining if learners view scientific knowledge a s approximate, tentative, and refutable rather than absolute, exact, and final. This dimen sion correlates with the certainty of knowledge dimension discussed by Hofer and Schommer involving the aspects of absolute versus continually dynamic. The final EBAPS dimension, source of ability to lea rn probes learners’ epistemological beliefs about the efficacy of hard work and good study strategies in learning science, as distinct form their self-confi dence and other beliefs about themselves. In other words, science is learnable b y anyone willing to make the effort, not just by a few talented individuals. Schommer refers to this dimension as innate ability. Nature of Science Assessments In general, NOS refers to the epistemology of scien ce, science as a way of knowing, or the values and beliefs inherent to the development of scientific knowledge (Lederman, 1992). NOS traditionally has been treate d as declarative knowledge outcomes and measured by objective instruments as d iscussed earlier. Although the

PAGE 89

73 validity of the assessment instruments described be low has been criticized they are presented here as being the most valid attempts to assess understandings of the NOS (Lederman, et al., 2002; Lederman, et al., 1998). Cooley and Klopfer’s (1961) Test on Understanding S cience (TOUS) is used as one of a series of tests. Some researchers criticiz e TOUS with one of the criticisms of TOUS being that a few of the TOUS items do not rela te to a learners’ conception of scientific knowledge and are more relevant to the i nstitution of science and the profession of scientists (Lederman, et al., 1998). In addition, some argue that the TOUS loads strongly on a verbal factor and the difficult y of some items in the TOUS decrease the meaning for students. Lederman, et al., (1998) suggest that TOUS is an excellent initial assessment tool for those interested in ass essing understandings of the NOS. The Nature of Science Scale (NOSS) developed by Kim ball (19671968) is used to determine whether or not science i nstructors have the same view of science as scientists. Kimball’s validation samp les included scientists, science teachers, philosophy majors, and science majors. A criticism of the NOSS is that its development and validation using a samp le of college graduates make it inappropriate for high school populations. The Science Understanding Measure (SUM) based on th e TOUS was developed by Coxhead and Whitefield (1975). The purpose of SU M is the informative and diagnostic analysis of groups of students in the 11 to 14 age range. The SUM involves five areas: scientists as people, science and socie ty, the role and nature of experiments, theories and models in science, and the unity and i nterrelatedness of the sciences. Rubba and Anderson (1978) developed the Nature of S cientific Knowledge Scale (NSKS) to assess secondary students’ understanding of the nature of scientific knowledge in relation to their science epistemologi cal beliefs. The NSKS’s six subscales

PAGE 90

74 are amoral, creative, developmental, parsimonious, testable, and unified. Even with the NSKS obtaining weak criticism from other researcher s, it does possess potentially significant wording problems (Lederman, 1998). For example, there are some pairs of statements that differ only in that one is stated i n the positive and the other in the negative. This redundancy could encourage participa nts to check their answers on previous items when they read similarly-worded item s later in the questionnaire. This could affect reliability estimates. However, it is considered to be a valid and reliable measure of NOS by virtue of its focus on one or mor e ideas that have been traditionally considered under the label of NOS (Lederman, et al. 1998). This instrument was used in this study to assess further students epistemolo gical beliefs concerning the nature of scientific knowledge. The statistics of the NSKS w ill be discussed in further detail in chapter three. The Views on Science-Technology-Society (VOSTS) was developed by Aikenhead and Ryan (1992) and is an instrument deal ing with STS topics. The content of VOSTS statements is defined by the domain of STS content appropriate for high school students. The VOSTS conceptual scheme inclu ded science and technology, influence of society on science/technology, influen ce of science/technology on society, influence of school science on society, characteris tics of scientists, social construction of scientific knowledge, social construction of techno logy, and nature of scientific knowledge. For the past decade, interviews and othe r qualitative methodologies have been more widely used to assess students’ knowledge about NOS. Some researchers become aware of the importance of using qualitative methodologies to determine how students interpret the language of items as well as how researchers interpret students’ written language (Lederman & O’Malley, 1990).

PAGE 91

75 Lederman, et al., (2002) developed a new open-ended instrument, the Views of Nature of Science Questionnaire (VNOS), which in co mbination with individual semistructured interviews seeks to provide a meaningful assessment of learners’ NOS views. The VNOS has three versions, all of which are openended. The most frequently used versions are the VNOS–B with seven items and the VN OS–C with ten items. Each instrument aims to elucidate students' views about several aspects of "nature of science" (NOS). These NOS aspects include the following: (1) Empirical NOS; (2) Tentative NOS; (3) Inferential NOS; (4) Creative NOS; (5 ) Theory-laden NOS; (6) Social and cultural NOS; (7) Myth of the “Scientific Method” ; and (8) Nature and distinction between scientific theories and laws. Lederman, et al., (2002) suggest that the VNOS–B an d the VNOS–C be administered under controlled conditions (e.g. clas sroom setting) and with sufficient time. The authors suggest that the instruments not be used for summative assessments (i.e., final determination of student conceptions o r views) and that the users inform the students that there is no right or wrong answers. T he researchers strongly recommend that administration of the VNOS be followed with in dividual interviews to insure the validity of the instrument. The VNOS–B was tested for construct validity. The VNOS–B was administered to two groups of nine participants each: a novice group and an expert group. Analysis of the interviews identified clear differences in the expert vs. novice responses regarding NOS. The instrument was further modified and expanded for the VNOS–C. A panel of five experts examined the items for content validity and the items were modified accordingly. Profile comparisons ind icated that interpretations of participants’ views as explained on the VNOS–C were similar to those expressed by participants during individual interviews (Lederman et. al., 2002).

PAGE 92

76 Many researchers focus on assessment of students’ c onceptions of the NOS. The question is how knowledge about NOS helps students learn science and why NOS should be as a goal of science instruction. Dri ver, et al., (1996) answered this question by suggesting five arguments supporting th e inclusion of the NOS in science curriculum. These five arguments include: understa nding the NOS will help students make sense of the science, manage technological obj ects and processes they encounter, make sense of socio-scientific issues, p articipate in decision-making processes, appreciate science as a major element of contemporary culture, help students understand norms of scientific community e mbodying moral commitment, and support successful learning of science content. However, evidence suggests that knowledge of the NO S assists students in learning science content, enhances understanding of science, enhances interest in science, enhances decision making, and enhances ins tructional delivery (McComas, Almazroa, & Clougii, 1998). For example, Songer and Linn (1991) found that students with dynamic views of science acquired a more integ rated understanding of thermodynamics than those with static views. The dy namic view of science means that scientific knowledge is tentative, whereas the stat ic view means that science is a group of facts that are best memorized.

PAGE 93

77 Applicability to College Science Education Epistemological Orientations in the Sciences As the learner goes through college, he or she unde rgoes developmental progression in their attitudes toward knowing, lear ning, and teaching. The seven epistemological models described in this paper, dev eloped by Perry (1970), Belenky et al. (1986), King & Kitchener (2002), Baxter Magolda (2002), Kuhn (1991), SchommerAikins (1990), and Hofer & Pintrich (1997) outline the course of this progression. The models differ some, but paint a more or less cohere nt image of epistemological progression. Doing science depends on mature habits of mind, such as questioning assumptions and not taking information at face valu e. A learner with developed epistemological beliefs in science knows how to eva luate controversies and the existence of uncertainty Real science is all about testing accepted knowledg e and challenging authority, accepting the inescapability of uncertainty and vag ueness. Then in due course committing to theories and models based on the best available evidence while acknowledging that the theories and models will eve ntually have to be revised or rejected as better evidence emerges. Unfortunately despite significant progress in science curriculum reform in recent years, many cou rses are still taught in what some identify as a “dualistic mode,” emphasizing facts a nd well-established principles and procedures and not introducing multiplicity until t he learner’s junior or senior year with the use of case studies, or involving the learner i n research or design experiences. Many learners enter college at the level of absolu te knowing (Baxter Magolda, 1992), believing that knowledge is certain, authori ties have the knowledge, and the responsibility to communicate it, and the learners’ job is to absorb it and repeat it. As they experience their college courses and extracurr icular activities, the learners may

PAGE 94

78 progress through some or all of several successive stages in which they gradually relinquish their belief in the certainty of knowled ge and the all of knowing of authorities. They recognize the need of making judgments based o n evidence, and become increasingly skilled at gathering and analyzing the evidence. Science majors at the level of absolute knowing view science as a collection of known facts. According to Palmer and Marra (2004), these students have trouble under standing the instructor’s use of evidence as the basis of judgments or decisions and are essentially incapable of gathering and using evidence for their own judgment s. An extensive research base supports the reflective judgment model (King & Kitchener 2002, 2004) and records the progression i n levels of college students from freshman to senior years. The data closely match th e previously cited studies of science and engineering students based on the Perry model. On average, the learner enters college at the level of pre-reflective thinking (du alism), basing their judgments on unconfirmed beliefs and the declaration of authorit ies, and leave at the quasi-reflective thinking level (multiplicity), beginning to seek, a nd use evidence to support their judgments. Studies indicate very few graduates reac h the level of reflective thinking (contextual relativism). Research using the King-K itchener model found that only advanced doctoral students were consistently able t o reason reflectively (Felder & Brent, 2004). Later studies of epistemological development on the Perry scale have reached less gratifying conclusions. In particular, most le arners majoring in science are found to be in the 2.5–3.5 level and less than one-third mak e it as far as Level 5 (Pavelich & Moore, 1996; Wise, Lee, Litzinger, Marra, & Palmer, 2004). Studies by Jehng, Johnson, & Anderson (1993) and Paulsen and Wells (1998) show that learners in science are more likely than learners in social sciences and hu manities to believe in the certainty of

PAGE 95

79 knowledge and in authority as its source. However t hose it the field of science would view those beliefs as mistaken. Science majors at the level of transitional knowin g have begun to view science as a set of theories and facts with exceptions (Pal mer & Marra, 2004). Learners in the impersonal pattern take comfort in the objective na ture of science and are bewildered if this view is contradicted by their instructor. Man y learners in the interpersonal pattern turn away from science switching to the arts or hum anities because they begin to view as cold, inhuman, dogmatic, manipulative, and the e nemy of subjective knowing (Felder & Brent, 2005). There are two patterns of development described in the epistemological models, one characteristic of more female than male and the other of more male than female, but contextual knowing, is the endpoint of both pattern s. The contextual mindset of learners at the stage of contextual knowing influences how t hese individuals view science. At Baxter Magolda’s (1992) earlier levels, science is seen as a collection of objective facts that are either known and understood now or will be known and understood eventually if the correct investigation procedures are followed ( Palmer & Marra, 2004). Contextual knowers exhibit correctly viewing science as a coll ection of approximate models of reality that the scientist must play a part in cons tructing. These learners’ skepticism and willingness to challenge what is currently known an d to question the assumptions core to all claims, their tolerance of vagueness, their rec eptiveness to use both logic and intuition in their investigations, and their unwill ingness to transfer judgments made in one context to another context without critical evaluat ion, could define a first-rate scientists. It is clear that instructional programs wishing to prepare graduates to be expert scientists should be designed to promote the episte mological development of their students. Unfortunately, many science courses empha size facts and well-established

PAGE 96

80 procedures and do not routinely call on learners to confront the uncertainty of knowledge and the need to make evidence-based judgments in th e face of that uncertainty. The result is that most learners graduating from colleg e do not progress much beyond the epistemological level at which they entered. Assessing Epistemological Levels in the Classroom Numerous instruments have been developed to measure epistemological beliefs. These instruments as discussed earlier fall into tw o types: uniand multidimensional. Educators may want to consider the following questi ons in order to select the measurement tool most appropriate for evaluating th eir own learners in a classroom setting. First, consider the issues of age, ethnic ity, and gender of the participants to be assessed. According to Duell & Schommer-Aikins, (2 001) four conceptual issues the educator may want to take into account as they chos e an instrument include: (1) Is the theory behind the instrument credible? (2) Does thi s instrument measure the epistemological dimension(s) relevant to the educat or’s goals? (3) Is the educator comfortable with the format of the instrument? and (4) Among the instruments which one has the strongest evidence of reliability and valid ity? Initial epistemological beliefs measurement methods involved conducting and transcribing open-ended interviews and using traine d raters to assign levels to the interviewees. Interview transcription and analysis remains the most reliable and valid approach to assessment but the difficulty and expense of this approach has motivated efforts to design questionnaires and multiple choic e instruments that can inexpensively administered to large numbers of learners. Alternative measurements to interviews in which lea rners write essays on topics derived form the interview protocols include the Me asure of Intellectual Development (MID) for the Perry model (Pavelich & Moore, 1996), and the Measure of

PAGE 97

81 Epistemological Reflection (MER) for the Baxter Mag olda model (Baxter Magolda, 1992). Likert-scale instruments that assess learne r levels on the Perry and the King and Kitchener models respectively include the Learning Environment Preferences (LEP) questionnaire and Reflective Thinking Appraisal (Fe lder & Brent, 2004). Although these assessments have the desired advantages of low cost and ease of administration the ratings obtained using them tend to be one or two l ower than those obtained with interviews and correlate moderately at best with th e latter levels. The instrument used to collect the data should be r eliable (consistent results are obtained in repeated assessments) and valid (the in strument measures what it is intended to measure). The validity and reliability of epistemological development assessment is critically important if the results a re to be used to design balanced instruction to address the needs of all the learner s. Reliability and validity data are readily available for some instruments discussed, w hile for others they are difficult to find (Felder and Brent, 2005). Promoting Epistemological Growth Promoting epistemological intellectual growth requ ires challenging learners’ beliefs about the nature of knowledge, the role of authorities, and the procedures that should be used to make judgments. This requirement poses a problem for instructors. In most college, science classes, learners are likely to be found at all levels of epistemological development from absolute knowing t hrough contextual knowing. Instruction that might be ideally suited to learner s at one level could be ineffective or counterproductive for learners at another. One of the key principles to promoting epistemolog ical growth is effective instruction. The instructor needs to consider the learner’s epistemological beliefs and how he or she learns. Some instructors teach witho ut having much formal knowledge of

PAGE 98

82 how learners learn. The instructor’s role is prima rily that of a facilitator or coach, encouraging the learners to achieve the target atti tudes and skills and providing constructive feedback. It may not be adequate enough to just help learne rs to reflect on their epistemological beliefs. The learning environment may also need to be changed so that learners are required to engage in constructivist l earning behaviors that may then influence their epistemological beliefs In partic ular, assessment is a key factor in determining an individuals’ learning behavior and b eliefs about learning in particular contexts. Assessments need to focus on the develop ment of understanding and the application of theory to personal situations and ex periences rather than a reproductive focus on gaining facts. However over-assessment ca n reduce the motivation for learners to understand concepts, and encourage them to rote-learn material. Instructional conditions should provide the student with the challenge, reflection, and support needed to promote epistemological devel opment. Recommendations for classroom environments that enhance development acr oss epistemological positions have included encouraging learner questions and com ments, instructor recognition of learner reactions, and increased emphasis on learne r participation (Baxter Magolda, 1987). This development may be fostered by curricu lar methods that validate the learner as a knower, situate learning within the le arners’ experience, and create chances for learners to construct meaning with others (Hofe r, 2001). King and Kitchener (2002) suggest providing opportunities for learners to dis cuss and analyze ill-structured problems, the skills of gathering and evaluating da ta, engaging learners in the discussion of controversial issues, and assisting t hem in examining their assumptions about knowledge and how it is gained. In addition, instructors need to show respect for

PAGE 99

83 learners’ beliefs in spite of developmental level, and to provide feedback and support both on a cognitive and affective level. Many learners have difficulties learning within the conventional structure of a general chemistry course. Chemistry is traditional ly taught in two specific settings, the lecture hall and the laboratory. Traditional pedag ogy leaves little room for doing anything but moving quickly digested information fr om textbooks to testing. There are few protective measures in traditional pedagogy to examine whether actual learning takes place, unless one assume that correct respons es to exam questions indicate learner understanding (Coppola & Jacobs, 2001). Fur thermore, traditional laboratory activities are not actual inquiry experiments, inst ead they very observations that have been known and repeated hundreds of times. Althoug h many instructors have experimented with promising pedagogical techniques in the classroom or laboratory, few have treated this work with the same level of respe ct that they treat their research. Literature from studies concerning pedagogical inst ruction in science suggest six pedagogical applications that may provide the balan ce of challenge, reflection and support needed to promote epistemological growth an d promote a deep approach to learning (Bruning, et al., 2004; Felder & Brent, 20 04; Louca, Elby, Hammer, & Kagey, 2004; NRC, 1999; NRC, 1997; Palmer & Marra, 2004; Prince, 2004; Smith, Sheppard, Johnson, & Johnson, 2005). The pedagogical applica tions are listed in Table 2. Figure 3 provides a general overview of the pedagogical ap plications that facilitate epistemological growth in the classroom. The rema inder of the review discusses these applications and offers suggestions for implementin g them.

PAGE 100

84 Table 2 Pedagogical Applications that Facilitate Epistemolo gical Growth 1. Learning Tasks Variety and Choice 2. Expectations – Communicating and Explaining 3. Modeling and Practice 4. Constructive Feedback 5. Learner-Centered Environment 6. Respect for Student Development Pedagogical Conditions Learning Tasks Expectations Student-Centered Learning Environments Demonstrate Respect Vary problem types Vary levels of tasks Allow choices on tasks Instructional objectives Study Guides Exams Inductive Learning Active Learning Cooperative Learning Constructive Feedback Modeling and Practice Figure 3 Summary of pedagogical applications that facilitat e epistemological growth

PAGE 101

85 Learning Tasks – Variety and Choice The use of a variety of instructional tasks is the key to promoting learning. Assigning a variety of learning tasks is the only w ay to assure that all learners are confronted with tasks enough above their current de velopment level to challenge them but not too far above to discourage them. Variety and choice enable instructors both to challenge the learners’ epistemological beliefs and to ensure that learners are confronted with tasks that require a deep approach to learning (Chin & Brown, 2000, Clow, 1998). In selecting a task which encourages learners to em ploy a deep approach to learning, a number of factors should be considered. According to Clow, (1998) several studies identified the following key factors that f acilitate a deep approach: 1. The activity should be perceived by the learners as interesting and relevant. 2. Learners should have autonomy over learning and study methods. 3. If the workload is too excessive learners will r esort to a surface approach. 4. The task should not increase the anxiety of the learner. 5. Learners should not feel threatened by the task in anyway. 6. Learners should be actively involved in the task 7. Learners should interact with each other as peer learning can be very powerful. 8. Learners should have and take time to reflect on the task afterwards. They need to consider what they have learned, how they l earned it, and how it fits with their prior knowledge. 9. The context of the task should be similar to the relevancy of the subject material. 10. Provide some choice over learning tasks, and ho w the task is assessed.

PAGE 102

86 There are several ways to provide variety and choic e in learning tasks. The first way is to offer a variety of high-level problems. Science problems come in a wide range of types such as closed-ended with one correct solu tion, open-ended with multiple solutions, theoretical problems, applied problems, while others call for library research, problem formulation, and critical thinking. For ex ample, provide learners with data from a real or hypothetical experiment such as salt on a roadway retarding ice formation and call on learners to explain the results in terms of the course concepts. Other tasks (Garratt, 1998) might be based on the interpretatio n of a graph or figure, the creation of a concept map, or a short thought provoking questio n such as: “Consider several beakers of tap or pure water at different temperatu res. How do their pH values compare? Explain.” In order to promote a deep approach to learning ass ign high-level problems that the learner perceives as relevant to the subject ma tter. In addition, have some of the problems relate to the learners’ backgrounds, caree r goals, concerns, and interests by using socioscientific issues such as environmental science, genome project, and alterative fuels (Sadler, Chamber, & Zeidler, 2002; Zeidler, 1984). Provide learners with some choices over the task by allowing them to select from alternative tasks, alternative problems on homework and exams, and deciding how some tasks will be graded. Providing some choice h elps minimize the incidence at which learners are forced to work at levels too hig h or low for their level of development (Felder & Brent, 2005). Expectations – Communicating and Explaining T here are numerous reasons for learners finding chem istry difficult to learn. For instance when we instruct we make assumptions about what our students know (Garratt,

PAGE 103

87 1998), but we rarely analyze them in detail for our selves. Often the assumptions we make are wrong as we may not know what the students were suppose to learn from their previous courses and students may think they know m ore than they do. Learners are helped to overcome their problems with learning (mi sconceptions) if they have a clear understanding of what is expected of them, what goa ls we set for them, and what goals they set for themselves. Course objectives are broad statements reflecting g eneral course goals and outcomes, while learning objectives are targeted st atements about expected learner performance. Usually, learning objectives are comp etency-based as the they designate exactly what learners need to do to demonstrate mas tery of course material. Therefore, learning objectives should be stated in terms of le arner outcomes. Instructional objectives should be brief, clear, specific stateme nts of what learners will be able to perform at the conclusion of the task. According to Felder and Brent (2004), instructional objectives are statements of observable behaviors that demonstrate learners’ abi lities, attitudes, knowledge, and understanding. Instructional objectives have two pa rts: an action verb and a content area. Utilize the action verb to specify the desir ed learner performance followed by a specific description of the course specific goal. For instance, instructional objectives assist in maintaining a learner-centered emphasis a nd usually take one of the following forms: “The learner will be able to....” or “On the next exam, the learner may be called upon to…” The action verb may involve a range of s kills or cognitive processes at various levels of thinking such as define, calculat e, outline, list, predict, compare and contrast, design and model. It is important to exa mine various levels of cognitive understanding. Bloom’s (1956) taxonomy of educatio nal objectives breaks down the cognitive domain into six levels. Levels 1-3, know n as lower-level skills include

PAGE 104

88 knowledge, comprehension, application, while levels 4-6 identified as higher-level skills are analysis, synthesis, and evaluation. The best way to promote the development of higher-level skills is to include high-level tasks in the instructional objectives. Learners learn more effectively when they know what they are working towards. Learners value and expect transparency in the way t heir knowledge will be assessed. Therefore, write instructional objectives that incl ude both knowledge of content and mastery of the skills you wish the learners to deve lop. Felder and Brent (2005) suggest including some higher-level problem-solving skills (e.g. analysis, critical thinking) and the process skills (e.g. oral communication, teamwork). Make the objectives as detailed and specific as possible, list all the different tasks the learner will be expected to do, and make course tasks, homework, and exams consistent w ith objectives. Students wish to see clear relationships between lectures, laborator y activities, and learning tasks and what they are expected to demonstrate they know and can do. The instructional objectives can be valuable if they are shared with the learners in the form of study guides as they reveal to the learner what they are responsible for on the exam. When learners have a clearer understanding of what is ex pected of them, the clarity leads to a greater chance of better learner performance (NRC, 1999). Modeling and Practice Learners acquire skills most effectively through pr actice and modeling. No matter how often learners see a skill demonstrated, they rarely master it until they have practiced it repeatedly and received feedback on ho w to improve. In other words, the only way a skill is developed is by trying somethin g, seeing how well or poorly it works, reflecting on how to do it differently, then trying it again and seeing if it works better. Effective modeling and practice in instruction can challenge the learner’s beliefs and promote epistemological growth.

PAGE 105

89 One of the least effective methods of modeling, thi nking, and problem solving used in traditional instruction is to transcribe fu lly worked-out problems on the board, projector, or in a PowerPoint show. Give students incompletely specified problems and have the students itemize what they know, what they need to know, and then determine how they will determine the unknowns. Ask students to make up problems having to do with the course content that require high-level ski lls. Reform movements in chemistry education have sought to engage learners by promoting active learning and providing contemporar y situations that illustrate abstract concepts inside and outside the classroom. Introdu cing computers to a course can often result in a boost to students’ learning. Interacti ve technologies (e.g., Blackboard, Web CT, and World Wide Web) remotely deliver animations on-line quizzes, simulations, tutorials at a time and pace dictated by the learne r. More significantly, the learners can have these experiences whenever and wherever they w ish (Clow, 1998). Give the learners something to do in class instead of passively listening. For instance, in a 50-minute class at several points du ring the class, ask the students to answer a problem, sketch a concept map, solve part of a problem, or interpret an observation first individually, then in groups of t hree or four for 30 seconds to two minutes. After the activity, call on a few individ uals for response before opening the floor to volunteers. Problem-solving skills and speed in problem solving are developed through practice and feedback. Learners need to be given s ufficient experience working with mathematical and scientific models. According to T aber (2000) this means that the problem sets have to be structured to ensure that t he learner is both able to achieve success, and to develop their skills by applying th e scientific principles in higher-level tasks and contexts.

PAGE 106

90 King and Kitchener (2002) describe an ill-defined p roblem (e.g., global warming, ozone layer) as one that has more than one acceptab le solution, while a well-defined problem has only one correct answer (e.g., solving quadratic equation). To understand science as it is practiced, rather than solving pro blems from a textbook the learner needs to engage in problem-posing. After posing a proble m, learners need to experience open-ended problem solving in the classroom or labo ratory setting. Real scientific problems do not have answers in the back of the tex tbook. Research on problem-solving has received a great de al of attention. Although, several models have emerged, most are quite similar and can be summarized into a five-stage sequence: (1) identifying the problem, (2) representing the problem, (3) selecting an appropriate strategy, (4) implementing the strategy, and (5) evaluating the solutions (Bruning, et al., 2004). Obstacles to effective problem-solving can be elimi nated by enhancing the occurrence of this type of learning through practic e. Learners who persist in trying different approaches, even those that do not result in a final solution are practicing problem-solving. Five conditions discussed by Farm er, Farrell, and Lehman (1991) that enhance problem solving include: (1) the problem mu st be a problem to the learner, an obstacle, (2) the learner must have a clearly defin ed attainable goal, (3) the relevant prerequisite rules and concepts must be recalled by the learner, (4) there must be cues to help the learner recall rules and approaches, an d (5) the instructor must stress the nature and expectations of the task. Perform demonstrations and have the learners pred ict the outcomes prior. The best demonstrations generate incorrect prediction s resulting from misconceptions. Once the learners are given evidence that their men tal pictures may be wrong can

PAGE 107

91 promote cognitive dissonance, demystify authority l eading to epistemological change (Felder and Brent, 2005). Provide visual illustrations as most learners get a great deal more out of visual information than verbal information. Show pictures sketches, concept maps, and computer simulations of course-related material. T ake the class to the local wastewater treatment plant and point out the chemistry of the system (e.g., acidity, alkalinity, chlorine chemistry, pH levels, and stoichiometry). Instructors should give repeated practice in high-level tasks in class and as homewo rk before including these tasks on assessments such as exams. The more we challenge l earners to assess their own knowledge and skills accurately, the more confident they will become as learners. However, challenge alone is not sufficient. Without providing appropriate support to help learners deal with the changes they are bei ng called upon to make, they may decide to stay at their current level or even retre at to a lower developmental level. Letting go of fundamental and firmly held beliefs e ven in learning is one of the hardest tasks faced by students. Science college instructor s frequently adopt a sink or swim mentality, teaching at a high level and forcing the learners to either adapt or drop out. However, a more able approach is to include modelin g in the epistemological ways of thinking. Modeling, also referred to as monitoring is the met acognitive process of keeping track of, regulating, and controlling a mental proc ess, considering past, present, and planned mental actions. Ask learners to pause and r eflect on present learning (e.g. Why am I doing this?) and past learning (e.g. What did you learn?) to deepen their problemsolving approach and improve understanding. Therefore, it is essential that learners develop se lf-reflection skills and suitable beliefs about learning and knowledge not only for t heir own sake but because these

PAGE 108

92 skills and views may be related to improvements in their conceptual understanding. Tremendous growth has occurred in research about le arning and the role that epistemological reflection has on the learner const ructing knowledge and beliefs. Researchers recognize that learners’ beliefs about the nature of knowledge and learning play an important role in their success as well as their ability to reflect on how they learn. Reflection promotes knowledge integration and refe rs to both metacognition and sense making. Reflection provides a method for fos tering knowledge integration by helping learners to expand their repertoire of idea s, differentiate, and make connections between them. The process of reflection may help t he learner identify weaknesses with their current understanding and thus motivate them to revisit, test, and reformulate the links and connections among their ideas, leading to more coherent, robust, and integrated understanding. Constructive Feedback Learners in any classroom cover a range of levels o f epistemological development. Studies have shown that learners’ inte llectual development can be strongly influenced by their affective states. Zus ho, Pintrich, and Coppola (2003) believe emotion drives a learners’ attention, which in turn drives learning and memory. Learners who are depressed or angry may not take in and proc ess information effectively. Furthermore, an accepting and supportive classroom atmosphere has been found to enhance both academic and intellectual development. Studies that support findings related to social and cultural influences have been important in offering instructors pedagogical recommendations to facilitate epistemol ogical growth in their students (Felder & Brent, 2004; Wolters & Pintrich, 1998). Providing appropriate feedback is essential if lear ners are to remain motivated. A feature of effective feedback is that it will imp rove the learner’s confidence not only in

PAGE 109

93 the quality of work being produced but also in thei r ability to progress. Instructors should seek to respond positively to learner answers to qu estions or contributions to discussion by picking out those aspects which can be treated a s partially correct and leading the discussion towards a better response. For example, when learners share uninformed opinions during class discussions, the instructor c an demonstrate effective and respectful ways to challenge erroneous assumptions or misconceptions. The important benefit of using positive feedback is that it often leads to deeper learning. Learner-Centered Environment According to constructivist models learning is not a spectator sport. Researchers believe the most identifiable goal of epistemologic al growth is a decreasing reliance on authority for all the answers. To promote epistemo logical growth numerous studies suggest that using a learner-centered environment c an accomplish the goal (Hammer, & Elby, 2003; Herron & Nurrenbern, 1999; Hogan, 1999; NRC, 1999). This is achieved by involving learners in learning tasks individually a nd in groups that require learners to take more responsibility for their learning than th e traditional approach requires. Studies from the National Research Council (NRC, 19 97) have reported that learner-centered environments are an essential elem ent for a quality learning experience. Learning-centered environments are def ined by the NRC as “environments that pay careful attention to the knowledge, skills attitudes, and beliefs that learners bring to the educational setting.” The learner-ce ntered approach places more responsibility on the learner by expecting her or h im to come to class prepared and ready to work at the challenging task of refining c onceptual understanding and problemsolving skills. In a study performed by Nolen (200 3) classroom learning environment was a significant predictor of both satisfaction an d achievement in science.

PAGE 110

94 Currently the most relevant instructional implicati on of constructivist epistemology is that pedagogical strategies that facilitate the construction of knowledge and are learner-centered should be favored over those that do not (Smith, Sheppard, Johnson, & Johnson, 2005). Students learn by using auditory, kinesthetic and visual approaches (Bunce, 2001). Many pedagogical strategies that fos ter, encourage, and facilitate the construction of knowledge using these approaches ha ve emerged over the years such as active learning, case-based learning, cooperativ e or collaborative learning, hands-on learning, and inductive learning. All of these str ategies attempt to create an environment where learners are actively thinking and applying k nowledge, as opposed to passively listening to an instructor present the material. J. W. Layman (1996) explains how classroom instruct ion can change as the instructor and learner move from instructorcenter ed pedagogy to learner-centered pedagogy: “The previously dominant view of instruction as dir ect transfer of knowledge from instructor to student does not fit the current pers pective… The present view places the learner’s constructive mental activity a t the heart of all instructional exchanges… This does not mean that students are left to discover everything for themselves, nor that what they discover and how the y choose to describe and account for it are left solely to them. Instructio n must provide experiences and information from which learners can build new knowl edge. Instruction helps to focus those processes so that the resulting knowled ge is both valid and powerful. Valid in the sense of describing the world well … an d powerful in the sense of being useful and reliable for those students in man y diverse setting.”

PAGE 111

95 Inductive learning is based on the claim that knowl edge is built primarily from a learner’s prior learning experiences and interactio ns. Inductive learning is an effective method to motivate desire in students to learn a to pic and for addressing the instructional expectations (Felder & Brent, 2004). Inductive learning approaches such as guided inquiry, problem-based learning and case stu dy method learning have learners confront problems before they are given all the concepts needed to solve them (Bruning, et al., 2004; DiPasquale, Mason, & Kolkhorst, 2003; O’Sullivan & Copper, 2003; Leonard, 2000). The instructor using the inductive learning approac h begins by exposing learners to concrete instances of a concept. An effective w ay to motivate learners when using this method is for the instructor to inform the lea rner up front what the material has to do with their everyday life. Subsequently learners ar e encouraged to observe patterns, raise questions, and make generalizations from thei r observations. This approach can push the learner toward the independence and abilit y to relinquishing their misconceptions. Active learning is instruction that engages learner s in any course-related activity other than passively watching and listening to a le cture. This in-class instruction involves learners working individually or in small groups on tasks related to the instructional objectives such as answering question s, brainstorming, formulating questions, solving short problems, or troubleshooti ng (Felder & Brent, 2004). The idea behind active learning is that learners acquire ski lls through active practice and feedback. Therefore, the more practice they get at engaging in an activity, the better they are likely to understand the concepts associat ed with the activity. Numerous studies support the positive effects on knowledge and skill acquisition of interspersing active

PAGE 112

96 learning in a lecture class (NRC, 1999; Leonard 2000 ; Olmstead, 1999; O’Sullivan & Copper, 2003) Cooperative learning is one of the widely used and researched pedagogical methods (NRC, 1997). Hofer (2001) suggests one way to promote critical thinking skills and conceptual change is to encourage learners to w ork together in cooperative settings in which they discuss and evaluate their own belief s and how their beliefs affect learning. A number of studies have found that cooperative learning environments help l earners develop the skills and beliefs needed to think crit ically (Lord, 1994; Schraw, 2001). Macgregor, Cooper, Smith, and Robinson (2000) perfo rmed a synthesis of forty-eight interviews with instructors teaching undergraduate classes across the United States who incorporated small-group activities into their larg e classes. The instructors incorporating small-group learning activities in their large clas ses provided extensive empirical evidence and theoretical rationale for cooperative learning. For instance the studies suggested that cooperative learning promotes cognit ive elaboration, enhances critical thinking, provides feedback, and promotes social an d emotional development. In cooperative groups, learners work with peers to help incorporate new knowledge. Some instructors use this approach in l aboratory settings, lecture, or recitation sessions. In general cooperative learni ng requires certain characteristics of team members: individual accountability, individua l responsibility, interpersonal skills, and positive interdependence. The important aspe cts of these learning groups are they are designed to challenge learners’ current knowled ge and require learners to seek knew knowledge, compare and contrast prior knowledge or apply knowledge that has just been presented (Bunce, 2001). The questions p osed by team members reflect where the learners are in the learning process, rat her than where the instructor assumes they are. In a cooperative activity learners can c ompare and contrast concepts such as

PAGE 113

97 heat and temperature in discussing the gas laws. D iscussion among the team members helps learners confront their own understanding or lack of it. After the discussion, presentations of each team’s rationale assists lear ners in expressing the concepts, practice with the concepts, a chance to critique pr esentations, and time to assimilate the new knowledge (Bunce, 2001). The project titled The National Survey of Student E ngagement (NSSE, 2004) strengthens educators and researchers understanding of how learners perceive classroom-based learning as an element in the large r issue of learner engagement in their college education. Smith, et al., (2005) sug gest that NSSE findings are a valuable tool for colleges to track how successful their aca demic practices are in engaging their student bodies. The NSSE project is based on the p remise that learner engagement, the frequency with which learners participate in ac tivities that represent effective educational practice is meaningful and necessary fo r the quality of education. The annual survey of freshman and seniors asks learners how often they have, for instance, participated in projects that require integrating i deas form various sources, used e-mail to communicate with classmates and instructors, ask ed questions in class or contributed to class discussions, or tutored other classmates. Learner responses are organized around five benchmarks: (1) Level of academic chal lenge, (2) Active and collaborative learning, (3) Student-faculty interaction, (4) Enri ching educational experiences, and (5) supportive campus environment. One of the pleasing revelations of the NSSE finding s was the significant number of learners engaged in various forms of active and collaborative learning activities. The shift form passive, instructor-dominated pedagogy t o active, learner-centered environments promises to have desirable effects on learning. Student-centered learning environments take learners to deeper levels of unde rstanding and meaning, encouraging

PAGE 114

98 them to apply what they are learning to real life. Regression analyses of responses from 61,000 students across 459 colleges indicate that learners who scored higher on the deep learning scale were more satisfied with their overall educational experience. According to the latest findings seniors, full-time students, and students at liberal arts colleges scored higher on the deep learning scale. However, learners majoring in the physical sciences and engineering scored lowest, du e primarily to relatively low integrative and reflective learning scores (NSSE, 2 004). To some degree, the findings from the NSSE corroborates previous research showin g that learners majoring in the physical sciences and engineering use deep approach es to learning less often than learners from other fields (Felder & Brent, 2005; Z eegers, 2001). Respecting Student Development Levels The social environment in a classroom can have a pr ofound effect on the quality of learning that takes place. If learners believe that an instructor is concerned about them and has a strong desire for them to learn the concepts, the effects on their attitudes and motivation to learn can be intense. Learners in any classroom cover a range of levels of epistemological development. The instructor should not only respect and be sensitive to all learners but also encourage learners to use their skills and talents. Presentation of course content in a nonbiased manner, a willingness to entertain competing viewpoints, a reflective and co mposed response to confrontation and controversy, and sensitivity to learners with d ifferent needs and from varying backgrounds encourages the learner and improves the quality of instruction. Asking learners to change their epistemological bel iefs is asking a lot of them. Instructors must enhance their challenges to learne rs’ beliefs with measures that convey they care about the learners and are willing to hel p them. Ways of establishing an

PAGE 115

99 atmosphere of respect and caring include learning s tudents’ names, being available, when using student-centered learning methods, expla in how, and what they are doing. To foster the developmental level of each learner c arefully consider the learning activities to be performed in and out of class. For instance, learners at Perry’s Level 5, Belenky’s level of procedural knowing, and Hofer an d Pintrich’s level source of knowledge might thrive in a classroom environment b ased on cooperative and inquirybased learning, in which the learners are faced wit h high-level open-ended problems and are given guidance by the instructor when it is needed but are left to find their own way. Level 4 learners might do well in this environ ment even if they feel uncomfortable at first eventually promoting their progression to Lev el 5. However, Level 2 learners and Level 3 learners might find such an environment unc omfortable enough to derail their learning. For example, open-ended questions that do no have unique well-defined solutions may present a major challenge to learners at the lower belief levels of epistemological development. These problems usually require a higher epistemological belief level and deep approach to learning. Nevertheless, the answer is not to instruct complet ely in a manner that learners at Level 2 would find comfortable such as presentin g facts and formulas in lectures, assigning only single-answer problems involving tho se facts and formulas, and putting similar problems on the exams. Level 2 and Level 3 learners would not experience any epistemological growth because of it, and learners at the higher levels would be bored. The solution is to provide an appropriate selection of challenges to learners at all levels.

PAGE 116

100 The Laboratory in Chemistry Education Introduction For years, the science laboratory has been thought of as the best place for the building and articulation of students’ images and u nderstanding of the nature of science (Vhurumuku, et al., 2004). The fundamental assump tion has been that by students being involved in laboratory work they would come t o develop and assimilate the implied images of the nature of science resulting in meanin gful learning. According to Markow and Lonning (1998) meaningful learning in the colle ge chemistry laboratory is based on the notion that laboratory instruction should lead to an understanding of concepts rather than rote learning and fact verification. Students need to view the laboratory as a place to construct new knowledge and not simply as a plac e to verify the textbook. There are several pedagogical models to support me aningful learning in chemistry such as laboratory instruction. Research on the role of the laboratory in science teaching is based on more than 30 years of experience with all facets of the chemistry curriculum (Lazarowitz & Tamir, 1994; Bel l, 2004; Hofstein, 2004). Numerous studies have been reported on laboratory instructio n and its effectiveness for acquiring scientific knowledge, scientific skills, and motiva ting students (Tiberghien, et al., 2001; Hofstein, et al., 2005). Over the years an attemp t has been made to evaluate the domains that characterize laboratory work with stud ies focusing on the following features: (1) modes of learning, instruction, and assessment in the chemistry laboratory, (2) modes of assessing students’ performance in the chemistry laboratory, (3) assessing students’ attitudes towards chemistry laboratory wo rk, and (4) assessing students’ perceptions of the laboratory classroom learning en vironment (Hofstein, 2004).

PAGE 117

101 The Nature of Laboratory Instruction Numerous studies suggest that laboratory instructio n is an effective and efficient teaching strategy to attain some chemistry learning goals. According to Hofstein (2004) effective laboratory activities help students (1) c onstruct their chemistry knowledge, (2) develop communication, cooperation, psychomotor, an d thinking skills, (3) promote positive attitudes, and 4) encourages students to “ think scientifically.” For students to become successful in scientific inq uiry, their direct experience with laboratory apparatus and materials may be a ne cessary precursor (Millar, 2004). Practical laboratory work helps provide students wi th experience with chemical phenomena giving concrete meaning to, for example, ideas of chemical reactions by performing real reactions with laboratory tools. Too often, however students find chemistry difficult when in the laboratory they mak e observations at the macroscopic level, but the instructors expect them to interpret their findings at the microscopic level (Gabel, 1999; Newton, 2000). The laboratory is a complex learning environment in which students interact with each other, the lab activity, with the laboratory e quipment or instruments, and with the instructor. The interactions include affective, cog nitive, and psychomotor components. Often students do not have time to think about and reflect on their observations during laboratory instruction (Domin, 1999). However, a c ritical component of the laboratory instructional environment is encouraging students t o reflect on concepts in chemistry that can guide their inquiry. The effectiveness of laboratory investigations can be seen as an ideal environment for meaningful learning when appropriat e instructional techniques are implemented into the curriculum design. For exampl e, the use of cooperative learning techniques, active learning techniques, such as pre -preparation and post-laboratory

PAGE 118

102 small group discussions, peer evaluations, and conc ept mapping could promote higher order thinking and positive attitudes (Cooper, 1995 NRC, 1996). With laboratory investigations discussions play a meaningful role i n developing students’ understanding of scientific ideas (Driver, et al., 1994; Millar, 2004) Developmental Positioning in Chemistry Laboratory I nstruction According to several of the epistemological models (e.g. Perry, King-Kitchener, Baxter Magolda) the needs for experiential learning and concrete examples are important support elements for learners at the dual ist level. The laboratory can provide learners the opportunity to make connections betwee n abstract ideas from lecture and the world of atoms, measurement, molecules and solu tions. While, highly structured traditional lab activities support dualists, these activities can be mere methods of “verifying the truth.” Nonetheless, lab activities that are more challenging such as discovery, inquiry, or problem-based may appear too unstructured to the dualists and present more risk of accidents (Finster, 1989, 1991 ). According to Finster (1989, 1991), if most general chemistry students are at a late dualist-absolute knowing level then the most p roductive instruction will occur at the early multiplicity-transitional knowing level. L earners with a dualist perspective may have difficulty in the laboratory environment unles s they know exactly what they are suppose to do, why they are there, and what data th ey are suppose to collect. Progressing from a more structured laboratory envir onment (dualist-absolute knowing) to one of less structure (late multiplist-early relati vist) can encourage personal epistemological growth. Table 3 summarizes how the learner at different epistemological levels views aspects of the educati onal process.

PAGE 119

103 Table 3 Learner Epistemological Views of Educational Charac teristics (Adapted from Finster, 1991, p. 753) Level – Position Dualism-Absolute Knowing MultiplicityTransitional Knowing Early Relativism Independent Knowing Contextual Relativism Contextual Knowing IssuesAssumptions Nature of Knowledge Knowledge is known; right and wrong answer; collection of facts; quantitative Much knowledge is known; but uncertainty exists in some areas; knowledge is contextual All opinions are equal; knowledge is contextual; authority guides Knowledge complexcontextual; no absolute truth; right wrong can exist; Quality important over quantity Role of Instructor Source of knowledge; absolute authority Role is to dispense knowledge Source of the right way to find truth; viewed as dogmatic; model process towards truth Model the way they want us to think; use evidence Source of expertise; guide or consultant; Mutuality of learning is desired Role of Learner Receives information; demonstrates information on assessments; work hard To learn how to learn truth; express oneself To learn how they want us to think Exercise the intellect; apply “rules of adequacy” to information, judgments, and perspectives Role of Peers Not legitimate sources of knowledge Not authorities; can assist in helping or be ignored as all opinions are equal Sources of diversity; thought and perspective Sources for learning and diversity Evaluation Issues Right is good; wrong is bad; Assessments should be clear-cut and objective Is the assessment fair and how to answer if no “right” answer?; hard work not standard Show independent or relativistic thought Evaluation of selfwork separate; Assessments offer feedback for improvement; Quality of answer is important Intellectual Tasks Learn basic information; distinguish right from wrong; provide explanations Compare and contrast; distinguish content from process; improved analysis use supporting evidence in analysis; examine assumptions and processes; relate to real life Relate learning between different contexts; consider relationships and complexity; conceptual change Sources of Challenge and Frustration Ambiguity, multiple perspectives, uncertainty; dispute between authorities Recognize that uncertainty is not temporary; Qualitative; Which answer “really right” accepting learning responsibility; think independently; listen to authority Choice or commitment; choose between alternatives; scholarly work Sources of External Support High degree of structure; concrete examples; experiential learning; Presence of authority for truth Decreased structure; diversity; clear assignments involving process; access of authority to help open class atmosphere; prefers diversity; Presence of authority to help evaluate Diversity of options; Comfortable moving across contexts; intellectual mastery

PAGE 120

104 Laboratory Instructional Methods Throughout the history of chemistry education, four different methods of laboratory instruction (table 4) have been establis hed. Domin (1999) identifies four the different instructional methods as: (1) Expository (traditional-verification), (2) inquiry, (3) discovery, and (4) problem-based. These methods ar e distinguished according to three descriptors: approach (deductive or inductive), pr ocedure (given or student generated), and outcome (predetermined or undetermined). Table 4 Descriptors of Laboratory Instructional Methods (Do min, 1999, p. 543). Method Outcome Approach Procedure Expository Predetermined Deductive Given Inquiry Undetermined Inductive Student generated Discovery Predetermined Inductive Given Problem-based Pre-determined Deductive Student Generated Expository instruction, also termed traditional or verification is the most common and heavily criticized laboratory instructional met hod (Domin, 1999; Berg, 2005). Within this learning environment the instructor defines th e topic to be investigated, relates the outcome, and directs the actions of the students. The predominant feature of this method is its “cookbook” nature where the students repeat the instructor’s directions or follow the procedure in a course lab manual and are aware of the outcome. The students then compare their results against the exp ected. This approach has been criticized for placing little emphasis on thinking, being an ineffective means of conceptual change, and unrealistic in its portrayal of the nat ure of science. Studies suggest that two reasons exist for the inab ility of traditional laboratory instruction to result in minimal meaningful learnin g (Hodson, 1996; Domin, 1999;

PAGE 121

105 Shiland, 1999; Berg, 2005). First, in traditional laboratory instruction the students spend more time determining if they obtained the correct results than they spend thinking, planning, and organizing the experiment. Second, t raditional laboratory activities are designed to facilitate the development of the lower -order cognitive skills of Bloom’s taxonomy of educational objectives; knowledge, comp rehension, and application (Domin, 1999; Berg, 2005). An alternative to traditional laboratory instructio n is an open-inquiry approach. In this inductive method of instruction the students f ormulate the problem within a given area, and the outcome is undetermined (Domin, 1999; Berg, 2005). This gives the students ownership of the activity while requiring them to relate the investigation to previous work, state the purpose, predict the resul ts, generate the experimental procedure, and perform the investigation. This lab oratory instructional method is designed to improve a students’ ability to utilize formal thought, improve their attitudes toward science, and to give the student the opportu nity to engage in an authentic investigation process. Inquiry laboratory activiti es when properly designed facilitate the development of the higher-order cognitive skills of Bloom’s taxonomy of educational objectives; analysis, synthesis, and evaluation (Do min, 1999; Berg, 2005). However, the inquiry method has been criticized for placing too much emphasis on the scientific process at the cost of content, and for being time consuming. The discovery or guided-inquiry approach is induct ive with the instructor guiding the student towards discovering a desired outcome. In discovery learning students are given a general outline of possible procedures or p erhaps no more than a statement of goals. This laboratory instructional method has be en criticized for sharing some of the weaknesses of the traditional method and for being time consuming. Discovery laboratory activities when properly designed facili tate the development of the lower-order

PAGE 122

106 cognitive skill of application and higher-order cog nitive skills of Bloom’s taxonomy of educational objectives; analysis, synthesis, and ev aluation (Domin, 1999; Berg, 2005). In the problem-based instruction, the instructor pr ovides a problem and the required reference material while guiding the stude nts toward a solution. Using a deductive approach, students working in this instru ctional environment must apply their understanding of a relevant concept to devise an ex perimental pathway to the solution. Therefore this requires the student to think about what they are doing and why they are doing it. This instructional method is time consu ming and poses high demands on both students and instructors. Problem-based laboratory activities when properly designed facilitate the development of higher-order cognitiv e skills of Bloom’s taxonomy of educational objectives; analysis, synthesis, and ev aluation (Domin, 1999; Berg, 2005). Laboratory Pedagogical Approaches The latest trend in pedagogical techniques in the c hemistry laboratory is to demand more work from the learners before the labor atory by developing a prepared mind. The pedagogical emphasis on mental preparati on and how the mind can improve the acquisition of motor skills in the laboratory c an possibly be achieved with the use of a preor post laboratory discussion or assignment or both. Mental preparation administered in the form of pre-lab or post-lab que stions, summaries or imaginary practice is learning effective and places minimal d emands on the instructor (DeMeo, 2001). All too often learners view laboratory wor k as unconnected and it is here preand post-lab assignments or discussions can be part icularly useful, both to identify and subsequently to merge the links to what they alread y know (Bodner, 1986; Byers, 2002).

PAGE 123

107 Pre-Laboratory The implementation of pre-laboratory pedagogy has u ndergone some changes over the years however the thought is still that it prepares the learner’s mind for learning and applying new concepts and physical skills (DeMe o, 2001). According to Johnstone & Al-Shuaili, (2001) learner pre-laboratory prepara tion should not be just “read the lab manual before you come to the laboratory.” Some le arners ignore preparing for laboratory because they belief they can survive wit hout doing it. The idea of preparing the learner for laboratory with a pre-lab session m ay encourage deeper thinking about the experiments before they are carried out. The p re-laboratory should prepare the learner to be an active participant in the laborato ry. Personal Response System The Personal Response System (PRS), unofficially kn own as “the clicker,” is technology that allows for electronic interactions and real-time student feedback (Burnstein, & Lederman, 2001). This portable remot e-control device allows students to register their answers to multiple choice questions anonymously; the system tallies the responses and shows a histogram of responses. Facul ty can use this data in any number of ways to adjust their classroom teaching b ased on student responses to significant. The benefits to both faculty and stude nts can be great. The PRS can benefit faculty in three areas: teachin g, research, and service (Fitch, 2004). The most commonly stated goal of stu dent response systems is to improve student learning in the following areas: (1 ) improved class attendance and preparation, (2) clearer comprehension, (3) more ac tive participation during class, (4) increased peer or collaborative learning, (5) bette r learning and enrollment retention, and (6) greater student satisfaction. A second bas ic goal is to improve teaching effectiveness. With PRS, immediate feedback is easi ly available from all students on the

PAGE 124

108 pace, content, interest, and comprehension of the a ctivity, lecture, or discussion. The PRS allows the instructor to view immediately how t he whole class collectively responds to the questions thereby allowing the instructor to adjust the class activities and discussions based on what is clear and what is not clear to the students, The student benefits include allowing students: (1) to respond to questions in private with no pressure to get the right answer, ( 2) to view immediately how the whole class collectively responds to the questions, and ( 3) to discuss the question and responses with classmates who can sometimes articul ate new material in a way that the expert (i.e. instructor) might not be thinking. Laboratory Work Any piece of laboratory work requires students to e mploy procedures. However, instructors cannot expect students to use procedure s effectively if these are not taught explicitly, explained and used in a variety of cont exts. Once the procedures are understood, students’ have powerful tools to be use d in designing experiments. Experimental design is a particularly effective con text for teaching epistemological knowledge (Tiberghien, et al., 1998). During laboratory work there should be a constant i nteraction between the collection of data (i.e. measurements, observations ) and theory. Laboratory notebooks are often used as a formative assessment tool. The use of laboratory notebooks as a part of instruction is supported by many researcher s who advocate writing in science to enhance learner understanding of scientific content and processes as well as general writing (Keys, et. al., 1999; Shepardson & Britsch, 2000; Bass, et. al., 2001). The laboratory notebook trains learners to fulfill anot her scientific requirement, the provision of a clear and accurate written record of procedure s, results and discussion. The particularly common and egregious habit of recordin g results and performing

PAGE 125

109 calculations on scraps of paper or paper towels is actively discouraged. Instead, learners should be instructed to treat the laborato ry notebook as an integral part of each laboratory exercise in which the pre-lab write up p repares them for the exercise, and where results are entered during each laboratory se ssion. Laboratory notebooks at a minimum should consist of the elements listed in Ta ble 5. The conclusion and discussion should be based on the laboratory results and accom panied by a brief discussion of their chemical significance. Learners are encouraged to record any problems encountered during the procedure and comment on their effect on the results with recommendations for avoiding similar problems in future laboratory exercises. Table 5 Basic Elements of the Laboratory Notebook 1. General-Introduction-Purpose 2. Predictions 3. Procedural 4. Results-Calculations 5. Discussion-Conclusion 6. References Microcomputer-Based Laboratory Instruction Microcomputer-based Laboratory (MBL) instruction ha s been used in chemistry laboratory education since early 1980 (Barton, 2005 ; Pienta, & Amend, 2004; 2002; Nakhleh, 1994; Friedler & Tamir, 1984). MBL are to ols that use microcomputers for analysis, data acquisition, and display. Students use probes and software to direct the computer to collect, record, and graph scientific d ata similar to research scientists (Pienta, & Amend, 2004; Newton, 2000). MBLs can support and enhance meaningful learning in scientific inquiry. They assist in a learners’ knowledge construction, and h elp develop concepts and skills such as graphing, collaboration, and scientific reasonin g (Pienta, & Amend, 2004; Nachmias & Linn, 1990). The value of the MBL learning envi ronment lies in increasing the

PAGE 126

110 student’s ability to analyze and interpret data. S tudents can repeat experiments thereby generating more data for analysis, manipulate the p arameters of investigations, and study graphs by using MBL modeling tools (Pienta, & Amend, 2004; Newton, 1997; Settlage, 1995; Lazarowitz, & Tamir, 1994). MBLs allow students to devote more time to observat ion, reflection, and discussion. Students performing a traditional benc h laboratory investigation can require twice as much time as those performing the investig ation with a MBL system. Therefore, the MBL instruction allows students more time to di scuss, plan, and take responsibility for their study processes (Pienta, & Amend, 2004; D omin, 1999). However, according to Pienta and Amend (2004) students without an appropr iate conceptual understanding of chemistry may fail to observe the phenomenon under investigation. Therefore, MBLs may not promote learning for all students (Atar, 20 02). The instructional effectiveness of MBL is connected to the pedagogical method employed. The design of the activities with the MB L must be carefully structured. Learners spending time doing little more than looki ng at the MBL hardware log data and prepare graphs can hinder learning outcomes (Malina & Nakhleh, 2001; Newton, 1997; Linn, 1995). In addition, learners need time to be come familiar and confident in using the probes and software. Learners’ interactions with the instructor are impo rtant in maximizing potential benefits from MBL use (Pienta, & Amend, 2004; Barto n, 1997; Newton, 1997). The instructor should engage learners in discussions of the meaning of their data and graphs with their peers. This encourages learners to refl ect on their meaning and improve their ability to think more deeply (Barton, 1997). In ad dition, asking learners prompting questions such as: a) How do you know when the rea ction has finished, or b) If you

PAGE 127

111 dilute the solution, how does this affect the react ion time? can significantly affect their interpretations of the data (Pienta, & Amend, 2004; Rogers, 1997). Post Laboratory Data processing and the development of conclusions provide students opportunities to develop conceptual and epistemolog ical understanding. Data processing can be treated as an algorithm, or as an epistemological opportunity for students to develop the confidence that can be attr ibuted to data and the uses to which data can be put (Tiberghien, et al., 1998). The us e of post-laboratory discussions to facilitate reflection and promote the consolidation of learning appears to be consistent with current learning theories. The facilitation of post-lab discussions in peer groups encourages deeper reflection about the results (Bye rs, 2002). The post-laboratory tasks or discussions should deal with applications, extensions, implications, and possible connections with other areas of chemistry. Laboratory reports need to be more than filling in blanks in an established pattern. While most learners initially need guidan ce formatting a laboratory report, the challenge is in forcing the learner to examine chem istry from more than a “body of knowledge” approach. Constantly addressing the iss ues such as experimental limitations and that science does not always presen t a clear, single answer can promote analysis by the learner in the form of “thinking ab out thinking.” The technical writing experience for science majors can be helpful as the y will probably be writing scientific articles in the future (Wimpfheimer, 2004). Summary Researchers exploring learners’ personal epistemolo gical development and images of the nature of science have identified sev eral individual constructs, instructional factors, and social factors that may influence whether positive learning

PAGE 128

112 changes will occur. There is a great deal of resear ch available on the topic of the nature of science and epistemological beliefs in the class room. However, much of the research is very limited in scope, looking at preservice tea chers, and students in K-12. There is limited research on the connections between NOS and personal epistemological belief development of college science students in a labora tory environment. The purpose of this chapter was to describe the the oretical and conceptual frameworks, and describe the empirical research per tinent to student images of science and epistemological beliefs development. Research l iterature regarding the following variables was presented: (1) models of epistemologi cal development; (2) multidimensional models of epistemological developm ent; (3) nature of science; (4) the applicability to college science education; and (5) the laboratory in chemistry education was examined to gain an understanding of previous s tudies. Sections one and two, the scope of the review, desc ribed related theories including Perry’s Scheme of Intellectual and Ethica l Development, Baxter-Magolda’s Epistemological Reflection Model, King and Kitchene r’s Reflective Judgment Model, and Hofer and Pintrich’s Epistemological Theories M odel. It has also provided information on assumptions, and validity and reliab ility issues of the theories. In addition, it reviews literature studies related to these theories. Section three reviewed literature of studies relat ed to the Nature of Science, also referred to as students’ images of science. The re view discusses the controversy over the definition of the NOS, describes the images of science that students draw upon during laboratory activities, the need for students ’ to experience cognitive dissonance to change their NOS beliefs, the instruments used to m easure understanding of the NOS, thoughts on the connections between NOS and epistem ology, and how to elicit and develop students’ understanding of NOS in the class room.

PAGE 129

113 Section four discussed the research methodology iss ues with the major focus being on the potential assessment tools used in stu dying NOS and personal epistemological beliefs. It provides an overview o f a few of the instruments used to assess the aforementioned beliefs. Section five discussed the literature surrounding h ow the constructs in sections one, two and three apply to college science educati on. It provides an overview of studies that describe epistemological orientations in learning science as well as how to assess students’ epistemological levels in a classr oom setting. In addition, it has reviewed literature concerning pedagogical applicat ions that can be used in the classroom in order to promote epistemological growt h. The final section of this chapter presented an over view of the laboratory in chemistry education. This section of the literatur e review elaborated on the nature of laboratory instruction, epistemological development in laboratory instruction, and the history of laboratory instructional methods. The r eview ended with an overview of potential laboratory pedagogical approaches used in laboratory instruction. Chapter three describes in six sections the design and methodology of the research study. Section one restates the purpose o f the study, elaborates on the rationale behind the research questions, and presen ts an overview of the analysis, design, and methodology. Section two describes the context and participants of the setting. Section three discusses the research inst ruments, measures, and techniques which include the: (1) Chemical Concepts Inventory (CCI), (2) Epistemological Beliefs Assessment for the Physical Sciences (EBAPS), (3) Nature of Scientific Knowledge Scale (NSKS), (4) Students’ Reflective Assessment of Laboratory Methods, and (5) Indepth semi-structured interviews. Section four ide ntifies the forms of pedagogical treatment involved in the laboratory instruction. This section offers an overview of the

PAGE 130

114 laboratory environment and pedagogy. Included is a discussion of the three general instructional features under consideration for this study, pre-laboratory, laboratory work, and post-laboratory. Section five summarizes data collection giving a general overview of the phases of data collection and the researcher ’s role during the study. Section six summarizes the how the data is analyzed by describi ng the potential quantitative and qualitative analysis methods implemented for the st udy. The last section discusses aspects used in monitoring the reliability and vali dity of the data collection and analysis.

PAGE 131

115 Chapter Three: Methods Introduction The nature of this study was to explore and lay a f oundation for focusing on more specific features of reasoning related to personal epistemological and NOS beliefs changes in light of specific science laboratory ins tructional features for future research The primary focus of this mixed methods study was t wo-fold: (1) to determine if college science students’ NOS and personal epistemological beliefs change as a result of the completion of a general chemistry laboratory course and (2) to explore the possible influences of laboratory classroom instructional pr actices on the aforementioned changes in beliefs. This chapter is divided into five sections. The first addresses the general research design such as the research instru ments, data collection procedures, and data scoring procedures. Following this are sec tions discussing the recruitment and characteristics of the study’s participants. The ch apter will then conclude with the procedures for analyzing and informing the data. T he procedures will be described as they pertain to the research questions in the prese nt study. Figure 4 presents an overview of the organization of chapter three. Due to the differing research methods used by scien ce educators studying images of science and instructional strategies and educational psychologists studying personal epistemological beliefs, a semi-naturalist ic mixed-methods triangulation embedded approach was employed in this study. This approach represents one of the traditional models of a mixed-methods triangulation design. The researcher collects and analyzes the different data sets separately and the n the qualitative data provides a

PAGE 132

116 supportive, secondary role (Creswell, 1999; Carcell i & Greene, 1997). The qualitative results are embedded within the quantitative data t o better interpret the findings serving a supportive secondary role. This model is used to compare and inform quantitative results with qualitative findings. Reliability usually measures the extent to which th e results of an instrument or study would be replicated given the same sample. R eliability is an important precondition for establishing validity (Lincoln & Guba 1985). However, the qualitative research tradition recognizes that participants and their interpretations of research instruments are dynamic. Therefore, exact replicat ion of results is not an assumption of this study. Initial and final interviews were impl emented to assist in checking the validity of the participants’ scores on the EBAPS and NSKS. The initial scores of the interview participants were compared to their initial intervi ew responses. This method was repeated with the final scores and interviews. The Cronbach alpha coefficient as well as Pearson correlations are reported and used as indic ators of internal consistency and to describe the strength and direction of the linear r elationship between the dimensions of each instrument. A combination of assessment tools developed and va lidated in previous studies within the two different disciplines was used to de termine if students’ NOS and personal epistemological beliefs change following the comple tion of a general chemistry laboratory course and the possible influences of la boratory classroom instructional practices on the aforementioned changes in beliefs. Descriptive statistics such as frequencies, means, and standard deviations were computed to summarize the participants’ responses t o the pre-post assessments. A paired-samples t-test (repeated measures) was used to compare the pre-post mean scores for the participants. The variability for t he paired-samples t-test was calculated

PAGE 133

117 by calculating the eta squared. The effect size (d ) was interpreted using the guidelines from Cohen (1998). In this dissertation, effect si zes were calculated from the mean gain score (mean Time 2 – mean Time 1) divided by the po oled standard deviation of the Time 1 and Time 2. To interpret the effect size val ues the following guidelines from Cohen (1998) were used: 0.20 = small effect, 0.50 = moderate effect, and 0.80 = large effect. Pearson product-moment correlation was used to determine the degree that quantitative variables were linearly related. To c ompare individual student performance on the preand post-assessment the normalized (Hak e) gain factor was calculated. The variability for the paired-samples t-test was c alculated using the formula for eta squared. Eta squared can range from 0 to 1 and represents the proportion of variance in the dependent variable that is explaine d by the independent variable. To interpret the eta squared values the following guid elines from Cohen (1998) can be used: 0.01 = small effect, 0.06 = moderate effect, and 0.14 = large effect. Variability is defined here as t 2 divided by t 2 plus sample size minus 1 (eta squared = t 2 / t 2 + N-1). The data analysis is discussed further in chapters three and four. The remainder of this chapter discusses the research design. Figure 5 pr esents an overview of the general context and measures that were applied in this stud y.

PAGE 134

118 nr r Research Questions Context and Participants Research Instruments Data Analysis Treatment Data Collection Figure 4 Overview of the organization of chapter three

PAGE 135

119 General Methods Overview EBAPS Context-Participants Setting Sample Population Quantitative Instruments CCI NSKS Assessment of Laboratory Instruction Qualitative Instruments Initial Lab Skills Questionnaire Laboratory Methods Questionnaire Semi-structured interviews Probe student responses Research Measures-Tools Initial and Final Interviews Figure 5 General context and measures overview

PAGE 136

120 Research Questions RQ1 What range of personal epistemological and NOS beli efs about science (chemistry) do undergraduate science students have at the beginning of a semester general chemistry laboratory course? RQ1a. Do students’ images of the nature of chemistry (NOS ) change by the completion of a semester general chemistry laborato ry course? RQ1b. Do students’ personal epistemological beliefs about science (chemistry) change by the completion of a semester general chem istry laboratory course? RQ2. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work) do students believe were essential to their understanding during the semester general chemistry laboratory learning expe rience? RQ2a. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work) do students believe in fluenced their personal epistemological beliefs about science (chemistry) during the semest er general chemistry laboratory course? RQ2b. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work) do students believe in fluenced their images of the nature of chemistry (NOS) during the semester general chemist ry laboratory course? Elaboration of Research Questions The questions that guided this study deal with stud ents’ personal epistemological beliefs of science, students’ images of science (NO S), and laboratory pedagogical practices as discussed in the literature review (Ch apter 2). The construct of personal epistemology involves the nature of knowledge and k nowing. The NOS refers to the epistemology of science, science as a way of knowin g, and the beliefs and values

PAGE 137

121 inherent to the development of scientific knowledge Laboratory science experiences are where students interact with materials to obser ve phenomena. Certain laboratory pedagogical practices might improve both students’ images of science and their epistemological beliefs. Together the research qu estions prescribe an investigation that explores if students’ NOS and personal epistemologi cal beliefs about science (chemistry) change by the completion of a semester chemistry laboratory course and how, what laboratory instructional strategies stude nts’ believe influenced their understanding of the material and changed their NOS and personal epistemological beliefs. The nature of this study was to explore and lay a foundation for focusing on more specific features of reasoning related to pers onal epistemological and NOS beliefs in light of specific science laboratory instruction al features for future research. Research question 1 focused on students’ current NO S and personal epistemological beliefs. Leach, et al., (1998) ind icate there is a good deal of evidence that the images of science and epistemological beli efs that students hold can hinder performance during laboratory work (Sere, et al., 1 993; Ryder, et al., 1997). Research question 1’s associated sub-questions consider if a t all, students’ current NOS and epistemological beliefs change following the comple tion of the semester course. There are a number of studies of students’ images of scie nce and epistemological beliefs in the literature; however few of these relate to high sch ool or college students or the images of science or epistemological beliefs that students mi ght draw upon during laboratory work (Leach, et al., 1998). As discussed in chapter two studies have been condu cted to examine personal epistemological beliefs of both instructors and stu dents in domain specific areas such as history, mathematics, and science. However, there seems to be a lack of agreement in its definition even when one refers to personal epi stemological beliefs in a particular

PAGE 138

122 subject (Elder, 1999; Paulsen & Wells, 1998). Curr ently, a point of understanding is that beliefs are more multi-dimensional rather than unidimensional in nature (Schommer, 1994; Hofer, 2001). In this study NOS and persona l epistemological beliefs about science and learning science were examined with the use of the EBAPS and NSKS discussed in chapter two and later in this chapter. The EBAPS assesses student’s beliefs concerning the learning and nature of scien tific knowledge. For the purpose of this study the science epistemological beliefs also includes or refers to beliefs about the nature of scientific knowledge as proposed in the m odel developed by Rubba and Anderson (1978). Therefore, the NSKS will be used to supplement the NOS beliefs assessed in the NOS portion of the EBAPS. The pres ent study examined if students’ NOS and personal epistemological beliefs change by the completion of the course with the use of preand post surveys and interviews. Research question 2 explored what laboratory pedago gical practices (e.g. prelaboratory activities, laboratory work, post-labora tory activities), as discussed in chapter two, students believe influenced their understandin g of the material during the semester laboratory course. Heavy attrition within science can restrict the flow of students pursuing careers in the STEM (science, technology, engineering and mathematics) fields, because academic performance in courses suc h as chemistry and physics is interpreted by students and advisors alike as a rel iable predictor for ultimate success as a science major. Therefore, pedagogical instructio nal strategies are critical so that students with a desire to succeed can achieve their educational goals. The chemistry curriculum is influenced by the accreditation crite ria developed by the American Chemical Society. These reform movements in chemis try have sought to engage students by promoting active learning and providing contemporary situations that illustrate abstract concepts (American Chemical Soc iety, 1999). Effective instruction

PAGE 139

123 usually integrates several instructional pedagogies in order to motivate and facilitate learning at the individual level (Smith, et al., 20 05; Prince, 2004). In science laboratories students carry out experiments which are often inte nded as either an activity in doing experimental research, or support for understanding the theory discussed in lecture. Both purposes require the learner to make links bet ween scientific theories and the scientific phenomena and equipment. However, often students in science laboratory courses only manipulate equipment and do not manipu late the ideas (Gunstone, 1996). Therefore, in laboratory instruction it is imperati ve to include preand post-laboratory activities requiring students to make predictions a nd give explanations (Hofstein & Lunetta, 1982). Research question 2’s associated sub-questions considered if students believe any of the laboratory pedagogical practices influenced their NOS or personal epistemological beliefs about chemistry. According to Rollnick, et al., (2001), university chemistry departments rarely question th e importance of laboratory work as an essential component of the experiences of underg raduate science students. However, research in the relationship of NOS and pe rsonal epistemological beliefs to laboratory pedagogical practices has been rarely ad dressed (Leach, et al., 1998; Sere, et al., 1998; Tiberghien, et al., 1998; Sere, et al ., 1998; Sere, 2002; Wickman, 2003). In chapter two laboratory pedagogical practices are di scussed in relation to learning in a laboratory environment. The present study examined with the use of semi-structured interviews and a laboratory pedagogical questionnai re what laboratory pedagogical practices students believe influenced their underst anding of the material or changed their NOS or personal epistemological beliefs.

PAGE 140

124 Context and Participants Setting The setting for the study is a rapidly growing, fis cally autonomous, urban campus of a major university in Florida with approx imately 5000 students enrolled in 45 undergraduate and graduate degree programs through the Colleges of Arts & Sciences, Business, and Education. Participants in this stu dy were registered for General Chemistry 2045 Laboratory, a one-semester course at the University. The 16-week semester general chemistry course included a separa te 3-hour lecture and 3-hour laboratory section each week with a maximum number of twenty students per laboratory section. The prerequisites for the course are high school chemistry or physical science, and college algebra. The lecture sections were tau ght by two different professors; laboratory sections are taught by the researcher an d several other graduate teaching assistants. The study was conducted in the campus general chemistry laboratories during the Fall semester of 2006. Population Sample Fifty-six undergraduate students, between the ages of 18 and 45 representing five intact chemistry laboratory sections in the Fa ll semester of 2006 participated in this study. The course participants represented freshma n, sophomores, juniors, and seniors from different study programs (majoring in environm ental science, biology, chemistry, marine science, nursing, and teacher education). Overall, the mean age of the participants was 21 ye ars, with a range of 18 to 45 years of age. Approximately 64% of the participants were female and 36% were male. Overall 46% of the participants were freshman, 21% sophomores, 18% juniors, 9% seniors, and 7% with no college rank. All but five of the 56 participants had taken a high

PAGE 141

125 school chemistry and biology course. Seventy-seven percent of the participants were majoring in science with 13% undecided. A sample of 20 participants from the total sample o f 56 volunteered and participated in the initial and final interviews. Overall, the mean age of the interviewed participants was 22 years, with a range of 18 to 45 years of age. Approximately 85% of the participants were female and 15% were male. Ove rall 40% of the participants were freshman, 25% sophomores, 25% juniors, and 10% with no college rank. All of the 20 participants had taken a high school chemistry and biology course. Ninety percent of the participants were majoring in science with 10% unde cided. Research Instruments – Measures Chemical Concepts Inventory The Chemical Concepts Inventory (CCI) is the label given to an assessment that explores learners’ mental models, their qualitative images, understanding of concepts related to how chemistry works (see Appendix A). Re search supports the inclination that learners can often solve mathematical problems in c hemistry but have poor or incorrect mental models about the fundamental concepts behind the mathematics (Pavelich, et. al., 2004). The design of the CCI was modeled afte r Treagust (1988) and Odom and Barrow (1995). College level general chemistry cou rses cover many concept areas in a semester therefore the CCI was designed to cover a wide sampling of concepts from general chemistry. The content validity was checke d using the Context Matrix used by the American Chemical Society test development team (Russell & Hill, 1989). The CCI has shown statistically significant (p < 0. 001) correlations between students' scores on the inventory prior to a course of instruction and their performance on labs, quizzes, and exams in the course as well a s a statistically significant correlation with students' final performance. These correlation s range from 0.144 to 0.165 with all

PAGE 142

126 values significant at the p < 0.001 level. The CCI ’s overall Cronbach alpha reliability coefficient ranges from 0.75 to 0.86 for high schoo l and college science students (Russell & Hill, 1989). The CCI was used to better understand the chemistry background (prior knowledge) of the participants. This assessment of a learner’s current chemical concept knowledge was given at the beginning of the study a s a pre-assessment of students’ images of chemistry concepts ( i.e. determine prior knowledge). The CCI is compri sed of 22 multiple choice questions, with several paired q uestions. The first question asks about a chemical or physical effect while the secon d asks for the learners reasoning about the observed effect. A second type of ques tion asks students to explain more completely why they had chosen a particular respons e as well as why they had discarded the remaining responses. The final common form of question asks the students to define a basic chemical concept such as boiling or evaporation. Descriptive statistics of the CCI such as frequenci es, means, and standard deviations were computed to summarize the participa nts’ level of prior knowledge about chemistry. Interview participants were selected o n a volunteer basis. Personal Epistemological Beliefs Assessment Personal epistemological beliefs in science refer t o students’ understanding of how scientific ideas are built up, including their knowledge about the process of knowingabout scientific knowledge (Songer & Linn, 1991). Students’ personal epistemology and their understanding of how chemical ideas are built do influence their learning. Studies have shown that learners’ prior knowledge does infl uence their ideas and that learners generally do hold a surprisingly wide range of idea s that are resistant to change (Taber, 2002; Gabel 1998; Fensham, 1994).

PAGE 143

127 The personal nature of learners’ epistemologies has a significant impact on their learning. In a study by Carey, et al. (1989), lear ners’ understanding of the NOS was challenged and improved through experiments designe d to encourage the learners to build, reflect, and test their own scientific theor ies, resulting in significant improvement to the learners’ level of understanding. Gobert and D iscenna (1997) identified a statistically significant correlation between each learner’s epis temology and his or her use of models in making inferences abut scientific phenomena. In order to probe the epistemological beliefs of le arners taking a physical science (i.e., chemistry, physical science, or phys ics) the multi-dimensional Epistemological Beliefs Assessment for the Physical Sciences (EBAPS) was administered at the beginning and end of the study (Elby et. al., 1999). The EBAPS discussed in chapter two (see Appendix B) is design ed to assess personal epistemological beliefs of learning science and the nature of scientific knowledge in five dimensions: the structure of knowledge, the nature of learning, real-life applicability, evolving knowledge, and the source of ability to le arn (Elby, 2001). Each item was scored on a scale of 0 (least sophisticated) to 4 ( more sophisticated). Table 6 identifies the score range for each epistemological sophistica tion level which was used to classify each participant’s initial and final level of belie f. The EBAPS items are a mix of Likerttype ratings of agreement or disagreement, as well as hypothetical conversations to which students respond using multiple choice answer s to indicate how closely their own views match those of the conversation participants. Table 7 identifies each dimension and describes the reasoning behind each as is discu ssed in chapter two. EBAPS’s overall Cronbach alpha reliability coefficient for high school and advanced chemistry and physics students ranges between 0.73 and 0.83.

PAGE 144

128 Table 6 Epistemological Beliefs Assessment for Phy sical Sciences Scale Sophistication Level Score Range Scaled Score Range Extremely Sophisticated 3.5 – 4.0 87 100 Highly Sophisticated 3.4 – 3.0 86 – 75 Moderately Sophisticated 2.9 – 2.4 74 60 Poorly Sophisticated 2.3 – 1.6 59 40 Unsophisticated 1.5 – 0 39 0 Potential epistemological beliefs instruments were eliminated as they were specifically aimed at physics not chemistry student s were Halloun & Hestene’s (1998) Views About Science Survey (VASS), and the Maryland Physics Expectation survey (MPEX) by Redish et al., (1998). Another instrume nt that was eliminated was Schommer’s (1990) Epistemological Questionnaire (EQ ) which probes learners’ epistemological stances toward physical science onl y to the extent that epistemological stances are stable beliefs or theories that don’t d epend heavily on disciplinary context (Elby & Hammer, 2001, 2002). Some of the instrumen ts mentioned above are discussed in more detail in chapter two. This instr ument was used to answer research question #1 concerning students personal epistemolo gical beliefs about science at the beginning and end of the semester course.

PAGE 145

129 Table 7 EBAPS Instrument Variables (adapted from Elby, et al., 1999). Dimension Reasoning Structure of scientific knowledge Is physics and chemistry knowledge a bunch of weakl y connected pieces without much structure and consist ing mainly of facts and formulas? Or is it a coherent, conceptual, highly-structured, unified whole? Nature of knowing and learning Does learning scienc e consist mainly of absorbing information? Or, does it rely crucially on construc ting one's own understanding by working through the mate rial actively, by relating new material to prior experie nces, intuitions, and knowledge, and by reflecting upon a nd monitoring one's understanding? Real-life applicability Are scientific knowledge an d scientific ways of thinking applicable only in restricted spheres, such as a cl assroom or laboratory? Or, does science apply more generall y to real life? These items tease out learners' views of the applicability of scientific knowledge as distinct from the learner's own desire to apply science to real life, which depends on the learner's interests, goals, and othe r nonepistemological factors. Evolving knowledge This dimension probes the extent to which learners navigate between the twin perils of absolutism (thi nking all scientific knowledge is set in stone) and extre me relativism (making no distinctions between evidence based reasoning and mere opinion). Source of ability to learn Is being good at science mostly a matter of fixed natural ability? Or, can most people become better at learn ing (and doing) science? As much as possible, these ite ms probe students' epistemological views about the eff icacy of hard work and good study strategies, as distinct from their self-confidence and other beliefs about thems elves. Descriptive statistics such as frequencies, means, and standard deviations were computed to summarize the participants’ responses t o the pre-post assessment. To compare individual student performance on the preand post-assessment the normalized (Hake) gain factor was calculated. A p aired-samples t-test (repeated measures) was used to compare the pre-post mean sco res for the participants. The variability for the paired-samples t-test was calcu lated using eta squared (Appendix B). The effect size (d) was interpreted using the guide lines from Cohen (1998). Pearson product-moment correlation was used to determine th e degree that quantitative variables were linearly related. This correlation analyses h elped address the first research

PAGE 146

130 question. The data analysis is discussed in more d etail later in this chapter as well as chapters 4-7. Nature of Scientific Knowledge Scale To assess learners’ initial and final images of sci ence, Rubba and Anderson’s (1978) Nature of Scientific Knowledge Scale (NSKS) was administered (see Appendix C). In addition, the NSKS was used to supplement a nd support the portions of the EBAPS that dealt with the nature of scientific know ledge related to personal epistemological beliefs. This instrument discussed in chapter two is a 48-item Likert scale forced-response format consisting of five cho ices (strongly disagree, disagree, neutral, agree, and strongly agree). The NSKS’s si x subscales are amoral, creative, developmental, parsimonious, testable, and unified (see Table 8). The NSKS is considered to be a reliable and valid pencil and pa per measure of the NOS as it focuses on one or more of the characteristics of the NOS. W hen the NSKS was administered to high school and college students, the reliability r anged from .65 to .89. The construct validity of the NSKS was examined by testing an ant icipated difference in understanding of the nature of scientific knowledge between two g roups of college freshmen with different educational backgrounds (Rubba, 1977). Fo r reliability, NSKS’s overall Cronbach alpha reliability coefficient for biology and chemistry students (grades 9, 10, 11), and Cronbach alpha reliability coefficient is 0.89 for advanced chemistry students (grade 12). Even though the NSKS has received little criticism from other researchers, according to Lederman (1998) it does possess signif icant wording problems. For instance, there are some statement pairs that diffe r only in that one is stated in the positive and the other in the negative. This redun dancy could encourage participants to check their answers on previous items when they rea d similarly worded items later in the

PAGE 147

131 questionnaire, and could inflate reliability estima tes and misplace confidence in the validity of the questionnaire. The scores of the negatively worded items in the NSKS were reversed so that all items have the same respo nse scale. The range of scores for each dimension is 8 to 40 p oints. For each dimension, a score of 24 points indicates a neutral (N) position or combination of realist and instrumentalist views on NOS while a score between 25 and 40 is within the accepted view of the nature of science (Instrumentalist-I), and a score between 8 and 23 is within the unaccepted NOS view (Realist-R). The overall sc ore for all six dimensions ranges from 48 to 240 points. A score of 144 (141-147) on the overall scale score is considered neutral (N) while scores ranging from 145 and 240 ( 148-240) are within the accepted view of the nature of science (instrumentalist), an d scores ranging from 143 and 48 (48140) are within the unaccepted view (realist) Initial research into learners’ images of science ( i.e. NOS) consisted of forcedchoice survey responses that provide little insight into the conceptions underlying learners’ responses (Lederman, 1992). Lately resea rchers have turned to semistructured interview assessments to probe students’ images of science. To further assess students’ images of science adapted versions of interview protocols such as the “Nature of Science” interview developed by Carey et al. (1989) will be used during interviews (see Appendix F). The adapted versions will be adjusted based on the student responses to the NSKS. The original intervi ew by Carey et al. (1989) is composed of 21 questions with the following themes: the goals of science; the types of questions that scientists ask; the nature of experi ments, hypotheses, and theories; the influence of theories and ideas on experiments; and processes of theory change (Thoermer & Sodian, 2002; Sandoval & Morrison, 2003 ). This assessment was used to

PAGE 148

132 answer research question one concerning students’ N OS beliefs and as additional support for the NOS aspects of the EBAPS. Descriptive statistics such as frequencies, means, and standard deviations were computed to summarize the participants’ responses t o the pre-post assessment. The scores of the negatively worded items in the NSKS w ere first reversed so that all items have the same response scale. To compare individua l student performance on the preand post-assessment the normalized (Hake) gain fact or was calculated. A pairedsamples t-test (repeated measures) was used to comp are the pre-post mean scores for the participants. The variability for the paired-s amples t-test was calculated using eta squared. The effect size was interpreted using the guidelines from Cohen (1998). Pearson product-moment correlation was performed to determine the degree that quantitative variables were linearly related. This correlation analyses helped address the first research question. The data analysis is discussed in more detail later in this chapter as well as chapters 4-7.

PAGE 149

133 Table 8 Nature of Scientific Knowledge Scale (Rub ba & Anderson, 1978) Amoral Scientific knowledge provides man with many capabil ities, but does not instruct him on how to use them. Moral judgment can be passe d only on man’s application of scientific knowledge, not the knowledge itself. Creative Scientific knowledge is a product of the human inte llect. Its invention requires as much Creative imagination as does the work of an ar tist, a poet or a composer. Scientific knowledge embodies the creative essence of the scie ntific inquiry process. Developmental Scientific knowledge is never “proven” in an absolu te and final sense. It changes over time. The justification process limit s scientific knowledge as probable. Beliefs which appear to be good ones at one time may be app raised differently when more evidence is at hand. Previously accepted beliefs s hould be judged in their historical context. Parsimonious Scientific knowledge tends toward simplicity, but n ot to the disdain of complexity. It is comprehensive as opposed to speci fic. There is a continuous effort in science to develop minimum number of concepts to ex plain the greatest possible number of observations. Testable Scientific knowledge is capable of public empirical test. Its validity is established through repeated testing against accepted observati ons. Consistency among test results is a necessary, but not a sufficient condition for the validity of scientific knowledge. Unified Scientific knowledge is born out of an effort to un derstand the unity of nature. The knowledge produced by the various specialized scien ces contributes to a network of laws, theories and concepts. This systematized body gives science its explanatory and predictive power. Students’ Reflective Assessment of Laboratory Metho ds An initial survey (see Appendix D) adapted from the Curriculum Innovation Fund of the University of Manchester (2002) to gauge what participants believed about laboratory practical work and how they rated their current laboratory skills was administered during the first laboratory session. The second student questionnaire (see Appendix E) a dapted from several sources (Byers, 2002; Berg, 2003; Jalil, 2006) was used to assess a learner’s reaction to the three broads areas of instructional methods ass ociated with each laboratory activity (e.g., pre-laboratory activities, laboratory work, and post-laboratory activities). The students were probed further on their comments duri ng the interviews (see Appendix F). The comments were compared and further evaluated wi th their responses on the EBAPS and NSKS and interview sessions.

PAGE 150

134 The questions covered three broad areas: 1. The learner’s general evaluation of laboratory instruction in the three broad areas of instructional methods associated wit h each laboratory activity (e.g., pre-laboratory activities, laboratory work, a nd post-laboratory activities). 2. The learner’s perceptions of the preand post laboratory activities in relation to laboratory work. 3. A cognitive domain self-assessment (reflection) of their learning outcomes from the laboratory activity. The first area of the questionnaire probed the peda gogical features of laboratory instruction. The students were asked to evaluate h ow helpful they found each of the pedagogical features with respect to understanding and necessity of the laboratory learning experience. The pedagogical features are defined in the following three categories: (1) pre-laboratory activities, (2) lab oratory work, and (3) post laboratory activities. This section of the questionnaire was used to assist in answering research question two. The overall frequencies of responses were calculated and reported. The second area of the questionnaire probed student s’ perceptions regarding the following four aspects of laboratory work: (1) und erstanding the laboratory work, (2) enjoyment in performing the laboratory work, (3) ac hievement in conducting the laboratory work, and (4) difficulty in doing the la boratory work. Students were asked to choose one statement for each aspect that best desc ribes their own position regarding the aspect. This section was used to assist in ans wering research question 2 clarifying which laboratory instructional strategy (preor po st-lab) the students found most beneficial. The overall frequencies of responses we re calculated and reported. The third area of the questionnaire was formulate d using Bloom categories in the cognitive domain (Berg, 2003). The learner was asked to describe the kind of

PAGE 151

135 learning they believed they gained in a particular laboratory activity. The participants evaluated their own learning outcomes on the scale: very much, a lot, some, a little or nothing for each of the Bloomian categories. This area of the questionnaire was used in assisting with answering research question 2 concer ning students’ understanding of the laboratory material. The overall frequencies of re sponses were calculated and reported. The goal of the questionnaires was to elicit genera l information on students’ views of the three laboratory pedagogical features (pre-laboratory, Labwork, postlaboratory) used during the semester course. Secti on one of the questionnaire related to students’ preferences for instructional tools withi n the three pedagogical features was transformed into a quantitative form. Based on stu dents’ responses, five levels will be used in this study as follows: level 1: least ess ential; level 2: somewhat essential; level 3: essential; level 4: very essential; and level 5: extremely essential. Level 1 will be represented by 1 point, and level 5 by 5 points. T he goal of the open-ended questions, questions 7 and 8 on the questionnaire is to elicit additional information on the instructional methods and their NOS and personal ep istemological beliefs. The responses to these questions along with the intervi ews were compiled and organized to address the second research question. Chemistry Laboratory Course Description Introduction The core ideals and pedagogy for the course laborat ory outcomes are identified and discussed in the following section. The nature of this study was to explore and lay a foundation for focusing on more specific features o f reasoning related to personal epistemological and NOS beliefs in light of specifi c science laboratory instructional features for future research.

PAGE 152

136 The instructor acted as a facilitator during the la boratory sessions. The tone of the session was set for active student learning wit h the use of a student-centered prelaboratory discussion. The instructor relinquished control of the laboratory session, quite often to the students. The instructor moved from g roup to group interacting with the students several times during the laboratory work s ession. The instructor asked guiding questions and redirected students to interact with other student laboratory pairs in their laboratory groups during the laboratory work. All the students participated in the nine laborator y activities during the semester of the study. The exercises are presented in chron ological order in Table 9. The instructor facilitated the laboratory sessions as i n previous semesters with no changes made to the original presentation or format. None of the pedagogical techniques were designed or changed in order to elicit changes in N OS or personal epistemological beliefs. The nature of this study was to explore a nd lay a foundation for focusing on more specific features of reasoning related to pers onal epistemological and NOS beliefs changes in light of specific science laboratory ins tructional features for future research. The laboratory activities occurred once a week duri ng a three-hour lab period, with Lab 7 conducted as a dry lab. The chemistry d epartment CHM 2045 laboratory manual was be used in this study. Examples of port ions of the pre-laboratory, laboratory work, and post-laboratory activities are located in Appendices G-K. The manual combines several versions of instruction, expositor y instruction where the entire experiment is described with explicit instructions enabling participants to carry out an exercise after it is explained or demonstrated and modified inquiry instruction where the experiment is less structured enabling the student an opportunity to participate in the investigative plan. All the required chemicals an d equipment not located in assigned laboratory drawers were made available for the part icipants.

PAGE 153

137 Table 9 Topics of Laboratory Exercises Chemistry 2045 Laboratory Experiments 1. Laboratory Orientation (LO) 2. Data Analysis & Physical Properties (DP*) 3. Matter Lab (ML) 4. Chemical Reactions-Stoichiometry (CRS*) 5. Activity Series Redox (ASR*) 6. Atomic Fingerprints (AF) 7. Molecular Shapes (MS) 8. Thermodynamics – Enthalpy (TE*) 9. Molar Volume – (MV) Organization of Course Laboratory Instruction Introduction The anticipated laboratory course outcomes are iden tified in Table 10. However, the anticipated outcomes were not specifically iden tified or predicted at the beginning of the study as possibly influencing the NOS or person al epistemological reasoning changes. The outcomes are based on normal laborato ry objectives as well as standard laboratory activities. Whether the outcomes influen ced the participants’ beliefs is only considered during the post-interviews. The nature of this study was to explore and lay a foundation for focusing on more specific features o f reasoning related to personal epistemological and NOS beliefs in light of specifi c science laboratory instructional features for future research. An overview of the organization of laboratory instr uction is presented in Table 11. Student centered preand post-laboratory assignmen ts and discussions were introduced into the laboratory experiments. In this study, th e instructional categories with specific pedagogical methods used in laboratory instruction that were compared are: (1) prelaboratory activities (i.e. quiz, procedural flowch arts, interactive introductory PowerPoint, and predictions) and group discussion, (2) laborato ry work activities (i.e. microcomputer techniques, traditional bench techniques, laborator y notebook recording, reflective

PAGE 154

138 questioning, and peer interaction), and (3) post-la b activities (i.e. analyzing qualitative and quantitative data, post-laboratory discussion, and writing a laboratory report). Table 10 Anticipated Laboratory Course Outcomes Anticipated Laboratory Course Outcomes Conceptual and theoretical knowledge Clarifying and illustrating scientific theory Arouse curiosity and stimulate interest Connect chemistry to real world Generic skills Academic culture Computer skills Cooperative learning Critical analysis Ethical behavior Knowledge skills Leadership Problem solving Proper use of references Self-regulation Team work Time management Practical and scientific skills Apply statistical tests Deductive reasoning Develop manipulative skills Develop safe laboratory skills Error analysis Form predictions Interpret findings Make observations Proper use of equipment/instruments Properly present data Record and report observations Test predictions experimentally Trouble-shoot laboratory procedures

PAGE 155

139 Table 11 Organization of Laboratory Instruction Treatment Potential Activities Pre-laboratory Prior to Class Blackboard Online Quizzes Pre-laboratory Questions Laboratory Notebook In-class PRS – PowerPoint Pre-laboratory Discussion Laboratory Work 1) Traditional Bench Work or microcomputer-based techniques or combination 2) Recording of qualitative and quantitative data in laboratory notebook 3) Interact-reflect-discuss with lab partner(s) Post-laboratory 1) Post-lab discussion in class or online 2) Written analysis of activity with results 3) Student Reflective Assessment of Laboratory Learners participated in pre-lab assignments to be done prior to the lab meeting, then a pre-lab in-class discussion with their labor atory peer groups guided by the instructor, followed by the instructor clarifying e xperimental equipment and procedures, a brief overview by the instructor of any new equipme nt if necessary, and concluding with the learner performing the experiment. During the laboratory work the participants recorded data, reflected on the data at the end of class if time permitted. If time did not permit a post-lab discussion at the end of the sess ion the participants, met outside the normal classroom time with laboratory peer groups o r during a scheduled chat session on the course website guided by the instructor, and wrote a final report. Students worked in pairs and teams of 2-4 pairs per group. For laboratory activities, 2 and 5 a Basic Lab Report (BLR) was wr itten by each student or group of students. Each individual student wrote a Formal L ab Report (FLR) for Labs 4 and 8. For the remaining laboratory activities the student s completed their analysis directly in

PAGE 156

140 their laboratory notebook as per the laboratory not ebook guidelines (see Appendix I). The relationship of data collection to instruction is described in Table 12. Pre-Laboratory Course Activities The necessity for some form of pre-laboratory prepa ration is patently obvious. Pre-laboratory activities were used as a means to d ecrease the information overload on students. A learner entering a laboratory environ ment without some form of preparation is likely to spend excessive time in fruitless frus tration routine, and non-learning (Johnstone & Al-Shuaili, 2001). For this course th e pre-laboratory activities were twofold. The participants performed out of class prelaboratory activities prior to class which was followed by 15-45 minute in class pre-laborator y activities. For each laboratory topic, students performed pre-l aboratory activities located in the lab manual or on the course website and turned in at the beginning of the laboratory period. Examples of portions of the pre-laboratory activities are located in Appendices H and J. The pre-lab prepared students through a ser ies of online and pencil-and-paper exercises from the laboratory manual introducing an d assessing prior knowledge of concepts, terms, and laboratory procedures. The on-line activities designed for the laboratory portion of the course involve pre-laboratory activities. Each week, before enteri ng the laboratory class, the students went on-line using the Blackboard course site and viewed a pre-laboratory presentatio n (see Appendix J) and took an on-line pre-laboratory quiz (Appendix H). The major advantage to the on-line pre-laboratory preparation of the student is the consistency of laboratory preparation. Every student viewed the s ame presentation for a particular laboratory experiment. Therefore, the variability of the quality of pre-laboratory presentation is removed.

PAGE 157

141 In the pre-lab work students were asked to complete the online pre-lab practice quiz and pre-lab questions prior to the laboratory meeting. The pre-laboratory quiz and questions consist of a number of exercises of sever al types. Some focused on a particular type of calculation important in the exp eriment or the safety considerations such as whether to wear goggles and gloves. Some f ocused on an important organizational or laboratory technique used in the experiment while others introduced important terms, concepts or nomenclature needed in the experiment. The week prior to the laboratory activity the students met in laborat ory peer groups to discuss the factors they believed influenced the parameters they measur ed or observed via e-mail, course web-site chat, or by holding a group discussion. The pre-lab discussion was normally held during the first 15-45 minutes of the laboratory session and included a short PowerPoint interactive quiz using the PRS clicker during laboratory activities 2, 4, 5, and 8 The pre-lab discussion consisted of the students cooperatively engaging in peer laboratory group discussions, demonstrations, and activities on procedural as well as conceptual issues including use of available classroom technology (e. g. MBL computer probes, In teractive PowerPoint introduction using PRS clickers). For laboratory activities 3, 6, 7, and 9 the students were given a brief overview of the procedure and safety concerns the first 5-10 minutes of the course with further lab discussion occurring during and af ter the laboratory work. This was done to assist in determining whether the students prefe rred a detailed pre-lab discussion prior to the laboratory work or after the completio n of the laboratory work. This assisted in assessing student’s reflections on section two o f the laboratory reflective questionnaire. The on-line activities designed for the laboratory portion of the course involved pre-laboratory activities. Each week, before enteri ng the laboratory class, the students

PAGE 158

142 go on-line using the Blackboard course site and view a pre-laboratory presentation and take an on-line pre-laboratory quiz. One of the maj or advantages to the on-line prelaboratory preparation of the student is the consis tency of laboratory preparation. Every student views the same presentation for a particula r laboratory experiment. Therefore, the variability of the quality of pre-laboratory pr esentation is removed. The following is a list of portions of the course p re-laboratory discussion. The discussion was guided, but not explicitly directed, by the instructor. Discuss pre-laboratory questions as a class Discuss safety and procedural concerns as a class Decide what data to gather and how to accomplish i t with their partner and laboratory groups The groups collaboratively prepare a class data ta ble on the front board. Determine who should be responsible for individual tasks such as; collecting original data, performing replications, etc. The students discussed their pre-laboratory questio ns at the start of class. After completing the assigned reading for the laboratory experiment, each student came to the laboratory with the pre-laboratory questions from t he laboratory manual completed and a list of any other questions they may want to discus s. The instructor allowed the students the opportunity and the time to discuss these quest ions with each other. The students formed groups and decided on a specific question th ey would like answered. For instance: How does the limiting reagent affect the percent yield? Other questions often presented for consideration included: (1) Are ther e any other safety considerations? (2) What procedures will be followed or changed? and (3 ) What information will need to be gathered? These questions set the stage for the la boratory work interactions that took

PAGE 159

143 place during the experiment. The instructor used th ese questions to set up the framework for the experiment. Laboratory Work Course Activities During the laboratory work student laboratory pairs and groups organized themselves and worked together to collect experimen tal information in a collaborative manner. The instructor moved among the laboratory groups keeping the students on task, asking guiding questions, and redirecting stu dent questions to other laboratory pairs in their groups, such as: (1) How do we measu re a certain variable and (2) What is your goal in performing this step? As each laborat ory pair generated data the information was recorded on the class data table on the front board as well as in their laboratory notebooks. During the laboratory work t he instructor attempted to guide the students in making meaning by examining patterns or trends occurring during the experiment. For instance questions were asked that encouraged reasoning, such as: What did you find when you did this earlier? What will happen if you increase the amount of thi s substance? How does this relate to the group data? Once or twice during the laboratory work the instru ctor stopped the activity to go over questions concerning the concepts, data, and p rocedures. This was due to the fact that some students had the same questions or proble ms. The students were asked during the course of the laboratory session how the laboratory activity related back to the concepts in order to help them connect theory with process. The students studied their results as well as the class results to determine w hether they need to repeat steps to replace inconsistent data. The class data was preanalyzed and discussed as a group prior to the end of the laboratory session or outsi de during online chats.

PAGE 160

144 During laboratory work the students engaged in acti vities that solely implement traditional bench methods, some that combine tradit ional bench methods with m icrocomputer-based technology, and those that relied heav ily on m icrocomputerbased technology (see Table 12). Laboratory activity 1 (LO) introduced the students to the equipment, safety regulations, and basic sci entific format of laboratory science. The format for the physical property portion of Lab 2 (DP) was more structured as this was many of the students’ first experience with the Vernier Microcomputer (MBL) and sensors. Students were provided a basic outline of the microcomputer-based program (Vernier), sample data and calculations for a simil ar situation followed by an overview of what they might observe. The students were then gu ided through on how to perform an analysis and interpretation with a typical MBL syst em using sample data. The measurement activity engaged students in collecting and analyzing quantitative data. Labs 3 and 4 (ML-CRS) are progressively less struct ured, providing students with general traditional bench procedural options, a sta tement of objectives, safety considerations, and a review of the basic concepts related to the lab. The fifth lab (ASR) called on students to perform traditional ben ch chemistry in the form of an analysis of chemical reactivity. Lab 6 (AF) present s in part real-life chemistry with the learners engaged in traditional methods to study th e concepts surrounding the electromagnetic spectrum and the atom. The only a ctivity performed that used neither traditional bench or MBL methods is lab 7 (MG) whic h dealt with molecular geometry. Labs 8-9 (TE-MV) both involved student use of the M BL. In this course a learner’s laboratory notebook, dis cussed in chapter two is defined as a set of entries written by the learner that reflect investigative experiences within the chemistry laboratory. Thus the laborato ry notebooks reflect both learning and

PAGE 161

145 instruction as it occurs. Students followed the ge neral laboratory notebook guidelines located in the laboratory manual (see Appendix I). Students were taught proper notebook organization, and also how to record their procedures and observations as they performed their laboratory work and ideas that they had related to the work. In this course students collected records of their laborato ry investigation in their laboratory notebooks and later transformed these data into fig ures, graphs, tables and schemas, interpreted their results and made knowledge claims During this portion of instruction the students wer e expected to interact with their assigned lab partner, as well as the other members of their laboratory team at their assigned laboratory station. (see Appendix J). La boratory work provided students opportunities to learn from their mistakes, problem solve in an experimental environment, and improve their laboratory skills. Performing more than 1 trial, collecting class data, and interacting with their laboratory t eam can introduce important aspects of real science, such as, collaboration of a community of scientists. Post-laboratory Course Activities The class data was pre-analyzed and discussed as a group prior to the end of the laboratory session or outside during online cha ts. The post-lab discussion was held during the last 15-45 minutes of the laboratory ses sion, or during a set time scheduled by the students at a location on campus, or during the week online at a set scheduled time prior to the next laboratory session. This di scussion consisted of the students cooperatively engaging in peer laboratory group dis cussions of their results, class data and discussing procedural as well as conceptual iss ues that may have related to their final analysis. The students examined the pooled d ata and looked for trends. The discussions conducted included some of the foll owing elements depending on whether the discussion was held during the labor atory session or outside later in the

PAGE 162

146 week: (1) Warming up – Planning the discussion: Th e participants evaluate the purpose of the discussion, the duration, technical details etc. The participants decided how they wished to proceed; (2) Discussion: Free discussion within the laboratory group. The instructor, if present only intervened if the group seemed to need help; and (3) Summaries: From time to time during the discussion the participants summarized certain points. This brought more clarity to the discussion and more validity to the data. This gave students the chance to clarify misunderstandin gs. The structure of the discussions was seldom so rigi d that there were clear lines between the aforementioned elements. The discussio ns did not consist of specific questions. It was rather a collection of some poss ible questions for the students to consider (Appendix H). During this portion of the laboratory instruction participants were asked to reflect on what they could claim, evidence of the claim, how their results compared to others, and what connections could be m ade between lecture and lab based on their results. The students were encourag ed to make explicit associations among claims, data, evidence, and observations. After each laboratory the students assessed the lab oratory instructional methods using the Student Reflective Assessment of Laborato ry Methods Questionnaire (see Appendix F). The students were required to write fo ur laboratory reports using the laboratory report guidelines located in the laborat ory manual. For laboratory activities, 2 and 5 the students wrote a report using the BLR for mat and a FLR for labs 4 and 8. (Appendix K) For the remaining laboratory activit ies the students performed a brief analysis in their laboratory notebooks.

PAGE 163

147 Table 12 Relationship of Data Collection to Instruc tion Week(s) Data Collection Instruction (per week) Inst ructional Method(s) 1-3 Chemical Concepts Inventory Pre-Assessment (CCI) Epistemological Beliefs Pre-Assessment for Physical Sciences (EBAPS) Nature of Scientific Knowledge Scale (NSKS) Lab-1 Laboratory Introduction Lab Notebook (LNB) (TB-MBL) Expository & Discovery 2-6 Initial Interviews Student Assessments of Laboratory Methods Lab-2 Data Analysis and Physical Properties (TBMBL) BLR-1 Discovery 4 Student Assessments of Laboratory Methods Lab-3 Matter Lab (TB) LNB Discovery 5 Student Assessments of Laboratory Methods Lab-4 Chemical ReactionsStoichiometry FL-1 (TB) Expository & Discovery 8 Student Assessments of Laboratory Methods Lab-5 Activity Series Redox (TB) BLR-2 Discovery & Inquiry 9 Student Assessments of Laboratory Methods Lab-6 Atomic Fingerprints (TB) LNB Expository & Discovery 10 Student Assessments of Laboratory Methods Lab-7 Molecular Shapes (MS) Dry Lab LNB Expository 11-12 Student Assessments of Laboratory Methods EBAPS (post) Lab-8 Thermodynamics (TB-MBL) FL-2 Discovery & Inquiry 12-13 Student Assessments of Laboratory Methods NSKS (post) Lab-9 Molar Volume (MBL) LNB Discovery 13 Final Interviews Lab Review 14 -16 Final Interviews Lab Practical

PAGE 164

148 Data Collection The data collection process in this study occurred in three phases. During the first phase data was collected regarding students’ initial NOS and personal epistemological beliefs as well as their prior skil ls and knowledge related to chemistry. During phase two initial interviews were performed with the twenty volunteers from the population sample (n=56) students concerning their NOS and personal epistemological beliefs about science. In addition, during this pha se student laboratory instruction reflections were collected (n=56). Phase three inv olved post-administration of the NOS and personal epistemological beliefs assessments (r epeated measure) as well as final interviews with the twenty volunteers concerning wh at laboratory instructional strategies students’ believed influenced their understanding o f the laboratory material, as well as their NOS and personal epistemological beliefs abou t chemistry. Each phase is described briefly in the following section and in g reater detail later in the chapter in regards to the setting and sample, context, materia ls used, as well as the procedures for the measures. The quantitative phase will include a discussion of the survey instruments that will be utilized for this study. The qualitative phase will describe the interview process and what questions were asked. A data collection timeline is described in Table 13. Researcher’s Role There is a certain element of bias that this resea rcher brings to the study as the major laboratory instructor. Threats to the validi ty and integrity of the data were minimized as described below and at the end of this chapter. The course was presented and taught in the same manner it has been taught du ring the prior two years by the researcher. The instructor facilitated the laborat ory sessions as in previous semesters with no changes made to the original presentation o r format

PAGE 165

149 The initial assessments (CCI, EBAPS, NSKS, laborato ry skills questionnaire) were administered and collected by the researcher a nd a graduate student teaching within the department. The aforementioned method o f instrument administration was repeated with the post-administration of the EBAPS and NSKS at the end of the semester study. The laboratory instruction questi onnaires were collected each week from the students after each laboratory session. Th e participants placed the questionnaires in a labeled envelope out of the vie w of the researcher to avoid any conflict of interest with the researcher’s role as the instructor. Analysis (coding) of the questionnaires occurred after the completion of the semester when grades had been assigned and entered. The interviews were perform ed by a trained outside interviewer (graduate student) within the education department to avoid interference with data collection and interpretation. The reliability and validity issues are discussed further at the end of this chapter. Table 13 Data Collection Timeline Week(s) Data Collection Sample Size 1-3 Chemical Concepts Inventory Pre-Assessment (CCI) Epistemological Beliefs Pre-Assessment for Physical Sciences (EBAPS) Nature of Scientific Knowledge Scale (NSKS) 56 2-3 Initial Interviews 20 2-14 Student Assessments of Laboratory Methods 56 14-15 EBAPS and NSKS (post) 56 15-16 Final Interviews 20

PAGE 166

150 Phase One: Quantitative During the first phase the researcher presented a g eneral orientation of the study during the introductory session of the first week o f the laboratory course. The Chemical Concepts Inventory (CCI), the Epistemological Beliefs Assessment for the Physical Sciences (EBAPS), the Nature of Scientific Knowledge Scale (NSKS), and a laboratory skills questionnaire were prepared as a survey pack age to be completed by all participants and administered by a graduate student teaching within the department (see Appendix A-D ). The Epistemological Beliefs Assessment for the Physical Sciences (EBAPS) and the Nature of Scientific Knowledge Scale (NSKS) were administered as a pre and post assessment to all participants. In addition to th ese instruments, the package contained an invitational letter describing the study, a part icipant consent form with a page requesting demographic information (see Appendix L) The CCI was used to examine the participants’ prior knowledge in chemistry. The EBAPS was used to examine the participants’ i nitial beliefs at the beginning of the semester course and their final personal epi stemological beliefs about the physical sciences upon completion of the course. The NSKS w as used to examine the participants’ initial NOS beliefs at the beginning of the semester course and their final NOS beliefs upon completion of the course as well a s supplemental support for their epistemological beliefs. The laboratory skills que stionnaire was used to examine the participants’ views and skills concerning laborator y work. Phase Two: Qualitative The survey results were compared and contrasted wi th the results of the second phase of the data collection process, which include d qualitative data collection with initial semi-structured interviews of the participant volun teers (Appendix F), to further assess

PAGE 167

151 students’ initial NOS and personal epistemological beliefs. During the course of laboratory instruction data concerning laboratory i nstructional strategies were collected and analyzed with the use of the Student Evaluation of Laboratory Instruction Questionnaire. The type of interview used in this study was a sem i-structured interview. Interview participants were selected on a volunteer basis. This interview was structured because it was planned, taped, and the interview wa s driven by some guidelines. On the other hand, they were semi-structured because t he interviewer used probes and follow-up questions based on the responses of the i nterviewee. The participants entered into a dialogue with the interviewer, allowing one to listen to the data for clues about students’ beliefs, experiences, and perceptions tha t provided data to address the problem and research questions (Hatch, 2002). Phase Three: Quantitative and Qualitative The final phase involved participants retaking the EBAPS and NSKS surveys in order to determine if there was a change in their b eliefs by the completion of the semester course. In addition, those participating in the initial interviews participated in a final end of the semester interview. During this i nterview the interviewer collected data to assess further whether the participants’ NOS or personal epistemological beliefs changed and what role the laboratory instructional strategies played in those belief changes. In-Depth Semi-structured Interviews The primary purpose of the interviews was to clarif y the epistemological and NOS beliefs held by the participants so that these beliefs could be compared to the assessment instruments. Rather than being bounded b y only measuring instruments, the interview enabled me to gain a clearer understandin g of the participants’ beliefs and

PAGE 168

152 thoughts. Participants were interviewed by an outs ide interviewer at two points during the semester: before the end of the first month of the semester and during the final month. The interviews were guided by an interest i n hearing individuals use their own words to express their personal views. The intervi ews were semi-structured (Appendix F) with the primary questions pre-planned and stand ardized to minimize the interviewer effects. The questions were presented in the same general sequence, but the interviews varied slightly depending on the student responses. In addition, the probe questions varied depending on student responses. The participants were interviewed in a university o ffice by an outside interviewer. The initial interviews lasted between 15-20 minutes and were audio-taped for transcription purposes. The final interviews laste d between 30-45 minutes and were also audio-taped. Interview times were extended a s needed to allow the participants to express their ideas. The initial interview protocols (Appendi x F) included general questions and/or statements exploring participants’ initial NOS and personal epistemological beliefs. The interviewer presented the participant with a partic ular question and asked the participant to offer a position. For instance, one question re lated to personal epistemological beliefs involved the participant reacting to the following: “Science is a weakly connected subject consisting mainly of facts and formulas without muc h structure versus being a strongly connected and highly structured subject.” Another question related to participants’ NOS beliefs was “Scientific knowledge is a changing and evolving body of concepts and theories.” A full account of these questions is pr ovided in Appendix F. These questions were open-ended to encourage the participants to ex plain their beliefs. The probe questions used during the initial intervi ews by the interviewer included those listed in Tables 14 and 15 (Appendix F). The probe questions were adapted from

PAGE 169

153 King and Kitchener (1994) and Carey et al., (1989). The probe questions were designed in order to elicit ratable data from the student to explain more completely why they have chosen a particular response as well as why they ha d discarded remaining responses. The final interview protocols (Appendix F) include d questions and/or statements exploring participants’ NOS and personal epistemolo gical beliefs by the end of the course as well as questions pertinent to instructio nal practices as experienced by the students. Once again the interviewer presented ea ch interview participant with questions such as “What instructional feature (prelab, laboratory work, or post-lab) was the most effective in promoting your learning in th is course?”, “How would you rank the following aspects of each instructional feature (le ast essential to extremely essential)?”, and “What instructional feature (pre-lab, laborator y work, or post-lab), if at all do you believe influenced your beliefs about the evolving knowledge science in this course?” For the final interviews similar probes were used f or select questions (Appendix F) related to the initial and final EBAPS, and the NSKS survey results, along with comments from the students’ reflective laboratory a ssessment questionnaires in an attempt to see if participants could explain in som e cases why they might have changed their answer(s) from the beginning of the semester for those questions to which they responded differently during the initial assessment (s) or interview. In addition to audio-taping and transcribing during each interview notes and observations were taken during the interviews by th e outside interviewer. Short summaries of each interview were composed in order to provide a contextual background for each interview. Grounded theory analytical procedures were used to inductively analyze the participants’ interview responses. These procedure s involved (1) the simultaneous collection and analysis of interview data, (2) comp arative methods of analysis whereby

PAGE 170

154 participants’ responses were compared among one ano ther and within each participant, and (3) the integration of a theoretical framework. The data analysis is discussed in more detail later in this chapter. Table 14 Interview Probe Questions (King & Kitchene r, 1994, p. 1020) Probe Questions 1. What do you think about this statement? 2. How did you come to hold that point of view or answer? 3. On what do you base that point of view or answer? 4. Can you ever know for sure that your position on this issue is correct? How or why not? 5. When two people differ about matters such as this, is it the case that one opinion is right and one is wrong? If yes, what do you mean by “right”? If no, can you say that one opinion is in some way better than the other? What do you mean by “better”? 6. How is it that people have such different point of view about his subject? 7. How is it possible that experts in the field disagree about this subject? Table 15 Probe Questions – Unpacking Interview Term s (Carey, et al., 1989) What do you mean by ________? Answer Helps Theory Conclusion Learn Truth Discover Procedure Try again Equipment Proof Try Out Explanation Test Understand

PAGE 171

155 Summary of Data Collection Introduction Qualitative and quantitative data collection mixedmeasures were employed in three phases during this study of fifty-six student s in 3 sections of a first semester general chemistry laboratory class taught by the re searcher and other graduate students. A consent form (Appendix L) was signed a nd collected from each participant before the administration of the instruments. The students were guaranteed that all the data they provided would be kept strictly confident ial, so that only the researcher(s) would have access to the personal data. The data gathered during the study were analyzed to determine the answers to the two main research questions and their sub-quest ions. The questions in general focused on the initial and final personal epistemol ogical and NOS beliefs held by the participants as well as the role that the instructi onal features (pre-lab, laboratory work, or post-lab) played in their learning and beliefs. The major sources of data gathered throughout the study included: Participants’ preand post-responses to the EBAPS and NSKS; Participants’ responses to open-ended laboratory q uestionnaire; and Transcriptions from initial and final semi-structu red interviews with the participants. Grounded theory analytical procedures were used to inductively analyze the participants’ interview responses. These procedure s involved (1) the simultaneous collection and analysis of interview data and (2) c omparative methods of analysis whereby participants’ responses were compared among one another and within each participant,

PAGE 172

156 Instruments The researcher and a department graduate student ad ministered to each class at a prearranged time the initial study instruments (C CI, EBAPS, NSKS, and initial lab questionnaire) during the first laboratory session. The participants were informed by the researcher the purpose of instruments and provided instructions about how to answer the instruments. The researcher informed the studen ts they needed to give honest responses. After the students completed the instru ments, their answers were collected by the researcher and graduate student to be analyz ed at a later date. The data of cases that dropped the course or that failed to com plete a majority of the components of the study were discarded prior to further analysis. During the last two laboratory activities the EBAPS and NSKS were re-administered to all remaining participants. The relationship of data collection to instruction was identified in Table 12. Semi-Structured Interviews Following completion of the initial surveys, studen ts were selected from volunteers in the study sample to participate in th e initial interviews in order to gain a deeper understanding of the patterns of student res ponses to certain assessment questions. Approximately 35 % of the students (N=2 0) from the participating general chemistry laboratory courses volunteered and partic ipated in the interviews. The interviews were held on campus at scheduled times o utside of the normal laboratory class period. The participants from the initial int erview participated in the final interview to determine if their NOS or epistemological belief s had changed and the extent to which, if at all, laboratory instruction influenced those changes. The interviews were audio-recorded for transcription and further analys is. During the interviews, the interviewer presented th e participant with particular question(s), pre-determined from the responses on t he surveys and asked the

PAGE 173

157 participant to offer a position (Appendix F). For instance, some of the questions asked the participant if they could attempt to explain or expand their answer and why they might have changed their answer from the beginning of the semester for those questions to which they responded differently during the init ial survey or interview. Following an articulation of a rationale, the participants were asked to explain more completely why he or she chose a particular response as well as why t hey had discarded the remaining responses. The participants were given a chance t o reflect on their position, and clear up any misinterpretations. This method allowed for initial member checking. In addition to audio-taping and transcribing during each interview notes and observations were taken during the interviews by th e outside interviewer. Short summaries of each interview were composed in order to provide a contextual background for each interview. These summaries we re used as member checks. Data Analysis Introduction A mixed-methods descriptive approach to data analys is, using both quantitative and qualitative data was used to analyze and then c ompare the data in order to generate the most rigorous description of the participants’ images of science and epistemological beliefs and the influence that laboratory instructi on may have had on changing those images or beliefs. This approach necessitates when quantitative measures are employed (CCI, EBAPS, and NSKS). This allowed a nu merical assessment of students’ beliefs and understanding as opposed to making pred ictions or inferences. The data analysis was performed with the Statistics Package for the Social Sciences (SPSS) software version 15. Descriptive s tatistics such as frequencies, means, and standard deviations were computed to sum marize the participants’ responses to the pre-post assessments. A paired-s amples t-test (repeated measures)

PAGE 174

158 was used to compare the pre-post mean scores for th e participants. The variability for the paired-samples t-test was calculated by calcula ting eta squared. The effect size (d) was interpreted using the guidelines from Cohen (19 98). In this dissertation, effect sizes were calculated from the mean gain score (mean Time 2 – mean Time 1) divided by the pooled standard deviation of the Time 1 and Time 2. To interpret the effect size values the following guidelines from Cohen (1998) were use d: 0.20 = small effect, 0.50 = moderate effect, and 0.80 = large effect. Pearson p roduct-moment correlations were used to determine the degree that quantitative vari ables were linearly related. The variability for the paired-samples t-test was c alculated using the formula for eta squared. Eta squared can range from 0 to 1 and represents the proportion of variance in the dependent variable that is explaine d by the independent variable. To interpret the eta squared values the following guid elines from Cohen (1998) were used: 0.01 = small effect, 0.06 = moderate effect, and 0. 14 = large effect. Variability is defined here as t 2 divided by t 2 plus sample size minus 1 (eta squared = t 2 / t 2 + N-1). CCI Analysis Quantitative. The CCI was administered to all participants, pre -instruction as a means of gauging the participants’ prior chemistry knowledge The data analysis was performed with the Statistics Package for the Social Sciences (SPSS) software through use of descriptive statistics (frequencies, means, and sta ndard deviations) to summarize the participants’ responses to all quantitative assessm ents The scantron forms were scanned using the CCI Key (see Appendix M), and the data stored on a CD in a locked filing cabinet.

PAGE 175

159 EBAPS Analysis Quantitative. The EBAPS was administered to all the participant s both pre-instruction and post-instruction as a means of quantitatively g auging the individual and overall changes in personal epistemological beliefs concern ing the learning of science and the nature of scientific knowledge during instruction. Each item on the EBAPS was scored on a scale of 0 (least sophisticated) to 4 (most so phisticated). (see Appendix N) The scoring scheme is non-linear to take into account q uestion-by-question (see Tables 7, 8, & 16) variations in whether, for instance, neutrali ty is more or less sophisticated. A subscale score is the average of the learner’s scor es on every item in that subscale. When an item within a given subscale is left blank, the average is calculated without that item included. Multiplying through by 25 allows on e to report subscale scores on a scale of 0 to 100. The total score is the average of stu dents' scaled scores on all 30 items (Elby, et al., 1999). The data analysis will be pe rformed with the Statistics Package for the Social Sciences (SPSS) software and Microsoft E xcel. Further statistical analysis was performed as needed and discussed earlier in th is chapter. Refer to Appendix N for the EBAPS Scoring with Excel Template (Elby, et al. 1999). Table 16 EBAPS Coding Subscales (adapted from Elb y, et al., 1999) EBAPS Subscales Color Coding Items Structure of Knowledge (Red) 2, 8, 10, 15, 17, 19, 20, 23, 24, 28 Nature of Learning (orange) 1, 7, 11, 12, 13, 18, 2 6, 30 Real-life Applicability (green) 3, 14, 19, 27, 28 Evolving Knowledge (blue) 6, 29 Source of Ability to Learn (purple) 5, 9, 16, 22, 25 No subscale (black) 4, 21

PAGE 176

160 Qualitative. The purpose of the EBAPS was to qualitatively gauge the interview participants’ initial epistemological understanding and any changes in their epistemological development. Using the qualitative data transcribed from the interview sessions, and the results from the EBAPS, participa nts’ epistemological beliefs level of development were tentatively identified and offered as support for initial and final beliefs with the Epistemological Beliefs Assessment for Phy sical Sciences Scale (Table 7). NSKS Analysis Quantitative. The NSKS was administered to all the participants both pre-instruction and post-instruction as a means of quantitatively gaugi ng the individual and overall changes in NOS beliefs during instruction. Composite score s (i.e., addition of subscale scores) of learner change for the three NSKS subscales that di stinguish between the instrumentalist and realist positions in learners’ images of science will be used in the study (see Appendix O). Subscales are composed f rom the eight items, four positive and four negative, corresponding to each of the fac tors in a Model of the Nature of Scientific Knowledge, i.e., amoral, creative, devel opmental, parsimonious, testable and unified subscales. Subscale scores are calculated b y summing the appropriate 8 items of a given subscale after reflecting the negative i tems of the scores. Following this scoring scheme, a maximum score of 40 points for ea ch subscale and 240 points for the entire NSKS is possible. Further statistical analy sis will performed as needed The range of scores for each subscale is 8 to 40 po ints. For each subscale, a score of 24 points indicates a neutral position whi le a score between 25 and 40 is within the accepted view of the NOS or one of an instrumen talist, and a score between 8 and 23 is within the unaccepted view of science or one of a realist. The overall score for all six subscales ranges from 48 to 240 points. A scor e of 144 on the overall scale score is considered neutral while scores ranging from 145 to 240 are within the accepted view of

PAGE 177

161 the nature of science, moving towards instrumentali sm, and scores ranging from 48 to 143 are within the unaccepted view, moving towards realism (see Appendix O) Qualitative. The purpose of the NSKS was to qualitatively gauge the interview participants’ initial understanding and any belief changes of NOS. Using the qualitative data transcribed from the interview sessions, and t he results from the NSKS participants’ NOS beliefs, along with the EBAPS the participants’ level of development will be tentatively identified with the NSKS scale located in Appendix O Semi-Structured Interviews To ensure the reliability of the coding scheme, the coding scheme and data was given to other colleagues following complete coding by the principal researcher. Those researchers coded the data, and the results were co mpared to ensure that another person would code the data the same way. After the first repetition of the other researchers coding the data, the coding scheme was revised, simplified, and clarified. Initial interviews were conducted after the adminis tration of the CCI, EBAPS, and NSKS in order to gain a deeper understanding of the patterns of student responses to certain assessment questions. Initially the interv iew participants were to be selected on the basis of their scores (high, middle, low) on th e CCI, the responses on the NSKS, and the EBPAS questionnaire, however due to partici pants busy schedules and a small sample size (N=56) volunteers were requested. App roximately 35 % of the students (N=20) from the participating general chemistry lab oratory courses volunteered to participate in the interviews. Interview methods ar e discussed below, in this chapter section titled Data Collection and in Appendix F. S tudents were asked questions directly pertaining to their NOS and personal epistemologica l beliefs during the initial interview.

PAGE 178

162 During the final interview they were asked to refle ct on their beliefs as related to instruction. The interviewer probed the students’ responses and comments concerning components of the laboratory questionnaire. The da ta obtained from the interviews were used to explore possible student experiences a nd beliefs that lead to specific responses and/or changes on the NSKS, EBPAS, and la boratory questionnaire. Grounded theory analytical procedures were used to inductively analyze the participants’ interview responses. These procedure s involved (1) the simultaneous collection and analysis of interview data, (2) comp arative methods of analysis whereby participants’ responses were compared among one ano ther and within each participant, and (3) the integration of a theoretical framework. To analyze the interviews, the researcher read through both sets of transcripts ma king preliminary notes regarding patterns that emerge from individual participants. The dimensions of the EBAPS and NSKS were used to develop the coding patterns. The data collected from the sets of interview responses were coded using the dimensions of the instruments (EBAPS and NSKS) discussed in chapters two and three. The tra nscribed interview data was read looking for patterns, relationships and other theme s within the dimensions. Entries were coded according to patterning identified while keep ing a record of what entries went with which element of the patterns. In other words the data was read and then chunked based on common language. The coding scheme will be discussed further in subsequent chapters. Reliability and Validity in Qualitative Research Introduction The importance of providing checks and balances to maintain acceptable standards is a necessary component of any research inquiry. In effect, the need for rigorous data collection and analytic methods has t o be addressed. The traditional

PAGE 179

163 method of judging the rigor of a research inquiry i s by the use of several of the following six strategies: prolonged engagement, triangulatio n, peer debriefing and support, member checking, negative case analysis, or auditin g (Padgett, 1998; Guba & Lincoln, 1989; Lincoln & Guba, 1985). Trustworthiness Researchers, who frame their studies in an interpre tive model, think in terms of trustworthiness as opposed to the conventional, cri teria of internal and external validity, reliability, and objectivity (Denzin & Lincoln, 199 4; Lincoln & Guba, 1985; Padgett, 1998). Lincoln and Guba (1985) suggest that the “trustwort hiness” of a qualitative study allows a researcher and audience to evaluate the value of the results. Denzin and Lincoln (1994) suggest that four factors be considered in e stablishing the trustworthiness of findings from qualitative research: credibility, tr ansferability, dependability, and confirmability. An inter-rater or peer check on t he coding of the interview responses by a minimum of two raters checked reliability. Credibility Credibility refers to the confidence one can have i n the truth of the findings and can be established by various methods. Three credib ility methods are triangulation, member checking and negative case analysis. With re spect to triangulation, data from multiple sources through multiple methods (i.e. int erviews, surveys, and reflective questions), non-participant observation, and docume nt reviews will be employed. Triangulation is a way of corroboration that allows the researcher to be more confident of the study’s conclusions. Triangulation of outcome s produced by the initial and final interviews and the Student Assessments of Laborator y Methods questionnaire were used to assess the influence of the laboratory inst ructional methods as well as the

PAGE 180

164 EBAPS and NSKS pre-post assessments. This procedur e was particularly important in addressing research sub-questions 1 and research qu estion 2. Prolonged engagement means being present at the sit e where the study is being done long enough to build trust with the participan ts, experience the scope of variation and to overcome distortions due to the presence of the investigator at the site. This may involve an entire year or longer or it could mean a s little as a month or semester. If the investigator is on the site long enough to see the range of things to be expected, the results produced will be more credible. This study lasted for one semester. Persistent observation is a practice that checks depth of experience and understanding. To be persistent, the investigator m ust explore details of the phenomena under study to a deep enough level that he or she c an decide what is important and what is irrelevant and focus on the most relevant a spects. In studies of this nature (involving repeated measu res), completing the initial responses to an instrument could impact responses o n the repeated measure of the instrument. A testing effect can occur when the pr e-assessment itself influences the post-assessment. The reliability of the assessment instruments may change in human ability to measure differences (due to experience, fatigue, etc). Therefore, initial and final interviews were implemented to assist in chec king the validity of the participants’ scores on the EBAPS and NSKS. The initial scores o f the interview participants were compared to their initial interview responses. Thi s method was repeated with the final scores and interviews. Interviews, observations and surveys are time-consu ming, but will be the main data-gathering methods. During the field observatio ns and interviews the researcher simply can not afford to rush through or skirt arou nd the issues.

PAGE 181

165 Member checking involves checking the accuracy of f acts and observations, as data collection transitions into data analysis. Cro sschecking will encourage selfawareness and self-correction. All interview partic ipants were shown transcribed summaries of their initial and final interviews to verify the accuracy. After the initial analysis of the study, feedback on some of the find ings was achieved from individuals in the field who did not participate through peer revi ews. Individuals from the research site were asked to confirm the accuracy of the observati ons as well as comment on whether the interpretations ring true and are meaningful. T his process provided participant validation of the findings. Applicability Applicability or transferability means, in essence, that other researchers can apply the findings of the study to their own. To pr ovide for applicability the study presents the findings with “thick” descriptions of the participants, the data collection procedures, the analytic procedures, and the emerge nt patterns. Dependability According to Denzin & Lincoln, (1994) dependability refers to the stability of the findings over time and confirmability to the intern al coherence of the data in relation to the findings, interpretations, and recommendations. The logic for selecting participants and events to observe, interview, and include in th e study were clearly presented. A technique for assessing dependability is the depend ability audit where an independent auditor reviews the activities of the investigator. Once again, this was accomplished with a peer review.

PAGE 182

166 Confirmability Confirmability refers to the quality of the result s, in other words the degree to which qualitative data and their interpretations ca n be authenticated. The techniques to be used for establishing credibility such as data t riangulation, investigator triangulation, and member-checking are important for building conf irmability. An audit trail can be used to accomplish dependability and confirmability simultaneously (Lincoln & Guba, 1985; Padgett, 1998). The audit trail for this stu dy includes detailed notes regarding data collection, data analysis, and any modificatio ns made. Summary This chapter described the predicted design and met hodology of the research study. The purpose of this study was to explore th e theoretical and conceptual frameworks, and describe the empirical research per tinent to student images of science and epistemological beliefs development during the course of laboratory instruction. Section one restated the purpose of the study, elab orates on the rationale behind the research questions, and presents an overview of the analysis, design, and methodology. Section two described the context and participants of the setting. Section three discusses the research instruments, measures, and techniques which include the: (1) Chemical Concepts Inventory, (2) Epistemologica l Beliefs Assessment for the Physical Sciences, (3) Nature of Scientific Knowle dge Scale, (4) Students’ Reflective Assessment of Laboratory Methods, and (5) In-depth semi-structured interviews. Section Four identifies the forms of treatment (ped agogy) involved in the laboratory instruction. This section offers an overview of th e laboratory environment followed by a discussion of the three general areas under conside ration, pre-laboratory, laboratory work, and post-laboratory for this study. Section six of this chapter summarizes how data will be collected during the study with a gene ral overview of the phases of data

PAGE 183

167 collection and the researcher’s role during the stu dy. Section seven briefly summarizes how the data will be analyzed. In addition, this c hapter described the potential quantitative and qualitative analysis methods imple mented. The final section discusses the aspects to be used in monitoring the reliabilit y and validity of the data collection and analysis. Chapter four presents a description of the particip ant sample followed by the presentation of the quantitative analyses of the st udy’s first research question and subquestions. The questions are presented with the qu antitative results of the analyses for all the participants (N=56) and of the twenty whom participated in the interviews. The results are discussed and related back to the key N OS and personal epistemological beliefs literature.

PAGE 184

168 Chapter Four: Quantitative Findings Introduction Given the mixed-methods nature of this study’s find ings, the presentation of the data is necessarily embedded in a description of th e findings in chapters four, five, six and seven. This chapter presents a description of the participant sample followed by the presentation of the quantitative analyses of the st udy’s first research question and subquestions. The questions are presented with the qu antitative results of the analyses for all the participants (N=56) and of the twenty whom participated in the interviews. The results are discussed and related back to the key l aboratory education literature as well as the NOS and personal epistemological beliefs lit erature. Chapter five presents a description of the developm ent of the participant’s personal epistemological beliefs through the presen tation of qualitative analyses of the study’s first research question and sub-question 1b. The characterization of personal epistemological beliefs with the results of the ana lyses of the participants’ responses to interview probes will be presented. The combinati on of interviews and quantitative measures will provide a glimpse into students’ pers onal epistemological beliefs changes during the course of a semester and what the partic ipants’ believed influenced their beliefs. Chapter six presents a description of the developme nt of the participants’ NOS beliefs through the presentation of qualitative ana lyses of the study’s first research question and sub-question 1-a. The characterization of NOS beliefs with the results of the analyses of the participants’ responses to inte rview probes will be presented. The

PAGE 185

169 combination of interviews and quantitative measures will provide a glimpse into participants’ NOS belief changes during the course of a semester and what the participants’ believed influenced their beliefs. Chapter seven characterizes the findings of the ins tructional features of the second research question and sub-questions 2-a and 2-b. The characterization of laboratory instruction with the quantitative and qu alitative results from the Student Evaluation of Laboratory Instruction Questionnaire as well as the results of the analyses of the participants’ responses to interview probes will be presented. This will provide a glimpse of the participants’ overall beliefs concer ning the laboratory aspects of the semester course. The final chapter of this dissertation (Chapter 8) concludes by presenting some implications on theory and pedagogy, limitations to the study, a summary of the key findings, and areas for future research. Characterization of Participants’ Epistemological a nd NOS Beliefs Research Question 1 and Sub-Questions The first research question and sub-questions lent themselves to quantitative data analysis. They are: RQ1. What range of personal epistemological and NOS beli efs about science (chemistry) do undergraduate science students have at the beginning of a semester general chemistry laboratory course? RQ1a. Do students’ images of the nature of chemistry (NOS ) change by the completion of a semester general chemistry laborato ry course? RQ1b. Do students’ personal epistemological beliefs about science (chemistry) change by the completion of a semester general chem istry laboratory course?

PAGE 186

170 Quantitative results regarding pre-post semester NO S and personal epistemological beliefs toward science are presente d and discussed briefly in this chapter. Further discussion will be presented in ch apters five and six. Description of Participants A sample of 56 undergraduate students at a major Un iversity in Florida volunteered and participated in the study. All par ticipants were enrolled in the first semester of a two semester general chemistry labora tory course during the fall semester of 2006. Students who agreed to participate signed the participant consent form (Appendix L). Overall, the mean age of the partic ipants was 21 years, with a range of 18 to 45 years of age. Approximately 64% of the par ticipants were female and 36% were male. Overall 46% of the participants were freshman 21% sophomores, 18% juniors, 9% seniors, and 7% with no college rank. All but fi ve of the 56 participants had taken a high school chemistry and biology course. Seventyseven percent of the participants were majoring in science with 13% undecided. A sample of 20 participants from the total sample o f 56 volunteered and participated in the initial and final interviews. Overall, the mean age of the interviewed participants was 22 years, with a range of 18 to 45 years of age. Approximately 85% of the participants were female and 15% were male. Ove rall 40% of the participants were freshman, 25% sophomores, 25% juniors, and 10% with no college rank. All of the 20 participants had taken a high school chemistry and biology course. Ninety percent of the participants were majoring in science with 10% unde cided. Chemical Concepts Inventory Results The Chemical Concepts Inventory (CCI) discussed in chapter three is the prior knowledge assessment that was administered to explo re the participants’ prior mental models and their qualitative images of how chemistr y works (see Appendix A).

PAGE 187

171 Descriptive statistics of the CCI pre-assessment ch emistry knowledge scores of the fiftysix participants are outlined in Table 17 to includ e means, standard deviations, and ranges of scores. Table 17 Descriptive Statistics – Chemical Concept Inventory Scores N Minimum Maximum Mean Std. Deviation 56 31.00 100.00 68.96 15.264 20 45.00 86.00 67.55 10.247 As shown in Table 17 the mean CCI pre-knowledge ass essment scores for the participants (N=56) ranged from 31.00-100.00. The participants had a mean score of 68.96 with a standard deviation of 15.264. Using t he laboratory instructional grading scale for the course the number of participants sco ring within a specific range is indicated in Table 18. The scores appear to be nor mally distributed with a majority scoring (16) in the 65-74 range. As shown in Table 18 the mean CCI pre-knowledge ass essment scores for the interviewed participants (N=20) ranged from 45.00-8 6.00. The participants had a mean score of 67.55 with a standard deviation of 10.247. Using the laboratory instructional grading scale for the course the number of particip ants scoring within a specific range is indicated in Table 18. The scores appear to be nor mally distributed with a majority scoring (7) in the 65-74 range. Table18 Distribution of Participants’ CCI Scores Score Range Number Participants (N=56) Number Participants (N=20) 85-100 8 1 75-84 11 5 65-74 16 7 55-64 12 5 0-54 9 2

PAGE 188

172 Epistemological Beliefs Assessment Physical Scien ce Results Descriptive Statistics All Participants Participants’ initial and final personal epistemolo gical beliefs over the course of a semester were assessed using the Epistemological Be liefs Assessment for Physical Science (EBAPS). The EBAPS discussed in chapters t wo and three (see Appendix B & N) is designed to assess personal epistemological b eliefs in five dimensions: the structure of knowledge, the nature of learning, rea l-life applicability, evolving knowledge, and the source of ability to learn (Elby, 2001). P rior to data analysis, a check on accuracy of data entry and missing data for the dat a set was done through SPSS frequencies. Each item is scored on a scale of 0 (l east sophisticated) to 4 (more sophisticated). Descriptive statistics of the EBAP S preand post-assessment scores (N=56) of all the participants are outlined in Tabl e 19 to include means, standard deviations, and ranges of scores from each dimensio n as well as the overall score. Preand postassessment scores for all participants ar e located in Appendix P. Table 19 Descriptive Statistics EBAPS Scores – Al l Participants Dimension PreMean SD Range PostMean SD Range Structure of Knowledge (A-1) 2.172 0.460 1.153.20 2.488 0.502 1.153.65 Nature of Knowing & Learning (A-2) 2.511 0.469 1.153.44 2.760 0.551 1.633.94 Real-life Applicability (A-3) 2.665 0.694 0.754.00 2.978 0.643 1.754.00 Evolving Knowledge (A-4) 2.357 0.687 1.004.00 2.804 0.788 0-4.00 Source of Ability to Learn (A-5) 2.896 0.730 0.804.00 3.107 0.721 1.204.00 Overall Score 2.514 0.352 1.583.23 2.771 0.388 1.283.55 As shown in Table 19 the mean pre-assessment overa ll EBAPS scores for the participants (N=56) ranged from 1.58 to 3.23. The participants had a mean pre-

PAGE 189

173 assessment score of 2.514 with a standard deviation of 0.352. The results also indicate that participants’ EBAPS post-assessment scores ran ged from 1.28 to 3.55. The participants had a mean post-assessment score of 2. 771 with a standard deviation of 0.388. These results seem to suggest that the labo ratory instructional experience had a small but positive effect on some of the participan ts’ personal epistemological beliefs. However, each instructional method (pre-laboratory, Labwork, post-laboratory) included multiple pedagogical components (i.e., quiz, MBL, l aboratory notebook, and analysis paper) that may or may not of influenced the partic ipants’ epistemological beliefs. Taking into consideration that the range of possibl e scores 0 to 4, the results suggest that some of the participants were neither prior to nor after the laboratory instruction homogeneous in terms of their overall epistemologic al stage as 22 participants’ improved their epistemological beliefs by the end o f the semester course (see Tables2021 & Appendix P). As indicated in Tables 20 and 21 one participant sh ifted from moderately sophisticated (2.85) to extremely sophisticated (3. 55). Approximately 10 participants moved into the highly sophisticated belief level (3 .0-3.4) by the end of the semester course while two participants’ scores dropped from highly sophisticated beliefs (3.02 & 3.23) to moderately sophisticated beliefs (2.80 & 2 .95). Twenty-four of the participants remained in the moderately sophisticated belief ran ge (2.9-2.4) with small changes in their individual dimension scores. Four participan ts remained in the poorly sophisticated beliefs range (1.6-2.3), while two participants sco res dropped from moderately sophisticated (2.47 & 2.50) to poorly sophisticated beliefs (1.83 & 2.38).

PAGE 190

174 Table 20 Participant Shifts between Epistemological Belief Levels H E H H H M M H M M M P M U P H P M P P 1 1 2 7 24 2 1 5 9 4 Table 21 EBAPS Score Range – Pre-Post Count Sophistication Level Score Range Scaled Score Range Pre-Count N=56 Post-Count N=56 Extremely Sophisticated (E) 3.5 – 4.0 87 100 0 1 Highly Sophisticated (H) 3.4 – 3.0 86 – 75 3 13 Moderately Sophisticated (M) 2.9 – 2.4 74 60 35 35 Poorly Sophisticated (P) 2.3 – 1.6 59 40 17 6 Unsophisticated (U) 1.5 0 39 0 1 1 The overall average score for the EBAPS at the begi nning of the semester course for all participants was 2.514 indicating a moderately sophisticated level of epistemological beliefs. Among them, the highest s core was 3.23 indicating highly sophisticated epistemological beliefs and the lowes t score was 1.58 indicating a poor level of sophistication in epistemological beliefs. It is worth noting, however, that for the pre-assessment overall score, only 3 of 56 students scored above 3.00 indicating high sophisticated epistemological beliefs while 18 of 5 6 participants scored below 2.40 indicating initially poor to unsophisticated episte mological beliefs. The majority of participants scored between 2.42-2.61 indicating mo derately sophisticated epistemological beliefs. By the end of the semester, the overall average EBA PS post-score for all the participants was 2.771. The highest post-score was 3.55 indicating superior sophisticated epistemological beliefs and the lowes t score was 1.28 indicating a decrease from the initial lowest score of 1.58 fall ing into the range of poor epistemological beliefs. Again it is worth noting that for the post-assessment overall score, 14 of 56 students scored above 3.00 with 1 o f the 14 scoring 3.55 while only 7 of

PAGE 191

175 56 students scored below 2.40. The majority of th e participants scored between 2.662.87 indicating moderately sophisticated epistemolo gical beliefs. EBAPS T-Test Results All Participants Paired samples t-test were conducted for each axis mean score and overall mean score to compare the preand post-mean scores of the participants. Statistically significant (p 0.05) differences were found in four of the five d imensions, structure of knowledge, nature of learning, real life applicabil ity, evolving knowledge and in the overall score. In this dissertation, effect sizes were calculated from the mean difference score (mean Time 2 – mean Time 1) divided by the po oled standard deviation of the Time 1 and Time 2. The results were analyzed by co mparing pre and post test scores, the Hake gain (also called the Hake factor), and th e maximum possible gain. The Hake gain is a normalized gain defined as pretest score max pretest posttest gain possible max gain actual g = = The results are presented in Table 22. There was a statistically significant increase in t he structure of knowledge dimension scores from pre-assessment (M=2.172, SD=0 .460) to post-assessment (M=2.488, SD=0.502), t (55) =-4.248, p 0.000, d=0.57 (medium statistically significant effect). There was a statistically significant inc rease in the nature of learning dimension scores from pre-assessment (M=2.511, SD=0.469) to p ost-assessment (M=2.760, SD=0.551), t (55) =-2.988, p 0.004, d=0.40 (small but statistically significant effect). There was a statistically significant increase in t he real-life applicability dimension scores from pre-assessment (M=2.665, SD=0.694) to post-ass essment (M=2.978, SD=0.643), t (55) =-2.809, p 0.007, d=0.38 (small but statistically significant effect). There was a statistically significant increase in the evolving knowledge dimension scores from pre-

PAGE 192

176 assessment (M=2.357, SD=0.687) to post-assessment ( M=2.804, SD=0.788), t (55) = 4.064, p 0.000, d=0.54 (medium statistically significant eff ect). There was not a statistically significant increase in the source an d ability to learn dimension scores from pre-assessment (M=2.896, SD=0.730) to post-assessme nt (M=3.107, SD=0.721), t (55) =-1.790, p 0.079, d=0.24 (small not statistically significant effect). There was a statistically significant increase in the overall s cores from pre-assessment (M=2.514, SD=0.352) to post-assessment (M=2.771, SD=0.388), t (55) =-4.568, p 0.000, d=0.61 (medium statistically significant effect). Table 22 EBAPS T-Test Analysis All Participants Dimension PreMean PostMean Gain t-Value p-Value Effect Size Eta 2 Structure of Knowledge (A-1) 2.172 2.488 0.27 -4.248 0.000* 0.57 0.25 Nature of Learning (A-2) 2.511 2.760 0.16 -2.988 0.004* 0.40 0.14 Real Life Applicability (A-3) 2.665 2.978 0.19 -2.809 0.007* 0.38 0.13 Evolving Knowledge (A-4) 2.357 2.804 0.33 -4.064 0.000* 0.54 0.23 Source/Ability to Learn (A-5) 2.896 3.107 0.11 -1.790 0.079 0.24 0.055 Overall Score (Tot) 2.514 2.771 0.17 -4.568 0.000* 0.61 0.28 *significant at p 0.05 The average gain score of all participants was betw een 0.17-0.27 on a scale of 0 to 4.00 or 4-6 points on a scale of 0-100. The pair ed t-test shows that this gain score represents a statistically significant mean differe nce between the pretest and posttest with t =-4.568, p <0.000. This indicates a moderately significant inc rease in the sophistication level of several participants’ epist emological beliefs over the course of the semester with an effect size of d=0.61. The result s suggest that some of the participants

PAGE 193

177 in general improved their personal epistemological beliefs during the course of the semester. Eta squared is the proportion of the total variance that is attributed to an effect. In other terms it is considered a variance proportion estimate that can be positively biased and over estimate true effect. However, it is usua lly calculated when performing a paired-sample t-test as an additional indicator of effect size (Pallant, 2003). The eta square index (hand calculated in this case) indicat es that 28% of the variability in the preand post-overall scores may be explained in pa rt by the semester of laboratory instruction. So while there is a statistical diffe rence, the practical difference is moderate and warrants further investigation. EBAPS results (table 22) show a significant increas e in structure, nature, real life applicability of science, and evolving knowledge. The participants seem to struggle with ability to learn science. In summary based on the EBAPS results: (1) the mean gain scores for the overall test and all dimensions, exc ept for the source of ability to learn were found to be significant at p .05 and (2) the data suggest that laboratory instr uction possibly had effected a change in the students’ epi stemological beliefs. EBAPS Correlations – All Participants The differences between participants’ responses on the pre-assessment and the post-assessment were tested as follows. To check t he pattern of internal relationships between dimensions, dimensions with overall scores, and preand post-overall scores, Pearson’s correlations between the preand post-as sessment dimensions were calculated. Table 23 shows the correlation coeffic ients and the p-level of these correlations. The EBAPS (N=56) has good internal consistency, wit h a Cronbach’s alpha coefficient of 0.703. The correlations shown in T able 22 indicate that the preand post-

PAGE 194

178 assessments for 14 out of the 16 were significantly correlated, either at .05 or .01 level, providing additional support for the instrumentatio n reliability. The relationship between the EBAPS dimensional (Axi s) mean scores and the overall preand post assessment mean scores was in vestigated using Pearson productmoment correlation coefficient. All of the initia l means of the five EBAPS dimensions (Axis) significantly correlated with the initial to tal overall mean score at the 0.01 level (r (55) = 0.579, 0.709, 0.556, 0.421, and 0.647, respe ctively). All of the post means of the five EBAPS dimensions (Axis) significantly correlat ed with the post total overall mean score at the 0.01 level (r(55) = 0.682, 0.721, 0.50 7, 0.383, and 0.683, respectively). Table 23 EBAPS Paired Samples Correlations (N=56) Pair Correlation Significance Sum A1in-Totin 0.579** 0.000 Sum A2in-Totin 0.709** 0.000 Sum A3in-Totin 0.556** 0.000 Sum A4in-Totin 0.421** 0.001 Sum A5in-Totin 0.647** 0.007 Sum A1F-TotF 0.682** 0.000 Sum A2F-TotF 0.721** 0.000 Sum A3F-TotF 0.507** 0.000 Sum A4F-TotF 0.383** 0.004 Sum A5F-TotF 0.683** 0.000 Sum A1in-A1F 0.332* 0.012 Sum A2in-A2F 0.266* 0.046 Sum A3in-A3F 0.226 0.093 Sum A4in-A4F 0.386** 0.003 Sum A5in-A5F 0.262 0.051 Sum Totin-TotF 0.356** 0.007 **Correlation is significant at the 0.01 level *Correlation is significant at the 0 .05 level The structure of knowledge (A1) and nature of learn ing dimension (A2) preand post-means were significantly correlated at the 0.0 5 level, while the preand post means of the dimension, evolving knowledge (A-4) are sign ificantly correlated at the 0.01 level (r(55) = 0.332, 0.266, and 0.386, respectively). T he preand post total mean scores are

PAGE 195

179 significantly correlated at the 0.01 level (r (55) = 0.356). However, the results indicated the lack of significant correlations between the pr eand post mean scores of the dimensions, real-life applicability and source of a bility to learn (r(55) = 0.226 and 0.262, respectively). EBAPS Results Interview Participants Descriptive Statistics Interview Participants Interviewed participants’ initial and final persona l epistemological beliefs over the course of a semester were assessed using the EBAPS. Prior to data analysis, a check on accuracy of data entry and missing data for the data set was done through SPSS frequencies. Each item is scored on a scale of 0 (least sophisticated) to 4 (more sophisticated). Descriptive statistics of the EBAP S preand post-assessment scores (N=20) of all the interviewed participants are outl ined in Table 24 to include means, standard deviations, and ranges of scores from each dimension as well as the overall score. Preand postassessment scores for all in terviewed participants are located in Appendix P. Table 24 Descriptive Statistics – EBAPS Scores – In terview Participants Dimension PreMean SD Range PostMean SD Range Structure of Knowledge (A-1) 2.090 0.407 1.202.90 2.512 0.558 1.653.50 Nature of Knowing & Learning (A-2) 2.569 0.351 1.563.06 2.935 0.549 1.753.94 Real-life Applicability (A-3) 2.788 0.480 1.503.50 3.138 0.594 2.004.00 Evolving Knowledge (A-4) 2.150 0.587 1.333.33 2.783 0.669 1.674.00 Source of Ability to Learn (A-5) 3.000 0.554 1.603.80 3.210 0.617 2.004.00 Overall Score 2.537 0.266 1.882.98 2.867 0.125 2.083.55

PAGE 196

180 As shown in Table 24 the mean pre-assessment overal l EBAPS scores for the interviewed participants (N=20) ranged from 1.88 to 2.98. The participants had a mean pre-assessment score of 2.537 with a standard devia tion of 0.266. The results also indicate that the interviewed participants’ EBAPS p ost-assessment scores ranged from 2.08 to 3.55. The participants had a mean post-ass essment score of 2.867 with a standard deviation of 0.125. These results seem to suggest that the laboratory instructional experience had a small but positive e ffect on some of the participants’ personal epistemological beliefs. However, each in structional method (pre-laboratory, Labwork, post-laboratory) included multiple pedagog ical components (i.e., quiz, MBL, laboratory notebook, and analysis paper) that may o r may not of influenced the participants’ epistemological beliefs. Taking into consideration that the range of possible scores 0 to 4, the results indicated that some of t he participants were prior to and after the laboratory instruction homogeneous in terms of their overall epistemological stage while 9 of the participants improved their epistemo logical beliefs (see Tables 25-26 & Appendix P). As indicated in Tables 25 and 26 one participant sh ifted from moderately sophisticated (2.85) to extremely sophisticated (3. 55). Approximately 6 participants moved into the highly sophisticated belief level (3 .0-3.4), 3 from moderately sophisticated and 3 from poorly sophisticated belie fs by the end of the semester course. Nine of the participants remained in the moderately sophisticated belief range (2.9-2.4) with small changes in their individual dimension sc ores, while 3 participants moved from poorly sophisticated to moderately sophisticated be liefs. One participant remained in the poorly sophisticated beliefs range (1.6-2.3).

PAGE 197

181 Table 25 Participant Shifts between Epistemological Belief Levels H E M H M M P H P M P P 1 4 9 2 3 1 The overall average score for the EBAPS at the begi nning of the semester course for the interviewed participants was 2.537 i ndicating a moderately sophisticated level of epistemological beliefs. Among them, the highest score was 2.98 indicating highly moderate sophisticated epistemological belie fs and the lowest score was 1.88 indicating a poor level of sophistication in episte mological beliefs. It is worth noting, however, that for the pre-assessment overall score, none of the 20 interviewed participants scored above 3.00 indicating most of t hem began the semester with moderate or poor beliefs, while 6 of 20 participant s scored below 2.40 indicating initially poor to unsophisticated epistemological beliefs. T he majority of participants scored between 2.41-2.66 indicating moderately sophisticat ed epistemological beliefs. Table 26 EBAPS Score Ranges –Pre-Post Count Sophistication Level Score Range Scaled Score Range Pre-Count N=20 Post-Count N=20 Extremely Sophisticated (E) 3.5 – 4.0 87 100 0 1 Highly Sophisticated (H) 3.4 – 3.0 86 – 75 0 6 Moderately Sophisticated (M) 2.9 – 2.4 74 60 14 12 Poorly Sophisticated (P) 2.3 – 1.6 59 40 6 1 Unsophisticated (U) 1.5 0 39 0 0 0 By the end of the semester, the overall average sco re for all the interviewed participants was 2.867. The highest score was 3.55 indicating superior sophisticated epistemological beliefs and the lowest score was 2. 08 in the range of poor epistemological beliefs. Again it is worth noting that for the post-assessment overall score, 7 of 20 students scored above 3.00 with 1 of the 7 scoring 3.55 while only 1 of 20 students scored below 2.40. The majority of the p articipants scored in the range 2.703.03 indicating moderate to highly sophisticated ep istemological beliefs.

PAGE 198

182 EBAPS T-Test Results – Interview Participants Paired samples t-test were conducted for each axis mean score and overall mean score to compare the preand post-mean scores of the interviewed participants. Statistically significant (p 0.05) differences were found in four of the five d imensions, structure of knowledge, nature of learning, real li fe applicability, evolving knowledge and in the overall score. In this dissertation, effect sizes are calculated from the mean difference score (mean Time 2 – mean Time 1) divide d by the pooled standard deviation of the Time 1 and Time 2. The results were analyz ed by comparing pre and post test scores, the Hake gain (also called the Hake factor) and the maximum possible gain. The Hake gain is a normalized gain defined as pretest score max pretest posttest gain possible max gain actual g = = The results are presented in Table 27. There was a statistically significant increase in t he structure of knowledge dimension scores from pre-assessment (M=2.090, SD=0 .407) to post-assessment (M=2.512, SD=0.558), t (19) =-4.064, p 0.001, d=0.91 (large statistically significant effect). There was a statistically significant inc rease in the nature of learning dimension scores from pre-assessment (M=2.569, SD=0.351) to p ost-assessment (M=2.935, SD=0.549), t (19) =-2.905, p 0.009, d=0.65 (medium but statistically significant effect).

PAGE 199

183 Table 27 EBAPS T-Test Analysis Interview Particip ants Dimension Pre-Mean Post-Mean Gain t p Effect Size Eta 2 Structure of Knowledge (A-1) 2.090 2.512 0.39 -4.064 0.001* 0.91 0.47 Nature of Learning (A-2) 2.569 2.935 0.23 -2.905 0.009* 0.65 0.24 Real Life Applicability (A-3) 2.788 3.138 0.20 -2.580 0.018* 0.58 0.26 Evolving Knowledge (A-4) 2.150 2.783 0.55 -4.371 0.000* 0.98 0.50 Source/Ability to Learn (A-5) 3.000 3.210 0.11 -1.213 0.240 0.27 0.072 Overall Score 2.537 2.867 0.21 -4.169 0.001* 0.93 0.48 *significant at p 0.05 There was a statistically significant increase in t he real-life applicability dimension scores from pre-assessment (M=2.788, SD=0.480) to p ost-assessment (M=3.138, SD=0.594), t (19) =-2.580, p 0.018, d=0.58 (medium statistically significant eff ect). There was a statistically significant increase in t he evolving knowledge dimension scores from pre-assessment (M=2.150, SD=0.587) to post-ass essment (M=2.783, SD=0.669), t (19) = -4.371, p 0.000, d=0.98 (large statistically significant effe ct). There was not a statistically significant increase in the source an d ability to learn dimension scores from pre-assessment (M=3.000, SD=0.554) to post-assessme nt (M=3.210, SD=0.617), t (19) =-1.213, p 0.240, d=0.27 (small not statistically significant effect). There was a statistically significant increase in the overall s cores from pre-assessment (M=2.537, SD=0.266) to post-assessment (M=2.867, SD=0.353), t (19) =-4.169, p 0.001, d=0.93 (large statistically significant effect). The average gain score of all participants ranged f rom 0.21-0.33 on a scale of 0 to 4.00 or 5-8 points on a scale of 0-100. The pair ed t-test shows that this gain score represents a statistically significant mean differe nce between the pretest and posttest with t =-4.169, p <0.001. This indicates a largely significant increa se in the sophistication

PAGE 200

184 level of several participants’ epistemological beli efs over the course of the semester with an effect size of d=0.93. The results suggest that some of the interviewed participants in general improved their personal epistemological bel iefs during the course of the semester. Eta squared is the proportion of the total varianc e that is attributed to an effect. In other terms it is considered a variance proporti on estimate that can be positively biased and over estimate true effect. However, it is usually calculated when performing a paired-sample t-test as an additional indicator o f effect size (Pallant, 2003). The eta square index (hand calculated in this case) indicat es that 48% of the variability in the preand post-overall scores may be explained in pa rt by the semester of laboratory instruction. So while there is a statistical diffe rence, the practical difference is moderate and warrants further investigation. EBAPS results (table 27) show a significant increas e in structure, nature, real life applicability of science, and evolving knowledge fo r the interviewed participants. The interviewed participants seem to struggle with abil ity to learn science as did the other 36 participants. In summary based on the EBAPS resul ts: (1) the mean gain scores for the overall test and all dimensions, except for the source of ability to learn were found to be significant at p .05 and (2) the data suggest that possibly laborat ory instruction had effected a change in the students’ epistemological beliefs. EBAPS Correlations – Interview Participants The differences between interviewed participants’ r esponses on the preassessment and the post-assessment were tested as f ollows. To check the pattern of internal relationships between dimensions, dimensio ns and overall scores, and preand post-overall scores, Pearson’s correlations between the preand post-assessment

PAGE 201

185 dimensions were calculated. Table 28 shows the cor relation coefficients and the p-level of these correlations. The EBAPS (N=20) has good internal consistency, wit h a Cronbach’s alpha coefficient of 0.716. The correlations shown in T able 28 indicate that the preand postassessments for 10 of the 16 correlations significa ntly correlated, either at .05 or .01 level, providing additional support for the instrum entation reliability. The smaller sample size (N=20) may of contributed to the lack of corre lation between the preand postdimension scores. The relationship between the EBAPS dimensional (Axi s) mean scores and the overall preand post assessment mean scores was in vestigated using Pearson productmoment correlation coefficient. Three of the init ial means, (structure of knowledge, nature of knowing and learning, and real-life appli cability) of the five EBAPS dimensions (Axis) significantly correlated with the initial to tal overall mean score at the 0.01 level (r(19) = 0.590, 0.740, and 0.674, respectively). So urce of ability to learn significantly correlated with the initial overall mean score at t he 0.05 level (r (19) = 0.489). Only evolving knowledge did not correlate with the initi al overall mean score (r (19) = 0.105).

PAGE 202

186 Table 28 EBAPS Paired Samples Correlations Pair Correlation Significance Sum A1in-Totin 0.590** 0.006 Sum A2in-Totin 0.740** 0.000 Sum A3in-Totin 0.674** 0.001 Sum A4in-Totin 0.105 0.658 Sum A5in-Totin 0.489* 0.029 Sum A1F-TotF 0.807** 0.000 Sum A2F-TotF 0.798** 0.000 Sum A3F-TotF 0.514* 0.020 Sum A4F-TotF 0.163 0.492 Sum A5F-TotF 0.475* 0.034 Sum A1in-A1F 0.575** 0.008 Sum A2in-A2F 0.279 0.234 Sum A3in-A3F 0.379 0.099 Sum A4in-A4F 0.474* 0.035 Sum A5in-A5F 0.129 0.587 Sum Totin-TotF 0.373 0.105 **Correlation is significant at the 0.01 level *Correlation is significant at the 0 .05 level Two of the post means (structure of knowledge (A1) and nature of knowing and learning A2) of the five EBAPS dimensions (Axis) si gnificantly correlated with the post total overall mean score at the 0.01 level (r(19) = 0.807 and 0.798, respectively). Reallife applicability and source of ability to learn s ignificantly correlated with the final overall mean score at the 0.05 level (r (19) = 0.514 and 47 5, respectively). Evolving knowledge (A4) did not correlate with the final overall mean score (r (19) = 0.163). The structure of knowledge (A1) dimension preand post-means were significantly correlated at the 0.01 level (r (19) = 0.575), while the dimension evolving knowledge (A4) preand post-means were significant ly correlated at the 0.05 level (r (19) = 0.474). However, the results indicated the lack of significant correlations between the preand post mean scores of the dimensions, na ture of knowing and learning, reallife applicability and source of ability to learn ( r(19) = 0.279, 0.379, and 0.587,

PAGE 203

187 respectively). Once again the lack of correlation may be attributed to the small sample size. Nature of Scientific Knowledge Results The Nature of Scientific Knowledge Scale, NSKS, (Ru bba & Anderson, 1978) discussed in chapters 2 and 3 was used as a supplem entary source for research question one and two regarding changes in participa nt's understandings of scientific literacy, in particular nature of science issues (s ee Appendix O for scoring instructions). The NSKS contains 24 positively and 24 negatively w ritten item statements with eight statements in each of six subscales. The response a lternatives for each item are in a Likert-style format including strongly agree, agree neutral, disagree, and strongly disagree. The six dimensions of the instrument reflect differ ent aspects of the nature of science. These dimensions measure participant's und erstandings of the amoral, creative, developmental, parsimonious, testable, an d unified nature of science. The amoral dimension reflects that “scientific knowledg e provides humans with many capabilities but not how to use them”, the creative dimension reflects that "scientific knowledge is partially a product of human creativit y", the developmental dimension reflects that "scientific knowledge is tentative", the parsimonious dimension reflects that “ scientific knowledge moves toward being comprehensi ve and simplistic”, the testable dimension reflects that "scientific knowledge is ca pable of empirical test", and the unified dimension reflects that "the specialized sciences c ontribute to an interrelated network of laws, theories, and concepts" (Meichtry,1992; Rubb a & Anderson, 1978). The range of scores for each dimension is 8 to 40 p oints. For each dimension, a score of 24 points indicates a neutral (N) position or combination of realist and instrumentalist views on NOS while a score between 25 and 40 is within the accepted

PAGE 204

188 view of the nature of science (Instrumentalist-I), and a score between 8 and 23 is within the unaccepted NOS view (Realist-R). The overall sc ore for all six dimensions ranges from 48 to 240 points (Figure 6). A score of 144 (1 41-147) on the overall scale score is considered neutral (N) while scores ranging from 14 5 and 240 (148-240) are within the accepted view of the nature of science (instrumenta list), and scores ranging from 48 and 143 (48-140) are within the unaccepted view (realis t). Realist-------------------------------------neutral ----------------------------------Instrumentalist (48) (unaccepted NOS view) (144) (accepted NOS view) (240) Figure 6 NSKS Representative Placement Scale Descriptive NSKS Statistics All Participants Participants’ preand post scores concerning their nature of science (NOS) beliefs over the course of a semester were assessed using the NSKS. The NSKS discussed in chapters two and three (see Appendix C ) is designed to assess NOS beliefs in six dimensions: amoral, creative, develo pmental, parsimonious, testable, and unified. Each dimension is scored on a scale of 8 (realist-R) unaccepted of NOS views to 40 (instrumentalist-I) accepts NOS views. The o verall NSKS score is the sum of all six dimensions ranging from 48-240. Prior to data a nalysis, a check on accuracy of data entry and missing data for the data set was done th rough SPSS frequencies. Before calculating the dimension (subscale) scores for bot h the preand post-assessments, scores for the negatively worded items were reverse d using SPSS 15.0. Descriptive statistics of the NSKS preand post-as sessment scores (N=56) of all the participants are outlined in Table 29 to includ e means, standard deviations, and ranges of scores from each dimension as well as the overall score. Preand postassessment scores for all participants are located in Appendix P.

PAGE 205

189 Table 29 Descriptive Statistics NSKS Scores – All Participants Dimension Pre-Mean SD Range Post-Mean SD Range Amoral (D-1) 23.643 3.205 18-38 24.196 2.713 18-31 Creative (D-2) 22.893 2.095 18-27 23.670 2.288 18-32 Developmental (D-3) 23.625 1.950 19-27 24.768 2.123 19-31 Parsimonious (D-4) 24.625 2.378 20-31 26.321 2.552 20-33 Testable (D-5) 24.196 1.986 19-28 24.982 2.004 21-34 Unified (D-6) 23.643 1.494 20-28 24.411 1.735 20-28 Overall Score 142.482 7.027 122-158 148.375 7.845 118-169 As shown in Table 29 the mean pre-assessment overa ll NSKS scores for the participants (N=56) ranged from 122-158. The parti cipants had a mean pre-assessment score of 142.482 with a standard deviation of 7.027 The results also indicate that participants’ NSKS post-assessment scores ranged fr om 118-169. The participants had a mean post-assessment score of 148.375 with a stan dard deviation of 7.845. These results seem to suggest that the laboratory instruc tional experience had a small but positive effect on some of the participants’ NOS be liefs. However, each instructional method (pre-laboratory, Labwork, post-laboratory) i ncluded multiple pedagogical components (i.e., quiz, MBL, laboratory notebook, a nd analysis paper) that may or may not of influenced the participants’ NOS beliefs. I n addition, explicit NOS instruction discussed in chapter 2 was not included or monitore d during this particular study. Taking into consideration that the range of possibl e overall scores 48240, the results indicated that some of the participants were not ho mogeneous prior to and after the semester of laboratory instruction in terms of thei r overall NOS beliefs as 32 of the 56 participants moved towards the acceptance of NOS vi ews score range (see Tables 3031 & Appendix P). As indicated in Tables 30 and 31 eleven participant ’s overall scores shifted from non acceptance of NOS views (R) to neutral views (N ). Approximately 4 participants moved from non acceptance (R) of NOS views to the a cceptance of NOS views (I) by

PAGE 206

190 the end of the semester course while no participant s’ scores dropped from acceptance of NOS views (I) to non acceptance (R). Seventeen o f the participants moved from having neutral (N) views of NOS to accepting views of NOS (I). Six participant scores remained in the neutral range (N), while 5 particip ants remained in the non accepted views (R) of NOS range with minor changes in their individual dimension scores. Table 30 NSKS Assessment Ranges Belief Dimension R-Pre R-Post N-Pre N-Post I-Pre I-Post Amoral (D-1) 30 21 9 11 17 24 Creative (D-2) 30 25 11 13 15 18 Developmental (D-3) 26 12 9 12 21 32 Parsimonious (D-4) 18 4 13 8 25 45 Testable (D-5) 16 8 13 17 27 31 Unified (D-6) 22 16 21 16 13 24 Overall Score 20 5 23 16 13 35 The overall average score for the NSKS at the begin ning of the semester course for all participants was 142.482 indicating most pa rticipants NOS beliefs lie in the unaccepted NOS views. Among them, the highest scor e was 158 indicating acceptance of NOS views and the lowest score was 122 suggestin g non acceptance of NOS views. For the pre-assessment overall scores, 13 of 56 stu dents scored above 147 indicating an acceptance of NOS views while 20 of 56 participa nts scored below 141 indicating initial non acceptance of NOS views. The majority of participants scored from 141-147 considered the neutral range indicating they held s ome of the accepted and non accepted NOS views but not all the views.

PAGE 207

191 Table 31 NSKS Beliefs Shift Pre-Post Assessment – A ll Participants Dimension R R N N I I R N R I N I N R I R I N Amoral (D-1) 14 2 13 8 8 4 3 3 1 Creative (D-2) 15 4 8 7 7 3 5 5 2 Developmental (D-3) 9 2 16 7 10 5 2 1 4 Parsimonious (D-4) 3 2 21 3 12 11 0 1 3 Testable (D-5) 7 7 22 5 4 5 1 0 5 Unified (D-6) 7 5 4 7 8 12 4 5 4 Overall Score 5 6 13 11 4 17 0 0 0 By the end of the semester, the overall average sco re for all the participants was 148.375 indicating a slight shift from non accepted views to neutral views of NOS The highest score was 169 indicating an acceptance of N OS views and the lowest score was 118 in the range of non acceptance of NOS views. A gain it is worth noting that for the post-assessment overall score, 16 of 56 students sc ored in the neutral range of NOS views while 5 participant’s scores remained in the unaccepted NOS views range. The majority of the participants (35) scored in the acc epted range of NOS views. NSKS T-Test Results – All Participants Paired samples t-test were conducted for each axis mean score and overall mean score to compare the preand post-mean scores of the participants. Statistically significant (p 0.05) differences were found in five of the six di mensions, creative, developmental, parsimonious, testable, unified, and in the overall score. In this dissertation, effect sizes are calculated from the mean difference score (mean Time 2 – mean Time 1) divided by the pooled standard deviati on of the Time 1 and Time 2. The results were analyzed by comparing pre and post tes t scores, the Hake gain (also called the Hake factor), and the maximum possible gain. T he Hake gain is a normalized gain defined as pretest score max pretest posttest gain possible max gain actual g = =

PAGE 208

192 The results are presented in Table 32. There was not a statistically significant increase in the amoral dimension scores from pre-assessment (M=23.643, SD=3.205) to post-as sessment (M=24.196, SD=2.713), t (55) =-1.414, p 0.163, d=0.19 (small not statistically significant effect). There was a statistically significant increase in t he creative dimension scores from preassessment (M=22.893, SD=0.0470) to post-assessment (M=23.670, SD=2.288), t (55) =-2.262, p 0.028, d=0.30 (small but statistically significant effect). Table 32 NSKS T-Test Analysis All Participants Dimension Pre Mean Post Mean Gain t-test p-value Effect size Eta 2 Amoral 23.643 24.196 0.0338 -1.414 0.163 0.19 0. 035 Creative 22.893 23.670 0.0470 -2.262 0.028* 0.30 0. 085 Developmental 23.625 24.768 0.0700 -4.021 0.000* 0. 54 0.227 Parsimonious 24.625 26.321 0.1103 -5.401 0.000* 0.7 2 0.346 Testable 24.196 24.982 0.0500 -2.537 0.014* 0.34 0. 104 Unified 23.643 24.411 0.0470 -2.695 0.009* 0.36 0.1 17 Overall Score 142.482 148.375 0.0604 -8.152 0.000* 1.00 0.547 N = 56 *significant at p 0.05 There was a statistically significant increase in t he developmental dimension scores from pre-assessment (M=23.625, SD=1.950) to post-assessment (M=24.768, SD=2.123), t (55) =-4.021, p 0.000, d=0.54 (medium statistically significant eff ect). There was a statistically significant increase in t he parsimonious dimension scores from pre-assessment (M=24.625, SD=2.378) to post-assessm ent (M=26.321, SD=2.552), t (55) = -5.401, p 0.000, d=0.72 (medium statistically significant eff ect). There was a statistically significant increase in the testable dimension scores from pre-assessment (M=24.196, SD=1.986) to post-assessment (M=24.982, SD=2.004), t (55) =-2.537, p 0.014, d=0.34 (small but statistically significant effect). There was a statistically significant increase in the unified dimension score s from pre-assessment (M=23.643, SD=1.494) to post-assessment (M=24.411, SD=1.735), t (55) =-2.695, p 0.009, d=0.36 (small but statistically significant effect). Ther e was a statistically significant increase in

PAGE 209

193 the overall scores from pre-assessment (M=142.482, SD=7.027) to post-assessment (M=148.375, SD=7.845), t (55) =-8.152, p 0.000, d=1.00 (large statistically significant effect). The average gain score of all participants ranged f rom 0.0604-5.750 on a scale of 8-40 or approximately 4.471-7.351 points on a sc ale of 48-240. The paired t-test shows that this gain score represents a statistical ly significant mean difference between the pretest and posttest with t =-8.152, p <0.000. This indicates a moderately significant increase toward the acceptance of NOS views of seve ral participants’ over the course of the semester with an effect size of d=1.00. The re sults suggest that some of the participants in general changed their NOS beliefs d uring the course of the semester. Eta squared is the proportion of the total varianc e that is attributed to an effect. In other terms it is considered a variance proporti on estimate that can be positively biased and over estimate true effect. However, it is usually calculated when performing a paired-sample t-test as an additional indicator o f effect size (Pallant, 2003). The eta square index (hand calculated in this case) indicat es that 55% of the variability in the preand post-overall scores may be explained in pa rt by the semester of laboratory instruction. So while there is a statistical diffe rence, the practical difference is moderate and warrants further investigation. NSKS results (table 32) show a significant increase in the creative, developmental, parsimonious, testable, and unified dimensions for some of the participants. However, the participants seemed to struggle with the amoral dimension. In summary based on the NSKS results: (1) the mean gain scores for the overall test and all dimensions, except for amoral were found to be significant at p .05 and (2) the data suggest that possibly laboratory instruction h ad effected a change in the students’ NOS beliefs.

PAGE 210

194 NSKS Correlations – All Participants The differences between participants’ responses on the pre-assessment and the post-assessment were tested as follows. To check t he pattern of internal relationships between dimensions, dimensions and overall scores, and preand post-overall scores, Pearson’s correlations between the preand post-as sessment dimensions were calculated. Table 33 shows the correlation coeffic ients and the p-level of these correlations. The NSKS (N=56) has good internal consistency, with a Cronbach’s alpha coefficient of 0.729. The correlations shown in T able 32 indicate that the preand postassessments for 18 of the 19 were significantly cor related, either at .05 or .01 level, providing additional support for the instrumentatio n reliability. The relationship between the NSKS dimensional mean scores and the overall preand post assessment mean scores was investigat ed using Pearson productmoment correlation coefficient. All of the initia l means of the six NSKS dimensions (D) significantly correlated with the initial total ove rall mean score at the 0.01 level (r (55) = 0.646, 0.556, 0.471, 0.522, 0.361, and 0.557, respe ctively). All of the final means of the six NSKS dimensions (D) significantly correlated wi th the final total overall mean score at the 0.01 level (r (55) = 0.547, 0.677, 0.647, 0.633 0.594, and 0.365, respectively). The amoral, developmental, and parsimonious dimensi ons as well as the overall NSKS preand post-mean scores were significantly c orrelated at the 0.01 level (r(55) = 0.521, 0.457, 0.547, and 0.741, respectively), whil e the creative and testable dimensions preand post-means were significantly correlated a t the 0.05 level (r(55) = 0.266 and 0.325, respectively). However, the results indicat ed a lack of significant correlation between the preand post mean scores of the unifie d dimension (r (55) = 0.135).

PAGE 211

195 Table 33 NSKS Paired Samples Correlations (N=56) Pair Correlation Significance Sum D1in-Totin 0.646** 0.000 Sum D2in-Totin 0.552** 0.000 Sum D3in-Totin 0.471** 0.000 Sum D4in-Totin 0.522** 0.000 Sum D5in-Totin 0.361** 0.006 Sum D6in-Totin 0.557** 0.000 Sum D1F-TotF 0.547** 0.000 Sum D2F-TotF 0.677** 0.000 Sum D3F-TotF 0.647** 0.000 Sum D4F-TotF 0.633** 0.000 Sum D5F-TotF 0.594** 0.000 SumD6F-TotF 0.365** 0.006 Sum D1in-D1F 0.521** 0.000 Sum D2in-D2F 0.266* 0.047 Sum D3in-D3F 0.457** 0.000 Sum D4in-D4F 0.547** 0.000 Sum D5in-D5F 0.325* 0.014 Sum D6in-D6F 0.135 0.322 Sum Totin-TotF 0.741** 0.000 **Correlation is significant at the 0.01 level *Correlation is significant at the 0.05 level Descriptive NSKS Statistics Interview Participant s Interviewed participants’ (N=20) preand post scor es concerning their nature of science (NOS) beliefs over the course of a semester were assessed using the NSKS. The NSKS discussed in chapters two and three (see A ppendix C) is designed to assess NOS beliefs in six dimensions: amoral, creative, de velopmental, parsimonious, testable, and unified. Each dimension is scored on a scale o f 8 (realist-R) unaccepted of NOS views to 40 (instrumentalist-I) accepts NOS views. The overall NSKS score is the sum of all six dimensions ranging from 48-240. Prior to data analysis, a check on accuracy of data entry and missing data for the data set was do ne through SPSS frequencies. Before calculating the dimension (subscale) scores for both the preand postassessments, scores for the negatively worded items were reversed using SPSS 15.0. Descriptive statistics of the NSKS preand post-as sessment scores (N=20) of interviewed participants is outlined in Table 34 to include means, standard deviations,

PAGE 212

196 and ranges of scores from each dimension as well as the overall score. Preand postassessment scores for participants are located in A ppendix P. Table 34 Descriptive Statistics NSKS Scores Int erview Participants Dimension Pre-Mean Score SD Range Post-Mean Score SD Range Amoral (D-1) 23.150 2.368 20-28 24.350 1.954 20-28 Creative (D-2) 22.550 2.089 18-25 24.100 1.971 20-2 8 Developmental (D-3) 24.000 1.654 20-26 24.700 1.418 22-27 Parsimonious (D-4) 24.550 2.114 21-29 26.700 2.105 23-31 Testable (D-5) 24.050 2.089 19-27 24.300 1.418 21-2 6 Unified (D-6) 23.750 1.333 21-26 24.750 1.333 23-28 Overall Score 141.650 4.196 132-149 148.900 3.960 142-155 As shown in Table 34 the mean pre-assessment overa ll NSKS scores for the participants (N=20) ranged from 132-149. The parti cipants had a mean pre-assessment score of 141.650 with a standard deviation of 4.196 The results also indicate that participants’ NSKS post-assessment scores ranged fr om 142-155. The participants had a mean post-assessment score of 148.900 with a stan dard deviation of 3.900. These results seem to suggest that the laboratory instruc tional experience had a small but positive effect on some of the participants’ NOS be liefs. However, each instructional method (pre-laboratory, Labwork, post-laboratory) i ncluded multiple pedagogical components (i.e., quiz, MBL, laboratory notebook, a nd analysis paper) that may or may not of influenced the participants’ NOS beliefs. I n addition, explicit NOS instruction discussed in chapter 2 was not included or monitore d during this particular study. Taking into consideration that the range of possibl e overall scores 48240, the results indicated that some of the participants were homoge neous prior to and after the semester of laboratory instruction in terms of thei r overall NOS beliefs (see Tables 34-35 & Appendix P).

PAGE 213

197 As indicated in Tables 35 and 36 five participant’s overall scores shifted from non acceptance of NOS views (R) to neutral views (N). Approximately 3 participants moved from non acceptance (R) of NOS views to the accepta nce of NOS views (I) by the end of the semester course while no participants’ scores d ropped from acceptance of NOS views (I) to non acceptance (R). Nine of the partic ipants moved from having neutral (N) views of NOS to accepting views of NOS (I) while tw o participants’ scores remained in the neutral range (N).with minor changes in their i ndividual dimension scores. The interviewed participants’ overall average score s for the NSKS at the beginning of the semester course was 141.650 indica ting most participants held neutral NOS belief. Among them, the highest score was 149 indicating acceptance of NOS views and the lowest score was 132 suggesting non a cceptance of NOS views. For the pre-assessment overall scores, only 1 of the 20 int erviewed participants scored above 147 indicating an acceptance of NOS views while 8 o f 20 participants scored below 141 indicating an initial non acceptance of NOS views. The majority of participants (11) scored from 141-147 considered the neutral range in dicating they held some of the accepted and non accepted NOS views but not all the views. By the end of the semester, the overall average sco re for all the interviewed participants was 148.900. The highest score was 155 earned by 2 participants indicating acceptance of NOS views and the lowest score was 14 2 also scored by 2 participants in the range of non acceptance of NOS views. Again it is worth noting that for the postassessment overall score, 13 of 20 students scored in the range of acceptance of NOS views with the remaining 7 scoring in the neutral r ange. Therefore the majority of the participants scored in the acceptance of NOS views range by the end of the semester.

PAGE 214

198 Table 35 NSKS Score Range – Pre-Post Count (N=20) Belief Dimension R-Pre R-Post N-Pre N-Post I-Pre I-Post Amoral (D-1) 11 6 4 5 5 9 Creative (D-2) 12 5 3 6 5 9 Developmental (D-3) 6 3 5 8 9 9 Parsimonious (D-4) 5 1 8 3 7 16 Testable (D-5) 8 4 3 5 9 11 Unified (D-6) 7 3 10 7 3 10 Overall Score 8 0 11 7 1 13 Table 36 NSKS Belief Shifts Pre-Post Assessment Dimension R R N N I I R N R I N I N R I R I N Amoral (D-1) 5 2 5 3 3 1 1 0 0 Creative (D-2) 3 1 4 5 4 1 1 1 0 Developmental (D-3) 2 2 6 3 2 2 1 0 2 Parsimonious (D-4) 1 2 7 1 3 6 0 0 0 Testable (D-5) 3 1 8 3 1 1 1 0 2 Unified (D-6) 1 2 1 3 2 7 1 1 2 Overall Score 0 2 1 5 3 9 0 0 0 NSKS T-Test Results Interview Participants Paired samples t-test were conducted for each axis mean score and overall mean score to compare the preand post-mean scores of the participants. Statistically significant (p 0.05) differences were found in three of the six d imensions, creative, parsimonious, unified and in the overall score. In this dissertation, ef fect sizes are calculated from the mean difference score (mean Tim e 2 – mean Time 1) divided by the pooled standard deviation of the Time 1 and Time 2. The results were analyzed by comparing pre and post test scores, the Hake gain ( also called the Hake factor), and the maximum possible gain. The Hake gain is a normaliz ed gain defined as pretest score max pretest posttest gain possible max gain actual g = = The results are presented in Table 36.

PAGE 215

199 There was not a statistically significant increase in the amoral dimension scores from pre-assessment (M=23.150, SD=2.368) to post-as sessment (M=24.350, SD=1.954), t (19) =-2.074, p 0.052, d=0.46 (small not statistically significant effect). There was a statistically significant increase in t he creative dimension scores from preassessment (M=22.550, SD=2.089) to post-assessment (M=24.100, SD=1.971), t (19) = -2.747, p 0.013, d=0.61 (medium statistically significant eff ect). Table 37. NSKS T-Test Analysis Interview Participants Dimension Pre Mean Post Mean Gain t-test p-value Effect size Eta 2 Amoral 23.150 24.350 0.0712 -2.074 0.052 0.46 0.185 Creative 22.550 24.100 0.8882 -2.747 0.013* 0.61 0.284 Developmental 24.000 24.700 0.0438 -1.853 0.079 0.41 0.153 Parsimonious 24.550 26.700 0.1391 -4.060 0.010* 0.91 0.464 Testable 24.050 24.300 0.0157 -0.677 0.506 0.15 0.024 Unified 23.750 24.750 0.0615 -2.297 0.033* 0.51 0.217 Overall Score 141.650 148.900 0.0737 -7.623 0.000* 1.00 0.753 *significant at p 0.05 There was not a statistically significant increase in the developmental dimension scores from pre-assessment (M=24.000, SD=1.654) to post-as sessment (M=24.700, SD=1.418), t (19) =-1.853, p 0.079, d=0.41 (small not statistically significant effect). There was a statistically significant increase in t he parsimonious dimension scores from pre-assessment (M=24.550, SD=2.114) to post-assessm ent (M=26.700, SD=2.105), t (19) = -4.060, p 0.010, d=0.91 (large statistically significant effe ct). There was not a statistically significant increase in the testable dimension scores from pre-assessment (M=24.050, SD=2.089) to post-assessment (M=24.300, SD=1.418), t (19) =-0.677, p 0.506, d=0.15 (small not statistically significant effect). There was a statistically significant increase in the unified dimension score s from pre-assessment (M=23.750, SD=1.333) to post-assessment (M=24.750, SD=1.333), t (19) =-2.297, p 0.033, d=0.51 (medium statistically significant effect). There w as a statistically significant increase in the overall scores from pre-assessment (M=141.650, SD=4.196) to post-assessment

PAGE 216

200 (M=148.900, SD=3.960), t (19) =-7.623, p 0.000, d=1.00 (large statistically significant effect). The average gain score of the interviewed participa nts ranged from 0.0737-6.85 on a scale of 8 to 40 or 5.259-9.241 points on a sc ale of 48-240. The paired t-test shows that this gain score represents a statistically sig nificant mean difference between the pretest and posttest with t =-7.623, p <0.000. This indicates a largely significant increa se in the sophistication level of several participants ’ epistemological beliefs over the course of the semester with an effect size of d=1.00. The results suggest that some of the interviewed participants in general improved their NOS beliefs during the course of the semester. Eta squared is the proportion of the total varianc e that is attributed to an effect. In other terms it is considered a variance proporti on estimate that can be positively biased and over estimate true effect. However, it is usually calculated when performing a paired-sample t-test as an additional indicator o f effect size (Pallant, 2003). The eta square index (hand calculated in this case) indicat es that 75% of the variability in the preand post-overall scores may be explained in pa rt by the semester of laboratory instruction. So while there is a statistical diffe rence, the practical difference is moderate and warrants further investigation. NSKS results (table 37) show a significant increase in the creative, parsimonious, and unified dimension scores for the interviewed pa rticipants. The interviewed participants seem to struggle with the amoral, deve lopmental, and testable dimensions. In summary based on the NSKS results: (1) the mean gain scores for the overall test and three dimensions (amoral, developmental, and te stable) were found to be significant at p .05 and (2) the data suggest that possibly laborat ory instruction had effected a change in the participants’ scores in the three NSK S dimensions.

PAGE 217

201 NSKS Correlations – Interview Participants The differences between interview participants’ res ponses on the preassessment and the post-assessment were tested as f ollows. To check the pattern of internal relationships between dimensions, dimensio ns and overall scores, and preand post-overall scores, Pearson’s correlations between the preand post-assessment dimensions were calculated. Table 38 shows the cor relation coefficients and the p-level of these correlations. The correlations in Table 38 indicate that only 2 of the 19 the preand post-assessments were significantly correlated, either at .05 or .01 level. The smaller sample size (N=20) may of contributed to th e lack of correlation between the preand post-dimension scores and the preand pos t dimension scores with the overall scores. The relationship between the NSKS dimensional mean scores and the overall preand post assessment mean scores was investigat ed using Pearson productmoment correlation coefficient. None of the initi al or post means of the six NSKS dimensions significantly correlated with the initia l or final total overall mean score. As suggested previously this lack of correlation may b e due to the small sample (N=20). The testable dimension preand post-means were sig nificantly correlated at the 0.01 level (r (19) = 0.616), while the overall preand post-means were significantly correlated at the 0.05 level (r (19) = 0.457). How ever, the results indicated the lack of significant correlations between the preand post mean scores of the remaining 5 dimensions, amoral, creative, developmental, parsim onious, and unified (r(19) = 0.295, 0.229, 0.404, 0.370, and 0.067, respectively). On ce again the lack of correlation may be due to the small sample size.

PAGE 218

202 Table 38 NSKS Paired Samples Correlations (N=20) Pair Correlation Significance Sum D1in-Totin 0.440 0.052 Sum D2in-Totin 0.239 0.310 Sum D3in-Totin 0.409 0.073 Sum D4in-Totin 0.408 0.074 Sum D5in-Totin 0.188 0.427 Sum D6in-Totin 0.266 0.257 Sum D1F-TotF 0.404 0.077 Sum D2F-TotF -0.025 0.916 Sum D3F-TotF -0.070 0.769 Sum D4F-TotF 0.292 0.211 Sum D5F-TotF 0.028 0.906 SumD6F-TotF 0.014 0.954 Sum D1in-D1F 0.295 0.206 Sum D2in-D2F 0.229 0.332 Sum D3in-D3F 0.404 0.077 Sum D4in-D4F 0.370 0.108 Sum D5in-D5F 0.616** 0.004 Sum D6in-D6F 0.067 0.780 Sum Totin-TotF 0.457* 0.043 **Correlation is significant at the 0.01 level *Correlation is significant at the 0 .05 level

PAGE 219

203 Discussion Range of Initial Beliefs RQ1. What range of personal epistemological and NOS beli efs about science (chemistry) do undergraduate science students have at the beginning of a semester general chemistry laboratory course? Participants’ initial scores on the Epistemological Beliefs Assessment for Physical Science (EBAPS) represent a range of belie fs from unsophisticated to highly sophisticated with the majority falling into the mo derately sophisticated range (2.4-2.9). No participants scored in the top sophistication le vel, extremely sophisticated, meaning that there were no participants at the beginning of the semester course that held a high level of epistemological beliefs theorized in the m odels (Baxter-Magolda, 1986; Schommer, 1990; Hofer & Pintrich, 1997; Perry, 1970 ). Most of the participants initial scores fell in the range of late dualism to late mu ltiplicity (levels 2-4) in Perry’s model and in the absolute knowing to transitional knowing range of Baxter Magolda’s model. The average EBAPS overall score of 2.514 would plac e the participants in the early multiplicity stage or transitional knowing stage of epistemological development. This gives some support to Perry and Baxter Magolda’s fi ndings that students depending on their year in college and other factors such as age and gender begin as a dualist or multiplist. Participants at level 2 or absolute knowing usually perceive the world especially scientific knowledge from a dualistic viewpoint. T hey divide scientific knowledge into either right or wrong answers based on what is know n to authority. These participants’ beliefs are guided by obedience to authority and ha rd work. Participants at level 3 or transitional knowing acknowledge the existence of d iversity of opinion and uncertainty of scientific knowledge and are considered relativisti c students. This shift represents an

PAGE 220

204 increase in tolerance of uncertainty with notions o f right and wrong having meaning only in context and uncertainty becomes legitimate (Moor e, 2002). The results of the study support an initial persona l epistemological belief range (1.58-3.23) of unsophisticated to highly sophistica ted at the beginning of the semester course with the majority of the participants fallin g at the low end of moderately sophisticated beliefs (2.514) or multiplicity. Ho wever, according to the multi dimensional epistemological beliefs models of Schommer (1994) a nd Hofer and Pintrich (1997) beliefs are a system of independent distributions. In other words, students may be sophisticated in some beliefs but not necessarily s ophisticated in other beliefs. According to Schommer (1994), there are multiple di mensions to be considered and thought of independently as well as in various comb inations (Hofer & Pintrich, 1997). The EBAPS measured the participants’ beliefs in fiv e dimensions: structure of scientific knowledge, nature of knowing and learnin g science, real-life applicability of science, evolving scientific knowledge, and the sou rce of ability to learn science. The participants initially held nave beliefs about the structure of scientific knowledge (2.172) and evolving knowledge in science (2.357). This av erage score suggests a dualistic perspective about the structure of scientific knowl edge. Participants holding this view see scientific knowledge as right or wrong and auth ority is always correct. At the beginning of the semester course participants held low moderately sophisticated beliefs about the nature of knowing and learning science (2 .511). This average score suggests an early multiplist view of the nature of knowing a nd learning science. Here the participants are beginning to recognize diversity a nd uncertainty is possible and truth is knowable. However, the participants scored slightl y higher in real-life applicability of science (2.665) moving toward the mid-range of mode rately sophisticated beliefs or multiplicity. The highest initial average score wa s in the source of ability to learn science

PAGE 221

205 (2.896) which lies at the high end of the moderatel y sophisticated beliefs range or multiplicity. Here participants are inclined to bel ieve that there are no absolute answers and all views are equally valid and that each indiv idual has a right to his or her own opinion. The distribution of average scores with in each epistemological dimension corresponds with Schommer (1994) and Hofer and Pint rich (1997) views that beliefs are better described in terms of distributions rather t han a single point along a continuum as described in the uni-dimensional models (Baxter Mag olda, 1986, Belenky, et al., 1986; King & Kitchener, 1994; Kuhn, 1991; Perry, 1970). Participants’ initial scores on the Nature of Scien tific Knowledge Scale, NSKS, (Rubba & Anderson, 1978) represent a range of belie fs from realist to instrumentalist with the majority falling into the neutral range (1 41-147). No participants scored at the high end of the scale (240) of accepted views of th e nature of science (NOS) meaning that there were no participants at the beginning of the semester course that held a high level of NOS beliefs theorized in the NOS model (Ab d-El-Khalick & Lederman, 1998; Lederman, Wade & Bell, 1998; Ryder, Leach & Driver, 1999). A majority of the participants’ initial scores fell in the neutral an d high range of relativist. According to Hogan (2000), students have mixed views about the N OS suggesting that some indicate a view of science as dynamic while others indicate a view of science as static. Learners at many age levels seem to understand that scientific knowledge changes but tend to see change as a “right” idea re placing a “wrong” one. However, they do not believe that theories as a whole change (Driver et al., 1996; Khishfe & AbdEl-Khalick, 2002; Lederman & O'Malley, 1990; Linn & Songer, 1993). Learners do not recognize these theoretical changes and view scient ific knowledge as trouble-free and providing right answers (Carey et al., 1989; Driver et al., 1996). Students believe that getting the “right” answer relies on proper impleme ntation of the scientific method

PAGE 222

206 (Hogan, 1999; Linn & Songer, 1993; Millwood & Sand avol, 2004). Changes in NOS Beliefs RQ1a. Do students’ images of the nature of chemistry (NOS ) change by the completion of a semester general chemistry laborato ry course? Participants’ final scores on the Nature of Scienti fic Knowledge Scale, NSKS, (Rubba & Anderson, 1978) represent a range of belie fs from a “high-end” realist to a “low-end” instrumentalist with the majority of the participants falling into the “low-end” of the instrumentalist (148.375) range. This suggest s that some of the participants moved toward the acceptance of NOS views during the cours e of the semester. Within each dimension shifts from realist views (non acceptance of NOS views) to neutral views (acceptance of some NOS views) and instrumentalist views (acceptance of NOS views) occurred. For this study the desired shift for the participants was towards the instrumentalist views. As shown in Table 39 there was an overall improvement towards the acceptance of NOS views by the end of the semes ter course. Table 39 NSKS Percent Change Dimension R Pre R Post R N Pre N Post N I Pre I Post I Amoral 54% 37% -17 16% 20% +4 30% 43% +13 Creative 53% 47% -6 20% 23% +3 27% 30% +3 Developmental 46% 21% -25 16% 22% +6 38% 57% +19 Parsimonious 34% 7% -27 23% 14% -9 43% 79% +36 Testable 27% 13% -14 23% 30% +7 50% 57% +7 Unified 39% 28% -11 38% 29% -9 23% 43% +20 *R Pre = Realist Pre; R Post = Realist Post; R Change =Realist Change *N Pre = Neutral Pre; N Post = Neutral Post; N Change = Neutral Change *I Pre = Instrumentalist Pre; I Post = Instrumentalist Post; I Change = Instrumentalist Change The study shows that some participants became more accepting of NOS views for the dimensions related to the importance of exp erimental tests and observations, the tentativeness of scientific knowledge, the simplici ty of scientific knowledge, and the unity

PAGE 223

207 of nature on the NSKS. As shown in Table 39 most p articipants in this study had limited problems with the acceptance of NOS views for the p arsimonious dimension of the NSKS that scientific knowledge tends toward simplic ity (79%). Some of the participants realized the importance of experimental tests and\o r observations, that scientific knowledge is tentative, and the unity of nature on the NSKS. For example, 78% of the students understood that scientific laws, theories, and concepts should be stated as simply as possible. Additionally, the NSKS dimensio n of developmental states that scientific knowledge is never “proven” and changes over time. Fifty-seven percent of the participants agreed that today’s scientific laws, t heories, and concepts may have to be changed in the face of new evidence. Seventy-nine percent of the participants thought that scientific knowledge needs be capable of exper imental testing. Many participants in this study agreed with the mod el on the testable and unified nature of scientific knowledge. They believe that s cientific knowledge must be subject to testing and the interaction of the various discipli nes of science contributes to the overall understanding of the nature of science. However, many participants were confused on the amo ral, creative, and unified levels of the NOS on the NSKS. Within the dimensio n of amoral, participants final scores reflected a minimal change from the “high-en d” of realist to the neutral range. By the end of the semester course 43% of the participa nts reported that even if the applications of a scientific theory are judged to b e bad, we should not judge the theory itself. This result shows that some of the partici pants seem to realize the difference between scientific theory itself and the applicatio ns of the theory. However, the participants thought that moral judgment needs to b e placed on both the applications of scientific knowledge and the knowledge itself. Thi s suggests that many of the participants did not understand that the cause of s ome mistakes is not because of

PAGE 224

208 scientific knowledge, but how humans make use of sc ientific knowledge. That may be why 37% of the participants indicated that certain pieces of scientific knowledge are good and others are bad. This result suggests that the participants could not clearly distinguish between scientific knowledge and the ap plications of scientific knowledge in moral judgment. The creative dimension involves the aspect that sci entific knowledge is a product of the human intellect and is a tenet scientists wa nt students to believe. Only 30% of the participants in this study believed that scientific knowledge expresses the creativity of scientists and represents imaginative thoughts, whe reas almost one half of the participants (47%) thought that “scientific theorie s are discovered, not created by man”. Two possible answers probably can shed some light o n this controversial problem. First, these participants believed that scientific theorie s are not created by man; and the theories are just discovered by man. In this view, participants thought that scientific theories are already there and are just waiting for man to discover. Second, these participants may not realize the difference between creativity and discovery. In this view, the problem will be related to meanings of words, n ot related to knowledge of the NOS. Lederman (1992) stresses that even though scientifi c knowledge is at least partially based on and/or derived from observations of natura l world; it involves human imagination and creativity. He stated that science involves the invention of explanation, which requires a great deal of creativity. The unified dimension of the NSKS is the belief tha t scientific knowledge is born out of an effort to understand the unity of nature. That the knowledge produced by biology, chemistry, and physics contributes to a ne twork of laws, theories and concepts. Forty-three percent of the participants indicated t hat there are similarities among biology, chemistry, and physics.

PAGE 225

209 According to Lederman (1992) references to the NOS as part of a science curriculum topic have appeared throughout the 20" century. How ever, increased emphasis in this area began in the 1960s culminatin g in the inclusion of the nature of science as a key topic in the scientific literacy c urriculum focus that has predominated over the last 20 years. The inclusion of the measure of participants’ under standing of the NOS was included primarily because of the view that student s often do not have an adequate understanding of the NOS, which is a critical compo nent for scientific literacy (Lederman et al., 2002; Schwartz & Crawford, 2003) and succes s in the science fields. It is also a small component of the major research focus of epis temological beliefs. The EBAPS variables structure of knowledge and evolving knowl edge presented some questions related to NOS (see Appendix N). The influence of NOS on student’s epistemological beliefs as related to science and learning science needs to be investigated further. Changes in Personal Epistemological Beliefs RQ1b. Do students’ personal epistemological beliefs about science (chemistry) change, if any, by the completion of a semester gen eral chemistry laboratory course? Participants’ final scores on the Epistemological B eliefs Assessment for Physical Science (EBAPS) represent a range of beliefs from u nsophisticated to extremely sophisticated with the majority still falling into the moderately sophisticated range (2.42.9) at the end of the semester course. One partic ipant scored in the top sophistication level, extremely sophisticated, while several moved from moderately sophisticated to highly sophisticated by the end of the course. Thi s adds support to the research on epistemological beliefs theorized in the models (Pe rry, 1970; Baxter-Magolda, 1986; Schommer, 1990; Hofer & Pintrich, 1997) that some c hange in beliefs occurs as learners interact with the educational environment and respo nd to new learning experiences by

PAGE 226

210 either integrating to their existing cognitive fram eworks or accommodating the framework itself. This suggests that change is brought about through cognitive disequilibrium. However, in this study cognitive disequilibrium was not directly monitored. Therefore, as learners with nave personal epistemological belief s encounter complex and uncertain information as presented in higher education course s in science these complexities and uncertainties bring about a change that results in a maturing of their epistemological beliefs. Therefore the learner will move from a dua listic level (1-2) to hopefully at a minimum the beginnings of a relativistic level (5-6 ) by their senior year of college. Most of the participant’s final EBAPS scores fell i n the range of early to late multiplicity (levels 3-4) in Perry’s model and in t he transitional knowing range of Baxter Magolda’s model. The average EBAPS overall score o f 2.771 would place the participants in the middle of the multiplicity stag e or transitional knowing stage of epistemological development. This gives some suppor t to findings that students depending on their year in college and other factor s such as age and gender will progress in a positive manner toward higher epistem ological beliefs at different rates (Perry, 1970; Baxter Magolda, 1986; Moore, 2002). As in Perry’s study not all the participants in this study began in the dualistic s tage nor did all the participants improve in their beliefs. This is due in part to the short ness of the study over the course of a semester where many of the studies discussed were o ver longer periods of time. The results of the study support a final personal e pistemological belief range (1.28-3.55) of unsophisticated to extremely sophist icated by the end of the semester course with the majority of the participants fallin g in the mid to upper range of moderately sophisticated beliefs (2.771) or multipl icity. Participants at the higher end of level 3, multiplicity or transitional knowing make the departure from looking for certainty from an authority figure to accepting that some thi ngs in science will never be known and

PAGE 227

211 that one’s own opinion is important. According to Moore (2002), the beginning of participant ownership of ideas and knowledge emerge The EBAPS measured the participants’ end of course beliefs in five dimensions: structure of scientific knowledge, nature of knowin g and learning science, real-life applicability of science, evolving scientific knowl edge, and the source of ability to learn science. The participants moved from nave belief s about the structure of scientific knowledge (2.172) to more moderately sophisticated beliefs (2.488) during the course of the semester. This move from a dualistic view to o ne of multiplicity suggests that some growth occurred in participants’ views that the str ucture of scientific knowledge is an accumulation of concrete, discrete facts to viewing it as an interrelated network of strongly connected and highly structured concepts. At the beginning of the semester course participan ts held low moderately sophisticated beliefs about the nature of knowing a nd learning science (2.511) whereas by the end of the semester their beliefs had moved slightly (2.760) into holding midrange moderately sophisticated beliefs. This final average score suggests a move towards holding a mid-range multiplist view of the nature of knowing and learning science meaning that the participants are beginning to recognize diversity and uncertainty is possible and truth is knowable. The participants’ scored slightly higher in real-li fe applicability of science (2.978) moving toward the high range of moderately sophisti cated beliefs or multiplicity. Students moved from accepting diversity and uncerta inty as legitimate but temporary to believing that all views are equally valid and shif ts to self as an active maker of meaning. The greatest increase from initial to post scores was seen in the dimension of evolving knowledge in science (2.804). For some of the participants the degree to which they viewed scientific knowledge as fixed (set in s tone) or fluid (tentative) changed

PAGE 228

212 during the course of the semester. This change sug gests that some participants began to view scientific knowledge as approximate, tentat ive, refutable rather than absolute, exact, and final. Once again the highest final average score was in t he source of ability to learn science (3.107) which places some of the participan ts at the low end of the highly sophisticated beliefs range. This average score m odels Perry’s (1970) position involving relativism and Baxter Magolda’s (1986) po sition of independent knowing. Here participants are inclined to take more responsibili ty for their own learning rather than relying heavily on authority and acknowledge that s ome viewpoints are more valid than others. This move from multiplism or transitional knowing to relativism or independent knowing is considered a significant development in an individual’s epistemological beliefs (Moore, 2002). As stated earlier the distribution of average final scores within each epistemological dimension corresponds with Schommer (1994) and Hofer and Pintrich (1997) views that beliefs are better described in t erms of distributions rather than a single point along a continuum as described in the uni-dimensional models (Baxter Magolda, 1986, Belenky, et al., 1986; King & Kitche ner, 1994; Kuhn, 1991; Perry, 1970). Summary This chapter presented and discussed the quantitati ve findings related to research question-1 and sub-questions 1a and 1-b co ncerning the range of personal epistemological and NOS beliefs at the beginning of the semester chemistry laboratory course and the change, if any, in the range of both beliefs at the end of the semester course. There was a 5.8% average increase in the overall NS KS participant scores by the end of the semester course. The average NSKS s core was 142 at the beginning of

PAGE 229

213 the semester placing more of the participants on th e relativist end of the NSKS scale holding non NOS views. However, by the end of the semester the average NSKS score was 148 placing more of the participants on the ins trumentalist end of the NSKS scale holding NOS views. Three categories of NOS beliefs as indicated on the NSKS showed the highest improvement within the group of partici pants. The greatest improvement in scores was seen in the variables parsimonious, unif ied, and developmental. The participants presented higher levels of mature beli efs within the variables of parsimonious, developmental, and testable. The par ticipants presented higher levels of mature beliefs within the variables of parsimonious developmental, and testable. There was a 6.4% average increase in the overall EB APS participant scores by the end of the semester course. The average EBAPS score was 2.514 at the beginning of the semester placing more of the participants in the moderately-poor sophistication level of epistemological beliefs. However, by the end of the semester the average EBAPS score was 2.771 placing more of the participa nts in the moderately-highly sophistication level of epistemological beliefs. T hree categories of the EBAPS evaluating epistemological beliefs showed the highe st improvement within the group of participants. The greatest improvement in scores w as seen in the variables evolving knowledge, structure of knowledge, and real-life ap plicability. The participants presented higher levels of mature beliefs within the variable s of source of ability to learn, evolving knowledge, and real-life applicability. The next chapter presents a description of the deve lopment of the participants’ personal epistemological beliefs through the presen tation of qualitative analyses of the study’s first research question and sub-question 1b. The characterization of personal epistemological beliefs and any changes in those be liefs that may have resulted with analyses of the participants’ responses to intervie w probes will be presented. The

PAGE 230

214 combination of interviews and quantitative measures will provide a glimpse into participants’ epistemological beliefs changes durin g the course of a semester and what the participants’ believed influenced their beliefs

PAGE 231

215 Chapter Five: Development of Epistemological Belief s Introduction Chapter five presents a description of the developm ent of the participant’s personal epistemological beliefs through the presen tation of qualitative analyses of the study’s first research question and sub-question 1b. The characterization of the participants’ personal epistemological beliefs as r elated to science is discussed with the use of the participant’s responses to interview pro bes. The combination of the interviews and the quantitative measures previously discussed in chapter four will provide a glimpse into the participants’ personal e pistemological belief changes during the course of the semester. Because the major objective of this research was to determine if students’ personal epistemological beliefs change over the co urse of a semester in a laboratory instructional setting, the next step looks closely at the epistemological data. These descriptions will be generated from the pre and pos t EBAPS test data and more importantly the participants’ responses during the initial and final interviews. The results are discussed and related back to the key personal epistemological beliefs literature. The nature of this study was to explore and lay a found ation for focusing on more specific features of reasoning related to persona l epistemological belief changes in light of specific science laboratory instructional features for future research.

PAGE 232

216 Method of Analysis This analysis was conducted in a multi-layered, mul ti-stage process, through reading, and sorting participants’ responses to epi stemological questions, both general in nature and specific to the course. The analyses below are organized by the EBAPS dimensions (axes): structure of knowledge, nature o f knowing and learning, real-life applicability, evolving knowledge, and source of ab ility to learn. The aforementioned dimensions (axes) served as the major theme codes g iving a framework from which firstorder themes originally derived from the participan ts’ verbatim quotations or raw data themes could be analyzed. Within each dimension (a xis), the responses to interview and reflective questions regarding personal epistem ological beliefs at the beginning and end of the semester are presented. The intent of t his analysis is to expand the theoretical understanding of the dimensions (axes) of personal epistemology in science and the continuum of beliefs, as expressed in conte xt. Illustrative quotes have been selected from the interviewed participants as repre sentative of the range of beliefs along the continuum. Table 40 presents a demographic ove rview of the interview participants with their participation identification number. Qu otes are identified with the letters ST followed by the participants’ identification number (Table 40). The final interview quotes follow the initial interview quotes (In) and are id entified in bold text and coded with the letter F.

PAGE 233

217 Table 40 Demographic Statistics of Interview Partic ipants ID Sex Age Major College Year 1 F 19 Pre-Pharmacy Fr 2 F 21 Psychology So 3 F 21 Biomedical Science Jr 4 M 24 Electrical Engineering So 5 M 22 Environmental Science Jr 6 F 27 Marine Science None 7 F 20 Biomedical Sciences Jr 8 M 18 Undeclared Fr 9 F 18 Environmental Science Fr 10 F 20 Environmental Science So 11 F 19 Nursing Fr 12 F 18 Undecided Fr 13 F 18 Pre-Pharmacy Fr 14 F 19 Pre-Pharmacy Fr 15 F 20 Biology So 16 F 18 Environmental Science Fr 17 F 24 Physical Ed Jr 18 F 20 Athletic Training Jr 19 F 19 Biomedical Sciences So 20 F 45 Masters Nursing None The main research questions that guided this portio n of the study were: RQ1 What range of personal epistemological beliefs abou t science (chemistry) do undergraduate science students have at the beginnin g of a semester general chemistry laboratory course? RQ1b. Do students’ personal epistemological beliefs about science (chemistry) change by the completion of a semester general chem istry laboratory course?

PAGE 234

218 Summary of EBAPS Overall Scores Using the overall scores on the EBAPS (Table 41) di scussed in chapter four to measure relative increases or decreases in epistemo logical understandings, the results show forty-five participants increased their total scores while ten participants’ scores decreased by the end of the semester course. One p articipant’s score remained unchanged from the pre-test to the post-test. The total overall mean score between the pre-test and the post-test resulted in an average i ncrease of 0.26 (6.5 points). What is clear is that several of the participants’ overall scores did show some improvement in epistemological beliefs by the end o f the semester course. Nineteen of the fifty-six participants improved their EBAPS sco res by 6.5 points or less (0.26), while twenty-five improved their score by more than 6.5 p oints (8-35 points; 0.32-1.40). Therefore, 79% of the participants improved their E BAPS scores. The remaining twelve either had no change in their score or lost points. Whether this lack of improvement was in any way influenced by laboratory instruction or outside factors will be presented later in chapter seven. Table 41 Descriptive Statistics EBAPS Scores – Al l Participants Dimension Pre-Mean Score N=56 Pre-Mean Score N=20 Post-Mean Score N=56 Post-Mean Score N=20 Structure of Knowledge (A-1) 2.172 2.090 2.488 2.512 Nature of Knowing & Learning (A-2) 2.511 2.569 2.760 2.935 Real-life (A-3) Applicability 2.665 2.788 2.978 3.138 Evolving (A-4) Knowledge 2.357 2.150 2.804 2.783 Source of Ability to Learn (A-5) 2.896 3.000 3.107 3.210 Overall Score 2.514 2.537 2.771 2.867

PAGE 235

219 Summary of EBAPS Interview Scores As for the interview participants (N=20), 85% impro ved their EBAPS score by the end of the semester (Table 42). Eight participants improved their scores by 6.5 points (0.26) or less, while another nine improved their s cores by more than 6.5 points (0.351.40; 9-35 points). Three of the interview partic ipants showed no overall gain in their scores. As stated earlier whether the improvements or lack of improvements were in any way influenced by laboratory instruction or oth er possible factors will be presented later in chapter seven. Student five had the lowest overall EBAPS pretest s core of 1.88 (47), followed by student 10 (2.05; 51). Although 85% of the intervie w participants showed an increase in total EBAPS scores, student ten had the largest tot al score increase (1.40; 35) for the entire population sample (see Table 42). Student t en improved her sophistication in all five dimensions of the EBAPS. Student five put fo rth great effort to gain understanding during instruction but showed only a small quantita tive increase in overall epistemological sophistication, as measured in the EBAPS pre-test to the post-test. Student 16s preand post-test scores were the high est of the interview participants (2.85 (71); 3.55 (89), respectively). This was an above average increase of 17.5 points (0.70) suggesting a marked improvement in the sophistication of her epistemological beliefs. In addition students 1, 6 8, 14, 15 all improved their epistemological beliefs scoring in the highly sophi sticated level by the end of the semester course. This marked improvement supports t he basic theory of the epistemological belief models discussed in chapter two that some students undergo a developmental progression in their epistemological beliefs (Perry, 1970; Belenky et al., 1986; King & Kitchener, 2002; Baxter Magolda, 2002; Kuhn, 1991; Schommer-Aikins, 1990; & Hofer & Pintrich, 1997).

PAGE 236

220 Table 42 Descriptive EBAPS Statistics Interview P articipants ID Gender CCI EBAPS Pre EBAPS Post Difference 1 F 72 2.70 3.13 0.43*** 2 F 76 2.35 2.55 0.20** 3 F 81 2.38 2.97 0.59*** 4 M 67 2.70 2.62 -0.08* 5 M 86 1.88 2.08 0.20** 6 F 63 2.37 3.12 0.75*** 7 F 63 2.32 2.77 0.45*** 8 M 72 2.83 3.22 0.39*** 9 F 45 2.53 2.60 0.07** 10 F 72 2.05 3.45 1.40*** 11 F 58 2.80 2.98 0.18** 12 F 63 2.63 2.78 0.15** 13 F 49 2.63 2.48 -0.15* 14 F 65 2.48 3.02 0.54*** 15 F 76 2.98 3.12 0.14** 16 F 77 2.85 3.55 0.70*** 17 F 65 2.50 2.45 -0.05* 18 F 76 2.63 2.77 0.14** 19 F 67 2.52 2.87 0.35*** 20 F 58 2.65 2.80 0.15** decrease in score ** 0.26 (6.5 points) gain in score *** > 0.26 gain in score Characterization of Epistemological Beliefs Although the EBAPS assessment serves the purpose of finding out if, and in what categories, students beliefs are changing, we needed a way to explore how these beliefs changed during the semester. Using a set of probe questions initial and final interviews were conducted to ascertain if at all, w hether participant epistemological beliefs changed during the semester of laboratory i nstruction. Key areas that appeared to provide opportunities fo r participants to make inferences about their beliefs included the initial and final interviews. The initial interviews lasted approximately 15 – 20 minutes and focused on the five dimensions (axes) of the EBAPS and four of the NSKS dimensions to be discussed in chapter 6.

PAGE 237

221 The final interviews lasted 30-45 minutes and focus ed on EBAPS beliefs discussed in chapters 4 and 5, NOS beliefs discussed in chapters 4 and 6, the EBAPS dimensions in relation to the instructional features/practices di scussed in chapter 7, and general NOS beliefs in relation to the instructional features/p ractices discussed in chapter 7. The following discussion will present an overview of th e responses by the interview participants to the personal epistemological belief s probes during the initial and final interviews. The discussion is organized with the u se of the five EBAPS dimensions. Initial and Final Epistemological Beliefs Interview s During the initial and final interviews, five quest ions related to the multidimensional axes of the EBAPS: structure of scienti fic knowledge, nature of knowing and learning science, real-life applicability of scienc e, evolving scientific knowledge, and source of ability to learn science were used to pro be the participants (Appendices B & N). These were designed to investigate the partici pant’s epistemological beliefs. The interview participants were asked to elaborate on the questions in order to invoke the participant’s thoughts about the EBAPS variable s. The questions themselves were meant to look at different areas of epistemological beliefs within the EBAPS. According to Wood and Kardash (2000) one must be aware of the interconnectedness of epistemological beliefs and the placement of questi ons into specific categories based on the assessment tool implemented. The interconnected ness of the variables of epistemological beliefs is established by the answe rs of the interview participants. These answers can often display different epistemological categories within one question. This suggests that one cannot fully isolate these variab les and only search for evidence in the participants’ reflections and interviews. This study investigated the change from the beginni ng to the end of the semester within each of the five categories of epistemologic al beliefs identified in the EBAPS. First

PAGE 238

222 the overall participant scores were compared to tho se of the interview subjects. After a comparison between interview subjects and the overa ll class based on quantitative scores, an attempt was made to briefly look at what might have changed using the qualitative data from the interviews based on the e pistemological beliefs within each variable. Responses to the Personal Epistemological Beliefs P robes On the subsequent pages portions of the initial and final interview responses are presented and discussed concerning the participants epistemological beliefs. The interview probes were designed using the EBAPS vari ables discussed in previous chapters. Each variable interview probe will be pr esented and discussed separately. Structure of Scientific Knowledge In the current literature on personal epistemology the dimension, structure of scientific knowledge is viewed as operating on a co ntinuum that ranges from viewing scientific knowledge as an accumulation of concrete discrete, knowable facts without much structure to viewing it as an interrelated net work of strongly connected and highly structured concepts that are contextual, contingent and relative. Within this dimension the overall participant (N= 56) pre-test mean was 2.172 (54.3) while the post-test mean was 2.488 (62.2) (s ee Table 41) with 36 participants improving their score. The preand post-mean score s of the interviewed participants (N=20) were 2.090 (52.2) and 2.512 (62.8), respecti vely with 16 participants improving their score. This was also a category that quantit atively shows an above average (> 0.32 or 7.9 points) increase in 30 of the 56 partic ipants, and 10 of the 20 interviewed participants’ scores. The gain on the “structure o f knowledge” dimension is an indicator that some participants are moving away from a view of science as disconnected facts to one of science as a coherent body of knowledge.

PAGE 239

223 Although increases were observed quantitatively (Ta ble 43) with a majority of the interview participants, the difference in their und erstandings is best reflected in their initial and final interview responses in Table 44. In order to query participants, understanding of the structure of scientific knowle dge, the interview question asked whether science (chemistry) was a weakly connected subject without much structure or a strongly connected and highly structured subject. Although initially the majority of the interviewed participants believed that science was a strongly connected and highly structured subject (ST 2, 3, 6, 8, 10, 12, 15-17, a nd 20) several also felt science consisted mainly of learning facts and formulas (ST 7 and 14). Several participants initially indicated they believed that the structur e of scientific knowledge was a combination with structure and involved many facts and formulas (ST 1, 4-5, 9, 11, 13, 18, and 19). When comparing participants’ initial interview com ments with their initial EBAPS scores for the structure of scientific knowledge se veral mirror each other. For instance participants 1, 4, 7, 11, and 14 had initial scores in the poorly sophisticated range and reflected that range in their interview statements that science (chemistry) is a lot of facts and formulas. While participants 3, 15, 16, and 18 all had initial scores in the moderately sophisticated range and reflected that range in the ir interview statements that science (chemistry) was strongly connected and highly struc tured. Therefore, the majority of the participants’ EBAPS scores were supported by their initial interview statements.

PAGE 240

224 Table 43 EBAPS Structure of Knowledge Pre-Post St atistics ID Pre Post Difference 1 2.30 2.80 0.50*** 2 1.80 2.05 0.25** 3 2.90 2.95 0.05** 4 2.20 1.85 -0.35* 5 1.20 1.90 0.70*** 6 1.95 2.95 1.00*** 7 1.65 1.80 0.15** 8 2.25 3.10 0.85*** 9 2.00 1.95 -0.05* 10 2.00 3.50 1.50*** 11 2.10 2.95 0.85*** 12 1.95 2.60 0.65*** 13 1.65 1.65 0.00* 14 1.75 2.05 0.30*** 15 2.50 2.90 0.40*** 16 2.65 3.40 0.75*** 17 2.00 2.60 0.60*** 18 2.50 2.15 -0.35* 19 1.85 2.40 0.55*** 20 2.60 2.70 0.10** decrease in scor e ** 0.26 (6.5 points) gain in score *** > 0.26 gain in score The final interviews reflect a shift in a few of th e participants’ beliefs. At the beginning of the semester course 10 of the 20 inter viewed participants believed that science was a strongly connected and highly structu red subject (ST 2, 3, 6, 8, 10, 12, 15-17, and 20). By the end of the course approxima tely 17 of the participants held the belief that science is strongly connected and highl y structured (ST 1-4, 6-13, and 16-20) while none of the participants felt science consist ed mainly of learning facts and formulas. Several participants still indicated the y believed that the structure of scientific knowledge was a combination with structure and invo lved many facts and formulas (ST 5 and 15). When comparing participants’ final interview commen ts with their final EBAPS scores (Table 43) for the structure of scientific k nowledge the majority of the participants’

PAGE 241

225 scores and interview comments mirror each other whi le others present opposite views. For instance participants 2, 4-5, 7, 9, 13-14, and 18 had final EBAPS scores at the high end of poorly sophisticated range closer to the mod erated sophisticated range. However, the majority of the aforementioned participants ref lected moderate beliefs in their final interview statements as shown in Table 44. The maj ority of the aforementioned participants stated that they believed science to b e strongly connected and highly structured. This difference could be attributed to several factors such as: distracted during the administration of the EBAPS resulting in incorrect bubbling of answer choice or interpretation of the EBAPS questions and/or ans wer selection as well as their personal experiences in the chemistry lecture and l aboratory course during the semester. While participants 12, 17, and 19 all ha d final scores in the moderately sophisticated range and reflected that range or hig her in their final interview statements that science (chemistry) was strongly connected and highly structured. Participants 1, 3, 6, 8, 10-11, 15-16, and 20 final EBAPS score reflec ted a moderately high to high epistemological belief that science is highly struc tured and strongly connected. All of the aforementioned participants except participant 15 r eflected that belief in their final interview. Participant 15 felt that scientific kno wledge has gray areas where it can be weakly connected and strongly connected in others.

PAGE 242

226 Table 44 Participants’ Reflections – Structure Scie ntific Knowledge (N=20) Initial and Final Epistemological Beliefs Interview Question-1 Structure of Scientific Knowledge – Science (chemis try) is a weakly connected subject consisting mainly of facts and formulas without muc h structure versus being a strongly connected and highly structured subject. Quotation Comments ST-1: “ Mainly facts and formulas because everything has a reason for why it happens. i.e., chemical reactions. I would also say that it's coherent and highly stru ctured. Almost everything in chemistry can be explained and applie d.” (In) “I would say it’s strongly connected and highly structured. It all works toget her.” (F) ST-2: “ I disagree. As I learn more chemistry I find the fa cts are connected. For instance the different types of characteristics in atoms and how they interact with each other to form new compounds. I believe chemistry can be understood. It Is highl y structured.” (In) “Scientific knowledge is always changing but it is always conne cted. I think it is a combination. It is highly connected and structured but it allows for flexibility.” (F) ST-3 : “I don't think science is a bunch of weakly conne cted pieces. My view of chemistry is that it is a very specific science. That it has str ong foundations, rules and practices I think it has a lot of structure. The difficulty is becoming familiar and comfortable with it.” (In) “ I definitely think it is strongly connected and highl y structured.” (F) ST-4 : “Absorbing information yes, but also practicing mathematical skills to better understand given information. For instance stoichio metry and balancing, finding properties, etc all goes beyond just absorbing info. You defin itely build upon your own individual knowledge base when studying sciences. Relating new experiences to old ones, and reflecting upon your own personal understanding is going on all the time.” (In) “I would say the latter, strongly connected and highly structure d.” (F) ST-5: “ Yes, I do agree with all of this statement. All th e facts and formulas in chemistry make more sense when they are connected. Chemistry is structured so that every thing fits like a puzzle.” (In) “I kind of fall in the middle because using formula s can involve absorbing while other concepts are actually truly s tructured.” (F) ST-6: “I think everything in science connects. For inst ance the periodic table. I’d say it’s highly structured.” (In) It’s more toward that all sciences are connected an d structured.” (F) ST-7 : “I agree, chemistry is a lot of work and memoriz ation. There is a lot of formulas. Chemistry is a broad topic and refers to an abundan ce of information therefore it can not be placed in one category.” (In) “I think scientific knowledge is all connected. On e just needs to understand the knowledge to see how it is connected.” (F) ST8: “I disagree the knowledge of chemistry can be appli ed to many real life situations and is more than just facts and formulas. I would say chemistry knowledge is coherent and conceptual in the sense that it is all logically co nnected with basic concepts and ideas.” (In) “Strongly connected and highly structured.” (F) ST9: “No. Although chemistry is mainly composed of fac ts and formulas, there is structure behind it based on proven facts and experiments. I believe that it is highly structured.” (In) “I think it’s strongly connected and highly structu red.” (F) Continued on next page

PAGE 243

227 Table 44 (Continued) ST-10: “ I disagree because I think it has as much structure as math. I think it is a pretty organized body of knowledge. I think it’s just as important and just as unified as any other kind of science and I think it’s probably the most important science being that it is the basis for anything else, including biology.” (In) “I say weakly connected without much structure is not something you normally think about as having to do with scientific knowledge, but in the beginning of any kind of scie ntific theory you don’t really have all the pieces yet, so it would be weakly connected without much structure because it’s something that hasn’t been completely explored .” (F) ST-11 : “Chemistry is much more than weakly connected pi eces, although it has facts and formulas. There are many experiments that have tak en place to support theories that are now helping us to improve things in the world i.e. technology. Chemistry is a combination, it is based on certain concepts but it is a unified whole knowledge.” (In) “Strongly connected and highly structured.” (F) ST-12: “I think that when you go through the textbook it seems that science is mainly facts and formulas. However, when you perform the labs y ou see a lot more about how everything in science is tied together I think it is a whole knowledge and unified knowled ge because the concepts interrelate.” (In) “All science is connected so it’s strongly connected and highly structured.” (F) ST-13: “I think that chemistry knowledge is more about und erstanding how chemistry works. It is a lot about facts and formulas but it s more in depth than that Chemistry is highly structured and conceptual in some aspects.” (In) “I think it’s strongly connected and highly structured.” (F) ST-14: “I don’t think that it is weakly connected but I do believe that chemistry, to me, looks like facts and formulas. I am far from seeing the b ig picture Highly structured knowledge, it seems like there is a detailed explanation for ever y formula or equation.” (In) “I fall in the middle. However I think it’s more toward being str ongly connected and highly structured.” (F) ST-15: “No, I believe it is a very structured science wi th a lot to learn and understand besides memorizing facts and formulas Highly structured, because you have to learn everything in steps to understand chemistry as a wh ole. I think science is easier to understand when it is highly structured.” (In) “I think there were some gray areas where it was weakly connected and other areas where it wa s strongly connected and highly structured especially during laboratory activities. ” (F) ST-16: “I would say no because chemistry is actually ver y based on theories. It’s not weakly connected. It’s all interrelated. And, it h as a lot of structure. It’s theoretical and it is a unified whole knowledge as it is all interrelated.” (In) “Strongly connected and highly structured.” (F) ST-17: “No, I have used chemistry in a lot of my other cla sses, mostly biochemistry. But you need to have the basic understanding of chemist ry to understand that, and the roles that chemistry plays in our lives. I see how it re lates to other concepts and sciences. I guess that is the "whole knowledge, unified" part t o me.” (In) “More structured and connected.” (F) ST-18: “Well, from what I’ve learned so far in chemistry e verything is connected. Yes, it is facts and formulas, but it is more conceptual. It’ s understanding how it works, analyzing things as well. Well, I agree to this comment beca use it is highly structured, but yet there is room for interpretation.” (In) “I would say strongly connected and highly struct ured.” (F) Continued on next page

PAGE 244

228 Table 44 (Continued) ST-17: “No, I have used chemistry in a lot of my other cla sses, mostly biochemistry. But you need to have the basic understanding of chemist ry to understand that, and the roles that chemistry plays in our lives. I see how it re lates to other concepts and sciences. I guess that is the "whole knowledge, unified" part t o me.” (In) “More structured and connected.” (F) ST-18: “Well, from what I’ve learned so far in chemistry everything is connected. Yes, it is facts and formulas, but it is more conceptual. It’ s understanding how it works, analyzing things as well. Well, I agree to this comment beca use it is highly structured, but yet there is room for interpretation.” (In) “I would say strongly connected and highly structur ed.” (F) ST-19: “ I believe that there is much more to chemistry and that everything does connect in some way. The facts and formulas help to explain w hy things happen the way they do. I do believe that chemistry knowledge is a unified whole science.” (In) “I think it’s definitely strongly connected and highly structured.” (F) ST20: “I strongly disagree with that. I think it is bas ed on a bunch of very painstakingly researched much interconnected data. I think it is painstakingly structured with many facts and formulas that are difficult to keep track of. I agree with most of the statement. I think there is still a lot of theory out there that needs further unification. There is a lot of long standing knowledge that has been verified for years .” (In) “I lean toward it being strongly connected and highly structured.” (F) Nature of Knowing and Learning Science In the current literature on personal epistemology the dimension, nature of knowing and learning science is viewed as operating on a continuum that ranges from viewing that learning science as consisting mainly of absorbing information such as facts to relying on constructing one’s own understanding by working through the material actively, by relating new material to prior experie nces, knowledge, and intuitions, and by reflecting upon and monitoring one’s understanding. Within this dimension the overall participant (N= 56) pre-test mean was 2.511 (62.8) while the post-test mean was 2.760 (69.0) (s ee Table 41) with 30 participants improving their score. The preand post-mean score s of the interviewed participants (N=20) were 2.569 (64.2) and 2.935 (73.4), respecti vely with 14 participants improving their score. This category that quantitatively sho ws an average (> 0.25 or 6.2 points) increase in 28 of the 56 participants, and 12 of th e 20 interviewed participants’ scores. The gain on the “nature of knowing and learning sci ence” dimension is an indicator that

PAGE 245

229 some participants are moving away from a view that learning science is just about absorbing information and learning facts to one of constructing one’s own knowledge by using prior knowledge, experiences, and intuition i n order to reflect upon and monitor one’s own understanding. Although increases were observed quantitatively (Ta ble 45) with a majority of the interview participants, the difference in their und erstandings is best reflected in the interview responses in Table 46. In order to query participants, understanding of the nature of knowing and learning science, the initial interview question asked whether learning science (chemistry) consisted mainly of ab sorbing information or that learning science relies on constructing one’s own understand ing, working actively through the material, relating new material to prior experience s and/or intuitions and/or knowledge, and reflecting upon and monitoring one’s understand ing. The majority of the interviewed participants believed that the nature of scientific knowledge was a combination of absorbing information as well as constructing one’s own knowledge (ST 1, 3-6, 8-12, 14, 17, and 19-20). Several felt the nature of scienti fic knowledge consisted mainly of absorbing information (ST 7 and18) while the remain ing participants (ST 2, 13, and 1516) indicated they believed that the nature of scie ntific knowledge was a result of constructing one’s own knowledge through connecting prior experiences with new learning experiences. When comparing participants’ initial interview comm ents with their initial EBAPS scores (Table 45) for the nature of scientific kno wledge some of the initial scores for this axis are mirrored in the participants’ initial inte rview comments while others were not. For instance participants 1, 3, 6, 8-12, 17, and 19 -20, all had initial scores in the moderately sophisticated range and reflected that r ange in their interview statements that the nature of scientific knowledge was a combi nation of absorbing information as

PAGE 246

230 well as constructing one’s own knowledge. While pa rticipants 15 and 16 both had initial scores at the high end of the moderately sophistica ted range they reflected highly sophisticated views in their initial interview stat ing that the nature of scientific knowledge was a result of constructing one’s own knowledge th rough connecting prior experiences with new learning experiences. Even though partici pants 7 and 18 scored in the Table 45 EBAPS Nature of Knowledge – Pre-Post Sta tistics ID Pre Post Difference 1 2.813 3.375 0.562*** 2 2.375 2.813 0.438*** 3 2.438 3.313 0.875*** 4 3.063 2.938 -0.125* 5 1.563 1.813 0.250** 6 2.438 2.938 0.500*** 7 2.375 2.813 0.438*** 8 2.813 2.813 0.000* 9 2.813 2.750 -0.063* 10 2.438 3.938 1.50*** 11 2.813 2.938 0.125** 12 2.688 2.500 -0.188* 13 3.000 2.438 -0.562* 14 2.063 3.188 1.125*** 15 2.813 3.438 0.625*** 16 2.813 3.563 0.750*** 17 2.500 1.750 -0.750* 18 2.438 3.563 1.125*** 19 2.813 3.130 0.317*** 20 2.313 2.688 0.375** decrease in score or no change ** 0.26 (6.5 points) gain in score *** > 0.26 gain in score moderately sophisticated belief range their comment s in the initial interview reflected the belief that the nature of scientific knowledge invo lved mainly absorbing material. By the end of the semester course the majority of t he interviewed participants continued to hold the belief that the nature of sci entific knowledge was a combination of absorbing information as well as constructing one’s own knowledge (ST 5, 8-9, 12-14, 17, and 19-20). Two participants expressed the bel ief that the nature of scientific

PAGE 247

231 knowledge consisted mainly of absorbing information (ST 18 and 20) while the remaining participants (ST 1-4, 6-7, 10-11 and 15-1 6) indicated they believed that the nature of scientific knowledge was a result of cons tructing one’s own knowledge through connecting prior experiences with new learning expe riences. When comparing participants’ final interview commen ts with their final EBAPS (Table 45) scores for the nature of scientific know ledge once again some of the final scores for this axis are mirrored in the participan ts’ final interview comments while others were not. For instance participants 8-9 and 12-13, had final EBAPS scores in the moderately sophisticated range and reflected that r ange in their interview statements that the nature of scientific knowledge was a combi nation of absorbing information as well as constructing one’s own knowledge. While pa rticipants 2 and 7 both had final EBAPS scores at the high end of the moderately soph isticated range they reflected highly sophisticated views in their final interview stating that the nature of scientific knowledge was a result of constructing one’s own kn owledge through connecting prior experiences with new learning experiences. Even th ough participant 20 scored in the moderately sophisticated belief range her comments in the final interview reflected the belief that the nature of scientific knowledge invo lved mainly absorbing material. Participants 1, 3-4, 6, 10-11, and 15-16 final scor es were in the highly sophisticated range reflecting their final interview belief that the nature of scientific knowledge involved constructing one’s own knowledge. The final EBAPS scores of participants 5 and 17 suggested that the nature of scientific knowledge m ainly involved absorbing facts and formulas. However their final interview comments s uggested they believed the nature of scientific knowledge was a combination of absorbing information as well as constructing one’s own knowledge.

PAGE 248

232 This difference could be attributed to several fact ors such as: distracted during the administration of the EBAPS resulting in incorr ect bubbling of answer choice or interpretation of the EBAPS questions and/or answer selection as well as their personal experiences in the chemistry lecture and laboratory course during the semester. A major difference in the EBAPS score and final inter view comments were reflected in participants 18 and 20. Even though their final EB APS scores reflected moderately to highly sophisticated beliefs respectively about the nature of scientific knowledge their final interview comments suggested otherwise. Both participants believed that the nature of scientific knowledge leaned more towards consisting of absorbing and memorizing information and facts. Table 46 Participants’ Reflections – Nature of Know ing-Learning (N=20) Initial and Final Epistemological Beliefs Interview Question-2 Nature of Knowing and Learning ScienceLearning sc ience (chemistry) consist mainly of absorbing information or learning science relies on constructing one’s own understanding, working actively through the materia l, relating new material to prior experiences/intuitions/knowledge, and reflecting up on and monitoring one’s understanding. Quotation Comments ST-1: “ No, I think it's a combination of absorbing informa tion and applying it to real life. I think that's why labs help when learning chemistry. You need to see how chemistry works in the world by relating it to yourself. Also, you can only learn the material by thinking of it in your own terms.” (In) “It would be developing your own understanding. Ev erything in science is connected.” (F) ST-2: “Not alone, you have to be able to apply what you l earn. One can learn more with hands on than trying to beat it in your head by mem orizing Yes, you to find a way to translate chemistry into your own language so that you can learn and apply.” (In) “You can memorize all of you want but if you can’t apply it you are going to struggle with chemistry.” (F) ST-3: “Of course one has to absorb the information, but a key to learning science is being able to analyze the data and form conclusions. In addition, critical thinking is necessary. One has to absorb the information, analyze, reflect and draw conclusions so that it can be applied later on. I think that learning chemistry r elies on understanding material and being able to relate it to other experiences. I would al so say that it involves reflecting and monitoring understanding.” (In) “The least effective way for me to learn science is by absorbing or memorizing information in order to jus t remember facts. But by applying prior knowledge helped me to really unders tand.” (F) Continued on next page

PAGE 249

233 Table 46 (Continued) ST4 : “Absorbing information yes, but also practicing mathematical skills to better understand given information. It all goes beyond just absorbin g information You definitely build upon your own individual knowledge base when studying science s. Relating new experiences to old ones, and reflecting upon your own personal understanding is going on all the time.” (In) “I would say constructing one’s own knowledge.” (F) ST-5: “ No, I do not agree because it is not just absorbing the information. The information must be experienced in lab. It is more of a doing experi ence. ” (In) “I am probably in the middle again. Some things like the learning the chemical f ormulas would be constructing knowledge while learning to use the temperature pro be involved absorbing information.” (F) ST-6: “Yes and no. I mean, you have to absorb and memor ize a large quantity of facts and formulas. However, you need to be able to apply it So, it’s not all just absorbing the material. But you’re not going to know how to apply it unless you practice the concepts. You definitely have to work through the material.” (In) “I lean more toward relating new material to prior knowledge. You can learn memorize all of the materi al that you want but you may not be able to apply it. One needs to know how to think an d solve problems versus just trying to memorize.” (F) ST-7: “Yes, chemistry involves a lot of facts and formu las. A person must get past one problem in order to proceed to the next In order to obtain the maximum of information one m ust do all of the following: one must know what they are studying how to work through the problems and relate the information to other areas.” (In) “I think one has to construct one’s own views to understand science. I think using real life experie nces help to understand science.” (F) ST-8: “Yes, but I also believe experiencing that inform ation through laboratory work also plays a big role in learning chemistry. The relation of new material to past experiences a nd one's own knowledge and understanding of the subject is essen tial in the analysis and understanding of new found data.” (In) “A combination of absorbing information and constructi ng knowledge. One still needs to relate the new concepts to prior knowledge so you can combine the knowledge to construct understanding.” (F) ST-9: “Yes. Because you will have to use that information later on as it never goes away. It always comes back to the basics in science. You have to build up your understanding and shape it. You also have to be able to analyze and underst and each method.” (In) “I fall in the middle as I believe some people learn better by memorizing wh ile another group understand better by like rewriting or rephrasing it in their mind.” (F) ST-10: “I think in the beginning your probably absorbing i nformation, but once you learn basic rules and how to apply them to chemistry then you c an absorb less and apply more. Once you learn something and you understand it then you chec k yourself every time you apply it. You build knowledge as you learn different steps, it’s like i t’s a building block.” (In) “You can only absorb and memorize so much. For instance in math and sci ence you have building blocks. You have to understand an earlier concept to understand or move onto the next concept or formula. I think that you have to construct your o wn understanding by relating new material to prior knowledge, experiences and active ly work with the new material.” (F) ST-11: “True, but the information absorbed can be used to develop new concepts in the long run. You have to understand how to do conversions and le arn the basics of chemistry in order to go any farther in the subject. Yes one works actively through the material so that it can be learned and used to gain further knowledge and if you relat e new material to past experiences and knowledge you may then understand why something did not go right.” (In) “Constructing one’s own knowledge helped with my learning.” (F) Continued on next page

PAGE 250

234 Table 46 (Continued) ST-12: “I agree with that because you do have to memoriz e a lot of formulas and understand a lot of concepts and how to calculate. I think you have to create your own understanding but you can do that by working throug h the material. For instance you can read the book and create your own understanding of the text. In addition, when you perform the experiment you learn more through working with the material.” (In) “Well, learning science is connected to one’s prior knowledge.” (F ) ST-13: “It is about learning and understanding the informa tion. You have to develop your own understanding. You need to be able to reflect o n everything you did and know as well as relate things you know to things you are learnin g.” (In) “ A combination of absorbing information and constructing knowledge in order to understand.” (F) ST-14: “Yes, comprehending detail is very important Working through the material as well as relating prior knowledge is how I rely on learni ng chemistry. it can be very difficult trying to relate prior knowledge when it has been years wi thout lab experience. Slowly small things come back to me both in the lecture as well as the lab.” (In) “I would say absorbing and memorizing information because that’s how I learn. However, prior knowledge and prior experiences play a role. But, I’m starting to recognize chemistry in every d ay life and I never did that before.” (F) ST-15: “It is not enough to just absorb information. If you cannot apply the information then there is no point in absorbing the information. All of those are essential to learning science.” (In) “You have to relate your prior knowledge to complet ely understand.” (F) ST-16: “No. You can’t just memorize chemistry. You hav e to actually understand the concepts behind it in order to learn. Chemistry ta kes practice to understand it. Everything you have learned prior is connected to your new lea rning.” (In) “I believe in using one’s prior knowledge in order to construct one’s own und erstanding.” (F) ST-17: “I think a student could easily make it through a b asic chemistry course by absorbing the information, but to actually learn it you need to conceptualize it and understand it, especially if they plan on taking an y other science classes.” (In) “In lecture it involves absorbing information or knowledge while i n laboratory one applies prior and new knowledge for understanding.” (F) ST-18: “It is a lot of memorization but, it’s also somethi ng I like about science. Sometimes there isn’t a yes or a no answer. You can analyze it but if one experiences something they’ll probably remember it. “(In) “I would say it consists mainly of absorbing, memo rizing information and facts.” (F) ST-19: “Yes, I agree because you must learn the basic ma terial to move onto the harder material. This is why labs are good because they ma ke you think and reflect on why certain things happened.” (In) “Learning science requires you absorb the informati on. However, the repetition helps one understand.” (F) ST-20: “I think it is learning in action. This makes it a more realistic experience. I have been able to relate it to a lot of things I do at work. Especially the reactions, pH blood gasses and IV fluids that I am administering. I think it is an action science I think you have to construct your own conceptual framework so that you can under stand the material that is there. Hopefully the material can be interrelated with our life, experiences and prior knowledge.” (In) “As it pertains to this course I’d say I lean more toward it consisting mainly of absorbing and memorizing the information. ” (F)

PAGE 251

235 Real-Life Applicability of Science In the current literature on personal epistemology the dimension, real-life applicability of science is viewed as operating on a continuum that ranges from the view that is science is applicable to everyone’s life in side and outside the classroom or laboratory versus that it is an exclusive concern o f the scientific world. Within this dimension the overall participant (N =56) pre-test mean was 2.665 (66.6) while the post-test mean was 2.978 (74.4) (s ee Table 40) with 35 participants improving their score. The preand post-mean score s of the interviewed participants (N=20) were 2.788 (69.7) and 3.138 (78.4), respecti vely with 16 participants improving their score. This was also a category that quantit atively showed an above average (> 0.31 or 7.8 points) increase in 32 of the 56 partic ipants, and 13 of the 20 interviewed participants’ scores. The gain on the “real-life a pplicability of science” dimension is an indicator that some participants are moving away fr om the view that science only belongs in the realm of scientists to one that scie nce is applicable to everyone’s daily lives. Although increases were observed quantitatively wit h a majority of the interview participants (Table 47), the difference in their un derstandings is best reflected in the initial and final interview responses in Table 48. In order to query participants, understanding of the real-life applicability of sci ence, the initial and final interview question asked whether s cientific knowledge and scientific ways of thinking applied only to the classroom and laboratory settings, not to real life. In the initial interview the majority of the partic ipants stated that they believed that science is always applicable to their everyday life (ST 1, 3-9, 11-13, and 15-20) while 3 participants (ST 2, 10, and 14) indicated that in c ertain cases it applied more to a

PAGE 252

236 classroom or laboratory setting. None of the participants felt that science was only applicable to a classroom or laboratory environment When comparing participants’ initial interview comm ents with their initial EBAPS scores (Table 47) for the real life applicability o f science most of the initial scores for this axis are reflected in the participants’ initial int erview comments. For instance participants 1, 4, 8-9, 11-13, and 15-20, all had initial scores in the moderately to highly sophisticated range and reflected that range in their interview s tatements that scientific knowledge applied to real life not just the classroom or labo ratory setting. Participants 3, 5, and 7 had initial EBAPS scores in the poorly sophisticate d range however in their initial interview they each stated that scientific knowledg e was applicable to real Iife situations. As suggested earlier this discrepancy could have be en due to several factors including misinterpretation of the question and/or possible a nswers or incorrect bubbling of answer choice as well as their personal experiences in the chemistry lecture and laboratory course during the semester.

PAGE 253

237 Table 47 EBAPS Real Life Applicability – Pre-Post Statistics ID Pre Post Difference 1 3.88 3.50 0.12** 2 3.13 2.13 -1.00* 3 2.25 2.50 0.25** 4 2.63 2.63 0.00* 5 1.50 2.00 0.25** 6 2.25 3.63 1.38*** 7 2.25 2.63 0.38*** 8 3.00 3.25 0.25** 9 2.88 3.38 0.50*** 10 2.50 3.25 0.75*** 11 2.63 3.38 0.75*** 12 3.50 2.50 -1.00* 13 3.38 3.38 0.00* 14 2.88 3.38 0.50*** 15 3.25 4.00 0.75*** 16 2.88 3.75 0.87*** 17 3.00 3.63 0.63*** 18 2.88 2.50 -0.38* 19 2.63 3.50 0.87*** 20 3.00 3.88 0.88*** decrease in score or no change ** 0.26 (6.5 points) gain in score *** > 0.26 gain in score The final interviews reflected a shift for two of t he participants (ST 10 and 14) from a view that scientific knowledge is applicable more often in the classroom or laboratory to that it is often applicable to real l ife. The majority of the interviewed participants in the final interview still supported the belief that science is always applicable to their everyday life (ST 1, 3-9, 11-13 and 16-20) while 1 participant (ST 15) indicated that in certain cases it applied more to a classroom or laboratory setting and real life in other. None of the participants felt that science was only applicable to a classroom or laboratory environment. When comparing participants’ final interview commen ts with their final EBAPS scores (Table 47) for the real life applicability o f science most of the initial scores for this axis are mirrored in the participants’ initial inte rview comments. For instance participants

PAGE 254

238 1, 3-4, 6-14, and 16-20, all had final EBAPS scores in the moderately to highly sophisticated range and reflected that range in the ir interview statements that scientific knowledge applied to real life not just the classro om or laboratory setting. Participants 2 and 12 final scores decreased from highly sophistic ated to poorly and moderately sophisticated beliefs respectively. However, this sophistication level was not reflected in their final interview comments as they each stated that scientific knowledge was applicable to real Iife situations. As suggested e arlier these discrepancies may have been due to several factors including misinterpreta tion of the questions and/or possible answers or incorrect bubbling of choice as well as their personal experiences in the chemistry lecture and laboratory course during the semester. Table 48 Participants’ Reflections – Real Life Appl icability Science (N=20) Initial and Final Epistemological Beliefs Interview Question-3 Real-life Applicability of Science – Scientific knowledge and scientific ways of thinkin g apply only to the classroom and laboratory settings not to real life Quotation Comments ST-1: “ It applies to the real world more than anything els e. Everything in the world is linked to science such as all matter is made up of element s.” (In) “It definitely applies to everyday life.” (F) ST-2: “ : No, people are able to apply it outside. However, the scientific way of thinking is more investigative, therefore more accurate. You ac quire more thinking and problem solving skills. Not enough people are able to apply the kno wledge.” (In) “It depends. The reason is I believe one could get a way with not applying chemistry to anything in life. However, my neighbor is gifted in chemistry. We’ll be sitting there talking about diet. He discusses how the foods chemically work with my body to lose the weight. You can apply it but whether you do apply it depends on whether you want to or not.” (F) ST-3 : “I would imagine that it would apply to real life Chemistry involves a lot of analytical thinking, which can be applied in every aspect of l ife. It also involves a lot of detail and specific and accurate data/ results which can also apply to life outside the lab. Well there is the more broad approach that could be used in probl em solving. Just the other night I was talking to my boyfriend and he spilled olive oil. We started discussing how soap breaks up the compound of the oil. My mom and I were discussi ng how penicillin was made and how it cured so many people.” (In) “I think it would be easy for one to think chemistr y is just in the classroom and doesn’t have anything to do wi th real life. Both professors state that when they view a traffic light they picture th e LEDs firing or when the weather changes they check their tires to see if air (gas) needs to be added as gases expand and contract as the temperature changes. Now I see that science is part of our daily life.” (F) Continued on next page

PAGE 255

239 Table 48 (Continued) ST-4 : “I disagree there are real life situations where scientific ways of thinking are used. Diagnosing a problem with your car is one off the t op of my head that I just used this weekend. I would say scientific knowledge is used a lot in the lab/classroom as well as real life, but would lean a little more towards being us ed more in a lab or classroom.” (In) “I would say to real life.” (F) ST-5: “ No, I believe that classroom and lab have an effect on real life. Like how to take care of the environment for gettin g rid of PCB's.” (In) “First one has to understand the concepts which require learning expe riences in the classroom. Then one can go out and apply it to the real world.” (F ) ST-6: “No, there are scientific ways in every way of thi nking. It’s about evaluating things. We use science in everything. It’s about gathering da ta or ideas and combining it all. No matter what you do you always kind of look at all your opt ions to make a decision and even in everyday life. Maybe there are several possible me thods that might work and you try them all if necessary.” (In) “I use science every day. It is not restricted to the classroom as we use it in all kinds of daily situations. For in stance the heat transfer concept involved in a hot water heater.” (F) ST-7: “Not true, science is everywhere. People can app ly scientific ways of thinking to everyday activities When cleaning it is important to know what you can and cannot mix and how much solution is needed. Being able to understa nd that is basic science.” (In) “I think scientists and non-scientists use science everyday. ” (F) ST-8: “No, scientific thinking is used in every day lif e. For instance, most of the household products we find in our homes could be made in the chemistry lab very easily. We just tend to overlook things the way they can be seen scienti fically.” (In) “Applies to real life situations.” (F) ST-9: “No, it applies to everyday life. Having an underst anding of science that we learn inside the classroom or laboratory, allows us to un derstand the science in everyday life.” (In) “It applies to real life situations.” (F) ST-10: “Well, that depends on your career choice. I think it does apply to real life. Other than that, I think that it would apply mostly to yo ur career choice. Scientific knowledge and ways of thinking would be most helpful in your care er if it has to do with a scientific field, but I think in real life it does apply but not as much as it would in a career.” (In) “I don’t agree. Scientific knowledge isn’t restricted to just the c lassroom or laboratory. Once you have learned scientific knowledge in a classroom on e can apply it to everyday life.” (F) ST-11: “No, we use chemistry in our everyday lives the air we breathe and the things we eat and drink, they all have to do with chemistry.” (In) “I believe it applies to the real world. Everything in life deals with science. For i nstance from starting the car, producing electricity, and eating everything relate s to chemical processes.” (F) ST-12: “No because you can use scientific thinking in your everyday life like reading the back of a shampoo bottle to see the ingredients. T hat involves scientific thinking. However, they are more specifically used in the classroom or like at a pharmacy.” (In) “It applies to everyday situations. For instance the demonstratio n of how fireworks are produced relates to real life.” (F) ST-13: “I do not agree with that because scientific knowle dge is used all over the world in everyday life, not just in the classrooms and labor atories. Scientific knowledge is used in everyday life. For instance when people are cookin g, the use of temperature and how things react with each other. I think that we all u se knowledge and scientific ways of thinking.” (In) “Scientific knowledge is applicable to both.” (F) Continued on next page

PAGE 256

240 Table 48 (Continued) ST-14: “I know science does apply to real life. I am just not aware of it at all times. Something as simple as a physical change.” (In) “It includes real life situations as chemistry is everywhere. It’s not only restricted to the classroom or lab. For instance it is involved in such things as one’s diet and hea lth.” (F) ST-15: “No, science applies to everyday life. Life involve s strategically taking apart all the pieces and figuring them out and being able to anal yze situations as one does in the classroom or lab. In lab you need to analyze your d ata and, if something went wrong, sort through and figure out what went wrong. For instanc e if you took apart the brakes on your bike to fix them then put them back on and they do not work, one has to figure out what went wrong to fix the brakes and avoid that same pr oblem the next time.” (In) “Some aspects are restricted to the classroom but more ar e applicable to everyday life.” (F) ST-16: “No. Everything is chemistry. Some real life exam ples are the desk, traffic lights with LEDs, blinkers, and gas to run our cars.” (In ) “Think about the chemicals you clean with in your home. That’s science being used in an everyday real life situation. I don’t know if you’ve seen that commercial on TV, th e chemistry one.” (F) ST-17: “It definitely applies to real life, science is all around us. It’s in the plants outside, the weather, the food we eat, its everywhere so the re is no way we can say its limited to a lab.” (In) “Applies to everyday real life situations as demon strated in laboratory.” (F) ST-18: “That’s not true. You deal with science everyday in just starting your car. Even though some people don’t realize its science, every thing involves science.” (In) “Applies to everyday real life situations. For instance you be come more familiar with chemicals and one can apply that knowledge to household clean ing supplies learning that some of them are more hazardous than others.” (F) ST-19: “No. I disagree because science is all around us everyday. For instance the air we breathe and the water in which we swim or drink. M aking plastic and we use plastic everyday. We just do not often think of it in that way, that it’s science.” (In) We learned to understand why science related to real life. For i nstance the activity with fireworks can be applied outside the classroom or laboratory. ” (F) ST-20: “Those are the very things that are my real life. We are science biology, physics, chemistry, mathematics For instance, analyzing my patient's blood work po st-operatively. Deciding which IV fluids should be hung, determinin g if their urine output is sufficient, whether they need a fluid challenge, monitoring the ir vital signs and their physiological changes to determine if they are stable or going in to shock postoperatively, determining a blood loss, adjusting my ventilator setting based o n blood gasses.” (In) “I lean strongly to it applying to everyday real life.” (F) Evolving Scientific Knowledge In the current literature on personal epistemology the dimension, evolving scientific knowledge is viewed as operating on a co ntinuum that ranges from viewing scientific knowledge as absolute, “set in stone” to viewing it as changing and dynamic. This dimension also considers the justification and source of knowledge in terms of the evaluation of evidence and the opinion of experts.

PAGE 257

241 Within this dimension the overall participant (N= 56) pre-test mean was 2.357 (58.9) while the post-test mean was 2.804 (70.1) (s ee Table 41) with 29 participants improving their score. The preand post-mean score s of the interviewed participants (N=20) were 2.150 (53.8) and 2.783 (69.6), respecti vely with 14 participants improving their score. This was also a category that quantit atively shows an above average (> 0.45 or 11 points) increase in 27 of the 56 partici pants, and 12 of the 20 interviewed participants’ scores. The gain on the “evolving sc ientific knowledge” dimension is an indicator that some participants are moving away fr om a realist view of science being “set in stone” to a more instrumentalist point that science changes over time. Although increases were observed quantitatively (Ta ble 49) with a majority of the interview participants, the difference in their und erstandings is best reflected in the interview responses in Table 50. In order to query participants, understanding of evolving knowledge in science, the initial intervie w question asked to react to the following: whether (A) All scientific knowledge is set in stone. (B) There is no difference between scientific evidence-based reasoning and mer e opinion. (C) Sometimes different science instructors give different explanations for scientific events/concepts/phenomena. When 2 instructors explain the same thing different ly, can one be more correct than the other? Explain. (D) When 2 explanations are given for the same situation, how would you go about deciding which explanation to believe? Please give details and examples. (E) Can one ever be sure of which explanation to be lieve? If so, how can you? If not, why not? Initially nineteen of the interviewed participants (ST 1-8 and10-20) agreed that scientific knowledge is not set in stone, that ther e is a difference between opinion and evidence based reasoning, that one explanation can be more justified than another but not necessarily incorrect, and that one needs some type of supporting documents other

PAGE 258

242 than a textbook in order to determine which explanation to believe However one participant (9) felt that scientific knowledge is s et in stone and would use the textbook as the first source for deciding which explanation to believe. When comparing participants’ initial interview comm ents with their initial EBAPS scores (Table 49) for their understanding of evolvi ng knowledge in science a few of the initial scores for this axis are mirrored in the pa rticipants’ initial interview comments. For instance ten participants (ST 2, 4-6, 9, 14-15, and 18-20) all had initial scores in the moderately to highly sophisticated range that align ed with their initial interview reflections that scientific knowledge is not set in stone, that there is a difference between opinion and evidence based reasoning, that one explanation can be more justified than another but not necessarily incorrect, and that one needs s ome type of supporting documents other than a textbook in order to determine which explanation to believe. H owever some of the scores did not reflect the participants’ int erview comments and vice versa. The initial EBAPS scores for the remaining participants (ST 1, 3, 7-8, 10-13, and 16-17) fell in the poorly sophisticated and unsophisticated range while their initial interview comments suggest moderately to highly sophisticated beliefs. For instance all of the aforementioned participants stated that scientific knowledge was not set in stone and was constantly changing as technology improved. Ho wever, participant 9 scored in the moderately sophisticated range (2.67) which conflic ted with her interview statement that scientific knowledge is set in stone. As suggested earlier discrepancies between EBAPS scores and interview statements may have been due t o several factors including misinterpretation of the questions and/or possible answers or incorrect bubbling of choice as well as their personal experiences in the chemistry lecture and laboratory course during the semester.

PAGE 259

243 The final interviews reflected a shift in only one of the participant’s beliefs (ST 5) from totally supporting scientific knowledge is not set in stone to a more moderate position that in some cases it knowledge may be set and unchanging. The remaining participants’ final interview reflections remained unchanged from their initial interview. Table 49 EBAPS Evolving Knowledge – Pre-Post Stat istics ID Pre Post Difference 1 2.00 1.67 -0.33* 2 2.33 1.67 -0.66* 3 1.67 2.67 1.00*** 4 3.00 3.00 0.00* 5 2.67 2.67 0.00* 6 2.67 3.00 0.33** 7 2.00 3.00 1.00*** 8 2.00 3.00 1.00*** 9 2.67 2.67 0.00* 10 1.33 2.67 1.34*** 11 2.00 3.33 1.33*** 12 1.33 1.67 0.34*** 13 1.67 2.33 0.66*** 14 2.67 3.67 1.00*** 15 3.33 4.00 0.67*** 16 1.33 2.67 1.34*** 17 1.33 2.67 1.34*** 18 2.33 4.00 1.67*** 19 2.33 2.33 0.00* 20 2.33 3.00 0.67*** decrease in score or no change ** 0.26 (6.5 points) gain in score *** > 0.26 gain in score However, when comparing participants’ final intervi ew comments with their final EBAPS scores (Table 49) there were decreases, incre ases, or no change in participant scores. that scientific knowledge is not set in st one, that there is a difference between opinion and evidence based reasoning, that one expl anation can be more justified than another but not necessarily incorrect, and that one needs some type of supporting documents other than a textbook in order to determine which explanation to believe. F or instance two participants’ (ST 1-2) EBAPS scores de creased however they still both

PAGE 260

244 supported the view that scientific knowledge is not set in stone, that there is a difference between opinion and evidence based reasoning, that one explanation can be more justified than another but not necessarily incorrec t, and that one needs some type of supporting documents other than a textbook in order to determine which explanation to believe in their final interviews. The final EBAPS scores of participants 4-5, 9, and 19 remained unchanged as well as their views from the beginning of the semester. The final scores of participants 3, 6-8, 10-18, and 20 all increased by the end of the semester supporting their interview views that scientific kn owledge is not set in stone, that there is a difference between opinion and evidence based rea soning, that one explanation can be more justified than another but not necessarily incorrect, and that one needs some type of supporting documents other than a textbook in order to determine which explanation to believe. As suggested earlier discr epancies between EBAPS scores and interview statements may have been due to several f actors including misinterpretation of the questions and/or possible answers or incorrect bubbling of choice as well as their personal experiences in the chemistry lecture and l aboratory course during the semester. The tentativeness of scientific knowledge, the dif ferences between opinion and evidence-based reasoning and the need for evidence are the concepts that some participants struggled with throughout the course a s indicated in the pre-post interviews. The need to perform the laboratory activities prior to the lecture discussion of the concepts and theories surrounding the material may improve the participants’ views on evolving knowledge. The participants knowing the b asis of the theories surrounding the laboratory concepts may have tried to fit the data to the theory instead of considering the probable reasons for the data making a “perfect” fi t.

PAGE 261

245 Table 50 Participants’ Reflections Evolving Knowl edge (N=20) Initial and Final Epistemological Beliefs Interview Question-4 Evolving Knowledge – A) All scientific knowledge is set in stone. B) There is no difference between scientific evidence-based reason ing and mere opinion. C) Sometimes different science instructors give differ ent explanations for scientific events/concepts/phenomena. When 2 instructors expl ain the same thing differently, can one be more correct than the other? Explain. D) When 2 explanations are given for the same situation, how would you go about deci ding which explanation to believe? Please give details and examples. E) Can one ever be sure of which explanation to believe? If so, how can you? If not, why not? Quotation Comments ST-1: A) “ I disagree. Theories change all the time. It was once thought that God created everything but science has brought up the theory of evolution.” B) “False, because opinions could be founded on ignorance whereas if it's scien tific evidence based, it's concrete truth.” C) “Not really. I think that everyone learns differe ntly and how one teacher explains it to a student could be much clearer than if another teach er explained it. Both teachers would be equally correct.” D) “When you're converting one unit to another. One teacher could tell you to move the decimal place and another could tel l you to multiply by factors of 10 depending on the conversion. I think it depends on the student, but I just move the decimal place. I think it's easier that way. Another student might think the other way would be easier.” E) “I don't think there is going to be a "right" way. If you can get the same answer both ways, one should just use the method that is e asiest for them.” (In) “I don’t believe scientific knowledge is set in stone. I think that science is experimental. There might be a theory that is disproved by something or someb ody. So, everything is sort of coming and going. In 100 years we could believe co mpletely different things than what we believe today.” (F) ST-2: A) “No, one of the main cornerstones of science, is th at one can disprove something. Therefore science is always changing.” B) “No, there is a lot of difference. Opinion is based on a belief system designed by fam ily life, religion, society and science if you think that way. Scientific evidence is based on experiments performed many times with many counter experiments to disprove, which is alwa ys changing.” C) “One could be more right than the other, however, it nature there a ma ny ways something can happen. For example, the dinosaurs.” D) “Some say the dinosaurs became extinct because of m eteors crashing to earth and because of the dust suffocate d the dinosaurs, others believe the meteor it, killed plant life etc. You can decide on what you conceive as more believable based on your life experiences, research the topic to find many other points of few, and draw your own conclusions.” E) “I think one can only be sure within themselves, wh at you believe is up to you, if you are in position though to, you need to prove it.” (In) “ I don’t believe science is set in stone as it is constantly changing. I don’t believe that we are supposed to know everything. What happened 100 year s ago may not apply to now.” (F) Continued on next page

PAGE 262

246 Table 50 (Continued) ST-3 : A) “Well, the first thing that I think of is how much has changed scientifically over the course of history. More than thousands of new disco veries have been made throughout time, if it was set in stone there would be no room for more knowledge I think there are certain core principles but there is always room for growth and more develo pment.” B) “Well that makes me question their results. Opinion s can be given perhaps in the form of a hypothesis, but evidence based reasoning should alw ays be separated from opinion to gain true scientific results.” C) “I think that it would depend on the concept being explained. If it was the result of an experiment and one professor a nalyzed it differently, I wouldn't know if he/she would be less correct than the other.” D) “Yes, that could be tricky. Well at that point, I would have to use different resources avai lable to come to a decision. I could look it up at the library, online, or ask other students or professors, for examples.” E) “I think so, once you have come to your own conclusion, you woul d have to choose to reject one explanation and accept the other. If they were in f act so different to begin with.” (In) “Whenever I hear this I always think back years ago to when they thought the earth was flat. Then over time that view changed. I thi nk there are definitely some things that are set in stone so I’m kind of in the middle. There are ground rules.” (F) ST-4 : A) “Not true. New discoveries are being made all the time. Physicists/astronomers and other scientists work on problems without solut ions all the time.” B) I disagree. Evidence-based reasoning includes experimental data to prove a point where as opinions are not necessarily as factually supported.” C) “I don’t know about being "more correct", but i think instructors vary in their clarity and e xplanation of a topic. So I wouldn’t say it’s a matter of correctness most of the time, but rather a degree of clarity and success at teaching or conveying ideas.” D) “I suppose some trial and error with data collect ion should be used. Also trying to get some third party explan ations would be good as well. Just going to other sources for information or explanations. O ther professors or info sources like the library or Internet.” E) I think yes, if you physically try to justify an answer or explanation through experimental trial and error yourself, and come up with identical data and conclusions. Also getting verification through inte rrelated concepts that support the initial topic.” (In) “I don’t think science is set in stone. I think th ere’s a difference between mere opinion and evidence-based reasoning. Evidenc e based reasoning involves testing a theory or making observations based on ex perimental procedures. Then coming up with data and results that explain what’s happening.” (F) ST-5: A) “ No, I do not believe scientific knowledge is set in stone because the matter on the earth and in the universe still have mysteries to solve no matter how big or small they are to science.” B) “No, there is a difference. Reasoning is based on what is truly understood and opinion is based on one’s own percep tions.” C) “No, the instructors may have learned different things throughout their live s through their own instructors in the past. I must just accept the right one that is easiest fo r me to understand.” D) “The explanation that I go for is the one in the simplest form.” E) “Yes, because one person might go the long hard way and end up with an answer, and another per son might use the short easy way and still end up with the same answer as the first pers on.” (In) “I’m in the middle again as everything in life changes. For instance the infor mation on black holes in space. On the other hand science is merely human thought. Som e knowledge may change while other knowledge may not change.” (F) Continued on next page

PAGE 263

247 Table 50 (Continued) ST-6: A) “No. Science is always changing. New things are always being discovered. So, I wouldn’t say that it’s set in stone. Well, a formu la might be set in stone, but there are always new discoveries Things change such as the nu mber of planets. Life’s always changing.” B) Yes. Mere opinion could be based on anything but scientific evidence is where you just have supporting evidence that backs up your opinion by experiments. You can have an opinion about anything and not really h ave anything to back it.” C) “No. If they’re explaining the same thing, I don’t think on e is more correct than the other. Maybe one has a way of explaining something that you can understand better. Everybody understands concepts differently.” D) “Well, I would just go with the one that made sen se to me and use that.” E) “Well, I think it’s important to always question things. I never just believe something without questioning. Then one ca n decide which one has the most evidence to back it and that makes sense to you. S cientific evidence that offers support for that explanation. So if you have two explanations, you come up with points that can support each and decide which one has more evidence.” (In) “It is not set in stone things are always changing and things are evolving. Things tha t may have been true before may not be true now. Opinions are based on personal beliefs while e vidence-based reasoning involves discovery.” (F) ST-7: A) “No science is changing every day. What was consid ered fact years ago is not necessarily fact today. Like the earth being flat t hat was once the case however with the increase of knowledge and information we found that that was not the case.” B) “There is a clear difference between scientific evidence and a mere opinion. An opinion is something someone personally believes and does not necessaril y need to be proven. While scientific evidence has been researched and can be proven.” C) “One might be easier to understand than the other but not necessarily more correct.” D) “Which ever one I understood better I would go with the explanation that best described t he situation in a more scientific manner that justifies its reasons with facts and hypothesi s.” E) “Not really you just have to go with your best judgment.” (In) “I don’t think it is set in stone as science change s all the time. If it is based on opinion there may or may not be s ome facts to support the opinion. (F) ST-8: A) “No, scientific knowledge should be tested to the point of exhaustion in order to determine its truth.” B) “No there is a big difference. With scientific ev idence-based reasoning, you have specific data and supporting ev idence to reason your conclusions. A person's opinion has no evidence, it's just a hypot hetical conclusion based on what one thinks.” C) “I don't think it's an issue of correctness. I th ink that each professor knows what they are trying to get across and merely achieves t his by their own explanation and method.” D) “I would believe the one that was closest to my o wn understanding and knowledge. If the two explanations are very similar and differ just slightly, I would look to a third source say another instructor or through rese arch on the subject. Testing the data presented to me by searching for what others have c oncluded about it.” E) “Yes if the right explanation has been found to be absolute truth thr ough observation or experimentation.” (In) “I wouldn’t say all scientific knowledge is set in stone. I don’t think it is all set in stone as technology progresses things are modified. ” (F) Continued on next page

PAGE 264

248 Table 50 (Continued) ST-9: A) “Yes it is set in stone. It has been researched and studied by professionals who have been able to scientifically prove it.” B) “Yes there is a difference. Evidence is something that has been proven, to back up a theory Opinion is one's own personal belief on whether or not something is true.” C) “They can be equally correct. Everyone understands science differently and has a different way of explaining it. Therefore although different explanations may be given, they can still mean the same in the end.” D) “Whichever one I could relate to better or I would create my own so that I could understand it more clearly.” E) “By referring to the text book. Well if 2 explanations are given, if you are unsure about them just look up the concept in the t ext book and read that. Or you could ask the professor to explain it to you as an individual .” (In) “ I believe that scientific knowledge is set in stone.” (F) ST-10: A) “I disagree with that because I think all scientifi c knowledge is based on theories which although are accepted as basic truth are alwa ys apt to change.” B) “There is a difference because one is educationally based reaso ning and then the other is just an opinion probably based on ones own beliefs or relig ion or whatever, they happen to think. Whereas one is actually applying knowledge which is different than just throwing out an opinion that may or may not be accurate.” C) “Possibly. I couldn’t really say unless I knew what they were talking about. I think there are di fferent explanations for all sorts of things and it doesn’t mean that necessarily one’s more cor rect than another.” D) “Probably look at critics of both points of view and decide then whic h would be either the less critiqued one or the most reasonable seeming explanation. Well, I t hink then in order to decide whether you believe something or not, you have to look at both sides. You have to look at their critiques and you have to look at the support or look at the research. I find a lot of information that comes from either schools or classrooms or other on line classes.” E) “I feel whichever one presented the information in the most factual manne r and with limited opinion.” (In) “Even though most scientific knowledge is well backed up it is still based on theory. So, nothing is set in stone. There’s a distinction be tween evidence based reasoning and mere opinion. Evidence based reasoning is a result of experiments and theories.” (F) ST-11 : A) “New knowledge is showing up every day.” B) “Wrong, scientific based evidence is generally proven through numerous experiments wh ile opinions are only what someone thinks and hasn’t necessarily been experimented wit h.” C) “No, both are equally correct, since science is changing all the time, there are n o real right answers. One may base his explanation on one theory while the other bases his on another theory.” D) “ I would experiment with both ways that were explained to me and see which one I better understand.” E) “No, science is always changing. Both instructors could be right they just explain the concept differently.” (In) “I don’t believe all of science is set in stone as science is changing all the time. There is room for change.” (F) Continued on next page

PAGE 265

249 Table 50 (Continued) ST-12: A) “No because most of the knowledge we have about sc ience is represented by theories. So everything is subject to change.” B) “Yes and no because you can formulate an excepted theory based on clear scientific evidence but if someone else were to look and that same data they might interpret something diffe rent and that’s where the mere opinion comes in.” C) “No because as long as both the instructors know w hat they are talking about and give valid explanations of a concept then they both can be right they just explain it in different ways.” D) “I would believe the one that I can relate to the most. For instance if one instructor gives an explanation that I understand t hen I am going to believe that one. It’s like whatever explanation is easier for me to wrap my he ad around.” E) “No not really because everyone is going to interpret data in different wa ys. Many different explanations might be believable it just depends on who wrote the interpr etation.” (In) “No, science is not set in stone. Everyone has their own opinion. Scientists interpret the data in different ways. ” (F) ST-13: A) “I don’t think that is true, because many new scien tific things are being discovered explored and changed. Scientific knowle dge is not set in stone. Scientists are discovering new things everyday about science.” B) “An opinion is the way someone feels while, scientific evidence-based reasoning is more about fact, what is already known, and has been researched.” C) “I think that each person is different and views t hings differently. That’s why it could be explained different. I think one can be better than the other depending on the reasoning.” D) “I would go with whichever one made more sense to me or matched my reasoning. For instance if teacher A and B were explaining different ways to do a problem I would try both methods and whichever one worked for me I would use. Which ever method makes more sense to me.” E) “I don’t think that you can ever be sure on which explanation to believe unless you see how it works for yourself, and honestly believe that there is no other way that would work.” (In) “Again I am in the middle however I do not believe all scientific knowledge is set in stone. There is a difference between opinion and evidence based reasoning.” (F) ST-14: A) “Change occurs.” B) “One should take evidence-based reasoning over mer e opinion.” C) “If they are explaining the same thing and their b ottom line is the same then I don’t think that one would be more correct than the other.” D) “I would review both explanation and believe the one that makes the most sense to me. It may not be the right one, but if it’s the one I understand the most then I will believe that one.” E) “No, one can not ever be sure because an explanation is based on someone else’s studies not your own, evidence based over opinion.” (In) “No, not all scientific knowledge is set in stone. Evidence based reasoning is like the experiment its elf and supposition so I would say there’s a difference between that and mere opin ion. You need evidence to back up reasoning.” (F) ST-15: A) “No, scientific knowledge is constantly changing a s new discoveries are made.” B) “There is a big difference, opinion is not support ed by any evidence but scientific reasoning has evidence. Evidence to support the facts.” C) “If one has the evidence to support their reasoning then yes one can be more co rrect than the other. If it is merely opinion based then no one can be more correct than the other.” D) “Researching, gathering information on the two different explanations and f inally drawing a conclusion based on the information gathered and your own thoughts/opinions .” E) “It depends. If one finds enough evidence to support one of the explanations then ye s, but if it turns out to be just opinionbased then no.” (In) “I have always believed that science knowledge chan ges. For example since the discovery of the atom knowledge h as changed.” (F) Continued on next page

PAGE 266

250 Table 50 (Continued) ST-16: A) “No. It’s not because with the new technologies w e have today we can disprove something from before. For instance the controversy with Pluto about not being a planet. So, it’s not set in stone.” B) “I would say there is a difference. If it’s scien tific evidence it’s been peer reviewed by others and a mere opinion wou ld just be an individual opinion. ” C) “No. If it’s like a concept or an event, they coul d both be wrong or they could both be right because they could interpret it differently. For in stance when someone gets in a car accident and one person describes it occurring in o ne way while another person says it happened differently. They could both be right or wrong. It’s just in the way they interpret it or the way they experienced it.” D) “If possible I would try to experience the same si tuation. I would research it so that I could try and figure it out. For instance read other lab reports.” E) “I would say it depends as you can’t ever be sure o f which explanation to believe. For me it’s hard to believe everything about the atom beca use it’s so small.” (In) “ I believe there is a distinction between evidence based reasoning and mere opinion. I definitely don’t believe that scientific knowledge is set in stone. I believe that science is always evolving.” (F) ST-17: A) “No, new information is always being discovered re sulting in change. Such as when they discover or create new elements that chan ge the periodic table.” B) “One would definitely need to make that distinction. When it h as to do with science most of the concepts should be based on evidence not on opinion.” C) “Not necessarily more correct but some students will respond better to one or the other in structor based on their learning style.” D) “If I was having a difficult time deciding which to believe I would research the topic and see which one was either correct or made more sense to me. I would use either the text or look online. I believe the text would be more reliable.” E) “I suppose unless they have witnessed it or if there is a lot of believable evidence supp orting it.” (In) “I have a strong belief that science is not set in stone.” (F) ST-18: A) “Nothing is set in stone even theories.” B) “No. Because an opinion is what someone thinks. If you have evidence then it is vie wed as true.” C) “One may make more sense to you than the other. Again one can base it on their different life experiences.” D) “I would try to relate it to an experience that I’ve h ad so I would understand it better.” E) “I would say it’s hard to really be sure which explana tion you’re supposed to believe. Because again, you may not know exactly what is right and w rong. Nothing is set in stone. You can always do your own research using other books. How ever, even books are not always correct. It would be actually something that they feel is correct.” (In ) “I don’t think scientific knowledge is set in stone because it cha nges everyday. Someone can develop a new theory or add to an old one. There i s a distinction between evidence based reasoning and mere opinion.” (F) Continued on next page

PAGE 267

251 Table 50 (Continued) ST-19: A) “No not all scientific knowledge is set in stone b ecause there are theories and new material found daily in the world.” B) “No there is a difference because evidence based reasoning is based upon knowledge where as an opinion is your views.” C) “Yes there can be more than one way because one instruct or may explain the concept using a different method than the other instructor and stil l both are correct.” D) “I would believe the one that matches the textbook. I would use whatever way is easiest and makes the most sense for me personally.” E) “Yes all you need to do is look it up on the inter net or in your textbook. Usually these sources will tell you what explanation is right.” (In) “Well, I don’t think all scientific knowledge is set in stone. I t hink there’s a distinction between evidence based reasoning and opinion.” (F) ST-20: A) “There is always room for enlightenment. Some thin gs are as they should be and just seem to fall in to place.” B) “There is a vast difference between someone who ha s verifiable proof to validate outcomes and someone t elling you that they know something.” C) “Possibly. I have heard different theologians do t he same. It certainly makes you think though and it would make you research it a little h arder to find the "true" meaning, to find the more correct answer.” D) “I would have to research it. Get out the books, g et on the net. Perhaps both of them are not incorrect perhaps both of them are looking at different aspects of the same situation.” E) “Whichever one proves itself over the test of tim e. For instance drug trials. Certain drugs are given over a period of time work better under certain conditions. If someone tells me they know something I may listen to what they have to say but I am not going to risk anything of importance o n something that someone cannot prove to me by research studies, statistics, and repetiti ve results.” (In ) “I don’t believe that it’s all set in stone. I think that science involves eviden ce based reasoning.” (F) Source of Ability to Learn Science In the current literature on personal epistemology the dimension, source of ability to learn science is viewed as operating on a contin uum that ranges from viewing that learning science takes natural ability to viewing t hat anyone with effort and self – confidence can learn science. Within this dimension the overall participant (N =56) pre-test mean was 2.896 (72.4) while the post-test mean was 3.107 (77.7) (s ee Table 40) with 29 participants improving their score. The preand post-mean score s of the interviewed participants (N=20) were 3.000 (75.0) and 3.210 (80.2), respecti vely with 12 participants improving their score. This was also a category that quantit atively shows an above average (> 0.21 or 5.3 points) increase in 27 of the 56 partic ipants, and 10 of the 20 interviewed participants’ scores. The gain on the “source of a bility to learn science” dimension is an

PAGE 268

252 indicator that some participants are moving away fr om a view that you must have a natural ability to learn science to that if one put s forth the effort and has self-confidence anyone can successfully learn science. Although increases were observed quantitatively (Ta ble 51) with a majority of the interview participants, the difference in their und erstandings is best reflected in the interview responses in Table 52. In order to query participants, understanding of the source of ability to learn science, the initial and final interview question inquired whether being good at learning and doing science is mostly a matter of fixed natural ability so most people cannot become better at learning and do ing science. Initially the majority of the interviewed participants (ST 1-4, 6, 8-9, 14-15 and 19-20) expressed the belief that the ability to learn science was a combination of t he desire to learn, some natural ability, and/or working hard. The remaining participants (S T 5, 7, 10-13, and 16-18) supported the belief that anyone can learn science. Their in terview comments reflected the ideas that one only needs the desire and the willingness to work hard to be successful in learning science. When comparing participants’ initial interview comm ents with their initial EBAPS scores (Table 51) for their understanding of the s ource of ability to learn science a majority of the initial scores for this axis are re flected in the participants’ initial interview comments. For example six participants (ST 5, 11-13 and 16-17) all had initial EBAPS scores in the high to the extremely sophistication range that aligned with their initial interview reflections that one only needs the desir e and the willingness to work hard to be successful in learning science. Ten of the part icipants (ST 1, 2, 4, 6, 8-9, 14-15, and 19-20) scored in the moderately or highly sophistic ated belief range which supported their interview belief that the ability to learn sc ience is a combination of the desire to learn, some natural ability, and/or working hard.

PAGE 269

253 The final interviews reflected a shift in three of the participants’ beliefs (ST 5, 15, and 19). Participant five beliefs changed from tha t one only needs the desire and the willingness to work hard to be successful in learni ng science to the ability to learn science is a combination of the desire to learn, so me natural ability, and/or working hard. By the end of the semester the other two participan ts’ beliefs (ST 15 and 19) moved from that the ability to learn science is a combina tion of the desire to learn, some natural ability, and/or working hard to that one only needs the desire and the willingness to work hard to be successful in learning science. These b elief changes may have been due to their own personal experiences with chemistry in th e lecture and laboratory during the semester. The final interviews with the remaining p articipants did not reveal any belief changes concerning the ability to learn science. However, when comparing participants’ final intervi ew comments with their final EBAPS scores (Table 51) a few decreases in particip ant scores were noted. For example five participants (ST 5, 15, 17-18, and 20) EBAPS final scores decreased. One score decrease is reflected in participant five ’s final interview where she shifts from believing that one only needs the desire and the wi llingness to work hard to be successful in learning science to the ability to le arn science is a combination of the desire to learn, some natural ability, and/or worki ng hard. The other score decrease was participant 15, however in her final interview she moved from the belief that the ability to learn science is a combination of the desire to lea rn, some natural ability, and/or working hard to that one only needs the desire and the will ingness to work hard to be successful in learning science. The beliefs of the remaining participants with score decreases did not change from the initial to final interview. Th e majority of the participants with no change or increases in their final EBAPS scores mai ntained their initial beliefs in the final interviews. As suggested earlier discrepancies be tween EBAPS scores and interview

PAGE 270

254 statements may have been due to several factors inc luding misinterpretation of the questions and/or possible answers or incorrect bubb ling of choice as well as their personal experiences in the chemistry lecture and l aboratory course during the semester. Table 51 EBAPS Source of Ability to Learn Science – Pre-Post Statistics ID Pre Post Difference 1 3.00 3.80 0.80*** 2 2.60 3.40 0.80*** 3 1.60 3.00 1.40*** 4 2.60 2.80 0.20** 5 3.00 2.40 -0.60* 6 2.80 3.40 0.60*** 7 2.60 3.80 1.20*** 8 3.20 4.00 0.80*** 9 2.40 3.00 0.60*** 10 2.20 3.20 1.00*** 11 3.60 3.40 -0.20* 12 3.60 4.00 0.40*** 13 3.80 3.20 -0.60* 14 3.40 4.00 0.60*** 15 3.60 2.60 -1.00* 16 3.20 4.00 0.80*** 17 3.20 2.80 -0.40* 18 2.80 2.00 -0.80* 19 3.40 3.20 -0.20* 20 3.40 2.20 -1.20* decrease in scor e or no change ** 0.26 (6.5 points) gain in score *** > 0.26 gain in score

PAGE 271

255 Table 52 Participants’ Reflections Source of Abil ity to Learn Science Initial and Final Epistemological Beliefs Interview Question-5 Source of Ability to Learn Being good at learning and doing science is mostly a matter of fixed natural ability so most people cannot beco me better at learning and doing science Quotation Comments ST-1: “In part everyone can improve their skills in scien ce or any other subject just by diligently studying the material and relating the m aterial to their lives.” (In) “I think if you want to learn science you can. I do believe howeve r that some are born with the natural ability to learn. If you want to learn som ething you may have to work hard at it while it may be easier for others.” (F) ST-2: “Some people get it others don't. Chemistry and mat h are harder subjects for me to grasp. I have to really work at it, and sometimes it does n't show. Some have to do it a lot more.” (In) “I believe it is a combination. For instance I rea lly have to work hard at succeeding in math and science. History, English, psychology, and music are my passion. I don’t have to constantly go over the co ntent for those courses like I do for science and math. It is in part a natural draw.” ( F) ST-3 : “I definitely think that in order to be a really good scientist you have to have a passion for what your doing, but I don't think natural abil ity is the only component. Discipline, attention to detail, diligence, all of those charac teristics should be applied and taken into consideration as well.” (In) “I am kind of split on this statement. I put in a lot of time and hard work which is why I’m probably more succes sful than some other people One can probably really learn science if they reall y put your mind to it. I do feel I do have a little more natural ability.” (F) ST-4 : “I think to some degree one can be naturally gif ted at learning sciences. However, you can improve your understanding concepts by repe tition and practicing concepts. The amount of time needed to learn a new concept will v ary from person to person.” (In) “I think that most anyone can learn science if they pu t a lot of time, effort and hard work into it.” (F) ST-5: “No, if someone want to learn science and has an in terest they can even if they do not have a natural ability. Some are better at just learning the material without doing anything else.” (In) “I believe that most individuals can learn science if they want to.” (F) ST-6: “ You definitely have to work through the material es pecially if you don’t understand it. Everyone has their own way to understand and le arn science.” (In) “I think some people are better at it than others. (F) ST-7: “No I do not believe in natural ability. I believ e a person must obtain information through working hard. Some people may be able to gr asp the concepts quicker than other. However, it is not because of natural ability but d ue to their intellect. People can learn whatever they want it just takes practice and take and time.” (In) “I think if you want to learn science you can. I don’t believe in natural ability. If you are motivated and spend time on anything you can learn it. I don’t th ink motivation is the same as natural ability. Natural ability means if your pare nts can learn something you should be able to learn it. However my parents know nothin g about science so I’m not born with natural ability.” (F) Continued on next page

PAGE 272

256 Table 52 (Continued) ST-8: “No, natural ability plays a small role. A person who is hard working is more likely to succeed in chemistry through practice and familiari zation. Chemistry is coherent and logically connected material. Therefore if a person continues to practice the person will become better at learning/doing science.” (In) “Most individuals can learn science if they want to.” (F) ST-9: “Yes and no. I believe that some individuals are bo rn with a greater sense of knowledge in certain fields such as chemistry but I believe that everyone is capable of learning and understanding it. With practice one can understand science better.” ( In) “I believe that all individuals can learn science if t hey want to. However, for some I think it’s much easier. I think some people are bo rn with the ability to think analytically. I have a harder time learning scienc e.” (F) ST-10: “ I don’t think that’s right at all. They told my mo m that she wouldn’t be good at math or science when she graduated high school. She stud ied really hard and became a chemical engineer. She is like the shining light t hat makes me realize that you don’t have to be naturally good at something to be able to do it. ” (In) “If you want to learn science you can learn it. I know a number of people who aren’t naturally able to think scientifically and they’ve managed to learn and und erstand science.” (F) ST-11 : “No, if you take the time to study science you c an learn it and eventually become good at it If someone really wants to learn science all they really have to do is sit down and read to understand the general concepts.” (In) “I don’t think you have to have natural ability to learn science. I just believe one needs to work hard. I don’t understand a lot of the chemical reactions however I ‘m really g ood at math. So I reread everything and learn science with hard work.” (F) ST-12: “No I think that if you just study and really try to learn the best way you can then you can be good at anything. Some people might catch on faster and have a natural ability but that doesn't mean others can’t learn science.” (In) “I think that anyone can learn science if they want to. I don’t think just because one is not naturally good at science that they can’t learn it.” (F) ST-13: “If someone works hard enough they can become bette r at learning and doing science. It depends on how much time and effort th ey are willing to put into improving their learning.” (In) “Anyone can learn science you just have to use your prior knowledge and work hard.” (F) ST-14: “Yes natural ability is always a plus but dedicating one self to understanding the material helps. People can become better at science over time and w ith repetition. For example in the lab using the same techniques when p erforming certain tasks More repetition results in perfecting the task resulting in more reliable results.” (In) “I would say most individuals can learn science if they want to. Because I don’t think of myself as an intelligent but if I dedicate myself I can learn anything.” (F) ST-15: “Natural ability helps but being good at somethin g involves the student’s own willpower. One needs to be able to sort through the in formation and understand it. For instance, if you wanted to be better at rollerbladi ng, you would have to practice, practice, and practice. The same goes for science. Sitting ar ound and not doing anything about it won't get you anywhere.” (In) “ You have to want to learn science. If you open you r mind and believe you can then anyone can learn scie nce.” (F) ST-16: “No, that’s not true. It takes practice and stud ying so you can understand it. Some people are better at learning science but it’s beca use they work hard to understand it. Practicing problems in the book and going to lab cl ass help in understanding the concepts.” (In) “I lean towards most individual’s can learn science if they want to. I know if they tried hard enough they could learn and understand i t. Laziness keeps some from trying hard enough.” (F) Continued on next page

PAGE 273

257 Table 52 (Continued) ST-17: “I think anyone can learn science so I say it’s n ot a natural ability You would need the desire and motivation and work hard ” (In) “Since I have taken a number of other science courses I would say anyone can learn scienc e if they really want to.” (F) ST-18: “No, everything just takes practice. You might n ot be the smartest person, but if you know how to apply yourself and you constantly w ork at it, it’s not impossible, it’s just more difficult.” (In) “From my own experiences when I even mention that I am a science major other students react by saying they can’t ima gine taking chemistry. I think one has to have an interest. I just really like scienc e so learning has never been that big of a deal for me.” (F) ST-19: “Yes it’s true that some people learn things easi er but everyone can learn any kind of material if they set their minds to it. It’s jus t that some people might need to study for hours where as others can read through the material once and already understand it.” (In) “I believe that an individual can learn science if they want to. You have to put more time and work into learning science as well as be a ble to think. If people aren’t willing to try and learn then they’re not going to be successful.” (F) ST-20: “Then there would be no point in trying to learn or do better at anything. Perhaps everyone is not meant to be a scientist or a physic ist or a doctor but we can all be better at anything.” (In) “I lean slightly toward it being more a natural abi lity. You can learn a lot the harder you work. But, I think if you don’t have some natural ability to understand the concepts that you can work all you w ant and you’re still not going to get it.” (F) Discussion Changing Epistemological Beliefs RQ1. What range of personal epistemological of beliefs a bout science (chemistry) do undergraduate science students have at the beginning of a semester general chemistry laboratory course? Participants’ initial scores on the Epistemological Beliefs Assessment for Physical Science (EBAPS) represent a range of belie fs from unsophisticated to highly sophisticated with the majority falling into the mo derately sophisticated range (2.4-2.9). No participants scored in the top sophistication le vel, extremely sophisticated, meaning that there were no participants at the beginning of the semester course that held a high level of epistemological beliefs theorized in the m odels (Perry, 1970; Baxter-Magolda, 1986; Schommer, 1990; Hofer & Pintrich, 1997). Mos t of the participants initial scores fell in the range of late dualism to late multiplic ity (levels 2-4) in Perry’s model and in the

PAGE 274

258 absolute knowing to transitional knowing range of B axter Magolda’s model. The average EBAPS overall score of 2.514 would place th e participants in the early multiplicity stage or transitional knowing stage of epistemological development. This gives some support to Perry and Baxter Magolda’s fi ndings that students depending on their year in college and other factors such as age and gender begin as a dualist or multiplist. In the current literature on personal epistemology the dimension, structure of scientific knowledge is viewed as operating on a co ntinuum that ranges from viewing scientific knowledge as an accumulation of concrete discrete, knowable facts without much structure to viewing it as an interrelated net work of strongly connected and highly structured concepts that are contextual, contingent and relative. The initial EBAPS scores of the participants (N=56) resulted in 8.9 % of the participants beginning the semester with highly to extremely sophisticated bel iefs about the structure of scientific knowledge. Only one of the interview participants (N=20) initially scored in the highly sophisticated level for this dimension. In the ini tial interviews 50% of the participants believed that structure of scientific knowledge inv olved interrelated concepts. In the current literature on personal epistemology the dimension, nature of knowing and learning science is viewed as operating on a continuum that ranges from viewing that learning science as consisting mainly of absorbing information such as facts to relying on constructing one’s own understanding by working through the material actively, by relating new material to prior experie nces, knowledge, and intuitions, and by reflecting upon and monitoring one’s understanding. The initial EBAPS scores of the participants (N=56) resulted in 19.6 % of the parti cipants beginning the semester with highly to extremely sophisticated beliefs about the nature of knowing and learning scientific knowledge. Two of the interview partici pants (N=20) initial EBAPS scores fell

PAGE 275

259 in the highly sophisticated level for this dimensio n. In the initial interviews 10% of the participants believed that the nature of knowing an d learning scientific knowledge involved interrelating concepts and constructing on e’s own knowledge. In the current literature on personal epistemology the dimension, real-life applicability of science is viewed as operating on a continuum that ranges from the view that is science is applicable to everyone’s life in side and outside the classroom or laboratory versus that it is an exclusive concern o f the scientific world. The initial EBAPS scores of the participants (N=56) resulted in 39.3 % of the participants beginning the semester with highly to extremely sophisticated bel iefs about the real life applicability of scientific knowledge. Eight of the interview part icipants (N=20) initial EBAPS scores fell in the highly sophisticated level for this dimensio n. In the initial interviews 80% of the participants believed that the real life applicabil ity of scientific knowledge included life outside the classroom or laboratory. In the current literature on personal epistemology the dimension, evolving scientific knowledge is viewed as operating on a co ntinuum that ranges from viewing scientific knowledge as absolute, “set in stone” to viewing it as changing and dynamic. This dimension also considers the justification and source of knowledge in terms of the evaluation of evidence and the opinion of experts. The initial EBAPS scores of the participants (N=56) resulted in 23.2 % of the parti cipants beginning the semester with highly to extremely sophisticated beliefs about the evolving nature of scientific knowledge. Two of the interview participants (N=20 ) initial EBAPS scores fell in the highly sophisticated level for this dimension. In the initial interviews 70% of the participants believed that scientific knowledge cha nges and evolves over time. In the current literature on personal epistemology the dimension, source of ability to learn science is viewed as operating on a contin uum that ranges from viewing that

PAGE 276

260 learning science takes natural ability to viewing t hat anyone with effort and self – confidence can learn science. The initial EBAPS sc ores of the participants (N=56) resulted in 55.3 % of the participants beginning th e semester with highly to extremely sophisticated beliefs about the source of ability t o learn scientific knowledge. Twelve of the interview participants (N=20) initial EBAPS sco res fell in the highly sophisticated level for this dimension. In the initial interview s 30% of the participants believed that anyone can learn science. RQ1b. Do students’ personal epistemological beliefs about science (chemistry) change by the completion of a semester general chem istry laboratory course? The epistemological beliefs of 39% of the participa nts (N=56) improved their EBAPS scores by the end of the semester resulting i n a shift in their epistemological beliefs towards a more sophisticated level. The e pistemological beliefs of 50% of the interview participants (N=20) improved their EBAPS scores by the end of the semester resulting in a shift in their epistemological belie fs towards a more sophisticated level. This shift suggests that personal epistemological b eliefs can change over time. However the characterization of the participants’ p ersonal epistemological beliefs scores is better reflected in their interview responses. Prior studies concerning learners’ personal epistem ological beliefs conducted with college students indicate that their personal epistemological beliefs can change during the college years (Baxter Magolda, 1992; Per ry, 1981). Perry’s (1968) investigation found that entering college freshmen believe knowledge is certain and provided by authority while college seniors believe d that knowledge is complex and tentative and is derived through reason. Schommer (1997) conducted a longitudinal study to determine whether high school students’ ep istemological beliefs changed over time. Using the questionnaire Schommer (1990) deve loped she found that students’

PAGE 277

261 epistemological beliefs changed between students’ f reshman and senior years in high school in all four dimensions. These findings supp ort the idea that epistemological beliefs develop over time. However, a student’s be liefs about the structure of scientific knowledge may develop independently from his or her beliefs about the stability of scientific knowledge (i.e., evolving). Therefore, examining the dimensions of epistemological beliefs rather than epistemological beliefs as a coherent whole may allow a clearer picture of how beliefs change. In this study the structure of scientific knowledge is described in terms of ranging from isolated bits of knowledge to interrelated con cepts. Participants’ views ranged from viewing the structure of scientific knowledge as di screte, concrete, knowable facts to seeing the structure of scientific knowledge as rel ative, contingent and contextual. From the data it is clear that although 54% of the parti cipants (N=56) experienced an increase in sophistication of this dimension of epistemologi cal beliefs, the changes were not complete as to become sophisticated in all particip ants. In the initial interviews 50% of the participants believed that structure of scienti fic knowledge involved interrelated concepts. By the end of the semester 80% of the int erview participants (N=20) reflected improved epistemological beliefs concerning the str ucture of scientific knowledge in their interview statements. Participants’ views may have been related to their beliefs about the processes of knowing and the nature of scientif ic knowledge. For example, if a student believes that scientific knowledge consists of factual information the student may believe that recalling the information constitutes knowing. As a result the student may believe that learning scientific knowledge consists of memorizing information and not understand how the knowledge interrelates. However if a student believes that scientific knowledge is complex resulting from interpretation of evidence then the student may believe that scientific knowledge involves interrel ated concepts. Participating in a

PAGE 278

262 laboratory environment where interpretation of evid ence was used as an instructional tool may have influenced the participants’ epistemo logical beliefs. Prior studies such as Songer and Linn (1991) sugges t that students’ classroom experiences may impact their beliefs about the stru cture of scientific knowledge. They suggest that students may not integrate material pr esented in science courses if they believe that scientific knowledge consists of isola ted principles. Additionally, learners may not develop a consistent historical view of sci ence if science is taught as a collection of fairly unrelated facts and ideas. Le arners need to understand that scientific knowledge is best described as a set of strongly in tegrated and highly structured concepts rather than a series of weakly connected i solated ideas. Understanding that scientific knowledge is a set of strongly integrate d and highly structured concepts are associated with a highly sophisticated belief of th e coherence of scientific knowledge. For instance, learners should understand the princi ples that underlie scientific investigation such as causality, explanation, and u sing experiments to determine causality or construct scientific explanations. According to Linn and Hsi (2000) research on studen ts’ views on the structure of scientific knowledge suggests that students develop a repertoire of ideas about scientific knowledge rather than a cohesive view. In another study some college students expressed beliefs that scientific knowledge was a c ollection of separate pieces of knowledge such as formulas and symbols that only ex perts could understand. However, other students believed that the structure of scien tific knowledge was an integrated body of knowledge made up of concepts in which one could construct their own understanding (Hammer, 1994). Elder (2002) suggests that the rel atively sophisticated ideas of that scientific knowledge is a coherent system of concep ts develops later than other

PAGE 279

263 epistemological beliefs about science constructs su ch as that scientific knowledge evolves. The nature of knowing and learning scientific knowl edge can be described in terms of ranging from that learning science consist s mainly of absorbing information such as facts to relying on constructing one’s own understanding by working through the material actively, by relating new material to prio r experiences, knowledge, and intuitions, and by reflecting upon and monitoring o ne’s understanding. In this study participants’ ideas about the nature of knowing and learning scientific knowledge were viewed in terms of absorbing facts or by constructi ng one’s own knowledge. From the data it is clear that although 50% of the participa nts experienced an increase in sophistication of this dimension of epistemological beliefs, the changes were not complete as to become sophisticated in all particip ants. In the initial interviews 10% of the participants believed that the nature of knowin g and learning scientific knowledge involved interrelating concepts and constructing on e’s own knowledge. By the end of the semester 50% of the interview participants (N=20) r eflected improved epistemological beliefs concerning the nature of knowing and learni ng scientific knowledge in their interview statements. Participants’ views may have been related to their learning strategies and the belief that science mainly facts to be memorized. Participants tended to equate learning scientific knowledge with practi cing problems or generating scientific knowledge in the laboratory. In a prior study Songer and Linn (1991) investigate d eighth grade students’ strategies for learning science in combination with their study of students’ views about the nature of knowing and learning science. They f ound that some of the students who held static beliefs about the nature of knowing and learning science preferred the use of memorization as their approach to learning science. However other students that held

PAGE 280

264 dynamic beliefs about the nature of knowing and lea rning science approached learning via efforts to create meaningful understanding. If a learner believes that the nature of learning and knowing scientific knowledge is comple x as a result of interpretation of evidence then the learner may believe that learning science requires mental effort to understand the complexities and interrelationships of the scientific knowledge (Roth & Roychoudury, 1994; Schommer & Walker, 1995). The real life applicability of scientific knowledge can be described in terms of ranging from only applicable in the classroom or la boratory to applicable to everyday life. From the data it is clear that although 57% of the participants experienced an increase in sophistication of this dimension of epistemological beliefs, the changes were not complete as to become sophisticated in all particip ants. In the initial interviews 80% of the participants believed that the real life applic ability of scientific knowledge included life outside the classroom or laboratory. By the end of the semester 90% of the interview participants (N=20) reflected improved epistemologi cal beliefs concerning the real life applicability of scientific knowledge in their inte rview statements. Participants’ views may have been related to their scientific literacy. The more experiences participants had with applying scientific knowledge to their dai ly lives the more sophisticated their epistemological beliefs. Participants in this stud y tended to describe the real life applicability of science in terms of examples of ho w scientific knowledge applied to real life. Several described how specific science conce pts related to everyday life such as checking the gas pressure in one’s tires with tempe rature changes in the weather, personal diet, and health. Studies involving the epistemological viewpoints of both “public science knowledge” and “personal understandings of science” are found throughout research literature. “Public science knowledge” may be def ined as scientific knowledge that

PAGE 281

265 harbors consensus within a community of scientists. Epistemological viewpoints of ‘public science knowledge” addresses the processes involved in generating public science knowledge and justification of reliability. A citizen’s interest in science occurs within specific social decision-making purposes inc luding personal matters such as health care, safety risks at work, fabric choices, and protesting the building of an industrial plant. The citizen who wishes to engage in decision-making about an issue has to learn some science. Studies that address the epistemological viewpoints of scientific knowledge used by students from K-16 have been reported. Given th e variety of methods used, the findings are quite similar (Lederman & O’Malley, 19 90; Aikenhead & Ryan 1992; Meyling, 1997). Perhaps the most significant point to emerge from these studies is that students do indeed develop epistemological viewpoin ts of public science knowledge because of their interactions with science during t heir education and everyday life. According to Cobern (2000) many citizens including students find science disconnected from everyday life and thinking. They view science as a “school” subject not an important part of everyday life. Even in a college science course only a fraction of the information generated by scientific knowledge i s taught during a semester course. Therefore, it is important for science courses to p repare learners to be able to think critically about science related issues that may im pact their everyday life (Carey & Smith, 1993). Evolving scientific knowledge can be described in t erms of ranging from viewing scientific knowledge as absolute to viewing it as c hanging and dynamic. In this study participants’ ideas about the nature of evolving sc ientific knowledge were viewed in terms of “set in stone” to constantly evolving. Fro m the data it is clear that although 48% of the participants experienced an increase in soph istication of this dimension of

PAGE 282

266 epistemological beliefs, the changes were not compl ete as to become sophisticated in all participants. In the initial interviews 70% of the participants believed that scientific knowledge changes and evolves over time. By the en d of the semester 90% of the interview participants (N=20) reflected improved ep istemological beliefs concerning the evolving nature of scientific knowledge in their in terview statements. Participant’s ideas in this study about evolving scientific knowledge ( e.g., certainty) and the justification of scientific knowledge tend to be described in terms of whether they understand knowledge to be verified by authority (e.g., first hand source) or via evidence (second hand source). Participants’ views about evidence w ere related to their ideas about the certainty of knowledge. Some suggested that eviden ce is related to how or why ideas in science might change over time. Other participants suggested that scientific knowledge is associated with both sources of evidence. In te rms of first hand sources, participants indicated that one can obtain information from inve stigations such as experiments, direct experiences with situations, or from tools. Partic ipants suggested textbooks and the Internet as second hand sources. The idea that scientific knowledge changes over tim e to be consistent with evidence from data and/or new reasoning and that sc ientific knowledge can change through growth or revision should have an effect on a learner’s epistemological beliefs. In addition, the idea that because scientists are i nfluenced by their prior knowledge, multiple explanations can be produced from the same set of data would seem to have a potential to effect learner’s epistemological belie fs. Studies have shown that learners’ prior scientific knowledge does influence their ideas about the certainty and justification of know ledge. In addition, learners generally hold a wide range of ideas about science that are r esistant to change (Fensham, 1994; Gabel, 1998; Taber, 2002a). Learners’ views of sci entific knowledge develop over time.

PAGE 283

267 They are shaped and influenced by a variety of fact ors such as home, media, school, and technology. Learners that have the ability to critically examine the results of scientific literature rather than simply accept the interpretations of “authority figures” have a better understanding of the formation of sci entific knowledge. According to Carey and Smith (1993), science courses should prep are learners to value “the kind of knowledge that is acquired through a process of car eful experimentation and argument.” Nevertheless, studies show that regardless of takin g science courses, some learners do not understand that scientific knowledge is always evolving and constructed through theoretical interpretations of evidence (Ryan & Aik enhead, 1992). The source of ability to learn scientific knowledge can be described in terms of ranging from viewing that learning science takes na tural ability to viewing that anyone with effort and self –confidence can learn science. In this study participants’ ideas about the nature of one’s ability to learn scientific kno wledge were viewed in terms of the role natural ability played in a participants’ success. From the data it is clear that although 48% of the participants experienced an increase in sophistication of this dimension of epistemological beliefs, the changes were not compl ete as to become sophisticated in all participants. In the initial interviews 30% of the participants believed that anyone can learn science. By the end of the semester 45% of t he interview participants (N=20) reflected improved epistemological beliefs concerni ng the evolving nature of scientific knowledge in their interview statements. Participan t’s ideas in this study about the source of ability to learn science ranged from the belief that some natural ability is required to all one needs to be able to learn scien ce is motivation and the desire to work hard. One underlying theme is the attitude a stude nt has about learning science and their ability to learn science. Expected achieveme nt is another variable that appeared to heavily influenced learners’ beliefs about their so urce of ability to learn science.

PAGE 284

268 As would be expected, positive attitudes toward sci ence lead to better results on achievement measures of science capability (Weinbur gh, 1998). A student’s attitude toward science is more likely to influence achievem ent in science than achievement influencing attitude (Schibeci & Riley, 1986). Fo r instance, Steiner and Sullivan (1984) found that organic chemistry students who received a grade of a C or lower more frequently self-reported themselves as worried or a nxious about the subject. Steiner and Sullivan (1984) found that the best predictor for s uccess (C+ or better) is a positive attitude towards chemistry. This belief is charact erized by claiming an interest and confidence in learning organic chemistry. The organizing role of prior scientific knowledge a nd understandings in gaining new scientific knowledge and skills include not onl y epistemological beliefs but other aspects of knowledge structures and patterns of rea soning, such as attitudinal beliefs and reasoning abilities. For instance, there is evi dence indicating that students' scientific epistemological beliefs play an important role in d etermining their learning orientations towards science and the ways of organizing cognitiv e structures of scientific knowledge. There is also evidence indicating the importance of scientific epistemological beliefs on conceptual change (Perry, 1970; Posner et al., 1982 ; King & Kitchener, 1994). The epistemological beliefs of middle and high school s tudents were determined to relate to the ability to learn, speed of learning, and stabil ity of knowledge. The study found that if a student believes in quick learning, it may affect problem-solving strategies over time (Schommer-Aikins et al., 2005). Hofer and Pintrich (1997) suggested that, “beliefs about learning and teaching are related to how scientific knowledge is acquired and in term of the psychological reality of the network of individuals’ beliefs, bel iefs about learning and teaching are probably intertwined.” According to Hofer and Pint rich (1997), there is continuing

PAGE 285

269 speculation that college educational experiences ma y serve as the force for change in personal epistemological beliefs but limited resear ch has been performed to refute or support the idea. Hofer (1994) compared the episte mological beliefs of college students that experienced two different forms of calculus in struction over a semester course. Some students experienced instruction that emphasiz ed active learning, cooperative learning, and problem solving while other students experienced instruction as lectures and demonstrations of problem sets. Results indica ted significant differences in the epistemological beliefs of the students with those students experiencing active learning, cooperative learning, and problem solving scoring h igher. However, interpretations of these results are limited because student beliefs w ere not assessed prior to instruction. An understanding of epistemological beliefs is impo rtant because they may reveal that college students are being influenced b y unconscious and initial beliefs about the nature of knowledge and learning. Pintrich (200 2) suggested that epistemology is developmental. Development is the goal of education Therefore part of the goal of education should be to promote epistemological deve lopment. Summary In summary the overall findings of the study (N=56) in answering research question -1, sub-question-b was as follows: Do stu dents’ personal epistemological beliefs about science (chemistry) change by the com pletion of a semester general chemistry laboratory course? 1. Noticeable increase in posttest scores with a st atistically significant medium effect size of 0.61. 2. The mean gain scores is lowest for source of abi lity to learn and highest for evolving knowledge. 3. The mean gain score for overall increased by 4-6 points on a scale of 0100.

PAGE 286

270 4. The mean gain scores for four of the EBAPS dim ensions and the overall score are significant at p 0.05. 5. The mean gain score for source of ability to l earn is not significant at p 0.05. In summary the findings related to the interview pa rticipants of the study (N=20) in answering research question -1, sub-question-b w as as follows: Do students’ personal epistemological beliefs about science (che mistry) change by the completion of a semester general chemistry laboratory course? 1. Noticeable increase in posttest scores with a st atistically significant medium effect size of 0.93. 2. The mean gain scores is lowest for source of abi lity to learn and highest for evolving knowledge. 3. The mean gain score for overall increased by 5-8 points on a scale of 0100. 4. The mean gain scores for four of the EBAPS dim ensions and the overall score are significant at p 0.05. 5. The mean gain score for source of ability to l earn is not significant at p 0.05. Not unexpectedly, given the literature on epistemol ogical beliefs, the participants in the study showed a moderately significant change in their overall epistemological beliefs and in four of the five dimensions the exce ption being the source and ability to learn scientific knowledge. This lack of developmen t may not be so surprising since the source and ability to learn scientific knowledge ma y be influenced by the participant’s own self-efficacy and prior experiences learning sc ience. Overall, minimal to moderate gains were made for th e participants (N=56) in general within the EBAPS dimensions. The participan ts overall had quantitative scores that were mixed with four dimensions showing increa ses. Slightly better results were obtained from the interview subjects quantitatively in terms of increased sophistication of

PAGE 287

271 epistemological beliefs. The interview participants had increases within the same four dimensions with the exceptions being participants 4 13, and 17. With the interview participants, it seemed they eit her held the belief or not, as minimal to moderate growth could be seen qualitativ ely within the interviews over time. Although increases were seen quantitatively, these may well be insignificant. It seems apparent that some participants have very nave epi stemological beliefs while most possess moderately sophisticated beliefs and a few surprisingly have highly sophisticated beliefs. The nave views are to be e xpected since the development of sophisticated beliefs is normally seen only during the latter college years, as described in Perry’s work (1970). The next chapter presents a description of the deve lopment of the participants’ NOS beliefs through the presentation of qualitative analyses of the study’s first research question and sub-question 1-a. The characterization of NOS beliefs and any changes in those beliefs that may have resulted with analyses of the participants’ responses to interview probes will be presented. The combinati on of interviews and quantitative measures will provide a glimpse into participants’ NOS beliefs changes during the course of a semester and what the participants’ bel ieved influenced their beliefs. The results are discussed and related back to the key l aboratory NOS beliefs literature.

PAGE 288

272 Chapter Six: Development of NOS Beliefs Introduction Chapter six presents a description of the developme nt of the participants’ NOS beliefs through the presentation of qualitative ana lyses of the study’s sub-question 2-b. The characterization of the participants’ NOS belie fs is discussed with the use of the participant’s responses to interview probes. The combination of the interviews and quantitative measures previously discussed in chapt er four will provide a glimpse into participants’ NOS belief changes during the course of a semester. Another objective of this research was to determine if participants’ NOS beliefs change over the course of a semester in a laborator y instructional setting, the next step looks closely at the NOS data. These descriptions w ill be generated from the NSKS assessment and more importantly the participants’ r esponses during the initial and final interviews. No specific explicit NOS pedagogical m ethods or instruction were included in the semester laboratory course. The nature of this study was to explore and lay a found ation for focusing on more specific features of reasoning related to NOS bel ief changes in light of specific science laboratory instructional features for future resea rch. Method of Analysis This analysis was conducted in a multi-layered, mul ti-stage process, through reading, and sorting participants’ responses to NOS questions, both general in nature and specific to the course. The analysis below is organized by four of the six NSKS assessment dimensions (axes): creative, development al, parsimonious, and testable.

PAGE 289

273 The aforementioned dimensions (axes) served as the major theme codes giving a framework from which first-order themes originally derived from the participants’ verbatim quotations or raw data themes could be ana lyzed. Within each dimension (axis), the responses to interview and reflective q uestions regarding NOS beliefs are presented. The intent of this analysis is to expan d the theoretical understanding of the NSKS dimensions (axes) as related to the NOS and th e continuum of beliefs, as expressed in context. Illustrative quotes have bee n selected from the interviewed participants as representative of the range of beli efs along the continuum. Table 53 presents a demographic overview of the interview pa rticipants with their participation identification number. Quotes are identified with the letters ST followed by the participant’s identification number (Table 53). Fi gure 7 represents the scale used to identify the each participant’s range of NOS belief s. The main research questions that guided this portio n of the study were: RQ1. What range of NOS beliefs about science (chemistry) do undergraduate science students have at the beginning of a semester genera l chemistry laboratory course? RQ1a. Do students’ NOS beliefs about science (chemistry) change by the completion of a semester general chemistry laborato ry course? Summary of NSKS Overall Scores Using the overall scores on the NSKS (Table 54) dis cussed in chapter four to measure relative increases or decreases in NOS unde rstandings, the results show fortyfour participants (N=56) increased their total scor es while seven participants’ scores decreased and five scores remained unchanged from t he pre-test to the post-test. The total overall mean score between the pre-test and t he post-test resulted in an average increase of 5.9 points. The overall average increas e within the dimensions was 0.96 points.

PAGE 290

274 Table 53 Demographic Statistics Interview Partici pants ID Sex Age Major College Year 1 F 19 Pre-Pharmacy Fr 2 F 21 Psychology So 3 F 21 Biomedical Science Jr 4 M 24 Electrical Engineering So 5 M 22 Environmental Science Jr 6 F 27 Marine Science None 7 F 20 Biomedical Sciences Jr 8 M 18 Undeclared Fr 9 F 18 Environmental Science Fr 10 F 20 Environmental Science So 11 F 19 Nursing Fr 12 F 18 Undecided Fr 13 F 18 Pre-Pharmacy Fr 14 F 19 Pre-Pharmacy Fr 15 F 20 Biology So 16 F 18 Environmental Science Fr 17 F 24 Physical Ed Jr 18 F 20 Athletic Training Jr 19 F 19 Biomedical Sciences So 20 F 45 Masters Nursing None Figure 7 NSKS Belief Scale Realist-----------------------------------ne utral----------------------------------Instrumental ist (48) (unaccepted NOS view) (144) (accepted NOS view) (240) Realist – absolute; theories are either true or fal se Instrumentalist – subjective; theories are tools

PAGE 291

275 What is clear is that several of the participants’ overall scores did show some improvement in their NOS beliefs by the end of the semester course. Fifteen of the fiftysix participants improved their NSKS scores by 5.0 points or less, four improved by the average gain of 6.0 points while twenty-five improv ed their score by greater (7-18 points). Therefore, 78% of the participants improv ed their NSKS scores. For the entire population (N=56) participant fifty-two had an over all increase of 18 points, followed by participant twenty-two with a 15 point increase. I n addition, twenty of the original fifty-six participants moved toward the instrumentalist (acce ptance of NOS views) end of the NSKS scale with seven coming from the interview par ticipants. The lowest overall (N=56) NSKS pre-test score was 122 (St-17). None of the participants increased their scores in all six NSKS dimensions. The overall aver age increase within the dimensions was 1.1 points. The remaining twelve either had no change or a decrease in their score. Whether improvement or lack of improvement was in a ny way influenced by laboratory instruction or outside factors will be presented la ter in chapter seven. Table 54 Descriptive Statistics – NSKS Scores – All Participants Dimension Pre-Mean Score N= 56 Post-Mean Score N=56 Pre-Mean Score N= 20 Post-Mean Score N=20 Amoral (D-1) 23.643 24.196 23.150 24.350 Creative (D-2) 22.893 23.670 22.550 24.100 Developmental (D-3) 23.625 24.768 24.000 24.700 Parsimonious (D-4) 24.625 26.321 24.550 26.700 Testable (D-5) 24.196 24.982 24.050 24.300 Unified (D-6) 23.643 24.411 23.750 24.750 Overall Score 142.482 148.375 141.650 148.900

PAGE 292

276 Summary of NSKS Interview Scores As for the interview participants (N=20), 78% impro ved their NSKS score by the end of the semester (Table 55). Six participants i mproved their scores by 6.0 points or less, while another twelve improved their scores by more than 6.0 points. Two of the interview participants’ NSKS post scores decreased by 2.0 points. Whether the improvements or lack of improvements were in any wa y influenced by laboratory instruction or other possible factors will be prese nted later in chapter seven. Participant fourteen of the interview participants had the lowest overall NSKS pretest score of 132, followed by participant seven teen with 136. Although 78% of the interview participants showed an increase in total NSKS scores, participants one, ten, and nineteen had the largest total score increases of 12 point each. Interview participants two, three, and twelve improved their scores in five of the six dimensions with the majority improving their scores in four of the six dimensions. Participant six had the highest pre-test score (14 9) and participants one and nine had the highest post-test scores of the interv iew participants with both scoring 155 placing them at the instrumentalist (accepting of N OS views) end of the NSKS scale (Figure 7). However, all three of the aforementione d students’ pre-test scores placed them at the realist end of the NOS scale indicating that their initial beliefs did improve concerning the NOS. Twelve of the twenty interview participants moved from either a realistic or neutral position in regard to NOS towa rds an instrumentalist perspective during the course of the semester. This was an abo ve average increase of 9.4 points suggesting a marked improvement in the sophisticati on of their NOS beliefs.

PAGE 293

277 Table 55 Descriptive NSKS Statistics Interview Pa rticipants ID Gender NSKS Pre NSKS Post Difference 1 F 143 155 12*** 2 F 144 153 9*** 3 F 138 148 10*** 4 M 138 149 11*** 5 M 144 151 7*** 6 F 149 151 2** 7 F 143 152 9*** 8 M 147 145 -2* 9 F 147 155 8*** 10 F 141 153 12*** 11 F 143 149 6** 12 F 138 150 12*** 13 F 146 144 -2* 14 F 132 142 10*** 15 F 140 145 5** 16 F 143 148 5** 17 F 136 142 6** 18 F 143 148 5** 19 F 140 152 12*** 20 F 138 146 8*** decrease in score ** 6.0 points gain in score *** > 6.0 points gain in s core Characterization of Nature of Science Beliefs Although the NSKS assessment serves the purpose of finding out if, and in what categories, students beliefs are changing, we neede d a way to explore how these beliefs changed during the semester. Using a set of probing questions initial and final interviews were conducted to ascertain if at all, whether part icipant nature of science (NOS) beliefs changed during the semester of laboratory instructi on. Key areas that appeared to provide opportunities fo r participants to make inferences about their beliefs included the initial and final interviews. The initial interviews lasted approximately 15 – 20 minutes and focused on four of the NSKS dimensions. The final interviews lasted 30-45 min utes and focused on general NOS

PAGE 294

278 beliefs. The following discussion will present an overview of the responses by the interview participants to the NOS beliefs probes du ring the initial and final interviews. The discussion is organized with the use four of th e six NSKS dimensions. Initial and Final NOS Beliefs Interviews Many of the instruments used in NOS studies origina ted as objective, pencil and paper assessments which subsequently changed into m ore descriptive instruments. Researchers argued that traditional paper and penci l assessments are not adequate in fully explaining what one needs to know about stude nts’ conceptions of NOS. Researchers responded to this argument by conductin g interviews, surveys, and offering open-ended questions (Lederman, et al., 1998). Whi le the quantitative data offer an opportunity to examine and compare participants’ un derstanding of NOS in a generalized way, the interviews offer a chance to i nvestigate and describe more fully the range of participant positions with respect to unde rstanding NOS. During the initial interview, questions related to four of the six multi-dimensional axes of the NSKS: creative, developmental, parsimon ious, and testable were used to probe the participants (Appendices C, F & O). The questions were designed to investigate the participants’ NOS beliefs. The int erview participants were asked to elaborate on the questions in order to invoke the p articipants’ thoughts about the NSKS variables. The questions themselves were meant to look at different areas of NOS beliefs within the NSKS. During the final interview, participants were prese nted with an ill-structured scenario problem from King and Kitchener (1994). T he reflective judgment scenario problem (Appendix F) incorporates the four dimensio ns from the initial interview with the focus being on the developmental dimension.

PAGE 295

279 The study investigated the changes from the beginni ng to the end of the semester within four (creative, developmental, pars imonious, and testable) of the six dimensions of NOS beliefs identified in the NSKS. F irst the overall participant scores were compared to those of the interview subjects. A fter a comparison between interview subjects and the overall class based on quantitativ e scores, an attempt was made to briefly look at what might have changed using the q ualitative data from the interviews based on the NOS beliefs within each variable. Responses to the Initial and Final NOS Beliefs Prob es On the subsequent pages portions of the initial and final interview responses are presented and discussed concerning the participants ’ NOS beliefs. The interview probes were designed using the NSKS variables discu ssed in chapters two and three. Each variable interview probe will be presented and discussed separately. Creative Dimension In the current NOS literature the dimension relatin g to the creativity involved in scientific endeavors is viewed on a continuum that ranges from viewing scientific knowledge as a totally lifeless, rational, and orde rly activity to viewing it as an endeavor that requires human imagination and creativity thro ugh the invention of explanations based on observations. In addition this dimension c onsiders whether scientific models and theories are a product of the human imagination and whether they accurately represent reality. According to Rubba and Anderson (1978) scientific knowledge is a product of the human intellect. The invention of s cientific knowledge requires as much creative imagination as does the work of an artist, composer, or a poet. Scientific knowledge represented by models and theories exempl ifies the creative spirit of the scientific inquiry process.

PAGE 296

280 Within this dimension the overall participant (N= 56) creative pre-test mean was 22.98 while the post-test mean was 23.67 (Table 54) with 17 participants improving their score. The preand post-mean scores of the intervi ewed participants (N=20) were 22.55 and 24.10, respectively with 14 participants improv ing their score. This was also a category that quantitatively showed a below average increase of 0.80 points when compared to the overall average increase of 0.96 (N =56). The participant (N=56) with the highest pre-post score change of 11 points as w ell as the highest post score was student 52 with a score of 32. This moved the stud ent from the realist end of the NSKS scale to the instrumentalist end by the end of the semester. Initially 60% of the interview participants (N=20) scores (ST 1-5, 7, 10-12, 14, 17, and 19) suggested they held nave (realist) views t hat scientific knowledge is not a product of human imagination. However, by the end of the semester only 25% of the participants’ NSKS scores (ST 6, 8, 12, 15, and 17) fell in the realist range. The initial NSKS scores of 15% of the interview participants (S T 6, 8, and 16) fell in the neutral range suggesting they held a combination of nave a nd expert beliefs concerning the role creativity plays in the nature of science. By the end of the semester 25% of the participants (ST 2-3, 7, 11, and 16) scored in the neutral range. The initial scores of five of the participants (St 9, 13, 15, 18, and 20) sugg ested they held an appropriate view (instrumentalist) on the role that creativity plays in the nature of science. Ten participants (ST 1, 4-5, 9-10, 13-14, and 18-20) sc ored in the instrumentalist range by the end of the semester. However, for the majority of interview participant s the overall increase in post creative scores was above the average with an avera ge increase of 1.55 points (Table 56). The highest pre-post score change within the interview participants of 6 points were students 10 and 11. In addition student 10 ha d the highest post score of 28. This

PAGE 297

281 moved the student from the neutral section of the N SKS scale to the instrumentalist end by the end of the semester. Approximately 60% of t he interview participants improved their score on the “creative” dimension moving them into a higher range on the NSKS scale. This suggests that a small portion of the p articipants are moving away from a realist view that science does not require creativi ty to a more instrumentalist view. Although some increases were observed quantitativel y with ten of the twenty interview participants, the difference in their und erstandings is best reflected in the initial interview responses in Table 57. In order to query participants, understanding of the creative dimension of NOS the initial interview que stion asked participants to respond to the following: “whether scientific theories and mo dels are products of the human mind and may or may not accurately represent reality.” This question assessed participants’ understanding that scientific knowledge is created from human imaginations and logical reasoning. This creation is based on observations a nd inferences of the natural world and developed into scientific theories and models. That scientific models and theories are created from human minds and may or may not acc urately represent reality. Generally several of the interview participants (ST 6, 8, 10, 12, 14, and 16-19) agreed in some part that theories and models are pr oducts of the human mind, may or may not model aspects of reality and are needed to assist in understanding scientific knowledge. Other participants (ST 2-5, 7, 11, 15, and 20) agreed that theories and models are products of the human mind and come clos e to being copies of reality. While some participants (ST 1, 9, and13) did not believe that scientific theories and models were products of human imagination but based on fac ts and represent reality. Participants often credited theories and models sol ely to the accumulation of new observations or data and/or the development of new technologies. However, one

PAGE 298

282 participant (ST 12) considered change that results from reinterpretation of existing data from a different perspective. Table 56 Descriptive NSKS Statistics Creative Dim ension ID Pre Post Difference 1 23.00 26.00 3.00*** 2 23.00 24.00 1.00*** 3 21.00 24.00 3.00*** 4 23.00 25.00 1.00*** 5 23.00 25.00 2.00*** 6 24.00 22.00 -2.00* 7 21.00 24.00 3.00*** 8 24.00 20.00 -4.00* 9 25.00 26.00 1.00*** 10 22.00 28.00 6.00*** 11 18.00 24.00 6.00*** 12 19.00 22.00 3.00*** 13 25.00 26.00 1.00*** 14 22.00 25.00 3.00*** 15 25.00 20.00 -5.00* 16 24.00 24.00 0.00* 17 21.00 23.00 2.00*** 18 25.00 25.00 0.00* 19 22.00 25.00 3.00*** 20 25.00 25.00 0.00* decrease in scor e or no change ** 0.96 points gain in score *** > 0.96 points gain in score When comparing participants’ initial interview comm ents with their initial NSKS scores for the creative dimension of NOS some of th e participants’ scores mirror their reflections while others did not. For instance par ticipant 1 had an initial score in the realist range and reflected that range in her inter view statements that theories and models are based on facts and not products of the h uman mind. While participants 2-5, 7, and 11 all had initial scores in the realist ran ge but their interview comments suggested that theories and models are products of the human mind and come close to being copies of reality. Participant twelve had an initial NSKS score in the realist range however during the interview suggested that theorie s and models are products of the

PAGE 299

283 human mind, may or may not model aspects of reality and are needed to assist in understanding scientific knowledge. Participant th irteen had an initial NSKS score that reflected an instrumentalist view yet in her initia l interview she held the belief that scientific theories and models accurately represent ed reality and were not products of the human mind. These discrepancies between NSKS sc ores and interview statements could be attributed to several factors such as: di stracted during the administration of the NSKS resulting in incorrect bubbling of answer choi ce or interpretation of the NSKS questions and/or answer selection as well as their personal experiences in the chemistry lecture and laboratory course during the semester. Table 57 Participants’ Interview Reflections Crea tive (N=20) Initial NOS Beliefs Interview Question-1-Creative There are many differing views or images of the nat ure of science and scientific knowledge. I would like your views on the followin g statements: Scientific theories and models are products of the human mind and may or ma y not accurately represent reality. Quotation Comments ST 1: “ False, theories are based on facts. For instance theories have been tested and show consistent results. Therefore, a fact is somet hing that is proven by testing.” ST 2: “I believe that the theories and models are based o n some reality. The human mind questions and tries to figure out what happened. Sc ientist question what is proposed and try to disprove the theory. Sometimes this changes the way science presents an idea. It is a product of the mind, but was stimulated from realit y.” ST 3 : “I think that theories do originate from human mi nds. Someone has to discover and create theories. I do think that they can accuratel y represent reality.” ST 4 : “I think the models are products of the human mi nd and are reflective of our best understanding of science. Therefore, they represent reality as accurately as can be reflected at the current time. Theories and models are subject to change as information and knowledge evolves.” ST 5: “ Yes, models and theories are produced by the human mind. They represent some aspects but not all things are truly revealed. So scientists make the best guess as to how it applies.” ST 6: “They are products of the human mind. But, they h elp one understand the concepts. Theories might not accurately or perfectly describe the actual concept but it’s the best replication one has to help in understanding the co ncept. For instance, when one views the atomic models and orbital’s via diagrams. The diagr ams may not reflect the actual atom, but it’s the best thing that we have to represent it. T hat is our reality.” ST 7: “More or less, scientific theories begin as produ cts of the human mind. However the ultimate goal of a theory is to become a fact and b e able to represent reality.” Continued on next page

PAGE 300

284 Table 57 (Continued) ST 8: “I believe the statement is true. We cannot alway s replicate a scientific theory into a perfect model.” ST 9: “Theories have been proven, so they can apply to so me aspects of reality.” ST 10: “I think it has to do with the human mind and the w ay that we interpret scientific knowledge. For the most part theories and models a re based on observations and experiments performed by several scientists. When science is replicated by others then it becomes part of a theory. So, in that way it’s not just a product of the human mind, it’s only a product of the human mind in the way we interpret it.” ST 11: “Theories and models are accurate but they are als o products of the human mind. Theories and models are created after someone condu cts an experiment.” ST 12: “It all depends on how someone interprets the inf ormation. It may be accurate and it might not be accurate. If one scientist looks at scientific data from an e xperiment and a different scientist looks at the same data their ow n knowledge and opinions will be reflected in the theories that they make and the explanations they give So theories and models may be products of the human mind and may or may not be accurate.” ST 13: “I think that scientific theories and models do acc urately represent reality. They are based on evidence and not just made up from the hum an mind.” ST 14: “True, theories are produced by the human mind. However, there is plenty room for error as it does not accurately represent reality.” ST 15: “Yes and no. One may never know for sure if theor ies and models are accurate or whether they represent reality.” ST 16: “Yes. For instance one scientist starts with a r esearch concept and then others may research the same topic and add knowledge to suppor t or not support it. It’s developed in the human mind but it may somewhat accurately repre sent what we know. For example the atomic theory, we haven’t totally disproved it. ” ST 17: “True enough as theories and models are products of the human mind but based on physical evidence.” ST 18: “I agree. There are scientific theories from the 17 th century that we look at and wonder what we were thinking at the time. However, it gave one a basis to prove if it was correct or incorrect. So, they might be accurate f or the time until someone can prove that they were incorrect.” ST 19: “Yes they are products of the human mind. But wh en scientists make theories they are based on evidence-based reasoning and are gener ally accurate until proven false.” ST 20: “Although many scientific laws have eventually be en proven many theories are yet to be proven. It is through the great imagination o f brilliant minds that we have any scientific facts at all.” Developmental Dimension In the current NOS literature the developmental dim ension of scientific knowledge is viewed as operating on a continuum tha t ranges from viewing scientific knowledge as absolute, “set in stone” to viewing it as changing and dynamic. According to Rubba and Anderson (1978) scientific knowledge i s never “proven” in the absolute and final sense. Scientific knowledge is limited b y the justification process rendering it

PAGE 301

285 as probable. Scientific beliefs that appear to be true at one time may be assessed differently when additional evidence is available. Formerly accepted scientific beliefs should be judged in their historical context. Within this dimension the overall participant (N=56 ) developmental pre-test mean was 23.62 while the post-test mean was 24.76 (Table 54) with 28 participants improving their score. The preand post-mean scores of the i nterviewed participants (N=20) were 24.00 and 24.70, respectively with 7 participants i mproving their score. This was also a category that quantitatively showed an above averag e increase of 1.14 points when compared to the overall average increase of 0.96 (N =56). This above average increase occurred in 19 of the 56 participants, and 6 of the 20 interviewed participants’ scores. The participant with the highest pre-post score cha nge of 8 points as well as the highest post score was student 29 with a score of 31. This moved the participant from the realist end of the NSKS scale to the instrumentalist end by the end of the semester. However, for the interview participants the overal l increase in post NSKS developmental scores (Table 58) was below average w ith a 0.70 point average increase. The highest pre-post score change within the interv iew participants was participant 14 with a 5 point increase (20 to 25). This moved the participant from the neutral section of the NSKS scale into the instrumentalist range by th e end of the semester. Participants 2 and 15 had the highest post scores each with 27 rem aining in the instrumentalist range (Figure 7). Approximately 45% of the interview pa rticipants improved their score on the “development” dimension. This gain suggests that s ome participants are moving toward the belief that scientific knowledge is not “set in stone” which represents a more instrumentalist point of view. Initially 35% of the interview participants (N=20) scores (ST 3-7, 14, and 18) suggested they held the nave (realist) view that s cientific knowledge is “set in stone”.

PAGE 302

286 However, by the end of the semester only 15% of the participants’ NSKS scores (ST 5, 13, and 18) fell in the realist range. The initial NSKS development scores of 20% of the interview participants (ST 12, 16, and 19-20) fell in the neutral range suggesting they held a combination of nave and expert beliefs conc erning the tentativeness of scientific knowledge. By the end of the semester 30% of the p articipants (ST 3-4, 7, 17, and 1920) scored in the neutral range. The initial score s of nine of the participants (ST 1-2, 811, 13, 15, and 17) suggested they held an appropri ate view (instrumentalist) that scientific knowledge is tentative and evolving. El even participants (ST 1-2, 6, 8-12, and 14-16) scored in the instrumentalist range by the e nd of the semester. Table 58 Descriptive NSKS Statistics Developmenta l Dimension ID Pre Post Difference 1 26.00 26.00 0.00* 2 26.00 27.00 1.00*** 3 22.00 24.00 2.00*** 4 23.00 24.00 1.00*** 5 22.00 22.00 0.00* 6 22.00 25.00 3.00*** 7 23.00 24.00 1.00*** 8 26.00 25.00 -1.00* 9 26.00 26.00 0.00* 10 25.00 25.00 0.00* 11 25.00 25.00 0.00* 12 24.00 26.00 2.00*** 13 25.00 22.00 -3.00* 14 20.00 25.00 5.00*** 15 25.00 27.00 2.00*** 16 24.00 26.00 2.00*** 17 25.00 24.00 -1.00* 18 23.00 23.00 0.00* 19 24.00 24.00 0.00* 20 24.00 24.00 0.00* decrease in score or no change ** 0.96 gain in score *** > 0.96 gain in score Although increases were observed quantitatively wi th some of the interview participants, the difference in their understanding s is best reflected in the interview

PAGE 303

287 responses in Table 59. In order to query participa nts, understanding of the developmental dimension of NOS in relation to the t entativeness of scientific knowledge, the initial interview question asked the participan ts to react to the following statement: “Scientific knowledge is a changing and evolving bo dy of concepts and theories.” This question assessed participants’ understanding that scientific knowledge is subject to change with new observations and with the reinterpr etations of existing observations. The majority of interview participants (ST 1-2, 4-2 0) agreed part that scientific knowledge is a changing and evolving body of concep ts and theories. Only one participant (ST 3) felt that science was exact with set rules and laws and it was possible as new things were discovered that scientific conce pts could change. Participants often credited the changes in scientific knowledge to the accumulation of new observations or data and/or the development of new technologies. H owever, one participant (ST 6) considered change that results from reinterpretatio n of existing data from a different perspective. When comparing participants’ initial interview comm ents with their initial NSKS scores for the developmental dimension of NOS some of the participants’ scores mirror their reflections while others did not. For instan ce several participants (4-7, 14, and 18) had initial scores in the realist range and yet in their interview statements suggested that scientific knowledge does evolve and change over ti me. Other participants (ST 12, 16, and 19-20) had initial scores in the neutral range but their interview comments suggest they hold the belief that scientific knowledge does evolve and change over time. The remaining participants (1-2, 8-11, 13, 15, and 17) had initial scores in the instrumentalist range that correlated with their interview reflecti on that scientific knowledge does evolve and change over time. These discrepancies between N SKS scores and interview statements could be attributed to several factors s uch as: distracted during the

PAGE 304

288 administration of the NSKS resulting in incorrect b ubbling of answer choice or interpretation of the NSKS questions and/or answer selection as well as their personal experiences in the chemistry lecture and laboratory course during the semester. Table 59 Participants’ Interview Reflections Deve lopmental (N=20) Initial NOS Beliefs Interview Question-2-Developmen tal There are many differing views or images of the nat ure of science and scientific knowledge. I would like your views on the followin g statement: Scientific knowledge is a changing and evolving body of concepts and theori es. Quotation Comments ST 1: “ I agree. Theories change when knowledge is advanced At one point the world had no knowledge of evolution, or antibiotics and now t hey do through scientific research and developments.” ST 2: “I believe that is accurate. Scientific knowledge c hanges due to new technology. This new technology spawns new theories and new twists o n old theories. Therefore scientific knowledge is always changing.” ST 3 : “First science is often referred to as being very exact with set laws and rules, which I believe is true. However, I would also imagine that as new things are discovered different concepts may be introduced.” ST 4 : “I definitely agree. I think scientific knowledge evolves. I don't feel that everything in this universe is understood or currently clear.” ST 5: “ Yes, I believe that scientific knowledge is changin g, but not so immediately like the next day. The change might be over a period of mont hs to years. ” ST 6: “I would say yes. Scientific knowledge is always changing and evolving. Scientist can develop new ways to think about old knowledge. From this develop different theories.” ST 7: “I believe that scientific knowledge is and will always be changing.” ST 8: “Yes I think scientific knowledge is evolving in the sense that more facts, concepts, and theories are added and discovered over time. In other words, some old concepts can be tested and proven false. For instance, when they believed that everything was made of the elements of earth, fire, water, and air. We now know this to be false because of new scientific knowledge and concepts.” ST 9: “Yes, newer theories/concepts are being discovered all the time. Because our world continues to evolve, therefore so does science.” ST 10: “I agree. We could discover something that would change our views about some entire body of knowledge as a whole. I think scien tific knowledge will always evolve.” ST 11 : “Yes, experiments show new findings when they ar e conducted and show new things which weren’t known before.” ST 12: “Yes because new scientific knowledge can add to current theories. Scientists can develop new theories through their research.” ST 13: “I agree because it seems like scientific knowledge is changing all the time when scientists find new evidence and add to theories.” Continued on next page

PAGE 305

289 Table 59 (Continued) ST 14: “True. Changes in scientific knowledge are alway s occurring. Changes in concepts and theories occur when scientist develop better ex planations.” ST 15: “Yes, new discoveries are constantly being made.” ST 16: “Yes. Scientific knowledge has been evolving sin ce the beginning of time. With the development of new technologies scientific knowledg e has been evolving.” ST 17: “I agree completely. There are constantly new dev elopments changing what we know to be true.” ST 18: “Yes, for instance someone just discovered a new element. Everyday scientists are discovering new things.” ST 19: “Yes, science is changing everyday. However, the new things we learn are usually from things that have happened over a gradual perio d of time.” ST 20: “Yes there are always new discoveries. It is ever changing and ever evolving but there are still many scientific standards and bench marks that hold fast.” Parsimonious Dimension In the current NOS literature on the parsimonious d imension, evolving scientific knowledge is viewed as operating on a continuum tha t ranges from the view that scientific knowledge attempts to achieve simplicity of explanation as opposed to complexity. According to Rubba and Anderson (1978) scientific knowledge tends toward simplicity but not the disdain of complexity. Scie ntific knowledge is comprehensive as opposed to specific. There is a continuous effort to develop a minimum number of scientific concepts to explain the greatest number of possible observations. The ultimate goal of science is to develop an understanding of t he natural universe which is free of biases. Within this dimension the overall participant (N= 56) pre-test mean was 24.62 while the post-test mean was 26.32 (Table 54) with 36 participants improving their score. The preand post-mean scores of the interviewed pa rticipants (N=20) were 24.55 and 26.70, respectively with 10 participants improving their score. This was also a category that quantitatively showed an above average increas e of 1.70 points when compared to the overall average increase of 0.96 (N=56). This above average increase occurred in 33 of the 56 participants, and 15 of the 20 intervi ewed participants’ scores. The gain on

PAGE 306

290 the “parsimonious” dimension is an indicator that s ome participants are moving toward an instrumentalist view that scientific knowledge a ttempts to achieve simplicity of explanation and away from a realist view that it a ttempts to achieve complexity. However, for the majority of interview participants (Table 60) the overall increase in post parsimonious scores was above the average o f 1.70 points. The highest prepost score change within the interview participants was participant 15 with an increase of 6 points. This moved the student from the realist section of the NSKS scale to the instrumentalist end by the end of the semester. In addition participant 5 had the highest post score of 31. With approximately 45% of the pa rticipants improving their score on the “parsimonious” dimension this suggests that som e of the participants are moving towards the belief that scientific knowledge attemp ts to achieve simplicity of explanation rather than complexity. Initially 25% of the interview participants (N=20) scores (ST 2, 4, 10, and 14-15) suggested they held nave (realist) view that scien tific knowledge attempts to achieve complexity of explanation and that it is specific a s opposed to comprehensive. However, by the end of the semester only 5% of the participa nts’ NSKS scores (ST 14) fell in the realist range. The initial NSKS scores of 35% of t he interview participants (ST 1, 3, 8, 13, and 16-18) fell in the neutral range suggesting they held a combination of nave and expert beliefs concerning the parsimonious nature o f science. By the end of the semester 10% of the participants (ST 10 and 17) sco red in the neutral range. The initial scores of eight participants (ST 5-7, 9, 11-12, and 19-20) suggested they held an appropriate view (instrumentalist) concerning the p arsimonious nature of science. Seventeen participants (ST 1-9, 11-13, 15-16, and 1 8-20) scored in the instrumentalist range by the end of the semester.

PAGE 307

291 Table 60 Descriptive NSKS Statistics Parsimonious Dimension ID Pre Post Difference 1 24.00 27.00 3.00*** 2 22.00 27.00 5.00*** 3 24.00 27.00 3.00*** 4 23.00 26.00 3.00*** 5 26.00 31.00 5.00*** 6 27.00 25.00 -2.00* 7 27.00 29.00 2.00*** 8 24.00 26.00 2.00*** 9 27.00 29.00 2.00*** 10 21.00 24.00 3.00*** 11 29.00 25.00 -4.00* 12 25.00 29.00 4.00*** 13 24.00 25.00 1.00*** 14 21.00 23.00 2.00*** 15 23.00 29.00 6.00*** 16 24.00 26.00 2.00*** 17 24.00 24.00 0.00* 18 24.00 26.00 2.00*** 19 25.00 29.00 4.00*** 20 27.00 27.00 0.00* decrease in score or no change ** 0.96 gain in score *** > 0.96 gain in score Although increases were observed quantitatively wit h a majority of the interview participants, the difference in their understanding s is best reflected in the interview responses in Table 61. In order to query participa nt’ understanding of the parsimonious dimension of NOS in relation to the simplicity rath er than complexity of scientific knowledge, the initial interview question asked the participants to react to the following statement: “The ultimate goal of science is to gat her all the complex facts about natural phenomena.” This question assessed participants’ un derstanding that scientific knowledge tends toward simplicity, is comprehensive and there is an effort to develop a minimum number of concepts in order to develop an u nderstanding of the natural world which is free of biases.

PAGE 308

292 Several of the interview participants (ST 3, 5, 9-1 3, and 20) suggested that the ultimate goal of science was not to gather all the complex facts but to understand them and how they apply to the world. Other participant s (ST 6-8 and 15-19) believed that the ultimate goal of science was to gather all the comp lex facts as well as understand how they applied to the world. The remaining participa nts (ST 1-2, 14, and 17) felt that the ultimate goal of science was to gather all the comp lex facts about natural phenomena. Participants often credited the goal of science sol ely to the exploration and understanding of the world/natural phenomena. Howe ver, a few of the participants (ST 14) considered the gathering of complex facts over theories as the goal. When comparing participants’ initial interview comm ents with their initial NSKS scores for the parsimonious dimension of NOS some o f the participants’ scores mirror their reflections while others did not. For instan ce participants 2, 4, and 14 had initial scores in the realist range and reflected that rang e in their interview statements that that the ultimate goal of science was to gather all the complex facts about natural phenomena. While participant 15 had an initial scor e in the realist range but suggested that the ultimate goal of science was to gather all the complex facts as well as understand how they applied to the world. Particip ant 10 had an initial NSKS score in the realist range however during the interview sugg ested that the ultimate goal of science was not to gather all the complex facts but to understand them and how they apply to the world. Other participants (ST 5, 9, 1 1-2, and 20) had initial NSKS scores that reflected an instrumentalist view and reflecte d those views in their interview comments. These discrepancies between NSKS scores a nd interview statements could be attributed to several factors such as: distract ed during the administration of the NSKS resulting in incorrect bubbling of answer choi ce or interpretation of the NSKS

PAGE 309

293 questions and/or answer selection as well as their personal experiences in the chemistry lecture and laboratory course during the semester. Table 61 Participants’ Interview Reflections Pars imonious (N=20) Initial NOS Beliefs Interview Question-3Parsimoni ous There are many differing views or images of the nat ure of science and scientific knowledge. I would like your views on the followin g statement: The ultimate goal of science is to gather all the complex facts about na tural phenomena. Quotation Comments ST 1: “ That's true. Except I think it's sort of a fruitles s goal. We will never know everything. But, that's what we're aiming for.” ST 2: “I believe that is the ultimate goal in one fashion or another.” ST 3 : “I do not know enough to absolutely know whether this is right or wrong. However, my feeling is that science does not only want to ga ther facts, but also analyze them and know what they mean. So gathering facts of natural phenomena is one thing, but also applying it to life is another.” ST 4 : “I would say yes, that is a good description of scientific goals. Naturally occurring things can be tested with hands on experimental tec hniques.” ST 5: “ No, I do believe that we need to gather all the nat ural forms.” ST 6: “To gather everything about the world we live in a nd try to figure things out. How everything works and how it’s all interconnected an d how it relates to each other.” ST 7: “I agree science is about understanding the world and its’ make-up.” ST 8: “True, and to explain these facts.” ST 9: “No. Science also involves other sources that you w ouldn't find naturally.” ST 10: “I don’t necessarily believe that. I think that sci ence is to gather the facts about everything. The purpose of science is to gather kno wledge about medicines, things for the future, different kinds of tools, and whatever we n eed to know. I don’t think it necessarily is natural phenomena.” ST 11: “No, the goal of science is to keep gaining new kn owledge. So the world in its evolution can keep going on.” ST 12: “No because science is not only used to figure ou t natural phenomena it is also conducted for everyday purposes like making medicin e.” ST 13: “I think that the ultimate goal of science is to ex plore ideas and develop theories. Also to find out what is real and not real and how things work, and not just about natural phenomena.” ST 14: “True. Facts, I believe are more important than t heories. Although science is full of theories there are plenty of facts to back up natur al phenomena.” ST 15: “Yes, and to understand it.” ST 16: “Well, I don’t think we’ll ever gather all the fa cts about natural phenomena, but ultimately yes, I would say that’s the goal.” ST 17: “Yes. I’m trying to think of a type of science th at doesn't deal with that but even the biological sciences do, because living creatures ar e natural phenomena too.” Continued on next page

PAGE 310

294 Table 61 (Continued) ST 18: “Yes, I agree with that. I think the reason peop le study science is to understand why things are the way they are. Why the sky is bl ue is always a constant question. They want to figure out why things are the way they are. ” ST 19: “Yes, I would say this is the goal of science bec ause it’s to find how things work and you need to do this by consistently gathering facts .” ST 20: “I believe the ultimate goal of science is to und erstand the function and actions of the world we live in how and why everything occur s the way that it does. How can we live with it or use it to make things better.” Testable Dimension In the current NOS literature the testable dimensio n is viewed as operating on a continuum that ranges from the view that scientific knowledge needs not to be capable of experimental test as opposed to it is capable of empirical tests. According to Rubba and Anderson (1978) scientific knowledge is capable of public empirical tests. Scientific knowledge’s validity is established through repeate d testing against accepted observations. Consistency among results is require d, but not a sufficient condition for the validity of scientific knowledge. There is no one way to do science therefore there is no universal step-by-step scientific method. Within this dimension the overall participant (N =56) testable pre-test mean was 24.20 while the post-test mean was 24.98 (Table 54) with 13 participants improving their score. The preand post-mean scores of the intervi ewed participants (N=20) were 24.05 and 24.30, respectively with 10 participants improv ing their score. This was also a category that quantitatively showed a below average increase of 0.78 points when compared to the overall average increase of 0.96 (N =56). The participant (N=56) with the highest pre-post score change of 10 points as w ell as the highest post score was student 52 with a score of 34. This moved the stud ent from the neutral end of the NSKS scale to the instrumentalist end by the end of the semester.

PAGE 311

295 Initially 35% of the interview participants (N=20) scores (ST 2, 5, 7, 13-14, and 17-18) suggested they held nave (realist) views th at scientific knowledge needs not to be capable of experimental test and the scientific method does offer the real truth. However by the end of the semester only 15% of the participants’ NSKS scores (ST 4, 5, and 17) fell in the realist range. The initial NSK S scores of 15% of the interview participants (ST 4 and 15-16) fell in the neutral r ange suggesting they held a combination of nave and expert beliefs concerning the role that empirical evidence and the scientific method plays in the nature of scienc e. By the end of the semester 30% of the participants (ST 2-3, 12-13, 16, and 18) scored in the neutral range. The initial scores of ten of the participants (ST 1, 3, 6, 8-12 and 19-20) suggested they held an appropriate view (instrumentalist) on the role that empirical evidence and the scientific method plays in the nature of science. Eleven part icipants (ST 1, 6-11, 14-15, and 1920) scored in the instrumentalist range by the end of the semester. However, for the majority of interview participants (Table 62) the overall increase in post testable scores was above the average incre ase of 0.79 points. The highest pre-post score change within the interview particip ants of 6 points were students 10 and 11. In addition student 10 had the highest post sc ore of 28. This moved the student from the neutral section of the NSKS scale to the i nstrumentalist end by the end of the semester. Approximately 20% of the participants imp roved their score on the “testable” dimension. This suggests that a small portion of t he participants are moving away from a realist view that scientific knowledge needs not be capable of empirical tests to a more instrumentalist point that scientific knowledge is capable of empirical tests. Although some increases were observed quantitativel y with ten of the twenty interview participants, the difference in their und erstandings is best reflected in the initial interview responses in Table 63. In order to query participants, understanding of the

PAGE 312

296 testable dimension of NOS the initial interview que stion asked participants to respond to the following: “ The scientific method will eventually let people learn t he real truth about the natural world and how it works.” This question assessed participants’ understanding that scientific knowledge is based on and/or derived from observations of the natural world, there is no universal step-by-st ep scientific method, and science cannot answer all questions. Table 62 Descriptive NSKS Statistics Testable Dim ension ID Pre Post Difference 1 25.00 25.00 0.00* 2 23.00 24.00 1.00*** 3 27.00 24.00 -3.00* 4 24.00 23.00 -1.00* 5 21.00 22.00 1.00*** 6 25.00 25.00 0.00* 7 22.00 26.00 4.00*** 8 26.00 25.00 -1.00* 9 26.00 25.00 -1.00* 10 27.00 26.00 -1.00* 11 25.00 26.00 1.00*** 12 26.00 24.00 -2.00* 13 22.00 24.00 2.00*** 14 22.00 25.00 3.00*** 15 24.00 25.00 1.00*** 16 24.00 24.00 0.00* 17 19.00 22.00 3.00*** 18 23.00 24.00 1.00*** 19 25.00 26.00 1.00*** 20 25.00 25.00 0.00* decrease in score or no change ** 0.96 gain in score *** > 0.96 gain in score Several of the interview participants (ST 1, 3, 6, 10-11, 14, 16, and 20) agreed in some part that scientific knowledge is based on and /or derived from observations of the natural world and science cannot answer all questio ns. However, none of the aforementioned participants reflected on the aspect s of the scientific method. Only two

PAGE 313

297 participants (ST 5 and 7) mentioned that the scient ific method was a tool and could not give us all the scientific knowledge about the worl d. Other participants (ST 8-9, 12, 15, and 17-19) interview statements reflected a realist view with the belief that t he scientific method will eventually let people learn the real trut h about the natural world and how it works. Some participants credited the scientific method w ith the all the current scientific knowledge. When comparing participants’ initial interview comm ents with their initial NSKS scores for the testable dimension of NOS few of the participants’ scores mirror their reflections. For instance several participants (2, 17, and 18) had initial scores in the realist range and reflected that range in their int erview statements that the scientific method will eventually let people learn the real tr uth about the natural world and how it works. While participants 4 and 16 had initial scor es in the neutral range which are reflected in their interview comments suggesting th at some things will not be made clear by the scientific method and others will. Other pa rticipants (1, 3, 6, 10-11, and 20) all had initial NSKS scores in the instrumentalist rang e however their interview statements reflected a combination of beliefs concerning the t estable dimension including that we will never know the real truth about everything and that the scientific methods allows for advances in scientific knowledge. Participants 5 a nd 7 had initial scores reflecting realist views however their interview statements suggested that the scientific method was just a tool and it does not give us all the scientific kno wledge about the world. The initial NSKS scores for four participants (8-9, 12, and 19) refl ected an instrumentalist view however their interview reflections suggested they believed that the scientific method will eventually let people learn the real truth about th e natural world and how it works. These discrepancies between NSKS scores and intervi ew statements could be

PAGE 314

298 attributed to several factors such as: distracted during the administration of the NSKS resulting in incorrect bubbling of answer choice or interpretation of the NSKS questions and/or answer selection as well as their personal e xperiences in the chemistry lecture and laboratory course during the semester. Table 63 Participants’ Interview Reflections Test able (N=20) Initial NOS Beliefs Interview Question4Testable There are many differing views or images of the nat ure of science and scientific knowledge. I would like your views on the followin g statement: The scientific method will eventually let people learn the real truth abo ut the natural world and how it works. Quotation Comments ST 1: “I don't think there will ever be a time where we k now absolutely everything about the world and how it works. Although we make advances in our knowledge of the world all the time using the scientific method the world is alway s changing.” ST 2: “I believe eventually through persistence humans wi ll be able to figure out how the natural works in scientific terms. However, I don't know if the world will be ready to accept what science will offer.” ST 3 : “I don't have enough personal experience yet to m ake an absolute choice. Scientific method is about obtaining scientific results, but I am not sure whether it will let us learn about the real truth of the world.” ST 4 : “Depends on your definition of "eventually". I t hink that some things won't be made totally clear by scientific method any time soon. H owever some things could become clearer in the very near future.” ST 5: “ No, the scientific method is a nice tool, but can n ot give us all the knowledge.” ST 6: “It might and it might not. We keep learning more and it definitely helps. It starts us on the right track for questioning it and finding o ut as much as we can.” ST 7: “I believe that if the knowledge is out there we may be able to acquire it. However some things may never be discovered using the scien tific method like the big bang theory.” ST 8: “Yes. Through observation and use of the scientif ic method one can learn truths of the natural world.” ST 9: “Yes, because that’s what all science is based on.” ST 10: “Well, despite how much the scientific method is us ed to support science, I still think that because much of science is based on theory tha t it won’t necessarily be the real truth.” ST 11 : “There is no real truth. No one really knows how the world works as new things are discovered everyday. However we do gain new knowle dge the world with the scientific method.” ST 12: “Yes because it is how we have learned what we kn ow so far. Therefore unless a more advanced method of thinking is established the n the scientific method offers a perspective on how it works.” Continued on next page

PAGE 315

299 Table 63 (Continued) ST 13: “I think that the scientific method helps one expla in and understand more about how the world works and why.” ST 14: “No, the scientific method may try to teach peopl e the truth about the natural world, but other factors may stand in the way.” ST 15: “Yes if they are willing to learn the real truth. ” ST 16: “I wouldn’t say it’s necessarily the real truth b ecause again, it’s about theory. The scientific method helps us learn a lot about the na tural world and how it works, but not the complete real truth. For example there’s many spec ies we haven’t discovered in the ocean.” ST 17: “I agree. I think the only opposing argument is i n the world of theology but you can't really argue it once they have all the facts.” ST 18: “Yes, I agree with this because the scientific me thod is way of analyzing situations. If everyone follows the method then the data will b e consistent. Each scientist might look at the data differently, but they will all have used t he same standard.” ST 19: “Yes the scientific method could possibly tell us the truth about the world and how it works. It is a step by step way of proving how som ething works.” ST 20: “The real truth is probably far too difficult for most people to understand. But most people can have a basic understanding of how the na tural world works. I am not sure what my definition of the real truth is.” ST 20: “The real truth is probably far too difficult for most people to understand. But most people can have a basic understanding of how the na tural world works. I am not sure what my definition of the real truth is.” Final NOS Interviews During the final interview, participants were prese nted with an ill-structured scenario problem from King and Kitchener (1994). T he reflective judgment scenario problem (Appendix F) incorporates some aspects of t he four NSKS dimensions from the initial interview with the focus being on the devel opmental (tentativeness of scientific knowledge) and testable (empirical basis) dimension s. The following NOS characteristics served as a basis of comparison during the analysis of the post NSKS scores and final NOS inte rview: (1) Scientific knowledge is tentative since it is subject to change with new ob servations and with the reinterpretations of existing observations; (2) Sci entific knowledge is empirically based because it is based on and/or derived from observat ions; (3) Scientific knowledge is subjective due to prior experiences and beliefs of scientists. Scientific knowledge is theory-laden as interpretations of data are filtere d through existing theories; and (4)

PAGE 316

300 Theories are inferred explanations for natural phen omena and mechanisms for relationships among natural phenomena while scienti fic models are based on inferences to represent and understanding of a mechanism or re lationship and do not necessarily represent the actual phenomena. The overall average score for the NSKS at the be ginning of the semester course for all participants (N=56) was 142.482 indicating most participants NOS beliefs fell in the unaccepted NOS views range. By the end of the semester, the overall average score for all the participants was 148.375 indicati ng a slight shift from non accepted views (realist) to a blend of neutral and instrumen talist views of NOS. The interviewed participants’ overall average score (N=20) for the NSKS at the beginning of the semester course was 141.650 indica ting most participants held neutral NOS belief. Initially 70% of the interview partici pants (N=20) NSKS scores (ST 1, 3-4, 7, 10-12, and 14-20) suggested they held nave (realis t) NOS views. However by the end of the semester 85% of the participants’ NSKS score s (ST 1-12, 15-16, and 18-20) fell at the beginning of the instrumentalist range. By the end of the semester, the overall average score for all the interviewed participants was 148.900 placing them at the edge of neutral and instrumentalist views of NOS. For th e majority of interview participants (ST 1-5, 7, 9-12, 14, 17, and 19-20) the overall in crease in post NSKS scores was above the average increase of 5.9 points (Table 55). Th e highest score was 155 earned by 2 participants (ST 1 and 9) indicating acceptance of NOS views and the lowest score was 142 scored by 2 participants (14 and 17) in the rea list range. Again it is worth noting that for the post-assessment overall score, 17 of 20 stu dents scored in the range of acceptance of NOS views with one participant (13) s coring in the neutral range and the remaining 2 scoring in the realist range. Therefo re the majority of the participants scored in the acceptance of NOS views range by the end of the semester (Table 55).

PAGE 317

301 Although increases were observed quantitatively wit h seventeen of the twenty interview participants, the difference in their und erstandings is best reflected in the final interview responses in Table 63. In order to query participants, understanding of NOS the initial interview question asked participants t o respond to the following: “Some scientists believe that explanations of chemical ph enomena, such as atomic theory, are accurate and true descriptions of atomic structure. Other scientists say that we cannot know whether or not these theories are accurate and true, but that scientists can only use such theories as working models to explain what is observed. This scenario problem probes participants’ understanding that sci entific knowledge is tentative, has an empirical basis, the role a scientist’s subjectivit y and creativity plays, and theories and models are based on inferred explanations and may o r may not represent reality. Scientific knowledge, while reliable and durable, i s never absolute or certain. This knowledge, including facts, theories, and laws, is subject to change. Several of the interview participants (ST 4, 6-7, 10-12, 14-16, an d 18-20) illustrated their belief in the tentativeness of scientific knowledge in their fina l interview comments (Table 64). The participants reported that scientific knowledge cha nges because of new observations or evidence and there were many questions still unansw ered. The remaining participants did not mention the tentativeness of scientific kno wledge in their responses. Science’s necessary reliance on empirical evidence is what distinguishes it as a way of knowing from other disciplines. Science is at least partially based on observations. In relation to the empirical basis o f NOS 50% of the participants (4, 6, 912, 14, 16, and 19-20) identified scientific knowle dge such as theories as being derived from observations or evidence. The remaining parti cipants did not directly mention evidence or observations in their reflections.

PAGE 318

302 According to Lederman, et al., (2002) scientific kn owledge is theory-laden. Scientists’ theoretical commitments, beliefs, prior knowledge, training, experiences, and expectations actually influence their work. Scienc e is influenced and driven by currently accepted scientific theories. In the final intervi ew of several participants (ST 1-2, 6, 12, 16, and 18-20) suggested that science is theory-lad en and that a scientist’s beliefs play a role in science. Participants one and sixteen m entioned that scientists can and do disagree and neither are necessarily correct or inc orrect. Participant three felt conflicted over scientists’ beliefs about theories. She tende d to support the view that scientists believe that explanations of chemical phenomena are accurate and true descriptions. The remaining participants mentioned theories but n ot the influence that scientist have on scientific theories. Theories are inferred explanations for observable p henomena. Scientific theories are often based on a set of assumptions or axioms. Theories serve to explain large sets of seemingly unrelated observations. Scientific models are created to describe aspects of a theory and are useful in givi ng predictions and explanations. Scientific models based on available data, and are not copies of reality. The final interview statements of 70% of the participants (ST 4, 6-12, 14-17, and 19-20) agreed with the second statement that scientists can only use such theories as working models to explain what is observed. However one participa nt (ST 13) described theories as being accurate and true. When comparing participants’ final interview commen ts with their final NSKS scores few of the participants’ scores mirror their reflections. For instance two participants (14 and 17) had final NSKS scores in t he realist range however reflected instrumentalist views in their interview statements that theories are working models and scientific knowledge is tentative. While participa nt thirteen had a final NSKS score in the

PAGE 319

303 neutral range but reflected a realist view in her i nterview comments suggesting that theories are accurate and true. Other participants (3-4, 8, 11, 15-16, 18, and 20) all had final NSKS scores at the low end of the instrumenta list range. The majority of the aforementioned participants (4, 8, 11, 15-16, and 2 0) reflected a moderate to higher level of thinking and tended to agree with the seco nd statement that we cannot know whether or not these theories are accurate and true but that scientists can only use such theories as working models to explain what is observed. As stated earlier participant thirteen was conflicted and supported t he first part of the statement that theories are accurate and true descriptions of scie ntific knowledge. The remaining participants (1-2, 5-7, 9-10, 12, and 19) had final NSKS scores reflecting a higher instrumentalist view. However several participants (1-2, and 5) held more moderate views of NOS in their interview statements while th e remaining participants (7, 9-10, 12, and 19) reflected a higher level of acceptance of N OS views. Table 64 Final Interviews – Nature of Science (N=20 ) Final NOS Interview Question Some scientists believe that explanations of chemic al phenomena, such as atomic theory, are accurate and true descriptions of atomi c structure. Other scientists say that we cannot know whether or not these theories are ac curate and true, but that scientists can only use such theories as working models to exp lain what is observed. What do you think about this statement? How did you come to hold that point of view or answer? On what do you base that point of view or answer? Quotation Comments ST 1: “ I think the statement shows that scientists can dis agree and neither one of them are necessarily incorrect. The goal of science is to l earn more about scientific knowledge. So, I think that both of these scientists are correct in believing what they believe until somehow it’s disproven.” ST 2: “I really think that we use theories to help expl ain current scientific knowledge. If the theory is disproven we need to be able to go back t o the beginning of the theory and reevaluate. We will never really know until we prov e or disprove it.” ST 3: “ I think it is conflicting what scientists think abo ut theories. I think theories are not a law. It seems a theory offers a little more room to maneuver. It would be difficult for me to say one is correct and one is not correct. I would need to know more about each person’s case. I do agree more with the first scientist.” Continued next page

PAGE 320

304 Table 64 (Continued) ST 4: “I agree with the second statement that theories can be used as working models. This is because our understanding of how the univer se works is still evolving. There are many questions that are still unanswered and scient ific knowledge is always changing.” ST 5: “Both statements could be true. However, we don’t have all the scientific knowledge yet. Exact truth in science is not fully formed.” ST 6: “I lean more toward using theories as working mode ls. As some scientists say we don’t know if they are accurate and true but the mo dels represent to the best of our ability what we consider to be true from what we observed. Scientific knowledge is changing. We don’t know if we’ll ever absolutely understand ever ything and all the variables. There are still unknown information that may result in changes in t heories.” ST 7: “I agree that theories are used as models. Not all scientific knowledge can be proved or disproved. One must have some kind of support/e vidence.” ST 8: “ I agree with the second statement that theories can be used as working models of what could be. I think a theory could be accepted a s some part of the truth but it is still a theory and not completely a fact. Example of fact? I am 5 foot 7. Example of theory? Evolution.” ST 9: “I can see the truth in both statements. I would probably be grouped in the category where you can’t know whether or not theories are ac curate and true, but that you use the theories as working models to explain what is obser ved. However, some theories have been around for a long time and have not been dispr oven.” ST 10: “I think many concepts in science involve the use of a theory with a working model to explain the theories and/or what we’ve observed. This is because scientific knowledge is always changing. Things that were considered factu al may now be considered completely or partially incorrect.” ST 11: “I agree with the second statement. We can’t know whether theories are true or false. Science is changing all the time. There is always room for error.” ST 12: “Theories are considered by some to be true until s omeone disproves them. The second statement suggests that one can explain what is observed by using a model. For instance Lewis-Dot structures show the electrons ar e organized in a certain way. So one can use the model but there eventually there might be evidence that contradicts the theory so one would need a new model. So theories aren’t set they can always change if somebody discovers some new evidence.” ST 13: “I think that theories are accurate and true. This we know because of evidence.” ST 14: “Models explain what’s being observed from theori es. But, I do think that there are things that can be changed in the models as science is not set in stone. I understand that through experiments there’s repetition and that’s w hat supports theories.” ST 15: “Theories should be used as working models. The sc ientific process is not set in stone.” ST 16: “I’m sure there are some scientists that believe that theories are accurate and true. Others say that they’re not necessarily accurate an d the true because you can’t exactly prove it. I would say that I agree with the scienti sts that believe that theories are working models. One cannot know whether the theories are ac curate and true. For instance we can’t see the atom itself. We can’t say the theory is set in stone. It’s just a theory to explain something we presently believe based on some eviden ce.” ST 17: “Theories are not necessarily accurate or true. I believe they are more of an accepted working model.” ST 18: “Well they might be accurate for the time, but, th ey’re set in stone. For instance 100 years from now scientist could replace the current knowledge of the atomic theory with new knowledge. Therefore it’s not like it’ll always be accurate. For now it is.” Continued on next page

PAGE 321

305 Table 64 (continued) ST 19: “Well, theories are based off evidence. So theorie s can be accurate and true, but they can also be proven wrong if new evidence is di scovered. Scientists would use the theories as working models to explain what is obser ved. Even if they don’t believe it’s true or accurate they can still use it to disprove the t heory.” ST 20: “ I think the second statement is more accurate. Ove r time they have added to the atomic theory as new structures have been discovere d. They’ve done further tests and discovered that things were different than they tho ught them to be. So, I think models depict that uncertainty.” Discussion Changing NOS Beliefs RQ1. What range of NOS beliefs about science (chemistry) do undergraduate science students have at the beginning of a semeste r general chemistry laboratory course? The overall average score for the NSKS at the begin ning of the semester course for all participants (N=56) was 142.482 indicating most participants NOS beliefs lie in the unaccepted NOS views. Among them, the highest scor e was 158 indicating acceptance of NOS views and the lowest score was 122 suggestin g non acceptance of NOS views. For the pre-assessment overall scores, 13 of 56 stu dents scored above 147 indicating an acceptance of NOS views while 20 of 56 participa nts scored below 141 indicating initial non acceptance of NOS views. The majority of participants scored from 141-147 considered the neutral range indicating they held s ome of the accepted and non accepted NOS views but not all the views. The interviewed participants’ overall average score s for the NSKS at the beginning of the semester course was 141.650 indica ting most participants held neutral NOS belief. Among them, the highest score was 149 indicating acceptance of NOS views and the lowest score was 132 suggesting non a cceptance of NOS views. For the pre-assessment overall scores, only 1 of the 20 int erviewed participants scored above

PAGE 322

306 147 indicating an acceptance of NOS views while 8 o f 20 participants scored below 141 indicating an initial non acceptance of NOS views. The majority of participants (11) scored from 141-147 considered the neutral range in dicating they held some of the accepted and non accepted NOS views but not all the views. In general, the initial findings indicate that the participants of the study did not possess an adequate understanding of NOS at the beg inning of the semester. Various studies since the 1960s’ have concluded that miscon ceptions concerning the NOS among students are common (Moss, 2001; Brickhouse, et al., 2000; Walker, et al., 2000; Griffiths & Barry, 1993; Mackay, 1971; Colley & Klo pfer, 1963). In the current literature on NOS the creativity dim ension is viewed as operating on a continuum that ranges from viewing scientific knowledge as a totally lifeless, rational, and an orderly activity to viewing it as an endeavor that requires human imagination and creativity through the invention of explanations based on observations. In addition this dimension considers whether scient ific models and theories are a product of the human imagination and whether they accuratel y represent reality. The initial NSKS scores of the participants (N=56) resulted in 26.8 % of the participants beginning the semester with instrumentalist views of the role creativity plays in the nature of science while 60.0% held realist views. Only 25% o f the interview participants (N=20) initially scored in the instrumentalist range for t his NOS dimension. In the initial interviews 20% of the participants believed that sc ientific models and theories are products of the human imagination and may or may no t represent reality. Generally students possess misconceptions on the ro le creativity plays in obtaining scientific knowledge. Studies show that in general students do not believe that scientific knowledge is a product of human imaginat ion (Lederman & Abd-El-Khalick, 2000; Lederman, 1999). Lederman’s study (1999) con cluded that the 10 th grade

PAGE 323

307 students believed that creativity and imagination p layed a limited role in the development of scientific knowledge. Lederman and Abd-El-Khali ck’s study (2000) found that 70% of the college students did not refer to creativity or imagination or models or theories in their explanations. Walker, et al., (2000) reporte d that students in their senior year of college perceived science as a rote and clinical pr ocess. In the current literature on NOS the developmental dimension is viewed as operating on a continuum that ranges from viewing s cientific knowledge as absolute, “set in stone” to viewing it as changing and dynamic. T he initial NSKS developmental scores of the participants (N=56) resulted in 37.5 % of th e participants beginning the semester with instrumentalist views about the role developme nt plays in the nature of science while 35.0 % held realist views. Nine of the inter view participants (N=20) initial NSKS development scores fell in the instrumental range. In the initial interviews 95% of the participants believed that scientific knowledge cha nges and evolves over time. Some students hold misconceptions pertaining to the developmental nature of science. Studies have shown that a portion of stud ents hold the misconception that the truth of scientific knowledge is beyond doubt and d oes not change over time (Walker, et al., 2000; Meichtry, 1993). However, other studies have shown that students believed in that scientific knowledge is tentative (Moss, 2001; Lederman, 1986). In the current literature on NOS the parsimonious d imension of science is viewed as operating on a continuum that ranges from the vi ew that scientific knowledge attempts to achieve simplicity of explanation as op posed to complexity. The initial NSKS scores of the participants (N=56) resulted in 44.6 % of the participants beginning the semester with instrumentalist views while 32.1% hel d realist views concerning the parsimonious nature of NOS. Eight of the interview participants (N=20) initial NSKS parsimonious scores fell in the instrumental range. In the initial interviews 40% of the

PAGE 324

308 participants believed in some part that the ultimat e goal of science is not to gather all the complex facts but to understand them and how they a pply to the world. Studies suggest that students believe that scientif ic knowledge is specific rather than comprehensive (Lederman, 1986; Rubba and Ander son, 1978; Mackay, 1971). Another study suggested that scientists follow the scientific method (Lederman & AbdEl-Khalick, 2000). In the current literature on NOS the testable dimen sion is viewed as operating on a continuum that ranges from the view that scientif ic knowledge needs not to be capable of experimental test as opposed to it is capable of empirical tests. Plus that there is no one way to do science therefore there is no univers al step-by-step scientific method. The initial NSKS scores of the participants (N=56) resulted in 48.2 % of the participants beginning the semester with instrumentalist views w hile 28.6 held realist views concerning the testable nature of NOS. Ten of the interview participants (N=20) initial NSKS scores fell in the highly sophisticated level for this dimension. In the initial interviews 30% of the participants believed that th e scientific method was just a tool and that it does not give us all the scientific knowled ge about the world. According to McComas and Olson (1998) scientists r equire replicability and truthful reporting. A large majority of students i n a study performed by Lederman and Abd-El-Khalick (2000) demonstrated inadequate views of the empirical NOS. According to Sandoval (2003) there are broadly cons istent findings from NOS studies. Most learners appear to believe that sci entific knowledge is an accumulation of facts about the world, rather than explanations abo ut the world created by scientists. Learners seem to believe that the ideas that scient ists generate and test are descriptions of the actual world. They tend to see experimentati on as a straightforward process of proving ideas right or wrong as well as that experi ments yield answers to questions

PAGE 325

309 directly. A majority of learners have a hierarchica l view of the relationship between hypotheses, theories, and laws based upon their deg ree of certainty rather than their scope and purpose. In other words, learners view hy potheses as guesses, theories as well-tested hypotheses, and laws as indisputably pr oven theories. Learners seldom see scientists as creative, except in a limited sense o f needing to be clever to devise experiments. They do not recognize that scientists use their imaginations to generate theoretical ideas. In addition, learners tend to v iew historical scientific knowledge as uniformly wrong and current scientific knowledge as right, rather than viewing scientific knowledge developmentally. RQ1a. Do students’ NOS beliefs about science (chemistry) change by the completion of a semester general chemistry laborato ry course? By the end of the semester, the overall average sco re for all the participants (N=56) was 148.375 indicating a slight shift from n on accepted views to neutral views of NOS. The highest score was 169 indicating an accept ance of NOS views and the lowest score was 118 in the range of non acceptance of NOS views. Again it is worth noting that for the post-assessment overall score, 16 of 5 6 students scored in the neutral range of NOS views while 5 participant’s scores remained in the unaccepted NOS views range. The majority of the participants (35) scored in the accepted range of NOS views. The results also indicate that participants’ NSKS postassessment scores ranged from 118169. This suggests that NOS beliefs can improve ev en if only minimally over a course of a semester. The possible impact that instruction m ay have had on the changes is discussed in chapter seven. By the end of the semester, the overall average sco re for all the interviewed participants (N=20) was 148.900. The highest score was 155 earned by 2 participants indicating acceptance of NOS views and the lowest s core was 142 also scored by 2

PAGE 326

310 participants in the range of realist-neutral NOS vi ews. Again it is worth noting that for the post-assessment overall score, 13 of 20 student s scored in the range of acceptance of NOS views with the remaining 7 scoring in the ne utral range. Therefore the majority of the participants scored in the acceptance of NOS views range by the end of the semester. Once again this suggests that NOS belief s can improve even if only minimally over a course of a semester. The possible impact t hat instruction may have had on the changes is discussed in chapter seven. In a longitudinal study performed by Ryder, et al., (1999) undergraduate science majors were found to change their overall NOS belie fs. Students showed development in their ideas about the relationship between data and knowledge claims, the lines of scientific inquiry, and science as a social activit y. Another longitudinal study performed by Moss, et al., (2001) with pre-college students’ examined their understanding of the nature of science at the beginning and the end of t he academic year. Only minimal changes were noted by the end of the study in Lederman et al., (1997) state that the important qu estion concerning an individual’s understanding of NOS should center on the limits of one’s understandings. The current study highlighted the limits of partici pants’ understandings via the descriptions and dialogue presented in the previous sections. Portions of the reflective passages during interviews were presented on the ba sis of the model of NOS using the NSKS dimensions. A goal of this study was to commun icate, often in participants’ own voices, key comments which are representative of th eir NOS beliefs. In this study the creativity of scientific knowledge is described as being creat ed from the human mind and logical reasoning. This cre ation is based on observations and inferences of the natural world. The final mean sco re (N=56) for the overall understanding of the creative dimension was 23.67 a nd a wide range of levels of

PAGE 327

311 understanding were exhibited for this dimension. Fr om the data it is clear that although 30.4% of the participants (N=56) experienced an inc rease in overall belief range for this dimension of NSKS, the changes were not complete as to improve the NOS beliefs in all participants. The final mean score (N=20) for the overall understanding of the creative dimension was 24.10 and a wide range of levels of u nderstanding were exhibited for this dimension. From the data it is clear that although 50% of the participants (N=20) experienced an increase in overall belief range for this dimension of NSKS, the changes were not complete as to improve the NOS beliefs in all participants. By the end of the semester 75% of the interview participants (N=20) r eflected positive beliefs concerning the role creativity plays in NOS. Therefore by the end of the study, participants in both groups showed improvement in their creativity NOS v iews. The creative and imaginative nature of scientific k nowledge is explained by Lederman, et al, (2002) as being empirical. The de velopment of scientific knowledge involves making observations. In addition, generat ing scientific knowledge involves human imagination and creativity. It involves the invention of explanations and theoretical objects. These scientific objects are functional theoretical models rather than copies of reality. By the end of a study by Khishf e and Lederman (2006) only 5% of the study population still demonstrated nave views con cerning the role creativity plays in NOS. Some of the participants in this study ackno wledged a role of creativity in the form of human imagination, and some made connection s between creativity, inference, and subjectivity. According to Ziman (1995), patter n recognition is linked to subjectivity and is a mainstay of all scientific knowledge and p ractice. Historically and by definition in this study the de velopmental dimension views scientific knowledge as uncertain and always changi ng. With regard to the

PAGE 328

312 developmental nature of science, it was found that the participants did possess a better understanding of the tentativeness of scientific kn owledge by the end of the semester. The final mean score (N=56) for the overall underst anding of the developmental dimension was 24.77 and a wide range of levels of u nderstanding were exhibited for this dimension. From the data it is clear that although 39.3% of the participants (N=56) experienced an increase in overall belief range for this dimension of NSKS, the changes were not complete as to improve the NOS beliefs in all participants. By the end of the semester 57.1% of the participants (N=56) reflected instrumentalist beliefs concerning the role development plays in NOS. The final mean score (N=20) for the overall understanding of the developmental dimension was 24 .70 and a wide range of levels of understanding were exhibited for this dimension. F rom the data it is clear that although 35% of the participants (N=20) experienced an incre ase in overall belief range for this dimension of NSKS, the changes were not complete as to improve the NOS beliefs in all participants. By the end of the semester 45% of th e interview participants (N=20) reflected instrumentalist beliefs concerning the ro le development plays in NOS. Therefore by the end of the study, participants in both groups showed improvement in their developmental NOS views. Scientific knowledge is both tentative and durable. Having confidence in scientific knowledge is reasonable while realizing that such k nowledge may be abandoned or modified in light of new evidence or reconceptualiz ation of prior evidence and knowledge. The history of science reveals both evol utionary and revolutionary changes. A moderate percentage of the participants in this s tudy understood that scientific knowledge is subject to review and change and that today’s scientific laws, theories, and concepts may have to be changed in the face of new evidence. By the end of a study

PAGE 329

313 by Khishfe and Lederman (2006) only 5% of the study population still demonstrated nave views of the tentativeness of scientific know ledge. In a study by Brickhouse et al., (2000), approximately 47% of the college students i nterviewed believed that theories do not change while 90% of the participants in Abd-El Khalick and Lederman’s (2000) study did not seem to believe that scientific knowledge i s tentative. In addition Walker et al. (2000) stated some high school and college students in their study thought that science theory is static. However, the students in both Led erman’s (1986) and Moss, et al., (2001) studies believed that science knowledge is t entative. By definition in this study the parsimonious dimens ion views scientific knowledge as being comprehensive as opposed to specific and t ends toward simplicity. With regard to the developmental nature of science, it w as found that some participants did possess a better understanding of the parsimonious nature of scientific knowledge by the end of the semester. The data shows that the fi nal mean score for the overall understanding of the parsimonious aspect of science was the highest among the dimensions. The final mean score (N=56) for the ov erall understanding of the parsimonious dimension was 26.32 and a wide range o f levels of understanding were exhibited for this dimension. From the data it is c lear that although 46.4% of the participants (N=56) experienced an increase in over all belief range for this dimension of NSKS, the changes were not complete as to improve t he NOS beliefs in all participants. By the end of the semester 80.4% of the participant s (N=56) reflected instrumentalist beliefs concerning the role parsimony plays in NOS. The final mean score (N=20) for the overall understanding of the parsimonious dimen sion was 26.70 and a wide range of levels of understanding were exhibited for this dim ension. From the data it is clear that although 50% of the participants (N=20) experienced an increase in overall belief range for this dimension of NSKS, the changes were not co mplete as to improve the NOS

PAGE 330

314 beliefs in all participants. By the end of the sem ester 80% of the interview participants (N=20) reflected instrumentalist beliefs concerning the role parsimony plays in NOS. Therefore by the end of the study, approximately 80 % of the participants in both groups exhibited informed parsimonious NOS views. Some of the participants disagreed that there is a continuous effort in science to develop a minimum numbers of laws and concepts to e xplain the greatest possible number of observations. Furthermore only a minority of the participants kne w that scientific knowledge is comprehensive as opposed to specific. More supported the belief that scientific knowledge is specific as opposed to comprehensive. The poor performance of the participants in the parsimonious dimension corresponds with the result obtained from Lederman’s (1986) studies wher e Grade 10 students were found to hold misconceptions of the parsimonious subscale. By definition in this study the testable dimension views scientific knowledge as being empirical and based on observations using the senses and tools/instruments with a variety of methodologies. With regard to the te stable nature of science, it was found that some participants did possess a better underst anding of the testable aspects of scientific knowledge by the end of the semester. Th e final mean score (N=56) for the overall understanding of the testable dimension was 24.98 and a wide range of levels of understanding were exhibited for this dimension. Fr om the data it is clear that although 25% of the participants (N=56) experienced an incre ase in overall belief range for this dimension of NSKS, the changes were not complete as to improve the NOS beliefs in all participants. By the end of the semester 55.4% of t he participants (N=56) reflected instrumentalist beliefs concerning the role testabi lity plays in NOS. The final mean score (N=20) for the overall understanding of the t estable dimension was 24.30 and a wide range of levels of understanding were exhibite d for this dimension. From the data it

PAGE 331

315 is clear that although 25% of the participants (N=2 0) experienced an increase in overall belief range for this dimension of NSKS, the change s were not complete as to improve the NOS beliefs in all participants. By the end of the semester 55% of the interview participants (N=20) reflected instrumentalist belie fs concerning the role testability plays in NOS. Therefore by the end of the study, approxi mately 50% of the participants in both groups exhibited improvement in their testable NOS views. Scientists conduct investigations for a variety of reasons. Different types of questions propose different types of scientific inv estigations. Different scientific fields utilize different methods, central theories, and st andards to advance scientific knowledge and understanding. There is no single universal ste p-by-step scientific method that all scientists follow. Scientists investigate research questions using their prior knowledge, persistence and creativity. Scientific knowledge is gained in a range of ways including analysis, observation, theory, journal research of prior investigations and experimentation (McComas, et al., 1998). By the en d of a study by Khishfe and Lederman (2006) half of the study population improv ed their empirical NOS views. The final NOS interviews revealed that some of the participants still held several misconceptions pertaining to various aspects of NOS while others improved. Overall, some participants by the end of this study acknowle dged that scientific knowledge is subject to change, recognized that scientific knowl edge involves human imagination, there is no universal scientific method, scientific knowledge has an empirical basis, there are areas of scientific knowledge that are more cer tain than others, and models of theories do not necessarily represent reality. The findings suggest the need to foster a better understanding of NOS.

PAGE 332

316 Summary In summary the overall findings of the study (N=56) in answering research question -1, sub-question-a was as follows: Do stu dents’ NOS beliefs about science (chemistry) change by the completion of a semester general chemistry laboratory course? 1. Noticeable increase in posttest scores with a s tatistically significant large effect size of 1.00. 2. The mean gain scores are lowest for the amoral dimension and highest for the parsimonious dimension. 3. The mean gain score for overall increased by 5.8 9 points moving from a realist view towards an instrumentalist view of NOS 4. The mean gain scores for five of the NSKS dime nsions and the overall score are significant at p 0.05. 5. The mean gain score for the amoral dimension is not significant at p 0.05. In summary the findings related to the interview pa rticipants of the study (N=20) in answering research question -1, sub-question-a w as as follows: Do students’ NOS beliefs about science (chemistry) change by the com pletion of a semester general chemistry laboratory course? 1. Noticeable increase in posttest sco res with a statistically significant large effect size of 1.00. 2. The mean gain scores is lowes t for the testable dimension and highest for the parsimonious dimension. 3. The mean gain score for overa ll increased by 7.25 points moving from a realist view towards an instrumentalist view of NOS 4. The mean gain scores for three of the NSKS dim ensions and the overall score are significant at p 0.05. 5. The mean gain score for the amoral, developmental and testable dimensions are not significant at p 0.05.

PAGE 333

317 Not unexpectedly, given the literature on NOS belie fs, the participants (N=56) in the study showed a minimal but significant change i n their overall NOS beliefs and in five of the six dimensions the exception being the amora l dimension. This lack of development may not be so surprising since the amor al dimension related to scientific knowledge may be influenced by the participant’s ow n views of moral judgments and prior experiences learning science. Overall, minimal gains were made for the interview participants (N=20) in general within the NSKS dimensions. The participants overal l had quantitative scores that were mixed with only three of the six dimensions showing increases. Slightly better results were obtained from the entire population (N=56) qua ntitatively in terms of increased sophistication of NOS beliefs. The participants (N= 56) had increases within five of the six dimensions. With the interview participants, it seemed they eit her held the belief or not, as minimal to moderate growth could be seen qualitativ ely within the interviews over time. Although increases were seen quantitatively, these may well be insignificant. It seems apparent that some participants have very nave (re alist) NOS beliefs while most possess neutral NOS beliefs and a few surprisingly hold instrumentalist beliefs. The nave views are to be expected since the developmen t of NOS beliefs is normally seen after encountering NOS instruction and during the c ollege years. Even then many students fail to fully accept NOS views. Chapter seven presents the findings of the study’s second research question sub-question 2-a, and 2-b. The characterization of epistemological and NOS beliefs and any changes in those beliefs that may have resulted from laboratory instruction will be presented. The combination of interviews, reflect ive questionnaires, and quantitative measures will provide a glimpse into participants’ beliefs during the course of a

PAGE 334

318 semester. This will provide a glimpse of the parti cipants’ overall beliefs concerning the laboratory aspects of the semester course. The res ults are discussed and related back to the key laboratory education literature as well as the NOS and personal epistemological beliefs literature.

PAGE 335

319 Chapter Seven: Laboratory Instructional Features Introduction Chapter seven characterizes the findings of the ins tructional features of the study’s’ second research question, sub-question 2-a and 2-b. The characterization of laboratory instruction with the quantitative and qu alitative results from the Student Evaluation of Laboratory Instruction Questionnaire as well as the results of the analyses of the participant’s responses to interview probes will be presented. This will provide a glimpse of the participants’ overall beliefs concer ning the laboratory aspects of the semester course. This study was of an exploratory nature to lay a fo undation for focusing on more specific features of epistemological and NOS reason ing in light of specific instructional features (pre-lab, laboratory work, or post-lab) fo r future research. The results are discussed and related back to the key laboratory ed ucation as well as the NOS and personal epistemological beliefs literature. Method of Analysis This analysis was conducted in a multi-layered, mul ti-stage process, through reading, and sorting participants’ responses to lab oratory instruction questions, both general in nature and specific to the course. The analyses below are organized by the responses from the participants to the Student Eval uation of Laboratory Instruction Questionnaire (Appendix E) and the final interviews The first part of the analysis presents the participants’ (N=56) reflections on th e laboratory instructional features (e.g., preand postlaboratory activities, laboratory wo rk) through the use of the student

PAGE 336

320 questionnaire (section 1) and the final interview ( N=20) responses. In addition the responses to the final interview questions that eva luated participants’ views on several other aspects of the laboratory instruction such as ; the role they played in promoting their own learning, the skills obtained during the laboratory course, and the role and significance of the laboratory notebook and scienti fic analysis. The second part of the analysis presents participant (N=56) responses to t he second section of the student questionnaire probing their perceptions of the prepost laboratory experiences. The third part of the analysis presents the participants’ (N= 56) reflective and final interview responses (N=20) to their believed learning gains u sing Bloom’s Taxonomy. The last section of this analysis presents the participants (N=56) reflective and final interview responses (N=20) to whether the instructional featu res influenced their epistemological or NOS beliefs. The interview participants’ episte mological beliefs analysis was performed using the EBAPS dimensions (axes): struct ure of knowledge, nature of knowing and learning, real-life applicability, evol ving knowledge, and source of ability to learn with the three laboratory instructional featu res. The reflective responses were evaluated using the six NSKS dimensions: Amoral, c reativity, developmental, parsimonious, testable, and unified. The final NOS interview was evaluated using the three laboratory instructional features. The afore mentioned dimensions (axes) served as the major theme codes giving a framework from wh ich first-order themes originally derived from the participants’ verbatim quotations or raw data themes could be analyzed. Within each dimension (axis), the respon ses to interview (N=20) and reflective questions from section one of the studen t questionnaire (N=56) regarding NOS and personal epistemological beliefs at the beginni ng and end of the semester are presented. The intent of this analysis is to expan d the theoretical understanding of the dimensions (axes) of personal epistemology in scien ce and the continuum of beliefs, as

PAGE 337

321 expressed in context. Illustrative quotes have bee n selected from the interviewed participants as representative of the range of beli efs along the continuum. The demographics for all the participants (N=56) is pre sented in Appendix P. Table 64 presents a demographic overview of the interview pa rticipants with their participation identification number. Quotes are identified with the letters ST followed by the participants’ identification number (Table 65). Tab le 66 presents the descriptive statistics of the CCI, NSKS, and EBAPS scores for the intervie w participants. The main research questions that guided this portio n of the study were: RQ2. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work) do students believe were essential to their understanding during the semester general chemistry laboratory learning expe rience? RQ2a. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work) do students believe in fluenced their personal epistemological beliefs about science (development) during the seme ster general chemistry laboratory course? RQ2b. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work) do students believe in fluenced their images of the nature of chemistry (NOS) during the semester general chemist ry laboratory course? Characterization of Participants’ Reflection of Lab oratory Instruction Chemistry is a laboratory science; therefore chemis try instruction would not complete without some laboratory component. But in a discipline as wide-reaching as chemistry is, the natural questions are what should be taught and how it should be taught. Learning chemistry can take place in the c hemistry laboratory. The chemistry laboratory is a venue almost unique to chemistry le arning, and it can provide another dimension to the instructional goal of promoting st udent learning. McComas (1991)

PAGE 338

322 points out that while other subjects or academic do mains, such as, architectural drafting, computer programming, drama, finance, and home econ omics, involve students interacting with materials, it is the science labor atory that is most closely associated with “experimentation, problem solving and questioning”. Table 65 Demographic Statistics Interview Partici pants ID Sex Age Major College Year 1 F 19 Pre-Pharmacy Fr 2 F 21 Psychology So 3 F 21 Biomedical Science Jr 4 M 24 Electrical Engineering So 5 M 22 Environmental Science Jr 6 F 27 Marine Science None 7 F 20 Biomedical Sciences Jr 8 M 18 Undeclared Fr 9 F 18 Environmental Science Fr 10 F 20 Environmental Science So 11 F 19 Nursing Fr 12 F 18 Undecided Fr 13 F 18 Pre-Pharmacy Fr 14 F 19 Pre-Pharmacy Fr 15 F 20 Biology So 16 F 18 Environmental Science Fr 17 F 24 Physical Ed Jr 18 F 20 Athletic Training Jr 19 F 19 Biomedical Sciences So 20 F 45 Masters Nursing None

PAGE 339

323 Laboratory instruction is a cornerstone of many sci ence programs as it allows students to be actively involved in their learning. Effective laboratory instruction requires engaging the minds of the students so that they can think about the laboratory instructional experience in such a way as to evalua te their understanding in relation to what is experienced (Domin, 2007). This involves cr eating opportunities for reflection (Tien et al., 2007), as well as argumentation (Driv er, 1995; Osborne et al., 2004) such as with the reflective laboratory instructional questi onnaire used in this study (Appendix E). According to the National Research Council (2006), both are necessary, and to be effective they must be explicitly linked to a speci fic laboratory experience. When to implement these opportunities for maximal effect de pends on the instructional methods or style used. Table 66 Descriptive Statistics Interview Partici pants’ Scores ID CCI EBAPS Pre EBAPS Post NSKS Pre NSKS Post 1 72 2.70 3.13 143 155 2 76 2.35 2.55 144 153 3 81 2.38 2.97 138 148 4 67 2.70 2.62 138 149 5 86 1.88 2.08 144 151 6 63 2.37 3.12 149 151 7 63 2.32 2.77 143 152 8 72 2.83 3.22 147 145 9 45 2.53 2.60 147 155 10 72 2.05 3.45 141 153 11 58 2.80 2.98 143 149 12 63 2.63 2.78 138 150 13 49 2.63 2.48 146 144 14 65 2.48 3.02 132 142 15 76 2.98 3.12 140 145 16 77 2.85 3.55 143 148 17 65 2.50 2.45 136 142 18 76 2.63 2.77 143 148 19 67 2.52 2.87 140 152 20 58 2.65 2.80 138 146

PAGE 340

324 The laboratory instructional features of this study discussed in chapters two and three include: pre-laboratory, laboratory work, and post-laboratory. Pre-laboratory work usually involves expectations or requirements that students prepare on their own time prior to the actual laboratory work. Pre-laborator y activities can stimulate students to think, recall prior information, practice basic cal culations, learn the safety procedures, or check that experimental procedures have been read a nd understood. Laboratory work allows students to develop practical skills by lear ning to use the tools and conventions of science, work as a member of a scientific team, add to their understanding of the nature of science (NOS) as well as experience the ambiguit y and complexity of empirical work. Post-Laboratory activities are the student’s opport unity to report and reflect on what occurred during laboratory work. Post-laboratory w ork usually involves writing up the laboratory experiment, performing calculations usin g data from the experiment, comparison of class data, discussion of the results between teams, answering openended writing assignments and performing analysis o f data and errors. All of these instructional features can encourage students to co nnect and revise prior knowledge, thereby leading to an improved grasp of the topic, and improve motivation and learning. Participant Reflections of Laboratory Instruction Section one of the Student Evaluation of Laboratory Instruction Questionnaire was used to evaluate participants’ beliefs on how h elpful they found each of the instructional components and the pedagogical featur es with respect to their understanding and necessity of the laboratory learn ing experience. This section of the reflective student questionnaire (Appendix E) was u sed to assess participants’ reactions to the three major instructional components (e.g., pre-laboratory, laboratory work, and post-laboratory) of laboratory instruction implemen ted during the semester course. The three instructional components were sub-divided int o the five main pedagogical tools or

PAGE 341

325 approaches used during the course (e.g., pre-labora tory – lab manual, quiz, questions/flowcharts, discussion, and technology). The results for all the participants (N=56) and the interview participants (N=20) are pr esented in Tables 67and 68, respectively. The participants reflected further b y responding to the probe question concerning the instructional methods used in this c ourse and how they compared with other science laboratory courses they had experienc ed. The vast majority (65%) of participants (N=56) clea rly indicated that they found the laboratory work to be either very or extremely essential to the laboratory experience and their understanding of the material. Strong pa rticipant support was shown for the post-laboratory with 59% indicating that it was eit her very or extremely essential to the laboratory experience and understanding of the mate rial. The pre-laboratory was ranked third with 44% indicating that it was either very o r extremely essential to the laboratory experience and understanding of the material. Table 67 Participants’ Laboratory Instructional Pre ferences Instructional Category Least Essential Somewhat Essential Essential Very Essential Extremely Essential Pre-laboratory 3.0% 13.0% 40.0% 16.0% 28.0% Lab Work 2.0% 5.0% 28.0% 23.0% 42.0% Post-laboratory 4.0% 6.0% 33.0% 24.0% 33.0% N=56 The interview participants (N=20) ranked the three instructional features the same as all the participants (N=56) with laboratory work being the most essential, followed by post-laboratory, and lastly pre-laborat ory. The majority (83%) of interview participants clearly indicated that they found the laboratory work to be either very or extremely essential to the laboratory experience an d their understanding of the material. Once again, strong participant support was shown fo r the post-laboratory with 72% indicating that it was either very or extremely ess ential to the laboratory experience and

PAGE 342

326 understanding of the material. The pre-laboratory was ranked third with 46% indicating that it was either very or extremely essential to t he laboratory experience and understanding of the material. Table 68 Interview Participants’ Laboratory Instruc tional Preferences Instructional Category Least Essential Somewhat Essential Essential Very Essential Extremely Essential Pre-laboratory 5.0% 14.0% 35.0% 22.0% 24.0% Lab Work 3.0% 5.0% 9.0% 19.0% 64.0% Post-laboratory 7.0% 5.0% 16.0% 22.0% 50.0% Reflective Comments of Laboratory Instructional Pre ferences Participant comments were generally positive. Some of the participant (N=56) reflective comments are listed in Table 69. The ma jority of the participants commented that certain aspects of the pre-laboratory such as the generating of a procedural flow chart were beneficial while some of the pre-laborat ory questions were unnecessary. The laboratory manual, laboratory notebook, and the tec hnology tools were commented on by the participants most often in their reflective comments. The laboratory manual was viewed as quite useful and detailed enough for effe ctive use by the participants. The laboratory notebook received more positive response s from the participants as the semester progressed whereas a few participants felt that recording their “real-time” data during laboratory work could have been easily recor ded on regular notebook paper. This type of nave comment may suggest that some pa rticipants did not have the prior experience in their other laboratory courses with p racticing “real-time” data collection in a permanent document or understand the importance of recording data as it occurs. Students repeatedly stated that they used a wider r ange of technology-enhanced equipment (e.g., Blackboard, MBL) than they would n ormally use. The technologyenhanced approach allows students to perform severa l trials which are more of a challenge when using traditional bench laboratory m ethods due to time constraints.

PAGE 343

327 As for the post-laboratory the majority of the part icipants’ reflections were positive except for the few that it was not crucial to their learning. This type of novice response suggests that some students either did not like to write or understand the importance of analyzing and reporting scientific da ta to share with the scientific community. However, a majority of the participants offered positive overall comments concerning the laboratory instructional methods as noted in Table 69. During the actual lab work, participants’ min ds are engaged not on the underlying theories and principles, but on the proc edural aspects of the activity. The cognitive demand placed on working memory in trying to understand and follow the given methods allows for little, if any, cognitive resources to be devoted toward thinking about the concepts involved in the activity. Partic ipants partaking in a MBL laboratory activity were most cognitively engaged while they w ere in the laboratory conducting the activity. This is indicated by the use of the terms ‘frustrating’ and ‘challenging’ to describe the activities. These terms indicate that participants were, at some point in the lab work, in a state of cognitive dissonance which they had to think through to reestablish cognitive equilibrium. In the case of laboratory instruction, a majority o f the participants in this study perceived understanding to develop outside of the l aboratory, after the lab work was completed, when they had the opportunity to reflect on what they had done while others felt the post-lab was not crucial and simply a revi ew of the material. The aforementioned attitudes reflect both those of novice and expert p articipants. The post-lab analysis included the writing of the laboratory report that related to specific concepts addressed during a specific laboratory activity. For laborato ry instruction, the post-lab activity is crucial for conceptual development; it may be the o nly opportunity the students get to reflect on what was done in the lab. Research by Ke ys (2000) has shown that the

PAGE 344

328 process of laboratory report writing can stimulate science learning provided that “the students actively deliberated and reflected on scie nce content as part of the writing process itself.” Table 69 Participants’ Reflections Instructional Methods (N=56) Instructional Issue Reflective Written Comments Pre-lab ST-6 “The pre-lab assignment helps out the most. Writi ng the procedural flowchart really helped me understand the process.” ST-9 “I think making the procedural flowcharts really h elp me. The flowcharts offer a clearer picture of what I am going to be do ing before I get into the lab. However, I could have done without some of the prelab questions.” St-16 “I feel that some of the procedures for completing the pre-lab were a bit overly extensive, such as creating a flowchart for each procedure.” Laboratory Manual ST-30 “I think that the instruction manual is very thoro ugh and helpful compared to other ones.” ST-49 “I really like the detail of the lab manual.” ST-53 “The lab manual background and instructions seem t o be better than the ones I used in high school.” Laboratory Notebook ST-1 “I do not feel the lab notebook is necessary----si mple notebook paper would do.” ST-12 “The laboratory notebook set-up is how it is used in other labs. It is a good way of organizing the chemistry lab informatio n.” ST-18 “Well, I have never used a lab notebook before and I really don’t think it helps. It just makes everything twice as much w ork.” ST-28 “The lab notebook can be easily formatted and orga nized…easy to look up data/analysis from previous labs.” ST-54 “The lab notebook is an organized way to record da ta and observations.” Technology ST-7 “I think the instructional methods used to assist students in lab are all very helpful. Blackboard is a great tool. MBL usef ul but frustrating.” ST-8 “Using Blackboard and having online discussions. It was nice to see how other classmates viewed the lab and the data co llected. It also was a quick way of clarifying questions.” ST-26 “The technology is way more present in this course than others I have experienced. This is crucial for majors in the sci entific field.” Post-Lab ST-24 “Post lab is very important to analyze and understa nd what we did.” ST-26 “Post lab I don’t think is that crucial to the conc epts except for review purposes.” ST-50 “Post labs are needed to evaluate your data and und erstand the meaning of the lab.” Continued on next page

PAGE 345

329 Table 69 (Continued) Overall ST-4 “I believe the pre-lab and post-lab activities are vital to the lab. It gives the participant a better understanding of the exper iment to be performed.” ST-45 “Overall I find the instructional methods have acc elerated my learning compared to my other lab classes.” ST-49 “I feel in the lab you almost are forced to learn the material through constant exposure. This helps me learner better, I ’ve not had this before.” ST-53 “I had never been exposed to these instructional m ethods in any of my prior science laboratory activities. In my past sc ience labs, we never performed any pre-lab activities or even maintained a laboratory notebook. Our technology was also severely limited, and postlabs were pointless to say the least. I am glad to apply these new method s to my lab work because now I feel like I’m actually retaining information and learning from the activity, as opposed to just going through the motions.” Final Interview Discussion Instructional Methods Final Interview Questions One and Two Final interview questions one and two were used as a tool to determine which instructional feature (pre-laboratory, laboratory w ork, or post-laboratory) the participants considered were the most effective and least effect ive in promoting their learning during the semester course. As discussed in the chapter se ven introduction as well as chapters two and three the instructional features are divide d into three general methods prelaboratory, laboratory work, and post-laboratory. Table 70 summarizes the interview participants’ overall responses (%) to final interv iew questions one and two. Tables 71 and 72 presents the interview participants’ extende d responses to questions one and two. By the end of the semester course two of the three instructional features, laboratory work (40%) and post-laboratory (40%) wer e selected by the participants as the most effective in promoting their learning duri ng the semester course while the prelaboratory instructional feature (65%) was selected as the least effective.

PAGE 346

330 Table 70 Final Interview Laboratory Instructional Feature Instructional Category Most Effective Least Effective Pre-laboratory 15.0% 65.0% Lab Work 40.0% 5.0% Post-laboratory 40.0% 25.0% Other 5.0% 5.0% N=20 Question One – Most Effective Instructional Feature In order to query participants’ views concerning th e instructional features (prelab, laboratory work, or post-lab) they were asked which feature they found to be most effective in promoting their learning during the co urse (Table 71). The pre-laboratory was identified by only 15% of th e interview participants (ST 7, 14, and 19) as being the most effective instruction al feature. These participants stated in the final interview that the pre-laboratory feat ure offered them a preview of what concepts and methods were to be encountered during the laboratory work thereby decreasing their frustration levels. This supports Barnes and Thornton’s, (1998) study that if students are better prepared prior to atten ding lab, then they will be able to improve their rationale behind the laboratory proce sses being presented. They found that students in their study felt that the pre-labo ratory made performing the lab and writing the post-lab report easier. Students that do not prepare may be unable to fully engage in the completion of the laboratory work and thereby reduce their opportunity to learn. Students from Wyatt’s (2003) online pre-lab s study indicated satisfaction with the pre-lab exercises. However, the majority of parti cipants in this study indicated that they found the pre-laboratory activities to be the least essential to their learning. Participants (1-2, 9-10, 12-13, 17, and 20) indicat ed that laboratory work allowed them to experience the different aspects of the top ic whether it was use of the

PAGE 347

331 equipment, teamwork, or to see how things really oc curred. Laboratory work allowed the participants to complete or use a procedure in a gi ven situation in chemistry laboratory and take the new information gained to solve differ ent types of problems. This supports Byers (2002) views that laboratory work remains ess ential to the development of a range of practical skills as well as offering the learner an opportunity to understand what scientist do. On the other hand often students inv olved in laboratory learning only manipulate equipment and do not get around to manip ulating the ideas (Gunstone & Champagne, 1990). The post-laboratory engaged 40% of the participants (ST 4-6, 8, 11, 15-16, and 18) in reflecting on everything they experienced fr om the pre-laboratory and laboratory work together. The participants emphasized the con nection between post-laboratory analysis and understanding concepts introduced in t he pre-laboratory activities. It served as a tool for organizing, clarifying and syn thesizing their thoughts. The postlaboratory activities lead those participants to an improved understanding of the material presented. This supports the idea that communicati ng science with clarity and understanding is crucial to science students (Kopro wiski, 1997; Rivard, 1994). According to Herrington (1997) the act of writing a post-laboratory report should allow students opportunities to organize, develop, and ex plain scientific concepts. Writing a post-laboratory report helped connect data, scienti fic equations, and scientific knowledge with their observations performed during the labora tory work. Participant three felt all of the instructional fea tures were effective to her learning during the course. She believed that each offered a different but useful perspective to her learning experience. The literature suggests t hat if designed properly the entire laboratory experience has the potential to play an important role in attaining cognitive skills (Hofstein, et al., 2004).

PAGE 348

332 Table 71 Participants’ Reflections – Effective Inst ructional Methods (N=20) Final Interview Question-1 What instructional feature (pre-lab, laboratory wor k, or post-lab) was the most effective in promoting your learning in this course? Instructional Issue Quotation Comments Pre-lab ST-7: “ I think all of them worked together effectively. Bu t if I had to choose I would pick the pre-lab. I wouldn’t struggle in la b when I did the pre-lab.” ST-14: “I would say the pre-lab. The pre-lab offered a lot of background information. I definitely never went to lab witho ut my pre-lab done. So the pre-lab helped me so I was prepared.” ST-19: “ Well, it’s hard to decide between the pre-lab and t he actual lab itself because the pre-lab prepared you to perform the lab. I’d say probably the pre-lab as it offered one an overview.” Lab Work ST-1: “The laboratory work because it gave me a chance t o actually physically do things. This allowed me to see how t he concepts applied and how it effects real world situations.” ST-2: “The laboratory work itself because you had to ap ply all of the concepts and ideas to the actual hands on experienc e in order to get the experiments to follow through and get results.” ST-9 : “Probably the actual laboratory work because it put everything to use. You allowed you to see how the concepts appli ed. I also thought the pre-lab was really helpful because it gave you a he ads-up beforehand. However, actually performing it was the most helpfu l.” ST-10 : “Definitely the laboratory work because you it o ffered a real time experience.” ST-12: “The laboratory work was the most effective. It was easier to understand the material when we performed the lab. The instructor would go over the pre-lab before we performed the lab in order to clarify and questions. The post-lab I thought was also effecti ve. It allowed you to analyze what one did during laboratory work.” ST-13: “The laboratory work because when we did the expe riments it allowed one to see how the concepts applied.” Lab Work ST-17: “The laboratory work as it allowed you to actually do it yourself. I learn best with hands-on-experience. I get better g rades doing the work.” ST-20: “I’ll say laboratory work. Well, because that it g ave me hands-on visual learning. It allowed me to apply the pre-la b concepts. I could actually see what happened, how it happened, and wh y it happened.” Continued next page

PAGE 349

333 Table 71 (Continued) Post-Lab ST4: “I think the post-lab. After performing the lab an d actually analyzing the data I could look back over the experience and all the processes that we performed during the lab. This is when I gained the most knowledge and understanding of what we were doing during the lab.” ST-5: “I would say both the laboratory work and the post lab. If I had to pick between the two I would pick the post lab as it was more effective. The post-lab allowed one to understand the data, proper use of the formulas and how everything tied together.” ST-6: “ If I had to pick one out of the three I would pick the post lab. The post-lab because you could tie all the results toge ther and explained why things occurred. Although the pre lab and lab work are obviously important but the post lab is most important because it bring s everything you did together.” ST8: “I would have to say the post lab. After I did th e pre lab I didn’t know a lot about the concepts, but I had a better i dea after the lab work. But when I did the post lab I was able to evaluate everything and learn the most and see what happened during the lab work.” ST-11 : “The post lab was the most effective. The postlab allowed me to go back and look at the data and to analyze everyth ing.” ST-15 : “Post lab was the most effective. It forced me to sit down and understand what occurred during laboratory work. Pe rforming the formal write ups helped organize and analyze the data.” ST-16: “I would have to say post-labs. After the lab ex perience one could understand the data performing the post-lab analysi s.” ST-18: “I would say the post-lab. I would be confused unt il we had a prelab discussion. Once we had performed the lab I gai ned a clearer understanding of the concepts. However, my overall understanding occurred during the post–lab analysis.” Overall ST-3: “For me I found all equally effective depending on the experiment. For some of the labs I initially learned more from doing the pre-lab and bench work. While during other labs I learned more from the actual final analysis.” Question Two – Least Effective Instructional Featu re In order to query participants’ views concerning th e instructional features (prelab, laboratory work, or post-lab) they were asked which feature they found to be least effective in promoting their learning during the co urse (Table 72). The pre-laboratory was identified by 65% of the int erview participants (ST 2, 4-6, 8, 11-13, 15-18, and 20) as being the least effecti ve instructional feature. These participants stated in the final interview that the pre-laboratory feature increased their frustration levels. They felt that the pre-laborat ory activities did little to offer a

PAGE 350

334 perspective of what to expect and limited understan ding of the concepts. Others felt that the pre-laboratory activities were time consuming a nd unnecessary. These views may in part appear to be due to time management issues. A ccording to Johnstone and AlShualli (2001) many students ignore the importance of pre-laboratory preparation because they feel that they can survive without per forming it. Pre-laboratory activities ease the transition into the new experiences by all owing students to familiarize themselves with the experiment. In addition the st udents may gain a clearer understanding of what is expected of them during la boratory work (Koehler & Orvis, 2003; McKelvey, 2000; Nicholls, 1999). Effective p reparation may result in reducing anxiety and increasing student confidence. Only one participant (ST 14) indicated that she fou nd laboratory work to be the least effective. She stated that she felt more c omfortable with book and written style learning than hands-on. Here the participant lack ed an awareness of the aim of laboratory work. Firsthand laboratory science expe rience is seen as a key way to improve students’ understanding and appreciation of the way science works however other studies show that laboratory activities provi de little improvement in understanding the methods of science (National Research Council, 2006; Driver, et al., 1996 Lederman, 1992; Gunstone & Champagne, 1990; Tobin, 1990). Twenty-five percent of the participants (ST 1, 3, 9 -10, and 19) found the postlaboratory to be the least effective instructional tool. These participants suggested that the post-laboratory experience was too repetitive a nd extra work. In some cases if the participant did not understand the point of the pre -laboratory and laboratory data collected during laboratory work so they felt lost when attempting to analyze the results. Students need to learn how to negotiate scientific understanding by communicating those understandings within the context of scientif ic discourse (Prain & Hand, 1996).

PAGE 351

335 The post-laboratory analysis gives students an oppo rtunity to engage in authentic discourse, make connections between their findings and the relevant science concepts while learning to reflect, synthesize and generate new ideas (Keys, 2000; Keys et al., 1999). Table 72 Participants’ Reflections – Least Effect ive Instructional Methods Final Interview Question-2 What instructional feature (pre-lab, laboratory wor k, or post-lab) was the least effective in promoting your learning in this course? Instructional Issue Quotation Comments Pre-lab ST-2: “ The pre-lab because you were more worried about get ting it done rather than understanding it. You had to turn in yo ur pre-lab the day you performed the lab. You could turn it in late but yo u would get points deducted.” ST-4: “ I would have to say the pre-lab. Initially going in to a lab, the pre-lab was always the most difficult to me. I would have t o seek some help. I think working with a partner or in a team during a pre-la b sharing made things a little bit easier.” ST-5: “The pre-lab because you really don’t know exactl y what you are going to be doing or what type of technology you ar e going to be using. Even though the formulas were there they weren’t ef fective until after performing the labs.” ST-6: “If I had to pick one it would be the pre-lab. Si mply because you are doing it before discussing it. Most of the time yo u do the pre-lab on your own. It wasn’t until the pre-lab discussion that we ended up understanding.” ST-8: “If I had to choose it would be the pre-lab. I lea rned more doing the lab and post-lab. Pre-lab was helpful but not as he lpful as the others.” ST-11: “ Probably the pre-lab because I hadn’t done any of t he laboratory work yet or so it was harder for me to get the corr ect answers. I found them all very effective but the pre-lab was probably the least effective because it was harder for me to do it without actually doing t he lab work first.” ST-12: “The pre-lab because some of the questions I unde rstood them better after performing the lab.” ST-13: “Probably the pre-lab. Once we did the laboratory work and then the post-lab we understood more about the pre-lab.” ST-15: “The pre-lab. I tried to “slide by” but not under standing held me back during the lab work. I remember asking my lab partner for help so she explained it to me as we went through the lab. She also helped me with the post-lab and then the pieces came together.” ST-16: “I have to say it would be pre-lab. Even though I don’t think it wasn’t entirely non-effective. Well because when you’re first learning about the concepts or what the subject is you’re li ke feeling it out. The laboratory work allows you to view the concepts in actions and the post-lab helps one understand.” Continued next page

PAGE 352

336 Table 72 (Continued) Pre-Lab ST-17: “The pre-lab. It gave you a background but I don’ t think it was a 100% necessary. It was good coming into the lab kno wing a little bit about what you had to do but I don’t think anyone would h ave been that much worse off without it.” ST-18: “I’d say the pre-lab. I don’t do well when I jus t read the material. I like more of a hands-on approach. So, for me it wa s harder to just read it and be able to understand it right away.” ST-20: “I’ll have to say the pre-lab. The post-lab reall y pulls together everything that you’ve learned. The concepts that you’ve experienced in the pre-lab work do help you somewhat understand wh at you’re going to be doing, how you’re going to be doing it, and why you ’re going to be doing it. The post-lab helps you completely understand. It h elps you analyze the ideas of what was really happening and why it happe ned and it really puts it together for you. You know, it really allows thing s that you may not have realized or recognized before, or thought about bef ore come to life.” Lab Work ST-14: “I would say the lab work. The lab work is probab ly what most of my classmates would find most effective because it’ s hands-on. However I’m more of a book learner. But, I would say that the pre-lab was the most effective and then the postlab because that’s what tied everything together. The lab work was kind of just like a visual aid.” Post-Lab ST-1 : “I’d say the post-lab. The pre-lab helped me to initially understand the information and what I was going to be doing. The lab work helped me to demonstrate it so I could understand it better a nd the post-lab just reiterated it.” ST-3: “I would say the post-lab. Although, it is still i mportant I’m not discounting it at all. This is probably because if I didn’t already understand the concept in the pre-lab and didn’t get it after performing the lab then the post lab would be more difficult for me. Sometimes I could look back and say oh that’s why this occurred. But it is more imp ortant for me to get it first and then I could apply my knowledge.” ST-9: “The post-lab. Even though it was somewhat effect ive. I guess it was just me. After I was done with the experiment I wanted to move on to the next. The post-lab just seemed as if we were r epeating the information.” ST-10: “The post-lab. The pre-lab gave you an overview o f the lab. The post-lab was repetitive.” ST-19: “The post-lab. Sometimes it confused me. I would think I knew what I was doing. However, when I would get to the post-lab I did not understand. I would get confused instead of gettin g any clarity .” Overall ST-7: “I don’t think any of the instructional features w ere least effective. The pre-lab gave me an initial understanding, the l ab work was hands-on learning and the post-lab helped me understand the other two.”

PAGE 353

337 Final Interview Question Three Promoting Learning Final interview question three was used as a tool t o determine what the participants thought they could have done different ly to promote their learning during this semester course. The major response themes from the participants were spending more time on the course and/or on the pre-laboratory act ivities. Table 73 presents some of the interview participants’ extended responses to q uestion three. Participants’ self-efficacy and ability to self-reg ulate may have influenced their accomplishments and persistence when performing the laboratory tasks. According to Bandura (1977), self-efficacy beliefs influence per formance accomplishments and the persistence demonstrated in the pursuit of challeng ing tasks. In addition, self-efficacy has been shown to have a mediating role on student achievement. Participants’ perceptions of self-efficacy influenced their instr uctional activity choices. They may have avoided those laboratory instructional tasks in whi ch they lacked confidence and engaged in laboratory tasks in which they expected to experienced success. Educational psychology studies describe the ability to take responsibility for and to selfdirect one’s learning as self-regulation of learnin g (Zimmerman, et al., 1992; Zimmerman, 1990). Participants that actively contr olled their study time, study environment, and persistence were more successful i n accomplishing the tasks.

PAGE 354

338 Table 73 Interview Participants’ Reflections – Prom oting Learning Final Interview Question-3 What could you have done differently to promote you r learning? Quotation Comments ST 2: “I needed more time as chemistry is a complicated and difficult su bject. I’m not a science and math person. In the beginning of the s emester I was taking 15 credit hours and I was working 30 hours. I ended up having to dr op one of my classes and my grades did get better. If I had more time I would actual ly do all of the reading. I would look for key words to answer the questions and find the relation ship between the concepts and data.” ST 3: “I think of I was on top of the material. I mean I always did things as best as I could and tried to be as detailed as possible when prepar ing for class. You really had to understand the first steps otherwise you did not un derstand the later steps. Additionally, when I was not sure about a question or problem I w ould research it online. I would always try to figure out things on my own before I asked t he professor. I would look for things online and really look at the question before I just gave up.” ST 5: “ The major strategy that I used to promote my learni ng involved reading the material. I had taken an introductory chemistry course and ad apted that to what I was doing in lab. There were technical issues, not only blackboard, b ut time issues.” ST 6: “I needed to spend more time on preparing. I did a ll of the bookwork. I should have gone over the pre-lab more before and after perform ing the lab. Performing the postlaboratory reports allowed me to reread and I under stood those more.” ST 7: “ I think if I would have read more it would have hel ped. I took notes but I needed to devote more time.” ST 8: “I should have done more background reading before the lab. Time was also a big factor. Strategies I used to study for the lab incl uded doing the pre lab to the best to my ability and studying for the quiz. I would just rea d.” ST 15: “I could have tried to understand the pre-lab. I s lacked off a little when it came to doing the pre-laboratory activities. Especially if I didn’t understand. I tried to get by without doing much work. I do the same thing in other class es.” ST 16: “I would have to say try harder on the pre-labs. The pre-labs were kind of like going in blindly. Sometimes I had to review the lectures before I did the pre-lab. So I guess I should have tried harder on the pre-labs. For insta nce I should have read the sections in the laboratory manual and go over the laboratory PowerP oint slides on the Blackboard site. Unfortunately I didn’t.” ST 17: “I could have spent more time on preparing for lab. I really didn’t get to focus a whole lot on my other classes as this class took up so much time. Strategy for studying was reviewing the course work and reading. ” ST 19: “I usually don’t watch the lectures before the lab. So if I’d watch the lectures before or read the material then it probably would have ma de the lab easier. I don’t really think I have a learning strategy. I just do what I’m told to do Sometimes tutoring helped and when we would email each other back and forth. How ever, this campus only offers tutoring twice a week for one hour each time”. Final Interview Question Four – Laboratory Skills Final interview question four was used as a tool to determine what the participants’ believed were the most important skil ls they learned in the semester chemistry laboratory course. The major response t hemes from the participants were

PAGE 355

339 use of laboratory equipment and technology, trial a nd error, analysis, and time management/organization. Table 74 presents some o f the interview participants’ extended responses to question four. Laboratory experiences should aim to encourage lear ners to gain the following: manipulative skills, observational skills, the abil ity to interpret experimental data, and the ability to plan experiments (Johnstone & Al-Shuaili 2001). This supports the participants’ views on the skills they gained durin g the course such as: being able to make observations, the proper use of laboratory equ ipment/tools, improved organization and being able to analyze the information obtained during laboratory work. Table 74 Interview Participants’ Reflections – Labo ratory Skills Final Interview Question-4 What is (are) the most important skill(s) you learn ed in chemistry laboratory? Quotation Comments ST 1: “Learning how to use the laboratory equipment probably as imp roper use can affect the experimental results. In addition, how to use the computer lab software.” ST 2: “Triple checking your work is an important skill. You want to properly measure so that your data is accurate and precise.” ST 3: I think really looking at what is going on during t he laboratory work. Learn that you really need to follow the instructions and if you d on’t do it right to do it again. Use trial and error and if you make a mistake just repeat the tri al and avoid making the same mistake. Critical thinking is an important skill.” ST 4: “One of the most important skills was learning how to properly use the equipment.” ST 5: “ The most important laboratory skill was how to use the laboratory equipment and technology.” ST 6: “ I would say performing the analysis and pulling the concepts together.” ST 7: “ Following the directions and proper use of the labo ratory equipment are important skills. They are important because when you do not properly use the equipment the accuracy of the data is impacted.” ST 8: “Probably learning to become more organized was the most important laboratory skill I learned during the course. I became better at or ganizing the information and analysis.” ST 15: “The most important skills were the use of trial and error, using Excel, and how to analyze data.” ST 17: “The most important laboratory skill I learned was time management.” ST 19: “The most skills involved learning how to use the l aboratory tools and organization. Organization is the key skill to this course. You must understand the prior material before you could move on to the next activity.” ST 20: “I would say the hands-on experiments, the safety skills, proper use of the equipment and the analytical processing.”

PAGE 356

340 Final Interview Question Nine – Laboratory Notebook Final interview question nine was used as a tool to determine what the participants’ believed the role and significance of the laboratory notebook is in any scientific workplace. What is a laboratory noteboo k? In the context of this chemistry laboratory course, the lab notebook was viewed as a history of the work accomplished during the semester. Each participant recorded the work they performed f or lab assignments, carefully recording what they did and learned along the way. The major response themes from the participants on why one might want to keep a la boratory notebook were to provide a record of why and how experiments were performed, r eal time data collection, for interpreting results, and providing information to others. Table 75 presents some of the interview participants’ extended responses to quest ion nine. The use of laboratory notebooks as a tool is supported by the participant s’ interview responses. A number of the participants advocated the necessity in the cou rse as well as in the scientific workplace. However, several felt the laboratory no tebook was time consuming and repetitive. The use of laboratory notebooks as an instructional tool is supported by a number of researchers who advocate writing in scien ce to enhance student understanding of scientific content and processes, as well as general writing skills (Bass, Baxter & Glaser, 2001; Keys, Prain, Hand & Collins, 1999; Rivard & Straw, 2000). The information written into a laboratory notebook is u sed for several purposes. The most important is that the pages of the laboratory noteb ook preserve the experimental data and observations with unambiguous statements of “th e truth” as observed by the scientist (Kanare, 1985). The major goal is to wri te with detail and clarity so that other scientists can pick up the laboratory notebook and repeat. Students need to realize that

PAGE 357

341 the laboratory notebook is the prime source of info rmation when one is required to write an analysis (Aschbacher & Alonzo, 2004). Table 75 Interview Participants’ Reflections – Labo ratory Notebook Final Interview Question-9 Describe the role and significance of the laborator y notebook in any scientific workplace (e.g. classroom, research laboratory, hos pital, pharmacy) Quotation Comments ST 2: “The lab notebook is an essential tool for anyone involved in a science field. It allows one to record raw data and maintains a train of tho ught of what happened during the experience. As a psychology major I found it was e asier to use a notebook.” ST 3: “I definitely understand the purpose of a lab note book is to have everything written down and recorded for future use. The role or signi ficance I can understand in a hospital as you are dealing with a patient and there could be c onfusion if one did not record the information. However, sometimes I just wanted to bu rn this lab notebook and say can’t we just write it on a piece of paper. But I understand the significance of it as it is really important to have everything recorded. The differe nce between writing it in a lab notebook rather than just on the piece of paper is that one could lose the piece of paper.” ST 4: “Well, the significance I think would be recording the real time data and observations that you make whether it is in the classroom or, la b, or hospital or pharmacy. One can go back after you’ve left a certain situation to see w hat you’ve written down and help you evaluate a situation at a later time.” ST 5: “ The notebook is the first one I have done. One reas on to keep a notebook is to avoid future mistakes. “ ST 6: “The role is to keep track of all of your data as it is very important so you can look back to look at your data and your procedures. It i s a summary of what you learned and what you experienced. The difference is if it was j ust recorded on regular paper you could misplace the papers thereby losing the data.” ST 7: “ It’s a notebook in which you can write down all of your observations and perform calculations. It is an important way to keep your n otes all together so if you need to refer back to your data it is easily accessible.” ST 9: “In the classroom it’s important and significant so you can go back and refer to your work. After you’re done doing the laboratory you ca n see what you’ve done and process the information. For the research laboratory, their ma in goal is so after they’re done performing the experiment they can go back for specific detail s, see if there were mistakes and where the mistakes might have occurred.” ST 11: “The laboratory notebook helped me as I could refe r back to it throughout the semester. You can use it for future experiments suc h as in a research hospital so one can see the differences and similarities in their resul ts.” ST 12: “The lab notebook is used to record data and observations. It helps with postlaboratory activities such as analysis. The uses a re the same for a research laboratory or hospital. Recording information in the notebook im proves the accuracy.” ST 15: “ I think it is a very organized way to keep track of what you are doing. It is very important as you want to see why you did something and evaluate what went right and wrong.” ST 17: “I guess I can see the point in using it in the cl assroom more so than in a research lab or in a hospital. Several students didn’t know the structure as we had never used a lab notebook. I think it is good to use it in the class room as a tool.” Continued next page

PAGE 358

342 Table 75 (continued) ST 18: “The purpose was to record your data. I think it gives you practice for the future. I know in the field that I’m looking at there are all types of government forms you have to fill out. So, I think it gives you practice efficiently recording what you were doing and the results. It also allows others to see what you’re d oing. Therefore if they wanted to continue where you left off they would have some guidelines and initial results so they could pick up from there and continue on.” ST 19: “The lab notebook is where you record data and ob servations. Research labs and hospitals also need to record information. This al lows them to keep patient information organized and written down accurately for future ev aluation.” ST 20: “To have a place where you can record significant data and findings. This is necessary so that one can later utilize the informa tion for further clarification.” Final Interview Question Ten –Scientific Analysis Final interview question ten was used as a tool to determine what the participants’ believed the role and significance of the scientific report or analysis is in any scientific workplace. The goal of scientific writi ng is effective communication. A good scientific report does more than present data; it d emonstrates the writer's comprehension of the concepts behind the data. In the scientific community, one of the most basic goals is the development and application of new knowledge. Writing scientific reports and papers is the easiest and mo st effective way to share the information with the scientific and medical communi ty. However, scientific papers come under great examination as they are reviewed, teste d, and retested time and time again. Published scientific papers act as influential vess els in an attempt to validate the researcher’s data and interpretations. In time the results may become accepted as scientific fact. Each participant prepared post-laboratory reports. As discussed in chapters two and three there were three types of laboratory repo rts: the basic laboratory report (BLR), the formal laboratory report (FLR), and the laborat ory notebook (LNB).The major response themes from the participants were to share one’s results with others in the

PAGE 359

343 scientific community, and learn from the results. Table 76 presents some of the interview participants’ extended responses to quest ion ten. Scientific analysis in the form of laboratory repor ts should give learners the ability to think, talk, and write scientifically. Effectiv e scientific communication requires learners to use scientific language to reflect the scientifi c process. Several of the participants viewed scientific analysis as a method to relive an d reflect on the laboratory work thereby bringing structure to their thinking. Freq uently participants talked about how the laboratory analysis as a way to revisit to laborato ry work and put everything in perspective. Keys (2000) findings suggest that sci entific writing promotes scientific thinking by helping learners to explore relationshi ps between evidence and knowledge claims. Table 76 Interview Participants’ Reflections – Scie ntific Analysis Final Interview Question-10 Describe the role and significance of the scientifi c laboratory report/analysis in any scientific workplace. (e.g. classroom, research lab oratory, hospital, pharmacy) Quotation Comments ST 2: “The report is significant in research as it lays o ut the experimental results and discussion. It allows others performing the same ty pe of research to read and learn from other research. In this course it forced you to ex plain and hopefully understand the overall experience.” ST 3: “It relates what the scientist did through the ent ire process. It offers an analysis of the results. The report discusses what the results cou ld possibly mean and discuss any potential error. It is an all encompassing way to analyze information and present it to others. For research it is useful if they are doin g similar work.” ST 5: “ You would be presenting your results to other scien tists.” ST 6: “ The lab report is important because it presents an analysis of your data. This allows one to understand the results and explain what may have gone wrong. It pulls everything together. The report is important as it allows othe rs to read and learn from the results. It’s a way to learn and share with the science community.” ST 7: “T he significance of the report is so you and others have an understanding of the results, what you performed, and summarize your con clusions.” ST 8: “ I think it is very important to organize your data and results into a report. It describes what happened so when others read it they can learn from what you did.” Continued next page

PAGE 360

344 Table 76 (continued) ST 10: “In the classroom I think it is good practice. Fo r the real world their important if you’re going into anything where people relied on y our reports and your analysis. I think the reports are practice for the real world. Reports a nd analysis are always important, but specifically for science.” ST 11: “ The analysis is useful to others in the same field. It assists those who read it by showing the results in a clear concise format. ” ST 12: “The lab report is a way to “wrap it all up”. Yo ur research is presented in a special format that everyone else can read. This allows oth ers to see what you’ve done and learn from it.” ST 15: “In the classroom it is very important as it allow s for conclusions to be drawn. The research lab would do it in order to share their re sults with others. The report is used in the same manner at a research hospital or pharmacy. It is used to publish findings to help the scientific community as a whole.” ST 17: “ The research experience is worthless without being able to analyze the data and report the results.” ST 20: “The report is a way to provide information and al low access to data that you have found during the process. It is a method that allo ws you to share and disseminate knowledge to other people about your work. Then ot hers that may be working in the same area or interested in what you’re working on may ga in insights. When you share that knowledge other people may be able to learn from yo ur experience.” Reflections of Pre-Post Laboratory Experiences Section two of the Student Evaluation of Laboratory Instruction Questionnaire probed students’ perceptions regarding the followin g four aspects of laboratory work: achievement in conducting the experiment, difficult y of doing the experiment, enjoyment in doing the experiment, and understanding the expe riment. Each of the aforementioned topics included three self-explanatory statements, except the difficulty topic, which had four. (Appendix E) Participants were asked to choo se one statement for each topic that best described their own position regarding that to pic. The questionnaire results were tabulated so that pa rticipant responses (choosing statement A, B, C, or D) could be expressed as a pe rcentage. Table 77 shows the results of the questionnaire concerning participant s’ preferences for the instructional method of teaching experiments in the laboratories. As can be seen from Table 77, 55% of the participan ts reported they felt a sense of achievement when they participated in a pre-lab discussion prior to performing the

PAGE 361

345 experiment, while 34% indicated that they felt a se nse of achievement when they performed the experiment first and then participate d in a post-lab discussion. A small percentage (11%) felt there was no clear difference The sense of achievement for those preferring to pe rform the experiment first may derive in part from participants’ overcoming th e attitude of having to have the “right” answer and rising to the challenge of the difficult y they initially experienced with the experiments to being able to perform the activity w ith minimal assistance. For those participants their enjoyment of the laboratory expe rience improved over the course of the semester. Table 77 Reflections Pre-Post Laboratory Experience s Statements (N=56) Topic and Statements* Percentage (%) Achievement A. Experiment first 55 B. Explanation first 34 C. No difference 11 Difficulty A. Experiment first 72 B. Explanation first 5.0 C. No difference 14 D. Same difficulty 9.0 Enjoyment A. Experiment first 21 B. Explanation first 60 C. No difference 19 Understanding A. Experiment first 33 B. Explanation first 48 C. No difference 19 *Appendix E From Table 77, 72% of the participants indicated th at it was more difficult to perform an experiment before it was discussed espec ially when it came to the methods and equipment which many were not familiar with due to lack of laboratory experience. Approximately 14% of the participants felt at the b eginning of the semester it was a

PAGE 362

346 challenge to perform an experiment prior to a discu ssion but eventually preferred to perform the experiment first and follow-up with a p ost-lab discussion. A small percentage (2.0%) felt it was more difficult to per form an experiment after it was discussed, while 9.0% indicated there was no clear difference. Early in the semester when the participants began l aboratory experiments without a detailed pre-laboratory discussion their difficulties were noticeable as the laboratory manual was not designed to be used indep endently. In several experiments, the laboratory manual states “instructor will demon strate” so when the participants asked to be shown, they were usually directed to a step i n the procedure, to the diagrams or to a mock set-up of the laboratory equipment at the fr ont of the laboratory. This pushed some of the participants to act more independently while completely frustrating others. However, many of the participants appeared to gain independence to varying levels as the semester progressed. As can be seen from Table 77, 60% of the participan ts indicated that they enjoyed the laboratory experience better if they pa rticipated in a pre-lab discussion prior to performing the experiment, while 21% indicated t hat they enjoyed lab better when they performed the experiment first and then partic ipated in a post-lab discussion. A small percentage (19%) felt there was no clear diff erence. Many participants, if not most, were willing to obt ain the raw data and then leave the laboratory as quickly as possible as indicated by the preference (60%) of performing the experiment after a detailed pre-lab discussion. Enjoyment during the laboratory work period may contribute to the participants’ achievem ent and understanding. Creating enjoyment is one way to avert the “take the data an d run” scenario. Performing the experiment prior to a discussion allows participant s more time to ask questions and think

PAGE 363

347 thereby contributing to an improved understanding o f the concepts underlying the laboratory activity. As indicated in Table 77, 48% of the participants indicated that they understood better if they participated in a pre-lab discussion prior to performing the experiment, while 33% indicated that they understood better whe n they performed the experiment first and then participated in a post-lab discussio n. A small percentage (19%) felt there was no clear difference. There are several factors that could explain the un derstanding results if one considers Bloom’s Taxonomy (Jalil, 2006). When the experiment is discussed prior to performing it, the instructor is addressing higher levels of learning (e.g., analysis), without addressing the knowledge level (1 st level), in a proper way. This means that some participants may not know what the instructor is talking about when the discussion connects the theory to practice. Some participants may misunderstand leading to misconceptions. Here some participants preferred a cting as receivers of information demonstrating they were able to repeat experiments. This preference may have been due to one or a combination of the following: lack of prior experience in laboratory problem solving, cook-book laboratory experiences, or personal lack of confidence Performing the experiment first would be consider ed the natural process of learning as one begins with observation which is th e first level in Bloom’s taxonomy– knowledge. When the experiment is performed first and then discussed this promotes better visualization of the underlying concepts of that experiment. This approach may facilitate critical thinking, encourage use of prio r knowledge, and assist them in seeking additional information. The participants had to th ink more independently, make judgments, and interpret the laboratory manual. Wh en participants are allowed to

PAGE 364

348 discover answers on their own, retention improves, and deeper understanding develops (Jalil, 2006). Reflective Assessment Bloom’s Taxonomy Section three of the reflective questionnaire was u sed as a tool to estimate learning gains or outcomes due to laboratory instru ction. Bloom’s cognitive taxonomy separates into six major domains: knowledge, compr ehension, and application all considered lower-order cognitive skills, and analys is, synthesis, and evaluation, higherorder cognitive skills. This taxonomy was applied to the analysis of the reflective selfassessment questionnaires and interviews. The quest ionnaire gave more quantitative data and the interviews more qualitative informatio n. In the study participants (N=56) completed a self-e valuation of their overall learning gains/outcomes in the cognitive domains of Bloom’s Taxonomy due to the laboratory instruction. The questionnaire question in general was formulated as follows: “Which description best describes the kind of learn ing/understanding you have gained by doing this laboratory activity?” The participants w ere given the Bloom categories in the cognitive domains: knowledge, comprehension, applic ation, analysis/synthesis, and evaluation to characterize their learning gains/out comes. To assist the participants in understanding the meaning of each domain, keywords were provided. Two examples are: knowledge (to recall, describes, identifies fa cts, term or phenomena) and analysis (to analyze, troubleshoot, and distinguish concepts through reasoning). The participants evaluated their own learning gains/outcomes on the scale: nothing, a little, some, a lot, or very much for each of the cognitive domains. Tab le 78 summarizes participants’ overall self-assessments of the cognitive domains.

PAGE 365

349 Table 78 Participant Assessment of Laboratory Cogni tive Domains Cognitive Domains Overall Average Choice Lower Order Knowledge D a lot Comprehension D – a lot Application C some Higher Order Analysis C some Synthesis B – a little Evaluation B – a little It is clear that, with regard to knowledge, and com prehension there are no differences between the overall average participant selections of “a lot” and is supported by the data in Table 79 for all six lab activities. The application category was rated “some” overall by the participants and is supported by three of the selected laboratory activities noted in Table 79. Regarding the hig her-order categories of analysis, synthesis, and evaluation, the ratings made by the participants varied depending on the activity and instruction. The average overall choi ce for the category of analysis was “some” which is supported by the data in Table 79 w ith three of the six selected activities. This rating suggests that the particip ants felt they gained more in the area of analysis from the technology-based, micro-computer based (MBL) laboratory activities. In regard to the categories synthesis and evaluatio n participants selected “very little” as their overall average choice. Participants indicat ed there was little to no gain in their learning at the synthesis and evaluation level duri ng lab activity 3 (Matter Lab) or activity 7 (Molecular Shapes). However, participants did ha ve to compare, contrast, and justify solutions in lab activity 3 and 7 but not to the ex tent they had to in other activities.

PAGE 366

350 Table 79 Laboratory Activities in Terms of Bloom’s Taxonomy Cognitive Domain Labo ratory Activity 2 *DP 3 ML 4 CRS 7 MS 8 *TE 9 *MV Knowledge RG RG RG RG RG RG Comprehension RG RG RG RG RG RG Application RG SRG SRG SRG RG RG Analysis RG SRG SRG SRG RG RG Synthesis RG SRG RG SRG RG SRG Evaluation SRG NRG SRG NRG SRG SRG RG indicates skill required-gained as iden tified by participants SRG indicates skill somewhat required-gain ed as identified by participants NRG indicates skill not all required-gaine d as identified by participants *Technology-Based (MBL) activities Reflections Laboratory Learning – Bloom’s Taxonom y Knowledge involves lower-order thinking and include s those behaviors that emphasize the recognition or recall of ideas, mater ial, or phenomena (Domin, 1999) This involves such skills as: defining terms, iden tifying objects, or stating procedural steps. Remembering, recalling, and recognizing know ledge is essential for further development of meaningful learning as the aforement ioned knowledge is used in more complex tasks. Recognizing knowledge involves retr ieving from long-term memory in order to compare it with presented information. Re calling knowledge involves retrieving it from long-term memory. Instruction at the know ledge level promotes retention of the presented material in much the same form as it was taught (Anderson & Krathwohl, 2001). The student’s role at the knowledge level i s to read, listen, observe, take notes, recall information, as well as ask and respond to q uestions. Some of the keywords used to evaluate participants’ comments were to learn, r emember, and to understand. Table 80 presents some examples of the participants’ refl ective comments concerning the cognitive domain of knowledge. The participants re called, remembered, and/or recognized chemistry knowledge. For example, parti cipants recognized their knowledge

PAGE 367

351 of chemical reactions and steps in how to use instr uments, organize data tables, and repeat density calculations. Comprehension also involves lower-order thinking an d includes those behaviors that emphasize the grasping the understanding of th e meaning of informational materials (Domin, 1999). This involves skills such as: expla ining a concept, interpreting a graph, or generalizing data. When the goal of instruction is to promote knowledge transfer the focus shifts to comprehension (Anderson & Krathwohl 2001). The student’s role at the knowledge level is to read, listen, observe, take n otes, recall information, as well as ask and respond to questions. Some of the keywords use d to evaluate participants’ comments were to explain, to describe, and to under stand. Table 80 presents some examples of the participants’ reflective comments c oncerning the cognitive domain of comprehension. The participants constructed meanin g from the laboratory instruction through graphic, oral, and written communication. For instance, participants could give examples, restate in their own words, and explain e xperimental concepts. Here the participants built connections between “new” knowle dge to be gained to prior knowledge. This new knowledge is integrated with existing cogn itive frameworks and mental models (Anderson & Krathwohl, 2001). Being able to interpret, exemplify, classify, summa rize, infer, compare and explain knowledge is essential for further developm ent of meaningful learning. During comprehension students may begin to convert informa tion from one form to another. For instance, when a student converts a graph into words involves interpretation skills while exemplifying occurs when a student can give a specific example of a concept. Inferring involves finding patterns while comparing involves detecting similarities and differences between two or more ideas. Classifyin g, inferring, and comparing occur when a student recognizes something belongs to a ce rtain category as in knowing the

PAGE 368

352 differences between elements, compounds, and mixtur es. Explaining occurs when the student can construct a cause-and-effect model of a system such as correlating the colors of spectral lines with their wavelengths. Application is considered by some to be lower order thinking and by others to be the lowest level of higher order thinking. For the purpose of this study it was considered as the transitional level from lower to higher leve l thinking. Application involves lowerorder and higher-order thinking and includes those behaviors that emphasize the ability to used learned material in new and concrete situat ions (Domin, 1999). To apply knowledge means completing or using a procedure in a given situation. This involves skills such as: problem solving, utilizing concepts in novel situations, and constructing graphs. Some of the keywords used to evaluate part icipants’ comments were to apply, to solve, and to predict. Table 80 presents some ex amples of the participants’ reflective comments concerning the cognitive domain of applica tion. The participants constructed meaning from the laboratory instruction by being ab le to execute and/or implement a task with some degree of understanding of the probl em and the procedure. For instance, participants felt confident in applying t he learned concepts to other situations and the mathematics. Here the participants learned information in new and concrete situations to solve problems. This new ability to be able to apply knowledge is used with other cognitive processes such as understand and cr eate (Anderson & Krathwohl, 2001). Analysis is the lowest level of higher order thinki ng and includes those behaviors that emphasize the ability to breakdown material in to its component parts (Domin, 1999). Analysis of knowledge involves identifying pertinen t data, identifying inconsistencies, and establishing relationships between items. Lear ning to analyze is considered one of the most important objectives in science instructio n. Some of the keywords used to

PAGE 369

353 evaluate participants’ comments were to distinguish to analyze, and to differentiate. Table 80 presents some examples of the participants ’ reflective comments concerning the cognitive domain of analysis. The participants constructed meaning from the laboratory instruction by being able to distinguish the relevant from irrelevant parts and determine how the elements of a situation fit or fu nction within a structure relating to chemistry. For instance, participants felt confid ent in being able to analyze scientific error, differentiate the difference between types o f chemical reactions, and to distinguish molecular shapes due to their experiences during la boratory instruction. Here the participants used the cognitive processes of differ entiating, organizing, and attributing of new information in terms of relevance or importance (Anderson & Krathwohl, 2001). Synthesis involves higher order thinking and includ es those behaviors that emphasize the ability to put parts together to form a new whole (Domin, 1999). Synthesis of knowledge can involve checking consist encies, formulating a hypothesis, proposing a plan for an experiment, or proposing al ternatives. Synthesis involves students making judgments based on criteria and sta ndards using the cognitive processes of checking and critiquing (Anderson & Kr athwohl, 2001). Criteria factors include consistency, effectiveness, efficiency, and quality. The standards can be either qualitative or quantitative. Checking includes det ecting fallacies within a product by determining whether a product has internal consiste ncy. For instance when a student tests whether data supports a hypothesis or conclus ion or whether presented material contains parts that contradict one another. Some of the keywords used to evaluate participants’ comments were to create, to design, a nd to compare. Table 80 presents some examples of the participants’ reflective comme nts concerning the cognitive domain of synthesis. The participants constructed meaning from the laboratory instruction by being able to distinguish the relevant from irrelev ant parts and determine how the

PAGE 370

354 elements of a situation fit or function within a st ructure relating to chemistry. For instance, participants felt confident in being able to create a strategy to determine what errors occurred in trials, speculate why certain un expected results occurred that did not support the hypothesis, and making judgments on whe ther the data supports the chemistry concepts due to their experiences during laboratory instruction. The final higher-order thinking domain, evaluation includes those behaviors that emphasize the ability to judge the value of materia l based on definite criteria (Domin, 1999). Evaluation of knowledge can include judging the value of data, judging the value of experimental results, and justifying conclusions Evaluation involves students putting or reorganizing material together resulting in a co herent whole or new pattern that allows them to build a model of chemistry phenomena. At th is level students may judge the value of material based using one or all of the fol lowing cognitive processes: generating, planning, and producing (Anderson & Krathwohl, 2001 ). For instance when a student tests whether data supports a hypothesis or conclus ion or whether presented material contains parts that contradict one another. Some of the keywords used to evaluate participants’ comments were to justify, to conclude and to compare/contrast. Table 80 presents some examples of the participants’ reflect ive comments concerning the cognitive domain of evaluation. For instance, part icipants compared and contrasted class experimental data, justified the resulting en d product(s), and generated conclusions due to their experiences during laborat ory instruction.

PAGE 371

355 Table 80 Participants’ Reflections on Cognitive Dom ains (N=56) Cognitive Domain Reflective Written Comments Knowledge ST 7: “I did gain knowledge on how to use instruments an d determine an unknown substance.” ST 11 : “I can recall how to figure density; how to organ ize a table.” ST 12: “I can describe how you would balance redox reacti ons.” ST 53: “I learned how to recognize chemical reactions in the lab. The experiments showed the information in a more visual illustrated process.” Comprehension ST 8: “Participating in the experiment allows me to bett er understand and explain in my own words what was done.” ST 16: “I could easily explain to someone the differences between elements, compounds, and mixtures.” ST 26: “Performing experiments isn’t crucial for knowledg e as much as comprehension. Experience is important for comprehe nsion.” ST 53: “I was able to comprehend the correlation between the colors of spectral lines and their wavelengths.” Application ST 11: “Applying all of the different theories and formul as to actual problems.” ST 45: “I feel confident that I can predict whether chemi cal reactions will happen or not.” ST 52: “I learned to apply the concept of the Law of Cons ervation of Mass to the experiment and real life.” ST 54: “I can solve the enthalpy equations and can calcul ate heat and temperature changes.” Analysis ST 12: “I now hold the ability to analyze various chemic al reactions and determine whether they are a certain classification of reaction.” St 45: “I know how to analyze the shape of the molecule t o determine hybridization and determine polarity.” ST 53: “I was able to distinguish the shapes and geometry of certain molecules by analyzing the number of bonds, lone pairs, and elec tron groups.” St 54 : “I learned to analyze my error which occurred dur ing the experiment.” Synthesis St 15: “Create a strategy to figure out what went wrong i n the first trial.” ST 19: “I believe I can design experiments to collect and analyze raw data.” ST 43: “I was able to speculate as to why certain unexpec ted results occurred.” ST 45: “This lab not only supports the ideal gas law, but paves my way to learning other gas laws such as Boyles’ or Charles.” Evaluation ST 9: “I was able to compare and see the differences bet ween different chemical reactions.” ST 11: “Comparing information from all the groups in the class.” ST 12: “I can justify why each substance got separated fr om the mixture the way it did.” ST 53: “I was able to compare previous measured data to t he experimental data and draw appropriate conclusions.” Final Interview Question Eleven Bloom’s Taxonomy Final interview question 11 was used as a tool to d etermine which three of the six cognitive domains in Bloom’s Taxonomy did the inter view participants feel they utilized most often during the semester course. As discuss ed in the previous section Bloom’s

PAGE 372

356 Taxonomy is divided into six domains further classi fied into two levels of thinking: lowerorder (knowledge, comprehension, and application) a nd high-order (analysis, synthesis, and evaluation). Table 81 summarizes the intervie w participants’ responses to the interview question concerning which three cognitive domains that they utilized most often in the semester course. Table 82 presents so me examples of participants’ responses when asked to expand on their original an swer and explain why and during which instructional feature they felt they used tho se particular cognitive domains the most. By the end of the semester course 85% of the interv iew participants identified the cognitive domain of application as the skill that w as used most often during the semester course. This domain is the transitional level from lower-order to higher-order thinking in Bloom’s Taxonomy model. Participants indicated tha t application skills were used most often during laboratory work and post-laboratory an alysis for performing calculations and writing laboratory reports. Application allowed th e participants to complete or use a procedure in a given situation in chemistry laborat ory and take the new information gained to solve different types of problems. Eighty percent of the interview participants identi fied comprehension as the second domain used most often during the semester c ourse. Comprehension a lowerorder thinking skill was necessary for pre-laborato ry and laboratory work in order to grasp the meaning of and classify informational mat erials.

PAGE 373

357 Table 81 Descriptive Statistics of Interview Partic ipants (N=20) Final Interview Question-11 Which three of the six learning skill levels in Bloom’s Ta xonomy did you utilize most often in this course? ID Bloom’s Taxonomy Cognitive Domains Knowledge Comprehension Application Analysis Synthesi s Evaluation 1 X X X 2 X X X 3 X X X 4 X X X 5 X X X 6 X X X 7 X X X 8 X X X 9 X X X 10 X X X 11 X X X 12 X X X 13 X X X 14 X X X 15 X X X 16 X X X 17 X X X 18 X X X 19 X X X 20 X X X The cognitive domain of knowledge was identified by 60% of the participants as the third skill level used most often during the se mester course. The lower-order thinking domain of knowledge requires retrieving re levant knowledge from long-term memory. As suggested by the participants, knowledg e is essential during the prelaboratory and laboratory work components of instru ction in order to perform more complex tasks as the semester progressed. By the end of the semester course the three higherorder thinking domains were identified as being used the least by the participa nts. Forty-five percent of the interview participants identified the cognitive domain of ana lysis as the higher-order thinking skill used most often during the semester course. This d omain requires the breaking down of materials into component parts and determining how they relate to one another and the overall purpose in chemistry. Participants indicat ed that analysis skills were used most often during post-laboratory analysis for performin g calculations and writing laboratory

PAGE 374

358 reports. Analysis allowed the participants to exam ine the information (data) to develop conclusions by making inferences and using evidence to support their conclusions. The interview participants identified the cognitive domain of synthesis as the skill used the least (5.0%) during the course of the seme ster. Synthesis requires a student to apply knowledge and skills to produce alternatives. During the semester course participants did indicate that they had to create s trategies and speculate about unexpected results (data). Twenty-five percent of the interview participants f elt they often used the cognitive domain of evaluation during the semester course. T his domain required the participants to reorganize their models into a functional whole. Participants indicated that evaluation skills were necessary for post-laboratory analysis. Table 82 Interview Participants’ Reflections Bloo m’s Taxonomy Final Interview Question-11 Which three of the six learning skill levels in Blo om’s Taxonomy did you utilize most often in this course? Quotation Comments ST-2: “I would have to say comprehension, application, a nd analysis In every lab report we had to describe and interpret our results. We had t o classify and arrange our results so that whoever picked up our notebooks could understand.” ST-4 : “I think you use all six of them to a certain level. I believe that compre hension the understanding of information and grasping the meani ng was used a lot from lab to lab. Application was used writing lab reports. You had to demonstra te understanding of the data, perform calculations, and be able to draw con clusions from what you observed. Analysis involved breaking down what you did in order to re port the information.” ST-5: “Knowledge you would get from the course lectures, which corr esponded to the lab. So by watching the power-point lectures from the le cture and lab helped one understand what was going on in lab. Comprehension would involve further understanding during the laboratory work which reinforced the material visua lly showing you what was happening. Application because of the calculations that we have to do in lecture then actually applying that in the lab.” ST-6: “Knowledge application and comprehension were used. There are some things you have to memorize but you have to understand the app lication so you can apply it. One also gained an understanding of the information by tying it all together.” Continued next page

PAGE 375

359 Table 82 (continued) ST-9: “Knowledge as it included collecting and examining informatio n. Knowledge was gained from the pre-lab. Application was used performing calculations. We also had to be able to classify such as when identifying types of chemical reactions. Analysis was the major component of the post-lab. This included anal yzing your results and categorizing everything.” ST-10: “When you write a post-lab report you’re evaluating all the experimental data. Comprehension was necessary because in order to perform the lab you had to have some understanding of the concepts. You also had to abl e to apply the formulas and calculations to gain an overall understanding.” ST-12: “You have to know certain terms that are used in the labs. You have to know what the formulas are and how they are used. You have t o know how to collect data. Comprehension you have to know how to interpret your data and d iscuss it. For instance when you’re doing the post-lab you have to analyze what you did during the experiment. You have to explain the data and organize your resu lts into tables.” ST-16: “Comprehension because it helped you understand the theories behi nd the lab. It helps in grasping the meaning of what you did. Ap plication is to apply the collected information to what you already know. Synthesis helped me to understand the material from before and apply it to the next lab.” ST-18: “Application it was used in the pre-lab and during laboratory w ork. You would apply the pre-lab to the experiment. The analysis involved using equations that dealt with the lab. You’d take the data that you got from the experimen t and evaluate it. Evaluation included taking the experimental data as a whole and being a ble to draw a conclusion.” Characterization of Participants’ Epistemological R eflections Epistemology and Instructional Methods Section one, question eight of the Student Evaluati on of Laboratory Instruction Questionnaire was used to evaluate participants’ be liefs concerning what they learned, if anything about epistemological beliefs with respect to the laboratory instructional methods. This section of the reflective student qu estionnaire (Appendix E) was used to assess participants’ reactions to the three major i nstructional components (e.g., prelaboratory, laboratory work, and post-laboratory) o f laboratory instruction implemented during the semester course with the EBAPS dimension s (structure of knowledge, nature of knowing and learning, real-life applicability, e volving knowledge, and source of ability to learn). The results for all the participants (N =56) and the interview participants (N=20) are presented in Tables 83 and 84-93, respectively.

PAGE 376

360 Evidence from the reflective open-ended responses o n the student questionnaire indicated that some participants (N=56) perceived e pistemological messages in their instruction (Table 83). For the EBAPS dimension, s tructure of scientific knowledge participants’ reflections suggests that they believ e scientific knowledge to be structured and connected. For the EBAPS dimension, nature of knowing and learning scientific knowledge participants’ reflections suggest that th ey believe that learning scientific knowledge requires making connections with prior kn owledge. For the EBAPS dimension, real-life applicability of scientific kn owledge participants’ reflections suggests that scientific knowledge is relevant and visible i n our daily lives. For the EBAPS dimension, evolving scientific knowledge participan ts’ reflections suggests that they believe scientific knowledge to not set in stone, t hat error occurs and results do not always match the concepts. For the EBAPS dimension source of ability to learn scientific knowledge participants’ reflections sugg est that scientific knowledge can be learned by anyone through practice and one learns b y doing. Table 83 Participants’ Reflections Epistemology Instructional Methods EBAPS Variable Reflective Written Comments What have you learned, if anything, concerning your epistemological beliefs about science with respect to the instructional methods? Structure of Scientific Knowledge ST-27: “That knowledge of scientific principles and defin itions help during instruction. As the course progresses I see how scientific knowledge is highly structured and conne cted from one lab to the next.” ST-46: “I have learned that chemistry is more than explos ions. That it is the building blocks for everything. Scie ntific knowledge is connected from one topic to the next.” ST-52: “I’ve learned that chemistry involves large quanti ties of hands-on work and descriptive observations. These observations are connected to the science concepts. ” ST-53: “I believe scientific knowledge is attained throug h a series or process. Through these activities we can make connections between the concepts and data.” Continued next page

PAGE 377

361 Table 83 (Continued) Nature of Knowing & Learning Science ST-9: “I have learned how to analyze the lab results an d other information by working actively through the materia l. I gather the information in combination with the results to form a well thought out conclusion.” ST-49: “I have learned that two different instructors cou ld explain the same concept two different ways but sti ll be correct. That I have to relate what I learn in lab to my prior knowledge.” ST-50: “Knowing what the lab is all about is very essent ial in order to be able to comprehend the material and the n apply it to prior knowledge. Being able to analyze the resu lts assists in synthesizing and creating other ideas. Being ab le to evaluate your results and summed it up involves con structing one’s own understanding.” ST-53: “The instructional methods facilitated my understanding of this aspect of chemistry by compartmentalizing it in various successive section s. For instance during the laboratory work I used my prior knowledge to construct understanding about the new material encountered.” ST-54: “I have learned that it is very important to have pre and post labs. They allow you to reflect before and af ter the laboratory work. It is important that the instruct or allow you to do things on your own. In other words construct you r own understanding.” Real-life Applicability of Science ST-11: “It is a phenomenal event for chemistry is happeni ng everywhere and at every moment. Being able to see the chemistry concepts working in everyday life makes t hem more relevant.” ST-13: “I’ve taken chemistry before and but this course has increased my perspective on how chemistry is seen i n our daily life.” Evolving Scientific Knowledge ST-7: “I have learned that there is no exact, right ans wer in science. That science is always changing and the l aboratory results may or may not support the current knowledg e.” ST-8: “From this course I have learned that science has error but strives to be a precise and accurate as humanly possible. However, it changes and does not always occur as pr edicted.” ST-15: “Experiments do not always go according to plan. For instance some of the predictions did not concur wit h some of the results. This supports the idea that science is not set in stone.” ST-16: “I have learned that science can often require man y attempts/experiments to obtain supportive results. Sometimes you have to repeat an experiment if it does not go according to plan or if you want to try a different method. The results do not always support the science concepts as error does o ccur.” Continued on next page

PAGE 378

362 Table 83 (continued) Evolving Scientific Knowledge ST-45: “I learned why some % yields are above or below 100%. This supports my belief that the results fro m laboratory experiments are not exact and error is anticipated. The results obtained should be repeatable.” ST-52: “I’ve learned that chemistry involves performing and recording of specific observations such as viewing whether a change or no change occurs during the procedure. Th e results may or may not support the scientific concepts.” Source of Ability to Learn Science ST-4: “Through practice I learned that I can apply the concepts from lecture to lab.” ST-11: “I feel that I have achieved my goal when I perfor m experiments and understand the results. I enjoy le arning new concepts and theories involving chemistry.” ST-15 : “Fun is an important part of “instructional metho ds” especially in learning science. The more interesti ng the subject the more likely one is to remember and unde rstand the information. Performing the lab and then analyzing the results improves my understanding.” ST-49: “You can learn science if you do it. After I have performed the experiment I understand a concept muc h more easily.” N=56 Final Interviews Epistemological Beliefs and Inst ructional Methods During the final interviews, five questions related to the multi-dimensional axes of the EBAPS: structure of scientific knowledge, natur e of knowing and learning science, real-life applicability of science, evolving scient ific knowledge, and source of ability to learn science were used to probe the participants v iews on which instructional feature influenced their beliefs (Appendices B & N). The i nterview participants were asked to elaborate on the questions in order to invoke the p articipant’s thoughts about the EBAPS variables and the instructional feature (e.g., prelaboratory, laboratory work, and postlaboratory). These answers can often display differ ent epistemological categories within one question. This suggests that one cannot fully i solate these variables and only search for evidence in the participants’ reflections and i nterviews.

PAGE 379

363 Structure of Scientific Knowledge Strong support was shown by 45% of the participants (N=20) indicating that they found the post-laboratory work to be the most effec tive in influencing their epistemological beliefs about the structure of scie ntific knowledge (Table 84). Moderate participant support was shown for the laboratory wo rk with 30% indicating that it had a moderate influence on their laboratory experience a nd understanding of the structure of scientific knowledge. Three participants (15%) ind icated that none of the instructional features influenced their beliefs for this dimensio n. The pre-laboratory was ranked fourth with 10% suggesting that it had influenced their be liefs about the structure of scientific knowledge. Table 84 Instructional Feature – Structure of Scien tific Knowledge Instructional Category Most Effective Pre-laboratory 10.0% Lab Work 30.0% Post-laboratory 45.0% Other 15.0% N=20 Evidence from the final interview responses indicat ed that some participants (N=20) perceived epistemological messages in their instruction (Table 85). For the EBAPS dimension, structure of scientific knowledge interview participants’ suggested that they believe scientific knowledge to be struct ured and connected. Participants 9 and 12 identified the pre-laboratory as the instruc tional method that influenced their beliefs about the structure of scientific knowledge because it assisted in making connections between the concepts and the rest of th e laboratory experience. The participants (ST 1, 3, 15, and 18-19) selecting the laboratory work as having the most influence on their structure of scientific knowledg e beliefs expressed that during

PAGE 380

364 laboratory work they could begin to tie all the con cepts from the pre-laboratory together with what occurred during the lab. The majority of the participants (ST 4-5, 7-8, 11, 14, 16-17, and 20) described the post-laboratory featur e as having the most influence on their structure of scientific knowledge views. The se participants felt that the postlaboratory experience allowed them to see how all t he concepts and results from the pre-lab and laboratory work was structured and conn ected improving their understanding. Three participants (ST 2, 6, and 10 ) expressed that none of the instructional methods influenced their beliefs conc erning this dimension. They identified prior science learning experiences as having a majo r influence. Table 85 Structure of Scientific Knowledge Instru ctional Methods Final Epistemological Beliefs Interview Question-1 Structure of Scientific Knowledge – What instructio nal feature (pre-lab, laboratory work, or post-lab), if at all do you believe influe nced your beliefs about the Structure of Scientific Knowledge in this course? Instructional Issue Quotation Comments Pre-laboratory ST 9 : “The pre-lab as I could begin to see how the con cepts discussed were connected to the overall lab concept s.” ST 12: The pre-lab because it connected and related the concepts to the lab to be performed.” Laboratory Work ST 1: “The laboratory work because you are actively eng aged in learning and connecting the results to the concepts .” ST 3: “ The laboratory work as you could see how it connect ed to the pre-lab concepts.” ST 13: “The laboratory work because I could tie the mate rial from the pre-lab to what happened during the lab.” ST 15: “The laboratory work as it clarified the gray area s.” ST 18: “The laboratory work because you could make connections with the pre-lab.” ST 19: “The laboratory work because you can observe the connections. ” Post-Laboratory ST 4: “The post-lab because you’re connecting concepts that you’ve learned from previous labs.” ST 5: “ The post lab this is we here you actually try and c onnect the information.” ST 7: “The post lab because it connects the pre-lab and laboratory work concepts together. So you can see how it is co nnected.” Continued next page

PAGE 381

365 Table 85 (continued) Post-Laboratory ST 8: “The post lab. After you evaluated the data you cou ld see the bigger picture of how everything was intertwine d.” ST 11: “The post lab as it helped me put it all together.” ST 14: “The post-lab because you attempt to understand why things work the way they do and draw a conclusion.” ST 16: “ Post-lab because it helped strongly connect everyth ing.” ST 17: ”The post lab because it tied it all the concepts t ogether.” ST 20: “All three of them influenced that belief. However I would choose the post-lab.” Overall ST 2: “ I don’t believe so I think I understood in elementa ry school I’ve always been taught that theories should be pro ven 3 rd grade teacher beat into our heads.” ST 6: “ Not one in particular. When I developed the belief a long time ago when I was first studying science I guess. ” ST 10 : “None of the instructional methods. I just start ed taking kind of some more hands-on science classes.” N=20 Nature of Knowing and Learning Scientific Knowledge Strong support was shown by 50% of the participants (N=20) indicating that they found the laboratory work to be the most effective in influencing their epistemological beliefs about the nature of knowing and learning sc ientific knowledge (Table 86). Moderate participant support was shown for the post -laboratory with 25% indicating that it influenced their laboratory experience and under standing of the nature of knowing and learning scientific knowledge. One participant ind icated that none of the instructional features influenced her beliefs for this dimension. The pre-laboratory was ranked third with 20% suggesting that it had a moderate influenc e on their beliefs. Table 86 Instructional Feature – Nature of Knowing and Learning Science Instructional Category Most Effective Pre-laboratory 20.0% Lab Work 50.0% Post-laboratory 25.0% Other 5.0% N=20

PAGE 382

366 Evidence from the final interview responses indicat ed that some participants (N=20) perceived epistemological messages in their instruction (Table 87). For the EBAPS dimension, nature of knowing and learning sci entific knowledge interview participants’ suggested that they believe connectin g prior scientific knowledge with new concepts is important. Four participants (ST 2, 8, 10, and 20) identified the prelaboratory as the instructional method that influen ced their beliefs about the nature of knowing and learning scientific knowledge because i t allowed them to connect their prior knowledge with the new knowledge being presented du ring rest of the laboratory experience. The majority of the participants (ST 37, 11-12, 14, 17, and 18) selected the laboratory work as having the most influence on the ir nature of knowing and learning scientific knowledge beliefs. They expressed that during laboratory work they could apply and begin to tie all their prior and current concepts together with what occurred during the lab. A few participants (ST 1, 15-16, a nd 19) described the post-laboratory feature as having the most influence on their natur e of knowing and learning scientific knowledge views. These participants felt that the post-laboratory experience allowed them to take all their scientific knowledge gained from the pre-lab and laboratory work as well as their prior knowledge and construct their o wn understanding. One participant (ST 9) expressed that none of the instructional met hods influenced her beliefs concerning this dimension. She suggested that ever yone has their own method of learning that works for them.

PAGE 383

367 Table 87 Nature of Knowing and Learning Science I nstructional Methods Final Epistemological Beliefs Interview Question-2 Nature of Knowing and Learning in Science – What in structional feature (pre-lab, laboratory work, or post-lab), if at all do you bel ieve influenced your beliefs about the Nature of Knowing and Learning in Science in this c ourse? Instructional Issue Quotation Comments Pre-lab ST 2: “The pre lab as I had to construct my own understa nding and think outside the box.” ST 8: “ Pre lab because you need to relate the new things y ou learn to the previous material.” ST 10: “Probably the pre-lab as that is where you’re fir st introduced to the new material and you build on what you’ve done prev iously.” ST 20: “Pre-lab .” Lab Work ST 3: “I think the lab work was the most effective for m e. I wanted to be able to spit out more than facts and really underst and.” ST 4 : “I would say the lab work because you use real w orld situations to relate to what you see and what’s going on during t he labs.” ST 5: “The laboratory work influenced my beliefs. Using m y prior knowledge helped me understand while doing the lab. ” ST 6: “The lab work because as things occur you have to b e able to think the results through.” ST 7: “The lab work is when you are actually constructing knowledge as you work and begin to understand.” ST 11: “Laboratory work because it helped me to expand my knowledge learning.” ST 12 : “The lab work really helped in explaining the co ncepts.” ST 14: “ I would say the lab work. Prior knowledge and expe riences were important.” ST 17 : “The lab work as it actually allowed you to buil d on the lecture material.” ST 18: “It would be the laboratory work because even if you had prior experiences or knowledge you’re still learning a ne w concept. It allowed you to use the new concept.” Post-Lab ST 1: “The post-lab because it that summarizes most of your findings in order to show your understanding. You use prior sc ience knowledge.” ST 13: “I would say the post-lab because you have to put the knowledge together and draw a conclusion.” ST 15: “Post lab because it forced you to use the informat ion from the experiment and relate it to the concepts.” ST 16: “Post-lab. It required some learning and underst anding on my own. It involves using prior knowledge.” ST 19: “I think the post-lab because you’re trying to answ er and understand why and what happened in lab.” Overall ST 9: “None of the methods. I’ve always felt some peop le learn better by memorizing and others understand better by rewritin g or rephrasing it in their mind.” N=20

PAGE 384

368 Real-Life Applicability of Scientific Knowledge Strong support was shown by 65% of the participants (N=20) indicating that they found the laboratory work to be the most effective in influencing their epistemological beliefs about the real-life applicability of scient ific knowledge (Table 88). Minimal participant support was shown for the post-laborato ry with 15% indicating that it somewhat influenced their laboratory experience and understanding of the nature of knowing and learning scientific knowledge. Three p articipants indicated that none of the instructional features influenced their beliefs for this dimension. The pre-laboratory was ranked fourth with one participant suggesting that it had influenced her beliefs. Table 88 Instructional Feature – Real-Life Applicab ility Instructional Category Most Effective Pre-laboratory 5.0% Lab Work 65.0% Post-laboratory 15.0% Other 15.0% N=20 Evidence from the final interview responses indicat ed that some participants (N=20) perceived epistemological messages in their instruction (Table 89). For the EBAPS dimension, real-life applicability of scienti fic knowledge interview participants’ suggested that scientific knowledge occurs and is r elevant to everyday life. One participant (ST 13) identified the pre-laboratory a s the instructional method that influenced her beliefs about the real life applicab ility of because the activities presented examples in the readings. The majority of the par ticipants (ST 1, 4-5, 8-9, 11-12, and 14-19) selected the laboratory work as having the m ost influence on their real life applicability of scientific knowledge beliefs. The y expressed that during laboratory work certain experiments or demonstrations involved conc epts that could be applied to real

PAGE 385

369 life such as the atomic theory and how fireworks wo rk. A few participants (ST 6, 10, and 20) identified the post-laboratory feature as havin g the most influence on their real life applicability of scientific knowledge views. These participants described the postlaboratory instructional method as a way to apply w hat they had learned in the course to their daily lives. Three participants (ST 2-3 and 7 ) expressed that none of the instructional methods influenced their beliefs conc erning this dimension. They identified prior science learning experiences as having a majo r influence. Table 89 Real-Life Applicability Instructional Me thods Final Epistemological Beliefs Interview Question-3 Real Life Applicability of Science – What instructional feature (pre-lab, laboratory w ork, or post-lab), if at all do you believe influenced y our beliefs about the Real Life Applicability of Science in this course? Instructional Issue Quotation Comments Pre-lab ST 13: “Pre-laboratory. As there were examples in the rea ding.” Lab Work ST 1: “The laboratory work which included demonstrations of science things that happen in real-life such as fireworks a nd tire pressure.” ST 4: “I think the lab work. We made observations about the chemistry of light, the role of the gas laws to tire pressure, n eon lights, and many other things related to real life.” ST 5: “The instructional feature that influenced my belie f the most was the laboratory work.” ST 8: “Probably the lab work. I would see how these pro cesses apply outside of the lab in real life situations.” ST 9 : The laboratory work. When we did laboratory work we performed activities or experiments where we could see for in stance how fireworks are created. ” ST 11: “Probably the most influential is laboratory work because it demonstrated the different things that go on during real life such as a chemical reaction or phase change.” ST 12: “I would explain like how the laboratory work rel ates to real life in the post-lab. We were doing with this one lab, in the laboratory where we studied light and how fireworks are made. I think t he lab work because when you actually would do it you were actually exp eriencing the reality.” ST 14: “ I would say the lab work just because that’s when y ou actually see it.” ST 15: “The laboratory work offered us some experiences with materials used in everyday life. We did reactions with dish s oap and chemicals used in fireworks.” ST 16: “I would say the laboratory work. We had to work with light spectrums, the sun and fireworks. It shows that ch emistry is everywhere.” Continued next page

PAGE 386

370 Table 89 (continued) Lab Work ST 17: “Laboratory work because it ties in to how things actually happen.” ST 18: “ I’ll go with the laboratory work. We had to practi ce safety just like if we had a job in science or one using chemicals.” ST 19: I’d say the laboratory work because we would do lab s and demonstrations that involved concepts like the scie nce of fireworks.” Post-Lab ST 6: “The post lab because we would be able to see the c onnections between lab and everyday life. For instance how hea t is transferred via your hot water heater.” ST 10: “ Probably the post-lab because you see that it can a pply at home.” ST 20: “Post-lab. I think that’s where everything conne cts together and you gain some insight into how it applies to our li fe. You realize the difference it really has made.” Overall ST 2: “None of the instructional features applied.” ST 3: “All of the methods even the course lecture and dis cussion portion of the lab. I learned this gradually over time that ch emistry all around.” ST 7: “None of the features. I think scientists and peopl e everywhere use it. I believed it before I came into the classroom. “ N=20 Evolving Scientific Knowledge Strong support was shown by 35% of the participants (N=20) indicating that they found the laboratory work to be the most effective in influencing their epistemological beliefs about evolving scientific knowledge (Table 90). Moderate participant support was shown for the post-laboratory with 25% indicating t hat it was somewhat effective in influencing their laboratory experience and underst anding of the nature of knowing and learning scientific knowledge. Five participants i ndicated that none of the instructional features influenced their beliefs for this dimensio n. Once again the pre-laboratory was ranked fourth with 10% suggesting that it had a min imal influence on their beliefs. Table 90 Instructional Feature – Evolving Scientifi c Knowledge Instructional Category Most Effective Pre-laboratory 10.0% Lab Work 35.0% Post-laboratory 30.0% Other 25.0% N=20

PAGE 387

371 Evidence from the final interview responses indicat ed that some participants (N=20) perceived epistemological messages in their instruction (Table 91). For the EBAPS dimension, evolving scientific knowledge inte rview participants’ suggested that the laboratory experience challenged them to compar e the concepts to their results and decide what explanation to believe. Two participa nts (ST 3 and 9) identified the prelaboratory as the instructional method that influen ced their beliefs about the evolving nature of scientific knowledge as the pre-laborator y involved reviewing the theories that applied and how they had developed and changed over time. The majority of the participants (ST 8, 11, 13-16, and 19) selected the laboratory work as having the most influence on their evolving scientific knowledge be liefs. They expressed that during laboratory work they would consider the theories th at applied and compare what they expected to happen with what actually happened. Fro m this comparison some could see that scientific knowledge changes. Six participants (ST 1, 5, 7 12, 18, and 20) descri bed the post-laboratory feature as having the most infl uence on their evolving scientific knowledge views. These participants felt that the post-laboratory experience allowed them to compare the results to the theories and und erstand the changes or differences. Several participants (ST 2, 4, 6, 10, and 17) expre ssed that none of the instructional methods influenced their beliefs concerning this di mension. They identified prior science learning experiences as having a major influence.

PAGE 388

372 Table 91 Evolving Scientific Knowledge Instructio nal Methods Final Epistemological Beliefs Interview Question-4 Evolving Scientific Knowledge – What instructional feature (pre-lab, laboratory w ork, or post-lab), if at all do you believe influenced your beliefs about the Evolving Knowledge of Science in this course? Instructional Issue Quotation Comments Pre-laboratory ST 3: “Probably pre-lab as I got a better picture or idea how the concepts for the lab developed.” ST 9: “The pre-lab. Reading about the theories and loo king back on the different hypotheses showed that scientists changed their minds over time.” Lab Work ST 8: “The laboratory work. When carrying out the experim ental process it can challenge what is considered set in stone.” ST 11: “Laboratory work.” ST 13: “Laboratory work.” ST 14: “Laboratory work because it offered supporting evi dence.” ST 15: “I always thought science changed over time. Howe ver, laboratory work helped validate my belief.” ST 16: “The lab work. For example the lab where we stud ied the Law of Conservation of Mass.” ST 19: “I would say the lab work because it would reinfo rce the concepts about how they change.” Post-Lab ST 1: “The post-lab helped me decide if something was rig ht or wrong. After I studied the results I could predict why it happened and how the theory might have been contradicted.” ST 5 : “The post lab. Showed that things can change.” ST 7: “It would be the post-lab as you formed conclusions based on your results that did not always match the expec ted. So science is not set in stone and there are different possibilities.” ST 12: “The post-lab most because that’s when you interp ret your results. Everybody interprets their results d ifferently so part of it will be somewhat based on their opinion.” ST 18: “I would say the post-lab. It gives you the opportu nity to evaluate the new concepts and see if there is evide nce to support the concepts.” ST 20: “I would say post-lab influenced or supported my be lief.” Overall ST 2: “My belief was established in 3 rd grade.” ST 4: “I’m not so sure if any of them influenced my bel iefs.” ST 6: My belief was developed when I was a child in ear ly science classes. ” ST 10: “None of the instructional features influenced me I held that belief in about 7 th grade.” ST 17: “None influenced my belief because I have had man y science classes.”

PAGE 389

373 Source of Ability to Learn Scientific Knowledge Strong support was shown by 35% of the participants (N=20) indicating that they found the post-laboratory to be the most effective in influencing their epistemological beliefs about the source of ability to learn scient ific knowledge (Table 92). Moderate participant support was shown for the pre-laborator y with 25% indicating that it was moderately effective in influencing their laborator y experience and understanding of the source of ability to learn scientific knowledge. F our participants indicated that none of the instructional features influenced their beliefs for this dimension. The laboratory work was ranked third with 20% suggesting that it modera tely effective in influencing their beliefs. Table 92 Instructional Feature –Source of Ability t o Learn Instructional Category Most Effective Pre-laboratory 25.0% Lab Work 20.0% Post-laboratory 35.0% Other 20.0% N=20 Evidence from the final interview responses indicat ed that some participants (N=20) perceived epistemological messages in their instruction (Table 93). For the EBAPS dimension, source of ability to learn scienti fic knowledge interview participants’ suggested that they believe anyone can learn scienc e some just have to work harder. Five participants (ST 1-3, 12, and 14) identified t he pre-laboratory as the instructional method that influenced their beliefs about the sour ce of ability to learn scientific knowledge because it prepared and assisted them in making connections between the concepts and the rest of the laboratory experience. Four participants (ST 4, 7, 11, and 13) identified laboratory work as having the most i nfluence on their source of ability to

PAGE 390

374 learn scientific knowledge beliefs. They expressed that the hands-on experience was an effective way for them to tie all the concepts toge ther. The majority of the participants (ST 6, 8-9, 15-16, and 19-20) described the post-la boratory feature as having the most influence on their source of ability to learn scien tific knowledge views. These participants felt that the post-laboratory experien ce allowed them to apply the concepts and results from the pre-lab and laboratory work th ereby improving their understanding. The remaining participants (ST 5, 10, 17-18, and 19 ) expressed that none of the instructional methods influenced their beliefs conc erning this dimension. They identified motivation and effort as having a major influence. Table 93 Source of Ability to Learn Instructional Methods Final Epistemological Beliefs Interview Question-5 Source of Ability to Learn Science – What instructi onal feature (pre-lab, laboratory work, or post-lab), if at all do you believe influenced y our beliefs about the Source of Ability to Learn Science in this course? Instructional Issue Quotation Comments Pre-lab ST 1: “The pre-lab because if you just read and do the w ork anyone can be successful.” ST 2: “The pre lab because it helped me understand the underlying concepts. You have to be prepared to learn.” ST 3: “The pre-labs because the concepts are introduced I had a difficult time if I did not do all of the pre-lab.” ST 12: “The pre-lab because you could are introduced to the concepts. Plus it doesn’t matter if you think you are good at scie nce or not you still can learn by doing the work. ST 14: “ I would say pre-lab because that is where I gained most of the basic knowledge which helped me to better understand the material.” Lab Work ST 4: “Laboratory work because I learn better when ther e are hands-on activities.” ST 7: “Laboratory work because doing the activities helpe d me to understand the concepts.” ST 11: “Laboratory work. I would reread the pre-laborat ory material and go over the laboratory work data then I would understa nd what happened.” ST 13: “Laboratory work influenced my beliefs.” Continued next page

PAGE 391

375 Table 93 (continued) Post-Lab ST 6: “Post lab because you learn more after experiencing and thinking about it.” ST 8: “All of them influenced my beliefs but the post-l ab more than the other two. You learn by evaluating your results and drawi ng a conclusion.” ST 9: “The post-lab but I already held the belief that all individuals can learn science.” ST 15: “I would say the post lab. It helped to connect my prior knowledge with what I learned during the laboratory work.” ST 16: “Post-lab because learning science involves analy zing the information.” ST 19: “The post-lab because it really makes you think.” ST 20: “Post-lab. When doing the post-lab you are trying to process all the data. You are trying to find out why certain thing s happened.” Overall ST 5: “The laboratory work and the post lab. Both featur es allowed you to learn through experience.” ST 10: None of the instructional features. I think it’s based on your effort and motivation.” ST 17: “None of the instructional features. I would say anyone can learn science.” ST 18: “I do not think any of the features influenced me I really like learning science.” N=20 Characterization of Participants’ NOS Reflections NOS and Instructional Methods Section one, question eight of the Student Evaluati on of Laboratory Instruction Questionnaire was used to evaluate part icipants’ beliefs concerning what they learned, if anything about NOS beliefs with re spect to the laboratory instructional methods. This section of the reflective student qu estionnaire (Appendix E) was used to assess participants’ reactions to the three major i nstructional components (e.g., prelaboratory, laboratory work, and post-laboratory) o f laboratory instruction implemented during the semester course with four of the NSKS di mensions (creativity, developmental, parsimonious, and testable). The results for all t he participants (N=56) and the interview participants (N=20) are presented in Tables 94-96. Evidence from the reflective open-ended responses o n the student questionnaire indicated that a few participants (N=56) perceived NOS messages in their instruction

PAGE 392

376 (Table 93). For the NSKS dimension, creativity par ticipants’ reflections suggest that they believe science involves imagination. For the NSKS dimension, developmental participants’ reflections suggest that they believe that scientific knowledge develops over time. For the NSKS dimension, parsimonious partici pants’ reflections suggest that scientific knowledge is tied together by overlappin g concepts. For the NSKS dimension, testable participants’ reflections suggest that they believe scientific knowledge is gained by multiple trials, observations and error does occ ur. Table 94 Participants’ Reflections NOS Instruct ional Methods NOS Variable Reflective Written Comments What have you learned, if anything, concerning your NOS beliefs about science with respect to the instructi onal methods? Creative ST-11 : “Science is a phenomenal event because it is happening everywhere and at every moment. You have to use your imagination to gain an understanding.” ST-13: “I’ve taken other science courses where we had to design experiments. This course continues to show h ow chemistry concepts involve imagination.” Developmental ST-9: “I have learned that science requires gathering the information over time to form a well thought out co nclusion.” ST-32: “I have learned that you can perform multiple experimental trials that support a theory for many years and with one opposing test disprove the theory.” ST-45: “I have learned how to support the Law of Conserv ation of Mass.” Parsimonious ST-5: “The nature of science is based on many laws and concepts that are tied together.” ST-9: “I have learned about chemical reactions and the properties that set the different chemicals apart. The rules for this are simple and can be applied to other situati ons.” Testable ST-7: “I have learned that there is no exact right answe r in science.” ST-8: “I have learned that science is based on trial and error.” ST-11: “It is much easier to understand the nature of sci ence by doing hands-on lab experiments than by simply readi ng.” ST-15: “Predictions did not always correspond with the re sults.” ST-16: “Science requires many attempts to obtain results. Sometimes you have to repeat an experiment if it do es not go according to plan.” ST-45: “I learned that laboratory experiments do not alwa ys produce expected results.” ST-52: “I’ve learned that science involves making multipl e observations.”

PAGE 393

377 Final Interview – NOS Beliefs and Instructional Methods During the final interview, one question related to NOS beliefs was used to probe the participants views on which instructional featu re influenced their beliefs (Appendices B & N). The interview participants (Table 96) were asked to elaborate on the question in order to invoke the participant’s thoughts about NO S and the instructional feature (e.g., pre-laboratory, laboratory work, and post-laborator y). These answers can often display different NOS categories within one question. This suggests that one cannot fully isolate these variables and only search for evidence in the participants’ reflections and interviews. Table 95 presents the participants ins tructional preference in relation to their NOS beliefs. Table 95 Instructional Feature – NOS Beliefs Instructional Category Most Effective Pre-laboratory 20.0% Lab Work 70.0% Post-laboratory 5.0% Other 5.0% N=20 Extremely strong support (ST 2-6, 8, 10-12, 14-15, 17-18, and 19) was shown by 70% of the participants (N=20) indicating that they found the laboratory work to be the most effective in influencing their NOS beliefs (Ta ble 95). Minimal participant (ST7, 9, 13, and 20) support was shown for the pre-laborator y with 20% indicating that it was moderately effective in influencing their laborator y experience and understanding of NOS. One participant (ST 1) indicated that the pos t-laboratory feature influenced her NOS beliefs. Only one participant (ST 16) felt none of the instructional features influenced her NOS beliefs. She indicated that the lecture and textbooks influenced her beliefs.

PAGE 394

378 Table 96 Interview Participants’ NOS Reflections Instructional Methods Final NOS Interview Question-2 What instructional feature (pre-lab, laboratory wor k, or post-lab), if at all do you believe influenced your beliefs about the Nature of Science in this course? Instructional Issue Quotation Comments Pre-lab ST 7: “Pre-laboratory influenced by beliefs but my point of view was formed in earlier science courses.” ST 9: “The pre-lab included reading over the history of different scientists and any theories they influenced.” ST 13: “Pre-lab as it offered explanations of theories.” ST 20: “The pre-lab presented information on the related concepts and theories.” Lab Work ST 2: “Laboratory work because we used the theories and c oncepts to help explain what was going on just like scientists do.” ST 3: “Laboratory work because we could repeat reactions and observations to add support to our conclusions.” ST 4: “The laboratory work because in most cases you’re able to visually see what’s going on.” ST 5: “The laboratory work. You have to understand the th eory and apply it to results.” ST 6: “If anything it is the lab work because you get t o see it doesn’t always happen as expected. There are variables that cause changes.” ST 8: “Probably the laboratory work. While doing the benc h work in the laboratory you distinguish between theory and fact while testing your hypothesis.” ST 10: “The lab work because sometimes the results do no t line up completely with what you thought they would be. So you have to consider if it’s wrong or whether you made a mistake.” ST 11: “Laboratory work because you experience the error that can occur in science.” ST 12 : “The laboratory work because you expect an exper iment to produce certain results but when we actually performed it s ometimes something completely different happened.” ST 14: “The lab work.” ST 15: “The lab work because it is how scientific knowle dge is collected, used and interpreted.” ST 17: “The laboratory work as we could see things as the y happened.” ST 18: “ I would say the laboratory work.” ST 19: “Probably the lab work. It reinforced my thinkin g. For instance we made predictions before performing the lab. Then we would actually do the lab and find out if our predictions were correct.” Post-Lab ST 1: “The post-lab because that is where we discuss the results and whether it supports the theory.” Overall ST 16: “None of the features I was influenced by the cou rse lecture and textbook.”

PAGE 395

379 Discussion Essential Laboratory Pedagogy RQ2. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work), do students believe w ere essential to their understanding during the semester general chemistry laboratory le arning experience? The majority (65%) of participants (N=56) indicated that they found the laboratory work to be either very or extremely essential to th e laboratory experience and their understanding of the material. This supports the r esearch that laboratory work can provide learners with a good opportunity to apply t heir newly acquired knowledge and gain new skills through first-hand experience (John stone, 1997; Millar 2002). When learners engage in laboratory work they can test, r ethink, and reconstruct their own ideas and thoughts (Cimer, 2007; Kirschner, 1992). Dawe (2003) suggests that positive outcomes may be a result of the learners’ gaining o wnership over the concepts during laboratory work. The post-laboratory followed wit h 59% indicating that it was either very or extremely essential to the laboratory experience and understanding of the material. The pre-laboratory was ranked third with 44% indica ting that it was either very or extremely essential to the laboratory experience an d understanding of the material. The interview participants (N=20) ranked the three instructional features the same as all the participants (N=56) with laboratory work being the most essential, followed by post-laboratory, and lastly pre-laborat ory. The majority (83%) of interview participants indicated that they found the laborato ry work to be either very or extremely essential to the laboratory experience and their un derstanding of the material. The development of interpretation, measurement, observa tion, and prediction skills are dependent on laboratory work. However, laboratory experiences do not guarantee that the aforementioned skills can be achieved. More emp hasis should be placed on what

PAGE 396

380 the student should gain from the overall experience Once again, strong participant support was shown for the post-laboratory with 72% indicating that it was either very or extremely essential to the laboratory experience an d understanding of the material. The pre-laboratory was ranked third with 46% indicating that it was either very or extremely essential to the laboratory experience and understa nding of the material. By the end of the semester course two of the three instructional features, laboratory work (40%) and post-laboratory (40%) wer e selected by the participants (N=56) as the most effective in promoting their lea rning during the semester course while the pre-laboratory instructional feature (65% ) was selected as the least effective. Fifty-five percent of the participants (N=56) repor ted they felt a sense of achievement when they participated in a pre-lab dis cussion prior to performing the experiment, while 34% indicated that they felt a se nse of achievement when they performed the experiment first and then participate d in a post-lab discussion. A small percentage (11%) felt there was no clear difference The study found that 72% of the participants (N=56) indicated that it was more difficult to perform an experiment before it was di scussed especially when it came to the methods and equipment which many were not familiar with due to lack of laboratory experience. Approximately 14% of the participants felt at the beginning of the semester it was a challenge to perform an experiment prior t o a discussion but eventually preferred to perform the experiment first and follo w-up with a post-lab discussion. A small percentage (2.0%) felt it was more difficult to perform an experiment after it was discussed, while 9.0% indicated there was no clear difference. As indicated 60% of the participants (N=56) indicat ed that they enjoyed the laboratory experience better if they participated i n a pre-lab discussion prior to performing the experiment, while 21% indicated that they enjoyed lab better when they

PAGE 397

381 performed the experiment first and then participate d in a post-lab discussion. A small percentage (19%) felt there was no clear difference Lastly 48% of the participants (n=56) indicated tha t they understood better if they participated in a pre-lab discussion prior to perfo rming the experiment, while 33% indicated that they understood better when they per formed the experiment first and then participated in a post-lab discussion. A small per centage (19%) felt there was no clear difference. The conventional way of preparing the participants for the laboratory was through pre-laboratory activities. This included encouragi ng them to read over the potential methods. However, this can overload the students w ith information resulting in the learner becoming lost in the sequence of ideas. In addition, unless specific tasks are allocated (pre-laboratory) only a minority of stude nts will read the material. Epistemological Beliefs and Laboratory Pedagogy RQ2a. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work), do students believe i nfluenced their personal epistemological beliefs about science (development) during the seme ster general chemistry laboratory course? Strong support was shown by the participants (N=20) indicating that they found the laboratory work to be the most effective in inf luencing their epistemological beliefs. For three out of the five EBAPS (nature of knowing and learning, real-life applicability, and evolving knowledge) laboratory work was ranked as most effective in influencing beliefs. Moderate participant support was shown for the post-laboratory with it being ranked second in influencing overall EBAPS beliefs. Overall the pre-laboratory was ranked third suggesting that it had a minimal influ ence on participants’ beliefs.

PAGE 398

382 Students arrive with existing personal epistemolog ical beliefs that lead to interpretations of instruction, and as these belief s change, so do the interpretations. The participants in this study may have come to class w ith preconceptions about science laboratory learning formed from their prior learnin g experiences. The participants’ perceptions of the laboratory learning experience m ay have hindered their beliefs. Some participants preconceptions were expressed whe n they described the laboratory experience as a place to reinforce what they learne d in lecture or during the prelaboratory. According to Hofer (2001), studies have investigate d how epistemological beliefs that learners hold about knowledge and knowing affe ct the learning and instructional process. For example Ryan’s (1984) study suggeste d that there is a relationship between learners’ epistemological beliefs and their information-processing strategies as measured by Bloom’s taxonomy. Dimensions of episte mological beliefs have been shown to relate to learning and instruction (Schomm er, 1990). For instance, one study showed that participants who viewed knowledge as ce rtain were likely to generate unquestionable conclusions (Schommer, 1990). In ad dition, some were likely to give oversimplified conclusions. Garret-Ingram findings (1997) were that epistemolog ical beliefs affect learners’ use and choice of instructional strategies. This s uggests one may need to consider a conceptual framework that includes the role persona l epistemology plays in selfregulated learning. Hofer and Pintrich (1997) sugg est that learners’ beliefs and theories about knowledge may influence their engagement in l earning. Epistemological beliefs have been linked to concept ual change in learning science. Studies about learners’ epistemological b eliefs about whether science is dynamic or static or a mix of the two predicted the ir ability to integrate their

PAGE 399

383 understanding of a topic and their strategy to use (Davis, 1997; Songer & Linn, 1991). In the Davis study, eighth-grade students with a dynam ic view were likely to try to understand science, while those with a static view were more concerned with the memorization of facts. According to Edmondson and Novak (1993) several studies link science epistemological beliefs with science learni ng and the basic assumption that students’ beliefs about the origin and structure of knowing and scientific knowledge are intertwined with their learning of science. According to Hofer (2001), educational experiences play a role in fostering a belief change. The questions lie in what instructi onal strategies can best be employed. Little research exists that clarifies the relation between types of instruction and personal epistemological beliefs. NOS Beliefs and Laboratory Pedagogy RQ 2b. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work), do students believe i nfluenced their images of the nature of chemistry (NOS) during the semester general chemist ry laboratory course? Strong support was shown by the participants (N=20) indicating that they found the laboratory work to be the most effective in inf luencing their NOS beliefs. Minimal participant support was shown for the pre-laborator y with it being ranked second in influencing overall NOS beliefs. Overall the postlaboratory was ranked third suggesting that it had a minimal influence on participants’ NO S beliefs. According to Sere, et al., (1998) influences upon s tudents’ actions and learning during laboratory work include their images of scie nce (NOS) and their images of learning. Laboratory work might develop learners’ c onceptual understanding or their skills in planning investigations, or their aptitud es at using standard laboratory

PAGE 400

384 procedures in carrying out investigations. However, most learners in educational teaching laboratories often work with knowledge cla ims already agreed as reliable within the scientific community. For example in this study some of the participants during laboratory work used accepted theories or applied a ccepted theory in specific contexts. Their ideas about how that knowledge came to be vie wed as reliable may have influenced their laboratory work. For all these re asons, participation in labwork involves students in drawing upon their epistemological and NOS understanding. For example in this study, during laboratory work, the participant s had to make decisions about the amount of data that would be collected and the conc lusions that can be drawn from given data sets. According to Leach et al., (1998) the decisions that learners make about data collection will be influenced by their NOS vie ws of the nature of measurement (testable). Summary In summary the overall findings of the study in ans wering research question -2, sub-question-a and sub-question-b was as follows: RQ2. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work), do students believe w ere essential to their understanding during the semester general chemistry laboratory le arning experience? The majority (65%) of participants (N=56) indicated that they found the laboratory work to be either very or extremely essential to th e laboratory experience and their understanding of the material. The interview parti cipants (N=20) ranked the three instructional features the same as all the particip ants (N=56) with laboratory work being the most essential, followed by post-laboratory, an d lastly pre-laboratory.

PAGE 401

385 RQ2a. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work), do students believe i nfluenced their personal epistemological beliefs about science (development) during the seme ster general chemistry laboratory course? Substantial support was shown by the participants ( N=20) indicating that they found the laboratory work to be the most effective in influencing their epistemological beliefs. For three out of the five EBAPS (nature of knowing and learning, real-life applicability, and evolving knowledge) laboratory w ork was ranked as most effective in influencing beliefs. RQ2b. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work), do students believe i nfluenced their images of the nature of chemistry (NOS) during the semester general chemist ry laboratory course? Extremely strong support was shown by the participa nts (N=20) with 70% indicating that they found the laboratory work to b e the most effective in influencing their NOS beliefs. Chapter eight presents an overview of the dissertat ion and a brief summary of the studies findings in relation to each research q uestion. Following this is a general discussion of the limitations of the study and dire ctions for future research.

PAGE 402

386 Chapter Eight: Conclusions Introduction Personal epistemological and NOS beliefs research b oth have had a long history for over 30+ years. Few studies however, have invo lved college science students’ beliefs with instructional features. This study was of an exploratory nature to lay a foundation for focusing on more specific features o f epistemological and NOS reasoning in light of specific instructional features (pre-la b, laboratory work, or post-lab) for future research. This study investigated students’ episte mological and NOS beliefs and their perceptions of the instructional features as relate d to those beliefs. This chapter provides an overview of the dissertati on in the following section. This chapter includes a summary of the following: (1) chapter one – introduction; (2) chapter two – literature review related to personal epistemology, NOS, and science laboratory pedagogy; (3) chapter three – research methods; (4) chapter four – quantitative results – results for research questio n one and sub-questions; (5) chapter five – development of epistemological beliefs res ults for research question two and sub-question 2-a; (6) chapter six – development of NOS beliefs – results for research question two and sub-question 2-b; (7) chapter seve n – laboratory instructional features – results for sub-questions 2-a and 2-b; (8) signif icance and implications of study; (9) limitations; (10) suggestions for further research ; and (11) concluding remarks. Overview of the Dissertation Chapter one presented an overview of personal epist emological and NOS beliefs. This was followed by a discussion of the problem statement, the nature of the

PAGE 403

387 study as well as introduce concepts and issues cent ral to the research such as: the nature and development of personal epistemology, th e role of student images of science, the nature of chemistry learning, the defi nitions of personal epistemology and NOS, the possible link between personal epistemolog ical and NOS beliefs, the role of the laboratory instructional environment, and resea rch methodology issues. In addition, the research questions were presented followed by t he study’s significance for chemistry education research. This study investigated students’ epistemological a nd NOS beliefs, whether they changed and their perceptions of the instructional features as related to those beliefs. Overall, the study’s purpose was to explore and lay a foundation for focusing on more specific features of reasoning related to personal epistemological and NOS beliefs changes in light of specific science laboratory ins tructional features for future research. In addition, the study explored and laid a foundati on for focusing on more specific features of reasoning related to learning and speci fic science laboratory instructional features for future research. This study encompassed two large and distinct resea rch fields: personal epistemological and NOS beliefs. However, because the two research fields have not always been predominantly linked, the review of the qualitative results was divided into two separate chapters. Chapter 5 dealt with develo pment of personal epistemological beliefs, while chapter 6 dealt with the development of the nature of science. Research question one and sub questions looked at the initia l and final personal epistemological and NOS beliefs of the participants involved in the study. Chapter 4 discussed the quantitative changes of the participants’ NOS and p ersonal epistemological beliefs.

PAGE 404

388 The major construct of this study was personal epis temological beliefs a psychological driven concept borrowing from philoso phical issues (Schommer, 1994). Hofer and Pintrich (1997) define epistemological be liefs as how learners come to know and the theories and beliefs they hold about knowin g. The extent to which these beliefs affect a learner can be the difference between a un sophisticated naive belief about learning at a surface level and a sophistication th at involves a deeply divergent thought process that utilizes experience and formal educati on to a well-developed assimilation of knowledge (Schommer-Aikins, 2004). The secondary construct for this study NOS involves personal scientific epistemological beliefs. NOS is an area of human e nterprise that includes the beliefs and values inherent to scientific knowledge and its development. The consensus view of NOS from science education standards documents incl udes the following: (1) scientific knowledge has a tentative character; (2) scientific knowledge relies heavily on observation, experimental evidence, and rational ar guments; (3) there is no one way to do science; (4) science attempts to explain natural phenomena; (5) laws and theories serve different roles in science; (6) individuals f rom all countries contribute to science; (7) new scientific knowledge must be repor ted clearly and openly; (8) science requires accurate record keeping, peer review, and replicability; (9) observations are theory-laden; (10) scientists are creative; (11) th e history of science discloses both an evolutionary and revolutionary character; (12) scie nce is part of social and cultural traditions; (13) science and technology impact each other; and (14) scientific ideas are affected by their social and historical milieu (McC omas, Almazroa, & Clougii, 1998). Chapter two was divided into six main sections and consisted of a review of relevant studies in the science education and educa tional psychological literature focusing on the research questions described in Cha pter 1. The research literature

PAGE 405

389 included reviews of: (1) models of personal epistem ological development; (2) multidimensional models of personal epistemological development; (3) the nature of science; (4) the applicability to college science e ducation; and (5) the laboratory in chemistry education. The first and second section of the literature revi ew discusses several models of personal epistemological development beginning with a discussion of five major unidimensional epistemological models of development f ollowed by a description of two multidimensional models of epistemological beliefs. Uni-dimensional epistemological models and their related theories were described fo llowed by the multi-dimensional models. The uni-dimensional epistemological models suggests that individuals move through a patterned sequence of development while m ulti-dimensional models suggest that systems of beliefs do not develop through a pa tterned sequence and are composed of separate dimensions. The models discussed incl ude Perry’s scheme of intellectual and ethical development, Belenky’s women’s ways of knowing model, Baxter-Magolda’s epistemological reflection model, King and Kitchene r’s reflective judgment model, Kuhn’s model of reasoning skills, Schommer-Aikins’ system of independent beliefs model, and Hofer and Pintrich’s epistemological theories model In addition these two sections provided information on assumptions, and validity a nd reliability issues related to the theories. The third section presented a review of the researc h literature related to student’s images of science (NOS). The section begins with a consensus research-based definition of NOS, followed by a discussion of clas sifying students’ images of science in one of three categories: (1) data-focused view; (2) radical relativist view; or (3) theorydata linked view. This is followed by a review of t he necessity for cognitive dissonance in order for improved student understanding of NOS as well as an overview of measuring

PAGE 406

390 students’ NOS beliefs. The connections between NOS and personal epistemology are revisited and expanded from the initial discussion in chapter one. This section of the literature review ends with a discussion of three p otential methods used to enhance learners’ NOS beliefs; (1) historical, (2) implicit and (3) explicit-reflective. None of the aforementioned methods were targeted in this study. The fourth section of chapter two discussed researc h methodology issues related to the potential instruments used to assess student s’ NOS and personal epistemological beliefs. The discussion begins with a general over view of current assessment instruments followed by two sections that review in struments currently used to assess the aforementioned beliefs in general and in the do main of science. This review included a basic review of the two assessment instr uments used in this study; the EBAPS and the NSKS. The fifth section relates to the applicability of p romoting epistemological growth in the college science classroom through the use of ce rtain pedagogical applications. The discussion begins with an overview of epistemologic al orientations in learning science followed by a description of assessing epistemologi cal levels in the classroom in order to promote epistemological growth. The remainder of t his section discusses the six pedagogical applications identified in the literatu re that facilitate epistemological growth: (1) learning tasks; (2) expectations; (3) modeling and practice; (4) constructive feedback; (5) learner-centered environment; and (6) respect f or student development. The final section consisted of a review of the lite rature on science laboratory instruction. This section of the literature review elaborates on the nature of laboratory instruction, epistemological development in laborat ory instruction, and the history of laboratory instructional methods. The section begi ns with a description of the nature of laboratory instruction, how the developmental level s relate to laboratory instruction, and

PAGE 407

391 concludes with a discussion of science laboratory p edagogy and instruction. The laboratory instructional methods reviewed included: (1) expository; (2) inquiry; (3) discovery; and (4) problem-based. The laboratory p edagogical approaches discussed were: (1) pre-laboratory; (2) personal response sy stems; (3) laboratory work; (4) microcomputer-based software; and (5) post-laborato ry. The aforementioned pedagogical approaches were used in the study. Chapter three described the quantitative and qualit ative methods used in this study. Blending both methods into a single study i s recommended by researchers. Qualitative and quantitative data collection mixedmeasures were employed in three phases during this study of fifty-six students in 3 sections of a first semester general chemistry laboratory course. Section one restates the purpose of the study, elaborates on the rationale behind the research questions, and presents an overview of the analysis, design, and methodology. Section two describes the context and participants of the setting. A sample of 56 undergraduate students at a major University in Flo rida volunteered and participated in the study. All participants were enrolled in the f irst semester of a two semester general chemistry laboratory course during the fall semeste r of 2006. Section three discusses the research instruments, m easures, and techniques which include the: (1) Chemical Concepts Inventory (CCI), (2) Epistemological Beliefs Assessment for the Physical Sciences (EBAPS), (3) Nature of Scientific Knowledge Scale (NSKS), (4) Students’ Reflective Assessment of Laboratory Methods, and (5) Indepth semi-structured interviews. EBAPS was used to generate quantitative data on par ticipants’ personal epistemological beliefs. The EBAPS assesses person al epistemological beliefs of science in the following five dimensions: (1) the structure of scientific knowledge; (2) the

PAGE 408

392 nature of learning science; (3) the real-life appli cability of science; (4) the evolving knowledge of science; and (5) the source of ability to learn science. The EBAPS includes 30 items that are a mix of Likert-type rat ings of agreement or disagreement, as well as hypothetical scenario conversations to whic h students responded using multiple choice answers to indicated how closely their own v iews match those of the scenario conservation. This instrument was used to answer r esearch question one and subquestion1-b. The NSKS was used to generate quantitative data on participants’ NOS beliefs. The NSKS assesses participants’ NOS beliefs in the following dimensions: (1) amoral; (2) creative; (3) development; (4) parsimonious; (5 ) testable; and (6) unified. The NSKS includes 48 items related to the aforementioned dim ensions. The NSKS has a Likert scale forced-response format consisting of five cho ices from strongly agree to strongly disagree. This instrument was used to answer resea rch questions one and sub-question 1-a. The student laboratory questionnaire (Students’ Ref lective Assessment of Laboratory Methods) was used to assess the particip ants’ reactions to the three instructional methods associated with each laborato ry activity (e.g., pre-laboratory activities, laboratory work, and post-laboratory ac tivities). Section one of the questionnaire probed the usefulness of each pedagog ical feature of laboratory instruction with respect to understanding and neces sity of the laboratory learning experience. Section two of the questionnaire probe d participants’ perceptions regarding the following four aspects of laboratory work: (1) understanding the laboratory work, (2) enjoyment in performing the laboratory work, (3) ac hievement in conducting the laboratory work, and (4) difficulty in doing the la boratory work. Section three of the

PAGE 409

393 questionnaire asked the participants to describe th e kind of learning they believed they gained in a particular laboratory activity using Bl ooms Taxonomy categories. Semi-structured preand post study interviews with a subsample of participants (n=20) from the sample population (N=56) were perfo rmed by an outside interviewer. The interviews involved questions and/or statements related to the EBAPS and NSKS dimensions as well as the laboratory instructional features. The audio-taped interviews performed by an outside interviewer were transcribe d and coded for themes. The coding themes included the following: (1) EBAPS di mensions; (2) NSKS dimensions; and (3) the laboratory instructional features (prelaboratory, laboratory work, and postlaboratory). Section Four identifies the forms of treatment (ped agogy) involved in the laboratory instruction. This section offered an ov erview of the laboratory environment and pedagogy. Included is a discussion of the three general instructional features used during this study, pre-laboratory, laboratory work, and post-laboratory. The three pedagogical laboratory instructional feat ures used in this study included: (1) pre-laboratory; (2) laboratory work; and (3) post-laboratory. The prelaboratory methods included out of class and in cla ss activities ranging from online quizzes to class discussions prior to performing la boratory work. The laboratory work allowed students to engage in real-time laboratory recording of observations using a laboratory notebook, and answer their own questions experimentally while engaging in teamwork. The post-laboratory methods engaged stud ents in looking for trends, critically evaluating class data, work together to negotiate meaning as well as discuss claims and write about their claims by providing su pporting evidence. The last three sections of chapter 3 summarize data collection, describe how the data was analyzed, and describes the potential quan titative and qualitative analysis

PAGE 410

394 methods implemented for the study as well the aspec ts used in monitoring the reliability and validity of the data collection and analysis. I ncluded are a general overview of the phases of data collection and the researcher’s role during the study. The data collection process in this study occurred in three phases. The first phase of data collection included the administration of the CCI, EBAPS, and NSKS to all participants. In addition data related to participants’ prior chemistry skill s and knowledge was collected. The data was analyzed by an outside researcher. Initial in terviews were performed by an outside interviewer during phase two with the twenty volunt eers from the population sample (n=56) concerning their NOS and personal epistemolo gical beliefs about science. In addition, student laboratory instruction reflection s were collected (n=56). In the final phase the NSKS and EBAPS were re-administered (repe ated measure) and final interviews (n=20) by an outside interviewer concern ing what laboratory instructional strategies students’ believed influenced their unde rstanding of the laboratory material, as well as their NOS and personal epistemological beli efs about chemistry. Descriptive statistics (average dimension mean, eff ect size) were used to investigate differences between participants’ initi al and final personal epistemological and NOS beliefs. T-tests for paired samples were u sed to indicate the statistical significance of any differences. Associations betw een the preand post-assessments were determined using simple correlations. The interview responses, initial and final of the t wenty volunteers were compared and contrasted to their pre-post assessment scores from the NSKS and EBAPS. The interviews were offered as additional support of th e validity of the participants’ assessment scores. Chapter four presented a description of the partici pant sample followed by the presentation of the quantitative analyses of the st udy’s first research question and sub-

PAGE 411

395 questions dealing with pre-post assessment changes in NOS and personal epistemological beliefs. The research questions we re presented with the quantitative results of the CCI, EBAPS, and NSKS analyses for al l the participants (N=56) and of the twenty whom participated in the interviews. The re sults are discussed and related back to the key literature. Descriptive statistics (average dimension mean, eff ect size) were used to investigate differences between participants’ initi al and final personal epistemological (EBAPS) and NOS (NSKS) beliefs. T-tests for paired samples were used to indicate the statistical significance of any differences. Assoc iations between the preand postassessments were determined using simple correlatio ns. Chapter five presents a description of the developm ent of the participant’s personal epistemological beliefs through the presen tation of qualitative analyses of the study’s first research question and sub-question 1b. The characterization of personal epistemological beliefs with the results of the ana lyses of the participants’ responses to interview probes is presented. The combination of interviews and quantitative measures provided a glimpse into students’ initial and final personal epistemological beliefs. The interviews allowed for further probin g of beliefs and as extended support to the participants’ EBAPS scores. Clarification of a ny changes in beliefs during the course of the semester and what the participants’ believed influenced their beliefs were considered. The five dimensions of the EBAPS were used as coding themes for this analysis. The results are discussed and related bac k to the key personal epistemological literature. Chapter six presents a detailed description of the development of the participants’ NOS beliefs through the presentation of qualitative analyses of the study’s first research question and sub-question 1-a. The c haracterization of NOS beliefs with

PAGE 412

396 the results of the analyses of the participants’ re sponses to interview probes is presented. The combination of interviews and quan titative measures provide a glimpse into participants’ initial and final NOS beliefs. The interviews allowed for additional probing of beliefs and as extended support to the p articipants’ NOS scores. Clarification of any changes in beliefs during the course of the semester and what the participants’ believed influenced their NOS beliefs were consider ed. The dimensions of the NSKS were used as coding themes for this analysis. The r esults are discussed and related back to the key NOS literature. Chapter seven characterizes the findings of the ins tructional features of the second research question and sub-questions 2-a, and 2-b. The three pedagogical laboratory instructional features used in this stud y included: (1) pre-laboratory; (2) laboratory work; and (3) post-laboratory. The thre e instructional features were used as coding themes for this analysis. The characterizat ion of laboratory instruction with the quantitative and qualitative results from the Stude nt Evaluation of Laboratory Instruction Questionnaire as well as the results of the analyse s of the participants’ responses to interview probes was presented. This provided a gli mpse of the participants’ overall beliefs concerning the laboratory aspects of the se mester course. Major Findings of the Study This study was of an exploratory nature to lay a fo undation for focusing on more specific features of epistemological and NOS reason ing in light of specific instructional features (pre-lab, laboratory work, or post-lab) fo r future research. This study investigated students’ epistemological and NOS beli efs and their perceptions of the instructional features as related to those beliefs. The results are discussed and related back to the key laboratory education as well as the NOS and personal epistemological beliefs literature.

PAGE 413

397 The purpose of this mixed method study was to explo re whether student’s NOS, and personal epistemological beliefs about science (chemistry) changed by the completion of a semester general chemistry course a s well as, what laboratory pedagogical practices (e.g. preand post-laborator y activities, laboratory work) students’ believe influenced those belief changes and influen ced their understanding during the semester general chemistry laboratory course. The participants consisted of 56 undergraduate students enrolled in the first semest er of a general chemistry laboratory course at a major university in Florida. The theoretical epistemological perspectives guidin g this study were the unidimensional theories from models such as Perry (197 0) and Baxter Magolda (1992) as well as multidimensional theories from models such as Schommer’s (1990) and Hofer & Pintrich, (1997) discussed in chapters 1 and 2. Qua ntitative and qualitative methods were used to determine NOS and personal epistemolog ical difference scores followed by participant interviews and reflective instructio nal questionnaires. After determining the scores on the five dimensions of epistemology as me asure by the EBAPS, epistemological difference scores were computed. Th e aforementioned was repeated with the six dimensions of NOS as measured by the N SKS. Qualitative methods were used to expand and elaborate on the participants’ e pistemological and NOS beliefs in relation to their assessment scores and the three i nstructional methods (e.g. preand post-laboratory activities, laboratory work). The main research questions that guided this study were: The first research question and sub-questions lent themselves to quantitative and qualitative data analysis. They are:

PAGE 414

398 Question One RQ1. What range of personal epistemological and NOS beli efs about science (chemistry) do undergraduate science students have at the beginning of a semester general chemistry laboratory course? The findings discussed in detail in chapter 4 addre ssed the first research question of the range of students’ personal epistem ological and NOS beliefs and whether these beliefs changed by the end of a gener al semester chemistry laboratory course. The results are discussed and related back to the key personal epistemological and NOS literature. The overall average score for the EBAPS at the begi nning of the semester course for all participants (N=56) was 2.514 while the interview subsample of participants (n=20) was 2.537 indicating a low mode rately sophisticated level of epistemological beliefs. Based on the uni-dimension al epistemological models discussed in chapter two the aforementioned initial averages placed the participants in the early multiplicity stage of Perry’s model (1970), the tra nsitional knowing level of Baxter Magolda’s model (1986), and the quasi reflective th inking level of King and Kitchener’s model (1994). The multi-dimensional models of Scho mmer-Aikins (1990) and Hofer and Pintrich (1997) discussed in chapter two placed the participants at the lower end of the moderate level with their personal epistemological beliefs. This gives support to the personal epistemological studies discussed in chapt er two that students depending on their year in college, as well as other factors suc h as prior knowledge, age, and gender begin with a low level (dualist) to low moderate le vel (multiplicist) of personal epistemological beliefs. The 2.514 and 2.537 avera ges support epistemological studies related to science majors that these students range in the 2.5-3.5 sophistication level (Pavelich & Moore, 1996; Wise, Lee, Litzinger, Marr a, & Palmer, 2004). Participants’

PAGE 415

399 initial EBAPS scores suggested some of their episte mological beliefs were more sophisticated within the EBAPS dimensions of real-l ife applicability of science (2.7-2.8) and the source of ability to learn science (2.9-3.0 ). These higher initial dimension scores could be a reflection of their prior knowledge, lif e experiences, and/or their selfconfidence in other science courses. The initial a verage scores for the remaining three dimensions, structure of scientific knowledge (2.12.2), nature of knowing and learning scientific knowledge (2.5-2.6), and evolving scient ific knowledge (2.2-2.4) suggested low beliefs. This supports studies suggesting students ’ views on the structure of scientific knowledge, nature of scientific knowledge, and evol ution of scientific as being static and a repertoire of ideas rather than a cohesive view ( Linn & Hsi, 2000; Songer & Linn; 1991). The overall average score for the NSKS at the begin ning of the semester course for all participants, including the interview subsa mple was 142 placing the majority of the participants on the relativist end of the NSKS scal e holding non NOS views. Participants’ average sub-scores in each of the six NOS dimensions of the NSKS (2324) suggested non NOS views in every NOS aspect. T hese initial average scores supports previous studies that students with years of formal science education hold misconceptions regarding NOS (Dagher, et al., 2004; Lederman, et al., 2002; Smith, et al., 2000). Even after years of formal science edu cation, students often view science as a set of unrelated facts, as unchanging knowledge, and as an absolute, objective endeavor that is separate from social influences an d personal bias (Abd-El-Khalick & Lederman, 2000; Bell et al., 2003; Halloun & Hesten es, 1998). On average, 80% of the participants’ initial EBAPS overall scores and dimension sub-scores correlated with their interview response s as discussed in chapter 5. In other words, the interview responses and EBAPS scores of the participants reflected the low

PAGE 416

400 level of sophistication seen in other studies invol ving epistemological beliefs. The participants’ initial NSKS overall scores and dimen sion sub-scores correlated with at least 70% of the interview responses as discussed i n chapter 6. Therefore the NSKS scores and interview responses in this study reflec t the same general nave perspective of NOS as suggested in other NOS studies. Howeve r, a few of the participants’ EBAPS and NSKS scores did not support their reflect ions. Some of the participants’ scores reflected unsophisticated beliefs while thei r interview or questionnaire responses indicated more neutral NOS beliefs. Similar to th e findings in prior studies some of the participants in this study assumed scientific knowl edge to be factual and certain, based their beliefs on authority rather than argument or evidence, and that there is one scientific method. What one cannot infer at this point in time is whet her these beliefs are enduring over a long period of time or whether some students ’ beliefs more adaptable than others are. Additional and longer research studies are nee ded to investigate students’ initial beliefs and the effects of instruction on changing those beliefs. Sub-Question-1a RQ1a. Do students’ images of the nature of chemistry (NOS ) change by the completion of a semester general chemistry laborato ry course? The findings discussed in chapters 4 and 6 in detai l address this portion of the first research question concerning whether the stud ents’ NOS beliefs changed by the end of a general semester chemistry laboratory cour se. The results are discussed and related back to the key NOS beliefs literature in c hapters 2, 4 and 6. Overall the NSKS results for the total population s ample (N=56) showed a significant increase in the following dimensions: c reative, developmental, parsimonious, testable, and unified. The participants seemed to struggle with the amoral dimension.

PAGE 417

401 In summary based on the NSKS results: (1) the mean gain scores for the overall test and all dimensions, except for amoral were found to be significant at p .05 and (2) the data suggest that instruction had effected a small change in the students’ NOS beliefs. Overall the NSKS results for the interview particip ants (N=20) showed a significant increase in the following dimensions: c reative, parsimonious, and unified. The participants seemed to struggle with the dimens ions amoral, parsimonious, and testable. In summary based on the NSKS results: (1) the mean gain scores for the overall test and all dimensions, except for amoral, parsimonious, and testable were found to be significant at p .05 and (2) the data suggest that instruction had effected a small change in the students’ NOS beliefs. For the re to be a probability of a more substantial change in NOS beliefs specific instruct ional methods related to NOS and a longer period of instruction would be warranted. At the beginning of the study some participants’ he ld an idealized image of the nature of evidence, laws, and theories as evident i n the NSKS scores and initial interview statements. However, by the end of the se mester course some of the participants had shifted their non NOS views slight ly toward a blend or supporting some NOS views. As discussed in chapters 4 and 6 severa l participants used the word proof in their interview statements to describe aspects o f NOS. This supports other research studies where students used the term proof to descr ibe the fundamental nature of scientific evidence (Dagher et al., 2004; Lederman et al., 2002). Occasionally, students use the word proof to indicate an absolute answer, and to describe directly-observed evidence. Some participants described scientific knowledge as starting from a hypothesis, then becoming a theory, and after sever al experiments becomes a law. This supports Bell, Blair, Crawford, and Lederman’s (200 3) study that secondary students

PAGE 418

402 rank scientific knowledge in a hierarchy. The major ity of the final interviews discussed in chapter 6 correlated with the overall small increas e in the participants’ NSKS scores. However, a small number of the participants’ NSKS s cores did not support their reflections. These participants’ scores reflected unsophisticated NOS beliefs while their interview or questionnaire responses indicated more moderate beliefs. What one cannot infer at this point in time is whet her these NOS belief changes are enduring or whether some participants are simpl y more adaptable than others are. Additional and longer research studies are needed t o investigate the effects of instruction on NOS belief changes. Sub-Question-1b RQ1b. Do students’ personal epistemological beliefs about science (chemistry) change by the completion of a semester general chem istry laboratory course? The findings discussed in chapters 4 and 5 address this portion of the first research question concerning if the students’ epist emological beliefs changed by the end of the general semester chemistry laboratory course The results are discussed and related back to the key personal epistemological be liefs literature in chapters 2, 4 and 5. Overall the Epistemological Beliefs Assessment of t he Physical Sciences (EBAPS) results for the total population sample (N= 56) and the interview participants (N=20) showed a significant increase in structure, nature, real life applicability of science, and evolving knowledge. The participants seemed to struggle with the source of the ability to learn science. In summary based on the EBAPS results: (1) the mean gain scores for the overall score and all dimension s, except for the source of ability to learn were found to be significant at p .05 and (2) the data suggest that instruction had effected a change in the students’ epistemological beliefs.

PAGE 419

403 The majority of the final interview responses corre lated with the participants’ final EBAPS scores. Improvement in participants’ epistem ological beliefs was demonstrated by their more mature comments as discussed in chapt er 5. However, some of the participants’ EBAPS scores did not support their re flections. Some of the participants’ scores reflected unsophisticated beliefs while thei r interview or questionnaire responses indicated more moderate beliefs. Earlier studies relating to learners’ personal epis temological beliefs conducted with college students indicated that their personal epistemological beliefs can change during the college years (Baxter Magolda, 1992; Per ry, 1981). A minimal change in personal epistemological beliefs is indicated in th is study as discussed in chapters 4 and 5. A semester is hardly enough time to determine i f the changes were valid or simply due to chance. Another investigation found that en tering college freshmen believe knowledge is certain and provided by authority whil e college seniors believed that knowledge is complex and tentative and is derived t hrough reason (Perry, 1968). Schommer’s (1997) study determined high school stud ents’ epistemological beliefs changed over time. These findings add support to t his studies results that epistemological beliefs develop over time. However a student’s beliefs about the structure of scientific knowledge may develop indep endently from his or her beliefs about the stability of scientific knowledge (i.e., evolving). Therefore, examining the dimensions of epistemological beliefs rather than e pistemological beliefs as a coherent whole may allow a clearer picture of how beliefs ch ange. What one cannot infer at this point in time is whet her these belief changes are enduring or whether some participants are simply mo re adaptable than others are. More research studies are needed to investigate the effe cts of instruction on personal epistemological growth.

PAGE 420

404 The second research question and sub-questions were : Question Two RQ2. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work) do students believe we re essential to their understanding during the semester general chemistry laboratory le arning experience? The findings discussed in chapter 7 addressed the s econd research question of the laboratory pedagogical practices (e.g. preand post-laboratory activities, laboratory work) did students believe were essential to their understanding during the semester general chemistry laboratory experience. The result s are discussed and related back to the key laboratory education literature in chapters 2, 3 and 7. The majority (65%) of participants (N=56) indicated that they found the laboratory work to be either very or extremely essential to th e laboratory experience and their understanding of the material. The interview parti cipants (N=20) ranked the three instructional features the same as all the particip ants (N=56) with laboratory work being the most essential, followed by post-laboratory, an d lastly pre-laboratory. Laboratory investigations are viewed as ideal envir onments for meaningful learning when appropriate instructional techniques are implemented into the curriculum design. For this study, the use of cooperative lea rning and active learning techniques, such as pre-preparation and post-laboratory small g roup discussions were implemented to promote higher order thinking and positive attit udes. The aforementioned methods have been identified in studies as effective pedago gical tools (Cooper, 1995, NRC, 1996). The participants in this study identified t he laboratory work feature as being essential to their learning and understanding. The participants found the real-life experiences, group discussions and teamwork as mean ingful to their learning. This corresponds with research related to laboratory inv estigations that found discussions

PAGE 421

405 played a meaningful role in developing students’ un derstanding of scientific ideas (Driver, et al., 1994; Millar, 2004). Some of the participants in the study found the lab oratory notebook to be quite useful while others found it to be tedious. Howeve r, laboratory notebooks are often used as a formative assessment tool. The use of laborat ory notebooks as a part of instruction is supported by many researchers who advocate writi ng in science to enhance learner understanding of scientific content and processes a s well as general writing (Keys, et. al., 1999; Shepardson & Britsch, 2000; Bass, et. al ., 2001). The majority of the participants in this study iden tified the use of the MBLs as worthwhile. They found them easier to use and rela ted to real-life laboratory experiences. MBLs allowed the students to devote m ore time to observation, reflection, and discussion. Studies suggest that the use of MBL s can support and enhance meaningful learning in scientific inquiry. They as sist in a learners’ knowledge construction, and help develop concepts and skills such as graphing, collaboration, and scientific reasoning (Pienta, & Amend, 2004; Nachmi as & Linn, 1990). The MBL learning environment can assist in increasing the s tudent’s ability to analyze and interpret data. Students can repeat experiments th ereby generating more data for analysis, manipulate the parameters of investigatio ns, and study graphs by using MBL modeling tools (Pienta, & Amend, 2004; Newton, 1997 ; Settlage, 1995; Lazarowitz, & Tamir, 1994). However, according to Pienta and Ame nd (2004) students without an appropriate conceptual understanding of chemistry m ay fail to observe the phenomenon under investigation. Therefore, MBLs may not promo te learning for all students (Atar, 2002). Post-laboratory was identified by the participants as almost as essential as the laboratory work. However, without the laboratory w ork the participants felt there was no

PAGE 422

406 point to the post-laboratory activities. Some view ed the laboratory report as pointless, particularly in view of their laboratory notebook. Others felt strongly the opposite that the post-laboratory reports were extremely essential as it allowed one to tie together all the information and see the bigger picture. Much of th e laboratory work discussion can be expanded into the post-laboratory discussion and an alysis. The students had to look for patterns in results and relate data to the underlyi ng chemical concepts. Keys (2000) findings suggest that scientific writing promotes s cientific thinking by helping learners to explore relationships between evidence and knowledg e claims. The results of this study show that the use of written products such as labor atory notebooks and reports are valuable methods of instruction for the development of scientific reasoning skills and the construction of scientific understandings (Keys, et al., 1999; Keys, 2000; Reid & Shah, 2007). Writing in science is as a way to bridge p rior knowledge with new learning, build explanations, and make sense of information. Participants in this study identified the pre-labor atory as the least essential to their learning and understanding. However, pre-lab oratory instruction is introduced as a way to reduce the information overload on students (Reid & Shah, 2007). The prelaboratory can reduce the amount of time spent on l aboratory procedures so more time can be spent on other aspects of the laboratory env ironment such as, laboratory work. The pre-laboratory activities encourage planning an d allow understanding by reducing information overload. What one cannot infer at this point in time is whet her these beliefs about laboratory instruction are enduring or whether some participants are simply more adaptable than others are to the learning environme nt. More research studies are needed to investigate the effects of laboratory ins truction on student understanding.

PAGE 423

407 Sub-Question – 2a RQ2a. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work) do students believe in fluenced their personal epistemological beliefs about science (chemistry) during the semest er general chemistry laboratory course? The findings discussed in chapters 5 and 7 addresse d this research question concerning what laboratory pedagogical practices if any, did students believe influenced their personal epistemological beliefs about scienc e during the general semester chemistry laboratory course. The results are discu ssed and related back to the key laboratory education as well as the personal episte mological beliefs literature. Substantial support was shown by the participants ( N=20) indicating that they found the laboratory work to be the most effective in influencing their epistemological beliefs. The post-laboratory was ranked close secon d with the pre-laboratory receiving minimal support as to influencing the participants’ beliefs. For three out of the five EBAPS (nature of knowing and learning, real-life ap plicability, and evolving knowledge) laboratory work was ranked as most effective in inf luencing beliefs. As in much of the literature reviewed in preparatio n for this research study, aspects of the participants’ learning beliefs incor porated views about epistemological issues. The participants provided unprompted belief comments about their views throughout their reflective and interview comments discussed in chapters 5 and 7. This epistemological nature of the participants’ beliefs was reminiscent of the work by Perry (1970), Baxter Magolda (1992, 2002), Hofer and Pint rich (1997, 2002), and SchommerAikins (2002) that recognizes how an individual’s e pistemological beliefs are also integral to their entire belief system.

PAGE 424

408 Clearly, engaging in lab-based inquiry engages stud ents with epistemological issues. In terms of the first research question an d this question, there was some evidence that the participants’ epistemological bel iefs about science changed over time. Although the changes were not large, participants b ecame more sophisticated in their beliefs about the structure of scientific knowledge nature of scientific knowledge, real life applicability of science, and how scientific knowle dge evolves over the course of instruction. Whether the laboratory work itself or a specific component impacted the change as the participants declared would need furt her investigation. However, these results parallel the findings of Solomon et al. (19 96) that showed that hands-on laboratory science instruction was related to epist emological awareness. In this case, the participants did become more sophisticated in t heir overall beliefs. Of course, this potential explanation for the change needs to be te sted in additional studies that compare laboratory hands-on science with other more traditional science instruction. Work in this area suggests that students in constru ctivist learning environments develop more sophisticated epistemological stances than do those in traditional learning environments (Smith et al., 2000). The participants’ epistemological beliefs also inco rporated many views about self knowledge and these beliefs were often perceived by the study’s participants as. Such findings suggest that epistemological beliefs may i nto the area of self reflection. The fact that the participants’ beliefs were threaded with e pistemological references may be due to the fact that the methodology of the study allow ed for interlinked concepts to be discussed. The results of this study suggest that laboratory i nstructional methods and educational experiences can have an effect on learn ers’ epistemological development. Even with the short training on critical-thinking d uring the laboratory work and post-

PAGE 425

409 laboratory activities appeared to affect participan ts’ views of scientific knowledge and their approach in justifying scientific beliefs. Wh at one cannot infer at this point in time is whether these belief changes are enduring or whethe r some students are simply more adaptable than others are. More research studies ar e needed to investigate the effects of instruction on epistemological growth or changes Sub-Question-2b RQ2b. What laboratory pedagogical practices (e.g., prea nd postlaboratory activities, laboratory work) do students believe in fluenced their images of the nature of chemistry (NOS) during the semester general chemist ry laboratory course? The findings discussed in chapters 6 and 7 addresse d this research question concerning what laboratory pedagogical practices if any, did students believe influenced their NOS beliefs about science during the general semester chemistry laboratory course. The results are discussed and related back to the key laboratory education as well as NOS beliefs literature. Strong support was shown by the participants (N=20) indicating that they found the laboratory work to be the most effective in inf luencing their NOS beliefs. Minimal participant support was shown for the pre-laborator y with it being ranked second in influencing overall NOS beliefs. Overall the postlaboratory was ranked third suggesting that it had a minimal influence on participants’ NO S beliefs. The participants provided unprompted belief comment s about their views about the nature of scientific knowledge throughout their reflective and interview comments discussed in chapters 6 and 7. The data suggests t hat reflection is necessary to achieve an understanding of NOS, as the interview subjects did increase and improve their understanding if only slightly (Johnston & Southerl and, 2002; Lederman, et al., 2003; Southerland, et al., 2003). Due to the difficulty and abstractness of the issues of the

PAGE 426

410 NOS, the students must be made to reflect on these issues, typically in reaction to laboratory activities in order for understanding to take place (Akerson & Abd-El-Khalik, 2002). Lederman, Abd-El-Khalick, Bell, and Schwar tz (2002) suggest that many college students have difficulty synthesizing their laborat ory experiences into a coherent picture of NOS. The use of explicit NOS laboratory instruc tion may improve the participants’ views of NOS. However according to Lederman (2004), a one-size-fits-all approach to laboratory scientific inquiry is not typical of rea l scientific practice and not likely suitable for advancing consistent and desired NOS views of s cience, even through explicit or reflective means. What one cannot infer at this point in time is whet her these laboratory instructional views and NOS belief changes are endu ring or whether some participants are simply more adaptable than others are. More res earch studies are needed to investigate the effects of instruction on NOS belie fs. Limitations This study has several limitations. One limitation of this study is that the results cannot be generalized. The sample size was small (N =56) and the chemistry students are generally not representative of the general stu dent population. In addition, the study was not designed with a control group. The low samp le size and lack of a control group may raise questions about power and type II error. This study was of an exploratory nature to lay a fo undation for focusing on more specific features of epistemological and NOS reason ing in light of specific instructional features (pre-lab, laboratory work, or post-lab) fo r future research. Therefore the use of the word “growth” in the title of the dissertation may be a misnomer. It is a bit too presumptuous to infer growth patterns from two data points. The design of the study

PAGE 427

411 makes it difficult to explain the observed changes either as indicators of the general effects of instruction or of a particular form of i nstruction. In any event there is not sufficient data to make definitive claims about “gr owth”. The word change may be a more suitable term. In studies of this nature (involving repeated measu res), completing the initial responses to an instrument could impact responses o n the repeated measure of the instrument. A testing effect can occur when the pr e-assessment itself influences the post-assessment. The reliability of the assessment instruments may change in human ability to measure differences (due to experience, fatigue, etc). Therefore, initial and final interviews were implemented to assist in chec king the validity of the participants’ scores on the EBAPS and NSKS. The initial scores o f the interview participants were compared to their initial interview responses. Thi s method was repeated with the final scores and interviews. Another limitation is the influence that other lear ning experiences may have had on the participants’ beliefs. Participation in col lege supports students’ intellectual development. In addition to academic curriculum, th ere are co-curricular experiences that influence students’ development. These factors can be categorized as internal and external factors. The internal factors include stud ents’ gender, age, personal experience, and domain competency. External factors include c urriculum, major fields of study, and social context in college. It is important that stu dents’ development potential, including external factors that influence their developmental growth, be taken into consideration. Although this area is worth researching, it was not the focus of this study. The issue of whether and how the twenty volunteer i nterviewees were similar to or different from the remaining thirty-six of the f ifty-six participants needed to be considered. Further formal statistical comparisons of the two subgroups on the EBAPS

PAGE 428

412 and NSKS to determine if there are similarities in the patterns of responses would add to the studies assumptions. However, this will be att ended to at a later date. Lastly, all of the participants in the study were e nrolled in different sections of the same chemistry laboratory course with the same inst ructor whom was also the researcher. Effects thought to be from the study co uld instead be a result of her influence on the participants. To control for this phenomenon, participants were interviewed by another researcher and all reflectiv e responses were read after the conclusion of the study. Qualitative data were chos en based on responses to the quantitative parts of the study, and included data from many participants in the same class. Interview participants were self-selected a nd participated in the study because they wanted to or wanted extra credit. Some partici pants dropped out of the study after the pre-test; others dropped out after the second w eek of the course. Further Research Researchers of personal epistemology note the need for further work in the area of students’ NOS and personal epistemological belie fs and instructional experiences (Hofer, 2002; Schommer-Aikins, 2004). For some inst ructors of chemistry, the development of appropriate epistemological beliefs in their students is an important goal of instruction. For others, epistemology may not be as important. These instructors, however, would still be wise to encourage appropria te and thorough epistemological self-reflection, because it may facilitate conceptu al learning. Considering the goals of laboratory instruction, on e should consider to what extent laboratory courses: (1) help reinforce conce pts from the lecture course; (2) improve laboratory skills; (3) convey scientific pr ocesses; (4) promote positive attitudes towards science; and (5) students learn some facts about the nature of chemistry and chemicals as a result of laboratory instruction.

PAGE 429

413 Future research should aim to extract and explain d ifferences in terms of the sample characteristics, the laboratory methodologic al differences, and possible variance in the EBAPS and NSKS themselves. The exploration of students’ epistemological or NOS beliefs as related to science is rarely, if eve r, a part of a student’s classroom experience. None of the participants in the study r eported having discussed their beliefs in any college class or reported having their belie fs inventoried prior to this study.

PAGE 430

414 References Abd-El-Khalick, F., & Lederman, N.G. (2000). Improv ing science teachers’ conceptions of nature of science: A critical review of the lite rature. International Journal of Science Education 22 665-701. Abd-El-Khalick, F., & Akerson, V. L. (2004). Learni ng as conceptual change: Factors mediating the development of preservice elementary teachers’ views of nature of science. Science Education 88(5), 785-810. Aikenhead, G. S., & Ryan, A. G. (1992). The develop ment of a new instrument: “Views on Science-Technology-Society” (VOSTS). Science Education 76, 477-491. Aschbacher, P. R. & Alonzo, A.C. (2004). Using Sc ience Notebooks to Assess Students’ Conceptual Understanding. Paper presente d at the Annual Meeting of the AERA. Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ: Prentice Hall. Barnes, R. & Thornton, B. (1998). Preparing for lab oratory work. In Black, B. and Stanley, N. (Eds), Teaching and Learning in Changing Times 28-32. Proceedings of the 7th Annual Teaching Learning For um, The University of Western Australia, February 1998. Perth: UWA. http://lsn.curtin.edu.au/tlf/tlf1998/barnes.html (A pril 10, 2005). Barton, R. (1997). Computer-aided graphing: a compa rative study. Journal of Information Technology for Teacher Education, 6 (1), 59 – 72. Barton, R. (2005). Supporting teachers in making in novative changes in the use of computer aided practical work to support concept de velopment in physics education. International Journal of Science Education, 27 (3), 345 – 365. Bass, K.M., Baxter, G.P., & Glaser, R. (2001). Using reflective writing exercises to promote writing-to-learn in science. Paper presented at the Annual Meeting of the AERA, Seattle. Baxter Magolda, M. B. (1987). The affective dimens ion of learning: Faculty-student relationships that enhance intellectual development College Student Journal, 21, 46-58. Baxter Magolda, M. B. (1992). Knowing and reasoning in college: Gender-related patterns in students’ intellectual development. San Francisco: Jossey-Bass.

PAGE 431

415 Baxter Magolda, M. B. (2002). Epistemological refl ection: The evolution of Epistemological assumptions from age 18 to 30. In B. K. Hofer & P. R. Pintrich (Eds.), Personal epistemology: The psychology of beliefs a bout knowledge and knowing. Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Baxter, Magolda, M. B. (2004). Evolution of a cons tructivist conceptualization of epistemological reflection. Educational Psychologist 39, 31-42. Belenky, M.F., Clinchy, B. M., Goldberger, N. R., a nd Tarule, J. M. (1986). Women’s ways of knowing: The development of self, voice, and mind New York: Basic Books. Bell, P. (2004). The school science laboratory: Co nsiderations of learning, technology, and scientific practice. Paper presented at the “Hi gh School Science Laboratories: Role and Vision” meeting, Board on Sc ience Education, National Academy of Sciences. Washington D. C. Bell, P,. & Linn, M. (2002) Beliefs About Science: How Does Science Instruction Contribute? In Hofer, B & Pintrich, P (Eds) Personal Epistemology Chapter 16, pp.321-346, Lawrence Earlbaum Assoc. Mahwah, New Je rsey. Bell, R. L., Blair, L. M., Crawford, B. A. A., & Le derman, N. G. (2003). Just do it? Impact of a science apprenticeship program on high school students' understandings of the nature of science and scientific inquiry. Journal of Research in Science Teaching 40, 487-509. Bendixen, L. D. (2002). A process model of epistemi c belief change. In B. K. Hofer & P. R. Pintrich (Eds.), Personal epistemology: The psychology of beliefs a bout knowledge and knowing. Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Bendixen, L. D. and Rule, D. C. (2004) An integrati ve approach to personal epistemology: A guiding model. Educational Psychologist 39, 69-80. Berg, C. A. R., Bergendahl, V. C. B., Lundberg, B. K. S., & Tibell, L. A. E. (2003). Benefiting from an open-ended experiment? A compari son of attitudes to, and outcomes of, an expository versus an open-inquiry v ersion of the same experiment. International Journal of Science Education, 25, 351-372. Berg, A. (2005). Factors related to observed attit ude change toward learning chemistry among university students. Chemistry Education: Research and Practice 6, 118. Bianchini, J., Whitney, D., et al. (2001). “Toward Inclusive Science Education: University Scientists’ Views of Students, Instructional Pract ices, and the Nature of Science.” Science Education, 86, 42-78. Biggs, J. B., Kember, D., and Leung, D. Y. P. (2001 ). The revised two-factor Study Process Questionnaire: R-SPQ-2F. British Journal of Educational Psychology 71, 133-149.

PAGE 432

416 Bloom, B. S. (Ed.) (1956). Taxonomy of Educational Objectives. New York: David McKay Co. Bock, M. (1999). Baxter Magolda’s epistemological reflection model. New Directions for Student Services. 88, 29-40. Bodner, G. M. (1986). Contstructivism: A theory of knowledge. Journal of Chemical Education 63, 873-878 Boekaerts, M. (1997). Self-Regulated learning: A ne w concept embraced by researchers, policy makers, educators, teachers, an d students. Learning and Instruction 7, 161-186. Bransford, J.D., Brown, A.L., & Cocking, R.R. (Eds. ). (2000). How people learn: Brain, mind, experience, and school. Washington, DC: Natio nal Academy Press. Brickhouse, N. W., Dagher, Z. R., Letts, W. J., & S hipman, H. L. (2000). Diversity of students’ views about evidence, theory, and the int erface between science and religion in an astronomy course. Journal of Research in Science Teaching, 37, 340-362. Brownlee, J. (2002). Students learning to teach: Conversing with students about their epistemological beliefs. HERSDA Conference Proceedings. Bruning, R. H., Schraw, G. J., Morby, M. M., and Ro nning, R. R. (Eds.) (2004). Cognitive Psychology and Instruction. New Jersey: Pearson Prentice Hall Buehl, M.M. & Alexander, P.A. (2004) Motivation and performance differences among domain specific epistemological belief profiles. Pa per presented at the Annual Meeting of the American Psychological Association, Honolulu: HI Bunce, D. M. (2001). Does Piaget still have anythi ng to say to chemists? Journal of Chemical Education 78, 1107. Burnstein, R. & Lederman, L. M. (2001). Using wirel ess keypads in lecture classes. The Physics Teacher 39 8-11. Byers, W. (2002). Promoting active learning throug h small group laboratory classes. University Chemistry Education, 6, 28-34. Carey, S., Evans, R., Honda, M., & Unger, C. (1989) “An experiment is when you try it and see if it works”: A study of grade 7 students’ understanding of the construction of scientific knowledge. International Journal of Science Education 11, 514–529. Carey, S., & Smith, C. (1993). On understanding the nature of scientific k nowledge. Educational Psychologist 28, 235-25.

PAGE 433

417 Chin, C., & Brown, D. (2000). Learning in Science: A comparison of deep and surface approaches. Journal of Research in Science Teaching 37 (2), 109-138. Clough, M. P. (1997). Strategies and activities for initiating and maintaining pressure of students’ nave views concerning the nature of scie nce. International Journal of Science Education, 28, 191-204. Clow, D. (1998). Teaching, learning, and computing University Chemistry Education 2, 21-28. Cooley, W. W., & Klopfer, L. E. (1961). Test on und erstanding science. Princeton, NJ: Educational Testing Service Cooper, M. M. (1995). “Cooperative learning: An app roach for large enrollment courses,” Journal of Chemical Education 72, 162-4. Coppola, B. P., and Jacobs, D. C. (2001). Is the s cholarship of teaching and learning new to chemistry? Disciplinary Styles in the Scholarship of Teaching and Learning, Carnegie Foundation for the Advancement of Teaching Coxhead, P., & Whitefield, R. (1975). Science under standing measure test manual, University of Aston, Birmingham. Dagher, Z., Brickhouse, N., Shipman, H., & Letts, W (2004). How some college students represent their understanding of scientifi c theories. International Journal of Science Education 26, 735-755. Dawe, S. (2003). Practical Work: The Universal Pana cea? Available at: http://www.bishops.ntc.nf.ca/rriche/ed6620/practica l.html (Accessed: 2006). DeMeo, S. (2001). Teaching chemical technique: A r eview of the literature. Journal of Chemical Education 78, 37379. Denzin, N.K. & Lincoln, Y.S. (1994). Handbook of qualitative research. Thousand Oaks, CA: Sage Publications. DiPasquale, D. M., Mason, C. L., and Kolkhorst, F. W. (2003). Exercise in inquiry: Critical thinking in an inquiry-based exercise phys iology laboratory course. Journal of College Science Teaching, 32, 388-393. Domin, D. (1999). A content analysis of general ch emistry laboratory manuals for evidence of higher-order cognitive tasks. Journal o f Chemical Education. 75, 109-111. Drayton, B. & Falk, J. (2002). Inquiry-oriented sci ence as a feature of your school system: What does it take? Science Educator, 11(1), 9 – 17. Driver, R., Asoko, H., Leach, J., Mortimer, E., & S cott, P. (1994). Constructing scientific knowledge in the classroom. Educational Researcher 23, 5-12.

PAGE 434

418 Driver, R., Leach, J., Millar, R., & Scott, P. (199 6). Young people’s images of science. Buckingham: Open University Press. Duell, O. K., and Schommer-Aikins, M., (2001). Mea sures of people’s beliefs about knowledge and learning. Educational Psychology Review 13, 419-446. Duschl, R. A., Hamilton, R. J., & Grandy, R. E. (19 92). Psychology and epistemology: Match or mismatch when applied to science education ? In R. A. Duschl & R. J. Hamilton (Eds.), Philosophy of science, cognitive psychology, and ed ucational theory and practice. Albany, NY: State University of New York Press. Dweck, C. S., and Legget, E. L. (1988). A social-c ognitive approach to motivation and personality. Psychological Review 95, 256-273. Elby, A., & Hammer, D. (2001). On the substance of a sophisticated epistemology. Science Education, 85 (5), 554-567. Elby, A., Frederiksen, J., Schwarz, C., & White, B. (1999). Epistemological Beliefs Assessment For Physical Science Retrieved May 9, 2004 from http://www2.physics.umd.edu/~elby/EBAPS/home.htm. Entwistle, N. J., and McCune, V. (2004). The conce ptual bases of study strategy inventories. Educational Psychology Review 16, 325-345. Farmer, W. A., Farrell, M. A., and Lehman, J. R. (1 991). Secondary science instruction: An integrated approach. Massachusetts: Addison-Wes ley. Felder, R. and Brent, R. (2004). The intellectual development of science and engineering students: Part 1. Models and Challenge s. Journal of Engineering Education. 93, 269-277. Felder, R. and Brent, R. (2004). The intellectual development of science and engineering students: Part 2. Teaching to promote growth. Journal of Engineering Education. 93, 279-291. Felder, R. and Brent, R. (2005). Understanding stud ent differences. Journal of Engineering Education, 94, 57-72. Finster, D. C. (1989). Developmental instruction: P art I. Perry’s model of intellectual development. Journal of Chemical Education, 66, 659-661. Finster, D. C. (1991). Developmental instruction: P art II. Application of the Perry model to general chemistry. Journal of Chemical Education, 68, 752-756. Fitch, J. (2004). Student feedback in the college classroom: a technology solution. Educational Technology, Research and Development 52, 71-81. Friedler, Y. & Tamir, P. (1984). Teaching and learn ing in high school laboratory classes In Israel. Research in Science Education 14 89 – 96.

PAGE 435

419 Friedler, Y., Nachmias, R., & Linn, M. C. (1990). L earning scientific reasoning skills in microcomputer-based laboratories. Journal of Research in Science Teaching 27, 173-191. Gabel, D. L. (1999). Improving teaching and learnin g through chemistry education research: A look to the future. Journal of Chemical Education, 76 548 – 554. Gall, M. D., Borg, W. R., & Gall, J. P. (2007). Ed ucational research: An introduction (8 th ed.) White Plains, NY: Longman. Garratt, J. (1998). Inducing people to think. University Chemistry Education 2, 29-33. Garrett-Ingram, C. (1997). Something to believe in: The relationship between epistemological beliefs and study strategies. Paper presented at the Annual Meeting of the American Educational Research Associ ation, Chicago. Guba, E., & Lincoln, Y. (1989). Fourth Generation Evaluation Beverly Hills, CA:Sage Gunstone, R. & Champagne, A. (1990). Promoting con ceptual change in the laboratory. In E. Hegarty-Hazel (Ed.), The student laboratory and the science curriculum London: Routledge. Halloun, I., & Hestenes, D. (1998). Interpreting VA SS Dimensions and Profiles. Science & Education, 7, 553-577. Hammer, D. M. (1994). Epistemological beliefs in in troductory physics. Cognition and Instruction, 12, 151-183. Hammer, D., & Elby, A. (2002). On the form of a per sonal epistemology. In B. K. Hofer & P. R. Pintrich (Eds.), Personal Epistemology: The psychology of beliefs ab out knowledge and knowing Mahwah, NJ: Erlbaum. Hammer, D. M. and Elby, A. (2003). Tapping epistem ological resources for learning physics. Journal of the Learning Sciences, 12, 53-90. Herrington, A. J. (1997). Developing and responding to major writing projects In M. D. Sorcinelli, & P. Elbow, (Eds.), Writing to Learn: S trategies for Assigning and Responding to Writing Across the Disciplines, (Vol. 69, pp. 67-75). San Francisco:Jossey-Bass Herron, J. D., and Nurrenbern, S. C. (1999). Chemic al education research: Improving chemistry learning. Journal of Chemical Education 76, 1353-1361. Hodson, D. (1996). Laboratory work as scientific me thod: three decades of confusion and distortion. Journal of Curriculum Studies, 28 115 – 135. Hofer, B. K. and Pintrich, P. R. (1997). The devel opment of epistemological theories: Beliefs about knowledge and knowing and their relat ion to learning. Review of Educational Research 67, 88-140.

PAGE 436

420 Hofer, B. K. (2000). Dimensionality and disciplina ry differences in personal epistemology. Contemporary Educational Psychology, 25 378-405. Hofer, B. K. (2001). Personal epistemology researc h: Implications for learning and teaching. Educational Psychology Review, 13 353-383. Hofer, B. K. and Pintrich, P. R. (2002). Personal epistemology as a psychological and educational construct: An introduction. In B. K. Hofer & P. R. Pintrich (Eds.), Personal epistemology: The psychology of beliefs a bout knowledge and knowing. Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Hofer, B. K. (2002a). Epistemological world views of teachers: From beliefs to practice. Issues in Education, 8, 167-173. Hofer, B. K. (2004). Epistemological understanding as a metacognitive process: Thinking aloud during online searching. Educational Psychologist 39, 43-55. Hofstein, A. (2004). The laboratory in chemistry e ducation: Thirty years of experience with developments, implementation, and research. Chemistry Education: Research and Practice. 5, 247-264. Hofstein, A., and Lunetta, V. N. (2004). The labor atory in science education: Foundation for the 21 st century. Science Education. 88, 28-54. Hofstein, A., Navon, O., Kipnis, M., & Mamlock-Naam an, R. (2005). Developing students’ ability to ask more and better questions resulting from inquiry-type chemistry laboratories. Journal of Research in Science Teaching 42, 791-806. Hogan, K. (1999). Thinking aloud together: A test o f an intervention to foster students’ collaborative scientific reasoning. Journal of Research in Science Teaching 36 (10), 1085-1109. Hogan K. (2000) Exploring a process view of student s’ knowledge about the nature of science. Science Education 84, 51-70. Jehng, J., Johnson, S. D., and Anderson, R. (1993). Schooling and students’ epistemological beliefs about learning. Contemporary Educational Psychology 18, 23-25. Johnstone, A.H. (1997). Chemistry TeachingScience or Alchemy? Journal of Chemical Education, 74, 262-268. Johnstone, A. H., and Al-Shuaili, A. (2001). Learn ing in the laboratory; some thoughts from the literature. University Chemistry Education. 5. 42-51. Keys, W. C., Hand, B., Vaughn, P. and Collins, S. ( 1999). Using the science writing Heuristic as a tool for learning from laboratory in vestigations in secondary science. Journal of Research in Science Teaching 36, 1065–1084.

PAGE 437

421 Keys, C. W. (2000). Investigating the thinking proc esses of eighth grade writers during the composition of a scientific laboratory report. Journal of Research in Science Teaching 37, 676-690. Kimball, M. E. (1967-1968). Understanding the natur e of science: A comparison of scientists and science teachers. Journal of Research in Science Teaching 5, 110-120. King, P. M. and Kitchener, K. S. (1994). Developing reflective judgment: Understanding and promoting intellectual growth and critical thin king in adolescents and adults San Francisco: Jossey-Bass. King, P. M. and Kitchener, K. S. (2002). The refle ctive judgment model: Twenty years of epistemic cognition. In B. K. Hofer & P. R. Pintr ich (Eds.), Personal epistemology: The psychology of beliefs about kno wledge and knowing. Mahwah, NJ: Lawrence Erlbaum Associates, Inc. King, P. M., and Kitchener, K. S. (2004). Reflecti ve judgment: Theory and research on the development of epistemic assumptions through ad ulthood. Educational Psychologist 39, 5-18. Kirschner, P. A. (1992). Epistemology, Practical Wo rk And Academic Skills In Science Education, Science & Education 1, 273–299. Koprowski, J. L. (1997). Sharpening the craft of sc ientific writing. Journal of College Science Teaching 27, 133-35. Kuhn, D., Amsel, E., and O’Laughlin, M. (1988). The development of scientific thinking skills. Orlando, FL: Academic Press. Kuhn, D. (1991). The skills of argument Cambridge, England: Cambridge University Press. Kuhn, D., Cheney, R., and Weinstock, M. (2000). The development of epistemological understanding. Cognitive Development 15, 309-328. Kuhn, D., and Weinstock, M. (2002). What is episte mological thinking and why does it matter? In B. K. Hofer & P. R. Pintrich (Eds.), Personal epistemology: The psychology of beliefs about knowledge and knowing. Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Layman, J. W. (1996). Inquiry and Learning New York: The College Board. Lazarowitz, R. & Tamir, P. (1994). Research on usin g laboratory instruction in science. In D. L. Gabel (Ed.), Handbook of research on science teaching and learni ng: A project of the national science teachers associatio n (pp. 94 – 128) New York: MacMillan.

PAGE 438

422 Leach J., Ryder J. and Driver R. (1996) ULISP Working Paper 2: The Research Project Study: Design and Methodology. Centre for Studies in Science and Mathematics Education, University of Leeds, UK. Leach, J., Ryder J. & Driver R. (1997) ULISP Working Paper 5: Undergraduate science research projects and students’ images of the natur e of science. Centre for Studies in Science and Mathematics Education, Unive rsity of Leeds, UK. Leach, J., Millar, R., Ryder, J, Sr, M-G., Hammel ev, D., Niedderer, H., & Tselfes, V. (1998) Survey 2: Students’ images of science as they relat e to labwork learning Working Paper 4 of the European Commission Labwork in Science Education Project, PL 95-2005. Lederman, N. G., & Zeidler, D. L. (1987). Science teachers’ conceptions of the nature of science: Do they really influence teaching behavior ? Science Education 71, 721734. Lederman, N. G. (1992). Students’ and teachers’ con ceptions of the nature of science: A review of the research. Journal of Research in Science Teaching 29, 331-359. Lederman, N. G., Bell, R. L. & Wade, P. D. (1997) Assessing knowledge and teaching of the nature of science: problems, concerns, and sugg estions Paper presented at the annual meeting of the National Association for Research in Science Teaching, Chicago, IL. Lederman, N.G. (1998). The state of science educati on: Subject matter without context. Electronic Journal of Science Education 3(2), December 1998. Lederman, N. G., Wade, P. D., & Bell, R. L. (1998). Assessing the nature of science: What is the nature of our assessments? Science and Education 7, 595-615. Lederman, N. G. (2004). Syntax of nature of science within inquiry and science instruction. In L. B. Flick & N. G. Lederman (Eds.) Scientific Inquiry and Nature of Science Dordrecht/Boston/London: Kluwer Academic Publishe rs. Leonard, W. H. (2000). How do college students best learn science? Journal of College Science Teaching, 29, 385-388 Lincoln, Y., & Guba, E.G. (1985). Naturalistic Inquiry Beverly Hills, CA: Sage. Linn, M. C. (1995). Designing computer learning env ironments for engineering and computer science: The scaffolded knowledge integrat ion framework. Journal of Science Education and Technology, 4, 103 – 126. Linn, M. C. & Hsi, S. (2000). Computer, Teachers, Peers: Science Learning Partne rs Mahwah, NJ: Erlbaum. Lord, T. (1994). Using constructivism to enhance st udent learning in college biology. Journal of College Science Teaching 23, 346-348.

PAGE 439

423 Louca, L., Elby, A., Hammer, D., and Kagey, T. (200 4). Epistemological resources: Applying a new epistemological framework to science instruction. Educational Psychologist, 39, 57-68. Lunetta, V. N. (1998). The School Science Laborator y. In B. Frazer & K. Tobin (1998), International Handbook of Science Education. (pp. 249 – 262). Dordrecht: Kluwer. MacGregor, J., Cooper, J. L., Smith, K. A., and Rob inson, P. (2000). Strategies for energizing large classes. New Directions for Teaching and Learning 81, 1-4. Malina, E. G. & Nakhleh, M. B. (2003). How students use scientific instruments to create understanding: CCD spectrophotometers. Journal of Chemical Education, 80, 691 – 698. Markow, P. G. & Lonning, R. A. (1998). Usefulness o f concept maps in college chemistry laboratories: Students’ perceptions and effects on achievement. Journal of Research in Science Teaching 35 (9), 1015 – 1029. McComas, W. F., Almazroa, H. & Clough, M. P. (1998) The nature of science in science education: An introduction. Science & Education 7, 511-532 McComas, W. F. (2004). Seeking NOS Standards: What Content Consensus Exist s in Popular Books on the Nature of Science. Paper presented at the meeting of National Association for Research in Science Teachi ng. Meichtry, Y. J. (1993). The Impact of Science Curri cula on Student Views About the Nature of Science. Journal of Research in Science Teaching, 30 429-443. Millar, R., Le Marechal, J-F., Tiberghien, A. (1998 ). A map of the variety of labwork. Working Paper 1 of the European Commission Labwork in Science Education Project, PL 95-2005. Millar, R. (2002). Thinking About Practical Work, i n S. Amos and R. Boohan (eds). Aspects of Teaching Secondary Science: Perspectives on Practice London: RoutledgeFalmer pp 53-59. Millar, R. (2004). The role of practical work in the teaching and lear ning science Paper presented at meeting: High school science laborator ies: Role and vision. National Academy of Sciences, Washington, DC. Milwood, K. A. & Sandoval, W. A. (2004). A comparison of students' beliefs about school science and professional science Paper presented at the Annual Meeting of the American Educational Research Assn. San Diego, CA. Moss, D. M., Abrams, E. D., & Robb, J. (2001). Examinin g student conceptions of the nature of science. International Journal of Science Education vol.23, no.8, 771-790.

PAGE 440

424 Nachmias, R. & Linn, M. C. (1987) Evaluations of sc ience laboratory data: The role of computer. Journal of Research in Science Teaching, 24 491-506 Nakhleh, M. B. (1994). A review of microcomputer-ba sed labs: How have they affected science learning? Journal of Computers in Mathematics and Science Tea ching 13, 368 – 381. National Research Council (1997). Science teaching reconsidered: A handbook Washington, D.C.: National Academy Press. National Research Council (1999). How people learn: Brain, mind experience, and school Washington, D.C.: National Academy Press. National Survey of Student Engagement (2004). Student engagement: Pathways to collegiate success Bloomington, IN: Indiana University Center for Po stsecondary Research. Newton, L. (1997). Graph talk: some observations an d reflections on students’ datalogging. School Science Review 79 49 – 53. Newton, L. (2000). Data-logging in practical scienc e: research and reality. International Journal of Science Education 22, 1247 –1259. Nolen, S. B. (2003). Learning environment, motivat ion, and achievement in high school science. Journal of Research in Science Teaching 4, 347-368. Odom, A. L., & Barrow, L. H. (1995). Development an d application of a two-tier diagnostic test measuring college biology students' understanding of diffusion and osmosis after a course of instruction. Journal of Research in Science Teaching 32, 45-61. Olmstead, J. A. (1999). The mid-lecture break: When less is more. Journal of Chemical Education 76, 525-527. O’Sullivan, D. W., and Copper, C. L. (2003). Evalu ating active learning: A new initiative for a general chemistry curriculum. Journal of College Science Teaching 32, 448-452. Padgett, D. (1998). Qualitative Methods in Social Work Research Thousand Oaks, CA: Sage Publications. Palmer, B., and Marra, R. M. (2004). College stude nt epistemological perspectives across knowledge domains: A proposed grounded theo ry. Higher Education 47, 311-335. Paulsen, M. B., and Wells, C. T. (1998). Domain di fferences in the epistemological beliefs of college students. Research in Higher Education 39, 365-384.

PAGE 441

425 Pavelich, M. J. and Moore, W. S. (1996). Measuring the effect of experiental education Using the Perry model. Journal of Engineering Education 85, 287-292. Pavelich, M., Jenkins, B., Birk, J., Bauer, R., & K rause, S. (2004). Development of a Chemistry Concept Inventory for Use in Chemist ry, Materials and other Engineering Courses. Paper Presented at the Proceedings of the 2004 American Society for Engineering Education Annual C onference & Exposition, Savannah, GA. Perry, W. G. (1970). Forms of intellectual development and ethical devel opment in the college years: A scheme New York: Holt, Rinehart & Winston. Pienta, N. J.; Amend, J. R. (2004). Electronic Dat a Collection to Promote Effective Learning During Laboratory Activities. In The Chemists’ Guide to Effective Teaching; Pienta, N. J., Cooper, M. M., Greenbowe, T., Eds.; Prentice-Hall Publishing Co. Pintrich, P. R. (2002). Future challenges and dire ctions for theory and research on personal epistemology. In B. K. Hofer & P. R. Pint rich (Eds.), Personal epistemology: The psychology of beliefs about kn owledge and knowing. Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Posner, G., Strike, K., Hewson, P., & Gertzog, W. ( 1982). Accommodation of scientific conception: Toward a theory of conceptual change. Science Education, 66: 211227. Prain, V., & Hand, B. (1996). Writing for learning in secondary science: Rethinking practices. Teaching & Teacher Education 12, 609-612. Prince, M. (2004). Does active learning work? A re view of the research. Journal of Engineering Education 95, 223-231. Redish, E. F., Saul, J. M., & Steinberg, R. N. (199 8). Student expectations in Introductory physics. American Journal of Physics, 66 212-224. Reid, N., & Shah, I. ( 2007). The role of laborato ry work in university chemistry. Chemistry Education Research and Practice 8, 172-185. Rivard, L. P. (1994). A review of writing to learn in science: Implications for practice and research. Journal of Research in Science Teaching 31, 969-983. Rivard, L.P. & Straw, S.B. (2000). The effect of ta lk and writing on learning science: An exploratory study. Science Education 84, 566-593. Rogers, L. T. (1997). New data-logging tools-new in vestigations. School Science Review, 79, 6168.

PAGE 442

426 Roth, W. M., & Roychoudhury, A. (1994). Physics stu dents’ epistemologies and views about knowing and learning. Journal of Research in Science Teaching 31, 5-30 Rubba, P. A. (1977). Nature of scientific knowledge scale: Test and user ’s manual. East Lansing, MI: National Center for Research on Teache r Learning. (ERIC Document Reproduction Service No. ED 146 225). Rubba, P. A., & Anderson, O. (1978). Development of an instrument to assess secondary school students’ understanding of the nat ure of scientific knowledge. Science Education 62, 449-458. Russell, A. A., & Hill, J. C. (1989). California Ch emistry Diagnostic Test (E xaminations Institute of the American Chemical Society Division of Chemical Education No. CD89 ). USA: American Chemical Society Division of Educa tional Examination. Ryder, J., Leach, J. & Driver, R. (1999). Undergrad uate science students’ images of science. Journal of Research in Science Teaching 36(2), 201-219. Sadler, T. D., Chambers, F. W., and Zeidler, D. L. (2002). Investigating the crossroads of the nature of science, socioscientific issues, a nd critical thinking. Paper presented at the annual meeting of the National Ass ociation for Research in Science Teaching. New Orleans, LA. Sandavol, W. A., & Morrison, K. (2003). High school students’ ideas about theories and theory change after a biological inquiry unit. Journal of Research in Science Teaching 40, 369-392. Schmid, S. & Yeung, A. (2005) The influence of a pr e-laboratory work module on student performance in the first year chemistry laboratory. Research and Development in Higher Education 28 471–479. Schoenfeld, A., (1983). Beyond the purely cognitiv e: Belief systems, social cognitions, and metacognitions as driving forces in intellectua l performance. Cognitive Science 7, 329-363. Schoenfeld, A., (1988). When good teaching leads t o bad results: The disasters of “well taught” mathematics classes. Educational Psychologist 23, 145-166. Schommer, M. (1990). Effects of the beliefs about the nature of knowledge on comprehension. J ournal of Educational Psychology 82, 498-504. Schommer, M., Crouse, A., and Rhodes, N. (1992). E pistemological beliefs and mathematical text comprehension: Believing it is s imple does not make it so. Journal of Educational Psychology 84, 435-443. Schommer, M. (1993). Epistemological development an d academic performance among secondary students. Journal of Educational Psychology 85, 406-411.

PAGE 443

427 Schommer, M. (1994). Synthesizing epistemological belief research: Tentative Understandings and provocative confusions. Educational Psychology Review 6, 293-319. Schommer, M., and Walker, K. (1997). Epistemologic al beliefs and valuing school: Considerations for college admissions and retention Research in Higher Education 38, 173-186. Schommer-Aikins, M. (2002). An evolving theoretica l framework for an epistemological belief system. In B. K. Hofer & P. R. Pintrich (Ed s.), Personal epistemology: The psychology of beliefs about knowledge and knowing Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Schraw, G. (2001). Current themes and future direc tions in epistemological research: A commentary. Educational Psychology Review 13, 451-464. Schraw, G., Bendixen, L. D. and Dunkle, M. E. (200 2). Development and evaluation of the Epistemic Belief Inventory (EBI). In B. K. Hof er & P. R. Pintrich (Eds.), Personal epistemology: The psychology of beliefs a bout knowledge and knowing. Mahwah, NJ: Lawrence Erlbaum Associates, Inc. Sere, M.G., Leach, J., Niedderer, H. et al. (1998). Labwork in science education: Executive Summary European Commission Targeted Socio-Economic Resea rch Programme. Settlage, J. (1995). Children’s conceptions of ligh t in the context of a technology-based curriculum. Science Education 7, 535 – 553. Shepardson, D.P. & Britsch, S.J. (2000). Analyzing children’s science journals. Science and Children November/December, 13-17. Shiland, T. W. (1999). Constructivism: The implicat ion for laboratory work. Journal of Chemical Education 76, 107 – 109. Smith, C. L., Maclin, D., Houghton, C., & Hennessey M. G. (2000). Sixth-grade students’ epistemologies of science: The impact of schools science experiences on epistemological development. Cognition and Instruction 18, 349-422. Smith, K. A., Sheppard, S. D., Johnson, D. W., and Johnson, R. T. (2005). Pedagogical engagement: Classroom-based practices. Journal of Engineering Education 95, 87-101. Songer, N., & Linn, M. (1991). How do students’ vie ws of science influence knowledge integration? Journal of Research in Science Teaching 28, 761-784. Steiner R, Sullivan J (1984). Variables correlating with student success in organic chemistry. J. Chem. Educ ., 61:1072 – 1074.

PAGE 444

428 Strike, K. A. & Posner, G. J. (1992). A revisionist theory of conceptual change. In Duschl, R. A. & Hamilton, R. J. (Eds.), Philosophy of science, cognitive psychology, and educational theory and practice (pp. 147 – 176). New York: State University of New York Press. Taber, K. S. (2000). Chemistry lessons for univers ities: A review of constructivist ideas. University Chemistry Education, 4, 63-72. Tapper, J. (1999). Topics and manner of talk in und ergraduate practical laboratories. International Journal of Science Education 21 (4): 447 – 464. Thoermer, C., & Sodian, B. (2002). Science Undergra duates’ and graduates’ epistemologies of science: the notion of interpret ive frameworks. New Ideas in Psychology, 20, 263-283. Tiberghien A., Veillard L., Le Marechal J. F., Buty C. & Millar, M. (2001). An analysis of labwork tasks used in science teaching at upper sec ondary school and university levels in several European countries. Science Education 85 483 – 508. Tobin, K. (1990). Research on science laboratory a ctivities: In pursuit of better questions and answers to improving learning. School Science and Mathematics 90, 403-418. Treagust, D. F. (1988). Development and use of diag nostic tests to evaluate students' misconceptions in science. International Journal of Science Education 10, 159169. Valanides, N. & Angeli, C. (2005). Effects of inst ruction on changes in epistemological beliefs. Contemporary Educational Psychology, 30, 314-330. Valentine, J. C., DuBois, D. L., and Copper, H. (20 04). The relation between self-beliefs and academic achievement: A meta-analytic review. Educational Psychologist 39, 111-133. Vhurumuku, E., Holtman, L., Stein, D. K., & Mikalse n, O. (2004). An investigation of Zimbabwean A-level chemistry students: Laboratory w ork based images of the nature of science. Paper p resented at the Proceedings of the 12 th Annual Conference of the Southern African Association for Research in Mathematics, Science, and Technology Education, Cape Town, Africa. Weinburgh, M. (1998). Gender, Ethnicity, and Grade Level as Predictors of Middle School Students’ Attitudes Toward Science. Georgia State University. ( www.ed.psu.edu/CI/Journals/1998AETS/s5_1_weinburgh. rtf ) Wimpfheimer, T. (2004). Peer-evaluated poster sessi ons: An alternative to grading general chemistry laboratory work. Journal of Chem ical Education, 81, 17751776.

PAGE 445

429 Wise, J., Lee, S. H., Litzinger, T. A., Marra, R. M ., and Palmer, B. (2004). A report on a four–year longitudinal study of intellectual develo pment of engineering undergraduates. Journal of Adult Development 11, 103-110. Wolters, C. A. and Pintrich, P. R. (1998) Contextua l Differences in Student Motivation and Self-Regulated Learning in Mathematics, English and Social Studies Classrooms. Instructional Science 26, 27-47. Wyatt, R. (2003). Campus Technology: Online Biology Pre-Labs http://www.campus-technology.com/article.asp?id=777 0 (April 14, 2005). Zachos, P., Hick, T. L., Doane, W. E. & Sargent, C. (2000). Setting theoretical and empirical foundations for assessing scientific inqu iry and discovery in educational programs. Journal for research in science teaching 37, 938-962. Zeegers, P. (2001). Approaches to learning in scien ce: A longitudinal study. British Journal of Educational Psychology 71, 115-132. Zeidler, D. L. (1984). Moral issues and social poli cy in science education: Closing the literacy gap. Science Education 68, 411-419. Zeidler, D. L., Walker, K. A., Ackett, W. A. & Simm ons, M. L. (2002). Tangled up in views: Beliefs in the nature of science and respons es to socioscientific dilemmas. Science Education 86, 343-367 Ziman, J. (1995). Reliable knowledge: An exploration of the grounds f or belief in science. New York: Cambridge University Press. Zimmerman, B. J. (1990). Self-regulated learning an d academic achievement: An overview. Educational Psychologist, 23 3-17. Zimmerman, B. J., Bandura, A., and Martinez-Pons, M (1992). Self-motivation for academic attainment: The role of self-efficacy beli efs and personal goal setting. American Educational Research Journal 29, 663-676. Zusho, A., Pintrich, P. R., and Coppola, B. (2003). Skill and Will: The role of motivation and cognition in the learning of college chemistry. International Journal of Science Education 25, 1081-1094.

PAGE 446

430 Appendices

PAGE 447

431 Appendix A: Chemical Concepts Inventory We are asking you to complete this inventory to det ermine the prior conceptual knowledge and misconceptions in general chemistry t hat students bring to the classroom. THIS INVENTORY CANNOT AFFECT YOUR GRADE IN ANY WAY Instructions: 1. Please write and bubble in your student identifi cation number (U Number) on the scantron with pencil 2. On the signature line write CCI and the date 3. This inventory consists of 22 multiple choice qu estions. 4. Several of the questions are paired. In these ca ses, the first question asks you about a chemical or physical effect. The second question then asks for the reason for the observed effect. 5. Please do not write on this inventory, bubble yo ur answers on the scantron. 6. Turn in both the inventory and the scantron. 7. You may not remember some of the material from your high school chemistry course. Please take the time to think about the questions a nd answer to the best of your ability. We appreciate your help with this project.

PAGE 448

432 Appendix A (Continued) Chemistry Concepts Inventory 1. Which of the following must be the same before a nd after a chemical reaction? (a) The sum of the masses of all substances involve d. (b) The number of molecules of all substances invol ved. (c) The number of atoms of each type involved. (d) Both (a) and (c) must be the same. (e) Each of the answers (a), (b), and (c) must be t he same. 2. Assume a beaker of pure water has been boiling f or 30 minutes. What is in the bubbles in the boiling water? (a) Air. (b) Oxygen gas and hydrogen gas. (c) Oxygen. (d) Water vapor. (e) Heat. 3. A glass of cold milk sometimes forms a coat of w ater on the outside of the glass (Often referred to as 'sweat'). How does most of th e water get there? (a) Water evaporates from the milk and condenses on the outside of the glass. (b) The glass acts like a semi-permeable membrane & allows the water to pass, but not the milk. (c) Water vapor condenses from the air. (d) The coldness causes oxygen and hydrogen from th e air combine on the glass forming water. 4. What is the mass of the solution when 1 pound of salt is dissolved in 20 pounds of water? (a) 19 Pounds. (b) 20 Pounds. (c) Between 20 and 21 pounds. (d) 21 pounds. (e) More than 21 pounds.

PAGE 449

433 Appendix A (Continued) 5. The diagram represents a mixture of S atoms and O 2 molecules in a closed container. Which diagram shows the results after the mixture r eacts as completely as possible according to the equation: 2S + 3O 2 --> 2SO 3 ..........A........... ..........B............ ..........C............ ..........D............. ..........E........... 6. The circle on the left shows a magnified view of a very small portion of liquid water in a closed container. What would the magnified view s how after the water evaporates? .....A...... .....B...... .....C...... .....D....... .....E...... 7. True or False? When a match burns, some matter i s destroyed. (a) True (b) False 8. What is the reason for your answer in question 7 ? (a) This chemical reaction destroys matter. (b) Matter is consumed by the flame. (c) The mass of ash is less than the match it came from. (d) The atoms are not destroyed, they are only rear ranged. (e) The match weighs less after burning.

PAGE 450

434 Appendix A (Continued) 9. Heat is given off when hydrogen burns in air acc ording to the equation Which of the following is responsible for the heat? (a) Breaking hydrogen bonds gives off energy. (b) Breaking oxygen bonds gives off energy. (c) Forming hydrogen-oxygen bonds gives off energy. (d) Both (a) and (b) are responsible. (e) (a), (b), and (c) are responsible. 10. Two ice cubes are floating in water: After the ice melts, will the water level be: (a) higher? (b) lower? (c) the same? 11. What is the reason for your answer? (a) The weight of water displaced is equal to the w eight of the ice. (b) Water is more dense in its solid form (ice). (c) Water molecules displace more volume than ice m olecules. (d) The water from the ice melting changes the wate r level. (e) When ice melts, its molecules expand. 12. A 1.0-gram sample of solid iodine is placed in a tube and the tube is sealed after all of the air is removed. The tube and the solid iodin e together weigh 27.0 grams. The tube is then heated until all of the iodine eva porates and the tube is filled with iodine gas. Will the weight after heating be: (a) less than 26.0 grams. (b) 26.0 grams. (c) 27.0 grams. (d) 28.0 grams. (e) more than 28.0 grams.

PAGE 451

435 Appendix A (Continued) 13. What is the reason for your answer? (a) A gas weighs less than a solid. (b) Mass is conserved. (c) Iodine gas is less dense than solid iodine. (d) Gasses rise. (e) Iodine gas is lighter than air. 14. What is the approximate number of carbon atoms it would take placed next to each other to make a line that would cross this dot: (a) 4 (b) 200 (c) 30,000,000 (d) 6.02 x 1023 15. Figure 1 represents a 1.0 L solution of sugar d issolved in water. The dots in the magnification circle represent the sugar molecules. In order to simplify the diagram, the water molecules have not been shown. Which response represents the view after 1.0 L of water were added (Figure 2). .....Figure 1........................ .....................Figure 2...... .......A....... .......B........ .......C........ .......D........ .......E....... 16. 100 mL of water at 25¡C and 100 mL of alcohol a t 25¡C are both heated at the same rate under identical conditions. After 3 minutes th e temperature of the alcohol is 50¡C. Two minutes later the temperature of the water is 5 0¡C. Which liquid received more heat as it warmed to 50¡ C? (a) The water. (b) The alcohol. (c) Both received the same amount of heat. (d) It is impossible to tell from the information g iven.

PAGE 452

436 Appendix A (Continued) 17. What is the reason for your answer? (a) Water has a higher boiling point then the alcoh ol. (b) Water takes longer to change its temperature th an the alcohol. (c) Both increased their temperatures 25¡C. (d) Alcohol has a lower density and vapor pressure. (e) Alcohol has a higher specific heat so it heats faster. 18. Iron combines with oxygen and water from the ai r to form rust. If an iron nail were allowed to rust completely, one should find that th e rust weighs: (a) less than the nail it came from. (b) the same as the nail it came from. (c) more than the nail it came from. (d) It is impossible to predict. 19. What is the reason for your answer? (a) Rusting makes the nail lighter. (b) Rust contains iron and oxygen. (c) The nail flakes away. (d) The iron from the nail is destroyed. (e) The flaky rust weighs less than iron. 20. Salt is added to water and the mixture is stirr ed until no more salt dissolves. The salt that does not dissolve is allowed to settle out. Wh at happens to the concentration of salt in solution if water evaporates until the volume of the solution is half the original volume? (Assume temperature remains constant.) The concentration (a) increases. (b) decreases. (c) stays the same. 21. What is the reason for your answer to question 20? (a) There is the same amount of salt in less water. (b) More solid salt forms. (c) Salt does not evaporate and is left in solution (d) There is less water.

PAGE 453

437 Appendix A (Continued) 22. Following is a list of properties of a sample o f solid sulfur: i. Brittle, crystalline solid. ii Melting point of 113 ¡C. iii Density of 2.1 g/cm3. iv Combines with oxygen to form sulfur dioxide Which, if any, of these properties would be the sam e for one single atom of sulfur obtained from the sample? (a) i and ii only. (b) iii and iv only. (c) iv only. (d) All of these properties would be the same. (e) None of these properties would be the same.

PAGE 454

438 Appendix B: Epistemological Beliefs Assessment for the Physical Sciences Instructions: ¨ We are asking you to complete this inventory to as sist us in probing the epistemological stances of students taking physics, chemistry, or physical science ¨ For each of the items, please read the statement, and indicate (on the scantron answer sheet) the answer that describes how strongl y you agree or disagree, or fill in the answer that best fits your view, or whe ther you agree with one student or the other ¨ The data collected will be handled anonymously thr oughout. ¨ This inventory cannot affect your grade only help improve it. ¨ The inventory consists of 30 statements. ¨ Calculators are not needed for these questions ¨ Please do not write on this inventory. ¨ Bubble your choices on the scantron using pencil ¨ Write in and Bubble your study ID number (U #) ¨ Write EBAPS on the signature line with the date ¨ Turn in both the inventory and the scantron

PAGE 455

439 Appendix B (Continued) EPISTEMOLOGICAL BELIEFS ASSESSMENT FOR THE PHYSICAL SCIENCES Part 1 DIRECTIONS : For each of the following items, please read the st atement, and indicate (on the scantron answer sheet) the ans wer that describes how strongly you agree or disagree. A: Strongly disagree B: Somewhat disagree C: Ne utral D: Somewhat agree E: Strongly agree 1. Tamara just read something in her science textbook that seems to disagree with her own experiences. But to learn science well, Ta mara shouldn’t think about her own experiences; she should just focus on what the book says. 2. When it comes to understanding physics or chemistr y, remembering facts isn’t very important. 3. Obviously, computer simulations can predict the be havior of physical objects like comets. But simulations can also help scientists e stimate things involving the behavior of people such as how many people will buy new television s ets next year. 4. When it comes to science, most students either lea rn things quickly, or not at all. 5. If someone is having trouble in physics or chemist ry class, studying in a better way can make a big difference. 6. When it comes to controversial topics such as whic h foods cause cancer, there’s no way for scientists to evaluate which scientific studies are the best. Everything’s up in the air! 7. A teacher once said, “I don’t really understand something until I teach it.” But actually, teaching doesn’t help a teacher understan d the material better; it just reminds her of how much he or she already knows. 8. Scientists should spend almost all their time gath ering information. Worrying about theories can’t really help us understand anything. 9. Someone who doesn’t have high natural ability can still learn the material well even in a hard chemistry or physics class. 10. Often, a scientific principle or theory just doesn ’t make sense. In those cases, you have to accept it and move on, because not everythi ng in science is supposed to make sense.

PAGE 456

440 Appendix B (Continued) A: Strongly disagree B: Somewhat di sagree C: Neutral D: Somewhat agree E: Strongly agree 11. When handing in a physics or chemistry test, you c an generally have a sense of well you did even before talking about it with othe r students. 12. When learning science, people can understand the m aterial better if they relate it to their own ideas. 13. If physics and chemistry teachers gave really clear lectures, with plenty of real-life examples and sample problems, then most good studen ts could learn those subjects without doing lots of sample questions and practice problems on their own. 14. Understanding science is really important for peop le who design rockets, but not important for politicians. 15. When solving problems, the key thing is knowing th e methods for addressing each particular type of question. Understanding the “bi g ideas” might be helpful for specially-written problems, but not for most regula r problems. 16. Given enough time, almost everybody could learn to think more scientifically, if they really wanted to. 17. To understand chemistry and physics, the formulas (equations) are really the main thing; the other material is mostly to help you dec ide which equations to use in which situations. Part 2 DIRECTIONS : Multiple choice. On the answer sheet, fill in the answer that best fits your view. 18. If someone is trying to learn physics, is the foll owing a good kind of question to think about? Two students want to break a rope. Is it better f or them to (1) grab opposite ends of the rope and pull (like in tug-of-war), or (2) t ie one end of the rope to a wall and both pull on the other end together? (a) Yes, definitely It’s one of the best kinds of questions to study (b) Yes, to some extent But other kinds of questions are equally good. (c) Yes, a little This kind of question is helpful, but other kind s of questions are more helpful. (d) Not really This kind of question isn’t that great for learn ing the main ideas. (e) No, definitely not This kind of question isn’t helpful at all.

PAGE 457

441 Appendix B (Continued) 19. Scientists are having trouble predicting and expla ining the behavior of thunder storms. This could be because thunder storms behav e according to a very complicated or hard-to-apply set of rules. Or, tha t could be because some thunder storms don’t behave consistently according to any set of rules, no matter how complicated and complete that set of rules is. In general, why do scientists sometimes have troub le explaining things? Please read all options before choosing one. (a) Although things behave in accordance with rules those rules are often complicated, hard to apply, or not fully known. (b) Some things just don’t behave according to a co nsistent set of rules. (c) Usually it’s because the rules are complicated, hard to apply, or unknown; but sometimes it’s because the thing doesn’t follow rul es. (d) About half the time, it’s because the rules are complicated, hard to apply, or unknown; and half the time, it’s because the thing doesn’t follow rules. (e) Usually it’s because the thing doesn’t follow r ules; but sometimes it’s because the rules are complicated, hard to apply, or unknow n. 20. In chemistry, how do the most important formulas re late to the most important concepts? Please read all choices before picking o ne. (a) The major formulas summarize the main concepts; they’re not really separate from the concepts. In addition, those formulas are helpful for solving problems. (b) The major formulas are kind of “separate” from the main concepts, since concepts are ideas not equations. Formulas are better characterized as problem-solving tools, without much conceptual mean ing. (c) Mostly (a), but a little (b). (d) About half (a) and half (b). (e) Mostly (b), but a little (a). 21. To be successful at most things in life ... (a) Hard work is much more important than inborn na tural ability. (b) Hard work is a little more important than natur al ability. (c) Natural ability and hard work are equally impor tant. (d) Natural ability is a little more important than hard work. (e) Natural ability is much more important than har d work.

PAGE 458

442 Appendix B (Continued) 22. To be successful at science ... (a) Hard work is much more important than inborn na tural ability. (b) Hard work is a little more important than natur al ability. (c) Natural ability and hard work are equally impor tant. (d) Natural ability is a little more important than hard work. (e) Natural ability is much more important than har d work. 23. Of the following test formats, which is best for m easuring how well students understand the material in chemistry? Please read e ach choice before picking one. (a) A large collection of short-answer or multiple choice questions, each of which covers one specific fact or concept. (b) A small number of longer questions and problems each of which covers several facts and concepts. (c) Compromise between (a) and (b), but leaning mor e towards (a). (d) Compromise between (a) and (b), favoring both e qually. (e) Compromise between (a) and (b), but leaning mor e towards (b). Part 3 DIRECTIONS : In each of the following items, you will read a sho rt discussion between two students who disagree about some issue. Then you’ll indicate whether you agree with one student or the other 24. Brandon : A good science textbook should show how the mat erial in one chapter relates to the material in other chapters. It shou ldn’t treat each topic as a separate “unit,” because they’re not really separat e. Jamal : But most of the time, each chapter is about a dif ferent topic, and those different topics don’t always have much to do with each other The textbook should keep everything separate, instead of blending it al l together. With whom do you agree? Read all the choices befor e circling one. (a) I agree almost entirely with Brandon. (b) Although I agree more with Brandon, I think Jam al makes some good points. (c) I agree (or disagree) equally with Jamal and Br andon. (d) Although I agree more with Jamal, I think Brand on makes some good points. (e) I agree almost entirely with Jamal.

PAGE 459

443 Appendix B (Continued) 25. Anna : I just read about Kay Kinoshita, the physicist. S he sounds naturally brilliant. Emily : Maybe she is. But when it comes to being good at science, hard work is more important than “natural ability.” I bet Dr. Kinoshi ta does well because she has worked really hard. Anna : Well, maybe she did. But let’s face it, some pe ople are just smarter at science than other people. Without natural ability, hard wo rk won’t get you anywhere in science! (a) I agree almost entirely with Anna. (b) Although I agree more with Anna, I think Emily makes some good points. (c) I agree (or disagree) equally with Anna and Emi ly. (d) Although I agree more with Emily, I think Anna makes some good points. (e) I agree almost entirely with Emily. 26. Justin: When I’m learning science concepts for a test, I like to put things in my own words, so that they make sense to me. Dave: But putting things in your own words doesn't help you learn. The textbook was written by people who know science really well. You should learn things the way the textbook presents them. (a) I agree almost entirely with Justin. (b) Although I agree more with Justin, I think Dave makes some good points. (c) I agree (or disagree) equally with Justin and D ave. (d) Although I agree more with Dave, I think Justin makes some good points. (e) I agree almost entirely with Dave. 27. Julia: I like the way science explains how things I see in the real world. Carla: I know that’s what we’re “supposed” to think, and it’s true for many things. But let’s face it, the science that explains things we do in lab at school can’t really explain earthquakes, for instance. Scientific laws work well in some situations but not in most situations. Julia: I still think science applies to almost all real-w orld experiences. If we can’t figure out how, it’s because the stuff is very complicated or because we don’t know enough science yet. (a) I agree almost entirely with Julia. (b) I agree more with Julia, but I think Carla make s some good points. (c) I agree (or disagree) equally with Carla and Ju lia. (d) I agree more with Carla, but I think Julia make s some good points. (e) I agree almost entirely with Carla.

PAGE 460

444 Appendix B (Continued) 28. Leticia: Some scientists think the dinosaurs died out beca use of volcanic eruptions, and others think they died out because an asteroid hit the Earth. Why can’t the scientists agree? Nisha: Maybe the evidence supports both theories. There’s often more than one way to interpret the facts. So we have to figure out wh at the facts mean. Leticia: I’m not so sure. In stuff like personal relations hips or poetry, things can be ambiguous. But in science, the facts speak for them selves. (a) I agree almost entirely with Leticia. (b) I agree more with Leticia, but I think Nisha ma kes some good points. (c) I agree (or disagree) equally with Nisha and Le ticia. (d) I agree more with Nisha, but I think Leticia ma kes some good points. (e) I agree almost entirely with Nisha. 29. Jose: In my opinion, science is a little like fashion; s omething that’s “in” one year can be “out” the next. Scientists regularly change the ir theories back and forth. Miguel: I have a different opinion. Once experiments have been done and a theory has been made to explain those experiments, the matter is pretty much settled. There’s little room for argument. (a) I agree almost entirely with Jose. (b) Although I agree more with Jose, I think Miguel makes some good points. (c) I agree (or disagree) equally with Miguel and J ose. (d) Although I agree more with Miguel, I think Jose makes some good points. (e) I agree almost entirely with Miguel. 30. Jessica and Mia are working on a homework assignmen t together... Jessica : O.K., we just got problem #1. I think we should go on to problem #2. Mia : No, wait. I think we should try to figure out why the thing takes so long to reach the ground. Jessica : Mia, we know it’s the right answer from the back of the book, so what are you worried about? If we didn’t understand it, we woul dn’t have gotten the right answer. Mia : No, I think it’s possible to get the right answer without really understanding what it means. (a) I agree almost entirely with Jessica. (b) I agree more with Jessica, but I think Mia make s some good points. (c) I agree (or disagree) equally with Mia and Jess ica. (d) I agree more with Mia, but I think Jessica make s some good points. (e) I agree almost entirely with Mia.

PAGE 461

445 Appendix C: Nature of Scientific Knowledge Scale Instructions: ¨ We are asking you to complete this inventory to as sist us in assessing student conceptions relating to the Nature of Science (NOS) ¨ The data collected will be handled anonymously th roughout. ¨ This inventory cannot affect your grade only help improve it. ¨ The inventory consists of 48 statements, with seve ral paired statements. ¨ Calculators are not needed for these questions ¨ Please do not write on this inventory. ¨ Bubble your choices on the scantron using pencil ¨ Write in and Bubble your study ID number (U #) ¨ Write NSKS inventory on the signature line with th e date ¨ Turn in both the inventory and the scantron

PAGE 462

446 Appendix C (Continued) Nature of Scientific Knowledge Scale (NSKS) (Rubba, P. A., & Anderson, O.,1978). 1 2 3 4 5 Strongly Agree Agree Neutral Disagree Strongly Disagree 1. Scientific laws, theories and concepts do not ex press creativity. 2. Scientific knowledge is stated as simply as poss ible. 3. The laws, theories and concepts of biology, chem istry and physics are related. 4. The applications of scientific knowledge can be judged good or bad, but the knowledge itself cannot. 5. It is incorrect to judge a place of scientific know ledge as being good or bad. 6. If two scientific theories explain a scientist’s observations equally well, the simpler theory is chosen. 7. Certain pieces of scientific knowledge are good and others are bad. 8. Even if the applications of a scientific theory are judged to be good, we should not judge the theory itself. 9. Scientific knowledge need not be capable of expe rimental test. 10. The laws, theories and concepts of biology, che mistry and physics are not linked. 11. Consistency among test results is not requireme nt for the acceptance of scientific knowledge. 12. A piece of scientific knowledge will be accepte d if the evidence can be obtained by other investigators working under similar condition s. 13. The evidence for scientific knowledge need not be open to public examination. 14. Scientific laws, theories and concepts are not stated as simply as possible. 15. There is an effort in science to build as great a n umber of laws, theories and concepts as possible. 16. We accept scientific knowledge even through it may contain error. 17. Scientific knowledge expresses the creativity o f scientists.

PAGE 463

447 Appendix C (Continued) 1 2 3 4 5 Strongly Agree Agree Neutral Disagree Strongly Disagree 18. Moral judgment can passed on scientific knowled ge. 19. The laws, theories and concepts of biology, che mistry and physics are not related. 20. Scientific laws, theories and concepts express creativity. 21. It is meaningful to pass moral judgment on both the application of scientific knowledge and the knowledge itself. 22. The evidence for scientific knowledge must be r epeatable. 23. Scientific knowledge is not a product of human imagination. 24. Relationships among the laws, theories and conc epts of science do not contribute to the explanatory and predictive power of science. 25. The truth of scientific knowledge is beyond dou bt. 26. Today’s scientific laws, theories and concepts may have to be changed in the face of new evidence. 27. We do not accept a piece of scientific knowledg e unless it is free of error. 28. A scientific theory is similar to a work of art in that they both express creativity. 29. There is an effort in science to keep the numbe r of laws, theories and concepts at a minimum. 30. The various sciences contribute to a single org anized body of knowledge. 31. Scientific beliefs do not change over time. 32. Scientific knowledge is a product of human imag ination. 33. The evidence for a piece of scientific knowledg e does not have to be repeatable. 34. Scientific knowledge does not express the creat ivity of scientist. 35. Biology, chemistry and physics are similar kind s of knowledge. 36. If the applications of a piece of scientific kn owledge are generally considered bad, then the piece of knowledge is also generally consi dered to be bad.

PAGE 464

448 Appendix C (Continued) 1 2 3 4 5 Strongly Agree Agree Neutral Disagree Strongly Disagree 37. Scientific knowledge is subject to review and c hange. 38. Scientific laws, theories and concepts are test ed against reliable observations. 39. If two scientific theories explain a scientist’ s observations equally well, the more complex theory is chosen. 40. Scientific knowledge is specific as opposed to comprehensive. 41. Scientific theories are discovered, not created by man. 42. Those scientific beliefs which were accepted in the past, and since have been discarded, should be judged in their historical con text. 43. Scientific knowledge is unchanging. 44. Biology, chemistry and physics are different ki nds of knowledge. 45. Consistency among test results is a requirement for the acceptance of scientific knowledge. 46. Scientific knowledge is comprehensive as oppose d to specific. 47. The laws, theories and concepts of biology, che mistry and physics are interwoven. 48. A piece of scientific knowledge should not be j udged good or bad.

PAGE 465

449 Appendix D: Initial Laboratory Work Questionnaire Initial Questionnaire on Laboratory Work Study ID #_________________________ Part 1This section explores what you think about laborato ry work. (Please check the box that best describes your level of agreement wit h each statement). I think that laboratory work Agreement Level Strongly Agree Agree Neither Disagree Strongly disagree 1. is overdone in my studies 2. is an important part of my studies 3. has helped me to understand scientific theories 4. is more enjoyable if I work on an experiment in conjunction with others 5. is preferable if I work on an experiment by myself 6. is something I am confident about 7. is something I find difficult 8. should be included in program 9. should be optional in program

PAGE 466

450 Appendix D (Continued) Part 2This section concerns how confident you feel about the skills and knowledge you may possess at the start of your laboratory course. ‘Very high’ means you think you could teach someone else the skill, ‘high’, you cou ld certainly do it yourself, neither high nor low, you are unsure whether you could do it you rself, ‘low’, you probably couldn’t do it, and ‘very low’, you certainly couldn’t do it (Please check the box that best describes your level of confidence about each statement) I can Confidence Level Very High High Neither Low Very Low 10. Follow laboratory instructions 11. Assemble apparatus-equipment 12. Take numerical readings accurately 13. Plan experiments 14. Plot graphs of numerical results 15. Analyze graphs of numerical results 16. Process data reliably 17. Estimate uncertainties in numerical results 18. Report observations accurately 19. Interpret observations reliably 20. Assess health and safety risks 21. Understand theories underlying experiments 22. Write good scientific reports Part -3-Laboratory Skills Of the skills below which three – five do you regar d as most important? (Please check three-five below). Skill 1. Follow instructions 2. Assemble apparatus 3. Take numerical readings accurately 4. Plan experiments 5. Plot graphs of numerical results 6. Analyze graphs of numerical results 7. Process data reliably 8. Estimate uncertainties in numerical results 9. Report observations accurately 10. Interpret observations reliably 11. Assess health and safety risks 12. Understand theories underlying experiments 13. Write good scientific reports

PAGE 467

451 Appendix E: Student Evaluation of Laboratory Instr uction Student Reflective Evaluations on Laboratory Instru ction Section 1 Lab Title____________________________ Study ID #__ _________ __ For the three features and their sub-features pleas e indicate by checking the appropriate box on how helpful you found each of the following pedagogical features with respect to understanding and necessity of the laboratory learn ing experience if applicable. Starting with 1 to indicate not essential to 5 where you considered the feature extremely essential to your understanding and necessity of the laborato ry learning experience if applicable Least Essential Somewhat Essential Essential Very Essential Extremely Essential Feature 1 2 3 4 5 1. Pre-lab a. Lab Manual b. Quiz c. Questions-FC d. Discussion e. Technology (e.g. BB, PRS) 2. LabWork a. Lab Manual b. Group Discussions c. Lab NB d. Technology (e.g. MBL) e. Bench Work 3. Post-lab a. Lab Manual b. Lab NB c. Discussion d. Technical WritingAnalysis e. Technology (e.g. BB)

PAGE 468

452 Appendix E (Continued) For questions 7-8 please respond in the space provi ded with respect to your understanding and necessity of the laboratory learn ing experience. 7. How do the instructional methods (e. g. pre-lab post-lab, technology, and laboratory notebook) used in these chemistry laboratory activi ties compare with other science laboratory activities you have experienced? Explai n. 8. What have you learned, if anything, concerning the nature of science (i.e. chemistry) with respect to the instructional methods? Concerni ng your epistemological beliefs with respect to the instructional methods? Explain

PAGE 469

453 Appendix E (Continued) Student Reflections of Pre-Post Laboratory Experien ces Questionnaire Section 2 Choose one statement for each topic that best descr ibes your perceptions regarding the preand post laboratory methods. You may make com ments on the back of the questionnaire. Achievement A. I feel that I achieve more in my learning if I do the experiment after participating in a pre-lab discussion. B. I feel that I achieve more in my learning if I do the experiment then participate in a post-lab discussion. C No clear difference Difficulty A. It is more difficult to perform an experiment b efore it is discussed. B. It is more difficult to perform an experiment a fter it is discussed. C. Initially, it was more difficult to perform an experiment before it was discussed, but now I prefer to discuss the experiment after I have performed it. D. Doing the experiment before or after I particip ated in the discussion made no clear difference. Enjoyment A. Overall, I enjoy the laboratory more if I do th e experiment before a discussion B. Overall, I enjoy the laboratory more if I do th e experiment after the discussion. C. No clear difference Understanding A. I understand the connection between theory and practice well if I do the experiment first and then participate in a discussion B. I understand the connection between theory and practice well if I do the experiment after I participate in the discussion. C. No clear difference

PAGE 470

454 Appendix E (Continued) Reflective Self-assessment of Laboratory Learning – Section 3 DIRECTIONS : For each of the following items, please read the st atement, and circle the answer that best describes the kind of learning you believe you gained by doing this laboratory activity. Then briefly reflect on your choices in the space provided below each statement by identifying situations in this particu lar activity that modeled each learning category 1. Knowledge: (i.e., to recall, descri bes, identifies facts, terms, or phenomena) A: Nothing B: A Little C: Some D: A lot E: Very Much Reflections 2. Comprehension: (i.e., to interpret, predict, exp lain so others understand) A: Nothing B: A Little C: Some D: A lot E: Very Much Reflections 3. Application: (i.e., to solve, apply, use concept s or learning to other situations) A: Nothing B: A Little C: Some D: A lot E: Very Much Reflections

PAGE 471

455 Appendix E (Continued) 4. Analysis: (i.e., to analyze, troubleshoot, disti nguish concepts through reasoning) A: Nothing B: A Little C: Some D: A lot E: Very Much Reflections 5. Synthesis: (i.e., to create, integrate, design p atterns, create new meaning of concepts) A: Nothing B: A Little C: Some D: A lot E: Very Much Reflections 6. Evaluation: (i.e., to compare, contrast, justify solutions or value of concepts) A: Nothing B: A Little C: Some D: A lot E: Very Much Reflections

PAGE 472

456 Appendix F: Interview Formats/Scripts Initial Interview Questions Potential Prompt/Probe questions: What do you mean by _______? Can you expand on your answer for me? Can you give me an example of what you mean? Can you give me a view that you think is wrong? Now I would like your beliefs/views on the followin g statements and/or questions. This is not about right or wrong responses however you need to respond with more than just yes or no offering supporting statements and exampl es. Personal Epistemological Beliefs in Science Q-1Structure of Scientific Knowledge Chemistry knowledge is a bunch of weakly connected pieces without much structure and consisting mainly of facts and f ormulas. Chemistry knowledge is coherent, conceptual, highly -structured and a unified whole knowledge Q-2Nature of Knowing and Learning Science Learning science (chemistry) consist mainly of abso rbing information. Learning science relies on constructing one’s own u nderstanding, working actively through the material, relating new materia l to prior experiences/intuitions/knowledge, and reflecting up on and monitoring one’s understanding. Q-3Real-life Applicability Scientific knowledge and scientific ways of thinkin g apply only to the classroom and laboratory settings, not to real life Q-4Evolving Knowledge All scientific knowledge is set in stone. There is no difference between scientific evidencebased reasoning and mere opinion.

PAGE 473

457 Appendix F (Continued) Sometimes different science instructors give differ ent explanations for scientific events/concepts/phenomena. When 2 instr uctors explain the same thing differently, can one be more correct tha n the other? Explain When 2 explanations are given for the same situatio n, how would you go about deciding which explanation to believe? Pleas e give details and examples Can one ever be sure of which explanation to believ e? If so, how can you? If not, why not? Q-5Source of Ability to Learn Being good at learning and doing science is mostly a matter of fixed natural ability so most people cannot become better at learning and doing science. Nature of Science There are many differing views or images of the nat ure of science and scientific knowledge. I would like your views on the followin g statements: Q-6Creative Scientific theories and models are products of the human mind and may or may not accurately represent reality. Q-7Developmental Scientific knowledge is a changing and evolving bod y of concepts and theories. Q-8 Parsimonious The ultimate goal of science is to gather all the c omplex facts about natural phenomena Q-9 -Testable The scientific method will eventually let people le arn the real truth about the natural world and how it works.

PAGE 474

458 Appendix F: (Continued) Final Interview Question FormatInstructional I would like your beliefs/views on the following st atements and/or questions. This is not about right or wrong responses however you n eed to respond with more than just yes or no offering supporting statements and examples. Potential Prompt/Probe questions: What do you mean by _______? Can you expand on your answer for me? Can you give me an example of what you mean? Can you give me a view that you think is wrong? 1. What instructional feature (pre-lab, laboratory work, or post-lab) was the most effective in promoting your learning in this course ? 2. What instructional feature (pre-lab, laboratory work, or post-lab) was the least effective in promoting your learning in this course ? 3. What could you have done differently to promote your learning? 4. What are the most important skills you learned i n chemistry laboratory? 5. Of the skills below rank in order which five you now regard as the most important? Skill 1. Follow instructions 2. Assemble apparatus 3. Take numerical readings accurately 4. Plan experiments 5. Plot graphs of numerical results 6. Analyze graphs of numerical results 7. Process data reliably 8. Estimate uncertainties in numerical results 9. Report observations accurately 10. Interpret observations reliably 11. Assess health and safety risks 12. Understand theories underlying experiments 13. Write good scientific reports

PAGE 475

459 Appendix F: (Continued) 6. How would you rank the following aspects of pre -laboratory? (Using each category level only once) Least Essential Somewhat Essential Essential Very Essential Extremely Essential Feature 1 2 3 4 5 Pre-lab a. Lab Manual b. Quiz c. Questions d. Discussion e. Technology (e.g. BB, PRS) 7. How would you rank the following aspects of lab oratory work? (Using each category level only once) Least Essential Somewhat Essential Essential Very Essential Extremely Essential Feature 1 2 3 4 5 Lab-Work a. Lab Manual b. Group Discussions c. Lab NB d. Technology (e.g. MBL) e. Bench Work 8. How would you rank the following aspects of pos t laboratory analysis? (Using each category level only once) Least Essential Somewhat Essential Essential Very Essential Extremely Essential Feature 1 2 3 4 5 Post-lab a. Lab Manual b. Lab NB c. Discussion d. Technical WritingAnalysis e. Technology (e.g. BB)

PAGE 476

460 Appendix F: (Continued) 9. Describe the role and significance of the labor atory notebook in any scientific workplace. (e. g. classroom, research laborato ry, hospital, pharmacy) 10. Describe the role and significance of the scie ntific laboratory report/analysis in any scientific workplace. (e. g. classroo m, research laboratory, hospital, pharmacy) 11. What three of the six learning skill levels i n Bloom’s Taxonomy did you utilize most often in this course? Epistemological Beliefs Final Interview Epistemological beliefs are individuals’ beliefs ab out the nature and structure of knowledge. Personal beliefs about what knowledge is and how we understand, integrate and apply knowledge (known as personal epistemologi es) are entrenched in the process of learning science. In this case specifically to probe the epistemological stances of students takin g physics, chemistry, or physical science. I would like your beliefs/views on the following st atements and/or questions. This is not about right or wrong responses however you n eed to respond with more than just yes or no offering supporting statements and exampl es. 1. Structure of Scientific Knowledge – weakly connected without much structure versus strongly connected and highly structured What instructional feature (pre-lab, laboratory work, or post-lab), if at all do you believe influenced your beliefs about the Structure of Scientific Knowledge in this course? 2. Nature of Knowing and Learning in Science – consists mainly of absorbing/memorizing information and facts versus r elies on constructing one’s own understanding by relating new material to prior kno wledge, prior experiences, and actively working through the material What instructional feature (pre-lab, laboratory work, or post-lab), if at all do you believe influenced your beliefs about the Nature of Knowing and Learning in Science in this course?

PAGE 477

461 Appendix F: (Continued) 3. Real Life Applicability of Science – scientific knowledge is restricted to the classroom and laboratory versus applies to everyday real life situations such as one’s home, automobile, diet, and health. What instructional feature (pre-lab, laboratory work, or post-lab), if at all do you believe influenced your beliefs about the Real Life Applicability of Science in this course? 4. Evolving Knowledge of Science – from the point of view that all scientific knowledge is set in stone to the belief that there is no distinction between evidencebased reasoning and mere opinion What instructional feature (pre-lab, laboratory work, or post-lab), if at all do you believe influenced your beliefs about the Evolving Knowledge of Science in this course? 5. Source of Ability to Learn Science – that learning science is a matter of fixed natural ability versus that most individual’s can l earn science if they want to What instructional feature (pre-lab, laboratory wor k, or post-lab), if at all do you believe influenced your beliefs about the Source of Ability to Learn Science in this course? Nature of Science Final Interview Typically, the Nature of Science (NOS) has been use d to refer to the epistemology of science, science as a way of knowin g, or the values and beliefs inherent to the development of scientific knowledge. The NOS refers to one’s understanding about the social practices and organization of scie nce and how scientists collect, interpret, and use data to guide further research ( Ryder, Leach, & Driver, 1999). I would like your beliefs/views on the following st atements and/or questions. This is not about right or wrong responses however you need to respond with more than just yes or no offering supporting statements and examples. 1. Scenario Problem Some scientists believe that explanations of chemical phenomena, such as atomic theory, are accurate and true descriptions of atomi c structure. Other scientists say that we cannot know whether or not these theories are ac curate and true, but that scientists can only use such theories as working models to exp lain what is observed. What do you think about this statement? How did yo u come to hold that point of view or answer? On what do you base that point of view or answer?

PAGE 478

462 Appendix F: (Continued) 2. What instructional feature (pre-lab, laboratory work, or post-lab), if at all do you believe influenced your beliefs about the Nature of Science in this course?

PAGE 479

463 Appendix F: (Continued) Example 1: Nature of Science Interview (Carey, et al., 1989; Sand oval & Morrison, 2003) Goals of Science 1. What do you think science is all about? 2. What do you think the goal of science is? 3. What do you think scientists do? 3a. How do they achieve their goals? Types of Questions 4. Do you think scientists ask questions? 4a. What sorts of questions do you think scientists ask? If No, go to question 6 5. How do scientists answer their questions? 5a. Can you give an example of a scientist’s questi on and what he or she would do to answer it? Nature and Purpose of Experiments 6. What is an experiment? 7. Do scientists do experiments? 7a. If No, skip to question 10. 8. Why do scientists do experiments? 8a. If “to test ideas” Then: How does the test tell the scientist something about the idea? Roles of Ideas: Conceptions of Hypotheses and Theories 9. How does a scientist decide what experiment to d o? 10. Have you ever heard the word ‘‘hypothesis’’? 10a. If No, explain: A hypothesis is an idea scientists have, an idea about how an experiment would turn out. 10b. If Yes, ask: What is a hypothesis? 10c. If ‘‘educated guess’’ or ‘‘guess’’ Then ask: Do you think a hypothesis is the same as a guess or do you think t here is a difference? What is the difference? 11. Do you think a scientist’s ideas influence the experiments he or she does? 11a. If Yes: How? 11b. If No: Do scientists ever test their ideas? 12. How do you think scientists come up with their ideas? 13. Have you ever heard the word theory ? 13a. If Yes: What is a theory? Do you think s cientists have theories? In all cases, explain: ‘‘A theory is a general idea about how and why things happen the way they do. For example, biology is a theory about living things.’’

PAGE 480

464 Appendix F (Continued) 14. Do you think a scientist’s theory influences hi s or her ideas about specific experiments? 14a. How? Unexpected Results and Disproving Ideas 15. If a scientist does an experiment and the resul ts are not as he or she expected, would the scientist consider this a bad result? 15 a. Why or why not? 15b. Can they learn anything from this? 15c. What? 16. Say a scientist is going to do an experiment to test his or her idea. Would a scientist do an experiment that might prove t his idea is wrong? 16a. Why or why not? Nature of Change Processes 17. What happens to a scientist’s ideas once he has done a test? 18. Do scientists ever change their ideas? 18a. If Yes: When would they do that and why ? 19. Do scientists ever change their whole theories? 19a. If Yes: When would they do that and why ? Achieving Goals and Making Mistakes 20. Do scientists always achieve their goals? 20a. If not, why not? 21. Can scientists make mistakes or be wrong? 21a. How?

PAGE 481

465 Appendix F (Continued) Example 2: Potential Interview Script NOS There are many differing views or images of the nat ure of science and scientific knowledge. I would like your views on the following statements: 1. Scientific knowledge is a changing and evolving body of concepts and theories Potential Prompts: Can you expand on your answer for me? Can you give me an example of what you mean? Can you give me a view of scientific knowledge that you think is wrong? 2. Scientific method will eventually let people lea rn the real truth about the natural world and how it works. Potential Prompts: Can you expand on your answer for me? Can you give me an example of what you mean? Can you give me a view that you think is wrong? 3. Theories and models are products of the human mi nd and may or may not accurately represent reality. Potential Prompts: Can you expand on your answer for me? Can you give me an example of what you mean? Can you give me a view that you think is wrong? 4. The ultimate goal of Science is to gather all the facts about natural phenomena Potential Prompts: Can you expand on your answer for me? Can you give me an example of what you mean? Can you give me a view that you think is wrong?

PAGE 482

466 Appendix F (Continued) Example 3: Potential Final Interview Question Format To assess perceived changes in student views of the nature of science and their personal epistemology as related to laboratory inst ruction and corresponding attributes. Participants are asked to elaborate and explain res ponses from other measures (i.e. CCI, NSKS, EBAPS, and laboratory questionnaire) and the first interview. Participants are asked: 1. Have your views or level of understanding of the nature of science changed in any way from your views at the start of the semeste r? ¨ If so, how? 2. How, if at all, has the laboratory experience in fluenced your views on the nature of science? 3. If response to #2 is negative, yet views have ch anged: ¨ To what do you attribute the change in your views or level of understanding? 4. If response to #1 and #2 are negative: ¨ Why do you think your views or level of understand ing of the nature of science has been stable? 5. Consider the laboratory instructional experience the laboratory notebooks, and other instructional sessions. ¨ Do you think any of these components of the labora tory influenced your views of the nature of science? personal epistemol ogy? ¨ If so, what components? How? And Why? 6. Can you recall examples or specific instances th at you feel had an influence on your understanding? Explain.

PAGE 483

467 Appendix G: SampleLaboratory Work Example 1 Data Analysis: Accuracy, Precision, Uncertainty, S ignificant Figures, Error, and Data Collection General Procedure The following activities will allow you to apply th e principles of accuracy, precision, error, significant figures, and uncertainty to a practical situation that will familiarize you with linear, volume, and mass measurements. The exercis es will help you develop the dexterity required to accurately use measurement to ols. Methods A. Visit the applicable web sites for this topic locat ed on Blackboard under “Web Resources”, read over each and download as needed. Record all observations, measurements, calculations, etc. in l ab notebook. B. Mass Measurements – Record letter of bars. Use an electronic balance to weigh three bars. Refer to electronic balance web site. Weigh the three bars and record the mass of each to the nearest 0.01 g . C. Length Measurements – Use a metric ruler to measure the length, width, and height of the three bars. Measure and record the v alue of each to the nearest 0.1 cm. Convert all values to inches. D. Liquid Volume Measurements-1 – Fill a 10.0 mL graduated cylinder ~ full with water. Record the volume. Pour the water into a pre-weighed small beaker ( 0.01 g). Mass the beaker and water and record. R epeat 3 more times with fresh water, record the volume and re-mass the beak er each trial. E. Liquid Volume Measurements-2 – Fill a 50.00 ml full (~12.50 mL). Deliver the water12.50 mL of water. Deliver the water into a pre-weighed small beaker ( 0.01 g). Mass the beaker and water and record. Repeat 3 more times with fresh water and re-mass the beaker each trial. F. Density of a Solid Using your results from B and C determine the dens ity of each bar with metric units and English units. Coll ect the class density data for those lab groups that used the same metal as you. The accepted density will be posted on BB after all lab sections have perform ed the lab. G. Predicting unit divisions of metric rulers and dete rmine instrument precision H. Graphing Analysis Find the Relationship: An Exerc ise in Graphing Analysis In several laboratory investigations you do this ye ar, a primary purpose will be to find the mathematical relationsh ip between two variables. For example, you might want to know the relationshi p between the pressure exerted by a gas and its temperature. In one experi ment you do, you will be asked to determine the relationship between the vol ume of a confined gas and the pressure it exerts. A very important method for determining mathematical relationships in laboratory science makes use of graphical methods. I Physical Properties of Matter with Vernier – MBL

PAGE 484

468 Appendix G (Continued) Example 2 Starting Vernier Logger Pro and Preparing to Collect Data ¨ Locate the Logger Pro icon on your computer and double-click on it, or us e the Start menu (Windows 95/98/2000/NT). ¨ An important feature of LabPro is its ability to d etect auto-ID sensors, and automatically set up an experiment. The computer w ill attempt to communicate with LabPro. ¨ Select the correct port and click Scan. ¨ If you have connected a Stainless-Steel Temperatur e Probe and the computer has detected the LabPro interface, you will see the following screen, which shows a graph of Temperature vs Time. ¨ Notice how the program automatically identified th e temperature probe (an autoID sensor). ¨ The current temperature reading is displayed in th e status bar at the bottom of the screen. ¨ The default data collection mode is time graph. In this example, you have a Temperature Probe, reading in Celsius, and collecti ng data as a function of time for 120 seconds. ¨ If you now disconnect the Temperature Probe, conne ct a different auto-ID sensor, and choose New from the File menu, Logger Pro will set up a new experiment for the new sensor. Auto-ID Sensor Activity ¨ Plug the Stainless-Steel Temperature Probe into ch annel CH 1 on LabPro, and lay the temperature probe on the tabletop. ¨ Start the Logger Pro software. Logger Pro will detect the auto-ID sensor, set the data collection parameters, and computer display. ¨ In this case, collection parameters are 1.0 sample per second and 120 samples. ¨ The program displays a graph and data table on the computer. ¨ The vertical axis of the graph will have temperatu re scaled from 0 to 100C. ¨ The horizontal axis will have time scaled from 0 t o 120 seconds change to the appropriate scale as needed. ¨ You are ready to collect data; Click Collect to be gin data collection. ¨ Wait about 10 seconds and place the Temperature Pr obe into the solution. ¨ Allow Logger Pro to complete data collection. ¨ Notice that the sensor does not read the new tempe rature instantly; it takes a moment to respond. ¨ Now that the run is complete, pull down the Analyz e menu and choose Examine. ¨ The cursor will become a vertical line. As you mov e the cursor across the screen, temperature and time values corresponding to the cu rsor position will be displayed. Move the cursor to the point when the pr obe was first placed in the solution. ¨ Record that time. ¨ Move the cursor to find the highest temperature, an d record that time

PAGE 485

469 Appendix G (Continued) Classification of Chemicals Reactions& Mass-to-Mole Calculations (Adapted from USF Tampa Campus Lab Manual – Lab Trek (1997)) Example 3 Part A. The Classification of Chemical Reactions ¨ Write and balance the chemical reaction for each p erformed ¨ Note any temperature changes (exothermic versus en dothermic) ¨ Record all observations in your lab notebook 1. Synthesis or Combination reaction : Obtain a short length (~1.5cm) of magnesium metal ribbon. Note & record its physical properties. Holding the ribbon with tongs over a watch glass, bring it into contact with a lighted match or portable burner fla me. Hold the end of the Mg ribbon in the flame until it ignites. What do you observe? Do not stare directly at the flame Has a chemical reaction occurred? How do you know? Wh at is the name of the product? 2. Decomposition reaction: 2a Demonstrated by instructorVolcano Obtain a small vial of ammonium dichromate from you r instructor. Place the compound on a watch glass or in a beaker so that it forms a small, cone-shaped pile. Ignite the apex of the cone using the Bunsen burner flame. Withdraw the flame as soon as the material begins to burn. What do you observ e? How do the physical properties of the reactants and products compare? What was the hi ssing sound? CAUTION : do not touch the hot watch glass or beaker with your hands Where did all this heat come from? 2b. Elephant's Toothpaste Perform in the sink with the graduated cylinder sit ting in the center. Add a few drops of food coloring and ~ 2.0 mL of dish soap to the grad uated cylinder. Carefully add ~15 mL of 30% Hydrogen Peroxide to the graduated cylinder. Carefully and slowly avoiding the sides of the graduated cylinder add ~2.4 g of the c atalyst (KI or NaI, or MnO 2 ). What do you observe? How do the physical properties of the reactants and products compare? CAUTION: very carefully touch the graduated cylind er with your hands. Where did all this heat come from? Has a chemical reaction occurr ed? How do you know? What is the name of the product(s)?

PAGE 486

470 Appendix H: SamplePre-Laboratory Activities Example 1 The research and development section of a liquid re freshment factory on the planet of Molborg received an unlabeled box with unlabeled co ntainers of one of their new potential products. In order to determine the iden tity of the substances in the unlabeled box the laboratory ran tests to determine the percent sugar concentration and density of the unlabeled unknown and compared the results to t heir known values of the new products. a. Given the data below determine identify the unlabel ed potential product of the unknown substances by comparing and contrasting the experimental data with the known data. Justify your choice mathematically by answering the questions and performing the necessary calculation on the fol lowing pages. b. Consider the following: Calculate the means for th e experimental values for each variable; Use the estimated uncertainty method to d etermine the range in the experimental values for both variables; carefully consider the entire data set and report the “best value” for the density and % sugar of the unknown substance. Identify the unknown substance from the list of kno wn substances in the table. Comment on how/why you arrived at this choice. Experimental Data Known Data Unknown Samples % Sugar Density (g/mL) Known Product % Sugar Density (g/mL) 1 12.23 1.038 Tropical OJ 12.18 1.044 2 12.13 1.040 Duck OJ 12.28 1.046 3 12.26 1.046 Hour OJ 12.21 1.042 4 12.18 1.044 Fresh OJ 12.03 1.038 AVG XXXXXXXX XXXXX XXXXXXXX

PAGE 487

471 Appendix H (Continued) Example 2 Differences in Values from Measurement 1 (Adapted from Leach et al., 1998) Two groups of chemical nutritionists have been aske d to measure the mass of 100.0 cm 3 of nut oil. Each group takes nine samples of 100.0 cm 3 of the oil from a large container and weighs each sample. These are their results, a fter having sorted them into ascending order: Mass of 100.0 cm 3 Peanut Oil (g) Trial Group A Group B 1 81.9 84.9 2 83.5 85.7 3 86.5 86.6 4 87.1 86.9 5 87.3 87.0 6 87.5 87.3 7 87.5 88.2 8 90.5 88.5 9 92.1 88.8 Average 87.1 87.1 1. What should Group A state as their result for the mass of 100.0 cm 3 of nut oil? Please write your answer in the box below 2. In the box below briefly explain your reasoning

PAGE 488

472 Appendix H (Continued) Example 3-Pre-Laboratory –Questions – Quiz Questions 1) Using the scientific literature sources-handbook s listed in the background reading in the lab handout (located in the library) answer the following using the sources listed in the reading. Do not use the Internet and properly c ite all sources a. What is the melting point of naphthalene? Source: b. Identify synonym(s) for naphthalene Source: c. Describe the hazards-precautions (MSDS) of usi ng naphthalene. Source: 2) Based on the Law of Conservation of Mass; calcul ate how many grams of oxygen are needed in the following reaction, if 12.43 g of Mag nesium was consumed & 34.54 g of MgO is produced: Mg + O 2 MgO Explain your results 3) ___AlCl 3 (aq) + ___NH 4 OH(aq) ___Al(OH) 3 (s) + ___NH 4 Cl(aq) If 24.5 g of AlCl 3 are treated with excess NH 4 OH, how many grams of NH 4 Cl are produced? Assume 100% of the reactant is converted to product. Show work, etc. 4) The density of olive oil is 0.79 g/mL What is the volume of 300.0 g ? (D= M/V) Show work and report this value to the correct number o f significant figures with units 5) Complete the following conversion: 520 kg of chocolate into lb if 1 kg = 2.20 lb Show all work and report this value to the correct numb er of significant figures with units 6) Identify the reactants to be used in the elements, compounds and mixture lab 7) Predict the products for the following double r eplacement reaction K 3 PO 4 + BaCl 2 ? + ? 8) Tin (II) Fluoride (SnF2), also known as stannou s fluoride, is added to some dental products to help prevent cavities. How many g rams of tin (II) fluoride can be made from 55.0 g of hydrogen fluoride, HF, if ther e is an excess of tin (Sn). Sn(s) + 2HF(aq) SnF2 (aq) + H2 9) A blacksmith dropped a 2.00 kg piece of steel (iron sFe = 0.449 J/g water, which was initially at 25.0C, and waited until the steel temperature was the same as the final temperature of water (88.6C). Determine the mass of water if the initial temperature of was 800.0 K. (heat capacity of water is 4.1814 J/g metal)

PAGE 489

473 Appendix H (Continued) 10) Given the following incomplete redox reaction, balance this equation in ACID solution with the set of smallest whole numbe r coefficients: MnO 4 -1 + Br -1 (aq) ----> Mn 2+ (aq) + Br 2 (g) 11) Two clear solutions are poured together. A pale blue, chalky material is formed which sinks to the bottom of the test tube. The test tube becomes cold. The substances in the blue material cannot be separated from each other by physical means. What type of change is described in t he paragraph? Explain 12) A blue crystalline material is heated strongly in a test tube. A clear liquid condenses around the mouth of the tube and the crystals g radually lose their blue color and become white powder. Every gram of blue crystal produces 0.36 g of clear liquid and 0.64 g of colorless powder. The same weight-mas s relationships are observed for samples of the crystals taken from many differe nt sources. These observations would be consistent with a hypothesis that the blue c rystals are? Explain

PAGE 490

474 Appendix I: Keeping a Laboratory Notebook General Information : Why use a laboratory notebook? "A laboratory notebook is one of a scientist’s most valuable tools. It contains the permanent writ ten record of the researcher’s mental and physical activities for experiment and observat ion, to the ultimate understanding of physical phenomena. The act of writing in the noteb ook causes the scientist to stop and think about what is being done in the laboratory. I t is in this way an essential part of doing good science." from Writing the Laboratory Notebook by Howard M. K anare; American Chemical Society 1985 • Always write in the lab notebook in PEN with permanent blue or black ink. • Do not write in pencil or erasable ink. Do not write with felt tip or colorful gel pens. • Use a single line or X to cross out a mistake, and write the correct word or number next to it. Initial the cross-out. Example: misttake (mistake) FB • Do not use white out or scribble out mistakes. • You must practice real-time entry of data, observations, and steps in the lab. In other words, record data directly into your noteboo k. • Ask the instructor to review and sign your data pag es BEFORE you leave lab each day that you collect data. • Do not write on scratch paper, and copy into the notebook later. This could result in the loss or confusion of data and makes the validity of your data suspect. Lab reports will NOT be accepted and you will receive no credit for an experiment if you do not practice real-time entry. • Organize data tables before you begin collecting data. • Clearly label and organize each section of your r eport. • Clearly label all data tables, calculations, and graphs. • Keep the Table of Contents up-to-date. • Remove only pages marked COPY from your notebook. • Do not remove the original pag es, even if you mess them up. Removing pages makes your data suspect • Write lab reports for an upper level college scie nce major audience that has education with chemistry in it. Write for an audience that you assume has not read the lab handout but has a solid knowledge base in science • Neatness and legibility are important. We must be a ble to easily read what you write. Therefore, leave space between the component s. Subject to change(s) at the discretion of the instr uctor.

PAGE 491

475 Appendix I (Continued) Keeping a Laboratory Notebook The laboratory notebook is the "ticket" to lab alon g with proper dress and out of lab. Without the laboratory notebook you will not b e admitted to lab and a grade of zero for that lab will be recorded. Have laborator y notebook pages signed prior to leaving lab. Your carbon-copy notebook should include the follow ing sections : ¨ Table of Contents – Using the inside front cover of the lab notebook fill out after every lab activity. ¨ Lab Title Heading: Fill in all the heading boxes on the first page of the Lab Report Section. Subsequent pages should include you r name and title of activity ¨ Purpose with Predictions Brief description of experimental goal(s) and any necessary predictions (hypothesis). Some predictio ns will be made prior and some after collection of data ¨ Procedure do not copy the procedure instead properly cite the lab manual, create a modified flow-chart of the procedure (unle ss told otherwise) and list any modifications-changes, waste disposal and suggestio ns made to the procedure identified in class (on board/discussion). ¨ Notes taken in pre-lab discussion ¨ Raw Data / Observations This section is a record of what you do and observe as you perform the experiment. 1. Quantitative data (numerical measurements) must be recorded with units in appropriate tables. 2. Qualitative data (observations) – colors, textur es, evolution of gases, precipitations, etc. – should be recorded here as w ell. 3. All data taken in lab must be recorded in pen direc tl y in the lab notebook. Include titles, heading, units etc ., on your original tables and any reorganized tables. ¨ Data Analysis Calculations/ Results 1. Calculations, tables, graphs, and qualitative ve rbal descriptions of outcomes. 2. All calculations must be shown with original for mulas and full solutions. Keep track of units at all steps. Label all calcula tions, tables and graphs. Calculate % error where appropriate 3. Summarize results in a table(s). ¨ Conclusion: Include your overall scientific interpretations of the lab results and incorporate in paragraph format any analysis or int egrated questions. Notes : If you will be preparing a Basic Laboratory Report (BLR) or Formal Lab Report (FLR) for a particular lab then students need to show only sa mple calculations of each type of calculation and eliminate the conclusion.

PAGE 492

476 Appendix J: Sample Pre-laboratory Discussion Activ ities Tube Activity Example1 : Tube Activity Possible scenario 1. Present the tube in front of students 2. Ask students to carefully observe and record al l patterns of the ropes on the tube 3. Pulls on the end of the rope and wait for a whi le 4. Pull on rope ends clockwise at one time 5. Pull on rope ends across the tube at another 6. Repeat pulling the ropes until students say the y understand or get enough data of the patterns of the rope. 7. Tell students they have to answer the question “What does inside of the tube look like?” “What makes the ropes move like that?” 8. Ask students to make their tubes based on obser vations that they made, which behave as the same way as yours. 9. Ask students to present their tubes 10. Conduct a debriefing for NOS. ¨ After presentations, ask students if they can see the inside of the tube that you showed students to address the distinction betw een observation and inference. ¨ You can explicitly explain how observation is diff erent from inference. As examples of inferential entities, you can provide s tudents with the structure of the earth, gravity, and the structure of the atom. ¨ To address the importance of observations, you can ask students “Is any inference OK in science?”, “How can we know which i nference is better?”, “To make a better inference, what would you do?” ¨ When students had different models of the tube, yo u can discuss the notion that scientists can interpret the same data in diff erent ways (associating with human subjectivity). ¨ In addition, when students’ different tubes behav ed in the same way as yours, it should be addressed that it is very diffi cult to determine which tube is better. In other words, we hardly say one is right and the other is wrong. Make explicit to students that what they have done is very similar to what scientists do by providing students with real examp les in science such as the structure of the atom.

PAGE 493

477 Appendix J (Continued) Example 2: Fruit (Density) Activity Possible Scenario 1. Place an aquarium or large clear container fill ed with water. 2. Ask students to predict what will happen when a banana (fruit) is put into the aquarium. “Is it going to sink or float?” 3. Have students make their prediction and explain why. 4. Place the banana into the aquarium and ask stud ents to make a careful observation. 5. Show students different banana and ask them to make a prediction what will happen if when you try different bananas in size an d in freshness. 6. Place the different bananas into the aquarium a nd ask students to make a careful observation. 7. Ask students “What’s going to happen if I peel off this banana? Is it going to sink or float?” “Do you think this banana will beha ve in the same way as before?” 8. Place the banana into the aquarium and ask stud ents to make a careful observation. 9. Ask students why or why not bananas behave diff erently. 10. Ask students to come up with a question to inv estigate and how they can test their explanation for bananas’ different behaviors. 11. When sharing students’ work, ask students “Wha t data do you have to support your conclusion?” to discuss the consistenc y between data and a conclusion. It is also important to address the dif ference between data and evidence. Explain that data are the same as observa tions, but scientists can take observations as evidence in favor of their explanat ions. As a result, the same data can be taken as evidence for two incomparable explanations.

PAGE 494

478 Appendix J (Continued) Example 3: Activity Series of Metals PowerPoint Slide 1 Another qualitative investigation Slide 2 Qualitative – What Quantitative – How much Slide 3 Experimental Objective Determine the relative reactivity of Copper, Cu Tin, Sn Calcium, Ca Magnesium, Mg Zinc, Zn Silver, Ag Hydrogen gas, H 2 Slide 4 Reactivity Metals and hydrogen gas can be oxidized (lose electrons) Something must be reduced (gain electrons) Water Acid Metal cation Slide 5 Reactivity Metals and hydrogen gas can be oxidized (lose electrons) Ca Ca 2+ + 2e H 2 2H + + 2e Oxidizing agents are reduced (gain electrons) Water 2H 2 O + 2e --> H 2(g) + 2OH Acid 2H + + 2e --> H 2(g) Metal cation Cu 2+ + 2e Cu Slide 6 Relative Reactivity Cu, Sn, Ca, Mg, Zn, Ag, H 2 What metals are oxidized by water? These are the most reactive What metals are oxidized by acid? These are more reactive than H 2 What metals are oxidized by what cations? A metal can be oxidized by the cation of a less rea ctive metal.

PAGE 495

479 Appendix J (Continued) Slide 7 Oxidation by water Metals in large test tubes Deionized water Observe Record Conclude The most reactive metals are oxidized by water. 2Na + 2H 2 O 2NaOH + H 2 Slide 8 Oxidation by acids Metals in small test tubes 6 M HCl Observe Record Conclude Metals oxidized by an acid are more reactive than H 2 Ni + 2H + Ni 2+ + H 2(g) Slide 9 Oxidation by metal cations Metal cation solutions in small test tubes Stock bottle back of lab Silver Nitrate dropper bottle (Avoid staining ski n) 6 x 4 wellpate for reactions Observe Record Conclude A metal is oxidized by the cation of a less reactiv e metal. Ni + Cu 2+ Ni 2+ + Cu Slide 10 Data Analysis Rank in order of reactivity (least to most) Ca, Cu, Mg, Sn, Zn, Ag, H 2 Write net ionic equations to represent all reaction s Patterns in chemistry Slide 11 Waste Handling Transfer all reaction liquid into large individual waste beaker using wash bottle Transfer waste to designated liquid waste container Wash reaction vessels with soap and water – rinse w ith deionized water

PAGE 496

480 Appendix J (Continued) Example 4: PRS PowerPoint Slides-Questions 1.115 g 2.100. g3.85 g 4.15 g5.1 g1.If you burn 100. g of wood and produce 15.0 g of ash, what is the mass of the other products produced? 1.115 g 2.100. g3.85 g 4.15 g5.1 g 1If you burn 100. g of wood and produce 15.0 g of ash, what is the mass of the other products produced? LCM

PAGE 497

481 Appendix K: General Overview of Laboratory Reports Important Reminders: Due no later than posted date on the lab schedule or will be considered late title page: related title, date, student(s) name, class typed, 12 point font, Arial, Times New Roman, Time s, Tahoma or Courier, 1 inch margins, 1.5 double spaced within paragra phs, prefer blocked margins 3 prong paper folder Presentation Report: Includes criteria expected to be reported in a tec hnical paper or scientific journal Assume that the reader of your report knows a litt le something about chemistry and your topic or wants to know for their research. You are the expert. Correct format for graphs, tables, drawings and di scussion of qualitative data Report is written in scientific style: clear, to t he point, past tense, and not written in first person Report is grammatically correct: spelling, subject -verb agreement, complete sentences & in past tense. Avoid first person (I, we…..) Avoid discussing how to do the calculation, just s how the calculation and discuss its significance. Use metric units

PAGE 498

482 Appendix K (Continued) Guidelines for a Basic Lab Report (*BLR) General Information : ¨ Typed Report with separate title page ¨ Body of report ranges from 3-7 pages ¨ 1.5 spacing and 1 inch margins Components: ¨ Title page ¨ Brief introduction of the theory/concepts/basic equations behind exper iment (~ 1pg) ¨ Purpose – Predictions Brief description of experimental goal(s) and an y necessary predictions (hypothesis) that you had to make in lab with any class data ¨ Data Analysis / Observations 1. Quantitative data (numerical measurements) must be recorded with units in appropriate tables. 2. Qualitative data (observations) – colors, textu res, evolution of gases, precipitations, etc. – should be recorded here as w ell. Chemical Reactions must be shown if applicable 3. Create/Copy needed graphs (properly labeled) or other visual representations of data using Microsoft Excel (grap hs, diagrams, pictures…) or other software program. Include class data (as needed) ¨ Calculations/ Results 1. Sample(s) of all calculations must be shown with original formulas and full solutions. Keep track of units at all steps. Label all calculations, tables and graphs. Calculate % error where appropriate 2. Summarize all results in a table. ¨ Conclusion: Include your overall scientific interpretations of the lab results and incorporate answers to analysis questions within bo dy of writing (paragraph) ¨ References – Properly cited ¨ Lab Notebook pages attached with original graphs(as required)

PAGE 499

483 Appendix K (Continued) FORMAL LAB REPORTS-OVERVIEW A scientific paper-report at a minimum includes the following parts: Title Page – should tell the reader what kind of work is being r eported; title should be creative. Describes lab content concisely adequately and appropriately Abstract – summarizes 4 essential aspects of the report: the p urpose of the experiment-research, key findings, significance and major conclusion. T he reader should be able to determine the major concep tual-theoretical focus of the research/experiment. Should be one single sp aced paragraph of 150200 word. Composed after paper is written, but pl aced at beginning Introduction introductory/thesis paragraph – functions: 1. place it in the context of what is already known about the topic, in other words discuss the concepts 2. Explain the theory, reactions, etc. behind the experiment 3. Presents the question(s) being asked or studied; state the purpose, variables, etc. Procedure – Briefly summarize the procedure in your own words. If the lab procedure was qualitative in nature then include ty ped flow charts summarizing the procedure. Reference and list any changes made to procedure. Cite the lab manual Usually no more than one-two pages Results – Data Analysis– Components: 1. Presents original experimental data in an accura te and organized fashion. 2. Several well organized paragraphs describing qualitative observations-data Presented clearly, without comment, bias or interpretation 3. Generate new graphs (properly labeled) or other visual representations (flow charts) of data using Microso ft Excel (graphs, diagrams, pictures…). Do not post raw data here place in appendix 4. Create easy to read data tables including all of your qualitative and quantitative data 5. Includes labels and/or units for all data 6. Show important sample math calculations 7. Always calculate % error if dealing with qualita tive data and accepted values 8. Usually dominated by calculations, tables, figur es, graphs, and observations 9. Graphics need to clear, easily read and prope rly labeled

PAGE 500

484 Appendix K (Continued) Discussion-Conclusion – this is where you will analyze and interpret the results of your experiment and point out their chemical si gnificance. Consider the following: 1. What do the results indicate clearly? 2. What have you determined? 3. Explain what you know with certainty based on yo ur results and theory; draw conclusions 4. What is the significance of the results? 5. What ambiguities exist? 6. What questions might one raise? 7. Find logical explanations for problems in the ex perimental data. 8. Open with effective comparison of results and hy pothesis 9. Restate your question, purpose, variables, etc…. 10. Discuss the specific data, chemical reactions, including math results with theory values. Incorporate answers to any discussi on questions if applicable. 11. State whether your results did or did not confi rm your hypothesis and support or negate your hypothesis from your results 12. Remember to number figures, tables, and calcula tions throughout the paper. Refer to figures, tables, and calculations as you discuss your results. 13. Provides sufficient and logical explanation to support results and conclusion. 14. Directly addresses what has been learned in the lab 15. Considers the chemistry (concepts) involved. How do your res ults fit in with what you know? 16. Sufficiently addresses other issues pertinent t o the lab including sources of error. Identify weaknesses in your experimental design. Describe how these imperfections may have affected your results. 17. List any problems that arose during the experim ent itself (Unforeseen difficulties with the procedure may affect the data and need to be described) 18. Demonstrate clear and thoughtful scientific inquiry 19. Draw a Conclusion ¨ Appendix 1. Lab Notebook pages: Original raw data graphs, tab les, etc. should be included in this section. Generate new flow charts, graphs – tables for results section of paper.

PAGE 501

485 Appendix L: Consent Form INFORMED CONSENT FORM The following information is being presented to hel p you decide whether or not you want to be a part of a minimal risk research study. Plea se read carefully. If you do not understand anything, please contact the principal r esearcher, Linda S. Keen-Rocha, who can be contacted at lrocha@mail.usf.edu or 727-USF-4785. Title of Study : Personal Epistemological Growth in a College Chemis try Laboratory Environment Principal Investigator : Linda S. Keen-Rocha Study Location(s): USF College of Arts and SciencesSt Petersburg and College of Education – Tampa Purpose of the Study: It remains to be determined whether certain effecti ve instructional practices are linked to the development of specific epistemologic al and NOS (nature of science) beliefs. The major intent of this study is to deve lop an understanding of the relation between students’ images of science, personal epist emological beliefs and laboratory classroom instructional practices. Plan of Study-Procedures: Participation in this semester study will require a pproximately 90-360 minutes of your time over the semester. Your involvement in th e process will require you to do the following: Participate in answering conceptual chemistry quest ions with a chemistry concept knowledge assessment instrument – Chemical Concepts Inventory CCI (15-20 minutes) Participate in assessing your images of science wit h the Nature of Science Knowledge Scale ( NSKS) assessment instrument (15-20 minutes) Participate in the Epistemological Beliefs Assessme nt for Physical Sciences (EBAPS) Instrument that requires you to reflect on your views about the nature of knowledge and learning in the physical sciences (e.g., chemistry, physics) (2030 minutes)

PAGE 502

486 Appendix L: (Continued) Participate in an initial or final interview or bot h which will be audio taped (30-90 minutes) Participate in evaluating laboratory instructional techniques with an assessment instrument (15 minutes per laboratory activity) Benefits of Being a Part of this Research Study The direct benefits of your participation in this s tudy will help us better understand the effectiveness of specific pedagogica l laboratory techniques, improve student learning opportunities and help us to better understand how students’ images of science and personal epistemolo gical beliefs influence their learning science. These learning experiences may h elp the student assess their own perceptions of themselves as learners. Student s will receive extra credit points on their midterm and final exam for their le vel of participation. Students not participating may choose to write a scientific pape r(s) to receive the extra credit. Risks of Being a Part of this Research Study No significant risks or discomforts are associated with your participation in this study. If you agree to participate in the assessmen ts, survey-questionnaires, and possible interview(s) you will be asked to reflect on if/how what you learned. If you agree to the reflection/responses review, a researcher will comb through your writings to look for themes. Confidentiality of Your Records ¨ Any information obtained during this study which c ould identify you will be kept strictly confidential. Your privacy and research re cords will be kept confidential to the extent of the law. ¨ However, certain people may need to see your study records. By law anyone who looks at your records must be keep them confide ntial. The only people who will be allowed to see these records are the study staff and people who make sure that we are doing the study in the right way. They also make sure that we protect your rights and safety: o The USF Institutional Review Board (IRB) and staff o The United States Department of Health and Human Se rvices (DHHS) ¨ The data will be stored in a locked cabinet in the investigator’s office and will only be seen by the investigator during the study and fo r three years after the study is complete. The information obtained in this study ma y be published in scientific journals or presented at scientific meetings but th e data will be reported as aggregated data. The audiotapes will be erased afte r transcription. Faculty from the College of Arts and Science and College of Educ ation who are involved in this research will compile these anonymous data.

PAGE 503

487 Appendix L: (Continued) Volunteering to Be Part of this Research Study Participation in the evaluation study of the progra m is completely voluntary. You are free to participate in this research study or t o withdraw at any time. If you choose not to participate, or if you withdra w, there will be no penalty or loss of benefits that you are entitled to through the da te you exit the study nor will your academic status be affected in any way. Questions and Contacts If you have any questions about this research study contact Linda Keen-Rocha at lrocha@mail.usf.edu or 727-553-4785. If you have questions about your rights as a person who is taking part in a study, call USF Research Compliance at (813) 974-5638. Investigator Statement I have carefully explained to the subject the natur e of the above protocol. I hereby certify that to the best of my knowledge the subject signin g this consent form understands the nature, demands, risks and benefits involved in par ticipating in this study. Name and Phone number of investigator: Linda S. Keen-Rocha, MA, Doctoral Candidate, Princi pal Investigator Office: (727) USF-4785

PAGE 504

488 Appendix L: (Continued) Consent, Right to Receive a Copy: I agree that: I have fully read this informed consent form descri bing a research project. I have had the opportunity to question one of the p ersons in charge of this research and have received satisfactory answers. I understand that I am being asked to participate i n research. I understand the risks and benefits, and I freely give my consent to participate in the research project outlined in this form, under the conditions indicated in it. You are voluntarily making a decision whether or no t to participate in this research study. Your signature certifies that you have decided to p articipate having read and understood the information presented. You will be given a copy of this consent form to keep. ___________ Check and initial if you agree to be audio taped during the interview( s) Signature of Participant: _________________________________________ Signature of Research P articipant Month and Year __________________________________________ Print Name Demographics : 1) Course Section-Study ID # ____________________________________ 2) Student U#:____________________________ 3) sex : __________ 4) college rank : no college rank freshman sophomore junior senior 5) semesters of high school chemistry : 0 1 2 3 over 3 6) semesters of college level chemistry completed : 0 1 2 3 4 5 6 7 8 7) College Major: _________________________________________ Name and Phone number of investigator: Linda S. Keen-Rocha, MA, Doctoral Candidate, Princi pal Investigator Office: (727) USF-4785

PAGE 505

489 Appendix M: Chemical Concepts Inventory Key (American Chemical Society Division of Chemical Edu cation, 2001) 1. d (Note: Some instructors who teach that a change in internal energy reflects a change in mass prefer c) 2. d 3. c 4. d 5. d 6. e 7. b 8. d 9. c 10. c 11. a 12. c 13. b 14. c 15. b 16. a 17. b 18. c 19. b 20. c 21. b 22. c

PAGE 506

490 Appendix N: EBAPS Scoring Scheme EBAPS Scoring with Excel Template 1) In the scoring template the q01, q02, q03...q30 columns are for students' raw answers to each of the 30 questions, with A = 1, B = 2, C = 3, D = 4, E = 5. 2) Get your data into a spreadsheet in that form, with each row corresponding to a different student, and the template will do the res t. 3) The q_01, q_02,...,q_30 columns are the scaled scores, on a sale of 0 to 4, with 4 = most sophisticated. 4) The axis_1, axis_2, etc. columns are students' s ubscale scores (again on a scale of 0 to 4) for each of the 5 subscales, with Axis 1 = Structure of knowledge Axis 2 = Nature of learning Axis 3 = Real-life applicability Axis 4 = Evolving knowledge Axis 5 = Source of ability to learn

PAGE 507

491 Appendix N (Continued) EBAPS Logistics and Scoring Color Coding Subscales Red Structure of knowledge Orange Nature of learning Green Real-life applicability Blue Evolving knowledge Purple Source of ability to learn Note: Black indicates the item doesn't belong to a subsca le. Items 19 and 28 belong to two subscales. A: Strongly disagree B: Somewhat disagree C: Neutra l D: Somewhat agree E: Strongly agree Part 1 1. Tamara just read something in her science textbook that seems to disagree with her own experiences. But to learn science well, Tamara shouldn’t think about her own experiences; she should just focus on what the book says. A = 4, B = 3, C = 1, D = 0.5, E = 0 2. When it comes to understanding physics or chemistr y, remembering facts isn’t very important. A = 0, B = 1.5, C = 2.5, D = 3.5, E = 4 3. Obviously, computer simulations can predict the be havior of physical objects like comets. But simulations can also help scientists es timate things involving the behavior of people such as how many people will buy new television s ets next year. A = 0, B = 1, C = 2, D = 3.5, E = 4 4. When it comes to science, most students either lea rn things quickly, or not at all. A = 4, B = 3, C = 2, D = 1, E = 0 5. If someone is having trouble in physics or chemist ry class, studying in a better way can make a big difference. A = 0, B = 1, C = 2, D = 3, E = 4 6. When it comes to controversial topics such as whic h foods cause cancer, there’s no way for scientists to evaluate which scientific stu dies are likely to be valid. Everything’s up in the air! A = 4, B = 4, C = 2, D = 1, E = 0 7. A teacher once said, "I don’t really understand something until I teach it." But actual ly, teaching doesn’t help a teacher understand the mate rial better; it just reminds her of how much she already knows. A = 4, B = 4, C = 2, D = 1, E = 0

PAGE 508

492 Appendix N (continued) 8. Scientists should spend almost all their time gath ering information. Worrying about theories can’t really help us understand anything. A = 4, B = 2.5, C = 1.5, D = 0.5, E = 0 9. Someone who doesn’t have high natural ability can still learn the material well even in a hard chemistry or physics class. A = 0, B = 1, C = 2, D = 3, E = 4 10. Often, a scientific principle or theory just doesn ’t make sense. In those cases, you have to accept it and move on, because not everythi ng in science is supposed to make sense. A = 4, B = 3, C = 2, D = 1, E = 0 11. When handing in a physics or chemistry test, you c an generally have a sense of how well you did even before talking about it with othe r students. A = 0, B = 1, C = 2, D = 3, E = 4 12. When learning science, people can understand the m aterial better if they relate it to their own ideas. A = 0, B = 0.5, C = 1, D = 3, E = 4 13. If physics and chemistry teachers gave really clear lectures, with plenty of real-life examples and sample problems, then most good studen ts could learn those subjects without doing lots of sample questions and practice problems on their own. A = 4, B = 3, C = 1, D = 0.5, E = 0 14. Understanding science is really important for peop le who design rockets, but not important for politicians. A = 4, B = 3, C = 2, D = 1, E = 0 15. When solving problems, the key thing is knowing th e methods for addressing each particular type of question. Understanding the "big ideas" might be helpful for speciallywritten problems, but not for most regular problems A = 4, B = 3, C = 2, D = 1, E = 0 16. Given enough time, almost everybody could learn to think more scientifically, if they really wanted to. A = 0, B = 1, C = 2, D = 3, E = 4 17. To understand chemistry and physics, the formulas (equations) are really the main thing; the other material is mostly to help you dec ide which equations to use in which situations. A = 4, B = 3, C = 1.5, D = 0.5, E = 0

PAGE 509

493 Appendix N (Continued) Part 2 DIRECTIONS : Multiple choice. On the answer sheet, fill in the answer that best fits your view. 18. If someone is trying to learn physics, is the foll owing a good kind of question to think about? "Two students want to break a rope. Is it better fo r them to (1) grab opposite ends of the rope and pull (like in tug-of-war), or (2) tie one end of the rope to a wall and both pull on the other end together?" (a) Yes, definitely. It’s one of the best kinds of questions to study. (b) Yes, to some extent. But other kinds of questio ns are equally good. (c) Yes, a little. This kind of question is helpful but other kinds of questions are more helpful. (d) Not really. This kind of question isn’t that gr eat for learning the main ideas. (e) No, definitely not. This kind of question isn’t helpful at all. A = 4, B = 3.5, C = 1.5, D = 0.5, E = 0 1 9. Scientists are having trouble predicting and expla ining the behavior of thunder storms. This could be because thunder storms behave according to a very complicated or hard-to-apply set of rules. Or, that could be be cause some thunder storms don’t behave consistently according to any set of rules, no matter how complicated and complete that set of rules is. In general, why do scientists sometimes have troubl e explaining things? Please read all options before choosing one. (a) Although things behave in accordance with rules those rules are often complicated, hard to apply, or not fully known. (b) Some things just don’t behave according to a co nsistent set of rules. (c) Usually it’s because the rules are complicated, hard to apply, or unknown; but sometimes it’s because the thing doesn’t follow rul es. (d) About half the time, it’s because the rules are complicated, hard to apply, or unknown; and half the time, it’s because the thing doesn’t follow rules. (e) Usually it’s because the thing doesn’t follow r ules; but sometimes it’s because the rules are complicated, hard to apply, or unknown. A = 4, B = 0, C = 3, D = 2, E = 1 20. In physics and chemistry, how do the most importan t formulas relate to the most important concepts? Please read all choices before picking one. A = 4, B = 0, C = 3, D = 2, E = 1 (a) The major formulas summarize the main concepts; they’re not really separate from the concepts. In addition, those formulas are helpf ul for solving problems. (b) The major formulas are kind of "separate" from the main concepts, since concepts are ideas, not equations. Formulas are better chara cterized as problem-solving tools, without much conceptual meaning. (c) Mostly (a), but a little (b). (d) About half (a) and half (b). (e) Mostly (b), but a little (a).

PAGE 510

494 Appendix N (Continued) 21. To be successful at most things in life... A = 4, B = 3, C = 2, D = 1, E = 0 (a) Hard work is much more important than inborn na tural ability. (b) Hard work is a little more important than natur al ability. (c) Natural ability and hard work are equally impor tant. (d) Natural ability is a little more important than hard work. (e) Natural ability is much more important than har d work. 22. To be successful at science... A = 4, B = 4, C = 2, D = 1, E = 0 (a) Hard work is much more important than inborn na tural ability. (b) Hard work is a little more important than natur al ability. (c) Natural ability and hard work are equally impor tant. (d) Natural ability is a little more important than hard work. (e) Natural ability is much more important than har d work. 23. Of the following test formats, which is best for m easuring how well students understand the material in physics and chemistry? P lease read each choice before picking one. A = 0, B = 4, C = 1, D = 2, E = 3 (a) A large collection of short-answer or multiple choice questions, each of which covers one specific fact or concept. (b) A small number of longer questions and problems each of which covers several facts and concepts. (c) Compromise between (a) and (b), but leaning mor e towards (a). (d) Compromise between (a) and (b), favoring both e qually. (e) Compromise between (a) and (b), but leaning mor e towards (b). Part 3 DIRECTIONS : In each of the following items, you will read a s hort discussion between two students who disagree about some issue. Then yo u’ll indicate whether you agree with one student or the other 24. Brandon: A good science textbook should show how th e material in one chapter relates to the material in other chapters. It shouldn’t tre at each topic as a separate "unit," because they’re not really separate. A = 4, B = 4, C = 2, D = 1, E = 0 Jamal: But most of the time, each chapter is about a different topic, and those different topics don’t always have much to do with each other The textbook should keep everything separate, instead of blending it all tog ether. With whom do you agree? Read all the choices before circling one. (a) I agree almost entirely with Brandon. (b) Although I agree more with Brandon, I think Jam al makes some good points. (c) I agree (or disagree) equally with Jamal and Br andon. (d) Although I agree more with Jamal, I think Brand on makes some good points. (e) I agree almost entirely with Jamal.

PAGE 511

495 Appendix N (Continued) 25. Anna: I just read about Kay Kinoshita, the physicis t. She sounds naturally brilliant. Emily: Maybe she is. But when it comes to being goo d at science, hard work is more important than "natural ability." I bet Dr. Kinoshi ta does well because she has worked really hard. Anna: Well, maybe she did. But let’s face it, some people are just smarter at science than other people. Without natural ability, hard wo rk won’t get you anywhere in science! (a) I agree almost entirely with Anna. (b) Although I agree more with Anna, I think Emily makes some good points. (c) I agree (or disagree) equally with Anna and Emi ly. (d) Although I agree more with Emily, I think Anna makes some good points. (e) I agree almost entirely with Emily. A = 0, B = 1, C = 2, D = 4, E = 4 26. Justin: When I’m learning science concepts for a te st, I like to put things in my own words, so that they make sense to me. Dave: But putting things in your own words doesn't help you learn. The textbook was written by people who know science really well. You should learn things the way the textbook presents them. (a) I agree almost entirely with Justin. (b) Although I agree more with Justin, I think Dave makes some good points. (c) I agree (or disagree) equally with Justin and D ave. (d) Although I agree more with Dave, I think Justin makes some good points. (e) I agree almost entirely with Dave. A = 4, B = 4, C = 2, D = 1, E = 0 27. Julia: I like the way science explains how things I see in the real world. Carla: I know that’s what we’re "supposed" to think and it’s true for many things. But let’s face it, the science that explains things we do in lab at school can’t really explain earthquakes, for instance. Scientific laws work wel l in some situations but not in most situations. Julia: I still think science applies to almost all real-world experiences. If we can’t figure out how, it’s because the stuff is very complicated or because we don’t know enough science yet. (a) I agree almost entirely with Julia. (b) I agree more with Julia, but I think Carla make s some good points. (c) I agree (or disagree) equally with Carla and Ju lia. (d) I agree more with Carla, but I think Julia make s some good points. (e) I agree almost entirely with Carla. A = 4, B = 4, C = 2, D = 1, E = 0

PAGE 512

496 Appendix N (Continued) 2 8. Leticia: Some scientists think the dinosaurs died o ut because of volcanic eruptions, and others think they died out because an asteroid hit the Earth. Why can’t the scientists agree? Maria: Maybe the evidence supports both theories. T here’s often more than one way to interpret the facts. So we have to figure out what the facts mean. Leticia: I’m not so sure. In stuff like personal re lationships or poetry, things can be ambiguous. But in science, the facts speak for them selves. (a) I agree almost entirely with Leticia. (b) I agree more with Leticia, but I think Maria ma kes some good points. (c) I agree (or disagree) equally with Maria and Le ticia. (d) I agree more with Maria, but I think Leticia ma kes some good points. (e) I agree almost entirely with Maria. A = 0, B = 1, C = 2, D = 3, E = 4 29. Jose: In my opinion, science is a little like fashi on; something that’s "in" one year can be "out" the next. Scientists regularly change their t heories back and forth. Miguel: I have a different opinion. Once experiment s have been done and a theory has been made to explain those experiments, the matter is pretty much settled. There’s little room for argument. (a) I agree almost entirely with Jose. (b) Although I agree more with Jose, I think Miguel makes some good points. (c) I agree (or disagree) equally with Miguel and J ose. (d) Although I agree more with Miguel, I think Jose makes some good points. (e) I agree almost entirely with Miguel. A = 0, B = 2, C = 4, D = 2, E = 0 30. Jessica and Mia are working on a homework assignmen t together... Jessica: O.K., we just got problem #1. I think we s hould go on to problem #2. Mia: No, wait. I think we should try to figure out why the thing takes so long to reach the ground. Jessica: Mia, we know it’s the right answer from th e back of the book, so what are you worried about? If we didn’t understand it, we would n’t have gotten the right answer. Mia: No, I think it’s possible to get the right ans wer without really understanding what it means. (a) I agree almost entirely with Jessica. (b) I agree more with Jessica, but I think Mia make s some good points. (c) I agree (or disagree) equally with Mia and Jess ica. (d) I agree more with Mia, but I think Jessica make s some good points. (e) I agree almost entirely with Mia. A = 0, B = 1, C = 2, D = 4, E = 4

PAGE 513

497 Appendix O: NSKS Scoring Procedures Scale Points Positive Items (1) Strongly Agree 5 2, 3, 4, 5, 6, 8 (2) Agree 4 12, 16, 17, 20, 22, 26 (3) Neutral 3 28, 29, 30, 32, 35, 37 (4) Disagree 2 38, 42, 45, 46, 47, 48 (5) Strongly Disagree 1 Scale Points Negative Items (1) Strongly Agree 1 1, 7, 9, 10, 11,13, 14 (2) Agree 2 15, 18, 19, 21, 23, 24 (3) Neutral 3 25, 27, 31, 33, 34, 36 (4) Disagree 4 39, 40, 41, 43, 44 (5) Strongly Disagree 5 NSKS Subscales Items Amoral 4, 5, 7, 8, 18, 21, 36, 48 Creative 1, 17, 20, 23, 28, 32, 34, 41 Development 16, 25, 26, 27, 31, 37, 42, 43 Parsimonious 2, 6, 14, 15, 29, 39, 40, 46 Testable 9, 11, 12, 13, 22, 33, 38, 45 Unified 3, 10, 19, 24, 30, 35, 44, 47 Subscale(s) Score 8 – 40 points Overall Score 48 -240 points NSKS Representative Placement Scale Realist-----------------------------------neutral---------------------------------Instrumentalist (48) (unaccepted NOS view) (144) (accepted NOS view) (240) Realist – absolute; theories are either true or fal se Instrumentalist – subjective; theories are tools

PAGE 514

498 Appendix P: CCI-EBAPS-NSKS Interview Participant Scores Descriptive Statistics of Interviewed Participants (N=20) ID CCI EBAPS Pre EBAPS Post NSKS Pre NSKS Post 1 72 2.70 3.13 143 155 2 76 2.35 2.55 144 153 3 81 2.38 2.97 138 148 4 67 2.70 2.62 138 149 5 86 1.88 2.08 144 151 6 63 2.37 3.12 149 151 7 63 2.32 2.77 143 152 8 72 2.83 3.22 147 145 9 45 2.53 2.60 147 155 10 72 2.05 3.45 141 153 11 58 2.80 2.98 143 149 12 63 2.63 2.78 138 150 13 49 2.63 2.48 146 144 14 65 2.48 3.02 132 142 15 76 2.98 3.12 140 145 16 77 2.85 3.55 143 148 17 65 2.50 2.45 136 142 18 76 2.63 2.77 143 148 19 67 2.52 2.87 140 152 20 58 2.65 2.80 138 146

PAGE 515

About the Author Linda S. Keen-Rocha received a bachelor’s degree in Biological Science from the University of Maryland in 1985. She received a Master of Education degree in Biological Sciences from the University of South Fl orida in 1997. She has conducted research in optic nerve regeneration, population ec ology, bridge corrosion, and other areas of science the past 20 years. She is enterin g her twenty-fifth year of teaching with 12 of those years in the high school setting teachi ng anatomy and physiology, chemistry, and other related courses. She has taught biology, anatomy, and chemistry courses at the college level for 13 years in Maryland and Flor ida. She has made presentations at regional and national meetings and has authored sev eral publications related to science education laboratory instructional issues. Her pri mary research interests include: personal epistemological beliefs in science, nature of science, science pedagogy, selfregulated learning, learning styles, and technology in science education.


xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam Ka
controlfield tag 001 002000962
003 fts
005 20090424140518.0
006 m||||e|||d||||||||
007 cr mnu|||uuuuu
008 090424s2008 flu s 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E14-SFE0002522
035
(OCoLC)319169334
040
FHM
c FHM
049
FHMM
090
Q181 (Online)
1 100
Keen-Rocha, Linda S.
0 245
Personal epistemological growth in a college chemistry laboratory environment
h [electronic resource] /
by Linda S. Keen-Rocha.
260
[Tampa, Fla] :
b University of South Florida,
2008.
500
Title from PDF of title page.
Document formatted into pages; contains 498 pages.
Includes vita.
502
Dissertation (Ph.D.)--University of South Florida, 2008.
504
Includes bibliographical references.
516
Text (Electronic dissertation) in PDF format.
520
ABSTRACT: The nature of this study was to explore changes in beliefs and lay a foundation for focusing on more specific features of reasoning related to personal epistemological and NOS beliefs in light of specific science laboratory instructional pedagogical practices (e.g., pre- and post- laboratory activities, laboratory work) for future research. This research employed a mixed methodology, foregrounding qualitative data. The total population consisted of 56 students enrolled in several sections of a general chemistry laboratory course, with the qualitative analysis focusing on the in-depth interviews. A quantitative NOS and epistemological beliefs measure was administered pre- and post-instruction. These measures were triangulated with pre-post interviews to assure the rigor of the descriptions generated.Although little quantitative change in NOS was observed from the pre-post NSKS assessment a more noticeable qualitative change was reflected by the participants during their final interviews. The NSKS results: the mean gain scores for the overall score and all dimensions, except for amoral were found to be significant at p < [or] = .05. However there was a more moderate change in the populations' broader epistemological beliefs (EBAPS) which was supported during the final interviews. The EBAPS results: the mean gain scores for the overall score and all dimensions, except for the source of ability to learn were found to be significant at p < [or] = .05. The participants' identified the laboratory work as the most effective instructional feature followed by the post-laboratory activities. The pre-laboratory was identified as being the least effective feature.The participants suggested the laboratory work offered real-life experiences, group discussions, and teamwork which added understanding and meaning to their learning. The post-laboratory was viewed as necessary in tying all the information together and being able to see the bigger picture. What one cannot infer at this point is whether these belief changes and beliefs about laboratory instruction are enduring or whether some participants are simply more adaptable than others are to the learning environment. More research studies are needed to investigate the effects of laboratory instruction on student beliefs and understanding.
538
Mode of access: World Wide Web.
System requirements: World Wide Web browser and PDF reader.
590
Advisor: Dana L. Zeidler, Ph.D.
653
Chemistry education
Laboratory instruction
Microcomputer-based
Pedagogy
Intellectual development
690
Dissertations, Academic
z USF
x Science Education
Doctoral.
773
t USF Electronic Theses and Dissertations.
4 856
u http://digital.lib.usf.edu/?e14.2522