USF Libraries
USF Digital Collections

Preservice elementary teachers' pedagogical content knowledge related to area and perimeter :

MISSING IMAGE

Material Information

Title:
Preservice elementary teachers' pedagogical content knowledge related to area and perimeter : a teacher development experiment investigating anchored instruction with web-based microworlds
Physical Description:
Book
Language:
English
Creator:
Kellogg, Matthew
Publisher:
University of South Florida
Place of Publication:
Tampa, Fla
Publication Date:

Subjects

Subjects / Keywords:
Mathematics
Technology
Knowledge of student thinking
Misconceptions
Dissertations, Academic -- Secondary Education -- Doctoral -- USF   ( lcsh )
Genre:
non-fiction   ( marcgt )

Notes

Abstract:
ABSTRACT: Practical concepts, such as area and perimeter, have an important part in today‟s school mathematics curricula. Research indicates that students and preservice teachers (PSTs) struggle with and harbor misconceptions regarding these topics. Researchers suggest that alternative instructional methods be investigated that enhance PSTs‟ conceptual understanding and encourage deeper student thinking. To address this need, this study examined and described what and how PSTs learn as they engage in anchored instruction involving web-based microworlds designed for exploring area and perimeter. It‟s focus was to examine the influences of a modified teacher development experiment (TDE) upon 12 elementary PSTs‟ content knowledge (CK) and knowledge of student thinking (KoST) regarding principles, relationships, and misconceptions involving area and perimeter as they develop simultaneously in a problem-solving environment. The learning of meaningful mathematics is a personal and independent activity, as one struggles to create and reason through their own mathematical realities and misconceptions. This study describes PSTs‟ reasonings, misconceptions, and difficulties as they grappled with new knowledge or reconciled new knowledge with prior understandings. Quantitative and qualitative research methods, including case-subject analysis, were used. Instructional sessions similar to Steffe‟s (1983) teaching episodes comprised this study‟s intervention. Results indicate that prior to intervention most of the PSTs possessed a procedural knowledge of area and perimeter and were bound by a dependency on formulas; their KoST pertaining area and perimeter was relatively underdeveloped. They seemed unaware of prevalent misconceptions students acquire while working with these concepts (specifically, units of measure and perceived relationships). The PSTs displayed an ineffective use of drawings to support their responses. Their preoccupation with finding what they judged as "the answer" to various problem-solving situations hindered their ability to properly diagnose and address student thinking and limited their meaningful interaction with the microworlds (MWs). A majority of PSTs felt the MWs were a valuable learning tool for themselves but not for their future students. The planned intervention played a role in the PSTs becoming more perceptive of the difficult mathematics involved with area and perimeter and better equipped to anticipate and address those difficulties with future students.
Thesis:
Dissertation (Ph.D.)--University of South Florida, 2010.
Bibliography:
Includes bibliographical references.
System Details:
Mode of access: World Wide Web.
System Details:
System requirements: World Wide Web browser and PDF reader.
Statement of Responsibility:
by Matthew Kellogg.
General Note:
Title from PDF of title page.
General Note:
Document formatted into pages; contains X pages.
General Note:
Includes vita.

Record Information

Source Institution:
University of South Florida Library
Holding Location:
University of South Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
usfldc doi - E14-SFE0003306
usfldc handle - e14.3306
System ID:
SFS0027622:00001


This item is only available as the following downloads:


Full Text

PAGE 1

Preservice Elementary Content Knowledge Related t o Area a nd Perimeter: A Teacher Development Experiment Investigating Anchored Instruction W ith Web Based Microworlds b y Matthew S. Kellogg A dissertation in partial fulfillment o f the requirements for the degree of Doctor of Philosophy Department of Secondary Education College of Education University of South Florida Major Professor: Gladis Kersaint Ph .D. Robert F. Dedrick, Ph.D. Denisse R. Thompson, Ph. D. James A. White Ph. D. Date of Approval: May 7 2010 Keywords: mathematics, technology, knowledge of student thinking, misconc e ptions Copyright 2010 Matthew S. Kellogg

PAGE 2

DEDICATION This work is dedicated to my family. To my loving wife Karen, without whose faithful and steadfast support this dissertation could not have been completed. Thank you for putting aspects of your life on hold during this journey. I love you. To my daughter Madison, this endeavor began before you were born. You Madison, Daddy can play whenever you want!

PAGE 3

ACKNOWLEDGEMENTS I would like to sincerely thank my committee for being my advocates through this long process. Each of your expertise has been vital to my completion. To Dr. Kersaint, thank you for all you have taught me; for taking the time to nurture my interests, for g uiding my design and research for your countless revision s. Thank you for your time, your patience and for understanding what I needed to hear and how to say it To Dr. Thompson, thank you for graciously sharing of your expertise regarding rubrics and as sessment for all the suggested readings which were so helpful, and your amazing editing skills. To Dr. White, your input and guidance regarding anchored instruction and instructional technologies was so valuable, as well as the gracious way in which it wa s presented. To Dr. Dedrick, my measurement guru, thank you for patiently helping me structure and carry out the statistical aspects of this study. You helped me understand what was necessary and appropriate. Each of you has left an indelible mark on my life for which I can only express my sincere gratitude. May you continue to inspire and challenge many future students.

PAGE 4

i TABLE OF CONTENTS LIST OF TABLES ................................ ................................ ................................ ............ vii LIST OF FIGURES ................................ ................................ ................................ ........... ix ABSTRACT ................................ ................................ ................................ ...................... x i i CHAPTER 1. INTRODUCTION ................................ ................................ ....................... 1 Statement of the Problem ................................ ................................ ......................... 5 Purpose of the Study ................................ ................................ ................................ 7 Conceptual Framework ................................ ................................ ............................ 8 Anchored Instruction ................................ ................................ ................. 12 Form at for Instructi onal Sequence ................................ ............................. 13 Technology I ntegration ................................ ................................ .............. 14 Research Questions ................................ ................................ ................................ 16 Definitions ................................ ................................ ................................ .............. 17 CHAPTER 2. REVIEW OF THE LITERATURE ................................ ........................... 19 Knowledg e Domains and the Craft of Teaching ................................ .................... 20 Content Knowledge and Pedagogical Content Knowledge ....................... 21 Characterizing PCK ................................ ................................ ................... 22 N ovi ce PCK ................................ ................................ ................... 23 Expert Novice PCK differences ................................ .................... 25 Reforming Pedagogical Content Knowledge ................................ ............. 28 Innovative Interventions fo r Pre and Inservice Teachers ......................... 30 Developing Meaningful Content Knowledge ................................ 30 Constructing Pedagogical Content Knowledge ............................. 33 Promoting an A wareness of Student Cognition ............................. 34 Measuring Pedagogical Content Knowledge ................................ ............. 41 Knowledge of and L earning about Area an d Perimeter ................................ ......... 43 Difficu lties with Area and Perimeter ................................ ......... 44 Prevalent Misconceptions Regarding Area and Perimeter ........................ 49 Confusing Area and P erimeter ................................ ....................... 50 Linear Versus Square U nits ................................ ........................... 52 P erceived R elationshi ps between Area and P erimeter ................... 56 Stu dent ................................ .......................... 61 Likely Causes of Area and Perimeter Misconceptions .............................. 63

PAGE 5

ii Unfocused Curri culum ................................ ................................ ... 63 Ineffective Instruction ................................ ................................ .... 6 5 Over Emphasis on Procedural Knowledge ................................ .... 67 Innovative Instructional Strategies ................................ ................................ ........ 69 Refine the Focus ................................ ................................ ........................ 69 Integrating Innovative Learning Tools ................................ ...................... 71 Enhancing Mathemati cs Teacher Education with Technology ............................. 72 The Need for Technology Infu sion W ithin Teacher Education ................. 7 3 Recommendations and Guidelines for Effective Technology Integration ................................ ................................ ............................ 75 The Concept and Possibil ities of Anchored Instruction ............................ 82 Goals and Uses of Anchored Instruction ................................ ....... 83 Highlighted Res earch on Anchored Instruction ............................. 85 Microworlds ................................ ................................ ............................... 89 Micro worlds: Defined and Described ................................ ............ 90 Ch aracteristics of a Microworld ................................ ..................... 91 Static Geometric So ftware ................................ ............................. 94 Dynamic Geometric Software ................................ ........................ 98 Teaching and Learning Mathematics with Microworlds ........................... 99 Computer Mi croworlds in the K 12 Setting ................................ 100 Micr o worlds and Teacher Education ................................ ........... 103 Summary of the Literature R hey Info rmed this Proposed Study ................................ ................................ ........ 110 CHAPTER 3. METHODS ................................ ................................ .............................. 1 14 Introduction ................................ ................................ ................................ .......... 114 Research Questions ................................ ................................ .............................. 1 15 Setting ................................ ................................ ................................ .................. 116 Descri ptions of the Methods Course ................................ ........................ 117 The Microworlds ................................ ................................ ................................ .. 119 The Intervention ................................ ................................ ................................ ... 126 Anchored Instruction ................................ ................................ ............... 128 The Teaching Episodes ................................ ................................ ............ 129 Modifi cations to Teaching Episodes ................................ ........................ 133 Revisions to CK and KoST Writing Prompts .............................. 134 Revisions to Cooperative Work ................................ ................... 136 Instrumentation ................................ ................................ ................................ .... 138 Pr e Study Survey Questionnaire ................................ .............................. 138 Area and Perimeter Tests ................................ ................................ ......... 139 Val idity of Testing Instruments ................................ ................................ ........... 144 Proce dures ................................ ................................ ................................ ............ 147 Data Collection ................................ ................................ ........................ 147 Whole Group Data ................................ ................................ ................... 149 Pre Study Questionnaire ................................ .............................. 149 Microworld Orientation Session ................................ .................. 149 Administering Area and Perimeter Tests ................................ ..... 150

PAGE 6

iii Data from Teaching Episodes ................................ ...................... 151 ................................ ................................ .................. 152 Case Subjects: Selection and Data Collection Process ............................ 153 Data Analysis ................................ ................................ ................................ ....... 15 6 Scoring Rubrics for Area and Perimeter Tests ................................ ........ 158 Reliability of the Data ................................ ................................ .............. 158 Inte rnal Consistency Reliability ................................ ................... 159 Inter Rater Relia bility: Training and Scoring .............................. 161 Rubric Scoring and Coding T raining ................................ ........... 162 Expert/Novice Coding: Deve lopment, Training, and Usage ....... 165 Validation of Ancho red Instruction Intervention ........................ 174 Cross Cas e Analysis ................................ ................................ ................ 176 Pre Intervention CK and KoST ................................ ................................ 176 Analysis of Pretest Written Responses ................................ .................... 177 Analysis of the First Interview ................................ ................................ 180 Intervention CK and KoST ................................ ........ 183 Emergent Knowledge: The Teaching Episodes ................................ ....... 183 Post Intervention Knowledge ................................ ................................ .. 184 Regres sion Analysis of Test Scores ................................ ............. 184 Relationships Between CK and KoST ................................ ................................ 186 Limitations of this Study ................................ ................................ ...................... 190 CHAPTER 4. FINDINGS ................................ ................................ ............................... 1 92 Selection of Case Subjects ................................ ................................ ................... 193 Case Subject Jackie ................................ ................................ ................. 193 Case Subject Brianna ................................ ................................ ............... 194 Case Subject Larry ................................ ................................ ................... 196 Case Subject Grace ................................ ................................ .................. 197 Intervention CK and KoST ................. 198 Pretest Level of CK and KoST ................................ ................................ 198 Descriptive Statistics for Rubric Scorings of Pretest Items ......... 198 Descriptive Statistics for Expert/Novice Codings for Pretest ...... 200 Describing PS Pre intervention CK and KoST ................................ .. 204 Distinguishing Between Area and Perimeter ............................... 204 Procedural Versus C onceptual CK ................................ .. 205 Perceived Student D ifficulties ................................ ......... 207 Distinguis hing the Correct Unit of Measure ................................ 209 C onfusing the Measure w ith its U nit ............................... 209 Knowledge R egarding Irregular S hapes .......................... 212 Creative in Problem Solving ................................ ............ 216 Ability to Explain and Illustrate Units of Measure .......... 218 Utilizing Drawings ................................ ........................... 220 Units of Measure ................................ ................................ ..... 223 The Importance of Units in Explanations ........................ 223 Focused on Solving, or Diagnos ing & Responding ......... 226

PAGE 7

iv Perceived Relationships Between Area and Perimeter ................ 235 Knowledge of the Direct Relationship Misconception ................................ ............................. 236 KoST ................................ ................................ ........... 240 Knowledge Regarding the Fixed Relatio nship Misconception ................................ ............................. 244 Intervention CK and KoST ................................ ................................ ................................ .. 248 A Teacher Development Experiment ................................ ....................... 249 Emergent Levels of CK and KoST ................................ .......................... 250 Comparisons of Pre Post and Follow Up Levels of CK and KoST ................................ ................................ ............................ 254 Changes in Rubric Score Frequencies ................................ ......... 258 Changes in Expert/Novice Frequency Totals .............................. 258 Changes in the Frequency of Specific Expert/Novice Codes Assigned ................................ ................................ ....... 262 Linear Regression Involving CK and KoST and Total Tes t Scores ................................ ................................ .............. 265 ................................ ...... 266 Changes in CK Regarding Units of Measure ............................... 275 Confusing the Measure with its Unit ............................... 276 Procedural Versus Conceptual CK ................................ .. 283 Knowledge Regarding Irregular Shapes .......................... 290 Creative in P roblem Solving ................................ ............ 291 Ability to Explain and Illustrate Units of Measure .......... 295 Utilizing Drawings ................................ ........................... 299 Regarding Units of Measure ................................ ................................ ..... 303 Focused on Solving, or Diagnosing & Responding: Emerge nt CK & KoST ................................ ................ 303 ............................ 310 Realizing the Importance of Units in Explanations ......... 317 Knowledge Regarding Perceived Relationships ......................... 328 Emergent CK of the Fixed Relationship Misconception ................................ ............................. 328 Post Intervention CK of the Fixed Relationship Misconception ................................ ............................. 333 Emergent CK of the Direct Relationship Misconception ................................ ............................. 336 Post Intervention CK of the Direct Relationship Misconception ................................ ............................. 342 Emergent KoST of the Fixed Relat ionship Misconception ................................ ............................. 350 Post Intervention KoST of the Fixed Relationship

PAGE 8

v Misconception ................................ ............................. 360 Emergent KoST of the Direct Relationship Misconception ................................ ............................. 364 Post Intervention KoST of the Direct Relationship Misc onception ................................ ............................. 373 Research Question 5: Identifying and Describing CK KoST Relationships ................................ ................................ ................................ ... 379 Ident ifying CK KoST Relationships ................................ ........................ 381 Desc ribing CK KoST Relationships ................................ ........................ 383 The Increased CK Increased KoST Relationship ................................ ................................ ........... 383 Relationship Prior to Intervention ............ 384 Relationship: Emergent Findings ............. 388 Rel ationship, Post Intervention ................ 393 CHAPTER 5. SUMMARY, IMPLICATIONS AND RECOMMENDATIONS .......... 400 Summary of Findings ................................ ................................ ........................... 401 Inte rvention CK and KoST: Research Questions 1 & 2 ........ 402 General CK Regarding Area and Perimeter ................................ 402 Distinguishing Between Area and Perimeter ................... 403 CK Regarding Units of Measure ................................ .................. 403 Inattention to Units ................................ .......................... 404 Ability to Explain and Illustrate Uni ts of Measure .......... 404 Utilizing Drawings ................................ ........................... 405 CK Regarding Perceived Relationships Between Area and Perimeter ................................ ................................ ................. 406 Pre Intervention KoST ................................ ................................ 407 Summary of Emergent Findings: Impact of Intervention ....................... 408 The Teaching Episodes ................................ ................................ 408 TE 1: Units of Measure ................................ ................... 408 TE 2: The Fixed Relationship Misconception ................ 409 TE 3: The Direct Relationship Misconception ............... 410 Impact of Microworld Usage ................................ ....................... 410 Intervention CK and KoST .............................. 412 Descriptive Findings ................................ ................................ .... 413 Changes in P .............................. 413 Procedural Versus Conceptual Knowledge ..................... 414 Ability to Explain ................................ ............................. 414 Utilizing Drawings ................................ ........................... 415 CK Regarding Perceived Relationships ........................... 416 .......................... 417 Case Subject Summaries ................................ ................................ ......... 419 ................................ ......................... 419 ................................ ........................ 421 ................................ ..................... 423 ................................ ........................ 424

PAGE 9

vi Conclusions ................................ ................................ ................................ .......... 426 Regarding Pre Intervention CK and KoST ................................ .............. 427 Expert/Novice Differences ................................ ........................... 427 Basic CK: Units of Measure ................................ ....................... 428 Ability to Diagnose and Respond to Student Thinking ............... 430 Perceived Relationships Between Area and Perimeter ................ 431 Regarding Relationship Between CK and KoST ................................ ..... 433 Regarding Anchored Instruction with Web Based Microworlds ............ 434 Impl ications for Practice ................................ ................................ ...................... 437 Implications for Teachers ................................ ................................ ........ 438 Implications for Teacher Educators ................................ ......................... 439 Implications for Future Research ................................ ................................ ......... 440 REFERENCES ................................ ................................ ................................ ................ 444 APPENDICES ................................ ................................ ................................ ................. 480 Appendix A: Piloting of Instruments ................................ ................................ 481 Appendix B : Syllabus for Methods Course ................................ ....................... 493 Appendix C: Pr e Study Survey Questionnaire ................................ .................. 495 Appendix D : Area and Perimeter Pretest ................................ ........................... 501 Appendix E: Area and Perimeter Posttest ................................ ......................... 508 Appendix F: Area and Perimeter Follow Up Test ................................ ............ 513 Appendix G: Preliminary Rubrics for Scoring Area and Perimeter Tests ......... 520 Appendix H: Amended Rubrics for Scoring Area and Perimeter Tests ............ 522 Appendix I: Supplemental Grading Sh eets ................................ ...................... 524 Appendix J: Samples of Test Items f rom Pi loting to Illustrate Scoring ........... 527 Appendix K: Learning P ackets for Teaching Episodes ................................ ..... 531 Appe ndix L: Second Observer Protocol ................................ ............................ 552 Appendix M: Mic roworlds Orientation Session ................................ ................ 553 Appendix N: Purposely Select ed Tasks for Final Interview ............................. 555 Appendix O: Anchored I nstruction Assessment Survey ................................ ... 556 ABOUT THE AUTHOR ................................ ................................ ....................... End Page

PAGE 10

vii LIST OF TABLES Table 1 Description of Test Questions Selected for this Study ............................... 141 Table 2 Post and Follow up Tests ............................. 160 Table 3 Coding Sheet s to Help in Categorizing Novice V ersus Expert Behav ior W ithin the Context of this Study ................................ ............ 166 Table 4 Results from Assessment Survey of Anchored Instruction ....................... 175 Table 5 Corresponding Test Items for Comparative Analysis for Answering Research Question Five ................................ ...................... 188 Table 6 Case Subject Data ................................ ................................ ....................... 195 Table 7 Descri ptive Statistics for Pretest ................................ ................................ 19 9 Table 8 ................................ .... 201 Table 9 Expert /Novice Co ding Frequencies for Pretest ................................ ........... 202 Table 10 Expert/Novice Specific Code Frequencies from Case Pretest ................................ ................................ ................................ ..... 203 Table 11 Pre Intervention Use of Drawings ................................ .............................. 222 Table 12 Investigating an Erroneous Student Claim (Pre Intervention) .................... 241 Table 13 Exp ert/Novice Coding Totals for Teaching Episodes ................................ 251 Table 14 Descriptive Statistics for Pre Post and Follow Up Tests ....................... 256 Table 15 P ost and Follow Up Test Rubric Score Frequencies ............ 259 Table 16 Expert/Novice Coding Totals for Pre Post and Follow Up Tests .......... 261

PAGE 11

viii Table 17 Expert/Novice Coding Frequencies for Case Subjects from Pre Post and Follow Up Tests ................................ ................................ .... 263 Table 18 Regression .................. 265 Table 19 Use of Drawings Throughout the Study ................................ ..................... 300 Table 20 Findings Related to Microworld Usage & Benefits ................................ .... 311 Table 21 Instructional Recommendations for Microworlds ................................ ...... 317 Table 22 Investigating an Erron eous Student Claim ................................ ................. 339 Table 23 Investigating an Erroneous Student Claim Throughout the Study ............ 343 Table 24 .............................. 350 Table 25 Sample of Expert/Novice Codings Relevant to Units of Measure Analysis S trand (CK) ................................ ................................ ............. 389 Table 26 Sample of Expert/Novice Codings Relevant to Units of Measure Analysis Strand (KoST ) ................................ ................................ ......... 390

PAGE 12

ix LIST OF FIGURES Figure 1 Measurement E xercise Very S imilar to O ne A sked in the 1972 73 NAEP ................................ ................................ ................................ ....... 45 Figure 2 Percentage of Students in Grades 3 and 7 R esponding to a 2003 NAEP Item ................................ ................................ ............................... 4 7 Figure 3 Item from the F ourth NAEP ................................ ................................ ......... 48 Figure 4 Diagram S hown to Preservice Teachers ................................ ....................... 51 Figure 5 igure Having a Perimeter of 24 Units ................................ ................................ ................................ 54 Figure 6 hat I ncreasing Perimeter A lso Increases Area ................................ ................................ 59 Figure 7 Screenshot of Perimeter and Area M icroworld w ith S everal Options S elec t ed ................................ ................................ .................... 120 Figure 8 Screenshot from Shape Explorer Microworld W e bsite .............................. 12 1 Figure 9 Screenshot from the revised Shap e Builder microworld website ............... 122 Figure 10 Shape Builder Screenshot of a Rectangular Shape A utomatically Generated by the Microworld W hi Mode ................................ ................................ ................................ ...... 124 Figure 11 Shape Builder Screenshot of Shape Automatically Generated W hile ........... 124 Figure 12 Screenshot from Shape Builder Showing Error Message w hen an Invalid Shape is Created ................................ ................................ ........ 125 Figure 13 Screensho t From the Shape Builder Microworld A ith the Shape S hown in Figure 9 ................................ ................................ ................................ .. 125

PAGE 13

x Figure 14 Focus Problem Appearing at B eginning of T eaching Episode 1 ................ 130 Figure 15 Focus Problem for Teaching Episod e 2 ................................ ...................... 133 Figure 16 Focus Problem for the Third Teaching Episode ................................ ......... 136 Figure 17 Piloted Item Used in Follow Up Interview for Pattern Matching .............. 182 Figure 18 Figure Introduced During First Interview ................................ ................... 205 Figure 19 Grid Included as Part of Question 1 on the P retest ................................ ..... 209 Figure 20 Samples of Student Responses to Question 1 on the Pretest ................... 210 Figure 21 Problem 3 f rom the Pretest ................................ ................................ ......... 212 Figure 22 ........... 214 Figure 23 Question 6 f rom the Pretest ................................ ................................ ......... 224 Figure 24 Question 7 on the Pretest ................................ ................................ ........... 227 Figur e 25 Question 9 f rom the Pretest ................................ ................................ ......... 230 Figure 26 ......... 233 Figure 27 Regression Lines and Equations f or C hange in Case and KoST ................................ ................................ ............................... 267 Figure 28 ......................... 268 Figure 29 ......................... 269 Figure 30 Regression Lines and Equations for Each Case Subject s Total Score ................................ ................................ ................................ ...... 270 F igure 31 .................... 271 Figure 32 .................... 272 Figure 33 Problem 1 from the Posttest ................................ ................................ ........ 284 Figure 34 ................................ ................................ ............................... 293 Figure 35 Question 9 f rom the Posttest ................................ ................................ ....... 318

PAGE 14

xi Figure 36 Question 9 f rom the Follow Up Test ................................ .......................... 321 Figure 37 and KoST ................................ ................................ ............................... 394 Figure 38 .......................... 395 Figure 39 structed Re sponse for a Figure w ith a Perimeter of 24 units ................................ ................................ ................................ ... 396

PAGE 15

xii PRESERVICE ELEMENTARY CONTENT KNOWLEDGE RELATED TO AREA AND PERIMETER: A TEACHER DEVELOPMENT EXPERIMENT INVESTIGATING ANCHORED INSTRUCTION WITH WEB BASED MICROWORLDS Matthew S. Kellogg ABSTRACT school mathematics curricula. Research indicates that students and preservice te achers (PSTs) struggle with and harbor misconceptions regarding these topics. Researchers conceptual understanding and encourage deeper student thinking. To address this need t his study examined and described what and how PSTs learn as they engage in anchored instruction involving web based microworlds designed for exploring area and perimeter. ent (TDE) upon 12 elementary thinking (KoST) regarding principles, relationships, and misconceptions involving area and perimeter as they develop simultaneously in a problem solving environment. The le arning of meaningful mathematics is a personal and independent activity, as one struggles to create and reason through their own mathematical realities and misconceptions. This study describe s and difficulties

PAGE 16

xiii as they grap pled with new knowledge or reconciled new knowledge with prior understandings. Quantitative and qualitative research methods, including case subject tea ching episodes comprised this stu Results indicate that prior to intervention most of the PSTs possessed a procedural knowledge of area and perimeter and were bound by a dependency on formulas ; their KoST pertaining area and perimeter was relatively underdeveloped. The y seemed unaware of prevalent misconceptions students acquire while working with these concepts (specifically, units of measure and perceived relationships). The PSTs displayed an ineffective use of drawings to support their responses. Their preoccupation with finding solving situations hindered their ability to properly diagnose and address student thinking and limited their meaningful interaction with the microworlds (MWs). A majority of PSTs felt the MWs were a valuable learning tool for themselves but not for their future students. The planned intervention played a role in the PSTs becoming more perceptive of the difficult mathematics involved with area a nd perimeter and better equipped to anticipate and address those difficulties with future students.

PAGE 17

1 CHAPTER 1 I NTRODUCTION The notion that many students in elementary through high school struggle with understanding mathematical concepts has been sufficiently documented, as evidenced by performance on national and international assessments (Beaton et al., 1996; Kenny & Kouba, 1997 ; Rutledge, Kloosterman, & Kenney, 2009) A recent focus in mathematics education howeve r, has been on the difficulties that elementary in service and preservice teachers have with the content they are expected to teach. Surveys of elementary preservice teachers report their feelings of apprehension and inadequacy about the mathematical conte nt they will have to teach, as well as their inability to meet current expectations regarding the appropriate use of technology to aid and enhance that instruction (Abdal Haqq, 1995; Ball, Lubienski, & Mewborn, 2001; Sanders & Morris, 2000; Swafford, Jones & Thorton, 1997). In response to these and other concerns regarding the state of mathematics education in America, several leading organizations including the National Council of Teachers of Mathematics (NCTM), the Mathematics Association of America ( MAA), the National Research Council (NRC), and state and national governmental agencies have issued reports and documents echoing the challenges, laying the framework, and outlining standards to improve mathematics education and the preparation of

PAGE 18

2 mathem atics teachers (International Society for Technology in Education [ISTE], 1993; NCT M 1989, 1991, 2000; NRC, 2000; U. S. Department of Education [USDOE], 2000). A common thread within the recommendations of these organizations is the importance placed on t eachers of mathematics conceptualizing their content knowledge and being able to incorporate multiple approaches with which to apply that knowledge when teaching What follows describes a mixed methods study conducted within a n intact methods of teaching e lementary mathematics c ourse, taught by the researcher. The study focuses on preservice teachers as they experience innovative technology based an chored instruction. The study emerges from a noticeable lack of research detailing instructional approaches fo r address ing the inadequate content knowledge of teachers, specifically on the topics of area and perimeter as well as thei r limited perceptions of how and what students think regarding these concepts. T his study suggest s that such detail is needed if educators are to better understand how to intervene effectively in the mathematics training of teachers to facilitate their knowledge growth so as to influence ultimately student learning. Shulman (1986) outlines three categ ories of subject matter knowledge that a teacher of mathematics should possess; content knowledge (CK), pedagogical content knowledge (PCK), and curriculum knowledge. What a teacher knows and how they use that knowledge are critical elements to effective i nstruction. For this study, content knowledge w as thought of as more than simply a collection of isolated facts and algorithms designed to produce correct answers; instead it also include d a repertoire of interconnected and meaningful concepts an d procedur es (Ball, 1990). Although

PAGE 19

3 courses they take, pedagogical content knowledge is left relatively underdeveloped (Brown & Borko, 1992) and therefore needs to be a primary foc us of methods courses. A research method called the teacher development e xperiment (TDE) (Simon, 2000) provided pedagogical content knowledge (from both a psychological and social perspective) within a methods course. Domain specific knowledge with respect to the pedagogical development of teachers of mathematics is currently lacking within the TDE research paradigm (Simon, 2000). This research study examine d the specific concepts of area and develop with respect to these concepts. Dewey (1964) espoused that content and methods were inseparable in teacher garded as if it were something q uite irrelevant to method. When this attitude is even unconsciously assumed, method becomes an external attachment to knowledge of subject matter CK and knowledge of student thinking (KoST) Increased KoST a critical facet of pedagogical content knowledge (Brophy, 1991; Fennema & Franke, 1992; Shulman, 1986) and a focal point of this study, has been shown to change significantly how teachers interact with students both mathematically and cognitively (Carpenter, Fennama, Franke, Levi, & Empson 1999). Equally important is the role played by students within a mathematical learning environment. The NCTM Curriculum and Evaluation Standards (1989), Professional Standards for Teaching Mathematics (1991), and Principles and Standards for School Mathematics (2000) all share a vision in which students are actively involved in learning meaningful mathematics. Before elementary students can learn the mathe matics

PAGE 20

4 necessary for a successful future, classroom teachers need to be prepared to deliver that content effectively. For this vision to become a reality, teachers need many opportunities to attain, enhance, and explore their mathematical content knowledge in new and chall enging ways (ISTE 1999 2008 ; NRC, 2001). Integrating technology into the learning of mathematics has been shown to have positive effects on achievement, stimulate and enhance spatial vi sualization skills, and promote a more conceptual u nderstanding of mathematics for students and teachers ( Boers van Oosterum, 1990 ; Dunham & Thomas 1994; Groves, 1994; Rojano, 1996; Sheets, 1993). Research has shown that technology can be a valuable tool in promoting conceptual understanding of mathematic s within preservice teachers (Keller & Hart, 2002; Wetherill, M idgett, & McCall, 2002) which lends support to a conceptual framework for appropriate uses of technology supported mathematics activities (Garofalo, Drier, Harper, Timmerman, & Shockey, 2000; S amatha, Peressini, & Meymaris, 2004). It would seem appropriate then that technology play a vital role in helping achieve the desired and necessary reform recommendations. As recently as 2000, the NCTM stated in its Principles and Standards for School Math ematics is essential in teaching and learning mathematics; it influences the mathematics that is endorsements, as well as affirming research, many topics in math ematics which lend themselves to the visually stimulating qualities of technology are continually learned and taught through memorizing and algorithmic processes. In order to address the alleged deficiencies and bring about the recommendations for mathemat ics reform, new strategies for the delivery and learning of mathematical content need to be investigated. It

PAGE 21

5 would also seem reasonable and advantageous to expose preservice teachers to the same types of delivery methods that they are being challenged and encouraged to implement in their future classrooms. Statement of the Problem Teaching middle and hi gh school mathematics for 12 years, comb ined with serving the last 10 years as a teacher educator, has revealed much to me regarding the mathematical unde rstandings o f both students and preservice teachers An interesting, and somewhat troubling realization has been that many of the preservice teachers I have worked with possess many of the same mathematical weaknesses and misconceptions (especially relati ng to measurement) as many classroom students discussed in the literature To help combat such weaknesses, organizations such as the National Council of Teachers of Mathematics (NCTM, 1989, 2000) have advocated an increased emphasis on the teaching and lea rning of geometry at all levels; not just a traditional, procedural, and static view of geometry, but a dynamic, and visually stimulating discovery of the practical, problem solving world of geometry (NCTM, 2000). Schmidt (2008) reported that m easurement t opics such as area and perimeter were part of the mathematics curriculum for all the top achieving countries based on the TIMSS math assessment for seventh and eight h graders. These topics are part of a curriculum structure which appears to provide stability and a form of continuity across grades 1 8 Geometry is a natural place for the development of visualization and spatial reasoning, which are valuable for many life skills (e.g., using maps, planning trip routes, approximating measurements, an d designing landscapes). Geometric ideas are helpful in

PAGE 22

6 representing and solving many real world situations. For example, w house, various area formulas must be applied correctly when deciding on ho w much paint to buy. The abilities to vi sualize, inter pret, and properly represent measurement concepts are valuable skills for success in mathematics and in life (Clements & Battista, 1992). Despite the practical value of and emphasis placed upon measurement topics such as area and perimeter, there is considerable research indicating that school students have an inadequate understanding of them (Beaton et al. 1996; Clements & Ellerton, 1996; Hart, 1987, 1993; Kenney & Kouba, 1997; Kouba, Brown, Carpenter, Lindquist, Silver, & Swafford, 1988). Research also reveals that preservice and classroom teachers possess various degrees of misunderstandings regarding concepts surrounding area and perimeter (Menon, 1998; Reinke, 1997; Simon & Blume, 1994 a ; Tierney, Boyd & Davis, 1990; Woodward & Byrd, 198 3 ). understanding of student thinking regarding area and perimeter were severely lacking. greatly influenced by their comfort levels regarding the mathematics they teach (Ball & McDiarmid, 1989). What is also troubling is the lac k of research exploring interventions designed to challenge and address area and per imeter shortcomings among preservice teachers. The opportunity for preservice teachers to reexamine and learn about familiar mathematics topics within new environments has the potential to turn the tide on the downward spiral described above. Meeting the ongoing challenge of finding ways to effectively integrate content and methods within mathematics methods courses for elementary preservice teachers (PSTs) is also a priority of this research. Microworlds are a technology based learning

PAGE 23

7 environment that fa cilitates exploring alternatives, testing hypotheses, and discovering facts regarding a specially designed context. An instructional strategy well suited to utilizing such an environment is anchored instruction. The major goal of anchored instruction is to develop useful and meaningful knowledge by designing learning and that centers on solving problem s that are of interest to the students ( Cognition & Technology Group at Vanderbilt [CTGV], 1991) The lat t er provide d the setting for this study. A nchored instruction may be a dynamic delivery method for geometric content and the use of such instructional approaches in the classroom have been strongly encouraged (NCTM, 2000). The impact of anchored instruction upon mathematical knowledge and their ability to apply that knowledge requires greater exploration. PSTs n eed many experienc es with these new delivery methods to help them develop conceptual understandings of the content being delivered to see and experience appropriate uses of technology in the teaching and learning of mathematics and to help instill greater confidence for t heir future use (Ch innappan, 2000; Connors, 1997). However, there is scant research examining the different influences of anchored instruction upon mathematical content knowledge or their knowledge of student thinking. Purpose of the Study The purpose of this study was to examine levels of knowledge in the context of anchored instru ction with geometry microworlds upon and KoST related to area and p erimeter. In particular, it focus ed on their understandings, misconceptions written a nd verbal explanations of that knowledge, and achievement on written area and

PAGE 24

8 perimeter tests within the context of a mathematics methods course for PSTs Previous research has shown that preservice elementary teachers have contextual and conceptual shor tcomings regarding area and perimeter, and because the majority of this research has focused on revealing and measuring such misconceptions, little is known about the underlying causes of these misconceptions, how they may interfere with ability to d alternative instructional methods may help alleviate the area and perimeter misconceptions that PSTs have. In short, this study serve d three purposes: (a) f urther understand cognitions of are a and perimeter and how they change and develop through planned intervention (b) examine the interplay between and their KoST and (c ) develop and describe the use of anchored instruction that integrates the use of web based microworlds designed for exploring perimeter and area as a potential learning env ironment for influencing and KoST Conceptual Framework There is considerable research indicating that students have an inadequate and procedural based understanding of the c oncepts of area and perimeter (Beaton et al. 1996; Clements & Ellerton, 1996; Hart, 1987, 1993; Kenney & Kouba, 1997; Kouba, Brown, Carpenter, Lindquist, Silver, & Swafford, 1988 ; Rutledge, Kloosterman, & Kenney, 2009 ). Research also reveals that preservi ce and cl assroom teachers possess varying degrees of misunderstandings regarding these same concepts (Menon, 1998; Reinke, 1997; Simon & Blume, 1994 a ; Tierney, Boyd & Davis, 1990). The methods coursework and teaching practicum provide preservice teachers w ith much needed

PAGE 25

9 theoretical and practical experiences; however, opportunities for preservice teachers to investigate carefully mathematical conte nt that students find difficult reflect upon why they find it difficult, and then plan appropriate interventio n and follow up appear to be lacking. An emerging methodology for studying the development of teachers is the teacher development experiment (TDE) (Simon, 2000). This methodology builds on the central principle of the constructivist teaching experiment (Cobb & Steffe, 1983; Steffe & Thompson, 2000), that is, knowledgeable and skillful researchers can study teacher development by fostering deve lopment as part of a continuous cycle of analysis and intervention. Simon (2000) presents the TDE methodology as an adaptation and extension of two groundbreaking research approaches; the development of the constructivist teaching experiment (Cobb & Steffe 1983; Steffe & Thompson, 2000) and, later, the whole class teaching experiment (Cobb, Yackel, & Wood, 1993; Cobb, 2000). The constructivist teaching experiment is used to collect and coordinate individual and in particular areas of mathematics (Simon, 2000). The teaching experiment is primarily an exploratory tool directed towards understanding the progress students make while learning particular mathematical concepts over an extended time (Steffe & Thompson, 2 000). The teaching experiment has The TDE begins with an instructional issue that the tea cher/researcher is striving to resolve (Simon & Tzur, 1999). In this study, the issue wa s that of finding mediums to effectively blend the presentation of content and methods. The contributions of the

PAGE 26

10 whole class teaching experiment reside in attempting to understand mathematical learning as it occurs in the social context of the classroom (Cobb, 2000). It is common practice for the whole class teaching experiment to expand the teaching experiment to include analysis of classroom social norms, socio mathema mathematical beliefs and values (Cobb, 2000). This expansion of a teaching experiment to include these social aspects, however, may result in sacrificing some details of the ogical) understandings and development (Simon & Blume, 1994b) The goals of this study could not allow for such potential sacrifices, and thus a conscious effort was made to minimize the methodological influences of the whole class teaching experiment. Adm ittedly, the social interactions occurring within a classroom can play a role in learning, but they were not a focus of analysis in this study. Although the teaching experiment and whole class teaching experiment focus primarily on mathematical development within classroom communities consisting of students and a teacher, the TDE is concerned with an additional academic community the teacher educator and a group of teachers or preservice teachers. Simon generate increasingly powerful preservice teacher s endeavor to develop their content knowledge (CK) and knowledge of student thinking (KoST), as related to area and perimeter, that is beyond what they already ce teacher goes about resolving conflicts in current knowledge and incorporating new knowledge

PAGE 27

11 (i.e., both of content and of student thinking about that content); thus, addressing the instructional issue presented earlier. The development of the TDE employ ed in this study is based on the interplay of four main constructs First, and foremost, it is built around the major tenants of anchored instruction which to summarize briefly involve s facilitating the learning of new knowledge anchored in a context of meaningful activities that are supported collaboratively (CTGV 1990, 1991, 1992, 1993). Second, it is guided by Third, Wales and program for problem sol ving called Guided Design provides a model for the social interaction between myself (the researcher) and the participants (preservice teachers), and among the participants themselves. Finally, s framework is supported by current thinking abou t the bene fits technology, particularly web based microworlds, suggest for student learning of mathematics. This notion is firmly supported and guided by analysis examining effective instructional techniques. Specifically, t his stud y examine d the influence of anchore d ins truction that incorporate s geometry microworlds on enhancing and deepen ing particular facets of namely content kno wledge and knowledge of student thinking. The assumption is that enhanced content knowledge, combined with appropriate intervention, w ill result in a more conceptually developed knowledge of student thinking. Although other pertinent dimensions of PCK exis t, this study specifically examined two of them, content knowledge and knowledge of student thinking Below, I describe each component of the framework that guided t he development and execution of this study

PAGE 28

12 Anchored Instruction Cognitive psychologists claim that knowledge is formed when small chu nks of information are woven together within a contextual framework (Klock, 2000). Anchored instruction can scaffold an environment in which knowledge can be formed in that manner Cobb, Yackel, and Wood (1992) state that there is a disconnect between how and that a constructivist instructional approach can help address this dilemma. Although they were talking about students in the classro om, their statement is very relevant to the typical mathematical instruction received by elementary preservice teachers (Ball, 1988; Ball & Bass, 2000). Anchored instruction is grounded in and derived from constructivist theories of knowledge and is a spec ific application of situated cogn ition It is a research based paradigm for learning through technology assisted problem solving developed by the Cognition & Technology Group at Vanderbilt (CTGV) under the leadership of John Bransford who derived their insights from the work of Dewey (1933) and Hanson (1970) Anchored instruction is a model that emphasizes the creation of an anchor of focus [typically, technology Ellefsen, & Hal l, 1994, p. 131). Videodiscs have often been used to provide an environment to anchor instruction and problem solving to a meaningful context, as is the case with the Vanderbilt Group ; however, research has shown that the appropriate choice of the anchor w hile implementing anchored instruction is more important than media attributes in the teaching of problem solving (Shyu, 1999). Th is study involved actively engaging preservice teachers in thinking about and planning for misc onceptions in mathematics (a realistic and

PAGE 29

13 relevant activity) To help facilitate this activity the context (or anchor) was situated within a learning environment w hose instructional sequence explore d documented student misconceptions regarding area and pe rimeter (the authentic content) Geometry microworlds, s pecifically designed for the mathematical content in this study, provide d the dynamic environment to help participants focus on the relevant features of the problem solving activities. Format for Instructional Sequence An instructional goa l of developing the participant s content knowledge before addressing their knowledge of student thinking is supported from the literature. Bransford, Vye, Kinzer, and Risko (1990 b ) acknowledge the critical role that content knowledge plays in thinking and problem solving. developing pedagogical reasoning and action for effective teaching involves a cycle which begins with Comprehension and Transformation Shulman proposes that understand ing must occur before teaching can take place. Comprehension includes understanding critically a set of ideas to be taught, when possible, in more than one way. Once ideas are comprehended, they must be transformed in some manner before they can be taught and learned by students. An important aspect of this study is the planned development and transformation of content knowledge into knowledge of student thinking a necessary pedagogical tool. Other research suggests that PCK needs to be built upon other f orms of professional knowledge (e.g. content knowledge) (Rowan, Schilling, Ball, & Miller, 2001). In addition, features implemented to provide a model through which I observe d discuss ed and interview ed pa rticipating preservice teachers as they explore d and

PAGE 30

14 wrestle d with concepts individually and cooper atively with peers. The model include s : (a) introducing (verbally) an interesting problem and a general framework (which included a microworld) for solving t he problem, (b) providing time for participants to generate individually and test their own strategies, (c) providing participants time to work lution to the strategies used and conclusions attained by a n expert (the researcher and supporting research literature). The above processes are not meant to imply that transforming content knowledge into pedagogical content knowledge occurs within a set o f fixed stages, phases, or steps. Instead, teacher education can only attempt to provide preservice teachers with the understanding, performance abilities, and a setting in which to develop the tools they will need to teach effectively. Technology Integration O ther aspects of the int ervention used in this study were supported by a meta analysis of research on instructi on performed by Marzano (1998). Based on the findings of over 100 research studies, Marzano identified instructional techniques that had a positive, significant impact on mathematical achievement. Specifically, four of those instructional techniques were shown to have an effect size greater than one and are especially pertinent to research involving instruction that incorporates the us e of microworlds. The instructional techniques involve (a) having students represent new knowledge in image based representations, (b) using computer based manipulatives to explore new knowledge and practice applying it, (c) generating and testing hypothes es about new knowledge, and (d) modeling of new concepts to students in a direct fashion followed by them applying the concepts to different situations.

PAGE 31

15 All four of these practices were utilized as part of the teaching experiment. Web based microworlds p rovide d the environment for these instructional techniques to be utilized. shown to stimulate and promote a conceptual understanding of mathematics within preservice teachers (Kel ler & Hart, 2002; Wetherill, Midgett, & McCall, 2002) which also lends support to a theoretical framework for appropriate uses of technology supported mathematical activities (Garofalo, Drier, Harper, Timmerman, & Shockey, 2000; Samatha, Peressini, & Meyma ris, 2004). Microworlds provide such an environment. The epistemology underlying microworlds is derived from c onstructivism (Jonassen, 1991b); however, microworlds can also support goal orientated environments in which learning occurs through discovery and exploration (Rieber, 1992). Rieber explains that one way to reach this compromise is by incorporating aspects of gui ded discovery into the learning activity which would naturally be constrained by the boundarie s imposed by a particular microworld. Micro worlds, functioning as cognitive tools ( i.e., open ended learning environments), have been shown to assist in the learning of powerful and fundamentally different mathematics (Jonassen & Reeves, 1996; Pea, 1986), enhance student thinking (Lederman & Niess, 2000), support cognitive processes such as logical reasoning and hypothesis testing (Lajoie, 1993), provide s pecific feedback appropriate to guide in the learning of new material (Roblyer & Edwards, 2000), and encourage the exploration of mathematical ide as (Jensen & Williams, 1993). In summary, research provides a strong basis for the belief that anchored instruction that integrates web based microworlds and provides opportunity for students to be immersed in a community of learners has the

PAGE 32

16 potential to e nhance content knowledge and move it along the continuum of transformation into a useful knowledge of student thinking. Research Questions T his study describe d and present ed findings regarding an instructional approach that incorporates a form of anchored instruction (The Cognition and Technology Group at Vanderbilt [CTGV] 1992) in which area and perimeter microworlds assist ed in providing a rich and dynamic learning environment for both an individual and cooperative approach to situated problem s olving. T he primary research question examined by this st udy was In what ways do preservice elementary content knowledge and pedagogical content knowledge related to area and perimeter, change as a result of experiencing anchored instr uction integrated with web based microworlds, designed for In particular: 1 What is the content knowledge regarding area and perimeter prior to involvement in the teaching episodes? 2. W hat is the knowledge of student thinking regarding area and perimeter prior to involvement in the teaching episodes? 3. How does content knowledge regarding area and perimeter change if at all, during the course of this study? 4. How does the knowledge of student thinking regarding area and perimeter change if at all, during the course of this study? 5. In what ways if at all, is the knowledge of student thinking regarding area and perimeter related to their content knowledge of those same c oncepts?

PAGE 33

17 Definitions The following i s a list of the terms that w ill be used throughout this study: Pedagogical content knowledge: A kind of content knowledge that is useful for teaching. g the subject that make it comprehensible to others; an understanding of what makes the learning of topics easy or difficult; the concepts and preconceptions that students of different ages and backgrounds Content k nowledge: A facet of PCK that refers to t he amount and organization of facts and concepts including an explanatory framework about a subject in the mind of a teacher as well as why those facts and concepts are true (Shulman, 1986) Knowledge of student thinking : A facet of PCK that involves organizing content content areas and appropriately address any shortcomings or misconceptions (Swafford, Jones, & Thorton, 1997). Procedural knowledge: Many theories of lea r ning and development indicate that procedural and conceptual knowledge lie on a continuum. For this study, they will be separated into the two ends of the continuum. Procedural knowledge will be defined as t he ability to execute sequential actions in performing mathematical rules, algorithms, or procedures typically it involves knowing HOW but not usually WHY. Conceptual knowledge : A generalizable k nowledge that goes beyond isolated facts, procedures, and the words themselves. Someone possessing conceptual understanding has knowledge that is organized connected, and capable of being communicated in a meaningful way.

PAGE 34

18 Inert knowledge: Knowledge that can usually be recalled when someone is specifically asked to do so but is not available to use spontaneously in a problem solving situation. Manipulative: a concrete or symbolic artifact that students interact with to facilitate a deeper understandin g of an abstract concept Applet: A small stand alone version of a computer program or application designed to run on the Internet within a Web browser ( i.e., Internet Explorer) and commonly used to add interactivity to websites. Microworld : A Microworld is a term coined at the MIT Media Lab Learning and Com mon Sense Group It means, literally, a tiny world inside which a student can explore alternatives, test hypotheses, and discover facts that are true about that world ( i.e., relationships between mathematical concepts such as area and perimeter ) (Retrieve d July 26, 2006, from: http://www.umcs.maine.edu/~larry/microworlds/microworld.html ) Anchored instruction: [typically technology & Hall, 1994, p. 131). Situated cognition: The notion that cognition is not confined to the individual but is connected to social activity and the environment that best reflects the way in which the knowledge will be used ( Collins, 1991 ).

PAGE 35

19 C HAPTER 2 R EVIEW OF THE LITERATURE The purpose of this study was to examine the changes, if any, in content knowledge and knowledge of student thinking related to concepts and misconceptions regarding area and perimeter, written and verbal explanations of that knowledge, and achievement on written area and perimeter tests after experiencing anchored instruction with geometry microworlds T h is chapter is organized into three main sections of research. T he first section provides an overview of knowledge domains useful for teaching while focusing on two specific domains ( i.e., content knowledge and knowledge of student thinking) The second section examines student and teacher knowledge and understanding of area and perimeter The third section contains a brief summary of the role of technology in preservice teacher education and its effect on learning followed by a discussion about anchored instruction and microworlds. Wri ting about PSTs also involves writing about students a nd teachers. To avoid confusion in this study I use the term (PST) studying mathematics as one of several subjects that will be taught (as with an elementary teache r) or only mathematics (typically future secondary teachers). Unless otherwise students from Kindergarten to the end of secondary school. will refer to someone who has graduated from

PAGE 36

20 college and teaches mathematic s at the elementary, middle, or secondary level. Knowledge Domains and the Craft of Teaching There is little doubt that what a teacher knows impacts what is done in the classroom and ultimately what students learn (Fennema & Franke, 1 992 ; Hill, Rowan, & Ball, 2005 ). It would seem reasonable then for those involved with teacher education to teach, as well as the ability to conceptualize and communica te that knowledge. However, there is very little consensus when it comes to defining what critical knowledge is needed to ensure that students learn mathematics Many types of knowledge useful for teaching have been identified. For example there is general pedagogical knowledge, content knowledge (also referred to as subject matter knowledge), pedagogical content knowledge (which encompasses knowledge of student cognitions and knowledge of curriculum and school contexts) and knowledge of learners and their characteristics beliefs, and attitudes (Manouchehri, 1997; Shulman, 1986). This study focus ed on two of these knowledge types: content knowledge and knowledge of student cognitions, which will be referred Researcher s such as Brophy (1991), Fennema and Franke (1992), and Shulman (1986 ) have identified these two components of teacher knowledge as critical in the teaching and learning process. Research has well documented that many novice teachers, especially elementar y, struggle to varying degrees with the content they must teach including: multiplication and place value (Ball, 1988; Ma, 1999; Steinberg, Haymore, & Marks, 1985), division ( Ball, 1990; Post, Harel, Behr, & Lesh 1991 ; Simon, 1993 ), fractions ( Khoury & Za zkis, 1994;

PAGE 37

21 Lehrer & Franke, 1992; Leinhar d t & Smith, 1985); functions and graphing (Even, 1993 ; Wilson, 1994; Stein, Baxter, & Leinhardt, 1990 ), geometry and measurement (Ba turo & Nason, 1996; Heaton, 1992; Simon & Blume, 1994 a ), and proof (Ball & Wilson, 1990; Ma, 1999; Martin & Harel 1989 ) Each of these areas represents subject matter that needs increased attention as part of teacher education Rather than focusing on results portio n of the literature review examine s the difficulties teachers experience when they teach without a conceptual content knowledge, the cognitive issues that surround these difficulties, and approaches used to address these difficulties. Content Knowledge and Pedagogical Content Knowledge Before the literature is reviewed, it is important to delineate clearly the knowled ge domains that will be discussed Content knowledge (a facet of pedagogical content knowledge) consists of the amount and organization of facts, concepts, and principles, including an explanatory framework, about a subject in the mind of a teacher as well as why those facts and concepts are true (Shulman, 1986). Different subject matter areas all have content structures that must not only be learned by teachers but also be made clear represented well and categorize d in useful ways. Teachers need to be able to explain why certain truths are accepted and even how those truths relate to subject matter outside the domain being discussed. Content knowledge valuable for teaching should ideally scan the s taxonomy when interacting within the classroom environment (Ball, 2003) C an integral part of their teaching, and a lack of it will very likely affect the quality of instruction (Grossman, Wilson, & Shulman, 1989) and ultimately student learning (Fennema & Franke, 1992) The re is

PAGE 38

22 con siderable research pertaining to various aspects of teachers content knowledge or subject matter knowledge but this review will primarily focus on efforts to enhance pre Th e term pedagogical content knowledge was originally used by Lee Shulman to of knowledge goes beyond a mere knowledge of subject matter (mathematics for example) to a dimension of content knowledge that is usable for teaching and learning. Pedagogical content knowledge (which includes knowledge of student thinking) facilitates the effective teaching of subject matter. It involves the most useful forms of representations of ideas, a nalogies, illustrations, examples, and explanations (Shulman, 1986). PCK can be defined as an understanding of how to represent specific topics in ways appropriate to the diverse abilities and interests of the learners (Grouws & Schultz, 1996). It has been described as the seamless interweaving of subject matter and pedagogy useful for teaching and learning (Ball & Bass, 2000). Characterizing PCK What makes a teacher an expert? Expertise in mathematics instruction develops over many years and takes on many different forms. Two critical areas that must be under ongoing construction while on th e road to becoming an expert are k nowledge about conten t and knowledge about students (Ball, Lubienski, & Mewborn 2001 ; Fennema & Franke, 1992). Both these categories of knowledge are specific dimensions of pedagogical content knowledge (Shulman 1987). When discussing mathematical

PAGE 39

23 content knowledge, r esearchers often use the terms procedural and conceptual to denote a distinction between two forms of content knowledge (Eisenhart et al., 1993) As commonly used, procedural knowledge refers to m astery of symbolic representations, computational skills and knowledge of procedures for identifying and solving various mathematical components, algorithms, and def initions. For example, a student with procedural knowledge of divisions of fractions will know the steps for writing down the problem, performing the division algorithm (first, invert the divisor and then multiply the two fractions). Teaching a procedural knowledge for the division of fractions is exemplified by presenting a step by step proce dure for producing an answer, often accompanied by strategies for remembering the steps of the algorithm. For when are troubling on many levels. Any teacher who uses such instructional strategies, although they may not be classified a mathematical content and pedagogy. Novice PCK Preservice elementary teachers (including student teachers ) are obviously considered novices As mentioned earlier there have been many studies document ing the ways in which novice teachers strugg le with the mathematical content they must teach. However, knowledge influences their thinking about student thinking and subsequently their instructional decisions. Borko et al. (1992) studied eight senior, preservice elementary teachers who h ad selected mathematics as a concentration and were intending on teaching

PAGE 40

24 middle school. They reported extensively about one specific preservice teacher called Ms. Daniels. Even though Ms. Daniels had the strongest mathematics background of any participant that knowledge did not apparently serve her well when forced to make instructional decisions in front of students Teaching situations revealed a limited repertoire of instructional representations. She was unable to generate meaningful examples in respo whatever reason she apparently had not acquired the words, mental pictures, or the conceptual knowledge needed to produce an adequate explanation during whole class instruction Mapolelo (1999) had similar results while studying the PCK of three prospective middle school Their strong mathematics background did not apparently tra nsfer directly into a classroom ready pedagogical content knowledge. When given opportunity to teach, all of the student te achers in the s tudy resorted exclusively to a lecture method that was procedural and explanation orientated In most cases their explanations, although accurate, focused on procedures and did not encourage the students to connect mathematical concepts. The student teachers expressed confidence regarding the mathematical content they would be teaching; however, their content knowledge did not appear sufficiently supported by PCK to facilitate flexible, responsive teaching. They had difficulties responding to student questions and seemed ill equipped to design meaningful activities that would enhance conceptual understanding. It does not appear that increased mathematics training (i.e. content knowledge)

PAGE 41

25 alone will develop or enhance pedagogical content kno wledge. Meredith (1993) found that even preservice elementary seem to translate into understanding ho w students think about and learn mathematics or predicting common difficulties Mapolelo (1993) reported that some middle grades student teachers even though possessing extensive mathematics background also lacked the ability to anticipate misconceptions that stude nts might have regarding learning the concepts at hand. It seems apparent that research is needed to explore avenues to better equip preservice teachers with knowledge regarding the common misconceptions children have about elementary mathematics and how b est to address them Expert novice PCK Differences Borko et al. (1992) reported that novice teachers are very concerned about their limited pedagogical content knowledge and the impact such a shortcoming may have on teaching and learning. Research also indicates that the PCK acquired by novice teachers is primarily procedural in content and application ( Ball & Wilson, 1990; Fuller, 1996). Teachers possessing conceptual understanding of mathematics interact with both content and students in fundamentally different ways Conceptual understanding involves knowledge of the underlying structure of mathematics how various concepts connect, and a realization of the various relationships between ideas that facilitate meaningful explanations of mathematical proce dures (Eisenhart et al. 1993). In the case of division of fractions, conceptual knowledge would include discussing the nature of fractions in general as well as specifics regarding the fractions to be divided. The meaning of division would be investigated often exemplified by using concrete and semi concrete models

PAGE 42

26 ( i.e., Cuisenaire rods Hershey bars, paper folding or drawings ) The expert teacher exhibits a greater propensity towards incorporating such learning tools into their instruction. Fuller (1 996 ) qualitative research suggest ed that experienced teachers seem to possess a greater conceptual understanding of certain mathematical topics than their preservice counterparts. An example of such knowledge was the fact that the classroom teachers were much more likely to suggest using manipulative materials to help students understand mathematical concepts as opposed to the procedural laden responses of preservice teachers. One shortcoming (1996 ) study is the vagueness with which some of the findings are reported. It appears a lack of substantive follow up (possibly interviews) to the instrument used, the Survey on Teaching Mathematics (Rich, Lubinski, & Otto, 1994), lent itself to this vagueness. For e xample, one of the expert teachers participating in the study indicated (p. 25) in response to a survey question involving a student who had a mathematical misconceptio n. Although response does seem to indicat e a tendency toward conceptual based instructional strategies, the reader is left to wonder exactly what pictures or manipulatives would have been used and why. Other researchers have reported the conceptual approaches of ex pert teachers. Mitchell and Williams (1993) observed expert teachers more than twice as often as their novice counterparts, incorporating technology to promote a focus on understanding content and process. Expert teacher s not only present content differen tly than novices, but their more developed PCK enables them to more thoroughly synthesize mathematical material for the purpose of review Livingston and Borko (1990) investigated how

PAGE 43

27 secondary mathematics student teachers prepared for and conducted review lessons as compared with their expert cooperative teachers. Review lesson s provide a unique opportunity for a teacher to blend content knowledge and knowledge of student thinking in a setting that often includes improvisation. The main difference between the novice and the expert appears to be one of focus. Livingst on and Borko (1990) reported that the to focus on the content and task at hand. The expert teacher has more extensively developed sch emata for PCK that includes more inclusive planning, a greater repertoire of explanations, representations, and knowledge of common errors and misconceptions. Novice teachers on the other hand seem to have a limited PCK about students how they learn the subject matter, the common errors they make, as well as an awareness of the misconceptions they harbor. Although some instructional settings (e.g. reviewing for an exam) can produce clear distinctions between the expert and novice teacher, certain content areas appear to be troublesome to both. Fractions seem to elicit procedural approaches to teaching and learning by both novice and experienced teachers (Fuller, 199 6 ). In such cases performance and getting right answers takes priority over understanding. Instructional strategies involving certain mathematical topics (e.g. knowledge of fractions) also reveal varying levels of conceptual understanding among the expert teachers (Leinhar d t & Smith, 1985). Perhaps teachers need to revisit difficult concepts and reflect upon their t eaching practices in the hopes of transforming procedural approaches to conceptual. Procedures are a necessary part of mathematics; however, c on ceptual teaching would present a web of connected ideas encompassing fractions with the intent to help students understand how and why

PAGE 44

28 mathematical procedures produce right answers. Brown and Borko (1992) argue that without a conceptual understanding of ma thematical ideas, teaching mathematics from a conceptual perspective is inconceivable To be considered complete, a mathe matics education should include aspects of both procedural and conceptual knowledge There is no serious conflict in their developmen t or implementation (Ma, 1999). Thus, if the goal is to teach for mathematical understanding, then the teacher must incorporate instruction that facilitates the development of mathematical procedures within a framework of conceptual understanding (Wearne & Herbert, 1988). The expert teacher understands that procedures in mathematics should alway s be accompanied by conceptual representations (Hiebert & Carpenter, 1992). The importance of equipping pr e and inservice teachers with PCK useful for teaching cann ot be overstated. Grossman (1991) articulate s the importance of this domain of knowledge for the teaching and learning of mathematics : If teachers are to gui de students in their journey in to unfamiliar territories, they will need to know the terrain wel l. Both knowledge of the content and knowledge of the best way to teach that content to students, help teachers construct meaningful representations, representations that reflect both the nature of the knowledge and skills (p. 203) Reforming Pedagogical Content Knowledge The knowledge needed to teach is uniquely different in both content and purpose from the knowledge possessed by non teaching peers. To Shulman (1987): The key to distinguishing the knowledge base of teaching lies at the intersection of content and pedagogy, in the capacity of a teacher to transform the content

PAGE 45

29 knowledge he or she possesses into forms that are pedagogically powerful and yet adaptive to the variations in ability and background presented by students. (p. 15) It would be har d to question the importance of developing expert teachers who possess a powerful and flexible pedagogical content knowledg e; however, there are many opinions regardin g what activities can develop such knowledge. Feinman Nesmer and Buchmann (1986) argue that novice teachers do not acquire pedagogical content knowledge until they are faced with the challenges of actual classroom teaching. In lieu of personal experiences, which are not always possible or expedient, there are several recommendations. Ball and Bass (2000) encourage using opportunities to lea rn content that either simulate or are situated in the contexts in which subject matter is used. For example, some teac her educators us e and inservice teachers to revisit the content themselves ( Barnett, 1998; Schifter, 1998). O ther researchers and teacher educators pr omote the use of video clips depicting exceptional classroom lessons or cases of classroom episodes as a means of fostering the development of PCK ( Kellogg & Kersaint, 2004; Lampert & Ball, 1998 ). Reflecting upon previously learned content knowledge and th e context in which it was learned has been suggested as a valuable platform from which to attempt the transformation of PCK (Meredith, 1 993). There seems to be a building consensus that developing PCK should occur simultaneo usly with the development of CK ( Good & Grouws, 1987; Stacey et al., 2001) and that without adequate CK the acquiring of PCK is severely hampered (Hutchison, 1997). Zeichne r and Tabachnick (1981) state tha t unless teacher education seek s to also reform the content knowledge of their pr eservice teachers along with their pedagogy the lasting

PAGE 46

30 effects of methods classes will be weak. Brown and Borko (1992) would seem to agree when the y argue that: Unless novice teachers experience good mathematics as students, see it modeled by teachers they respect, and are situated in a culture of teaching that accepts and practices good teaching, it will be difficult for them to implement and maintain good teaching in their classrooms (p. 227). Innovative Interventions for Pre and In ser vice Teachers Develop ing Meaningful Content K nowledge As stated earlier, there is no shortage of research documenting that preservice teachers especially elementary, struggle with the mathematical content they must teach. Sadly, many preservice teachers are not willing to take personal responsibility for their mathematical shortcomings. Sanders and Morris (2000) reported that the majority of the preservice elementary teachers in their study offered excuses ranging from tech nical terminology to non coverage at their school for their knowledge deficits regarding the elementary mathematics they must teach. Some preservice teachers were embarrassed by poor test results and felt inadequate to tackle their lack of content knowledg e. Fortunately, other evidence sug gests that improvements in areas of content d eficiency can be made. has previously been thought to be developed adequately in university mathematics courses (Brown & Borko, 1992), but research ers are now recommending that it should be addressed in the methods course s from a different perspective (Manouchehri, 1997). Ball (1990) contends that mathematics methods courses can change not only the pedagogy of preservice teachers but also th eir mathematical knowledge if the course is co nstructed with that as a goal.

PAGE 47

31 Ma thematics methods courses have been the setting for several studies aimed at suited for the classroom. Constructivist approaches to learning are often preferred by teacher educators. They can be useful in encouraging preservice teachers to investigate and more importantly challenge the ir prior learning and then promote the reconstruction of incorrect or weak mathematical ideas (Cobb, 1987). Stoddart, Connell, Stofflett, and Peck (1993) following constructivist principles, developed a five week conceptual change content unit on rational numbers to investigate ways of improving elementary Qualitative methods (i.e., interviews) w ere used to evaluate change in content understanding as a consequence of the conceptual change ins truction. Although the findings indicated a substantial improvement in the content knowledge of the preservice teachers ( n = 18) who received conceptual chang e instruction, a few limitations should be reported. The study offered no description of the posttest ( i.e., Were t he items the same or parallel?), and n o interview samples (or vignettes) ponses were a consequence of the conceptual method of an alysis was not described nor were Althou gh Stoddart et al. findings were promising, the short duration of the study (5 weeks) and small sample size suggests a need for further work with larger samples investigating the influences of longer intervention integrating mathematical content into methods courses. Quinn (1997) resear ch extended aspects of Stoddart et al. (1993) by integrating the study of mat hematical content throughout a semester long methods

PAGE 48

32 course Quinn did not use a conceptual change model but did design his elementary mat hematics course around constructivist based recommendations A n open classroom atmosphere was established where student questions were encouraged and valued and learning ac tivities were designed for p articipants to engage in hands on, cooperative work. Th e course stressed the instilling in children a conceptual understanding of mathematics. A test devised to measure conceptual and intuitive understanding of mathematics was used for both the pre and posttests. A correlated groups t test comparing the prese significant, t (26) = 4.1, p < .001, indicating the meaningful knowledge of mathematical content of the participants increased significantly during the course albeit with a small sa mple size of the many content areas addressed in the course, geometry was one of the most troubling for the preservice elementary teachers even after the semester long intervention. Quinn wo uld seem to suggest that changes in mathematical content courses for preservice teachers would only enhance their conceptual understanding of the mathematics they must teach. McGowen and Davis (2002) partially conducting a cas e study of one of the forty six participants enrolled in a specially designed mathematics content course for preservice elementary teachers. A preservice teacher named Holly was selected for study because of her unique combination of very poor computationa l skills and outstanding higher order thinking skills. Analysis of spread out over the course of a semester, of a 30 question paper and pencil competency exam of basic arithmetic computation (her scores were 20%, 50%, and 87 %) along with interview data revealed noticeable growth of her mathematical understanding. McGowen

PAGE 49

33 and Davis argue that preservice elementary teachers are in need of a mathematics foundation to build upon before they will be able to think about how to use their mathematical knowledge in the classroom. In other words, a strong foundation in content knowledge is essential to constructing pedagogical content knowledge truly useful for classroom instruction. Constructing Pedagogical Content K nowledge Relatively speaking, research examining the develop ment of pedagogical content knowledge is still in its infancy. Although t he line separating CK from PCK is blurry, with PCK containing elements of subject matt er knowledge and general pedagogical knowledge (Marks, 1990; Shulman, 1986), it is the view of Shulman, and others, that PCK builds on other forms of professional knowledge (e.g., content knowledge) and therefore is a critical element in the knowledge base of teaching (Rowan, Schilling, Ball & Miller, 2001). Hutchison (1997) acknowledged the documented CK limitations among preservice and inservice teachers; however, in this study she explored the tie that such case study, Jeannie, involved a preservice elementary teacher who entered her methods course with a procedural only knowledge of elementary mathematics. Qualitative analysis revealed that a lthough Jeannie strongly desired to be a good teacher her limited CK resulted in a sporadic and unconnected PCK. Further research is needed to determine effective ways to bridge the and the pedagogical content knowledge needed for teaching In certain instances preservice teach PCK has shown limited development even in spite of limited CK. Simon and Blume (1996) conducted a whole class

PAGE 50

34 constructivist teaching experiment examining how mathematical justification, a facet of PCK, could develop within a methods course for prospective elementary teachers. It was reported that participants possessing limited conceptual understandings were hindered in their sense making of various arguments presented as well as in their ability to accept valid justi fications; however, classroom norms regarding presenting, listening, and evaluating mathematical justifications were established by all participants. Being able to justify mathematical responses helps promote and reinforce meaningful understanding within s tudents and build s schemas of student s R h ine (1998) goes as far as to sugges t that increased achievement may be attained if Promoting an Awareness of Student C ognition Knowledge of student thinking is but one component of pedagogical content knowledge (Shulman, 1986), and i includes a knowledge of common conceptions, misconceptions, and difficulties that students encount er when learning particular concepts. misconceptions and their influences on subsequent learning has been among the most fertile topics for cognitive research (p. 10). Based on their limited teachin g experiences, it would not be surprising that preservice teachers lack a n understanding of how student s think regarding the mathematics they learn. Research confirms this. Even and Tirosh (1995) studied 162 prospective secondary mathematics teachers in th e last stage of their formal preservice training. The study investigated how the preservice teachers responded functions and undefined mathematical operations. Through qu estionnaires and follow up

PAGE 51

35 interviews, Even and Tirosh found that although most of the subjects were able to find their answers, they were sadly lacking in the ability provide coherent r easons as to why the student gave the answer they did and explain the concept (s) to the student other than providing a rule or definition. R esults such as these should strengthen the resolve of teacher educators about the importance of address ing student thinking with their preservice teachers (Ball, Lubinski, & Mewborn, 2001) Graeber (1999) further strengthens that point by stating understand t hat instructional decisions c an be guided by what is know n about Because knowledge of student thinking does not appear to be sufficient ly gained by preservice teachers during their coursework, one would be left to assume that such knowledge is attained through interacting with students in the classroom setting. R esearch does not back up such a claim (Ball et al., 2001 ; Ma, 1999). The real ization of the ne ed for teachers to understand how and why students think the way they do has been slow to develop. Research pertaining to knowledge of student thinking is still in its infancy. In mathematics education, i t gained prominence through the wor k of two extensive research informed professional development projects that investigated how informing teachers about how children thought about specific mathematical concepts and influence stu dent achievement : Cognitively Guided Instruction (CGI) at the University of Wisconsin Madison, and Integrating Mathematics Assessment (IMA) at the University of California,

PAGE 52

36 Los Angeles (Rhine, 1995). Each project designed professional development models ba sed on educational research. A precursor to these projects, Carpenter, Fennema, Peterson, and Carey (1988) related to student achievement. They used questionnaires and an interview with 40 first grade teachers and found that the teachers had an informal knowledge about the mathematical thinking of their students, but it was not organized in such a way as to inform classroom instruction. Follow up research brought the beginnings of t he CGI project under the initial guidance of Car penter, Fennema, Peterson, Chiang, and Loeff (1989). CGI sought to investigate how incorporating research based materials into a professional development program would assist teachers in organizing their knowledge of student thinking and in turn influence student achievement. Initial CGI studies focused on addition and subtraction word problems with multiplication and division being included within later studies (e.g., Fennema, 1996). For the Carpenter et al. (1989) study, 40 first grade teachers participat ed in the study. Half ( n = 20 ) were randomly assigned to the treatment group and participated in a 4 week summer workshop designed to familiarize the teachers with research findings on how young children think about and develop solutions strategies for add ition and subtraction and to give them an opportunity to plan instruction based on that knowledge. Subsequent classroom observations of teachers receiving the CGI training revealed that they spent significantly more time on word problems than on number fac ts a focus of the control teachers. The CGI teachers posed more problems to their students, focused more on the thought processes of their students than on their answers, and knew more about how individual

PAGE 53

37 This increased awaren in higher levels of achievement in problem solving as compared to the students of teachers without this knowledge (extensive tables provided means, standard deviations, and between groups t tests) F ollow up CGI studies by Carpenter and Fennema (1992) a nd Fennema, Franke, Carpenter, and Carey (1993) reported similar results. Fennema et al. (1996) performed a subsequent 4 year longitudinal study examining the changes of 2 1 primary grade teachers who participated in CGI professional development. By the end of t he mixed methods study (observations, interviews paper and pencil instruments, informal interactions, and supportive descriptive statistics), the instruction of 90% of the teachers had become m ore cognitively guided with the focus of engaging students in authentic problem solving. The substantial g ains in problem pts appeared to be related directly to changes in t use of research informed instruction. What was striking was that this shift in emphasis from skills to concepts and problem solving did not result in a decline in performance on measures of computational skills. It should also be noted th at it is hardly a trivial matter to be able to convince teachers to focus on concepts and problem solving rather than on computational skills. These results also have significance to the field of teacher education. The IMA program guided by findings regarding ef fective professional development, identified four elements it believes to be critical in supporting effective instruction: (a) Teachers need a deep understanding of the mathematics they teach, (b) teachers need a deep understanding of t he ways that childre n learn ma thematics, (c) they

PAGE 54

38 need to engage in analytic reflection of their practice (Gearhart et al., 1999). The primary goal of t he IMA professional project was t o bridge developmental research and practice by helping teachers interpret student cognitions as they made sense of challenging mathematics (specifically fractions). Initial IMA research compared two groups of teachers using the same activity based, reform minded curriculum (Rhine, 1998) One group received professional development emphasizing the understanding of student thinking. The second group met monthly to collaborate and provide support while preparing for and teaching the unit on fractions Gearhar t et al. (1996) fo und that the teachers re ceiving the IMA training provided their students with more opportunities to be engaged with substantive activities involving fractions than did the second group Gearhart and Saxe (1999) continued the development a nd investigation of the Integrating Mathematical Assessment (IMA) professional development program by leading a second opportunities to learn while studying fractions. Three g roups of elementary teachers ( n = 21 ) volunteered to participate in the study. Nine teachers received IMA professional build a supportive community of like minded colleagues, and five teachers committed to teaching with skills based textbooks. The first two groups of teachers used a problem solving curriculum. Data from videotapes of classroom instruction and field notes were coded and analyzed. Detailed rubric like rating scales were used to measure integrated assessment, conceptual issues related to problem solving, and opportunity to gain understanding of concepts linked to uses of numeric representations. A hierarchical linear model (HLM) was fit to student prete st posttest scores. The HLM along with qualitative

PAGE 55

39 analysis revealed mixed results, but overall showed that the problem solving curriculum (the IMA and Support groups) provided students greater opportunities to engage conceptually with the ideas related to fractions than the skill based textbooks. Another key finding was that using a curriculum built around assessment of student thinking, as is the reform Saxe, Gearhart, and Nasir ( 2001) also researched the effectiveness of IMA. Their methods were very similar as Gearhart and Saxe (1999) in that they elicited volunteers ( n = 23 ) who were placed in the same three groups (IMA, Support, and Traditional); however, the 2001 study was pur ely quantitative in nature. A paper and pencil test was used to achieve measures of both computational and conceptual performance. The ANCOVA on the conceptual scale re vealed a main effect for GROUP F (2,18) = 7.21, p < .005) followed by a Tukey HSD post h oc test found the IMA means were greater than the Supported and the Traditional groups. The ANCOVA on the computational scale did not reveal an effect for GROUP at conventional levels of significance ( p < 05); however, although the students in the IMA gro ups did outperform the other two groups on the computational items (not significantly though), the students in the Traditional group s showed greater achievement on computational items than the students in the Supported groups. These findings indicate that to take full advantage of reform curriculum teachers may well need further support (e.g., IMA) than simply collaborative help of colleagues. The findings from IMA research would appear to be encouraging to proponents of reform based professional developm ent ; however one limitation re lated to IMA research is the extensive use of volunteers. A dmittedly more difficult, random assignment of

PAGE 56

40 Rhine (1998) s ummarized t he findi ngs from CGI and IMA research by suggest ing that eachers the catalyst that reorients teachers towards the importance of integrating assessment of (p. 30). After examining the research conducted by CGI and IMA, i t was not apparent that either research program specifically addressed misconceptions regarding the mathematical content they investigated; however, because misconceptions are pr evalent within mathematics and confound aspects of student (and teacher) thinking, it would seem advisable to include a discussion of them within any training program designed to improve knowledge of student thinking. Swafford, Jones, and Thornt on (1997) appeared to build upon the CGI research by employing an intervention program for elementary teachers designed to enhance not only based findings regarding student cognition (specifically geometry and the van Hiele levels 3 ) but also the ir content knowledge ( in geometry ) The researchers used multiple measures to analyze the changes in teacher content knowledge and instructional strategies brought about by the intervention of a 4 week summer session and six half day seminars d uring the academic year. The emphasis during the sessions was about 85% geometry content and 15% research findings regarding student cognition and the van Hiele levels of geometric thought The researcher s found that teachers experienced a significant, t (49) = 5.5, p < .001, pretest posttest gain in geometric CK 72% of the teachers increased by at least one van Hiele level with more that 50% of the teachers increasing by two levels. This new found knowledge translated into several 3 See Swafford, Jones, & Thornton (1997) p. 469 for more information regarding the van Hiele levels of geometric understanding.

PAGE 57

41 important classroom b ehaviors Lesson plan analysis and classroom observations following intervention revealed the teachers now spent more time and more quality time on geometry instruction and possessed the confidence to provoke and respond to higher levels of student thought Gaining confidence in the teaching of mathematics was also reported in qualitative research conducted by Lowery (2002). She sought to understand how preservice elementary teachers construct CK and PCK while participating in a content specific methods cou rse that had immediate access to school based experiences. Th e intervention provided a unique combination of methods instruction focusing on content knowledge with direct access to field experiences This setting facilitated the blending and enhancement of CK and PCK in a situated learning context. Analysis of multiple data sources ( e.g., various written assignments, reflection journals, portfolios, and interviews) found that preservice teachers constructed CK while thoughtfully preparing lesson plans and d uring debriefings regarding classroom teaching experiences, and exhibited developing PCK by adapting real time teaching planned activities and follow up lessons in response to the needs of students These results would seem to imply that interventions de signed to enhance CK and PCK have greater positive impacts than only addressing one of those knowledge types. Even and Tirosh (1995) echo support for decisions for helping an d guiding students in their knowledge construction certainly Measuring Pedagogical Content K nowledge About thirty years ago the mathematics research community concluded that it could find no important relationship between teacher knowledge and student learning

PAGE 58

42 (Eisenberg, 1977; General Accounting Office 1984 ; School Mathematics Study Group, 1972 as cited in Fennema & Franke, 1992 ). An importan t distinction between then and now is how teachers knowledge was defined. These studies defined it as the number of university level mathematics courses successfully completed. Also, these studies did not attempt to measure what the teachers knew about the mathematics they were teaching or precisely what content was covered in the mathematics course they took (Fennema & Franke, 1992). Much has changed in the past 20 yea rs especially in the area of research on teach er knowledge. Currently, researchers are not so concerned with what mathematics courses teachers took in college as much as with what mathematical knowledge is needed to teach, can such knowledge be empirically quantified, and what are the relationships be tween this mathematical knowledge for teaching ( i.e., PCK) and student achievement. This research paradigm is in its infancy and is still being formulated, and as such very little research exi s ts on measuring PCK and its effects on student achievement; ho wever, the implications of such research are far re a ching and thus merit some discussion. Piloting of an instrument to be used to measure PCK began in 2001 4 Hill, S chilling, and Ball (2004) reported that although the ir findings are o nly preliminary, have not been replicated, and are based on exploratory (albeit extensive) factor analysis, there is reason knowledge is at least somewhat domain specific ( e.g., number, operations, patterns, functions, and algebra). A conclusio n worth noting was that from a measurement perspective, the results support constructing separate scales to represent and measure different knowledge types for teaching ( e.g., CK and PCK). This research was follow ed up by Hill, Rowan, and Ball 4 see Rowan, B., Schilling, S. G., Ball, D. L., & Miller, R. (2001). content knowledge in surveys: An exploratory study. Unpublis hed manuscript, University of Michigan, Ann Arbor.

PAGE 59

43 (2005). Thei contributed to increased student achi evement in mathematics. A mixed model methodology was used and key student and teacher level covariates were controlled for The results of the s the analyses performed involve clear limitations including small sample of students [1,190 first graders, 334 first grade teachers, 1,773 third graders, and 365 third grade teachers], (p. 399) With that being said, the strongest and most robust effect achievement. The results of this study, as well as others discussed, point to the ongoing need of analyzing the practice of knowledgeable teachers as well as the ir content knowledge in the hopes of improving student learning. Knowledge of and Learning about Area & Perimeter The previous portion of the review of literature looked at CK and PCK from a generally content neutral perspective. Research involving young children ( e.g., first or se cond grade) or focusing on how measurement concepts develop during school years will not be components of this research study and hence not a focus of this review of literature. Instead, t he next major section will present and discuss literature examining the ongoing struggles students have with concepts related to area and perimeter, common misconceptions regarding area and perimeter how they relate to instruction and learning, why students (and teachers) struggle with understanding area and perimeter con cepts

PAGE 60

44 how traditional instructional strategies tend to confound lea r ning and then conclude with a look at innovative instructional strategies and why they have been successful. Perfecting the craft of teaching is a life long endeavor and can be furthered by examining misconceptions surrounding subtleties of assumed mathematical concepts the mathematics (or lack thereof) that underlies such struggle s and what can be done to intervene and break the cycle of misconception breading misconception (Ball, Lubienski, & Mewborn, 2001; Ma, 1999; Stoddart et al., 1993). Difficulties with Area and Perimeter Measurement is an enterprise that spans both mathematics and science yet has its example area and perimeter, has become an increasingly important component of ma ny school mathematics curricula; however, neither the practical nature of such concepts nor increased emphasis has translated into mastery of basic skill s or deeper conceptual understanding regarding area and perimeter ( Kenney & Kouba, 1997; Martin & Strutchens, 2000). One ong and perimeter has been th e mathematics assessment of the National Assessment of Educational Progress (NAEP). First administered in 1972 73, the results of several NAEP exercises involving measur ement revealed pervasive misunderstandings of basic concepts (Hiebert, 1981). For example when responding to the question in Figure 1 only 28% of 9 year olds answered it correctly. Hiebert stated that this along with other similar results time the fou rth NAEP assessment of mathematics was administered, 14 years later, one indicates that many students do not understand the fundamental meaning of area. By the might assume that significant progress towards remedying such a shortcoming would

PAGE 61

45 Figure 1 Measurement exercise very similar to one asked in the 1972 73 NAEP. have been reached. Sadly, that was not the case. A little over half of the seventh graders tested could correctly calculate the area of a rectangle labeled with both the length and width (Kouba et al., 1988). More disappointing, even shocking, was that onl y a little over 10% of the 7 th grade students could find the area of a square when given the length of one side and the fact that the figure was a square. The 1992 NAEP mathematics assessment showed some progress in basic area computation with 65% of the e ighth graders tested assessment of NAEP conducted in 1996 revealed a significant drop in eighth graders performance on items involving basic area computation. Only 44% could identify the correct numerical expression for the area of a given geometrical figure (Martin & Strutchens, 2000). An item appear ing on the 2003 NAEP asked eighth grad ers to determine which of four numerical expression s would repres ent the area of a rectangle whose side measures were given; less than half (48%) answered the question correctly 3 cm 5 cm What is the area of this rectangle?

PAGE 62

46 ( National Center for Education Statistics [NCES] 2003 ). T he 2005 NAEP mathematics assessment revealed that only 38% of high school seniors could use a centimeter ruler to measure the appropriate lengths of a pictured para llelogram and correctly compute its area ( NCES 2005). It is worth noting that when comparing similar are a questions on the various NAEP assessments, students did notably better when asked to compute the area of a rectangle described with words as opposed to the area of a pictured rectangle. Possibly the visual cues are distracting and cause confusion among s tudents. The 2007 administration of the NAEP mathematics assessment reveals that, while some progress has been made, 4 th and 8 th grade students are still struggling with concepts related to area. For example, one problem from the 4 th grade exam gave the di mensions of a room (i.e., 12 feet wide by 15 feet long) and asked students how many square feet of carpet would be needed to cover the floor. Only 42% correctly answered the problem. An interesting side note was that the most common incorrect response was 27 which suggest s confusion exists between concepts involving finding area and perimeter T he research conducted in this study examine d aspects of these possible phenomena NAEP assessments also reveal students struggle with fundamental concepts r egarding length and perimeter. For example, the results of an item in the 1985 86 NAEP revealed that only 14% of the third graders and 49 % of the seventh graders who responded to the question in Figure 2 gave the correct answer of 5 cm (Lindquist & Kouba 1989). These deficiencies have also been reporte d more recently. In the 1996 NAEP mathematics assessment, only 22% of 4 th grade and 63% of 8 th grade students, who responded, could correctly determine the length of an object pictured above a ruler when t he end of the object and ruler were not aligned (Martin & Strutchens, 2000). A

PAGE 63

47 very similar question on the 2003 NAEP, pictured in Figure 2, produced equally troubling results with only 20% of the fourth graders correctly answering the item (NCES, 2003 ). O n the 2005 administration, eighth graders continued to struggle with perimeter concepts with only 40% correctly determining the length of a rectangular playground whose perimeter and width were given ( NCES 200 5). Even as recently as 2007, only 43% of 4 th grade students could correctly find the perimeter of a stop sign given that it has eight sides, the length of each side, and told that perimeter was the factor contributing application of length. Lindquist and Kouba (1989) report that in the fourth NAEP mathematics Figure 2 Percentage of students in grades 3 and 7 responding to a NAEP item

PAGE 64

48 a ssessment 17% of 3 rd grade and 46% of 7 th grade students who responded successfully found the perimeter of the rectangle in Figure 3 Poor performance by third graders on this item may not be that surprising because perimeter is still a relatively new concept at that age; however, the performance by seventh was also less than adequate Some improvement in performance appears in 1996 on the sixth NAEP mathematics assessment when 46% of the 4 th grade students who responded could correctly calculate how many feet of fencing would be needed to go around a rectangular garden (Kenney & Kouba, 1997). The garden was pictured and labeled similarly to Figure 3. Eighth graders were not asked that perimeter problem. A different sort of perimeter problem was asked on the 1996 NAEP when fourth graders were asked to use a ruler to draw a figure with a given perimeter (Martin & Strutchens, 2000). Interestingly enough, only 19% of those who r esponded could draw a correct figure. The nontraditional format of this problem seemed to cause significant difficulties for the fourth graders. It would appear the instruction students have been receiving regarding area, Figure 3 Item from the fou rth NAEP.

PAGE 65

49 perimeter, and length is developing an incomplete conceptual understanding of these concepts (Kamii & Clark, 1997; Martin & Strutchens, 2000). The high percentage of incorrect responses alone should be cause for alarm; however, even more troubli ng are the misconceptions students have regarding area and perimeter. Prevalent Misconceptions Regarding Area and Perimeter P erimeter is the length around the outside of a figure (for a rectangle, it would be the sum of the lengths of the sides of a figur e), and area is a measure of how much two dimensional sp ace a figure occupies Because the calculations of both measures involve the sides of the figures, someone lacking a conceptual understanding of area and perimeter could encounter many problems and di fficulties (Ma, 1999) Such errors evolve into knowledge gaps which if left unchallenged manifest themselves as misconceptions exhibited by students while working problems involving area and perimeter (Hirstein et al., 1978; Wilson & Rowland, 1993) and b y teachers while attempting to explain the concepts (Menon, 1998; Reinke, 1997; Simon & Blume, 1994 a ). The literature discusses many misconceptions regarding area and perimeter Some are general in nature ( e.g., confusing area and perimeter), and other s are more focused ( e.g., a rea and perimeter are directly related in that one determines the other). Some misconceptions, such as transitivity (Hiebert, 1984) and conservation (Piaget, Inhelder, & Szeminka, 1981), are more common among young children, altho ugh others ( e.g., confounding linear and square units) are held by both students and even teachers (Tierney, B oyd, & Davis, 1986 ). It is this last type of misconception ( i.e., those reported to be held by both students and teachers), that will be the focus of this section of the literature review and the proposed research.

PAGE 66

50 Confusing Area and P erimeter The misconceptions which are held by students, as well as pre and inservice teachers are not always mutually exclusive. For example, students often confus e area and perimeter (Hirstein, Lamb, & Osborne, 1978; Kouba et al., 1988) but that confusion can take different forms. In some instances students perform the wrong algorithm by multiplying dimensions that should be added ( Kenney & Kouba, 1997), while at other times they focus on the wrong unit of measure ( i.e., linear versus square or vice versa ) (Carpenter, Cobrun, Reys, & Wilson, 1975; Chappell & Thompson, 1999). In regards to response s to NAEP items, it appears students commonly calculate area in response to a perimeter problem, and vice versa (Kouba, et al., 1988; Kenney & Kouba, 1997 ; NCES, 2007 ). Kouba et al. (1988) conclude that the most plausible explanation is that students lack a conceptual understanding of these concepts. Kenney and Kouba ( 1997) speculate that the items themselves can provide visual cues that may initiate area and perimeter confusion. For example, if a grid is used with the figure then the students may be cued to focus on area even if the question deals with perimeter. Visua l cues have been reported by other researchers as contributing to area and perimeter confusion. Wilson and Rowland (1993) discuss findings where students tend to focus on one dimension of a figure (typically the longest one), and Carpenter et al. (1975) ex plain the tendency for children to judge area strictly on the basis of physical appearance. For example, when attempting to compare different sized rectangular regions in order to find two with the same area, students will choose the shape because they say it is the most similar to the other one, without out any mention of counting or calculating units to do the comparison. Confusions between area and perimeter still persist as evidenced by student performance

PAGE 67

51 on measurement items in the 2007 NAEP (NCES, 20 07). Researchers have found that preservice teachers are also prone to confusing area and perimeter. Reinke (1997) asked 76 preservice elementary teachers to explain in writing how they would find the perimeter and area of the shaded shape illustrated in Figure 4. When explaining how they would find the perimeter, approximately 22% of the subjects worked the problem exactly as they would if they were finding area. The preservice teachers performed better when explaining how they would find area, however, an interesting finding was that there were three instances of subjects using degrees for finding area and perimeter. Apparently, knowledge of a circle containing 360 degrees evoked references to the semicircle containing 180 degrees. Possibly the word data (e.g., follow up interviews) leaves the reader to only speculate the reasoning and ded such data when reporting findings from the research conducted within a mathematics their journal writings, revealed many misconceptions regarding area and perime ter. Figure 4 Diagram shown to preservice teachers

PAGE 68

52 Tierney et al. (1986) found that many preservice teachers equate finding area to finding a number, but too often any number will do. One participant was observed counting the pegs around a figure to find its area. When questioned, they replied that their method seemed to generate a reasonable number. Another wrote that area never seemed like a real concept to her because there was no tool for measuring it. A major difficulty for these preservice te achers was that they would often confuse what exactly they should count in order to find area, and they would have the same problem when attempting to count something to calculate perimeter. A plausible explanation for the confusion of area and perimeter is that conceptions regarding the use and meaning of appropriate units for finding area and perimeter are muddied at best. Linear Verses Square U nits The unit of measure functions as a conceptual bridge connecting an object and the central, unifying idea under instruction does not recognize that the co ncept of a square unit presents difficulties for students. In addition, knowledge about the square unit (and linear as well) is typically assumed to be ascertained from instruction on finding area (Simon & Blume, 1994). To understand concepts of measuremen t, the basic properties of units must first be explored and understood. To apply the appropriate unit of measure, the students must decipher what attribute is being measured (Wilson & Rowland, 1993). For example, if measuring length, then a linear unit suc h as a centimeter or an inch is needed. If area is the desired measurement, then a two dimensional unit such as a square would be appropriate. When these ideas are not understood, then errors are made and misconceptions develop.

PAGE 69

53 Researchers have found that students often confuse linear and square units (Lappan, Fey, Fitzgerald, Friel, & Phillips, 1998). While interviewing students Hirstein et al. (1978), found point counting in place of applying linear or square units to be a common misconception The fact that 37% of seventh graders answered 6 to the question previously shown in Figure 2 (arrived at by counting the numbers as opposed to the linear units) reveals the confusion that can arise when fundamental ideas regarding units are not understood (Kamii, 2 006). Sometimes it is hard to distinguish if students are confusing area and perimeter, linear and square units, or both. Chappell and Thompson (1999) asked sixth, seventh, and eighth graders to construct a figure with a perimeter of 24 units. Figure 5 is an example of what can occur when students have misunderstanding s regarding units of measure. Another difficulty can arise if students believe that units must be single, discrete, and/or whole entities; therefore, fractions of units tend to get ignored or counted as whole (Hiebert, 1981; Lehrer, 2003). For example, when finding the area of an irregular figure (e.g., a footprint), not counting or compensating for partial units results in an incorrect area. It also appears that calculating the area for regul ar and semi regular figures is problematic. The 1996 NAEP reported that only 12% of eighth grade students could correctly determine the number of square tiles needed to cover a region of given dimensions (Martin & Strutchens, 2000). Too often students unde rstand square units simply as something to be counted rather than as a subdivision of a plane (Lehrer, 2003). Such difficulties are often the result of children not being able to conceptualize the constructing of what Reynolds and Wheatley (1996) refer to unity can be thought of in base ten terms. It is a single unit comprised of smaller units.

PAGE 70

54 Figure 5. ucted response for a figure having a perimeter of 24 units. For example, a rectangle that is 10 inches long by 4 inches wide has a unity (or area) of 40 square inches. The rectangle could also be partitioned into four 2 5 regions each having a unity of 10 square inches. Although somewhat of an abstract concept, Reynolds and Wheatley used case studies involving four fourth grade students to report that developing an understanding of a nd being able to use a unity is a fundamental component an area into regions and ite rating units has also been investigated by Battista, Clements, Arnoff, Battista, & Borrow, 1998). Battista et al. looked at how students structure and enumerate two dimensional rectangular arrays ( i.e., rows or columns of square units). They found that the array structure that is often taken for granted by teachers as somewhat obvious to students is not an intuitive notion The second graders studied progressed through various levels of sophistication in their understanding of structuring arrays. The impo rtance of each student personally constructing arrays in various settings was stressed The process of constructing arrays and understanding how and why they can represent area is crucial for the formula A = L W to be understood conceptually (Battista et al., 1998).

PAGE 71

55 A possible explanation for why teachers might give important concepts such as arrays only cursory attention is that they may possess only a shallow understanding of them For example, a common misconception among teachers, especially element ary, is that perimeter is two dimensional A belief that has been justified by statements such as, The Conference Board of Ma thematics [CBMS] 2001 p. 22 r egarding area and perimeter the CBMS state: Many teachers who know the formula A = L W may have no grasp of how the its area or why multiplying linear dimensions yi elds the c ount of those units. ( p. 22) Baturo and Nason (1996) pedagogical content knowledge regarding the domain of area measurement. They conducted qualitative research involving clinical interviews and reported that their subjects had acquired skills for performing the basic algorithms for calculating area and perimeter. Although these skills would most likely allow them to function adequately in society, the subject matter knowledge of the student t eachers would extremely limit their ability to scaffold learners in developing meaningful understandings of these concepts. Although preservice teachers think about the teaching and learning of area and perimeter, their

PAGE 72

56 Perceive d Relationships Between Area and P erimeter It is very common for students to think that all rectangles of a given area have the same perimeter or that all rectangles of a given perimeter have the same area (Carpenter, et al., 1975; Hart, 1984; Lappan, 1998; Walter, 1970), as well as exhibit difficulties justifying their reasoning regarding the misconception (Chappell & Thompson, 1999). Woodward and Byrd (1983) posed a question to 258 eighth grade students at two different schools in Tennessee (129 from each). The gist of the question involved a story problem where a farmer had 60 feet of fence and wanted to construct as large a rectangular garden possible. The story continues by saying that the farmer drew out five possibilities for the garden Pictures of an 8 22, 10 20, 15 15, 5 25, and 2 28 rectangle wer e provided for the students to view. The students were then asked to check which statement they believed to be true. The first five choices involved selecting one of the five rectangles as the biggest, and the last choice was that the gardens were all the same size. The researchers were somewhat concerned that only 55 of 258 (21%) answered the question correctly while 157 (61%) said the gardens were the same size. The results spurred Woodward and Byrd to ask two sections of a mathematics course for prospect ive elementary teachers the same question. The preservice teachers were also asked to justify their responses. Almost two thirds of all the preservice teachers said the gardens were the same size. Some of the justifications they provided a is It would appear likely that these preservice teachers received insufficient instruction regarding area and perimet er. Fuller (1996) compared the pedagogical content knowledge of 26 preservice

PAGE 73

57 elementary teachers and 28 experienced elementary teachers. One of the items in her research designed survey involved asking a question very similar to the above garden question ; however, the item concluded with a statement along the lines of, after considering the problem the farmer concludes that it ( i.e., building different size d all the pens will have the same perimeter 60 feet. The pr e and inservice teachers were then asked to: (a) Explain why the farmer made the concluding statement, (b) How would you respond to their solution? and (c) Explain. Fuller reported that only one teacher, an experienced one, provided a res ponse that was co rrect both procedurally and conceptual ly Most of the other pre and inservice teachers attempted conceptual responses, with the majority of preservice teachers arriving at answers that lacked specific mathematical content as well as appropriate supporting pedagogy. The vague qualitative reporting of this study left the reader guessing as to the area and perimeter item s A minor difficulty that is related to the before m entioned misconception is dealing with area and perimeter of irregular shapes. In these shapes students appear to set aside their fundamental concepts of conservation of area and the unit of measure (Maher & Beattys, 1986). About 25% of the seventh graders who took the fourth NAEP indicated that the area of a rectangle could not be determined once the rectangle was separated and reformed into a different shape (Kouba et al., 1988). It could be argued that the students had difficult with conservation of area researche r s felt it was more plausible that they lacked a conceptual understanding of area. In the 2004 administration of the Long Term Trend (LTT) NAEP only 32% of

PAGE 74

58 seventeen year old students could correctly fin d the area of an L shaped region (Rutledge, Kloosterman, & Kenney, 2009). Finding areas of irregular shapes that are not made up of polygons ( e.g., a figure resembling a fried egg ) is also difficult for children (Lindquist & Kouba, 1989). Lehrer (2003) investigated the strategies used by younger children when asked to find the area of the figure resulting from tracing their hand on a piece of grid paper. He found that children tended to organize units in ways that would keep within the boundary of closed figures, and that would result in using units that resemble the space they were trying to fill (e.g., triangles for triangle gaps) even if that meant using different units for the same figure. Lehrer reported that less than 20% of the students studied bel ieved that identical units of measure must be used while covering an irregular figure. Preservice teachers have also been found to have similar difficulties with irregular shapes (Maher & Beattys, 1986; Tierney et al., 1990). Tierney found that preservice elementary teachers would often try to reconcile the application of the l ength w idth formula with calculating the area of irregular shapes. The subjects did not seem to question the appropriateness of the formula but rather communicated a sense of famili arity with it and thus attempted to apply it. The second major misconception involving a presumed relationship between area and perimeter is best illustrated with the following sc enario: Imagine that one of your students comes to class very excited. He t ells you that he has figured out a theory that you never told to the class. He explains that he has discovered that as the perimeter of a closed figure ( e.g., square or rectangle) increases, the area also increases. He shows you a picture (see Figure 6) as proof of his new theory. How would you respond to this student?

PAGE 75

59 Figure 6. increases area The scenario just presented illustrates a very common misconception regarding area and perimeter. Namely, that increasing the perimeter of a figure will always increase the is believed by both students and teachers (Lappan et al., 1998; Rein ke, 1997; Ma, 1999). Ferrer et al. (2001) write that of the many difficulties students have regarding area and perimeter the nonconstant relationship between these concepts is one of the hardest to grasp. Lappan et al (1998) in their instructional book f or teachers on two dimensional measurement Cover ing and Surrounding take a whole chapter to address the subtleties of the misconception that perimeter determines area In spite of the awareness that students struggle with that specific relationship, the o nly research found that examines the misconception was conducted with preservice teachers. F there are three aspects to the scenario presented above The first concerns the specific content knowledge regarding perimeter and area and the proposed relationship ( i.e., the knowledge regarding justification ( i.e., ideas of theory and proof), and the third is the pedagogical content knowledge involving

PAGE 76

60 proposed theory. Different researchers have posed similar versions of this scenario to preservice teachers. Ball (1988) interviewed 14 secondary mathematics majors and 26 preservice elementary teachers for their r theorem shown in figure 6. More than a third of all the teachers (44% of the secondary majors, 35% of the elementary) expressed that they were impressed with work and accepted the substanc e of the claims with little question or reflection. Only incorrect. Many of the teacher candidates (43%) indicated they were uns ure whether the re was a direct relationship be t w een area and perimeter. Ma (1999) presented the same question as Ball (1988) to a group of U.S. and Chinese preservice teachers. for the teachers in both groups. Most of the teachers indicated that they had not heard of immediately. All the teachers knew what area and perimeter meant and most could calculate them; however, their strategies for exploring the theory and their responses to the student diverged significantly. Only t he findings regarding the U.S. teachers will be discussed Of the 23 U.S. teachers questioned, two simply accepted the without question claim was true, five indicated that they would need to consult a textbook before they could respond to the student, 13 proposed a strategy of calling for more examples from the student, and three actually investigated the problem mathematically. Only one U.S. teacher successfully arrived at the correct s olution of presenting a counter example. Eve n

PAGE 77

61 when the U S teachers mentioned specific strategies for approaching the problem, the strategies were not based on careful mathematical thinking. They did not consider a systematic way to examine the various cases. Rather, the U.S. teachers proposed a st rategy based on the idea that a mathematical claim should be explored and proved by working through a large number of examples. This misconception, as Ma puts it, was shared by many of the U.S. teacher s and would likely mislead and confuse a student. Howe Knowing and Teaching Elementary Mathematics makes a compelling statement that summarizes his feelings on the U.S. the relationship between area and perimeter : For me, perhaps the most discouraging a spect of working on K 12 educational issues has been confronting the fact that most Americans see mathematics as an arbitrary set of rules with no relation to one another or to other parts of life. Many teachers share this view. A teacher who is blind t o the coherence of mathematics cannot help students see it. (p. 885) Justification of Responses The ability to reason is an essential component of learning to do mathematics B an important reasoning skill and is fundamental in developing a conceptual understanding of mathematics and facilitating its making sense (Ma, 1999; NCTM, 2000). It would be unrealistic to expect most students to develop reasoning skills without a proficient teacher who possesses suc h skills guiding the process. Research indicates that many teachers lack such skills. When Woodward and Byrd (1983) asked prospective elementary teachers to justify their answers to a problem involving area and perimeter, the responses given were shallow in content, were basically

PAGE 78

62 restatement of their answer, and involved little or no meaningful mathematical investigations. An alarming finding was regarding their knowledge of justification and pedagogical content knowledge. When asked how they would respond to a student who claimed he had discovered a new (albeit incorrect) theorem, the vast majority, 92% of the elementary and 86% of the secondary prospective teachers, concentrated entirely what they (the preservice teacher) knew about the relationship between area and perimeter the focal point of their response. They provided no meaningful discussion claim; instead, they put all their effort into deciphering whether he was right or wrong. Expanding upon Ball (1988) work M a (1999) reported that a lack of meaningful content knowledge regarding a proposed relationship between area and perimeter prohibited the vast majority of U.S. teachers involved in the study from engaging in any co nstructive conversation with potential student s inadequate ability to effectively question thematical claims as well as to offer clear justifications for mathematical arguments is predictably evident Lappan et al., 1998; Martin & Strutchens, 2000). When students are asked to provide written explanations or justifications of an swers to constructed response questions, even a lower level task becomes more difficult and their performance decreases (Kenney & Kouba, 1997 ; Strutchens, Harris, & Martin, 2001 ). Being able to provi d e real world applications of mathematical concepts is ev idence that students are making sense of the mathematics and developing conceptual understanding (NCTM, 2000). Chappell and Thompson (1999) found that middle school students have

PAGE 79

63 difficulties in generating practical application probl ems for even common mea surement concepts as area and perimeter. Apparently pre reasoning regarding many concepts surrounding area and perimeter are extremely lacking. One can only assume that if preservice teache rs have such misconceptions then their future students will as well. A disadvantage of much of the current elementary mathematics curricula is that problems involving the misconceptions discussed in the previous sections are not part of the instructional d iscussion for the teacher or the students in area and perimeter does not appear to be reversing the poor performance trend n or aiding in reveal ing or resolving the previously discussed shortcomings and misconceptions. Thi s second major section of the literature review concludes with first examining why there are pervasive misunderstandings regarding area and perimeter and lastly by presenting some innovative instructional strategies to improve the teaching and learning of these concepts. Likely Causes of Area and Perimeter Misconceptions Based on the literature addressing these misconception s it would appear that a conceptual understanding of fundamental concepts regarding area and perimeter by both students and teachers is severely lacking and restricted ( Fuller, 1996; Menon, 1998; Reinke, 1997; Woodward & Byrd, 1983). Exploring some of the most probable causes of these difficulties would be a logical first step before offering recommendations for necessary inte rventions. U nfocused Curriculum The goal of elementary mathematics needs to be that of building a firm

PAGE 80

64 foundation on which ongoing mathematical learning can be built and understood (NCTM, 2000). The curriculum should only be a part of that foundation, a nd teachers need to have the confidence and ability to circumvent and supplement when necessary (Ma, 1999). Information collected from the Third International Mathematics and Science Study (TIMMS ) revealed that fourth grade students in the U.S. encounter a mathematics curriculum that is unfocused contains many more topics and possesses little coherence as compared to those of other countries that significantly outperformed our students (V al verde & Schmidt, 1997). Data collected from a national random samp l e of teachers in TIMMS indicate that the majority of them are attempting the overwhelming task of covering all the material in the textbook. Consequently, the mathematics contained within our textbooks receives shallow and terse treatment (Valverde & Schm idt, 1997). For example, although an important purpose of measurement is to compare things that cannot be compared directly, the idea of comparison is either absent or casually mentioned within textbook instruction of measurement (Kamii & Clark, 1997). Som etimes a measurement topics can indirectly confuse students. A second grade mathematics textbook by Harcourt Inc. (2004) deals with congruent shapes by encouraging teachers are cong ruent A process of counting dots to determine side lengths of polygons would most likely cause confusion for students later when learning about perimeter and counting linear units. Effective instruction of a rea and perimeter needs to present two perspectives, the static and dynamic (Baturo & Nason, 1996). The static perspective equates area with a number representing the amount of space or surface that is enclosed by a boundary. The

PAGE 81

65 dynamic perspective focuses on the relationship between the perimeter and area of a figure, that is, as the perimeter approaches that of a line segment, the area approaches zero. However, the dynamic perspective is rarely examined in the typical textbook ( Baturo & Nason, 1996); hence, misconceptions regarding relationships between area and perimeter can develop and go unchecked (Ball, 1988; Woodward & Byrd, 1983). It has been suggested that the learning of area and perimeter could be more coherent and conce ptual if the concepts were examined simultaneously ( Chappell and Thompson 1999 ; Hiebert & Lefevre, 1986 ; Simon & Blume, 1994 a ) Scope and sequence of mathematical topics is important to instruction; however, knowledge of how students learn and what they f ind difficult must also be considered while implementing any curriculum. Outhred and Mitchelmore (2000) found that c hildren learn and conceive about area differently and have been documented as progressing through developmental levels while grasping the co ncept. To facilitate this progression they recommend the curriculum introduce the concept of area early on by having the students think of area measurement as the act of covering a region with a fixed unit, and then investigate rectangular covering within that context of area measurement later discovering or deriving the area formula. Baturo and Nason (1996) conclude d, after studying preservice that if preservice teachers are to be expected to teach measurement concepts such as area and perimeter from a conceptual perspective then they need to experience as students a more focused and dynamic curriculum complete with many concrete measuring experiences such as covering regions with units of area Ineffective I nstruction T he curriculum alone can not be blamed for the ongoing struggles many students

PAGE 82

66 have with mathematical achievement nor can it be expected to bring about necessary reform T here are many elements that merge together during the act of teaching a few of the pro minent ones are : the abilities and prior understandings of the students, the the curriculum, and instructional strategies To assume that all teachers are sufficiently prepared to teach elementary mathematics concepts such as area and perimeter would be a mistake. Tierney et al. (1986) found that when they asked prospective elementary teachers what they would teach a ten year old child about area, 80% of them drew a rectangle and wrote L W near it. Such a simplistic view reflects poorly on their prior training. Along with student performance data, the 1992 and the 1996 mathematics assessment of NAEP gathered data regarding L indquist (1997) reported the 1992 NAEP found that ten percent of fourth grade teachers indicated they have received little or no exposure to measurement concepts. Four years later that same catego ry had grown to 13% (Grouws & Smith, 2000). Such trends do not bode well for improving the teaching and learning of measurements concepts such as area and perimeter. Many of t he instructional practices traditionally employed when teaching measurement may regarding concepts such as area and perimeter. Typical instruction too often treats measurement as a mere empirical procedure requiring little or no logical reasoning (Kamii, 2006; Kam ii & Clark, 1997). For example, lining up paper clips along an object and counting them is an empirical procedure that can be done without giving much thought to the meaning of a linear unit of measurement.

PAGE 83

67 in Figure 2 (see p. 47 ) are most likely the result of having learned only empirical procedures. In contrast, instruction should be rich in activities involving both transitive reasoning (the mental ability to compare two lengths using a third item) and unit iteration whi ch involves mentally constructing a part whole relationship between the total length of a figure and the length of a smaller object ( e.g., a linear or square unit) (Kamii & Clark, 1997 ; Van de Walle, 2007 ). O ver E mphasis on P rocedural Knowledge A common result of these forces, ineffective instruction and an inadequate curriculum, is the fostering of a counterproductive, procedural based knowledge (Kouba et al., 1988) rather than a well connected, conceptual understanding. It is important for those involved in education, especially teacher education, to be aware of the signs of procedural based knowledge as well as how to counteract it. T here is tendency for many teachers to focus their instruction on arriving at an answer rather than on the co nceptual development of measurement ideas (B aturo & Nason, 1996 ; Kamii, 2006 ). It is not likely that teachers plan their instruction to emphasize procedural knowledge of such concepts as area and perimeter. O ften they may not be aware that they lack either the knowledge or the analytical ability to teach conceptually (Hershkowitz & Vinner, 1984). Tierney et al. (1986) found that a high proportion of preservice elementary teachers lack the necessary understanding of area concepts to support their teaching of it even with the aid of a reasonable textbook. This lack of understanding is dangerous in that teachers who have poor conceptual understanding of mathematics will feel more comfortable teaching just for procedural knowledge, and so will be unable and/or u nwilling to engage students in problems requiring them to think deeply (Menon, 1998). Procedural knowledge can also

PAGE 84

68 be reinforced indirectly. For example, a ctivities involving using wooden squares to cover figures and calculate their area may actually pred etermine the task by allowing students to construct recta ngular arrays and count the squares without relating the count to area or comprehending the squares as units of area (Outhred & Mitchelmore, 2000). The same researchers also found that representation through drawing was a better alternative in some settings to concrete manipulative s in promoting conceptual understanding of area measurement Other times the instruction can directly result in emphasizing the procedural side of mathematics to the neglect of the conceptual. Based on error patterns of responses to NAEP measurement items Kouba et al. (1988) stated it appears likely that students have been exposed to procedures ( e.g., area formulas) before developing a conceptual understanding. Too often a rea units are not applied to measure area; instead, the practice is to obtain two measures (typically length and width) and insert them into the often over used formula, A = L W (Nunes, Light, & Mason, 1993). However, the procedure of multiplying two lin ear measures is conceptually far removed from the notion of area (Outhred & Mitchelmore, 2000). Children have difficulty interpreting the results of the procedure (Kenney & Kouba, 1997), and many elementary students do not perceive the resulting product as a measurement (Lehrer, 2003). Many p rospective elementary teachers do not have a clear understanding of why multiplying the length and width of a rectangle is an appropriate method to determine its area (Simon & Blume, 1994 a ). A formula based approach to the teaching and learning of area and perimeter will not achieve the goal of conceptual understanding (Hiebert & Lefevre, 1986; Lehrer, 2003; Woodward & Byrd, 1983). Helping students conceptualize measurement ideas is not an easy undertaking

PAGE 85

69 because most are operating at the holistic level (the lowest) of the van Hiele levels of geometric thought (Strutchens & Blume, 1997). Developing a fundamental understanding of both the array structure and unit iteration are central to conceptualizing area measure (Kam ii & Clark, 1997; Simon & Blume, 1994 a ). Wilson and Rowland (1993) developed a research based instructional sequence that would facilitate that. They propose that the following steps be used for learning to measure length, area, volume, or any other system of measurement: Identify the property to be m easured, (b) Make comparisons, (c) Establish an appropriate unit and process for measuring, (d) Move to a standard unit of There are onceptual understanding of the measurement process (viz., perception, representation, conservation, transitivity, and unit iteration ), but these very skills are also developed through measuring (Wilson & Rowland, 1993). This dilemma suggests the importance of be ing aware of and plan ning for student abilities and difficulties as they engage in innovative and meaningful activities. Students, as well as prospective teachers, need to be active participants i n the process of their mathematical growth and accept the intellectual challenge of learning conceptually (Baturo & Nason, 1996; NCMT, 2000). Innovative Instructional Strategies Refine the Focus The textbook should not have to be the focal point for every mathematics lesson. Following research based instruc tional strategies, such as outlined by Wilson & Rowland, (1993), teachers are free to incorporate unique and inviting learning activities; for example, finding the area of a figure resembling a fried egg (Casa, Spinelli, & Gavin,

PAGE 86

70 2006), using broken ruler to resolve misconceptions about measuring length (Wilson & Rowland, 1993), or using a potato and a stamp pad to create and find the area of irregular figures (Johnson, 1986). The teacher can also supplement an existing curriculum with books and publication s specially designed for specific mathematical concepts. Moyer mathematical concepts of perimeter and area. Although confusing linear and square units is a common dif ficulty for students, the majority of students in this study had no difficulty with this distinction. Moyer also reported that many students demonstrated confidence while explaining before the class how they determined the perimeter and area for the figure s they had constructed. Other publications can actually replace sections or chapters of the required textbook. For example, the publication Covering and Surrounding (Lappan et al., 1998) is an extensive textbook unit specifically designed for 6 th 8 th grade rs to investigate numerous measurement concepts, specifically area and perimeter. Occasionally, important topics are neglected within a curriculum. If teachers are aware of such concepts, they can implement the curriculum accordingly. The concept of co nservation of area is considered by many to be fundamental to understanding area measurement (Beattys & Mahler, 1985; Piaget et al., 1981). Despite its importance, conservation of area is not emphasized in the school curriculum. Kordaki (2003) found that f ourteen year olds, interacting in a computer environment, were able to explore successfully and develop the conservation of area concept from three different perspectives. Refining the focus within teacher education has also been an area of ongoing

PAGE 87

71 discu ssion (CBMS, 2001; NCTM, 1991, 2000). In teacher education, a topic receiving increased attention is knowledge of student thinking. So often the teaching and learning of mathematics focuses on the act of doing mathematics (Ma, 1999). Teachers need to be aw are of how their students think about various mathematics concepts (e.g., area and perimeter), what they find difficult and why, and the misconceptions that are prevalent within the subject matter (Ball & Bass, 2000; Lehrer, 2003; Simon & Blume, 1994a). Ga ining such knowledge as preservice teachers, so that student thinking becomes an conceptual understanding (Ball et al., 2001; Swafford et al., 1997). Integrate Inno vative Learning Tools There is little doubt that technology has impacted the teaching of mathematics. It is beyond the scope of this study to discuss all the technologies (e.g., graphing calculators) that can be used to enhance the learning of mathematic s. This section will provide a brief overview of some of the technologies being used while focusing on the teaching and learning of area and perimeter. Several of the ideas presented here will be delineated in greater detail in Chapter 3. Visual cues are c ritical in developing spatial sense and therefore in the study of geometry (Clements & Battista, 1992). Without appropriate feedback, visual cues have been found to contribute to student errors when solving area and perimeter problems (Kenney & Kouba, 1997 ; Kouba et al., 1988). Incorporating a computer based environment into the learning of measurement has been shown to improve student performance on these concepts (Clements & Sarama, 1997; Noss, 1987). Specifically, the teaching and learning of area and pe rimeter has been enhanced through several computer

PAGE 88

72 Sketchpad (Stone, 1994), and a specially designed microworld (Kordaki, 2003). The previous two sections are by no means exhaustive, but do give insight in to the possibilities. With a little experience and creativity, the goals and objectives of mathematics textbooks can provide launching points for investigations that challenge students, confront misconceptions, and encourage the sharing and justifying of p roblem solving strategies and solutions (Bray, Dixon, & Martinez, 2006; Chappell & Thompson, 1999; Reinke, 1997). Enhancing Mathematics Teacher Education with Technology It is a common notion that teachers tend to teach as they were taught (Goodlad, 198 4; NCTM, 1989; Barron & Goldman, 1994), and it is apparent from decades of research and testing that traditional methods of instruction, both for students and for preservice teachers, regarding many mathematical topics (e.g., area and perimeter) are not pr oducing the desired results (CBMS, 2001; Mathematics Association of America, 1991). The research findings regarding pre regarding concepts such as area and perimeter are valuable in informing both teacher educators a nd professional developers; however, minimal research has been conducted to examine best practices to address these deficiencies. What is lacking from the research is specific recommendations for innovative interventions within teacher education, as well a s professional development, to better equip teachers to correct the previously mentioned misconceptions and stop the perpetual cycle of teachers passing on, both directly and wer A specific form of technology based instruction will be presented as a means to

PAGE 89

73 empower teachers. The literature discussed in the next several sections will be somewha t focused in that many areas of technology will not be reviewed. For example, hand held technologies, information and communication technologies, computer literacy, or attitudes and beliefs about technology are not the focus of this study; hence, will not be mentioned in great detail, if at all, in the review of literature. What will be discussed is recommendations and guidelines pertaining to how and why to incorporate technology into the mathematics education of prospective teachers, anchored instruction and its connections to mathematics instruction, and research pertaining to microworlds. The Need for Technology Infusion within Teacher Education Our schools seem destined to position themselves to be able to incorporate more technology into classroom ac essential in teaching and learning mathematics; it influences the mathematics that is Education (ISTE) survey on te chnology use in teacher education reported that the typical K 12 classroom in the United States contains one computer for every five students. A 2005 Education Week report indicated the student to Internet connected computer ratio had improved to 4:1. That ratio is not ideal for a personal and interactive technology based learning environment, which implies teachers will need creative methods to effectively integrate various forms of technology into the teaching and learning of mathematics (NCTM, 2000). The envisioned benefits of technology, especially upon the teaching and learning of mathematics, have been slow to realize, but a growing number of research studies have found that integrating technology into the learning of mathematics can positively influen ce achievement, stimulate and enhance spatial

PAGE 90

74 visualization skills, and promote a more conceptual understanding of mathematics for students and teachers ( Boers van Oosterum, 1990 ; Dunham & Thomas, 1994; Groves, 1994; Rojano, 1996; Sheets, 1993). Our consta ntly evolving and global marketplace demands cutting edge technology; therefore, our schools can expect to be called upon to contribute to preparing students to meet both the real and the perceived technological needs of such a society. A report by the Con gressional Office of Technology Assessment (OTA, 1995) found that only 3 percent of the teacher education graduates indicated they technological classrooms of tomorrow, prospectiv e teachers need course instruction in both content and pedagogy to function effectively in these newly forming instructional environments (Cooper & Bull, 1997; Glenn, 2000; Kersaint & Thompson, 2002; Timmerman, 2004); however, it has become apparent that m any prospective teachers do not possess the necessary knowledge or experience to meet these demands (Milken Exchange on Education Technology [MEET], 1999; OTA, 1995; Pellegrino & Altman, 1997; Thompson, 2000). After completing a comprehensive review of th e literature regarding information technology and teacher education, Willis and Mehlinger (1996) concluded: Most preservice teachers know very little about effective use of technology in education and leaders believe there is a pressing need to increase substantially the amount and quality of instruction teachers receive about technology. The idea may be expressed aggressively, assertively, or in more subtle forms, but the virtually universal conclusion is that teacher education, particularly preservic e, is not preparing educators to work in a technology enriched classroom. (p. 978)

PAGE 91

75 In fact many observers and researchers are suggesting that integration and infusion are not strong enough words for the type of technology use that should be espoused by t eacher education (Thompson, 2000). Research indicates that too many teacher education programs have focused on the technology rather than the curriculum (Cooper & Bull, 1997). The prevalence of stand alone information technology (IT) courses bears out that fact. Stand alone courses are often needed to supplement a lack of basic skills, but such courses are not preparing preservice teachers to enhance teaching and learning through meaningful and contextual technology integration ( Strudler, Quinn, McKinney, & Jones, 1995 training tends to focus on the mechanics of operating new machinery, with little about nt to teach about technology; instead preservice teachers need to be learning how to teach effectively with technology (MAA, 1991; Pellegrino & Altman, 1997; Timmerman, 2004). Recommendations and Guidelines for Effective Technology Integration Teaching wi th technology requires instructional planning that contemplates technology as a tool rather than an add on, something many teacher education programs are not preparing preservice teachers to do (OTA, 1995). Recommendations have been put forth that would pr omote and guide the technology training of preservice teachers. The research proposed in this study makes every attempt to incorporate as many of the guidelines discussed as is appropriate. The fact that many preservice teachers have not personally experienced technology integration as school students, gives rise to the need for faculty to be encouraged to model effective use of technology within their courses (ISTE, 2000, 2008; MEET, 1999). Although modeling appropriate use of technology is a

PAGE 92

76 step in the right direction, the OTA report makes it clear that preservice teachers need tools in cla ssrooms, and practice teaching with technologies themselves if they are to use Connors (1997) extends the recommendation by suggesting that teacher preparation and enhancement courses ne ed to model appropriate technology that prospective and experienced teachers can use to promote meaningful learning of the mathematical content that will be taught in the classroom. Such an integration of educational technology is anything but trivial (Tim merman, 2004). Effectively integrating technology into mathematics instruction requires acquiring new knowledge, as well as deepening current understandings, regarding both how and why to use technology in meaningful ways. Dexter, Anderson, and Becker (199 9) explain how the newly acquired knowledge must be carefully woven together with the content and demands of the curriculum, classroom management, and existing knowledge of subject matter and pedagogy. The key to successful learning with technology rests i n the teacher and not the technology. Although the educational technologies available today are flexible and powerful, they can never replace an effective teacher nor can they realize full potential without one. Schwab (2000) succinctly captures this tho the hands of a poor teacher it [technology] is a useless tool; in the hands of a good teacher based guidelines have been disseminated to facilitate the equipping of preservice teachers with the necessary knowledge to make good use of educational technologies. A synthesis of research conducted by Kathleen Heid (1997) offers four principles

PAGE 93

77 to guide the use of technology in mathematics education. The first focuses on the value of student centere shown to help in transitioning the teacher into their new role as facilitator (Simonsen & Dick, 1997). This constructivist view is new to some and difficult for others. Indeed, many t (learner centered) and objectivism (content centered). Hannafin, Burruss, and Little propose th at teachers abandon active classroom management and allow students complete control of their learning (Clements, 1999; Hannafin, Burruss, & Little 2001), neither do they suggest that software should be the controlling force in the learning process (Jonasse n, Carr, & Yueh, 1998). There is little doubt that balancing control issues within a technology rich classroom is an ongoing and ever evolving challenge. The second principle involves giving students opportunities to function as a mathematician (e.g., to c onjecture, explore, conduct trial and error, and perform hypothesis testing). Technology is thought to provide just such opportunities. Microworlds, which will be discussed later, are a prime example. The third principle suggests that teachers need to prov ide for and This type of cognitive activity is not easy, but is a valuable part of a technology based learning experience (Heid, 1997). The last principle is the idea that in an interactive, technology environment the teacher must assume and provide for constant access to the technology. In this setting the teacher takes on an interesting and powerful role in accomplishing what no textbook or worksheet can; to facilita te the computer in the connection of multiple representations (Clements, 1999; Heid, 1997). As will be seen in

PAGE 94

78 later sections, there are exciting Internet based learning environments that can greatly assist the teacher in that new role. Researchers from t he Curry Center for Technology and Teacher Education at the University of Virginia and the University of Wisconsin, have devised five guidelines that reflect what they believe to be appropriate uses of technology in mathematics education: (a) introduce tec hnology in context, (b) address worthwhile mathematics with appropriate pedagogy, (c) take advantage of technology, (d) connect mathematics topics, and (e) incorporate multiple representations (Garofalo, Drier, Harper, Timmerman, & Shockey, 2000). A brief discussion of the guidelines will help to clarify the role and purpose of each. The first guideline, introduce technology in context, suggests that the features of technology should be introduced and illustrated in the context of meaningful content based a ctivities. In other words, the purpose of technology integration should be to enhance the teaching and learning of mathematics as opposed to using mathematics to teach about technology. The second guideline, address worthwhile mathematics with appropriate pedagogy, encourages incorporating technology based activities that support sound curricular content and not the development of activities merely because the technology makes them possible. The technology used should support and facilitate conceptual devel opment, exploration, reasoning, and problem solving, as encouraged by the NCTM (1991, 2000). The third guideline recommends that activities take advantage of technology and explore topics well beyond what could be done by hand. The fourth guideline states that technology enhanced activities should facilitate mathematical connections between topics in the curriculum and to real world contexts whenever possible. The last guideline involves incorporating multiple representations. Mathematics

PAGE 95

79 educators should e ncourage technology integration that aids students in making connections (e.g., graphical, numerical, and pictorial) between multiple representations of mathematical concepts within problem solving situations (Jiang & McClintock, 2000). Near the turn o f the century, the NRC (2000) conducted a synthesis of research on cognition and learning and within that presented four components deemed essential for the development of effective learning environments: community, learner, knowledge, and assessment. The learner, knowledge, and assessment centered aspects of the learning environments described by the NRC are all essential, yet coexist, and are dependent upon, the facilitation of a community of learners; where learners and knowledge are honored and where pa rticipation, communication, and collaboration are fostered. learning environment. Shamath a, Peressini, and Meymaris (2004) strengthened and technology integration. Their work involving content based technology integration also provides specific examples demon strating how various technology supported mathematics activities exemplify all facets of an effective learning environment proposed by the NRC. The last set of guidelines that will be discussed emerge from a meta analysis conducted by Robert Marzano (199 8) in which instructional techniques were identified as having a statistically significant impact upon student achievement. Empirical evidence provides a model to guid e the technology based instructional strategies proposed in this

PAGE 96

80 study. These ideas will be further developed in Chapter 3. Marzano (1998) revealed that t he following instructional techniques had an average effect size (ES) greater than one. The reader sho uld keep in mind that an effect size of one corresponds to an average percentile gain of 34% in student achievement. The first technique, representing new knowledge in graphic/nonlinguistic formats, finds its roots in cognitive psychology which states that our brains store knowledge using both words and images. An ability to visualize discriminately is a vital skill that needs to be developed for the successful learning of geometry (Clements & Batista, 1992). Unfortunately, research indicates that such visu alization is extremely difficult for students (Dede, 2000). Visual limitations exist in varying degrees across students and can lead to conflicts between visual evidence and information gained from other sources (Triadafillidis, 1995). Computer based techn ologies are an ideal medium for minimizing these limitations and conflicts and facilitate the visualization of mathematical concepts (Noss, 1987, 1988; Clements, Sarama, & Battista, 1998). A second instructional technique is using manipulatives to explore new knowledge and practice applying it Marzano (1998) found that overall; the use of manipulatives is associated with an average percentile gain of 31 points (ES .89); however, the use of computer simulations as manipulatives produced the highest effect size of 1.45, indicating a percentile gain of 43 points. When a computer simulation assumes the role of a cognitive tool, as opposed to simply modeling a phenomenon, it becomes a microworld which will be discussed in detail later. Generating and testing hypothesis about new knowledge is a third effective instructional technique identified by Marzano. The implication from the research is that the greatest benefits regarding this

PAGE 97

81 technique ar e gained when the computer based explorations are guided by an expert teacher in a meaningful way (Clements & McMillen, 1996). sequence involving the demonstration of new concepts to students in a rather direct fashion and then having the students apply the concepts, generalizations, and principles to new situations. Technology is not a panacea, and guidelines to implement technology will only be successful to the extent to which they are implemented within a proven and meaningful learning environment. Indeed, it would seem prudent to integrate the four instructional strategies just discussed into any computer based learning environment in order to maximize student achieveme nt (Cholmsky, 2003). analysis is valuable to the field of education and very thorough in regards to classroom students and their learning, it includes no mention of effective instructional strategies for training preservice teachers This is an area ripe for investigation. technologies have been shown to stimulate and promote a conceptual understanding of mathematics within preservice teachers (Keller & H art, 2002; Wetherill, Midgett, & McCall, 2002). It is only through proper teacher mediation that technology can become a tool to enhance learning (Clements, Sarama, & Battista, 1998). If this is true, then maintaining the current status quo in regards to t eaching, and learning to teach, mathematical concepts such as area and perimeter will not bring about the much needed improvements Technology should not be just another means to disseminate information. With properly trained teachers, it can and needs to be used to develop critical and reflective thinking (Jonassen, Carr, & Yueh, 1998).

PAGE 98

82 The Concept and Possibilities of Anchored Instruction The anchored instruction model of learning was developed and tested by a team of prolific researchers who derived the ir insights from the work of Dewey (1933) and Hanson (1970) They worked out of the Learning Technology Center (LTC) at Vanderbilt University and when they published as a team, the group referred to them selves as the Cognition and Technology Group at Vand erbilt (CTGV). The group had concerns with traditional instruction and sought ways to build upon and incorporate preferred constructivist approaches in hopes of developing a more useful knowledge among participants (Bransford, Sherwood, Hasselbring, Kinzer & Williams, 1990a). Cognitive psychologists claim that meaningful knowledge is formed when small chunks of information are woven together within a contextual framework (Klock, 2000). Anchored instruction seeks to scaffold just such a framework. Anchored instruction is grounded in and derived from constructivist theories of knowledge and is a specific application of situated cognition It is a research based paradigm for examining learning through technology assisted problem solving. Anchored instruction i s similar to case based learning, but not as open ended. Bauer, Ellefsen, and Ha ll (1994) describe [typically, technology Videodiscs, the anchor chosen by the Vanderbilt Group, have often been u sed to provide an environment to anchor instruction and problem solving to a meaningful context. Each videodisc contains a story organized around an authentic problem solving task that

PAGE 99

83 emphasizes in context learning that is constructivist or generative in nature (Bransford et al., 1990a; CTGV, 1992a) and emphasizes the importance for students to experience the advantages of apprenticeship learning (Brown, Collins, & Duguid, 1989). Goals and Uses of Anchored Instruction The CTGV asserts that traditional c urricula focused on memorizing and recalling facts and often introduced different ideas in different contexts even if those ideas could be meaningfully connected (Bransford et al., 1990b). To combat this weakness the CTGV, under the leadership of John Br ansford, established many challenging goals chief among them was finding a way to address the problem of inert knowledge (Baumbach, Brewer, & Bird, 1995; CTGV, 1990; 1992a; 1992b; 1993), which often results from the traditional instruction presented in s chool (Whitehead, 1929). According to Whitehead, inert knowledge is knowledge that can usually be recalled when explicitly asked to, but is not spontaneously recalled in problem solving situations even though it is The major goal of anchored instruction is to let students experience the changes in their perception and understanding of the anchor as Another goal of anchored instru ction is to allow students and teachers to experience cooperatively the kinds of problems and opportunities that experts in various areas encounter (CTGV, 1990, 1992b). The potential of technology to provide representations that can connect mathematical le arning to authentic human experience should not be overlooked (Kaput, 1994). Before attempting to meet the desired goals of anchored instruction, key decisions regarding the choice and use of the anchor must be made. The decision points that follow,

PAGE 100

84 resp ectively, are based on the research of McLarty et al. (1990), and have been instrumental in informing the design of the proposed study: (a) choosing an appropriate anchor, (b) developing shared expertise around the anchor, (c) expanding the anchor, (d) usi ng knowledge as a tool, (e) allowing student exploration, and (f) sharing what was learned from the anchored instruction. The CTGV (1993) maintained that computer simulations, films, videos, and printed materials all can serve as appropriate anchors. It is advantageous for the anchor to be interactive, dynamic, and to be stimulating both visually and spatially (CTGV, 1992a). Once the anchor has been selected, it is important for users to have multiple experiences with the anchor from varying perspectives. B aumbach, Brewer, and Bird (1995) suggest that such activities will encourage students to develop expertise on various aspects of the anchor. As their knowledge of the anchor develops, students can be encouraged to assume greater responsibility for their le arning. Once the teacher and the students have developed a shared expertise around the anchor, phase three can be initiated. Now the students can expand the anchor by using their expertise to solve problems requiring the use of the anchor (Bauer et al., 19 94). solving skills are essential to success during this phase. In phase four students are allowed greater freedom to plan and conduct their own solution strategies by exploring the anchor. Having the ability to exp lore the same domain from multiple perspectives is a primary goal of anchored instruction (CTGV, 1992a). Although there are some minor discrepancies regarding certain aspects of the first four phases, it is agreed that learning activities centered around a nchored instruction need to culminate with students sharing what they have learned (Bauer et al., 1994; Baumbach et al., 1995; McLarty et al., 1990). Students are encouraged to compare their

PAGE 101

85 work with each other and with the teacher or other experts who ar e present. The dynamic and interactive learning environments that result from attempting to meet the goals of anchored instruction have produced diverse research on the instructional model. Highlighted Research on Anchored Instruction The relatively slim body of research encompassing anchored instruction should not detract from its contribution to the study of teaching and learning. The research paradigm of anchored instruction is a relatively new phenomenon, dating back to the late 1980s. Early research conducted by the CTGV indicated that anchored instruction seemed to help students develop rich, organized knowledge structures plus promote long term retention and spontaneous use of vocabulary (Bransford et al., 1990b). The CTGV later found that fifth gra ders can become very good at complex problem formulation on tasks similar to those experienced during anchored instruction (CTGV, 1992a). The research group felt that situating the learning experience in meaningful contexts was the key for anchored instruc tion to facilitate students acquiring knowledge of problem solving strategies as well as knowledge of content that was non inert. Following the earlier research studies involving general education fifth graders, anchored instruction has been studied in various settings, including middle grade science (Goldman, et al., 1996); several studies involving students with disabilities, including: literacy and social studies (Kinzer, Gabella, & Rieth, 1994), effects of media attributes, (Shyu, 1999), social studi es (Glaser, Rieth, Kinzer, Coldburn, & Peter, 2000), general education (Bottge, Heinrichs, Mehta, & Hung, 2002), remedial math and pre algebra (Bottge, Heinrichs, Chan, & Serlin, 2001), mathematical problem solving and transfer (Serafina & Cicchelli, 2003) and procedural math skills (Bottge, Heinrichs, Chan,

PAGE 102

86 Mehta, & Watson, 2003). While results from these studies were mixed, there were many positive findings and subsequent helpful recommendations. It appears that the majority of school research on anchore d instruction conducted in the past ten years involved, in some way, students with learning disabilities. A plausible explanation for this involves one of the disadvantages of incorporating anchored instruction into the traditional, general education class room. Implementing anchored instruction is a time consuming proposition. The standardized curriculum found in most of the general education mathematics classes, along with applicable high stakes tests, produces apprehension among many teachers who feel pre ssure to cover an unreasonable amount of content and thus settle on lecturing as their primary means of dispensing information (Oliver, 1999). Ironically, one of the biggest detriments to higher order thinking, a goal of anchored instruction, seems to be a standardized curriculum. Fortunately, for most higher education, the curriculum is not so rigidly defined, and offers a fertile soil for research on anchored instruction, as is the case with my study which will investigate the influence of anchored instru student thinking regarding area and perimeter. Very little research has investigated the use of this instructional method with preservice teachers and even less has involved topics in mathe matics. Early research on anchored instruction explored possible applications within teacher education. One study compared whether anchored instruction could promote reflective thinking among preservice teachers about teaching practices. McIntyre and Pap e (1993) had one group of K 6 preservice teachers ( n = 16 ) view videodiscs of expert teaching practices as part of their instruction while the other group received typical

PAGE 103

87 methods instruction without any video based instruction. Pre and posttests (finding s limited by small sample size), student logs and progress reports, and student interviews revealed an overall positive attitude from a majority of students receiving anchored instruction. These students appeared to be more descriptive in their analysis of critical classroom events and were better able to support their claims. Student interviews indicated that the interactive videodiscs resulted in more and deeper reflection of classroom activities. The role of anchored instruction in improving preservice t learning about instructional practices has also been examined in the domain of educational technology. Bauer, Ellefsen, and Hall (1994) were interested in determining whether using anchored instruction would help preservice teachers learn how to u se a variety of technologies and also the extent to which students could envision applying the model in their future teaching. A variety of data sources were used, including videotaped observations and interviews, student produced projects, and information provided by instructors. Researchers found that students did learn to incorporate a variety of educational technologies while using the Oregon Trail software as an anchor. Student achievement on assigned projects was superior to previous semesters in thor oughness and overall quality. Most of the students interviewed indicated that they felt the anchored instruction approach was worthwhile to learn and that they anticipated using some form of the model in their future teaching; however, a longitudinal study would be needed to determine if exposure to the model would have any impact on the future teaching practices of the participants. Bauer (1998) replicated his previous research with a larger sample size ( n = 48) and reported similar results as before. Ka

PAGE 104

88 research when they conducted a semester long case study involving a cohort group of 22 preservice teachers. They used anchored instruction as a means to integrate a curriculum development course with an educatio nal computing class. Participants not only learned about technology applications for the classroom, but they applied their knowledge by developing instructional units to share with an eighth grade student from a local middle school with whom they were pair ed. Feedback from the participants was overwhelmingly positive. The findings showed that anchored instruction was an effective way to both learn about educational technology tools while at the same time integrating technology into instructional practices at least in a one on one setting. Only one study was found investigating the use of anchored instruction in a mathematics course for preservice teachers. Kurz and Baterelo (2004) used case study methods to investigate four female preservice teachers (tw o secondary and two elementary) who volunteered to participate in a mathematics based technology integration course. The study focused on whether the subjects could determine the significance of using anchored instruction with their future students and if they envisioned student learning and mathematical growth using anchored instruction. To different degrees, the participants expressed optimism about the utilization of anchored instruction and were able to describe salient features of the model that suppor t student learning and growth. Given the fact that previously discussed research indicates many preservice teachers possess similar mathematical shortcomings as their students, it would seem the hypothetical context investigated by Kurz and Batarelo (i.e., studying how preservice teachers envision student learning and mathematical growth using anchored instruction) could have been more meaningful if grounded in examining first hand how

PAGE 105

89 preservice teachers themselves learned and grew mathematically through e xperiencing anchored instruction. Developing knowledge within students and teachers that is conceptually anchored is strongly recommended (CBMS, 2001; NCTM, 1991, 2000). t knowledge and knowledge of student thinking has been virtually unexplored and is ripe for investigation. At the time the CTGV were doing their initial research and formulating their fundamental ideas regarding anchored instruction, it was determined tha t computer technology was not yet widespread enough, nor affordable, for it to be universally accessible to serve as the anchor for the model; thus, the videodisc was decided upon to fill that role. However, since that time the microcomputer, along with In ternet access, have become commonplace for both higher education and the school classroom. The continued advancements in computers, software, and programming languages and platforms (e.g., Java) have allowed other learning environments to develop that shar e theoretical underpinnings with anchored instruction. Logo and other more dynamic and interactive microworlds represent prime examples. Microworlds The purpose of this portion of the literature review is to acquaint the reader with microworlds, explain t heir distinguishing design features, discuss some popular computer based geometry microworlds, provide highlights from research involving computer microworlds and students, and then focus on research incorporating microworlds into preservice teacher educat ion. The literature reviewed regarding preservice teachers will focus primarily on microworlds designed to function as online

PAGE 106

90 1995]), simulations (e.g., SimCity), or games ( e.g., Math Blaster Mystery, [David & microworlds when they are designed to let a novice begin to understand the underlying derlying model are the topic of discussion in the next section. Microworlds: Defined and Described The power of a microworld lies not necessarily in what it can do but rather in its constructivist environment designed to motivate (and indirectly guide) the user to explore ideas and relationships, and resolve conflicts between prior knowledge and newly encountered information (Papert, 1980; Rieber, 2004). According to the Piagetian principle of equilibrium, this cognitive conflict (referred to as disequi librium), is necessary for meaningful learning to occur (Hogle, 1995). A well designed microworld will foster these learning conflicts. The epistemology underlying microworlds is known as constructivism (Jonassen, 1991). Seymour Papert (1980) coined the t erm microworld over twenty years ago. He defined it as: . a subset of reality or a constructed reality whose structure matches that of a given cognitive mechanism so as to provide an environment where the latter can operate effectively. The concept leads to the project of inventing microworlds so structured as to allow a human learner to exercise particular powerful ideas of intellectual skills. (p. 204) Microworlds do not have to be computer

PAGE 107

91 chemistry set can function as a microworld. Papert made it clear that the concept of a microworld was not new and was actually related to the longstanding notions and uses of mathematical manipulatives (e.g., Cuisenaire rods). David Jonassen (1996) describes a microwor interesting; therefore, encouraging the user to generate their own problems and test hypo theses for solving it. Many definitions have been posited over the years, but perhaps the most elegant In the next section we consider various defining characteristics of a microworld which Characteristics of a Microworld Clear distinctions between characteristics that define a microworld and the principles that guide their design are not always evident; however, because the microworlds used in this study were (for the most part) already conceived and designed prior to my implementation, the focus of this section will be on the salient features necessary for a microworld to be able to function as a meaningful learning environment. The characteristics that follow are presented as a confluence of valuable points of view. Although the guidelines are open to various interpretations (e.g., instructional designers, constructivists, or instructivists), they are meant to provide a sort of f ilter to help identify microworlds worthy of integrating into instruction. The focus will be on how the microworld functions (i.e., their use), as opposed to how it is structured (i.e., their design). L. P. Rieber has been researching and writing about mic roworlds for almost

PAGE 108

92 twenty years. Based on a synthesis of his own and that of others in the field, Rieber (2004) presented the following definition of a microworld: Therefore, a microworld must be defined as the interface between an individual user in a social context and a software tool possessing the following five functional attributes: (a) It is domain specific; (b) it provides a doorway to the domain for the user by offering a simple example of the domain that is immediately understandable by the user; (c) it leads to activity that can be intrinsically motivating to the user the user wants to participate and persist at the task for some time; (d) it leads to immersive activity best characterized by words such as play, inquiry, and invention; a nd (e) it is situated in a constructivist philosophy of learning. (p. 588) Rieber continues by stating that for a microworld to be domain specific implies an appropriate treatment of curricular content and careful attention to pedagogical recommendations for how the domain, such as mathematics, should be taught. Hoyles (1991) explains that in order for investigation within a microworld to be meaningful the should work. In o ther words, the microworld should be able to meet the user where they are. Connecting with pupil conceptions is complex. Learning within a microworld is a very personal experience and what is meaningful can be relative. Rieber (1992) interprets meaningfuln ess as the degree to which a student can link new ideas to prior knowledge. The success of a microworld in opening the doorway to exploring a new domain hinges connection is a lso considered among the most important determinants of learning

PAGE 109

93 (Ausubel, 1968). Once the door to a specific content domain has been opened, it is critical that the microworld continue to motivate the user to persist at his or her exploration. It was Be correction by providing graphic and quick feedback (Hogle, 1995) combined with linked, interactive increase the opportunity to learn for all. Although the inherent scaffolding features of the matics, a qualified and knowledgeable teacher functions as the virtual glue holding all the elements critical in supporting and challenging student learning while at t he same time modeling important and interrelated parts operating within a microworld learning environment (e.g., the curriculum, the microworld, the teacher, and the student), and in the works of more than sitting inside of a library does; however, a microworld situated within a carefully constructed environment can be a valuable cognitive tool to facilitate the learning of mathematics. The concepts within Geometry provide an excellent backdrop for the integration of a microworld tool. Since the 1980 s many other microworlds have become available; however, there are four computer based microworlds that specifically deal with geometry. They are

PAGE 110

94 Logo, Geometric Supposer (including superSupposer), Cabri Geometry (including Cabri pad. It is important to distinguish the different levels of interaction experienced by the user while exploring within these microworlds. It is outside the scope of this review of literature to discuss thoroughly all the distinguishing features, specific f unctionality, and instructional uses of those software titles. I will instead summarize the findings involving the influences and impacts of the software upon the teaching and learning of geometry in the school classroom. Static Geometry Software Papert participation, along with a team from the Massachusetts Institute of Technology, in the development of the programming ber, 2004). Appearing in the early 1980s Logo is one of the earliest static construction environments. The term static refers to the type of interaction that occurs between the user and the software. A static environment does not allow the user to manipul effects of that manipulation. This limitation is a prime distinguishing characteristic between static and dynamic software. Despite this limitation there is a considerable am ount of research on Logo and results have been very positive. Logo is a programming language and that fact has allowed for updated versions over the years. The primary focus of Logo geometry is properties of two dimensional shapes and measurement. Research on Logo goes back almost twenty years, and the findings are extensive. The primary focus of this study only warrants a summary of major themes. Early versions of Logo required students to write basic code to control the

PAGE 111

95 movement of a turtle shaped icon o n the screen. Although the code was straightforward, it proved problematic to some young children (Clements & Batista, 1989; Hoyles, Noss, & Adamson, 2002). Turtle Math, a successor of Logo, has greatly reduced the obstacle. For an example of how far the e volution of Logo has progressed, please visit [ http://nlvm.usu.edu/en/nav/frames_asid_178_g_3_t_1.html ] to experience an Internet version. In spite of some problems with children writing the code, programmers and researchers see great value in the coordina ted action of writing symbols (code) and seeing the resulting drawing (Clements & Sarama, 1997). Studies found that students who learned geometry with Logo outperformed the control students on concepts involving angle conservation and angle measure (Noss, 1987) as well as understanding shapes and their components, and describing paths through a map (Clements & Batista, 1989; of higher levels of geometric thought. Curre geometric thought regarding two dimensional shapes is the van Hiele theory. According to this theory, students move through several qualitatively different levels of geometric thinking (Clements & Batista, 1992). The five levels are: (a) level 0 pre recognition, (b) level 1 visual, (c) level 2 descriptive/analytic, (d) level 3 abstract/relational, and (e) level 4 formal axiomatic (this level is required for doing proof). Advancing from one level to the next does not occur naturally in children and requires systematic nurturing (Dix, 1999). Research has shown interactions with Logo can help children (Clements & Meredith, 1993; Glass & Deckert, 2001) and middle school students (Clements & Sarama, 1997) progres s into their next van Hiele level. A positive feature of

PAGE 112

96 (Clements & Batista, 1994). Such tailored instruction is very important when attempting to create a student centered learning environment. Lastly, Clements and Sarama (1997) reported on a very interesting study where the Logo students not only outperformed traditionally taught students but also another control group of students taught the same content but used concrete manipulatives. An apparent implication here is for teachers to be aware of the strengths and weaknesses of the various learning support media at their disposal. Besides the mathematical learning advantages of Logo, certain social benefits have be en reported. Students working cooperatively with Logo showed enhanced, specific problem solving skills such as conflict resolution (Clements & Nastasi, 1999), and displayed sustained enthusiasm for collaborative work resulting in improved communication ski lls (Yelland, 2002). Logo seems to foster a cooperative environment where both cognitive and social conflicts could be resolved. It is worth noting that the teacher played a crucial role in mediating this process through facilitating appropriate discussion of the activities. Logo activities were found to be most meaningful and beneficial when they were integrated into the existing curriculum and not used as an add on (Clements & Sarama, 1997). In conclusion, and on a different note, although the research re garding Logo with school children is extensive and well reported, there is relatively little (if any) that examines the influences of a Logo learning environment upon the mathematical understandings of preservice elementary teachers or their reflective con siderations of future instructional strategies in light of such interactions. Although dimensional shapes and is used mostly with younger students, the microworld discussed next is geared towards older students.

PAGE 113

97 Geometric Suppo ser (1993) is one of the best known geometry microworlds. It is a static modeling tool used for making and testing conjectures in geometry through manipulating geometric objects and exploring the relationships within and between these objects (Schwartz, 19 learning of geometry by enabling the students to inductively prove relationships among geometric concepts b y allowing constructions to develop in a direct way, students exhibit a positive attitude towards learning those concepts with Supposer Clements and Battista skills thr ough traditional approaches, almost all have been unsuccessful. At that time, they concluded that new learning environments were needed to encourage students to make conjectures and generalizations that would promote both inductiv e and deductive thinking. Supposer has made great strides in accomplishing just that. Hlzl (1981) explains that students struggle with the rigid nature in which diagrams are presented in single d iagram very quickly is one remedy to that problem (Yerushalmy & Houde, 1986). After working with Supposer, students reported a deeper understanding of the role and limitations of diagrams (Yerushalmy & Chazan, 1993). Spending time in the Supposer environme solving strategies for analyzing problems, conjectures, and proof. Such students have even reported coming to understand more deeply and personally the value of formal proof in mathematics (Wilson, 19 93). The Geometric Supposer has been shown to have the capacity to change how students think and feel about geometry, but these results are not guaranteed or

PAGE 114

98 automatic. The attitude of the teacher and how they implement the Supposer are crucial to its suc cess. Wilson (1993) continues by stating that although the Supposer can be used with traditional instruction as a sort of digital blackboard by a lecturing teacher, its design lends itself to a more open ended approach. That open ended approach offers the teacher the opportunity to integrate inductive reasoning back into the classroom. For this to be accomplished, the roles of teacher and student need to be altered. Yerushalmy and Houde (1986) liken the desirable learning environment to that of a typical sc ience class. The scientific process becomes the primary focus, and teacher and student collaborate on collecting data, making conjectures, and looking for counterexamples or generalizations. These changes are not easy and the process is slow, but as seen a bove the learning dividends outweigh the initial investment of time and effort. Dynamic Geometry Software Although pioneering software packages such as Logo and Geometric Supposer made great strides towards achieving the technology recommendations of t he NCTM and other interested parties, it was not until the development of software like Sketchpad and Cabri Geometry 5). Both of these software titles are relatively new to the classroo Sketchpad was released around 1991 and Cabri around 1992; therefore, the volume of research is much less than what exists for Logo or Supposer. There are many articles and conference proceedings for both software programs that primarily discu ssed suggestions for implementation and interesting activities, but most presented no research framework. This informal finding caused me to wonder if the research is just dragging behind the

PAGE 115

99 innovation or if implementation is being done despite an apparen t hollow research foundation. It was Kaput who helped put my reflections in perspective by pointing out that research did not bring about the invention of the automobile. It was the result of necessity and progress. Necessity and progress have served as ca talysts to facilitate a gradual integration of technology into the teaching and learning of mathematics. Organizations such as the NCTM (2000) suggest that interactive geometry software can be used to enhance student learning, and the results presented, al ong with those that directly follow, appear to bolster that claim. Teaching and Learning Mathematics with Microworlds Microworlds, functioning as cognitive tools (i.e., technologies that support thinking processes during problem solving and learning), hav e been shown to assist in the learning of powerful and fundamentally different mathematics (Jonassen & Reeves, 1996; Pea, 1986), enhance student thinking (Lederman & Niess, 2000), support cognitive processes such as logical reasoning and hypothesis testing (Lajoie, 1993), provide specific feedback appropriate to guide in the learning of new material (Roblyer & Edwards, 2000), and encourage the exploration of mathematical ideas (Jensen & Williams, 1993). It is important to realize that a true computer micr oworld is not meant to be a panacea functioning in isolation from social interactions with peers and teachers. Although microworlds are a constructivist invention, they can also be a tool for supporting goal orientated environments in which learning occurs through discovery and exploration (Rieber, 1992). Rieber explains that one way to reach this compromise is by incorporating aspects of guided discovery into the learning activity which would

PAGE 116

100 naturally be constrained by the boundaries imposed by a particul ar microworld. The research presented on microworlds will attempt to strike a balance between describing the salient features of the microworld(s) involved in the study along with an appropriate discussion of the instructional strategies implemented. The m ost common use of microworlds among successful research studies involves embedding microworlds within a carefully planned curriculum unit, as opposed to treating them as a curricular add on or as a medium to enhance traditional teacher lead instruction. C omputer Microworlds in the K 12 Setting There is limited research beyond the specific applications and domains of popular being the relatively recent affordability (desktop computers only fell under $1000 in late Initial studies seemed to focus on how students interacted with the microworld as well as the various solution strategies produced The majority of this research did not attempt to embed the microworld within instructional units based on the curricula found at the their cognitive play activity into independent mathematical activity while interacting within two different types of microworlds (discrete and continuous). Two case studies involving four third grade students found that although the microworlds captivated the ned as pathways to mathematical activity, independent mathematical activity was generally initiated by teacher intervention. Clements, Battista, Sarama, and Swaminathan (1997) investigated the application and development of spatial thinking in an instr uctional unit on geometric motions and

PAGE 117

101 area. This was some of the earliest research to embed the use of microworlds within a specifically designed instructional unit. Observational data and results from paper and pencil assessments (including the Wheatley Spatial Ability Test) found that the three third grade classes showed significant growth in spatial competence although the microworld based activities motivated and aided the students in building more sophisticated and systematic problem solving strategie s. It is worth noting that although classroom teaching, and mathematics education research the role of the teacher within the instructional unit of this research study was not delineated nor were any teacher interventions discussed in conjunction with student comments. The reader is left to Research involving microworlds and school ag e children conducted since the late 1990s seems to be following similar frameworks. Healy and Hoyles (1999) conducted case studies of 12 13 years olds using Logo based microworlds. They provided detailed accounts of how student interaction with microworlds resulted in their adopting different problem solving strategies incorporating visual and symbolic reasoning in varying during the tasks. This omission is curious beca use the researchers concluded that it is critical that computer use be carefully integrated into instruction and not be a supplemental add on. It is not apparent if the researchers are envisioning the microworld as a purely self directed discovery environm ent. Stohl and Tarr (2002) seemed to echo this sentiment of integrated instruction. They claim that the microworld, Probability Explorer

PAGE 118

102 appropriate statistical inferences, is not a panacea for probability instruction. What is reasoning about such topics; however, in their study the researchers designed the instructional program and functioned as classroom teacher. So the reader is left to wonder instructional unit integrating Probability Explorer Kordaki (2003) conducted qualitative research examining the effect of computer microworlds on 9 th regarding the concept of conservation of area. It focused on their learning processes and microworlds (i.e., electroni verbal interactions) along with field notes of the researcher showed students exhibiting a flexible and broad view of appropriate solution strategies; however, no information regarding the inter ventions of the teacher was provided. It would seem beneficial for a research study whose focus is on the learning processes of students to include some It would appear that a limi tation with much of the research presented in this last section is the absence of discussion related to the role, and impact of the classroom teacher within a microworld based instructional/exploratory unit. Although tasks and units of discovery that promo te independent learning are definitely valuable, one would certainly surmise that a qualified teacher would be able to add support, guidance, and depth to such learning environments. It would be helpful to know if certain qualifications (content or technol ogy related) are needed for a teacher to implement the various instructional units described in the previous research studies. The research I propose will

PAGE 119

103 be providing not only a detailed description of the instructional units and the microworlds integrate instructional setting. It must be noted the research dynamics will be different as the proposed study will be conducted in the context of a teacher education college methods course. I n the concluding section of this literature review, the role of the instructor will be one facet examined while reporting on the research that has investigated the use of microworlds within teacher education courses. Microworlds and Teacher Education A new technology discussed in this section, and incorporated into this research, is the Internet or web based microworld (also known as online or Java applets). This technology is very new and dynamic in the sense that it is evolving along with the Intern et. Because of the young age of the Internet (the first commercial web browser was only released in 1994), educational research based on its technologies is also in its early stages, with the vast majority of it surfacing after 1998. The amount of research within this domain is growing but currently very limited. The foci of research involving microworlds and teacher education fall along a continuum involving aspects of the affective domain (Timmerman, 1999) and knowledge types (Keller & Hart, 2002; Wetheri ll, Midgett, & McCall, 2002), with other research examining specific mathematical content (e.g., fractions Chinnappan, 2000; and the mathematics of change, Bowers & Doerr, 2001). Another important consideration while evaluating the research is the platfo rm on which the microworld will be running. For example, some of (Bowers & Doerr, 2001; Chinappan, 2000; Timmerman, 1999); however, others are

PAGE 120

104 online applets, reside on the Internet, and can be accessed on any computer through an Internet browser (Keller & Hart, 2002; Wetherill, Midgett, & McCall, 2002). Although the foci of the research and the type of microworld used varies, it is widely agreed upon that mathematics teache rs, not the tools of technology, are the catalysts to bring about a meaningful learning of mathematics with technology (Kaput, 1992; NCTM 1991, 2000; Willis & Mehlinger, 1996). Garofalo, Drier, Harper, and Timmerman (2000) provide five guidelines (discusse d earlier) for technology based activities designed to help reexamine and deepen understandings of mathematics. All the research found pertaining to web based microworlds and preservice teachers involved exploring mathematics that pre and inservice teache rs will be responsible for teaching. Browning and Klespis (2000) question this approach, at least in regards to secondary teachers, and instead suggest that in order for preservice teachers to experience and understand the impact of technology upon the lea rning of mathematics, the concepts must be new and on their level. Although this approach would appear a possible alternative for secondary mathematics majors, it does not fit as well for preservice elementary teachers, which is the focus of my study. Inte grating technology into instruction can take on many forms; however, there is consensus that the most effective learning within technology rich environments occurs within the specific content area which the technology will be used (Bull, 1997; National Gov to different degrees. The four studies discussed in this section involve software based microworlds and provide examples of the degrees to which technology can be integra ted within a methods course for teachers. Tzur and Timmerman (1997) conducted a teaching experiment with a

PAGE 121

105 case studies with three of the teachers. The Sticks microwor ld was incorporated within researchers were able to use research on stages o degree the researchers felt that knowledge of student thinking, the microworld, or the instructional sequence and materials contributed to the gains stated. representation of fractions in a microworld environment. The stud y was limited in scope. Eight volunteer preservice elementary teachers met individually with the instructor, who was the investigator, for approximately two hours. The interview sessions consisted of an orientation of the software ( JavaBars ) and solving tw o fraction problems, first without knowledge base suggests that they built up a minimum level of content knowledge of fractions. Analysis of their pedagogical content knowle dge growth revealed the participants were more concerned with solving problems than thinking about difficulties students might have solving the same problems. The preservice teachers did not exhibit skills at using the microworlds to provide different and pedagogically powerful solutions or representations to the given problems. One might conclude that the relative short contact time with the microworld combined with a lack of appropriate or motivating context could be a cause of the lack of pedagogical gro wth. Another explanation could be

PAGE 122

106 the inexperience of the participants. Livingston and Borko (1990) reported that novice teachers tend to focus on the content and the task at hand while the focus of an expert teacher is more often on the students. study by extending contact time with the microworld. This research had a similar methodology to Tzur and Timmerman (1997). Here Timmerman conducted a phenomenological study involving 12 elementary school teachers enrolled in a 16 week number concepts while using computer microworlds. Over the course of the semester, the conceptions of three teachers were studied, but this study focused on two of them. The subjects of the case studies had different motivations towards and backgrounds in mathematics. Field notes, audio tape interviews, a collection of reflective journals and final projects, classroom observ ations of the teachers, and pre and post course attitude surveys revealed that although the teachers enjoyed the control they had over their own learning with the applets, they could not shift their teaching style from teacher controlled to one allowing f or student independence and freedom to explore and learn about fractions while interacting with the microworlds ( Toys and Sticks ). In this study the teachers ended up not using the microworlds as part of instruction on fractions because of the lack of cont rol they had over the environment even though they acknowledged having difficulty generating conceptual explanations for some basic operations involving fractions (e.g., the division algorithm). It also became evident that personal learning preferences a nd styles influence the process of teachers learning in technology rich environments. Although the reporting was rich, details regarding the instructional

PAGE 123

107 sequencing were very limited. Bowers and Doerr (2001) seemed to strike an informative balance with t heir reporting. They acknowledge that students in a mathematics education course are simultaneously learners and teachers in transition. In their study they analyzed the lear ning of the mathematics of change and their developing understanding of how to teach effectively such concepts. The semester long study took place at two different universities with a total of 26 participants situated in similar courses designed around a m icroworld software environment called MathWorlds. The instructional sequence was perspective and then engage them as reflective teaching practitioners. Qualitative anal ysis of written work on problem solving assignments, reflective journals, and the perturbations as both student and teacher came to develop an appreciation for the v alue of conceptual explanations and explorations with technology. The value of viewing participants in the dual roles was confirmed as some of the participants developed mathematical insights as they created, taught, and reflected on mathematical lessons a role of mathematics students. Viewing preservice teachers in their dual roles as student and teacher and designing activities that stimulate both roles appear as a valuabl e way of integrating technology in such a way as to help address the demands of balancing content and pedagogy within a mathematics methods course. There is another emerging technology which after closer examination seems even better equipped to facilitate this

PAGE 124

108 balancing act. This review of the literature concludes with research pertaining to Internet based microworlds. Technologies residing within the Internet comprise an evolving world of knowledge and potential tool for education. Research on such a d ynamic domain must be on the cutting edge in both theory and application. In light of the emerging state of Internet based microworlds, it would seem appropriate to include a discussion of the prominent findings from the two studies found which have and co ntinue to investigate this technology, even though these findings are preliminary. Both studies utilize online applets and activities located at the Illuminations website developed in association with the NCTM and currently found at: http://illuminations.n ctm.org/ These studies investigated the influence of applet based instructional materials on both teacher knowledge (content and pedagogy) and student learning. Based on the success of the Illuminations based professional development, Wetherill, Midgett, and McCall (2002) designed a two part qualitative study on the impact of the NCTM Illuminations applets and support materials on teacher knowledge of mathematics content and pedagogy, p of thirty middle grade teachers who participated in a summer professional development project centered on the resources contained at the illuminations website, three teachers were identified to participate in this two part study. Data were collected from videotaped lessons, from phase one were encouraging. A paired t test from the 30 original participating teachers (including the three for this study) showed signifi to explain concepts. Other preliminary findings indicate that the fraction applet provided

PAGE 125

109 teachers opportunities to develop new insights into their own knowledge as well as their ons regarding the relationships of fractions. Data from phase one also showed that the fraction applet enabled both teachers and students to visualize mathematical relationships and hence deepen their understandings of fractions. The second phase will cont inue studying the subjects in the first phase to collect formative data on the design of the applet based resources. What was lacking in the reporting of phase one was specific information regarding the instructional materials used in the study. It is poss ible such information will be forthcoming in the formative research involved with phase two. Another study presenting preliminary findings regarding the use of applets found on the Illuminations website comes from Keller and Hart (2002). Their three phase study (two of which have been completed) evaluated curriculum embedded applets for A set of online instructional tasks were created that would engage the preservic e teachers in using the applet to develop their spatial visualization skills in the role of a student and then apply that knowledge by filling the role of a future teacher designing lessons involving isometric drawings. Paper and pencil tests and videotape d sessions from phase one suggest that the applet based instructional materials improved the preservice phase suggest that the instructional materials enhanced the pres pedagogical content knowledge as evidenced by their increased awareness of certain teaching and learning issues related to isometric drawings. As in the previous study, no specifics were provided regarding the content of the inst ructional materials or the role of

PAGE 126

110 the various instructors. Until formative findings are presented, one can only speculate as to the potential effects or influences of web based microworlds on the knowledge and skills of pre and inservice teachers and the resulting impact upon student learning. How they Informed this Study It becomes clear from reviewing the research that many preservice teachers, even those who possessed a strong mathematics backg round or at least expressed confidence about their content knowledge, exhibit a very limited pedagogical content knowledge as noted by an inability to provide conceptual explanations (Borko et al., 1992), being 3), and routinely being unable to anticipate expert teacher on the other hand has been shown to possess a more conceptually grounded understanding of many mathematica l topics (Fuller, 1993), displays an appropriate balance of procedural and conceptual knowledge (Hiebert & Carpenter, 1992), uses technology to promote conceptual understanding (Mitchell & Williams, 1993), and tends to focus on the student instead of the c ontent (Livingston & Borko, 1990). A novice teacher progressing along the continuum to becoming expert is clearly advantageous and every effort should be made to accelerate that progression. The progression is multi owledge will be an integral part of their teaching, and a lack thereof will very likely affect the quality of instruction (Grossman, Wilson, & Shulman, 1989) and ultimately student learning (Fennema & Franke, 1992). Research suggests that preservice teache rs can benefit from revisiting

PAGE 127

111 their mathematical knowledge in appropriate and meaningful contexts (Ball & Bass 2000), and that pedagogical content knowledge and content knowledge should be developed simultaneously (Good & Grouws, 1987; Stacey et al., 2001 ). One might assume that many aspects of PCK (e.g., a knowledge of student thinking) naturally develop while performing the act of teaching. Researchers have found too often this is not the case (Ball et al., 2001; Ma, 1999). Methods classes have shown to offer a very knowledge and pedagogical content knowledge (Ball, 1990; McGowen & Davis, 2002; Quinn, 1997; Simon & Blume, 1996; Stoddart, Connell, Stofflett, & Peck, 1993) ; however, links between how and to what extent CK and PCK, regarding specific mathematics topics, can develop within a methods course are lacking as well as are attempts to establish how dependent PCK may be upon CK. This research seeks to add to the body of knowledge about the relationships and potential dependencies between CK and PCK (specifically, knowledge of student thinking), and how these two can develop within a specially structured methods course. andings regarding measurement concepts such as area and perimeter, and the results have consistently shown that large percentages of students struggle with the most fundamental skills and concepts (Hiebert, 1981; Kenney & Kouba, 1997; Kouba et al., 1988; L indquist & Kouba, 1989; Martin & Strutchens, 2000). Not only are many students not learning the skills necessary to solve even the most basic problems involving area and perimeter, but it appears they are also at the same time developing misconceptions re garding these ideas (Hiebert, 1984; Hirstein et al., 1978; Piaget, Inhelder, & Szeminka, 1981; Wilson & Rowland, 1993). Repeated

PAGE 128

112 exposures to procedural oriented curricula materials and instructional strategies have not been able to address adequately the documented deficiencies regarding area and perimeter (Kamii & Clark, 1997; Martin & Strutchens, 2000). The fact that researchers have found teachers possess many of the same misconceptions regarding area and perimeter as students do is cause for alarm (Bal l, 1988; Ferrer et al., 2001; Fuller, 1996; Lappan et al., 1998; Maher & Beattys, 1986; Ma, 1999; Menon, 1998; Reinke, 1997; Simon & Blume, 1994a; Tierney et al., 1986). Although non traditional instructional strategies have been successful in remediation of student difficulties and developing a more conceptual understanding of area and perimeter (Casa, Spinelli, & Gavin, 2006; Johnson, 1986; Lappan et al., 1998; Moyer, 2001; Wilson & Rowland, 1993), very little research has been conducted to investigate wa ys to address the deficiencies preservice elementary teachers have shown towards these concepts. It would seem reasonable that if teachers possessed a more conceptual understanding of area and perimeter, they would be better able to compensate for a medioc re curriculum and more prepared to deal with student difficulties. Further research is needed to explore ways to intervene in and misconceptions identified by the liter ature. This research examined what preservice elementary teachers understand about area and perimeter (i.e., their content knowledge) and how they might approach student difficulties regarding these concepts (i.e., their knowledge of student thinking) bo th before and after innovative intervention. Integrating technology into the learning of mathematics has been shown to positively influence achievement, stimulate and enhance spatial visualization skills, and promote a more conceptual understanding of mat hematics for students and teachers

PAGE 129

113 ( Boers van Oosterum, 1990 ; Dunham & Thomas, 1994; Groves, 1994; Rojano, 1996; Sheets, 1993). To be ready to enter the technological classrooms of tomorrow, prospective teachers need content specific instruction with the a ppropriate pedagogical support needed for these newly forming instructional environments (Cooper & Bull, 1997; Glenn, 2000; Kersaint & Thompson, 2002; Timmerman, 2004); however, it has become apparent that many prospective teachers do not possess the neces sary knowledge or experience to meet these demands (MEET, 1999; OTA, 1995; Pellegrino & Altman, 1997; Thompson, 2000; Willis & Mehlinger, 1996). It is strongly recommended that appropriate technology integration be modeled for and experienced by prospectiv e teachers (Connors, 1997; ISTE, 2000, 2008; MEET, 1999; NCTM, 2000; OTA, 1995; Timmerman, 2004), preferably within contexts that help simulate future classroom experiences (Clements, 1999; Heid, 1997; Thompson, 2000). One such instructional strategy that can accommodate the technology, content, and pedagogy needs of preservice teachers is anchored instruction. Anchored instruction with preservice teachers has been shown to promote reflective thinking (McIntyre & Pape, 1993), help with incorporating appropr iate technology integration (Bauer, 1998), develop instructional units (Kariuki & Duran, 2004), and determine the significance of integrating technology into the teaching of mathematics (Kurz & Baterelo, 2004). The last study mentioned is the only one foun d examining the benefits of preservice teachers learning about and preparing to teach mathematics through anchored instruction. This is certainly an area ripe for further study. This research provided valuable insights into the possibilities of web based m icroworlds serving as a technology delivery medium for anchored instruction.

PAGE 130

114 CHAPTER 3 METHODS Prospective mathematics teachers learn about pedagogical content knowledge when their instructors model activities, introduce tools such as manipulatives an d technology, and discuss literature about how students learn certain mathematical concepts and about student misconceptions (MSEB, 1996, p.6) Introduction This study uses quantitative and qualitative methods in an attempt to accomplish three goals: (a) of area and perimeter and how they change and develop through intervention, (b) to thinking, and (c) to examine the use of anchored instruction that integrates the use of web based microworlds designed for exploring perimeter and area, as a potential learning thinking. These goals are motivated by the need deficiencies, specifically relating to area and perimeter. Although these goals are specific, they fall under an overarching purpose for preservice teachers, which is to develop

PAGE 131

1 15 contextual conte nt knowledge and pedagogical content knowledge side by side while simulating future classroom scenarios and teacher student exchanges. The teacher development experiment (TDE) provides a method for studying teacher development (Simon, 2000), and has shown to be a valuable approach for studying prospective Blume, 1994a). The theory (or models of learning) advanced by this study should not be viewed as static but rather such open to ongoing modification by the researcher as well as other scholars. In as unquestionable relationships within the data. Rather the goals of this TDE were to appropriately illuminate concepts (Goodman, 1984), develop and describe models of interventions that promote mathematical growth (Simon, 2000), blur the line between theory and practice ( Cobb, 2000), and provide a basis for further discussion and research. Research Questions The primary research question for this study is, content knowledge and pedagogical content knowledge, related to area and perimeter, change as a result of experiencing anchored instruction integrated with web based microworlds, In particular: 1 invol vement in the teaching episodes?

PAGE 132

116 2. W hat prior to involvement in the teaching episodes? if at all, during the course of this study? perimeter change if at all, during the course of this study? area and perimeter related to their content knowledge of those same concepts? Setting The context of this s tudy was a mathematics methods course for elementary education majors at a small, liberal arts college in the southeastern United States. The study involved the use of an intact group of PSTs ( n = 12). The PSTs were enrolled in a methods course that met tw ice a week for 75 minutes per class. To facilitate the technology component of this study, the class took place in a small computer lab. The lab was equipped with an instructor computer connected to a projector and to the Internet. Each student had their o wn computer, with Internet access, as well as ample desk space for working and note taking. The PSTs enrolled in this course were juniors and seniors who were working towards state certification as elementary school teachers of grades K 6. Typically, PSTs enrolled in this course will have completed their mathematics requirements (i.e., courses in College Algebra, Probability and Statistics, and Liberal Arts Mathematics). The small class size is in keeping with similar teaching experiments (Borasi, 1994; Lea vy, 2006; McClain, 2003; Simon & Blume, 1994a, 1994b, 1996). The

PAGE 133

117 study occurred at the college where the researcher is a full time mathematics professor, who has taught the elementary level mathematics method course over nine times prior to conducting this study. According to Simon (2000), it is appropriate to conduct a TDE within the distinct learning community of the PSTs. Because the setting is a small liberal arts college (student enrollment is approximately 600), the researcher typically knows the st udents who enroll in the only section of the mathematics methods course for elementary education majors, as he is the primary instructor for other required courses they take (e.g., College Algebra, Liberal Arts Mathematics, and Technology in Education). By the time students appear in the elementary mathematics methods course, the researcher/instructor is aware of many of their mathematical strengths and weaknesses. Information obtained in the pre study questionnaire and results from the pretest were facto rs in asking four preservice teachers to participate as case studies adapted for questionnaire, (b) the overall score and mathematical substance of their responses to sim ilar items on the pretest, and (c) and the potential of those responses to facilitate future interviews and interventions, data mining, case study construction, and subsequent model building of mathematical knowledge. Description of the Methods Course Th e methods course in which this study occurred is required for all elementary education majors. The course is conducted from a constructivist learning perspective. Students are actively involved using manipulatives (both concrete and web based) to assist in constructing understanding of mathematical concepts. They often work in small

PAGE 134

118 cooperative groups which encourages sharing and justifying of ideas. The course syllabus ( Appendix B) presents the purpose of the course as follows: The purpose of this course is to provide opportunities for preservice teachers to examine and build upon their understandings of various mathematics topics, and to construct a vision of teaching and learning mathematics that considers the goals and the assumptions of the current reform movement in mathematics education. Content, methods, and materials for teaching elementary school mathematics will be examined cooperatively. The preservice teachers are involved in a variety of activities. These includ e lectures, demonstrations, summarizing journal articles, preparing lesson plans, viewing, reflective writing, and discussing online videos of reform based teaching episodes, mathematical error analysis of elementary students, question and answer sessions, and numerous problem solving situations including discussion of applications for teaching. The textbook used in the course is Elementary and Middle School Mathematics: Teaching Mathematically, Sixth Edition by John A Van de Walle (2007). Typically, the textbook is used as a guide while the following mathematical objectives and pedagogy are addressed: (a) develop understanding in mathematics, (b) teaching through problem solving, (c) build assessment into instruction, (d) teach mathematics equitably to al l children, (e) integrate technology and school mathematics, (f) extend early number concepts and number sense, (g) develop meaning for the operations, (h) support understanding of basic facts, (i) increase whole number place value and whole number comput ation, (j) promote estimation skills, (k) concepts and computation with fractions, and (l) concepts of measurement. Concepts involving area and perimeter (the focus of

PAGE 135

119 this study) do not appear in the Van de Walle text until chapter 20. Although the autho r ) are continually a source of total of one page of text to address area and perimeter misconceptions. The tre atment that area and perimeter receive in the course text lends further credence as to why research is needed to help with devising instructional methods to integrate seamlessly and efficiently elementary mathematics content with the appropriate pedagogy especially for methods courses already crowded with an abundance of topics to cover. The Microworlds Technology is one tool espoused by many to enhance the teaching and learning of mathematics (ISTE, 2000, 2008; Marzano, 1998; MEET, 1999; NCTM, 2000; NR C, 2001). As mentioned earlier, geometric microworlds, specifically designed for the exploration of area and perimeter concepts, were utilized within the teaching episodes to facilitate and motivate deep and extended exploration of the concept(s) and misco nception at hand. After considerable Internet searching, comparing, and experimenting (both personally and with students in my methods classes), two well designed microworlds were selected for this study Shape Builder and an Explore Learning Gizmo The m icroworlds facilitated four specific instructional techniques analysis conducted by Marzano (1998). One of these interactive microworlds (see Figure 7) was conceptualized and designed by ExploreLearning and is located at: http://www.explorelearning.com/ (2010). The ExploreLearning

PAGE 136

120 Figure 7. Screenshot of perimeter and area microworld with several options selec t ed. ( Copyright 1999 2010 ExploreLearning. All rights reserved. Used by permission.) rectangle (user selects), stretch or shrink it by moving the mouse, and then observe the resulting effect upon real time feedback allows for the exploration of the misconception that increasing a also be controll ed by directly entering numbers (decimals allowed) for the base and height. Various options can be turned on or off to allow for feedback or for discovery the concepts of area and square units. The picture icon (upper left corner) allows the user

PAGE 137

121 an y word processor, as a picture. The other microworld used in this study was develo ped through a cooperative effort and with the support of the Shodor Education Foundation, Inc. The researcher worked with a programmer to design a microworld that supports the exploration and hypothesis testing of issues related to content knowledge and kn owledge of student thinking. The original applet, called Shape Explorer can be seen in Figure 8. Shodor incorporated many of the features from the microworld used for this study into their newest version, called Shape Builder It was released after this s tudy was completed, and Figure 8. Screenshot from Shape Explorer microworld website. (Reprinted with permission from: http://www.shodor.org/interactivate/activities/ShapeExplorer/ copyright 1997 2010, The Shodor Education Foundation, Inc.)

PAGE 138

122 is found at http://www.shodor.org/interactivate/activities/ShapeBuilder/ (2010) The redesigned microworld that was used in this st udy is shown in Figure 9. It is also called Shape Builder and can be found at: h ttp://www.shodor.org/~pjacobs/ restored/ shapebuilder/ However, because of major Internet platform upgrades at Shodor, that microworld is no longer supported. The microworld h as two modes, Auto Draw Shape and Create Shape selected, the microworld will automatically create random shapes both irregular Figure 9. Screenshot from the revised Shape Builder microworld website. (Reprinted with permission from: http://www.shodor.org/~pjacobs/restored/shapebuilder/ copyright 1997 2010, The Shodor Education Foundation, Inc.)

PAGE 139

123 (Figure 10) as well as rectangular (Figure11 ). The complexity of the shape is determined by how select to have the microworld ask for perimeter or area or both. Being able to make calculations involving irregular shap es is an option that helps address a major area and perimeter weakness among school students and teachers alike, as presented in chapter 2. grid, create shapes, enter a guess fo user can also have the microworld compute the area and perimeter of the shape in real time. In either mode, the microwor ld will let the user know if they have entered in the correct answer for perimeter and/or area, and after two wrong attempts the microworld will give the correct answer. The microworld tracks and can display the accuracy of correct and wrong responses by c give an error message if the user attempts to create a disconnected shape ( Figure 1 2). A his option allows the user to create the outline of a shape (Figure 9), just as one could do with a manipulative such as color tiles, but then fill calculation for the area but leave the perimeter the same (compare Figure 9 with Figure 12). Such a feature could help students in addressing the misconception that figures with the same perimeter must have the same area and vice versa. Another way in which the Shape Builder microworld can facilitate the development of conceptual knowledge is the

PAGE 140

124 Figure 10. Figure 11 Shape Builder screenshot of a rectangular shape automatically generated by the microworld while

PAGE 141

125 Figure 12. Screenshot from Shape Builder showing error message when an invalid shape is created. Figure 13. button was pressed with the sh ape shown in Figure 9.

PAGE 142

126 will always increase its area and vice versa. Another feature tha t helps dispel mode. Both microworlds allow for dynamic interaction and real time feed back which are crucial to the implementation of anchored instruction and the development and enhancing of conceptual understanding of concepts related to area and perimeter. These microworlds possess the necessary options to facilitate the building of a co nceptually sound content knowledge of area and perimeter as well as specific tools to allow for hypothesis testing to help address the difficulties and misconceptions regarding area and perimeter as discussed in the literature. The Intervention An impor tant feature of a teaching experiment resides in the activities and situations used for the purpose of understanding the mathematical knowledge and growth PSTs (preservice teachers) and the instructor/researcher are involved in the active learning environment which is at the core of a teaching episode. In this study, the PSTs learned about elementary mathematics and how classroom students think about elementary mathematics, and the professor learned about the value of the planned knowledge of student thinking, and the value of these experiences to an already crowded

PAGE 143

127 elementary mathemati align with and reinforce many of the objectives of the methods course that include: interactive learning environments, cooperative group activities, round table like discussions, exploratory lea rning, the blending of content and pedagogy, technology integration, and examples of theory meeting practice. In lieu of a formal and complete pilot study, the researcher engaged in piloting the various instruments and interventions that were used in thi s teacher development experiment. Steffe and Thompson (2000) strongly recommend that: who wished to do so, should engage in exploratory teaching first. It is important that on ways and means of operating in whatever domain of mathematical concepts and operations are of interest (p. 275). Towards that end, various aspects of the proposed study were pil oted beginning in the spring semester of 2004 and concluding the fall 2006 semester including: (a) the pre study questionnaire, (b) the items and format of the area and perimeter pre post and follow up tests, (c) the development and refinement of the s coring rubrics for the area and perimeter tests, (d) the framework and classroom testing of the teaching episodes, and (e) interview protocols. All the pilot work done for this study was conducted in courses for elementary teachers. Details of the different piloting sessions are found in Appendix A The major decisions resulting from piloting are presented within the appropriate section. A similar version of the format used for the teaching episodes was piloted in the

PAGE 144

128 fall of 2006. The pilot informed the actual teaching episodes in the following ways: (a) There needed to be a separate orientation session (a few weeks before the forma l study would begin) to acquaint the PSTs with the two microwo rlds used in this study; (b) t here needed to be a clear transition within the teaching episodes between the PSTs thinking as learners of mathematics and as future teachers of mathematics; and (c ) e ach teaching episode needed to comprise two class sessions one for individual problem solving and opportunities to reflect upon their written responses and another for cooperative work, whole class discussion, and subsequent reflection writings. Ana work from the piloted teaching episode revealed that most of them were currently at a novice stage in their application of both content knowledge and knowledge of student thinking. They spent minimal time analyzing the mathematics of the pro blem; hence, they initially overlooked mathematical subtleties of the problem a valuable skill of experienced and effective teachers. For some PSTs the microworld did not seem to facilitate mathematical or pedagogical growth; however, others indicated si gns of growth in both categories (see Appendix A). Anchored Instruction Anchored instruction was used to frame the teaching experiment and the subsequent teaching episodes. Anchored instruction is a research based paradigm for learning through technology assisted problem solving creation of an anchor of focus [typically, technology based] around which instruction can 1994, p. 131). The instructional sequence actively involved preser vice teachers in thinking about and planning for how best to address

PAGE 145

129 motivating and authentic context. Although videodiscs have often been used to provide an environment to fac ilitate anchored instruction and problem solving within a meaningful context, interactive geometry microworlds, specifically designed for the mathematical content in this study, were used to provide the dynamic environment. Within the anchored instructio n framework, features provide a model through which preservice teachers were observe d their work examined, and discuss ions and interview s conducted as they explored and wrestled with concept s individually and cooperatively with peers. The model includes: (a) introducing (verbally) an interesting problem and a general framework (which included a microworld) for solving the problem, (b) providing time for PSTs to generate and test their own str ategies, (c) providing PSTs time to work with one researcher and the research literature ). The above processes are not meant to imply that transforming content knowledge into pedagogical content knowledge occurs in a set of fixed stages, phases, or steps. Instead, teacher education can only attempt to provide preservice teachers with the unde rstanding, performance abilities, and a setting in which to develop the tools they will need to teach effectively. The Teaching Episodes The focus of the teaching episodes for this study were the common difficulties and misconceptions classroom students (and teachers alike) have regarding area and perimeter, and what effective intervention might involve. Too often the topics of area and perimeter are presented in isola tion of each other (Chappell & Thompson, 1999; Hiebert

PAGE 146

130 & Lefevre, 1986; Simon & Blume, 1 994a). One aspect of this study investigated the anticipated merits of interweaving the exposure to both these concepts throughout the teaching episodes. With this said, three teaching episodes were constructed. Each teaching episode began with a whole c the focus of that teaching episode. Time was taken at the outset to explain the format of Guided Design, 1977) and allow for questions to help clarify any directions. Because the concepts being explored (area and perimeter) are assumed to be previously learned, there was not any lecture or content based, teacher lead instruction prior to engagi ng the preservice teachers in individual problem solving. During the teaching episodes the preservice teachers first analyzed and attempted to solve the focus problem (see Figure 14) individually. After the individual work, the students were organized into groups of two or three and allowed time to share their thoughts about the problem and their problem solving strategies, and then given time to reflect upon Figure 14 T he focus problem appearing at beginning of teaching episode 1. method is to s hade the squares along the outside of the shape, as shown in Figure 2, and then to count those squares.

PAGE 147

131 what they have heard and how it had influenced their understanding s Following the cooperative work time, the class came together and the instructor/researcher concluded the teaching episode with a whole class discussion of the primary concepts and misconceptions addressed by the teaching episode and how the microworld could have been used to provide personal insight and enhance instruction. Each teaching episode (see Appendix K) was broken up over two class periods. The first class session involved all the individual problem solving and reflection (both wit h and without the applet), and the second class focused on cooperative work, whole class discussion, and periods of reflection about both activities. For the first two teaching episodes the microworlds were not made available until after the PSTs had worke d on the focus problem for several minutes. Then they were given the next section of the packet and instructed to access the microworld to reevaluate and possibly refine their earlier responses. For the third teaching episode, the preservice teachers had a ccess to the microworlds from the beginning. This was done to determine whether the PSTs considered the microworld (s) as a tool to aid them while problem solving and when hypothetically interacting with students or viewed it as an add on (i.e., something used after the majority of the problem solving was done). Each teaching episode was self contained and presented to the PSTs in the form of a Learning Packet ( see Appendix K). Each packet contained the following: 1. A problem addressing the primary co ncept(s) and misconception to be explored (see Figure 14), 2. Follow up questions asking the PSTs about the correctness of the hypothetical

PAGE 148

132 follow up with the student, 3. Interspersed opportunities for the PST to reflect on their current progress and thinking 4. A time to got into groups of two or three and discuss their thoughts and findings, 5. A writing time to express shared knowledge in relation to their own previous knowledge prior to the sharing, and 6. A cooperativ e summary and whole class discussion of the salient points of the activity and provide the PSTs another opportunity to reflect and summarize how their mathematical understandings, knowledge of student thinking, and potential teaching strategies have changed as a result of the teaching episode. Because the PSTs were asked to reflect about cognitive issues, as opposed to affective issues (e.g., beliefs), opportunities to reflect are incorporated directl y into the context of the teaching episode, as opposed to being placed in a reflection journal and completed outside of class. The timing and placement appeared to help to capture moments of ching episodes were a mixture of testing items selected for the study and problems specifically modified to elicit mathematical discussion and contextual pedagogical reflection. In order to facilitate ongoing and retrospective analysis, as required in a te acher development experiment, the three teaching episodes were videotaped. The video tape was used by the researcher for ongoing analysis of the format and carrying out of each teaching episode as well as future analysis of instructor and PST involvement. Each teaching episode encompassed two 70

PAGE 149

133 minute class periods, or one week of the semester. Modifications to Teaching Episodes Many of the modifications to the teaching episodes were changes to format. Retrospective analysis of TE 1 resulted in the additi on and revision of certain writing prompts to elicit feedback to better establish patterns of novice and/or expert behavior. without overwhelming them with the spec ific mathematics inherent to a microworld. Because some of focus problems (e.g., TE 2) could be solved or approached in different ways, a writing prompt similar to the following was added about half way through Figure 15 Focus problem for teaching episode 2. The Setting: Your 5 th grade class is studying area, and you challenge them to find the area of one of their footprints. You instruct your students to stand on a piece of paper and trace their shoe, and then individually brainstorm a strategy to find the area of the footprint. The Situation : After several minutes one of your students, Tommy, comes up to you and explains his method. He says he would lay a piece of string around the outside of the paper footprint, cut the string to the precise length, form the piece of string in to a rectangle, use a ruler to measure the length and width of the rectangle, then find the area of the rectangle. In other words, he believes that the area of the rectangle will be the same as the

PAGE 150

134 way to find the area of a footprint? Can you also think of yet another way to solve the sode 1 also revealed modifications were needed to certain writing prompts involving content knowledge (CK) and knowledge of student thinking (KoST.) Revisions to CK & KoST Writing Prompts mpts were g knowledge and problem solving ability. The knowledge of student thinking prompt, analysis of teaching episode 1 revealed the PSTs were frequently giving clich type for the PSTs to reflect upon and consider the educational implications about such things as the curricula and presentation of topics (ideas we had discussed in the whole class discussion at the end for TE 1), and to encourage them to reflect on personal e xperiences; thus, revealing more about their mathematical background or beliefs about how students might best learn area and perimeter. To facilitate such reflection, this KoST prompt was same incomplete understanding as Jasmine [figure 17]? If so, what do you think may be the cause? When

PAGE 151

135 k the cause change. While these responses provide opinions, they reveal that PST s were beginning to instructional techniques to address them. Two important modifications to Day 1 for TE 3 were enacted after analyses of teaching episodes 1 and 2 rev ealed that when a PST did not fully comprehend the which provided little insight i increased the amount of content kno wledge that could be glean ed from PSTs' responses. specifically would you say or disagreed with the studen t in the problem and occasionally elaborate beyond that. That phrase seemed to allow for more freedom to reflect and hypothesize about how they

PAGE 152

136 4 in. 4 in. 4 in. 8 in. perimeter = 16 in. perimeter = 24 in. area = 16 square in. area = 32 square in. Figure 16 Focus problem for the third teaching episode. Revisions to Cooperative Work. The PSTs were asked to work in cooperative groups of three during the second day of each teaching episode. They were then supposed to succinctly share with th e groups members their thoughts and ideas about the questions presented in Day 1. There one pertaining to the questions addressing CK, one for KoST and instructional implications, and another for what was learned, a nd how, by interacting with the microworlds. While each PST took turns sharing, the other The Setting : You have just completed the last scheduled unit on area and perimeter with your 5 th grade class. You feel they understand the concepts pretty well. While one of your students, Jasmine, comes up to you very excited. The Situatio n : Jasmine t hen tells you that she has figured out told the class about She explains that she has discovered that whenever you compare two rectangles, the one with the greater perimeter will always have the greater area Sh e shows you this picture as proof of what she is saying:

PAGE 153

137 group members were to compare what they were hearing with their own understandings t hey had gained. This self reflective exercise was meant to see if the PSTs could identify their own lack of knowledge and integrate the new knowledge in a meaningful way. After that the PSTs were focused on generating lists of factoids as they were given by their group members. Because these sections were designed to organize new knowledge into preexisting schemas of personal knowledge, each of the first three prompts from teach ing episode 1 were rewritten to better focus on a specific knowledge type and emphasize the reflective nature of the exercise. For example, the writing prompt from teaching episode d you gain did you gain from your group regarding student thinking (see questions __ & __) and instructional practices attention on the specific knowledge types in question, however the aspect of personally incorporating what was being heard into their existing knowledge was greatly lacking. tly, remind the PSTs that a personal self reflection was expected. An examination of the responses to the revised prompts in teaching episode 3 revealed only a slight incr ease in the quality of responses. While there were a few more meaningful responses along the lains her

PAGE 154

138 knowledge as well as processing and integrating new knowledge. One specific prompt with your g roup] regarding the use of the two microworlds (see questions 7, 8, & 11)? Be that was revised after TE 1 and again after TE 2 still did not produce insightful responses. The purpose of this prompt was to, in part, help evaluate the effectiveness of the microworlds as a tool within the TDE. Instead, the majority of the responses included lists of likes and dislikes or general comments about how the microworld could be used to show Tommy he was wrong. In hindsight, the prompt should have been reworded to get at the idea of how best to use the microworlds with future students to help them uncover and resolve potential misconceptions related to area and perimeter. Instrumentation Instruments used in t his study are described in this section. For each instrument a brief synopsis of their design, format, and implementation as well as how the pilot study influenced its use are provided. Pre Study Survey Questionnaire The questionnaire ( Appendix C) consist ed of 23 questions: five multiple choice, thirteen multiple choice followed by a request for further details, and four short answer constructed response items. The purpose of the questionnaire was to gather background exposure to concepts related to area and perimeter, (b) use of concrete manipulatives to learn about area and perimeter, (c) knowledge or use

PAGE 155

139 of various forms of technology (specifically computer software or Internet) to assist in the learning or teaching of area and perimeter, (d) confidence regarding their future teaching of area and perimeter, (e) confidence and willingness to use technology while teaching about area and perimeter, and (f) pedagogical choices regarding teaching the fundamental propertie s of area and perimeter. Results from piloting this instrument suggest that it was necessary to separate survey items referring to area and perimeter into two different questions and to add more survey items related to previous exposure to technology. In addition the format needed to be standardized (e.g., inclusion of Yes/No boxes) to ensure accurate and uniform completion. The last two survey items were added to address specifically the of information listed pedagogical content knowledge of area and perimeter. Area and Perimeter Tests The tests used for pre post and follow up consisted of 10 constructed response items (see Appendices D, E, and F respectively). Before the pretest was administered, each PST was assigned a number (1 12). Each test contained a cover page which had a sp for each five written on the cover page and at the top of the first page of their test. The cover pag e was process. The content knowledge (CK) questions (i.e., the first five) were administered dge of

PAGE 156

140 student thinking (KoST). This process helped to minimize the content knowledge questions biasing the knowledge of student thinking questions. The sources for the potential testing items included: a searchable database of released items from previou s administrations of the National Assessment of Educational Progress (National Center for Educational Statistics [NAEP], 2003; 2005), teacher resources dealing with measurement, and an extensive evaluation of research articles. The items selected for this study along with their respective source(s) appear in Table 1. The goal was to select problems appropriate for pre or inservice elementary teachers that also addressed the prominent difficulties and misconceptions regarding area and perimeter revealed in the literature, namely: 1. Trouble distinguishing between area and perimeter (Carpenter et al., 1975; Chapel & Thompson, 1999; Hart, 1883; Hiebert, 1981; Kouba et al., 1988; Tierney et al., 1900; Woodward & Byrd, 1983), 2. Confusing lin ear units and square units (CBMS, 2001; Hart, 1984; Hiebert, 1981; Lappan et al., 1998; Moyer, 2001), 3. The idea that all rectangles of a given area must have the same perimeter and vice versa (Lappan, 1998; Woodward & Byrd, 1983), 4. W rongly believing that area and perimeter are directly related in that one determines or influences the other (Ferrer et al., 2001; Kennedy et al., 1993; Lappan, 1998; Ma, 1999), 5. Trouble devising real world contexts for area and perim eter problems (Chappell & Thomspon, 1998),

PAGE 157

141 Table 1 Description of Tes t Questions Selected for this Study Pret est 1 Category C oncept(s) addressed Source Item #1 Content knowledge Perimeter & units Kenney & Kouba, 1997; Chappell & T hompson, 1999 Item #2 S Content knowledge Area Chappell & Thompson, 1999 Item #3 Content knowledge Area, perimeter, & units Hart, 1984 Item #4 S Content knowledge Linear & square units Sonnabend, 2004 Item #5 Content knowledge Area & perimeter Bassarear, 2005 Item #6 Knowledge of student thinking Area & units Sonnabend, 2004; Bassarear, 2005 Item #7 Knowledge of student thinking Perimeter & units Bu sh, 2000 Item #8 Knowledge of student thinking Perimeter Bassarear, 2005 Item #9 Knowledge of student thinking Perimeter & units Beckmann, 2003 Item #10 Knowledge of student thinking Area, perimeter, & units Woodward & Byrd, 1983 Note 1 Item s for the Follow up Test were structured exactly the same (other than changing the names in the problems) as the Pretest. S Item also appears on the Posttest.

PAGE 158

142 Table 1 (Continued) Description of Test Questions Selected for the Study Postt es t Category C oncept(s) addressed Source Item #1 Content knowledge Area Hart, 1984 Item #2 S Content knowledge Area Chappell & Thompson, 1999 Item #3 Content knowledge Area, perimeter, & units Sonnabend, 2004 Item #4 S Content knowledge Linear & square units Sonnabend, 2004 Item #5 Content knowledge Area, perimeter, & units Sullivan & Lilburn, 2002 Item #6 Knowledge of student thinking Area & perimeter Bassarear, 2005 Item #7 Knowledge of student thinking Perimeter & units Chappell & Thompson, 1999 Item #8 Knowledge of student thinking Area & perimeter Menon, 1998 Item #9 Knowledge of student thinking Area & units Hart, 1984 Item #10 Knowledge of student thinking Area & perimeter Bassarear, 20 05 Note: S Item also appears on the Posttest. The rest of the posttest items are parallel to the pretest statistically, in format, and in content.

PAGE 159

143 6. Trouble calculating area and perimeter of irregular shapes (Booker et al., 1986; Bray et al., 2006; Carpenter et al., 1975; Cass et al., 2006; Kouba, 1988), and 7. Difficulties explaining and/or illustrating the methods for their solutions (Ball, 1988; Chappell & Thompson, 1999; Woodward & Byrd, 1983). Analyses of the pilot data revealed that these seven difficulties could be condensed into three broad analysis strands that would serve as an organizing framework for test responses: (a) distinguishing between area and perimeter, (b) units of measure, and (c) perce ived relationships between area and perimeter. All three of these strands address, to different degrees, aspects of content knowledge and knowledge of student thinking. To be considered for use in piloting sessions and for final inclusion within the assess ment instruments, each question needed to meet the following criteria: 1. The problem was appropriate for pre and inservice elementary teachers. 2. The problem addressed some form of the common difficulties or misconceptions regarding area and perimeter presented in the literature. 3. The problem was already formatted as a constructed response item or could be easily modified to fit that format. 4. The problem was already written in the context of a teacher addressing a s tudent or students experiencing difficulties with area and perimeter or could easily be modified to accommodate that perspective. 5. The problem lent itself to the PST explaining their solution process and/or the thinking of the hy pothetical student presented in the item, and facilitated an opportunity for the PST to respond how they follow up with hypothetical student or students.

PAGE 160

144 6. No manipulatives or technologies were required to solve the problem. The area and perimeter assessment administered as part of the pilot study contained 15 problems. To provide more time for PSTs to respond and to encourage thoughtful reflection, tests used in this study were shortened to 10 items. The items c urrently found on the pre post and follow up tests for this proposed study were chosen because they: (a) were interesting and challenging enough to produce rich and diverse written responses, (b) were deemed best suited by the researcher to meet the go als of this study, and (c) met necessary guidelines based on descriptive statistics (i.e., mean scores, standard deviation, corrected item total correlation, and various Cronbach alpha values). The potential to illicit a range of thoughtful responses was v ery important in the item selection process because of the nature of the qualitative analysis that followed. The reader is referred to the last section of Appendix A for more details regarding the refinement of these testing instruments. Validity of Test ing Instruments Test validity refers to the extent to which an instrument measures what it intends gical Association, 1985, p. 8). This definition highlights the fact that test scores by themselves are neither inherently valid nor invalid. It is the inferences that are made from the test scores that must be established as either valid or invalid (Gall e t al., 1996). Evidence then must be provided to support any inferences about scores resulting from administering a test. Three types of evidence are commonly examined to support the validity of an assessment

PAGE 161

145 instrument: (a) content related, (b) construct r elated, and (c) criterion related (American Educational Research Association, American Psychological Association & National Council on Measurement in Education, 1999). There are two main considerations for establishing content related evidence for a test & Leydens, 2000). To ensure that this criterion is met, the researcher, aided by a second scorer, revised the instruments to clarify and minimize confusions related to language and ent related evidence is also concerned with the extent to which the items on a test represent the conceptual domain that it is designed to measure (Gall et al., 1996). Evidence for content validity is established because the questions used for the pre po st, and follow up tests were all drawn from extant literature pertaining to the teaching and/or learning of area and perimeter (see Table 1, p. 141 ). Criterion related evidence supports the extent to which performance on a given task may be generalized t o other, more relevant activities (Rafilson, 1991). The items used for the testing instruments in this study are based on research literature investigating various degrees and types of knowledge possessed by students, PSTs, and teachers. The two selected f or this study, content knowledge and knowledge of student teaching, are considered indispensable to a meaningful learning and effective teaching of mathematical concepts such as area and perimeter (Ball, 1991, 2003; Ball & Bass, 2000; Hill et al., 2004; Sh ulman, 1986). The scoring rubrics used to assess the tests also exhibit criterion

PAGE 162

146 related validity because the scoring criteria address the components of the assessments activity (the tests) that are directly related to future practices within the teaching profession (i.e., the need for content knowledge and knowledge of student thinking) (Moskal & Leydens, 2000). Construct related evidence focuses on the extent to which a test can be shown to assess the particular hypothetical construct(s) that it claims to measure (Gall et al., 1996). Two constructs this study attempts to measure are content knowledge and knowledge of student thinking, as pertaining to area and perimeter. Such constructs are internal and not directly observable. It is important therefore that any assessment attempting to measure such a construct considers, requests, and then examines both the product (i.e., the answer) as well as the process (i.e., the explanation) (Moskal & Leydens, 2000). The tests used in this study did just that. Altho ugh the PSTs were asked to answer several closed ended such questions were followed up by asking for an explanation of their thinking or for what they feel the studen t in the question was thinking. The holistic scoring rubrics used to grade the tests contain criteria that address both the product and the process of the testing items. No single item of evidence is sufficient to establish construct validity (Gall et al., 1996); therefore, the quantitative and qualitative results from the testing instruments served as supporting evidence (along with other qualitative data) to help explain the degree and type (procedural vs. conceptual) of mathematical and pedagogical growt Procedures In order to answer the research questions, data were

PAGE 163

147 developing understandings related to content knowledge and knowledge of student thinking regarding concepts of area and perimeter. So me d ata were collected from the entire class while other information (e.g., semi structured interviews) were unique to the misconceptions regarding area and perimeter supported the TDE methodology for this (1998) instructional recommendations, provided sustained opportunities to gather data s. When using an emergent methodology, such as this teacher development experiment did, these sustained opportunities of contact with the PSTs are important to generate multiple data sources. When data sources are triangulated to reveal a pattern of theme, there is greater confidence and trustworthiness that the apparent theme is not the coincidental result of a particular form of data (Simon, 2000; Tobin, 2000). Data Collection The mixed methods approach generated both quantitative (e.g., pre study ques tionnaire, and area and perimeter tests) and qualitative data (e.g., interviews, Teaching Epi sodes packets). All the data were of Teaching Elementary Mathematics course occurring in the fall semester, 2007. The PSTs were the 12 preservice elementary teachers who signed up for the class. The course lasted for 15 weeks, and students are only allowed two absences during the course. The study lasted five weeks and involved approximately ten classroom contact hours as described below: Week 1: Dispensed and collected the pre study questionnaire.

PAGE 164

148 Week 2: Administered pretest; based on questi onnaire and informal results of pretest, four PSTs were purposely selected for in depth study as particular cases. Week 3: Results from the pretest were used to inform semi structured interviews with the four selected for case study. Ts time in class for directed use of the two microworlds that were integrated into the teaching episodes as part of the anchored instruction. Week 7: Conducted the first teaching episode. Week 8: Conducted the second teaching episode. Week 10: Conducted the third teaching episode. Week 11: Administered posttest; results from posttest were used to inform semi structured interviews with the four case study subjects. Weeks 12 & 13: Conducted second round of semi structured interviews Week 15: Administered unannounced follow up test as part of in class final exam. It is comm on for larger and more extensive teaching experiments to last an entire semester (Leavy, 2006; Simon & Blume, 1994, 1996); however, such studies often investigate broad constructs (e.g., Statistical inquiry Leavy; Multiplicative relationships & justifica tion Simon & Blume). Although this study represents a brief intervention, it is in keeping with other similar teaching experiments which studied specific mathematical content (Borasi, 1994; Komerek & Duit; 2004; McClain, 2003).

PAGE 165

149 Whole Group Data Pre St udy Questionnaire The pre study questionnaire was administered during class time to all the PSTs. Students were instructed to answer each question to the best of their memory and to be as specific as possible (i.e., provide personal situations or supporti ve examples) when asked for opinions regarding technologies as well as when responding to hypothetical pedagogical questions. All students were present when the questionnaire was administered. Before t he study began, cl ass time was used to orient the PSTs regardi ng the two microworlds that were used in this study. One problem was selected for each microworld that highlight ed the important features of that microworld (see Appendix M). The researcher modeled the various fe atures of each microworld without specifically discussing the pedagogical benefits of certain features. The PSTs were then given an opportunity to use each microworld while engaged in solving the two chosen problems. Neither of these problems was used in a ny part of the actual study, and they did not involve any of the misc onceptions under scrutiny in this study. One student was absent for the orientation and a t ime was scheduled the same week for her to work through the orientation in my office while I sup ervise d. evidence of novice and/or expert teacher characteristics. The second observer was present during the orientation session, and the session was video taped. Shortly after the orientation session, the researcher and the second observer meet, discussed the session, compared notes, and agreed that nothing occurred

PAGE 166

150 during the orientation session that would bias any aspect of t he study. The second observer was the current Dean of Academic Affairs at the in stitution w here the researcher wa s empl oyed full time. She holds a Ph. D. in Instruction and Curriculum and has vast experience with the elementary curriculum and preservice teachers. The second observer and I met once over the summer, and had several email correspond ences, to discuss various aspects of this study, especially methodology, as well as her role as second observer. The observer protocol (Appendix L) and the for mat of the teaching episodes were discussed. Administering Area and Perimeter Tests The pre po st and follow up tests were taken by all PSTs and were administered during class time. Only one test was not taken as scheduled (a follow up test), and that was made up under supervision. Each test was comprised of five content knowledge (CK) questions a (KoST). Before responding to any items, each PST was given the first half of the test (i.e., the content knowledge questions) and asked to complete its cover page. The PSTs were asked to turn to the first page of the test and the researcher read aloud the instructions. A brief description of the two categories of questions (i.e., CK and KoST) was presented and the PSTs were informed that they would be functioning first as a student/lear ner and then as a prospective teacher and to think, analyze, and respond accordingly. The PSTs were encouraged to ask questions regarding the format of the test or what was being asked of them. There were no significant questions or discussion that ensued. The instructor/researcher was available during the exams to address questions related to test or item format, but no mathematical assistance was given. The pilot study

PAGE 167

151 revealed that one hour would be sufficient to complete each testing session. The PSTs w ere encouraged to complete the first half of the test (content knowledge) in approximately 25 minutes. When they finished the first half, it was collected and the second half of the test (knowledge of student thinking) was provided for which 35 minutes was scheduled. The one hour proved sufficient for most; however, because the computer lab where we were conducting class was available for the period that directly followed our methods course, a few students needed and took 5 10 minutes to finish their test. Testing times are provided in Chapter 4. PSTs were instructed to raise their hand when they completed each portion of the test so the researcher could document stop time. The PSTs were instructed that after finishing the entire test, they were to sit quiet ly and each portion of each test was documented on a spreadsheet. This information was used during the analysis stage. The above process was completed for the pre po st and follow up tests. Data from Teaching E pisodes Both the instructor/researcher and the second observer kept field notes during each teaching episode. The instructor/researcher documented pertinent observations of and conversations with PSTs (especially the case subjects, described later) that occurred d uring the teaching episodes. Special effort was made to document whether the behavior or conversation was focused on mathematical content (i.e., area and perimeter) or aspects of pedagogical content knowledge (specifically, knowledge of student thinking). The organize her observation activity. Debriefing time was scheduled for the researcher and

PAGE 168

152 observer following each teaching episode. While engaged in each teaching e pisode, every PST completed a Learning Packet (Appendix K). They were asked to provide written responses to questions and prompts pertaining to aspects of mathematical content knowledge related to area and perimeter and their knowledge of student thinking regarding contextual situations involving those same concepts, reflective activities throughout the episode focusing on current and evolving understanding, perceived and realized benefits of exploring concepts with the microworlds, and how the cooperative work influenced their mathematical and pedagogical understandings. oles This study matches the multi level focus encouraged by and provided for the TDE. There were two levels of participants in this study, the researcher/teacher educator, and the p reservice teachers. There were also two levels of curricula being explored: the implemented a unique instructional approach for learning about area and perimeter concepts. It a ddressed concerns and recommendations of the research literature for both teacher education and the teaching and learning of elementary mathematics. Specifics about the teaching episodes will be presented later in this chapter. Not only did the researcher function in a dual role during this study, but so did the PSTs. Preservice teachers enrolled in a mathematics education course are simultaneously learners and teachers in transition (Bowers & Doerr, 2001). As learners, they have opportunities to investigat e and construct new thoughts about seemly familiar mathematics and about ways that others might learn the same concepts. As teachers in transition, they are

PAGE 169

153 contemplating how their learning experiences and understandings in mathematics will relate to and p repare them for future experiences as teachers in their own classrooms. This dual role served as a backdrop for rich and meaningful explorations into the Case Subjects: Selection and Data Collection Process Four PSTs, two scoring at or near the bottom on the pretest and two scoring at or near the top were identified as case subjects for in depth examinations. The quality of their responses on the pretest, as opposed to some predetermined score, was of primary considera rich depth study as particular cases assisted in analy zes, and reports upon social and affective components of the environment or setting being investigated. Although the researcher admits it is practically impossible to study mathematical learning in a vacuum apart from these variables, they were not a prima ry focus in the collection, analysis, or reporting stages of this study. Certain data collection procedures w ere unique to the case subjects; there were two, semi structured interviews, and their behavior was a primary focus of observation, intervention, a nd interaction during the teaching episodes. 3 The interview data served an important role in the pattern matching for test scoring as well as expert/novice coding. All interviews were videotaped and the audio was transcribed. The video camera was focused on the portion of the desk where the case subject was working. That allowed for capturing the case solving activity. The 3 Intervention Interaction refers to communication (usually two way) between the researcher and the preservice teacher.

PAGE 170

154 video proved valuable during instances when the researcher pointed or made reference to a c Two of the four baseline interviews were double coded with the expert/novice coding sheets by the same secondary scorer mentioned earlier. This process of pattern matching is a useful validity tool (Gall et al., 1996; Yin, 19 94), and helped ensure reliable coding of patterns and identification of possible themes. Before interview transcripts were finalized, the videotapes were watched in entirety to allow for additional comments to be inserted providing any necessary context ( this time, the preservice teacher pointed to the 2 7 rectangle she necessary, the appropriate videotape was consulted during the coding process; thus providing an additional quality check to help validate analysis. The first semi structured interview with each case subject was conducted within ten days following the pretest and before the first teaching episode (which began approximately one month after the pretest). All four first interviews were completed within tw memory failures would impact the results of the interviews, the PSTs were shown their own work while answering interview prompts. For the first interview, responses from the ques tionnaire and pretest served as a basis for interview protocols. Questions and probes were designed to clarify responses from those instruments and help gain an understanding elated

PAGE 171

155 there were also times where unstructured (or unplanned) follow up questions proved nec essary. While piloting interview protocols with PSTs, the need for such a semi structured approach was reinforced. On two different occasions an interviewee was asked of where it was determined that one respondent actually did understand perimeter but simply misspoke, but the other preservice teacher was truly confused and lacked a conceptual understanding of the measure. Purposeful questions were avoided during the first interview as they could result in a teaching situation and as such potentially bias the ree teaching episodes, the instructor/researcher observed, interacted with (in more of a clarifying manner), and took field notes of meaningful activities, taking special note of the investigative processes, hypotheses tested, and reasons offered for vari ous insights and interpretations of the four case PSTs. The second interview involved the same four case subjects and occurred after the posttest and during weeks 12 and 13 of the semester. This interview i ncluded direct, follow up contact with the case su bjects. The initial protocol consisted of clarifying questions based on posttest responses, but also included some purposeful questions (e.g., you say or do to help purposely selected task s (see Appendix N) were also integrated to further assist with collection of data measuring growth, or lack thereof, of content and pedagogical content knowledge. Observing the

PAGE 172

156 preservice teacher s analyze, problem solve, and respond to real time questioning record. There were no significant clarifications needed for any interview episodes before the follow up t est was administered. Each of the first and second interviews were approximately 45 minutes to an hour in duration. Case subject data were also collected during the teaching episodes. All teaching episodes were videotaped, and both the researcher and seco nd observer kept field notes to document significant individual and group behaviors, responses to classmates, and responses to researcher interventions. The researcher looked for opportunities to interact with all PSTs especially the case subjects. These opportunities were used as an attempt to document what might not have been captured in the learning packet or on video tape, presented in the teaching episodes. Data Analysis The emergent and unpredictable nature of a teacher development experiment requires a flexible analysis scheme. The analysis method use in this TDE wa s adopted from a grounded theory approach and its constant comparative method of analysis (Glaser & Strauss, 1975 ). The TDE involves two important levels of data analysis: the ongoing analysis, which occurred during the teaching episodes with the preservice teachers and b etween the teaching episodes as a personal reflection activity, and the retrospective analysis, which focused on the entire TDE or a subset of those data

PAGE 173

157 considered to be a useful unit of analysis (Simon, 2000). Simon explains how the ongoing analysis is t he basis for spontaneous and planned interventions with the preservice t eachers; these interactions help ed gather additional information, test hypotheses, and promote further mathematical and pedagogical development. A key aspect of ongoing analysis is the iterative process of generating and modifying mo dels of student development. F or this study, that involved models of the knowledge of stu dent thinking, how they develop and how they may interact. The retrospective analysis, according to Simon (2000), involves a reexamination of a larger body of data. This could be the entire TDE to date or a subset of those data (e.g., a baseline and follow up interview with a case subject) that is considered to be a useful unit of analysis. This analysis involves a careful structured review of all the relevant data of the TDE for the purpose of continuing to develop and refine explanatory Simon conveys that the development of explanatory models of preservice descriptive and illuminating models begin to appear and take shape during the ongoing analysis; howe ver, it is during the retrospective analysis that the models begin to stabilize and can be articulated more fully. The TDE methodology, supported by anchored instruction and the Guided Design model, directed and informed the ongoing interventions and inter actions between the PSTs and the researcher; thus, providing continued opportunity to collect data and refine hypotheses regarding individual and group development pert aining to content knowledge and knowledge of student thinking, and to permit finding ans wers to the five research questions of this study.

PAGE 174

158 Scoring Rubrics for Area and Perimeter Tests The overall scheme and initial criteria used for both the content knowledge and knowledge of student thinking holistic scoring rubrics were directly adopte d from Cai, Lane, and Jakabcsin (1996), and informed and influenced by Thompson and Senk (1998) upports the decision to use separate zero to four point scale rubrics for measuring content knowledge and knowledge of student thinking (see Appendix H). The reader should keep in mind ty of response (e.g., of the scoring training process and many pilot sessions, tables were created to delineate l misconception (see Appendix I) and to help differentiate a response emphasizing procedures from one focusing on understanding as well as responses teetering between scores. As reflected in the rubrics, a key distinguishing scoring factor is the presence and degree of conceptual understanding, acceptable, and model responses rests in that construct. Reliability of the Data The reliability of test scores refers to the consis tency, stability, and precision of test scores (Gall, 1996). On a reliable test a student would expect to receive the same score regardless of when the student completed the test, when it was scored, or who scores it (Moskal & Leydens, 2000). There are fou r general classes of reliability estimates: (a) internal consistency reliability, (b) test retest reliability, (c) parallel forms

PAGE 175

159 reliability, and (d) inter rater reliability (Gall et al., 1996). The following four sections will present the extent to which this study addresses each of these reliability measures. Internal Consistency Reliability This form of test score reliability is used to judge the consistency of results across items on the same test. Essentially, you are comparing test items that measur e the same construct (e.g., area or perimeter) to determine if they yield similar results When a test taker answers si milar questions in similar ways, that is an indication t hat the test has internal consistency. used to measure internal consistency when items are not scored dichotomously (e.g., right or wrong) but rather given a range of scores. Because the items used for the tests in this study were scored on priate measure of reliability for this .63 respectively, meeting the criteria for internal consistent reliability (Nunnally, 1978). because four of the ten items on the test had negative corrected item total correlation. None of those problems appeared on any future tests in this study. For the actual study entire pre post and follow up test, but also for the CK and KoST subtests. Recall that each 10 question test was split into a five question CK subtest and a five question KoST exp lanation. There are two important factors that can negatively influence reliability: a limited number of items or limited variability in the scores of those items. In this alpha

PAGE 176

160 for certain parts of each test (see Table 2). The limited number of items in each subtest ( n = 5 ) is one potential culprit for the low a lpha coefficients; however, after careful alpha also contained a test item h aving limited variability in its scores. For example, item 10 on the pretest (same item was problematic on the follow up test) proved to be the easiest question of any item on any tests (mean of 2.75, SD of only 0 .45). Wha t made this item even more troubling to reliability was the fact that PSTs who scored low on various other test items scored equally well on question #10 as those who scored well on those same items. That same situation was present for the other subtests w explanation includes the limited number of items and a small number of problematic test re post and follow up tests were strong (.75, .75, and .76 respectively) indicating that the testing instruments produced a majority of scores that had an acceptable level of internal consistency (Nunnally, 1978). Caution however must be taken when dr awing conclusions with measures derived from Table 2 Post and Follow up Tests CK subtest KoST subtest Overall Pretest .75 37 .747 Posttest 48 .66 .752 Follow up .64 54 .761

PAGE 177

161 Inter Rater Reliability: Training and S coring Whenever human beings are involved in a measurement process, careful consideration must be made to establish the reliability and consistency in the scoring of the items on an assessment. In an effort to measure the extent to which the researc her consistently and reliably applies the scoring rubrics to the testing instruments, 27 (out of an available 81) area and perimeter tests (each containing 15 items) were double scored a nd used for training purposes. Before any scoring was done by the second scorer, a lengthy training session was conducted. The second scorer holds a Ph. D. in Curriculum and Instruction with a concentration in mathematics education and has considerable exp erience with elementary mathematics content and pedagogy. The second scorer double scored 5 of the 12 pretests (or roughly 30%) and 4 of the 12 posttests. As part of this effort, t he results from the inter rater reliability process resulted in clarifications made to the language of the holistic scoring rubrics, the addition of supplemental grading sheets (see Appendix I), and improvements in item format and wording including the e limination of several items. These revised rubrics were used to score all subsequent test papers, and high scoring reliability was achieved throughout. The training and scoring sessions for the first batch of 27 tests had an inter rater reliability of 94%. The second and third scoring sessions had a slight drop in inter rater reliability, 88% and 86%. These two subsequent scoring sessions involved only four 10 item tests, which may help to explain the slight drop in inter rater reliability. Also, the test u sed for the third pilot contained four problems which had negative corrected item total correlation. These problems were removed from consideration for this study.

PAGE 178

162 Rubric Scoring and Coding Training Before the pretests were scored, the researcher purp osely selected two pilot test papers, which reflected a wide range of responses, to be used for a training session. The primary purpose of this session was to reacquaint the scorers with both the scoring rubrics (Appendix H) and the supplemental grading sh eets (Appendix I). Discussion occurred after each test was independently scored. Among other things, this allowed the rubric. This training session took place approxima tely a week and a half before the training for pretest scoring was scheduled to occur. There were two training sessions that preceded the formal double scoring of 5, ten question pretests. After perusing all the pretests, the researcher purposely select ed two pretests (one that appeared strong and another that appeared weak) that appeared to provide a wide range of response patterns. The researcher and the second scorer independently scored the same training paper. There was agreement on nine out of ten items for the first training test. The one disagreement was on question #4, which appeared as the same numbered question for the pre post and follow up tests. It proved to be one of the most difficult problems both to answer and to score. The second tr aining pretest s thinking portrayed in the world situation (or story problem), appropriate for 4 th or 5 th graders, in which they would need to find the

PAGE 179

163 we are going to build for our pet turtles. Two sides of the fence will be 12 inches. The other two sides will be 8 inches. It will look like this: (a rectangle was drawn and all four sides were appropriately lab response a 2 and the other gave it a 1. During the discussion, each scorer could be convinced (based on the rubric) to change their score. After further examination, it was decided that the respon se was conceptually incomplete and very weak (e.g., her left us wondering if the PST was actually thinking about perimeter instead of area. The lack of conceptual understanding provided a meaningful dividing line between a score of 1 and a score of 2. It was agreed this item should be scored a 1. To help reduce similar solutio the scores awarded. This was an important clarification that helped in distinguishing whether an item deserved a score of 1 or a score of 2. There were no other significant changes to the scoring rubrics, the supplemental grading sheets, or the manner in which they were applied as a result of the training sessions. The 5 pretest papers that were formally double scored were purposely selected based on an informal examinati on of the quality and depth of responses (both strong and weak). The goal was to provide scoring opportunities that would span a potential range of scores across a diversity of knowledge and understanding. Before any scoring was done, the researcher and se cond scorer agreed to grade the same problem for each test before moving on to the next problem. Two tests were double scored and the results discussed before scoring the other three pretests. The final pretest that was double scored included

PAGE 180

164 scores rangin g from a 1 to a 4. In spite of that, there was 80% agreement on the scoring of the items. The double scoring of these five pretests produced no clarifications to the scoring rubrics or the scoring process. The end result was an inter rater reliability of 9 4%. There is an interesting side note regarding the scoring of the pretests. One of the pretests that the researcher scored received a very low score (in the bottom 25%). Since this PST was also one of the case subjects, extra measures were taken to establish reliable baseline knowledge; therefore, the second scorer was asked to double score the test. Although the test was scored well after the double scoring session had concluded (two months for the researcher and four months for the second score r), there was 100% initial agr eement on the scoring of the 10 items. The double scoring training of the posttest proceeded in similar fashion as the which was in t he middle of the distribution, and the fact that the responses appeared substantial enough to potentially elicit a range of scores. Because the researcher also served as the instructor for the course, there was a potential that my expectations as the instr uctor might influence how I scored the test items. To limit this bias, I made a conscience effort to focus on the scoring rubrics and the supplemental grading sheets during the scoring process and not take into account my experiences as the instructor. T he first training test was scored independently and the results were discussed. Initial agreement was only 50%, although disagreement never differed by more than one number. It was d iscovered that the second scorer was relying too heavily on the supplement al grading sheet, as opposed to focusing on the rubric and grading the responses holistically. After correcting that, two more tests were purposely selected based

PAGE 181

165 on pretests scores (one high and one low). The weaker tests had scores ranging from a 1 to an almost 4, and the better test had scores ranging from 2 up to 4. Initial agreement for each test was 80%, with no scores differing by more than one. Strong agreement on these varying responses provided evidence for the reliability of the scoring process. The posttests of the four case subjects were purposely selected for the formal round of double scoring. There were two reasons for this. First, the pretest results corroborated that the case subjects, as anticipated, comprised two weak and two strong stud ents relative to the rest of the class, therefore providing, theoretically, a wide range of responses to score. Secondly, since a significant portion of analysis would be based upon the posttest scores of the case subjects, an extra level of reliability of their scores was warranted. It was decided that all four tests would be independently scored and that the same item for each test would be scored consecutively and that the order of the tests would be changed after each item, to avoid a specific test se tting an unintentional standard against which the other tests might be measured. For the first three posttests scored there was an 80% initial agreement rate and a 90% agreement on the fourth. The inter rater reliability for the four posttests scored was 9 4%. The high level of agreement, and the consistency in scoring differences, gives the researcher confidence that the and perimeter. E xper t/Novice C oding : Development, Training, and Usage content knowledge and their knowledge of student thinking. They were used to identify evidence of expert and novice language. The coding sheets are based on exta nt literature

PAGE 182

166 Table 3 Coding Sheets to Help Categorize Novice versus Expert Preservice Teacher Behavior, within the Context of this Study Novice Expert Source Knowledge (1a ) Sparse, lacking, vague (1b) Substantial amounts; richly Dufresne, Leonard, & Grace, (nd) Structures and/or disconnected (fragile a ) interconnected and hierarchical (17a) b Contradict own response (refer to 1b) (emerged during the study) (written and/or verbal) (2a) Exhibit little knowledge of (2b) Possess es an awareness of common Livingston & Borko (1990) misconceptions or concepts student errors and misconceptions most difficult for students (14a) b Tendenc y to over generalize (14b ) b Realizes limitations to generalizing (emerged during the study) (15a) b Incorrect mathematical (15b) b Correct, precise & conceptually strong (emerged during the study) computations and/or procedures mathematical procedures & work Problem (3a) Typically consider only (3b) Often able to find more than Dufresne, Leonard, & Grace, (nd) Solving one way of solving a problem one way to solve a problem (4a) Tend to skip the analysis (4b) Carefully analyze a problem LaFrance (1989); Chi, Glaser, & stage when problem solving before and/or while solving it Farr (1988) (5a) Are slower and prone (5b) Perform faster than novices at domain Chi, Glaser, & Farr (1988) to making errors specific skills usually with less errors (6a) Respond to superficial (6b) Initially try categorizing a problem and LaFrance (1989); Niemi (199 7); & features of a problem apply appropriate mathematical principles Chi, Glaser, & Farr (1988) Note a Specifically refers to a changing/vacillating response. b Identified category that emerged during the study.

PAGE 183

167 Table 3 (Cont.) Coding Sheets to Help Categorize Novice versus Expert Preservice Teacher Behavior, within the Context of this Study Novice Expert Sou rce Representations (7a) Poorly formed and/or (7b) Able to generate contextual and Dufresne, Leonard, & Grace, (nd); unrelated representations even multiple representations Livingston & Borko, 1990 (7a ) b Neglect to use representations (emerged during the study) Justification (8a) Are often unable to explain (8b) Can explain why their answers Dufresne, Leonard, & Grace, (nd) why their answers are correct are correct Instructional (9a) Primarily procedural in (9b) Presents clear & complete Ball & Wilson, (1990); Leinhardt Strategies content and application conceptua l explanations & Smith (1985); Fuller, (1996) (10a) Tend to focus on the content (10b) Primary focus is the student Livingston & Borko, 1990 (11a) Primary concern is performan ce (11b) Focuses on developing conceptual Livingston & Borko, 1990 and getting right answers u nderstanding (12a) Fail to incorporate learning (12b) When appropriate, incorporates Eisenhart et al ., 1993 tools, such as manipulatives, learning tools, such as manipulatives where appropriate (13a) Fail to incorporate technology, (13b) When appropriate, incorporates Mitchell & Williams, (1993); when appropriate, to promote a technology to promote understanding Marzano, (1998) focus on understanding of content and processes (16a) b Present incorrect incomplete, or (refer to 9b) (emerged during the study) inadequate explanations Note b Identified category that emerged during the study.

PAGE 184

168 that addresses behaviors of pre and inservice teachers that had been categorized as either novice or ex post and follow up tests and the Teaching Episodes), and verbal interaction (e.g., interview transcripts or comments made during the Teaching Episodes). The coding sheets are by no mean s all inclusive. For example, several expert novice categories presented in the literature dealt with classroom teachers interacting with their students (e.g., Experts are more apt to correct student performance while novices tend to correct student behavi or (Mitchell & Williams, 1993), and would not be compatible with this study. The categories that were chosen were considered to be most appropriate for the context, instruments, and PSTs (i.e., preservice teachers) of this study. There is no significance a ssociated with the numbering of the codes. The coding sheets provide structure while analyzing various forms of data (e.g., pre post and follow knowledge levels as well as to determine any growth that might have occurred as a result of the various interventions (e.g., Teaching Episodes and semi structured interviews). The numbering sequence (e.g., 1 a and 1 b ) was used during the coding process and reference to these codes will occur while reporting findings. Certain codes aligned very well with aspects of both CK and KoST, and helped to quantify and qualify the amount and type of respective knowledge present at different times throughout the study. For example, codes involving knowledge structure (e.g., 1 a /1 b ) and explanatory framework (e.g., 8 a /8 b 15 a /15 b and 16 a /9 b a /2 b ) and their ability to address shortcomings and misconceptions (e.g., 7 a /7 a /7 b 12 a /12 b and 13 a /13 b

PAGE 185

169 of KoST. response patterns that emerged into new codes. New codes identified within the sub verbal), (b) tendency to over generalize, and (c) incorrect mathematical computations category To balance out the holistic nature of th e scoring rubrics and provide a broader in a more analytic nature When scoring the pre post responses were not often found. Something as minor as leaving off the appropriate unit was grounds for assigning a score of 3 (acceptable) as opposed to a 4 (model); thus, some very good response s were assigned a 3. The Expert/Novice coding was completed on a more part by part basis. Each test question contained multiple parts, and thus the opportunity to assign multiple codes to the same question existed. For example, within one question a PST mi ght perform one calculation correctly (thus earning a code of 15 b ) but another incorrectly (thus a 15 a ). In that same question, an explanation for one part might be completely procedural (thus earning a 9 a ) while a conceptual explanation might be provided in another part of the same question (thus a code of 9 b would be assigned). In addition, a single question might contain two incorrect computations or two separate procedurally based explanations. In such instances, the same code was applied multiple times (e.g., two 15 a a

PAGE 186

170 To establish reliability for the coding process, the researcher and second scorer (the same one who double scored the pre and posttest) completed an extensive training program similar to what was done for the rubric scori ng training. Two pretests were purposely selected to provide a range of responses to code. It should be noted that the pretest interviews. That provided a broad ran ge of responses as well as added reliability The training sessions helped the researcher to refine the coding instruments. The following changes were made. For example, 4 new codes were added to the coding sheets: (a) 14 a a novice tendency to over generalize solution strategies, (b) 14 b the expert understands and recognizes the limitations to generalizing, and (c) 15 a while the literature discussed procedurally based, vague, disconnected, an d conceptually weak mentioning that the novice often displays an incorrect understanding of mathematical content (although it does seem obvious), and 15 b the expert di splays a thorough conceptual understanding of mathematical content. Other codes were revised to support an item by item coding, rather than a generalized comment related to teaching tendencies. For example, code 12 a rate learning observations were made during t he first training session: (a) c ertain codes (especially 4 a 4 b 5 a 5 b 10 a & 10 b ) might not be applicable to both the written tests and interview transcripts, and (b) t here w h ere instances where a response contained both novice and

PAGE 187

171 essed several features of an expert knowledge structure; however, that same response also contained an obvious conceptual error. Since that PST was a case subject, the interview transcript was consulted and it was concluded that the PST actually did posses expert knowledge regarding the question; hence, that response did not receive a novice code of 1 a Interview transcripts were only available for case subjects; therefore, their responses allowed for member checking and hence greater reliability. During t he second training session, conversation between scorers established that another code needed to be added to the coding sheets; a novice code of 16 a was added to apply to incorrect instruction and/or explanation. It was decided that code 9 b could function a There also appeared strong relationship s between certain codes. For example, a code of 2 b, 3 b or 9 b was almost always accompanied by a code of 1 b The following problem (see Appendix D, problem 1) provides a helpful e xample of the type of responses that would elicit different codes. The PSTs were provided a 10 10 grid including the statement beneath it that each grid polygon that would you help a 5 th grader understand that the polygon you drew really does have a 6 square on the grid and provided the following response for a isconnected knowledge structure, a ( especially because the polygon was drawn on a grid ), a

PAGE 188

172 explanation that would not aid understanding. Contrast that with the following response given for the same question by a nother PST who drew a 5 7 rectangle on the grid, and followed u rectangle. Make sure they count the outside edge of the boxes, using linear units instead r b b b clearly delineating a conceptual explanation. The third training session invol ved coding 5 KoST questions. Nothing occurred that required any revisions to the coding sheets. discussion of the pretest and its interview transcript. It was noted while examining an interview transcript and comparing it to the pretest that one PST would quite readily change his/her mind and vacillate between responses after just a basic interview prompt, what you a which b which refers to a sound CK, function ed as the contrasting code to 17 a Of the total 103 codes applied to the three pretests, there was initial agreement on 79 (77%). We had very strong agreement (98%) on identifying whether a specific response was novice or expert in nature. The vast majority of disagreements were related to which specific novice or expert code should be awarded (e.g., I would code something 10 a and the second scorer would code the same response as 11 a ), as opposed to one of awarding a novice code and the other awarding an expert code to the same response. Ou t

PAGE 189

173 of a total of 103 codes applied during the training sessions, that sort of disagreement occurred only twice. Those were resolved after agreeing that any code applied must be done in light of the whole response to avoid attributing undue significance to a ny one part reached. In summary, following discussion consensus was reached on 101 out of 103 codings representing 98% agreement for the training sessions. Following the training sessions, two pretests were formally double coded and pattern matching was performed through examining their respective interview transcripts. For the two double coded pretests, there was initial agreement on 47 out of 64 codes (73%). Clarifying h ow certain codes (e.g., 9 a 10 a & 11 a ) were applied improved agreement to 96%. All but one of the disagreements were of the novice type (i.e., either a different novice code or an extra novice code was applied). The one novice/expert disagreement was reso lved when the second scorer consulted the interview transcript The agreement on the pattern matching was 97% (i.e., every code, except one, that was applied to the pr etest was confirmed by the transcript), and agreement on new codes applied while reviewing the transcripts were 11 out of 17 (67%). One explanation for the slightly lower agreement was that the researcher consistently applied a code of 4 a to a transcript e very time ( n = All other pattern matching disagreements involved different selections of novice codes. The hi gh levels of agreement provided the researcher confidence that the coding

PAGE 190

174 content knowledge and knowledge of student thinking related to area and perimeter. Validation of Anchored Instruction Intervention To ensure that the Anchored Instruction framework was used with fidelity, experts who were familiar with this approach were asked to provide an expert review of ur doctoral candidates, from the field of instructional technology, agreed to examine and evaluate four aspects of this instruction, (b) the degree to which the anchor of choice (situated within the Teaching Episodes) captured the essence and addressed the designers of Anchored Instruction, (c) the degree to which the design principles of Anchored Instruction were addressed by the m aterials of this study, and (d) the degree to which PSTs in this study experienced Anchored Instruction. Each expert reviewer received an email explaining the review process. There were several files attached to the email: (a) an overview of the study, ( b) a summary of the based summary of the qualities of Anchored Instruction, (d) information on, including hyperlinks to, the two microworlds integrated into the instructional sequence, (e ) all three teaching episodes, and (f) the Anchored Instruction Assessment Survey (Appendix O). The survey instrument contained four sections consisting of an explanation for each component of the conceptual framework that was to be reviewed followed by a Likert scale checklist. Each reviewer took about a month to work throug h the materials and return his/her completed survey instrument. The results are summarized in Table 4.

PAGE 191

175 Table 4 Results from Assessment Survey of Anchored Instruction (n=4) Construct being reviewed Strongly Agree Agree Disagree Strongly Disagree I. Definition of Anchored Instruction 3 1 II. Selection for the anchor 3 1 III. 8 Design Principles: 1. Choosing an appropriate anchor 3 1 2. Possess a generative learning environment 4 3. Developing shared expertise around the anchor 3 1 4. Expanding of the anchor 2 2 5. Using knowledge as a tool 1 3 6. Merging of the anchor 3 1 7. Allowing student exploration 4 8. Provide opportunity for PSTs to share new knowledge 3 1 IV. PSTs should experience anchored instruction 2 2

PAGE 192

176 Cross Case Analysis Answering each of the five research questions involved, to different degrees, cross case analysis. For the non case subjects, their responses to the problems on the area and perimeter tests, as well as items within the teaching episode packets served as a means to conduct cross case analysis and comparison Yin (1984) advocates a process that has been referred to as replication (Miles & Huberman, 1994). The analysis process typically involves studying in depth cases and then examining successive cases (less in depth) to see whether the patterns found match those in the case subjects. This cross case comparison helped present a wider view of the data and facilitate a more comprehensive examination of mathe matical and pedagogical change, when it occurred. Including data from all the PSTs within the constant comparison analysis helped to support the findings from the case subjects. Intervention CK and KoST In order to answer research questions one and two, it was necessary to establish intervention content knowledge (CK) and knowled ge of student thinking (KoST); T heir written responses to the pre study questionnaire, the 10 item area and perimeter pretest, and the case ine interviews were analyzed. The expert/novice coding sheets were applied to the pre study questionnaire, pretest, and the baseline interviews. How the assigned codes were used in analysis and in the reporting of findings is described later in this sectio n. The bulk of pre intervention findings were n responses to the 10 pretest items. Analysis of the pretest items was done from three perspectives.

PAGE 193

177 Analysis of Pretest Written R esponses he pretest items received a score from 0 to 4 based on the researcher created holistic scoring rubrics (see Appendix H) developed from overall score ranging from 0 4 0. As described in the instrumentation section, the criteria of the scoring rubrics incorporate distinguishing characteristics of both novice and expert mathematics teachers obtained from the literature (e.g., novice teachers focus on the content at hand w hile expert teachers continually consider the various needs of the students) so that each score actually represents a location on a theoretical continuum from novice to expert. For example, procedural versus conceptual responses were addressed, and procedu ral laden responses ended up with a score of two or lower. Although each item contained two, three, or four parts (see Appendix D), both the closed and open ended parts received one overall score. The pretest contained 10 total items five addressing con tent knowledge (CK) and five dealing with knowledge of student thinking (KoST). Each test generated an overall score, which range d from 0 40. The mean and standard deviation for the overall score were calculated and discussed. A test scoring in the range o f 0 scores ranging from 21 receiving a score of 40 would imply every response to be model. Piloting revealed that tests rece Pilot scoring of 65 tests resulted in a mean score of 17.9 with a low score of 8 and a high of 25. The total pretest score served as a baseline indicator that was later used in g rowth

PAGE 194

178 curve a nalysis mathematical knowledge. Total scores from the pretest also functioned as the first time point recording in the growth curve analysis. The total test score was also a factor in the purposeful selection of four PSTs for in depth study. Among the four who were selected, two scored at or near the bottom on the pretest and two scored at or near the top. The quality of their responses, as opposed to some predetermined scor e, was of primary importance in case subject selection. This criterion is discussed in greater detail in the sections addressing research questions three and four, where a more detailed explanation analyzed, displayed, and discussed. The second, more focused, perspective that was used to gain insight into the scores on the five CK questions and five KoST questions. They were analyzed and discussed as sub tests within each test The scores on these sub tests can range from 0 20. Descriptive statistics (mean and standard deviation) for each PST score were analyzed and reported. Examining descriptive statistics for sub tests scores within the 65 piloted tests revealed no consistent or statistically significant trends. Frequencies of rubric scores were presented and any score patterns for the CK and KoST ite ms were discussed. Frequencies from expert/novice codings of the pretest responses were presented Transcripts of the first interviews were used as a form of pattern ma tching with the analysis of pretest responses. Based on the actual definitions of both CK and KoST,

PAGE 195

179 responses receiving certain codings were more informative than others. For example, CK involves: (a) an organization of facts and concepts; thus, analysis s urrounding responses receiving codes 1 a /1 b and 15 a /15 b would be helpful, and (b) an explanatory framework; therefore, responses receiving 8 a /8 b and 16 a /9 b would be useful. For KoST, it involves: receiving codes of 2a and 2b were valuable, and (b) appropriately addressing any shortcoming or misconceptions; hence, responses receiving 7 a /7 a /7 b 12 a /12 b and/or 13 a /13 b were considered carefully. 10 pretest items, and the first interview with the case subjects comprises the third perspective used to d with the anchored instruction. As the data analysis of the questionnaire and pretest proceeded, three broad categories of responses were identified. They are: (a) distinguishing between area and perimeter, (b) units of measure, and (c) perceived relationships between area and perimeter. These broad categories were used to help organize themes within the responses containing findings needed to answer research questions one through four. The cr oss written responses to the pretest items and comparing them to the coding sheets of difference patterns between novice and expert preservice and classroom teachers (see Table 3, p. 166 ). Another component of the analysis of the pretest responses was the four levels of understanding that teachers can exhibit as they explore a new idea presented to them by a student. They are, in order: (a) D isproving the claim, (b ) I dentifying the possibilities, (c ) C larifying the

PAGE 196

180 conditions, and (d ) Explaining the conditions. A category of understanding (Justifying an invalid claim) was not designated as a level by Ma, s ince it was not deemed successful. Analysis of the F ir st I nterview The transcripts from the first interview with the case subjects were used to pattern match the codes assigned to the pretest responses and as a source to aid in triangulating data. All in terviews were videotaped and the audio w as transcribed. Two of the four baseline interviews w ere double coded, using the e xpert / novice coding sheets (see Table 3 ), by the same secondary sco rer mentioned earlier. This function ed as a sort of pattern matchin g (Gall et al., 1996) to help ensure reliable coding of patter ns and identification of expert /novice themes. Before interview transcripts were finalized, the videotapes were watched in entirety to allow for additional comments to be inserted that added necessary videotape s were available during the coding process which provided an additional quality check to help validate analysis. Transcripts from the first (bas eline) interview were analyzed in similar fashion as the pretest responses. Meaningful interview passages were compared to the coding sheet s of difference patterns between novice and expert teachers (Table 3), and to prior responses on the pretest, looking for previously identified themes or emerging ones. Each case subject interview contributed to ongoing collection of data regarding the ir CK and KoST, thus providing another means to triangulate the data, hence adding credibility and strengthening confiden ce in subsequent conclusions (Patton, 2002). While analyzing and

PAGE 197

181 transcripts provided a means to substantiate, or even refute claims and/or identified patterns. If a test re sponse was unclear and difficult to score or code, b eing able to address that response during a follow up interview proved valuable and lent credence to the final score or code awarded. A good example of this process occurred while evaluating the substance Initial evaluation concluded that the response contained questionable content drew in Figure 17) and a limited knowledge of student thinking. This question, along up, semi structured interview. After the interview was completed and transcribed, ongoing analysis revealed that a lack of appropriate scrutiny during the problem solving stage was the major reason for the deficient response and not a genuine lack of understanding as was first thought. Below is a portion of the transcribed interview: ( I = instructor; S = student) I : first rectangle mean. S: The way that she is thinking is about the outside. For instance, the 18 units I : ; could you point to and count off the 4 feet? S: Oh yes, each dot represents one of the (pause), although if you use the space connecting the dots and to her each dot represented a unit and the same for the 5 also. I : So how about the rectangle you drew? If you put that up on a board to show students, how would you explain the dimensions of what you drew? Is that rectangle 7 2? S: No, it would actually be 6 1. likely a result of inadequate analysis as opposed to limited content knowledge.

PAGE 198

182 Figure 17. Piloted item used in follow up interview for pattern matching. A similar situation occurred during analysis of the pretest responses for the full study. One of the case subjects (Grace) provided an incomplete and shallow response to two items near the responses were topics of discussion for her first interview. It was then that she shared how she ran out of time while answering those two items. Given the opportunity, she was able to comp lete her responses, without any help or prompting, and provide a more accurate picture of her true understanding regarding the concepts and misconceptions contained in the items. These processes provided a descriptive notion of the level of expertise rega rding content knowledge and knowledge of student thinking possessed by the preservice teachers prior to intervention. Claims regarding the four case subjects selected were

PAGE 199

183 analyzed and evaluated further in subsequent interviews as well as with cross case a nalysis of the non case subjects, as described earlier. Intervention CK and KoST To be able to answer research questions three and four, it was necessary to and knowledge of student thinking (KoST) changed, if at all, throughout the course of the study. Emergent Knowledge: The Teaching E pisodes The teaching episodes (TEs) comprise the primary means of intervention for this study; therefore, the findings from the TEs embody All three teaching episodes were videotaped. The videotapes were watched before any coding was performed and w ere used as a reference to inform and support ongoing and retrospective analysis. Repeated viewing a nd analysis of the whole class discussions proved helpful in providing context and supportive data for the non case subjects. Because research questions three and four specifically addressed CK or KoST, each writing prompt from the three teaching episodes was identified as focusing on CK, KoST, or the use of microworld(s) within the TE (an application of KoST). The subsequent post and follow up tests. The expert/novice coding sheets were applied to each response and, when necessary, pattern matching was performed for the 4 case subjects through analyzing interview transcripts. Interventions by the researcher d uring the teaching episodes also provided opportunities to pattern match dat a identified during reflective analysis

PAGE 200

184 The numerous data samples collected and the analysis conducted were valuable in and knowledge of student thinking changed throughout the study, hence answering research questions three and four from a qualitative perspective. Post Intervention Knowledge The data analysis for the post and follow up tests was conducted in similar fa shion as for questions one and two. Regarding the pre post and follow up tests, the following were calculated, analyzed, and discussed: (a) descriptive statistics of the total and sub test scores, (b) rubric score frequencies, (c) expert/novice coding totals and (d) individual expert/novice code frequencies. However, additional analysis was also conduct ed. Expert/novice coding totals for CK and KoST as well as regression equations and graphs for total score and CK and KoST sub test scores were presen ted and discussed. In ord knowledge, the three TEs (involving the anchored instruction intervention) were the focal point of the qualitative cross case en responses to the post and follow up tests, and the case final interview involving the four case subjects followed the last of three teaching episodes, and was analyzed in the same manner as the pretest (baseli ne) interview. Regression Analysis of Tests Scores The second way that potential mathematical change was investigated involved regression analysis of mean scor es from the pre post and follow Willett, 1988, p. 345 ); However, quant itative measurements of change have proven controversial, with some seeing its

PAGE 201

185 value (Rogosa, Brandt, & Zimowski, 1982; Wi llett, 1988; Zimmerman & Williams, 1982), and others who are suspect (Gall et al., 1996; Linn & Slinde, 1977; Lord, 1956). The approach taken in this study involved an adaptation of the difference score ( i.e., gain score). The PST s on the pr e post and follow up tests were used as the de pende nt variable and the corresponding points in time (i.e., pre post and follow up) functioned as the independent variable to construct individual growth curves. a regression line was fit to those points. Significance of any growth, or lack thereof, was area and perimeter tests and the written reflections, observations, and field notes during the teaching episodes, and the interviews of the case subjects). Change related to the specific components of CK and KoST (as described in their de finitions) were analyzed and reported in much the same fashion as was done in answering research questions one and two. The presentation of the provided a visual confirmat ion of any change. Although the teaching episodes provided a interviews were the primary data sources for documenting more immediate growth (or lack thereof). The follow u p test was more a measure of retention as well as a means of confirming and/or illustrating the growth (or lack thereof) delineated by the triangulation of the previously mentioned data sources. This simplified approach assisted in presenting a second pers pective on the mathematical growth of the PSTs and contributed to answering research questions three and four.

PAGE 202

186 Relationships B etween CK and KoST To answer research question 5, it was necessary to examine potential relationships that might exist between CK and KoST (e.g., Does KoST increase as CK increases?) as related to area and perimeter in general, and more specifically units of measure and perceived relationships. Two approaches were used to answer this question. The first involved an analysis of qua ntitative data. The three correlation coefficients for CK and KoST at the three time points (i.e., pre post and follow up) were calculated and discussed. CK and KoST sub test scores for the pre post and follow up tests (e.g., Table 14 p. 2 5 6 ) and summary tables of expert/novice codings (e. g., Table 16 p. 261 ) were analyzed and patterns were n oted and examined (e.g., 9 of the 12 PSTs showed increases in their CK or KoST, but only 6 showed increases in both), and appropriate regression graphs (creat ed to help answer research questions 3 and 4) were presented. One goal was to identify and describe CK KoST relationships that surfaced primarily due to the intervention (i.e., from pre to posttest), and since the follow up test is more a measure of reten tion, its results were not weighted as heav il y. During analysis it was test score (CK or KoST) to their posttest sub test score (CK or KoST) was a necessary criterion to assist i n identifying and deciphering CK KoST relationships (e.g., increased CK and KoST, and static CK with increased KoST). That number (3) represents a 15% change and helped to rule out trivial and inconsistent patterns or weak relationships. It should be kept in mind that the goal of answering research question 5 was not to look for or attempt to establish statistical significance within or among CK and KoST data, but rather to discover and then describe CK KoST relationships that could be collaborated through

PAGE 203

187 different sources (e.g., responses to tests and TEs, and interview transcripts). The second aspect to answering research question 5 involved two comprehensive analysis strands, devised around the area and perimeter concepts/misconceptions central to thi s study (see Table 5), whi ch helped to focus and guide further analysis necessary to illuminate and describe patterns identified during quantitative analysis. The two analysis strands are (a) units of measure (i.e., linear and square units), and (b) the pe rceived relationships between area and perimeter (i.e., that equal perimeters must result in equal areas and vice versa, and the belief that a direct relationship exists between area and perimeter in that increasing / decreasing one will have the effect of increasing / decreasing the other). These analysis strands formed the basis for the topics of inquiry across various time points (i.e., across teaching episodes and from pretest to posttest, and to a lesser degree the follow up test). Answering research question 5 followed similar paths as used to answer research questions 1 4: (a) Case subjects were the primary focus of the comparative analysis, because their responses received appropriate pattern matching through two semi structured interviews, and (b) Any discussion of CK KoST relationships focus ed on the pre and posttest findings, since the follow up test has implications more for retention. The comparative analysis was supported with appropriate findings from the non case subjects. An example of how the descriptive statistics and the analysis strands function ed together will be presented next. What follows is a theoretical example of the analysis processes just describe d. If a PST responses concerning issues of CK regarding area and perimeter were consistently scored and determined to be weak and of novice standing ( based on rubric scoring and

PAGE 204

188 Table 5 Corresponding Test Items for Comparative Analysis for Answering Research Question Five Source: Pretest/Follow up* TE 1 TE 2 TE 3 Posttest CK 1, 3, 4 2, 3, 4, 6 M 1, 3, 4 KoST 6, 7, 9 5, 7 M 8 M 10, 11 M 7, 9 Source: Pretest/Follow up* TE 1 TE 2 TE 3 Posttest CK 5 2, 3, 4, 5, 7 M 9 2, 3, 4, 6 M 5 KoST 8, 10 6, 8 M 10, 12, 13 M 5, 7, 8, 10 M 6, 8, 10 Note CK = content knowledge; KoST = knowledge of student thinking ; and TE = teaching e pisode. M The question encouraged the use of a microworld Follow up test contains same problems as pretest.

PAGE 205

189 Table 3, p. 166), and such a finding received substantiation by a second data source (e.g., the teaching episode) or better yet a third (e.g., an interview), then logical progression should questions addressing similar concepts For example, if a PST continually confused area and perimeter concepts (e.g., linear versus square units) while addressing questions related to CK of area and perimeter, and that same PST also exhibited a limited, or even inaccurate, knowledge of how to best deal with a hypothetical student struggling with similar concepts, then a respond to a student and their thinking. Also, i t should be mentioned that each KoST question is designed to focus on a common misconception regarding area and perimeter concepts (i.e., CK). In other words, it was hypothesized that if a PST was unable to perceive the misconception presented in the probl em (i.e., fallible CK), they would typically present inferior methods of dealing with students exhibiting the same misconception (i.e., inferior KoST). It was conjectured that a substantial CK of area and perimeter was necessary for preservice teachers to be able to meaningfully and conceptually address student misconceptions regarding those concepts (i.e., a well developed KoST is dependent upon robust CK). performance on KoST q uestions to their performance on CK questions addressing similar concepts (or misconceptions), as opposed to another shortcoming possibly not directly related to CK (e.g., carelessness or running out of time). Equally important was the investigation and de scription of various relationships between KoST and CK that were identified. For example: (a) an increase in CK from pre to posttest accompanied by a

PAGE 206

190 static or decreasing KoST, or (b) a static CK from pre to posttest while the KoST increased. Such relati onships, when discovered, were analyzed, patterns compared and categorized, and narrative written in an attempt to explain and clarify any counterintuitive results (e.g., an increased KoST with decreased CK). When multiple data sources substantiate d a rela tionship between CK and KoST, rich description w as used in an attempt to illuminate such relationships. Limitations of this Study As in all research possessing a qualitative element, the quality of a teacher development experiment will be directly depen dent upon the knowledge, skills, and interactive abilities of the researcher (or researchers). As such, the researcher functioned teaching. The overall goal of that teach ing was to promote mathematical development within the PSTs, which puts added importance upon the competencies of the researcher (beyond the usual involving observation, questioning, and data management). According to Simon (2000), preparing to conduct TDE research combines two difficult processes: learning to conduct research while simultaneously learning to teach in ways appropriate for the TDE. These challenges, along with certain inherent aspects of this study, contributed to the following limitations: post and follow up) 75), certain five question sub tests (e.g., pretest KoST, posttest CK, and follow up KoST) were less than satisfactory. They were accounted for and discussed previously in Chapter 3.

PAGE 207

191 2000). s role as instructor of the teaching episodes could bias the validity of certain qualitative elements of this study, but such bias was minimized by the presence and feedback of a second observer.

PAGE 208

192 CHAPTER 4 FINDINGS The purpose of this study was to examine levels of knowledge in the context of rea and perimeter. In particular, it focused on their understandings, misconceptions, written and verbal explanations of that knowledge, and achievement on written area and perimeter tests within the context of a mathematics methods course for PSTs. The primary research question examined by this study was, es content knowledge and pedagogical content knowledge, related to area and perimeter, change as a result of experiencing anchored instruction integrated with web based microworld In particular: 1 involvement in the teaching episodes? 2. W hat g regarding area and perimeter prior to involvement in the teaching episodes? if at all, during the course of the study?

PAGE 209

193 4. How doe perimeter change, if at all, during the course of the study? area and perime ter related to their content knowledge of those same concepts? This chapter consists of results that are presented in three distinct sections. The first major section answers research questions 1 and 2 by discussing the results pertaining to pre intervent ion content knowledge (CK) and knowledge of student thinking (KoST). Descriptive statistics and qualitative analysis of the pre study questionnaire, pretest, the pre intervention results. The second major section presents findings taken from the post and follow up tests, the three teaching episodes, and the second interview with the case intervent ion CK and KoST, research questions 3 and 4 are addressed. Chapter 4 concludes by discussing results pertaining to possible relationships between CK and KoST as deciphered within predetermined content strands taken from the pre post and follow up tests and the three teaching episodes. Selection of Case Subjects Using the selection process described in Chapter 3, the following four case subjects were identified. Case Subject Jackie Jackie is a very diligent student who earns good grades (see Table 6). The not come naturally to Jackie, and she would be the first to admit that. Jackie is an

PAGE 210

194 inquisit ive person and not ashamed to admit it when she is confused about a concept, nor was she afraid to ask a question in class or after class. In the Survey Questionnaire (Appendix C) Jackie indicated she had studied area and perimeter in high school as well a elementary any mathmatic [ sic have had great tudors [ sic The researcher also taught Jackie in a technology course designed for preservice teachers. She proved quite capable with concepts and applications surrounding technology integration. When asked in the questionnaire about her opinion on using it. Elementary Jackie is well liked and has many friends within her elementary education cohort and throughout the campus. She also holds leadership positions within the student body as well as her college Greek o rganization. She is socially confident and very eager to participate in class discussions. Jackie has an open mind to both content and pedagogical issues related to education and the study of teaching. She was not only willing to be a case subject but expr essed excitement at the opportunity. At times during interviews and class discussions Jackie could get verbose, and this would tend to dilute her responses. Case Subject Brianna Brianna is a very conscientious student who performs very well academically (see Table 6). Instead of taking Liberal Arts Mathematics for her third mathematics course,

PAGE 211

195 Table 6 Case Subject Data Note All data was current through their junior year. she took Pre Calculus, a course not typically taken by elementary education majors. The for both College Algebra and Pre Calculus. When asked why she signed up for Pre Calculus, she said that she has always enjoyed math. Brianna was quiet during class, did not ask many questions, and was uncomfortable when called on to respond. Brianna is a very careful thinker, who would often take 10 20 seconds to ponder a question before giving a response. Jackie Brianna Larry Grace Academic background GPA 3.07 3.33 2.21 4.0 College algebra B A C A Liberal arts math B C A Prob. & Stats. C A D (C 2 nd time) A Pre calculus B Exposure and confidence related to area & perimeter HS geometry X X X Other HS math courses X X X College math courses X X X Involved manipulatives X X Involved technology Confidence level to teach concepts

PAGE 212

196 age children. When Brianna wrote that she had never been exposed to any instructional technologies while le arning about area and perimeter. W hen asked her opinion on using technology to assist be beneficial to use technology when teaching about area and perimeter, to help students d take the place Case Subject Larry Larry was the only male student in the Methods of Teaching Elementary Mathematics course. He is an exceptional athlete and a very successful soccer player for the college. Academically, Larry struggles (see Table 6). He often appeared overwhelmed with his course work; he would forget about assignments, and the depth of his work was Arts Math. Larry is a fun loving guy enjoyable to talk to, and well liked. He does not enjoy mathematics and must work very hard to earn a passing grade. Larry would not seek assistance and rarely asked questions in class. Tests and in class projects would overwhelm him, and he frequently d id not perform well on them. teaching area and perimeter to elementary s not aware of any technology that could aid in the teaching and learning of area and perimeter, but seemed open to its

PAGE 213

197 possibilities. When asked his opinion on using technology to help elementary students in learning about area and perimeter, he responded students in learning. It can do many things that cannot be done in the classroom. It makes Case Subject Grace student. After raising a family and working as an administrative assistant at the college where this study took place, Grace decided it was time for a career change, and at the age of 52 she enrolled in the school of education. Grace was an amazing student She maintained a 4.0 GPA her entire college career (see Table 6). I was her instructor for College Algebra and Liberal Arts Math. Grace is quiet, humble, and unassuming but was not afraid to ask a question in class and was thoughtful when responding to q uestions during class. In the Survey Questionnaire Grace indicated she had studied area and perimeter in her high school Geometry class, and did not recall any other exposure to those concepts nsive, confident, or very er concrete manipulatives or educational technologies (i.e., software or the Internet) while learning about area and perimeter. When asked her opinion on using technology to help elementary students in learning about area and perimeter, s hink it would be helpful keep attention and provide different types of visuals. Web based technologies provide a vast array of tools for assisting teaching; much more varied than a teacher could supply

PAGE 214

19 8 otherwise. And students are comfortable with and ade Intervention CK and KoST The findings in this section address the following research questions: What is the ( KoST) regarding area and perimeter prior to involvement in the teaching episodes? The pre intervention data came from the pre study survey questionnaire (Appendix C), pretest (Appendix D), the first interviews with the case subjects, and microworlds orient ation questionnaire and the pretest, and from transcripts from the first interview. Descriptive statistics were performed on the resulting scores as well on the expert/novice cod ing followed by qualitative findings meant to support and illuminate the descriptive results. Pretest Levels of CK and KoST As described in Chapter 3, findings inv olving CK and KoST, will address key components of their definitions. For CK that involves: (a) the amount and organization of facts and concepts, and (b) the ability to explain that knowledge in meaningful ways, and KoST entails: (a) organizing CK in a wa y that would enable a teacher to understand misconceptions. This will be the case for the first four research questions. Descriptive Statistics for Rubric Scorings of Pretest Items R e sults and a standard deviation of 4.97 (see Table 7). The data appeared relatively normal ly

PAGE 215

199 distributed with skewness and kurtosis values of .08 and .65 respectively. Jackie scored a 13 on the pretest, which was the lowest overall score. Her score of 4 on the CK subtest was the lowest in this category and was almost two standard deviations below the mean of 10.58. Brianna received the highest score of 30, and Grace was second at 27. Th e results indicated that the two easiest questions on the pretest, each with a mean of 2.75, were three (SD = .75) and ten (SD = .45), and the hardest question was four (M = 1.58; SD = .9). Jackie scored a 1 on question 3, a 1 on four, and a 3 on ten (see Table 10). Table 7 Descriptive Statistics for Pretest Pretest CK (items 1 5) KoST (items 6 10) Mean SD Mean SD Mean SD 21 25 4 97 10 58 3 48 10 67 2 10 PST* Pretest Score CK Items (1 5) KoST Items (6 10) #1 25 12 13 Grace (#2) 27 16 (high) 11 ** #3 23 13 10 #4 18 8 10 #5 16 8 8 #6 24 14 10 Jackie (#7) 13 (low) 4 (low) 9 Brianna (#8) 30 (high) 14 16 (high) #9 18 8 10 #10 23 12 11 #11 21 10 11 Larry (#12) 17 8 9 Note Pretest scores range from 0 to 40. A score of 40 indicates a model response for all 10 items. *PST = preservice teacher (i.e., study participant) **PST ran out of time and did not finish.

PAGE 216

200 ore frequencies on the 10 items indicate the majority of her knowledge was scoring rubrics (Appendix H). She scored four 1s and a 0 for the five content knowledge questions. understanding of basic area and perimeter ideas (i.e., draw a polygon that has a perimeter of 24, find the perimeter of an irregular polygon, and how, as a teacher, would you explain the co ncepts of linear and square units). Jackie performed better on the five knowledge of student thinking questions (items six through ten) earning one relate to the studen ts in the problems and correctly predict their struggles because, admittedly, she shares many of the same difficulties. Interview excerpts revealed that while Jackie may be aware of certain aspects of the misconceptions students possess, her ability to eff ectively intervene and her overall pre intervention KoST is fragile at b est. Contrast that with Brianna who only received one score of 2 for her entire pretest ; the rest Of the 120 scores assigned on the pr etest items, there were only seven scores of 4 awarded and only one on the KoST subtest (Brianna). Grace was the only PST who received more than one score of 4 (both came on the CK subtest). Descriptive Statistics for Expert/Novice Codings for Pretest Ide ntifying examples of expert/novice behavior (Table 3, page 167 ) within the intervention levels of CK

PAGE 217

201 Table 8 Item (CK: 1 5) (KoST: 6 10) Score Frequencies PST* 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 #1 1 4 3 2 2 3 3 2 2 3 1 4 4 1 Grace 4 4 3 2 3 3 2 2 2 2 5 3 2 #3 2 4 3 2 2 3 1 2 2 2 1 6 2 1 #4 1 1 3 2 1 2 1 3 1 3 5 2 3 #5 1 3 2 0 2 2 1 1 1 3 1 4 3 2 #6 2 4 3 2 3 1 3 1 2 3 2 3 4 1 Jackie 1 1 1 1 0 2 1 1 2 3 1 6 2 1 Brianna 3 3 3 3 2 3 3 3 4 3 1 8 1 #9 2 1 3 0 2 3 1 2 1 3 1 3 3 3 #10 3 1 4 2 2 3 3 1 1 3 3 2 4 1 #11 1 3 3 2 1 2 1 2 3 3 3 3 4 Larry 2 2 2 1 1 3 1 1 2 2 4 5 1 Total s 3 32 39 39 7 Not e. Rubric scores range from 0 to 4. A score of 4 indicates a model response, 1 is unacceptable, and 0 indicates no response. *PST = preservice teacher (i.e., study participant)

PAGE 218

202 produced the highest frequency of novice teacher c odings (50) and the second lowest (4) number of expert teacher traits. Brianna received the highest number of expert codings (21) followed by Grace with 20. Because the pretest was given prior to the microworld orientation and because all the PSTs indicate d in their survey questionnaire that they had no prior exposure to learning mathematics with technology, codes 13 a and 13 b were not Table 9 Expert /Novice Coding Frequencies for Pretest Total Score CK (items 1 5) KoST (items 6 10) Mean for a Mean for b Mean for a Mean for b Mean for a Mean for b 34.6 10.3 15.8 4.1 18.8 6.2 SD 8.7 6 5 3.3 5.4 3.4 PST a Sum b Sum a Sum b Sum a Sum b Sum #1 34 9 16 3 18 6 Grace 27 20 11 11 16 9 #3 45 10 14 4 31 6 #4 36 7 20 1 16 6 #5 43 3 21 1 22 2 #6 26 16 10 8 16 8 Jackie 50 4 26 0 24 4 Brianna 23 21 13 6 10 15 #9 27 9 12 4 15 5 #10 27 13 10 7 17 6 #11 37 7 18 2 19 5 Larry 40 5 18 2 22 3 Note An a signifies a novice response and b signifies an expert response (see Table 2). For total score: Min. a Sum = 23, Max. a Sum = 50, Min b Sum = 3, and Max. b Sum = 21.

PAGE 219

203 assigned to any of the pretest responses. Table 10 presents frequencies of individual codes identified on the pretest. This allows the comparison of frequencies among case subjects and the class frequency of code 1 a indicate an amount and organization of CK that is sparse, lacking, and/or disconnected (i.e., fragile). A high frequency of code 2 a signals a PST exhibits little knowledge of misconceptions or concepts most difficult for students and would point to insufficient levels of KoST. Jackie and Larry had higher than average frequencies of 2a while Jackie is an important facet of CK, and codes 8a / 8b 9a / 9b and 16a reflect that. The low frequency of code 8b for Jackie and Larry is an indicator that they struggle when trying to explain their responses. Grace and Brianna had higher frequencies of code 9a which would signify a tendency to be procedural when explaining how to do and think about mathematics. Code s 7a 7a and 7b which involve the effective use of representations (or neglecting a representation, as in 7 a ), are important because they indicate if a PST Table 10 Novice/Expert Specific Code Frequencies from Pretest Code 1 a 1 b 2 a 2 b 3 a 3 b 4 a 6 a 7 a 7 a 7 b 8 a 8 b 9 a 9 b 10 a 10 b 11 a 11 b 12 a 12 b 13 a 14 a Grace 6 3 3 3 0 3 0 1 3 0 2 1 4 5 1 0 1 2 1 1 2 4 0 Jackie 10 0 8 1 0 0 1 1 3 1 1 3 1 3 0 0 0 0 1 3 4 11 2 Brianna 6 1 2 7 1 1 1 0 3 0 2 0 6 6 2 0 1 2 0 1 0 2 0 Larry 8 0 5 3 0 0 1 0 3 1 1 0 1 4 0 0 0 0 0 3 5 9 0 c lass a vg 7.8 1.2 4.6 3.5 .7 .6 .3 .5 2.8 1.3 1.2 1.1 2.1 3.5 .7 .2 1.6 .8 .2 1 2 7.1 .3 SD 3.1 2.1 5.1 3.6 1.9 .7 .7 1.3 4.2 2.2 1.9 1.4 2.9 4.9 1.2 .4 2.2 1.2 .4 1.7 2.5 6.2 .9 Note There were no codes of 4 b 5 a 5 b 6 b 10 b 13 a 13 b 14 b or 15 b assigned for the pretest.

PAGE 220

204 understands and appreciates appropriate means to addresses the shortcomings and misconceptions of students code 16a is assigned to a response that presents incorrect, incomplete, or inadequate explanations. A frequency rate above the mean (as in the case of Jackie and Larry) i intervention CK and KoST will now transition into presenting findings that expound on the descriptive statistics. intervention CK and KoST The findings presented in these next several sections answer research question s one and two and are organized under three major categories: (a) Distinguishing between area and perimeter, (b) Units of measure, and (c) Perceived relationships between area an PST possessed an incomplete CK regarding these concepts. Because of the important ro le CK plays in the organization of KoST, greater emphasis was placed on the analysis of the pre intervention KoST. Distinguishing Between Area and Perimeter Although area and perimeter are used for different applications, they do have similarities. It is tho se similarities which make the se concepts susceptible to confusion. Although each measure involves a calculation with sides, area and perimeter also require attenti on to their appropriate unit (i.e., linear or square). These concepts are intrinsically linked, and a PST with a profound CK and KoST realizes the importance and value of incorporating linear and square units within discussi ons involving area and perimeter

PAGE 221

205 Procedural versus conceptual CK. According to the survey questionnaire with describing a basic procedure for finding their measure. They focused on explaining differentiate between area and perimeter by discussing dimensions. Most PSTs addressed the concepts of area and perimeter without any discussion about their appropriate units. Peri meter was defined as the lengt h around the outside of a shape found by adding up he were asked about their formula based approach to finding area. They were asked how they would find the to break the shape into triangles or triangles and a rectangle and said that formulas could then be used. I also asked Brianna if she thought it were possible that t here existed a Figure 18 Figure introduced during first interview with case subjects

PAGE 222

206 area is the space inside a 2 or 3 I asked her to elaborate on her response and she drew a square and a circle as representations for 2 dimensional shapes. When asked to clarify what she meant by the area of a 3 volume, her correct mention of area being 2 dimensional was significant as she was the only PST to do so. No other PST wrote about perimeter being a one dimens ional concept. Most PSTs were bound, even handicapped, by a dependency on formulas for both solving and explaining problems involving area and perimeter. Such a dependency might help explain why on the survey questionnaire 8 out of 12 PSTs indicated the y were arding the area formula (A = L W), which was done during the first interview with the case subjects. In this problem, a student (Pete) correctly calcu lated the perimeter of a 3cm 6 cm rectangle (included in problem), but is confused about what exactly the 18 represents. than adequate responses, it seemed appropriate to further investigate the PSTs CK regarding the area formula. The first interview with the case subject provided an opportunity to do that. In n area formula (A = L

PAGE 223

207 question should involve a discussion of arrays (i.e., rows and co lumns of square units). Their responses revealed different levels of knowledge and understanding regarding common procedures used to find area. multiply the base times the height a asked about the grid of 18 boxes she drew on the 3 6 rectangle in question 6, Jackie just thought maybe I would try this. It did come out to 18, so I guess that could be a way rectangle (3 6) could be used to insert the correct number of boxes (he called them centimeters squared) along t he length and width and that the L W formula was a shortcut to add up the 18 boxes that could fit inside. Larry did not visualize or grasp the row by column structure of the rectangle but instead saw the square units as simply something to be counted. Co ntrast that explanation with Brianna who described the While this language n eeds some refining, a realization of the array structure is a significant part of a foundation upon which conceptual knowledge and instructional strategies can be built. On the contrary, it was apparent Jackie did not comprehend the ro w by column structure in the 3 6 array. It is therefore not surprising then that she does not understand why the multiplication formula enumerates the units in the array. Perceived student difficulties. intervention KoST was revealed in their responses to t

PAGE 224

208 do you think elementary students may find difficult regarding the learning of area and opportunities to interact with elementary students. The 12 responses were varied, but the (Jackie). In contrast, Brianna and Grace touched on difficulties that went beyond a surface level comment. Brianna thought that when students were presented with a rectangle with only the length and width given and asked to find the perimeter, they might get confuse both novice and expert understanding. After providing a thoughtful response regarding p erimeter for the next to last question, which focused on understanding the boundary difficult for students. However, during the interview when discussing what students m forget that they are working with a 2 dimensional or 3 dimensional shape, and their answer might reflect a squared unit when it is supposed to be a cubed unit or vice v Grace was the only one who specifically discussed square units along with area at any time during the questionnaire. She did not provide any drawings to support her f would be the primary source of difficulty for students, as opposed to understanding the

PAGE 225

209 concepts. Distinguishing the Correct Unit of Measure each. Several problems on the pretest addressed various aspects regarding units of measure. Because of the fundamen tal importance of units of measure, a greater amount of reporting will be devoted to it. Confusing the measure with its unit. The first question on the pretest asked the igure th grader d Grace had no problem with this question and justified their solution by similarly explaining that adding the lengths of all four sides of their rectangle would produce a = 1 square unit Figure 19 Grid included as part of question 1 on the pretest.

PAGE 226

210 perimeter of 24 units. Compared with Brianna and Grace, Jackie and Larry were not as confident or successful. They were not alone, as this problem proved difficult for the majority of PSTs. Eight out of 12 PSTs provided a response that addressed, to differ ent degrees, concepts related to area. A common shape drawn was a 6 6 square, which does have a perimeter of 24; however, the justification provided by several for the second part included shading the inside of their shape. It was difficult to discern wh ether they were claiming the inside or the boundary as the perimeter. Others, including Jackie and Larry, were confusing area and perimeter along with linear and square units (Figure 20). Jackie drew a 3 8 rectangle. There was a dot inside each box of her rectangle indicating she apparently touched each box as she counted them. The explanation revealed her of 24. Bu t I guess I would show them that each box is 1 unit and in the box there is 24 Figure 2 0 Samples of student responses to question 1 on the pretest.

PAGE 227

211 perimet er, linear, and square units is disconnected. It is common for someone lacking a conceptual understanding of these concepts to wrongly believe square units are simply som ething to be counted rather than a subdivision of a plane (i.e., an area) (Battista et al., 1998). Jackie displayed this thinking when she responded to a question about how she because my first approach was to count the boxes and then draw a line around the boxes Larry drew a 6 6 square and place d one dot in each box along the perimeter of teacher/researcher and L = Larry): T : Can you tell me why you divided 24 by 4? L : [Takes 10 seconds to reread problem and then 8 more seconds to think] I was thinking 24 because there are 6 squares on one side, so 6 times 4 is 24 err, I m sorry, uh yes. And then I took 24 and divided it by 4 to show that there are 6 sides. I think I may have been confused on this one Maybe what I was thinking was i t might help the student to count out each individual square to see if there are 6 squares on one side, six squares on this side, six squares, and six squares and adding those four together and it comes to 24. T : So if you count up all the squares alon g the outside you are going to get 24? L : [2 sec pause] Yea. T : W ould you please show me? L: [Larry touches and count s the squares along the outside of the square he drew] 6, then I counted these 6 along the side; I guess you count this corner one onfused here. T : Does this question involve perimeter or area? L : [5 sec pause] Perimeter T : You said it might be helpful to count out each square individually. What exactly do you mean by that? L : cause if each side measures 6 . [pause, then just stops]

PAGE 228

212 It is very possible that the grid or the hint provided below the grid served as a visual cue prompting Jackie and Larry to think about area and/or square units. It is also possible Larry mad e a common error in conceptualizing perimeter as 2 dimensional. Either way, level measurement reasoning, where consistent unit iteration is performed howbeit the wrong unit (Battista, 2006). ules orient ed approach to area and perimeter, inability to consistently focus on the correct unit of measure, and tendency to respond to superficial features of a problem indicate a fragile and novice understanding of these concepts. Knowledge regarding irregular shapes. Question three from the pretest (Figure 21) provided insights about how the PSTs dealt with area and perimeter as well as units of measure of an irregular shape. The two PSTs who produced the shapes shown in Figur e 20 had no apparent trouble solving this problem. The only mistake was one of Figure 2 1 Problem 3 from the pretest. 3. (a) What is the area and perimeter of Figure A? (All corners are right angles.) (b) Explain, as you would to a fourth grader, how you arrived at both your answers. 1 cm

PAGE 229

213 the units for area. Only Jackie wrongly calculated the area while three out of the 12 PSTs (including Jackie and Larry) wrongly calculated perimeter. Several of the PSTs responded that they could figure out the area of the irregular figure in problem 3, b ut explanations revealed they lacked a strong conceptual understanding of 2 The are a of a square is s 2 So 1 2 = 1 square; count up the squares to = 8 cm 2 identified the area as 8 cm 2 and although he had partitioned the figure into 8 squares (a conceptual approach), his explanation was confusing and would not produce his ans wer: revealed Larry had an impoverished understanding of a square unit: T : I think I follow how you got the area. I just want to make sure Would you recount what you d id, or how you came up with your answers? L : think I used an equation on this. I just boxed it off. You put those little so it would be 8 centimeters squared. T : Y ou said 8 centimeters squared. Is there a reason why the area is centimeters square, and the perimeter is centimeter s ? Is that meaningful? L : I was trying to think if it was something meaningful, or if it was just something I was always taught to do. the way I was told to do it. The Microworld Orientation Session (Appendix M), which occurred almost one month after the pretest and one month prior to the first teaching episode, provided encountered the first of two problems presented during the session (Figure 22), he just stared back and forth between his computer monitor and the four writing prompts related

PAGE 230

214 Figure 22 F to the problem. Larry never created any figures in Shape Builder nor did he explore any of its features. Larry would often focus on only one way to solve a problem, a behavior of a novice teacher, and also had great difficulty imagining and testing hypotheses even with the microworld (MW) tools. work with irregular shapes (first on pretest #3 and then on the MW orientation session) exposed a noticeable lack of CK regarding area and perimeter. On the area. Apparently she ignored the concavity of the shape and simply applied the area formula to the length and the width. I asked Jackie about this during our interview: T: For area, I see that you multiplied 4 times 3 to find the area. Tell me more about those numbers. Why did you do that and where did they come from? J: I s had no idea. But what I kind of did again is that I broke this into shapes and you had the dots which made it kind of easier. So I kind of just broke it up like this in order to show you, so we knew that this [the labeled segment] was 1 centimeter, so I just kind of assumed because they all look like they have the same amount length of side, so I just said 1, 1, 1, 1 [pointing across the top

PAGE 231

215 of the shape] and I added it up t o 4 and then I did the same thing for the right side; I went down 1, 2, 3, this is another one, [segment drawn in] so this is the main number for this side. It was the same for the bottom, and the other side [the left side]. I mean, I just brought every T: J: Exactly, now that I see it again for the second time I realiz e that I just added more area, probably to the shape. T: J: No Jackie gave an answer of 14 for the perimeter, which is the perimeter of the 3 x 4 rectangle she built around Figure A, but not th e perimeter of Figure A. the Shape Builder microworld (see Figures 8 13), the PSTs were asked to comment on any particular features of the microworld that they saw as pote ntially helpful for the shapes, so the Shape Builder allows me to see what is going I observed Jackie interacting with the microworld and verbalize some of her frustrations. She struggl ed with the perimeter of irregular shapes. She said she was not sure if counting the outside segments would give the perimet er. After replicating Figure 22 in Shape Builder and experimenting with it, Jackie indicated that she found it interesting that if she dragged a square onto the working grid (a feature in Create Mode ) and placed it in the hole on the right side of the shap e (Figure 21) that the number of countable, outside segments went from 3 to 1; hence, she concluded that counting the outside segments was the correct way to find the perimeter. It appeared that this was the first time she had decided, on her own, that cou nting dots (i.e., the endpoints of a linear unit) was

PAGE 232

216 not the correct way to find perimeter. The knowledge now seemed to be personalized. Unlike Larry and Jackie, Brianna and Grace not only presented entirely correct responses to question 3 on the pretes t, they also provided clear explanations of their methods. Both used a procedural approach involving dividing the figure (see Figure 21) up into rectangles and squares and adding the smaller areas; however, Grace went a step further and provided a second way to s olve the problem. Grace displayed an understanding of conservation of area by explaining (and drawing) how the top 2 square forming a 2 x 4 rectangle. Creative in pro blem solving Being able to solve a problem in more than one way is a tra it of an expert teacher. Grace displayed this trait when solving question 3 on the pretest and was the only PST to do so. Her problem solving lead to a planned follow up with the ot her three case subjects during the first interview to see if they could also solve question #3 in a different way than they did on the pretest. Larry was unsuccessful. During the interview, it became increasingly evident that he could not intelligently tal k about area, perimeter, and units of measure. Larry was unable to consistently identify what attribute was being measured (i.e., one or two dimensional). In contrast, Jackie was able to find the area of Figure A in problem 3 using another approach: T : Can you think of another way that could be used to find the area of this shape, since you are kind of stuck without a formula? J : area, but these are the units within the shape [pointing to one of the boxes within the shape]. T : OK, and how many do you get when you count those up? [ Jackie uses a pencil to partition Fig. A into 8 squares] J : 8. So that could be a way. T : You got two different answers, right?

PAGE 233

217 J : T : Is that kind of bothering you? J : Oh yeah [recounting the actual squares], that would make perfect sense. T : Does that make more sense? J : Yes T : [Jackie interrupts] J: Units, or something. Each box represents one unit. This was the first time Jackie was able to think beyond her initial response and problem solve in real t ime. However, her initial overgeneralization (i.e., the use of formulas), along with her inability to coherently explain how she arrived at her answers, are represents one T : Can you think of any other way to find the area of that shape besides using the length times width formula? B : [35 sec. pau se] Well, if you broke it up into little squares by drawing dotted lines (student partitions shape into 8 squares) and added up all the squares. of mumbling and trailing off]. T : For area or perimeter? Are you doing area? B : Area. Oh ok. OH! So that would be right. 1, 2, 3, 4, 5, 6, 7, 8 (student points to and counts the 8 partitioned squares on the inside). T : So each one of those squares r epresents what? B : One square centimeter. T : So for this problem could students figure out the area and perimeter without formulas? B : Well if you are given that that the one segment shown as 1 centimeter, then I guess you could figure it out. way to refer to a square unit, as opposed to cm 2 which has procedural undertones (i.e., cm cm = cm 2 ).

PAGE 234

218 Ability to explain and illustrate units of measure. regarding area and perimeter (units in this case) was observed whenever they were asked to explain concepts, as was the case with question 4 from the pretest. Statistically the most difficult question on the pretest, it provided insight into why some PSTs were having trouble consistently finding correct areas and perimeters as well as coherently explaining various aspects of these concepts (e.g., linear and square units). The problem you explain the concepts of linear units and square units to a 5 th grader? Stress the differences in the concepts. Include a practical units that cannot be measured. Square of a square unit that was presented as part of question 1, that did not seem to inform her response to question 4 you cannot measure air Square this is the question I had the most trouble out of any in this survey, because I really have no idea what interview other than referring back to the square unit given in question 1 and mentioning how she thought maybe those could be used to measure a flat surface like a wall. illustrated his idea by circling the entire side of a rectangle, which classifies as very low measurement reasoning (Battista, 2006). Larry admitted he was very unsure about lin ear

PAGE 235

219 in problem 3, Larry illustrated square units by drawing boxes inside a rectangle. He ap peared to think of square units as something to be counted, but his understanding of square units as a subset of a plane is unsettled. Instead of explaining the distinguishing characteristics of linear and square units and providing classroom useful practi cal examples, Larry and Jackie (and most PSTs), simply explained how they are used (i.e., linear units are used with perimeter and square units with area). Brianna was able to more coherently and accurately distinguish between linear and square units, but when asked to illustrate her ideas the results were less than complete. Her diagram of a linear unit was a 2 cm square. It was difficult to ascertain if she was implying that the 4 cm segment is made up of linear units and that square units would be used to measure the area of the 2 2 square or if she really thought of her diagrams as discrete units. Her previous work would indicate the later, but her understanding of these concepts is clouded at best. Of the two PSTs who described linear units as one dimensional and square units as two dimensional, only Grace provided enough information to establish her explanation as classroom useful. Her explanation on the pretest focused on telling how the units are used rather understanding of these concepts, but provided evidence that she might not be able to explain them to elementary students i n a meaningful way: T : Could you draw or show me what one linear unit might look like? This question is talking about linear unit s and square units. Would it be possible for you to illustrate those concepts ? G : Yes, [Grace draws a square to the left of her writing]. When you are measuring a side of a square or a rectangle, you are measuring a linear measurement [she darkens the top of the square and draws 4 evenly spaced tick marks] So, say these [she points to one of the segments] ar e the units,

PAGE 236

220 you would s straight line in one dimension. T : Ok, h ow about a square unit? What might that look like? G : A square unit would be one that has 2 dimensions. It has a leng th and a width [Grace draws in tiny square in the upper right hand corner of the same square she used to discuss linear units] and that would be what you would find for the area, so this would be the area this unit right here would be squar ed, because it has 2 dimensions; t he area of that [the tiny square] right there. U tilizing drawings. explain concepts in meaningful ways (i.e., their explanatory framework), facilitated by effectiv e communication. Incorporating suitable drawings is one important aspect of were given the opportunity to hypothesize about future teaching. PSTs were asked to respon reference to drawing a picture or bringing in objects for display, only four provided drawings to represent their ideas. The i neffective use or lack of drawings to assist in problem solving or to clarify explanations is evidence of CK that lacks a well developed explanatory framework, which turned out to be an all too common theme found within ase subject to provide drawings to support his response; however, the drawings were sloppy and the response was incomplete, providing further evidence of insufficient CK. During the interview he was not able to elaborate upon his limited response regarding perimeter. When asked about his partially complete drawing of a rectangle with the square units drawn across the top row, his response revealed some recognition of the value of using grid paper when teaching about d count down and then you could multiply that

PAGE 237

221 cut for finding area. Although they suggested drawing pic tures and bringing in objects to help students in understanding area and perimeter, their purpose for doing so was to assist in the explanation of how to use the formulas. Also, one of the objects that Jackie recommended was a cereal box, which could be us eful for surface area or volume but quite confusing for discussing area. Alternatively, Grace seemed more concerned with a conceptual approach and thought it would be meaningful for students to see shapes drawn on a grid she recommended using a grid and highlighting the inside. However, Grace made no reference to discussing units for either perimeter or area. Her concerns with helping students understand area and perime ter were evident during our first interview. When be traced and highlighted by going around the outside of a 2 lack of realization o f the profound importance of discussing units when teaching about throughout the study. up) were included with the expectation that PSTs would include appropriate drawings to clarify and support their explanation as

PAGE 238

222 well as to assist in effectively addressing student difficulties and misconc eptions. Table 11 was written with the expectation that drawings should be used to effectively communicate a thorough response. Out of 48 potential opportunities (12 PSTs 4 problems), only five drawings were provided that accompanied a meaningful and correct response. Question #4, which appeared on the pre post and follow up test was statistically the most difficult (Mean of 1.58, 2.33, and 2.33 respectively; range 0 to 4). linear unit and a square unit to a 5 th and explaining linear and square units was very difficult for them; however only one of the 12 PSTs even attempted to draw a figure as a means to help visualize and/or explain these difficult concepts. Even when the PSTs were struggling to express themselves meaningfully, they would not provide a drawing to visualize the concept s or aid in the effective communication of their ideas. These traits reveal a novice level of problem solving. As the table 1 1 indicates, other times PSTs would suggest or refer to making a Table 11 Pre Intervention Use of Drawings Pretest Items PST #1 Grace #3 #4 #5 #6 Jackie Brianna #9 #10 #11 Larry #4 (U) #5 (R) X X x #6 (U) X * #8 (R) X X X x X x x x Note U = dealt with units, R = dealt with perceived relationships; = suggested a drawing but did not draw it; X = used appropriate drawing; x = used a drawing inappropriate for teaching/learning; X

PAGE 239

223 drawing, but would not actually draw one. Higher performing PSTs, (e.g., Grace, Brianna, and #6) would often provide a thorough written response, complete with limited CK left them ill prepared to construct a meaningful drawing, as was the case with question 4, while other times the PSTs were car e less and drew rectangles that were not to scale and thus were not helpful in facilitating a correct response. Even though the PSTs would often write of how helpful visuals were for both themselves and students, supportive diagrams and meaningful repres entations were often absent from their explanations. nderstanding of, and how they indicated they would respond to, stude nt difficulties and misconceptions, specifically regarding units of measure. manifestations of the organization of their CK and how well it enables them to T he importance of units in explanations Mathematical procedures, w hile effective at producing answers, typically do not inherently convey conceptual understanding of a construct. The area formula for a rectangle is a prime example of this. it was indicated that 18 was the correct calculation for the area of the rectangle, and only

PAGE 240

224 6. Pete, a 5 th grader, calculates the area of the rectangle below. He arrives at an answer of 18. (a) Is Pe (b) Explain why or why not (c ) After performing the calculation, Pete comes up to you looking puzzled and asks Figure 2 3 Question 6 from the pretest. meaning of the 18, explanations included: square units, units 2 sm all squares, square footage, 1cm x 1cm boxes, little squares, and centimeters. PST #4 attempted to explain (or cm 2 2 cm 2 needed to b e included because he is using 2 representations. There were two anticipated avenues in which to approach Part (c). One possibility was to realize that the 3 x 6 rectangle has both an area AND perimeter of 18 and that Pete may have actually performed a perimeter algorithm. This realization should have evoked a response asking Pete to explain how he arrived at his answer as well as 3 cm 6 cm

PAGE 241

225 delineating the differences between area and perimeter and their meaning even when they are represented by the same number. This same problem appeared on the pretest and follow up test, and no PST ever mentioned that the rectangle also had a perimeter of 18, the pretest, PST #6 discussed the fact that the 3 6 rectangle had a perimeter of 18, but that was only because she misread the problem and thought the student wa s supposed to be calculating perimeter. applying the formula L W does not directly help students conceptually understand what the answer represents; hence, a discussion about squ are units would be in order. Most of the PSTs stated they would show and/or explain what the 18 represented, but none, other than the case subjects, recommended gridding off the rectangle to expose the square units (i.e., centimeters). Jackie wrote how she would tell Pete that the 18 represents how many cm 6 cm grid inside the rectangle but failed to mention the significance of the grid or how it could be helpful to s tudent understanding. It is possible that Jackie simply confused cm with square cm. Larry, Brianna, and Grace realized the importance of a but only Brianna, and to a greater degree Grace, placed an emphasis on understanding the the rectangle a nd how 18 individual cm 2 fit into the rectangle; completely covering the

PAGE 242

226 the square units, yet again, none of the case subjects both drew AND adequately discussed an appropriate representation evidence of incomplete CK and an inadequate drawings was an all too common occurrence. Focused on solving, or diagnosing & responding. T he majority of PSTs in this study tended to focus on solving the problem (i.e., finding an answer), to the neglect of diagnosing student thinking. This was very evident in questions 7 and 9 from the pretest. ering these questions. in calculating perimeter, as well as their ability to diagnose a common student misconception regarding linear measure (i.e., point counting). Point counting is the process of counting points around the perimeter of a shape in order to determine the rrect and complete? thinking, as evident in her work, and (c) As a teacher, how would you respond to Kayla? What precisely would you say and do? Larry interpreted the problem as though Kayla d said his response to part

PAGE 243

227 Figur e 24. Question 7 on the pretest. Larry never took any time to try to analyze why Kayla did not draw her dog pens correctly. His overall CK and KoST up to this point can be characterized by his unstable CK. Based on their pretest responses, Jackie and Grace interpreted the problem as involving area instead of perimeter. They both i ndicated the 4 x 5 rectangle would use more fence (20) than was allowed (18). During our interview, Jackie struggled with stead of area. Jackie also eventually figured that Kayla was counting dots, instead of linear units, to determine perimeter but doing the perimeter and she did it. She g is sparse and fragile (she often contradicts herself) and that appears to hinder her ability Kayla, a fifth grade student, was asked to draw all the four sided dog pen designs that she could make using 18 units of fence for each design. Below are the drawings, on dot paper, that she came up with.

PAGE 244

228 to effectively diagnose student thinking and identify misconceptions (e.g., point counting). These are both traits of a novice teacher. While Grace began the problem with than perimeter. In contrast to Jackie and Larry, Grace would not become flustered after realizing her thinking was incor rect. Expert teachers are able to carefully analyze a problem before and/or while solving it. Grace displayed this often. She would pause, reread the problem, gather her thoughts, explain where she had gone wrong and why, and then continue on with her work or explanation. Grace responded to the first interview probe by reasoning refle t to Brianna was the only case subject who correctly interpreted the problem as and that Kayla was confusing dots with units. During our interview Brianna explained or

PAGE 245

229 anna would often made good use of her strong mathematics background. At times she would quietly think for 30 or more seconds before making, what was usually, an insightful comment. I then asked her if Kayla had drawn all the possible pens that would use 18 units of fence. She had not offered any information about this on her pretest, but she thought for a second and said, without any written calculations or drawings, Kayla could have done an 8 by 1 and a 7 by 2. Of the other 8 PSTs, only three were able to decipher that question 7 referred to perimeter and not area. In regards to KoST, Brianna was one of only two (and the only case subject) to appropriately respond repr esent units, but the distance from one dot to Another finding regarding question 7 involved The phrase brought to light a c thinking regarding classroom mathematics and the real world. Several PSTs indicated was by design) in conjunc tion with fence was confusing; however, many of these same PSTs used the idea of enclosing something with fence to illustrate the concept of perimeter when they responded to other pretest questions. Thus, it can be assumed that many are unsure which attrib ute to measure, and which unit to use, when calculating area or perimeter. The last problem from the pretest that explores the PSTs pre intervention CK and KoST regarding units of measure is question 9 (Figure 25). Similar to question 7, this problem p

PAGE 246

230 Fig. 1 Fig. 2 for the perimeter of Fig. 1, and if necessary, state what is the correct answer? (b) Explain why or why not. (c) As a teacher, how would you What specifically would you say and do? Figure 25. Question 9 from the pretest. responding to erroneous student thinking. Question 9 provided a useful variety of data as it was also the focus probl em for teaching episode 1. In it, PSTs are asked to verify an untraditional approach for finding the perimeter of irregular shape. There were two aspects to correctly addressing problem 9. First, PSTs had to decipher the legitimacy of secondly, prescribe an approach to address his thinking. Ten out of method was wrong was h e did not count the corner boxes twice. There were six PSTs, including Larry, who responded this way. These PSTs focused exclusively on the method is to shade the squares along the outside of the shape, as shown in Figure 2, and then t o count those squares.

PAGE 247

231 discussing the mathematics unde rgirding his approach (i.e., using square units to determine perimeter). As is common with novice teachers, they tend to respond to faulty student thinking by simply reiterating what they know about the topic at hand, rather than and subsequent interview follow asked Larry how he might respond to Justin, his method, and his thinking. Larry said: I mean i f that helps him, I think shading and counting the boxes, might help him, coming up with t he right answer. I guess if you explain perimeter and how each side you know this is a s ide of 8 [counting along the bottom of Fig.2] and a side of 4 [counting the left side of Fig. 2], and add that up accordingly, and go through it and count everythin g out. Just show them both ways and how they both work. And help him work through it a little easier, so he knows he needs to count the corners twice for each side and he understand s why. Larry was able to correctly determine the perimeter of Figure 1; however, his recommendations would confuse classroom students. Perturbations can lead to a stronger understa nding and more flexible content knowledge, but only if the cause of the dissonance is actively investigated and the misconceptions identified and addressed. An important finding resulting from question 9 was that none of the six PSTs who focused solely on the rightness

PAGE 248

232 content knowledge was limited in scope and his knowledge of student thinking was narrow i n focus. content knowledge. Initially, it appeared that Jackie seems to grasp the error in the thinking much past that. Apparently the insights gained during previous interview dialogues had not been incorporated into her evolving knowledge, or they were never actually learned at the time. During an interview Jackie seems to incorporate various elements of different problems, but without any systematic approach: T: answer be for t he perimeter for Figure 1? J: T: Tell me what you are counting, what you are thinking. J: Well, I was going back to what we were doing before with the problem back here [student refers back to problem #7]. The thing is this shape goes back where I was coming from though. I just remember doing that. So there is 5 on this side right here [referring to the left side of Fig. 1]. T: Five what? J: [Jackie draws 5 dots up left side of Fig. 1]. There are 6 [student draws in dots along the bottom of Fig. 1] down here, 6 up there [student draws in 6 dots along the first part of the top] and 5 for this side [student draws in a line down through Fig. 1 and labels i t 5]. And then you could do this one too (student points to what she labels as a 4 4 square within Fig. 1], but kinda where I get confused too, like figuring out, do I just do the perimeter of this one [student r this one [ together to get the full object? Or, do I do a different way of doing it? L ike do I, you know how before I had kind of added extra units, but that would be for

PAGE 249

233 The idea of counting dots, instead of linear units, to find perimeter was contained in question 7. When we addressed that problem, Jackie indicated that such a method was wrong. Two problems later however Jackie implemented the exact same method (Figure 2 the perimeter of Fig. 1 was 22 + 16, although she was not sure it was right. Jackie actually contradicted herself two different times while explaining her thoughts on this disconnected which negatively affects her explanatory framework and her a bility to appropriately address the shortcomings of students (her KoST). Larry and Jackie, as well as others, struggled with conceptualizing perimeter and what it measures. This reflected poorly on their CK. Figure 26. nd the perimeter of Fig. 1 (part of problem 9)

PAGE 250

234 aded are 2 dimensional; he should be even though you get the right answer this time; it may not work in all 21. She was not able to reconcile the discrepancy. In the end, Grace decided that even if Grace had a good amount of CK regarding units of measure (e.g., knew about dimensions), but struggled using it consistently to diagnose student thinking and therefore could not adequately address certain student misconceptions regarding theses concepts. Brianna earned a score of 4 (a model response) for her clear explanation of units. I would explain the difference between linear and square units, and that the shaded also explained how she would step through the problem with Justin and count the lengths o f each side to get the perimeter and then compare that to the number you would get confusion, it is always more meaningful when students are actively involved in th eir

PAGE 251

235 education. A thorough KoST would have also included such an approach with Justin. especially those who indicated that that is how they were taught, as Brianna did in h er intervention CK regarding units of measure was sufficient to get correct answers, but it was very procedural in nature and application. Her CK was organized enough to allow her to diagnose many student difficulties; however, the focus of her explanatory framework was more about getting correct answers than it was about developing conceptual understanding, which is not the goal of a more expert KoST. At mathematical foundation translated into very teacher centered approaches. She was not alone in this tendency. Unfortunately, it was found that PSTs who indicated they would allow students the opportunity to personally work through the various mathematical concepts was uncommon, and encouraging students to investigate further with manipulatives or technology was almost nonexistent. Perceived R elationships Between Area and Perimeter The perimeter and area of a figure are two different measures. The perimeter is a measure of the length of the boundary of a figure, where as the area is a measure of how much space a figure occupies. In the case of a rectangle, the calculations of both measures are related to the sides of the figure. These similaritie s provide the setting for two classic misconceptions involving the area and perimeter of a rectangle: (1) That increasing the perimeter of a rectangle will always increase its area (i.e., the direct relationship misconception), and (2) Rectangles that have the same perimeter measurement will also have the same area, and vice versa (i.e., the fixed relationship

PAGE 252

236 misconception). The first misconception appeared in question 8 of the pretest and took the form of a classroom scenario. The second misconception was contained in question 10 and involved a problem solving situation. Knowledge of the direct relationship misconception. Question 8 on the pretest presents the PSTs with a special case involving area and perimeter. The scenario is as hypothetical 5 th grade student] claims that whenever you compare two rectangles, the one with the greater perimeter will always PSTs are then asked: (a) Is she correct? (b) Explain why you agree or disagree with ving a very similar problem, aided in the analyses of s of their level s of understanding related to relationships involving area and perimeter. When this problem has been used by other researchers, it typica lly includes two rectangles (a 4 4 and a 4 8) complete with area and perimeter calculations provided by the There are two major aspects to this scenario involving the direct r elationship misconception: (a) t claim (related to CK), and (b) t response to the student (related to KoST). Because these findings are pre interventio n, The KoST findings regarding the direct relationship misconception were sparse and therefore will be interspersed within the CK findings. Four out of the 12 PSTs, includ ing

PAGE 253

237 be built on the incorrect assumption that increasing the perimeter of a rectangle must the longer the provided no mathematical examples or pictures to support their response evidence of inferior CK and KoST. Their lack of understanding regarding the mat hematics equipped to engage the student in any meaningful discussion regarding that claim. The other two PSTs attempted to justify the invalid claim by providing sample rectangles, including diagrams and calcul ations, illustrating that an increased perimeter did in fact result in an increased area. They failed to notice that the perimeter of a rectangle can increase as two of the sides of the rectangle decrease in length. The 4 PSTs, who said the claim was true, thought an appropriate response to Jasmine should involve praise and an example or two illustrating her claim: I would take simple measured boxes (1 2 cm and 2 4 cm). I would calculate the perimeters of both (6 cm and 12 cm), then calculate the areas: 1 2 = 2 cm 2 and 2 4 = 8 cm 2 then show the relationship that the larger perimeter is also the larger area. interview I asked Larry if he could give me an example that would illustrate or support of a 2 4 rectangle is tripled, what is the relationship between the original and the

PAGE 254

238 obvious that the rectangle is a lot bigger [than the 2 igating any other possibilities, Larry displayed a limited knowledge of this area/perimeter relationship (i.e., Level 0) according to Ma of the direct relationship mis conception or surrounding concepts, he neglected to use representations, and his primary concern was getting, what he thought to be, right answers. These are all characteristics of a novice teacher and an underdeveloped KoST. mathematical property or rule, wit h the thought that might make it easier for her to T : just so I know that you understand what she came up with? J : [Student rereads problem] The question says, Jasmine claims that whenever you compare two rectangles, the one with the greater perimeter will always this, so, my thinking, initially, kind of going back to the rectangle problem when you triple it and you get the greater area [Question5]. I said, yes, because the bigger the object is the more area it takes up. That was kind of my reasoning. And I said, sometimes the side of something is a large number, but the width is small. So, sometimes the ones that appear smaller have the bigger T J: No, not always, but say this is like 15 and then 2 [student draws a 2 15 rectangle, call it #3] or something like that. And then this one was 4 times 4 [student draws a 12 though the opposite can happen. A child will look at this [rectangle #3] a nd drawing really correct illustrations here.

PAGE 255

239 king on this problem, she correctly and perimeter. It is common for students to think, when comparing rectangles, that the one with the longest side will usually have the greater area. We continued: T: So, do you think that is what Jasmine was claiming? That the rectangle with the longest side will have the greater area or is that just part of her claim? J: better. T: S: But I do agree with her. I think that when you increase the perimeter you do have greater area, because T: And that would always be the case? J: know where that [what the student originally wrote for Part (b)] came from. I then explored he r pedagogical content knowledge regarding her response to part (c). T: In Part (c) you mentioned that you would try to bring in actual rectangular objects. I like that idea. How would you go about determining the perimeter of objects you brought in? J : probably more along the lines I was thinking of, but see ing some of those manipulatives too those would be really helpful for figuring out if you had like the smaller recta ngle with the rubber bands, the geoboard I think it is, the if you put little, for the units, the square units in it, you could show that the bigger, the more perimeter, the big ger the sides the more area there is in it. So you want to be really old fashioned, you can use a ruler. Jackie actually gave the previous response without pausing. This is an example that, up to this point in the study, characterizes Jackie as a learner her tendency to ramble in her responses to the point where she digresses away from the original question. The conclusion of our interview related to question 8, reveals Jac previous and emerging thoughts organized while engaged in a learning situation:

PAGE 256

240 T: So if you had the geoboard what would you be counting to find the perimeter? J: I was thinking about the . [students draws a 4 of her pretest while talking]. Look, like this, they have the little dots, the pegs, and they are kind of even and I was thinking back to those units again [student connects the dots to make rectangles inside her geoboard. See this i s your object. You can do this again, and then do this with a smaller one [another rectangle]. T: So, if you are going to try to calculate the perimeter of one of those shapes, what would you be counting to try to figure out their perimeters? J: The dots? Back to the dots. [student laughs out loud] was often the case with Jackie, even an initial correct mathematical response would be found to be built on a fra gile conceptual understanding of the concepts at hand. Jackie knowledge level established by Ma (1999) for measuring understanding related to this misconception. Jackie h ad difficulties explaining her ideas, which resulted in poorly structured interventions with potential students regarding their struggles in the pretest questions. Her CK w as insufficient and unorganized, which appeared to impede her KoST and hamper her ab ility to diagnose and appropriately respond to student thinking. I nvestigating a CK informing KoST The responses of four PSTs (including Jackie and Larry) regarding the direct relationship misconception, They stopped after explaining why the claim could work and did not investigate the cases in which it would not work. Providing a counterexample was the most straightforward was incorrect. Of those, five PSTs (see Table 12) said Ja their explanation and/or counterexamples did not directly disprove t he claim; f or

PAGE 257

241 Table 12 Investigating an Erroneous Student Claim (Pre Intervention) Note could work. Such a counterexample would disprove the existence of direct relationship between area and perimeter but would not directly address the claim which revolves around increasin g incorrect, but the explanation justifying her position did not make mathematical sen se. uncharacteristically shallow, teacher centered, and focused on getting the right answer. It was noticeable that Grace had done a lot of erasing while answering this question. It also seemed uncharacteristic that she did not provide any diagrams to support her response. The reasons behind these occurrences and her poor applications of her KoST became Number of PSTs ( n = 12) Agreed with the student Provided appropriate co unterexample Investigated the claim attained 4 (including Larry & Jackie) Yes No No Level 0 1 No No No In between Level 0 & 1 4 (including Grace) No No Yes Closer to Level 1 than Level 0 3 (including Brianna) No Yes Yes* Level 1

PAGE 258

242 evident during the interview: Actually, I [Grace] said she was correct at first. Because I was thinking that if you were looking at a fence ea around the fence, then if you have a longer fence, you are going t o have more area. B ut then I got to the end of the test and saw the last question [#10] and all the perimeters were the same, but the areas were not the same. So I thought my thinking is wrong here, so I went back to this one and then ran out of time. But what I said is that she was incorrect because the greater the difference in the length of the dimensions, the smaller the area. Even if the perimeters are the same. Grace ran out of time, but her abbreviated response revealed she had begun to explore ference in length of dimensions, the smaller the area rec process of discovering that a square is the rectangle with the largest area, an idea she would develop more fully later in the study. That is a relatively high level o f based on the perimeter remaining the same, and when Grace was made aware of this she tempt Level 1 understanding of the relationships between area and perimeter (Ma, 1999). Evidently she was slightly embarrassed by her inability to sort through the elements of this problem. Her CK regarding perceived relationships was incomplete. Since her initial thoughts were wrong

PAGE 259

243 about this question, she ran short of time on the pretest; however, as will be shared in later sections, Grace had only begun to fully investigate the possible conditi ons involving this problem. Grace was not generally satisfied with leaving mathematical conflicts this problem for the last couple days, but did not really have time to play with it, but it investigated the claim in an appropriate manner. Their explanations were similar to ectangle has a smaller perimeter than another involving irregular shapes es with equal perimeters having different areas the claim involved increasing the perimeter. The remaining three, including Brianna, circumstances; however, because they did not provide suitable examples or explanations, second level of understanding (1999). The three higher levels of understanding (Ma, 1999) went unexplored by PSTs. There are three possibilities to identify when the perimeter of a rectangle is increased: (a) the area can increase, (b) it can decrease, or (c) it may stay the same. The majority of the PSTs only discussed the first possibility. Three provided correct examples of the second

PAGE 260

244 circumstances. None of the PSTs mentioned or discussed the third possibility. Besides identifying or displaying one of the three previously mentioned possibilities, none of the PSTs revealed the two higher levels of understanding: (a) clarifying the conditions under which these possibilities held (Level 3), an d (b) explaining why some conditions this study simply stopped exploring the problem after arriving at one of the three possibilities, assuming they had adequately answer ed the question. Although 8 of 12 PSTs provided diagrams to support their explanations, only two of them were suitable for claim was incomplete, she was one of three, and the only case subject, to fully reach a in that she was able to understand that misconception by Brianna, and revealed a less than thoroug h KoST regarding this misconception. No PST suggested engaging the student in exploring the truth of her providing specific examples. Knowledge regarding the fixed r el ationship misconception The last question of the pretest addressed the second and final prominent misconception related to perceived relationships between area and perimeter the notion that rectangles with the same perimeter measurement will also have t he same area (and vice versa). The question also

PAGE 261

245 would include a square as a type of rectangle. garden. He wanted the garden to have a rectangular shape. He also wanted to have the most space rectangular gardens are pictured (an 8 22, a 10 20, a 15 15, one 5 25, and a 2 28). PSTs were asked whether one specific garden is the biggest, or if they are all the same size, and to explain their selection. Question 10 (see A ppendix D) was the overall easiest question on the pretest. It had a mean of 2.75 (range 0 4) and a standard deviation of only 0 Chapter 3 and in the limitations section. The potential misconceptions for question 10 were: (a) assuming that because all the garde ns had the same perimeter, they would have the same area, (b) predicting the greatest area based solely on appearance, and (c) not recognizing and/or acknowledging that squares are also, by definition, rectangles. Every PST calculated the area for each ga r den, and chose Garden 3 (the 15 15 square) as the garden with the greatest area on those calculations; however, because no PST justified their response by stating that squares ARE rectangles, no maximum score of 4 awarded. rectangle, then it needs to be one of Grace and Brianna were more confident of their responses. Grace mentioned that

PAGE 262

246 she was running out of time on this problem and was not able to carefully consider all Grace was not sure about this, but she was not willing to give up on her hunch. I asked justification for selecting ectangle with the greatest area, although four case subje cts. When asked if selecting Garden 3 would be a problem because it was do you think would most often be selected by 4 th or 5 th graders? Please explain your choice This question helped reveal PSTs (no case subject most common error that would be made by elementary students. That choice would characterize a student who thought that a specific perimeter can have only one area the primary misconcept ion addressed by the problem. Those four explained their selection Brianna, identifi ed Garden 5 as the most probable to be selected by elementary students

PAGE 263

247 common mis conception among elementary students (PSTS as well), but it was not the primary misconception of this problem. Grace apparently ran out of time and left part (c) blank. During the first interview, she was given the chance to offer a response. Grace thought however, research has shown that the responses and explanations offered by many students (and even preservice teachers) indicate they do think choice 6 is viable. The fact that only four out of 12 expressed an awareness of this student tendency indicates the majority of the PSTs were not sensitive to the fixed relationship misconception. In sum, the CK and KoST for the four case subjects has been presented, discussed, analyzed, compared and contrasted. The strengths of Grace (her ability to carefully process information coupled with the desire to help students understand) and Brianna (her strong mathematical background and sharp attention to detail) have been contrasted with the fragile understandings of Jackie and Larry. A reflective statement made by Jackie near the end of our first interview aptly summarized the struggles that just kind of hav here and there, and I think that some of them are mixed up.

PAGE 264

248 Intervention CK and KoST The findings in the next several sections addr ess research questions 3 and 4: regarding area and perimeter change, if at all, during the course of the study? The emerging and post intervention data came from (as they occurred in chronological order) the three teaching episodes (Appendix K), posttest (Appendix E), the second interview with the four case subjects, and the follow up test (Appendix F). Findings were eaching episodes (TEs), the post and follow up tests, and from transcripts from the second interview. Descriptive statistics will be presented first, followed by qualitative findings meant to support and illuminate the descriptive results. The first majo r category of findings deals with concepts surrounding u nits of m easure (e.g., linear and square units). This category contains several sections of findings their abi lity to respond to hypothetical students who are struggling or have during, and after the intervention, will be the focus of the first several sections of findings, and ad dress research question 3. Findings for those sections were primarily taken from the pre study questionnaire, the microworld orientation session, the post and follow up tests, the second interviews, with brief references to teaching episode 1 (TE 1). The re will then be a transition to the next major category of sections focusing on findings TE 1 will be presented and supported by relevant findings from the post and fo llow up

PAGE 265

249 tests, and second interview. In each section, change w as examined by looking back and comparing to the PSTs pre intervention CK and/or KoST that was presented while answering research questions 1 and 2. The second major category of findings relat es to Perceived Relationships between Area and Perimeter and relationship and direct relationship misconceptions. Findings will be presented in similar fashion as they were for Units of Measure One ma jor difference is that this second major category will involve findings from two teaching episodes TE 2 and TE 3. The findings related to emergent responses to the numerous writing prompts contained within this explain the intervention and set the stage for the discussion of findings that will answer research questions 3 and 4. A Teacher Development Ex periment The intervention for this study was couched within a teacher development experiment. A dynamic of the teacher development experiment (TDE) is the opportunity to perform the role of instructor and researcher simultaneously while attempting to pro mote development (teaching) within the preservice teachers as both students and future teacher s all taking place within a cycle of interaction and reflection (Simon & Tzur, 1999). Whole class interaction for this study took the form of three individual tea ching episodes (see Appendix K). The most prolonged individual interaction occurred during the second of two planned interviews with the four case subjects. The goal of this

PAGE 266

250 rs resolve conflicts and deficiencies in their current content knowledge (CK) and knowledge of student thinking (KoST), as related to area and perimeter, and how they endeavor to incorporate new knowledge (preferably conceptual rather than procedural). T he major components of this TDE were the three teaching episodes. Anchored instruction (anchored on major misconceptions surrounding area and perimeter) provided the scaffold for each teaching episode and two specifically designed microworlds were intended to offer support and motivation for the PSTs as they explored concepts and tested hypotheses. I conjectured that the microworlds would provide a fertile certain profound s ubtleties, related to area and perimeter. Emergent Levels of CK and KoST As explained in Chapter 3, findings involving CK and KoST will involve addressing key components of their definitions. For CK that includes: (a) the amount and organization of facts and concepts, and (b) the ability to explain th at knowledge in meaningful ways. For KoST that entails: (a) organizing CK in a way that would enable a shortcomings or misconcep tions. This was true for research questions 1 and 2 and will again apply to answering of research question 3 and 4. Identifying examples of expert/novice behavior (Table 3 page 166 ) within the describ ing their emergent levels of CK and KoST. Table 1 3 (p. 251 )

PAGE 267

251 Table 13 Expert /Novice Coding Totals from Teaching Episodes Teaching Episode 1 PST #1 Grace #3 #4 #5 #6 Jackie Brianna #9 #10 #11 Larry Mean SD CK a Sum 3 1 2 1 3 0 3 0 2 2 0 0 1.4 1.2 CK b Sum 3 3 2 4 2 3 0 4 3 2 4 3 2.8 1.1 KoST a Sum 13 3 7 6 7 5 9 0 11 12 4 12 7.4 4.1 KoST b Sum 1 15 10 7 7 10 3 16 4 5 9 2 7.4 4.8 a Sum 16 4 9 7 10 5 12 0 13 14 4 12 8.8 4.9 b Sum 4 18 12 11 9 13 3 20 7 7 13 5 10.2 5.4 Teaching Episode 2 PST #1 Grace #3 #4 #5 #6 Jackie Brianna #9 #10 #11 Larry Mean SD CK a Sum 3 0 7 7 4 4 6 0 2 3 6 5 3.9 2.4 CK b Sum 1 6 1 0 1 4 3 8 6 4 2 1 3.1 2.5 KoST a Sum 9 5 9 12 14 5 18 3 9 10 18 14 10.5 4.9 KoST b Sum 2 9 8 9 1 11 4 24 8 7 1 2 7.2 6.4 a Sum 12 5 16 19 18 9 24 3 11 13 24 19 14.4 6.8 b Sum 3 15 9 9 2 15 7 32 14 11 3 3 10.3 8.4 Teaching Episode 3 PST #1 Grace #3 #4 #5 #6 Jackie Brianna #9 #10 #11 Larry Mean SD CK a Sum 1 2 0 absent 1 2 4 0 0 5 3 3 1.9 1.7 CK b Sum 0 1 4 1 2 0 5 3 4 2 2 2.2 1.7 KoST a Sum 21 4 9 9 4 11 7 13 5 12 19 10.4 5.7 KoST b Sum 2 16 13 5 22 4 18 7 5 11 2 9.5 6.9 a Sum 22 6 9 10 6 15 7 13 10 15 22 12.3 5.8 b Sum 2 17 17 6 24 4 23 10 9 13 4 11.7 7.7 N ote An a signifies a novice response and b signifies an expert response (see Table 3 ). Bold sums represent a Min or Max.

PAGE 268

252 totals of novice and expert behaviors, as indicated by a and b respectively, as they were coded in the three teaching episodes (TEs). As might be expected, the frequency patterns present in the TEs are very similar as those observed in the pre post and follow up tests. Each TE contained 12 14 writing prompts and each prompt was categorized as predominantly addressing content knowledge (CK) or knowledge of student teaching (KoST), thus accounting for the frequencies CK a CK b KoST a and KoST b The PSTs found TE 1 the easiest of the three to decipher. All of the PSTs, except Jackie, straightforward. Because of that, Jackie received no CK b codes and the other PSTs had relatively similar CK b frequencies. While the PSTs performed pretty well with the CK questions related to TE 1, their inability to explain that knowledge along with a limited capacity to apply their CK and adequately address the struggling student in the TE resulted in much higher novice frequencies related to KoST. Brianna and Grace had the highest KoST b (i.e., expert) frequencies by a relatively large margin and this was reflected in the substance of their responses, as will be seen later. It is worth noting that Brianna was not assigned a single novice code for h er TE 1 responses, and Grace received the second lowest total of four. Teaching episode 2 (Figure 15 page 134 ) required the PSTs to grapple with two relatively difficult concepts. One was the misconception that a fixed perimeter (i.e., the piece of stri ng) can have only one area (i.e., the desired area of the footprint). The second involved a correct method to find/estimate the area of a footprint (an irregular shape). Several became fixated with finding the area of the footprint rather than on the

PAGE 269

253 misco the focus of the TE. That novice codes received, especially those relating to KoST. TE 2 was probably the most difficult for two reasons: (a) the potential distraction of finding the area of the footprint, and (b) because of where it fell within the intervention; there was still considerable instruction and learning to take place. The mean number of novice codes assigned was hig hest for this TE. There was a lot of mathematics involved with TE 2, and Brianna excelled. She was able to effectively apply her strong mathematical CK, and because of that she earned the highest number of expert codes (32) and the lowest number of novice (3). Grace was second in both areas with 15 and 5 respectively. Jackie and Larry ranked first and second in receiving the most novice codes, and while Jackie improved slightly over TE 1 by receiving more expert codes, Larry continued to perform near the b ottom of the class. Teaching episode 3 involved the PSTs investigating a very common, and elusive, misconception regarding a perceived relationship between area and perimeter. The class averages for novice and expert codes were relatively equal to the p revious 2 TEs, with a slight increase in expert levels of KoST. Brianna and Grace ranked second and third in overall frequency of expert responses, and this was primarily accounted for by strong performance in the KoST category. During all the teaching ep isodes, and this one particularly, Larry was observed just staring at the work in front of him for several minutes. This lack of activity (e.g., exploring with the microworlds) accounted for the high frequency of novice codes, especially regarding his KoST are not readil y evident in Table 13 Jackie does not seem to respond well initially to new

PAGE 270

254 material, as was the case with the unique nature of each teaching episode. In the qualitative section, it will be shown that when engaged in conversation about mathematical content better able to clarify and present her understanding about the concepts being discussed. Comparisons of Pre Post and Follow up Levels of CK and KoST Descriptive statistics are presented to provide an overall view of the changes in CK and KoST that were measured following the three teaching episodes. Because the posttest occurred one week after the third and final whole class intervention (i.e., TE 3), the posttest is more o intervention that occurred after the posttest was the second interview with the four case subjects. That interview involved both planned and unplanned teaching opportunities. The follow up test is better thought of as a measure of retention; however, since it was the same test as the pretest, there is value in comparing responses especially for the case subjects, in light of their second interview. With that in mind, the posttest will receive a thorough and in depth analysis with responses from the follow up test being second interview) was retained. The significance of scores and written responses on the posttest, with appropriate data from the follow up test, will then be delineated by discussing results from the teaching episodes as well as vignettes from the second interviews with the case subjects. This triangulation will provide a rich description of the course of this study. The posttest (Appendix E) was given to all 12 PSTs on October 30, 2007 one

PAGE 271

255 full week after the completion of the third and final teaching episode. The pre post and follow up tests all consisted of 10 items. The first 5 were intended to focus on CK and questions six through ten were designed to elicit KoST responses along with CK. The mean time to complete the posttest was 64.2 minutes almost 1 0 minutes greater than the pretest. The mean completion times for the CK and KoST subtests were 25.2 and 39 minutes respectively. That is an increase of almost 8 minutes for the KoST subtest. T he least amount of time spent on the posttest was 45 minutes a nd the longest was 85 minutes by Jackie. She asked for an extra 10 minutes to complete the KoST section, and at the end of the posttest only 51 minutes to complete the posttest, Grac e required 70, and Brianna took 80. Although Brianna methodically worked through the test, it appeared to the researcher that Larry was concerned with just getting done. PST #1 h ad the shortest completion time, and s he also was the only PST to have a lower score on the posttest than on the p re test. d deviation of 4.0 (see Table 14 ). The scores appeared to have a relatively normal distribution with skewness and kurtosis v alues of 0 and 1.2 respectively. The kurtosis va lue, while slightly platykurtic is within acceptable ranges. The follow up test was administered on December 11, 2007. The mean for the follow up test was 27.83 ( SD = 4.3). Skewness and k urtosis values wer e acceptable at 0 .07 and 1.3 respectively. the lowest total score and was over two standard deviations below the mean. He had the lowest scores on the CK and KoST subt ests as well. Grace shared the highest overall score of 33 and the highest KoST subtest score (17) with PST #6

PAGE 272

256 Table 14 Descriptive Statistics for Pre Post and Follow up Tests Total Score PST #1 Grace #3 #4 #5 #6 Jackie Brianna #9 #10 #11 Larry Mean S D Pretest 25 27 23 18 16 24 13 30 18 23 21 17 21.25 4.97 Posttest 23 33 25 31 27 33 28 31 29 31 28 20 28.25 4.00 Follow up 25 33 27 29 23 32 25 34 31 30 24 21 27.83 4.26 Content Knowledge (CK) PST #1 Grace #3 #4 #5 #6 Jackie Brianna #9 #10 #11 Larry Mean SD Pretest 12 16 13 8 8 14 4 14 8 12 10 8 10.58 3.48 Posttest 12 16 13 17 12 16 13 16 16 15 14 11 14.25 2.00 Follow up 11 17 15 13 12 17 11 18 16 17 12 10 14.08 2.87 Knowledge of Student Thinking (KoST) PST #1 Grace #3 #4 #5 #6 Jackie Brianna #9 #10 #11 Larry Mean S D Pretest 13 11 10 10 8 10 9 16 10 11 11 9 10.67 2.10 Posttest 11 17 12 14 15 17 15 15 13 16 14 9 14.00 2.41 Follow up 14 16 12 16 11 15 14 16 15 13 12 11 13.75 1.91 Note Posttest total scores range from 0 to 40. A score of 40 indicates a model response for all 10 items. CK & KoST subtest scores range from 0 to 20. *PST ran out of time and did not completely finish two problems. Min. and Max scores are in bold.

PAGE 273

257 The results indicated that the easiest problem on the posttest was question 5 ( M = 3.25; SD = 0 .62). The misconception being tested is that a fixed perimeter will have only one area the very same misconception investigated in teaching episode 2, which proved difficult for most PSTs. The hardest item was once again question 4 (mean = 2.33; SD = .78), which asked PSTs to explain and differentiate between linear and square units. The exact same question appeared o n the pre post and follow up tests, and was statistically the most difficult each time. On this question, both Jackie and Larry received a score of 2 (inferior), Grace scored a 3 (acceptable) and Brianna earned a model score of 4. Examining the change in total scores from the pre and posttest revealed positive growth for 11 out of 12 PST s. The posttest mean of 28.25 represents an impressive 33% increase over the pretest average score. Grace showed an increase of 22%, Brianna 3%, posttest score of 28, while still below the mean, was an increase of 115% over her pretest score. This was largely due to an increase in her CK subtest score from 4 to 13. Every PST n = 9) or remained unchanged ( n = 3). The KoST subtest scores showed strong improvement as well. The range of increase was from 2 points (20%) to 7 points (70%). Two PST s KoST subtest scores (#1 and Brianna) decreased slightly by 2 and 1 point respe ctively. The largest score difference betwe en a CK subtest and KoST subtest was three. The total score percent increase of 33% was well balanced between a CK score increase of 35% and a KoST increase of 31%. Results from the follow up test lend credence to the statistical evidence that knowledge ga ined during the study and demonstrated on the posttest was retained. The group mean decreased from 28.25 to 27.83 ( 1.5%) from post to follow up test. Means from the CK and KoST subtests were basically unchanged. The greatest

PAGE 274

258 individual drop between postt est to follow up was 4 points ( 14%) by PST #11, while the greatest increase was 3 points (+10%) by Brianna. These changes can be depicted by the use of regression lines and will be presented at the end of the qualitative analysis section. Changes in Rubr ic Score Frequencies Examining posttest score frequencies of the PSTs (Table 15 ) revealed several noteworthy results. On the pretest Jackie received seven unacceptable scores (one 0 and six 1s); however, on the posttest she did not receive any such scores and while she only achieved one acceptable score of 3 on the pretest, her responses earned 8 such scores on the posttest. Larry did not experience the same success. On the pretest Larry received 9 unacceptable scores (four 1s and five 2s), and on the po sttest Larry received the highest number of unacceptable scores (8; two 1s and six 2s). The entire class decreased their total number of unacceptable scores (0s, 1s, and 2s) from 74 on the pretest to only 35 on the posttest. Grace was the only PST who rece ived all acceptable scores (seven 3s and three 4s). Model responses rose sharply for the posttest. There were only seven 4s assigned on the pretest but 19 on the posttest. There were only three PSTs who did not receive any scores of 4 on their posttest, tw o of whom were Jackie and Larry. Changes in Expert/Novice Frequency Totals Comparing frequencies of expert/novice behavior (see Table 3 page 167 ) as way to portray the frequency totals of novice and expert behaviors, as indicated by a and b respectively, as they were calculated from the pre post and follow up test. Each test consisted of 10

PAGE 275

259 Table 15 Post & Follow up Test Rubric Score Frequencies Pretest Posttest Follow up Test PST 0 1 2 3 4 0 1 2 3 4 0 1 2 3 4 #1 1 4 4 1 2 3 5 10 Grace 5 3 2 7 3 1 5 4 #3 1 6 2 1 1 4 4 1 4 5 1 #4 5 2 3 1 7 2 1 1 6 2 #5 1 4 3 2 4 5 1 7 3 #6 2 3 4 1 1 5 4 1 6 3 Jackie 1 6 2 1 2 8 1 4 4 1 Brianna 1 8 1 2 5 3 6 4 #9 1 3 3 3 3 5 2 2 5 3 #10 3 2 4 1 1 7 2 2 6 2 #11 3 3 4 3 6 1 1 4 5 Larry 4 5 1 2 6 2 3 3 4 Total s 3 32 39 39 7 5 30 66 19 6 29 65 20 Note The questions for the pretest and follow up test were exactly the same (other than changing student names)

PAGE 276

260 questions. The first five focused on CK and the second five questions added KoST parts. The pretest and follow up test were identical in content and presentation, and the posttest contained items that parallel the pre and follow up tests. Examining class means, we can see there were improvements from pretest to posttest. There were fewer novice codes assigned and the number of expert codes increased by over three fold (from 10.3 to 31.3). ositive change. On the pretest she received by far the most novice codes (50, which was almost 16 above the mean), while her CK did not earn any expert codings and her responses related to KoST received only 4. On the posttest Jackie was able to decrease t he frequency of novice CK responses (from 26 down to 14) and increase those earning expert codes (from 0 to 10). up from only four on the pretest. Apparently the v arious interventions helped Jackie to both increase and organize her CK in ways that enabled her to more appropriately respond to student difficulties and misconceptions (i.e., her KoST). There was a decrease in the frequency of their novice codes for bo th Grace and Brianna from pretest to posttest. This change remained stable through the follow up test. Brianna had the highest combined frequency of expert codes (le d by her strong CK) for the posttest, along with the lowest number of novice codes. For the posttest, Grace was second in each respective category. On the follow slightly fewer expert codes than did Grace (who had the most), due primarily to Brianna neglecting to include appropriate diagrams with her responses. L arry did increase the number of expert codes received from pretest to posttest (from 5 to 15); however, his

PAGE 277

261 Table 16 Expert / Novice Coding Totals for Pre Post and Follow up Tests Pretest PST #1 Gra ce #3 #4 #5 #6 Jackie Brianna #9 #10 #11 Larry Mean SD CK a Sum 16 11 14 20 21 10 26 13 12 10 18 18 15.8 5 CK b Sum 3 11 4 1 1 8 0 6 4 7 2 2 4.1 3.3 KoST a Sum 18 16 31 16 22 16 24 10 15 17 19 22 18.8 5.4 KoST b Sum 6 9 6 6 2 8 4 15 5 6 5 3 6.3 3.4 a Sum 34 27 45 36 43 26 50 23 27 27 37 40 34.6 8.7 b Sum 9 20 10 7 3 16 4 21 9 13 7 5 10.3 6 Posttest PST #1 Grace #3 #4 #5 #6 Jackie Brianna #9 #10 #11 Larry Mean SD CK a Sum 14 7 13 11 20 11 13 6 14 10 12 17 12.3 3.9 CK b Sum 11 17 10 12 8 11 10 19 9 11 15 7 11.7 3.6 KoST a Sum 22 12 22 18 13 11 18 11 22 13 19 29 17.5 5.6 KoST b Sum 15 26 15 21 23 26 19 25 14 25 20 8 19.8 5.7 a Sum 36 19 35 29 33 22 31 17 36 23 31 46 29.8 8.4 b Sum 26 43 25 33 31 37 29 44 23 36 35 15 31.4 8.4 Follow up Test PST #1 Grace #3 #4 #5 #6 Jackie Brianna #9 #10 #11 Larry Mean SD CK a Sum 15 8 12 19 14 12 18 12 15 10 18 17 14.2 3.5 CK b Sum 8 19 14 8 13 11 6 14 11 15 9 4 11 4.2 KoST a Sum 15 8 18 9 16 5 11 11 13 16 11 19 12.7 4.2 KoST b Sum 13 21 9 18 10 23 12 15 12 10 13 7 13.6 4.9 a Sum 30 16 30 28 30 17 29 23 28 26 29 36 26.8 5.7 b Sum 21 40 23 26 23 34 18 29 23 25 22 11 24.6 7.4

PAGE 278

262 frequency of novice codes also increased (from 40 to 49). Larry received the most novice codes as well as the fewest expert codes for both the posttest and the follow up test. During our first interview, Larry often indicated that he did not have a firm un derstanding of the concepts at hand, and that relative quick completion time on each test, combined with his brief (often unclear) responses, indicated that Larry was more interested in completing the t ests than doing a thorough job. Changes in the Frequency of Specific Expert/ N ovice Codes Assigned Table 17 shows many of the specific codes that comprised the totals that were just discussed in Table 16 This table also reveals strengths and weaknesses of various PSTs. The case subjects were the focus of this table because their coded responses could be verified through the second interview. Certain codes, because they required a high level of expertise (e.g., 1 b and 9 b ), were not assigned very often. Specific codes aligned very well with aspects of CK and KoST, and were used to compare the amount and type of respective knowledge present at the pre post, and follow up test. For example, codes involving knowledge structure (1 a /1 b ) as well as explanatory framework (8 a /8 b 15 a /15 b and 16 a /9 b a /2 b ) as well as their ability to address shortcomings and misconceptions (e.g., 7 a /7 a /7 b 12 a /12 b and 13 a /13 b ) were useful in For example, posttest can be partially explained by the fact she received the novice codes of 1a 8a and 16 a a total of 10, 3, and 16 times r espectively, but the frequencies of those codes were reduced on the posttest to 5, 0, and 10 respectively. In addition to the reduction of

PAGE 279

263 Table 17 Expert /Novice Coding Frequencies for Case Subjects from Pre Post and Follow up Tests Pretest Code 1 a 1 b 2 a 2 b 3 a 3 b 4 a 6 a 7 a 7 a 7 b 8 a 8 b 9 a 9 b 10 a 10 b 11 a 11 b 12 a 12 b 13 a 14 a 15 a 15 b 16 a 17 a Grace 6 3 3 3 0 3 0 1 3 0 2 1 4 5 1 0 0 1 2 0 1 0 1 2 0 4 0 Jackie 10 0 8 1 0 0 1 1 3 1 1 3 1 3 0 0 0 0 0 0 1 0 3 4 0 11 2 Brianna 6 1 2 7 1 1 1 0 3 0 2 0 6 6 2 0 0 1 2 0 0 0 1 0 0 2 0 Larry 8 0 5 3 1 0 1 0 4 2 1 1 1 4 0 0 0 2 0 0 0 0 3 5 0 9 0 Class Avg 7.8 1.2 4.6 3.5 0 .7 0 .6 0 .3 0 .5 2.8 1.3 1.2 1.1 2.1 3.5 0 .7 0 .2 0 1.6 0 .8 0 0 .2 0 1.0 2.0 0 7.1 .3 Class SD 3.1 2.1 5.1 3.6 1.9 0 .7 0 .7 1.3 4.2 2.2 1.9 1.4 2.9 4.9 1.2 0 .4 2.2 1.2 0 .4 1.7 2.5 6.2 0 .9 3.1 2.1 5.1 3.6 Posttest Code 1 a 1 b 2 a 2 b 3 a 3 b 4 a 6 a 7 a 7 a 7 b 8 a 8 b 9 a 9 b 10 a 10 b 11 a 11 b 12 a 12 b 13 a 14 a 15 a 15 b 16 a 17 a Grace 0 8 0 9 0 1 0 0 5 1 1 0 9 2 3 0 4 0 5 2 2 4 0 1 1 7 0 Jackie 5 3 0 9 0 0 0 0 1 2 5 0 6 1 0 1 3 4 2 4 0 4 0 2 0 10 0 Brianna 0 9 0 9 0 0 0 0 3 0 4 0 7 2 6 2 3 1 3 3 1 4 0 2 2 3 0 Larry 8 0 3 7 0 0 1 0 4 2 2 1 5 3 0 3 0 6 0 3 1 4 0 1 0 10 0 Class Avg 3.4 5 1.1 7.8 0 0 .4 0 .1 0 3.1 1.3 3.3 1 6.7 2.3 1.4 1.9 2.4 2.3 2.6 3.3 0 .8 3.8 0 .1 1.4 0 .5 7.7 0 Class SD 2.1 2.9 1.7 3.7 0 1 0 .3 0 2.5 1.3 2.4 1.2 3.5 3.2 1.3 2.9 3.1 4 2.6 5.1 2 5.6 0 .3 1.7 1.3 4.3 0 Follow up test Code 1 a 1 b 2 a 2 b 3 a 3 b 4 a 6 a 7 a 7 a 7 b 8 a 8 b 9 a 9 b 10 a 10 b 11 a 11 b 12 a 12 b 13 a 14 a 15 a 15 b 16 a 17 a Grace 1 7 1 7 0 3 0 1 3 1 3 0 8 0 5 2 2 0 4 2 1 3 1 1 0 4 0 Jackie 6 0 2 6 0 0 0 0 1 1 5 0 5 2 0 2 1 1 1 2 0 3 0 4 0 9 0 Brianna 1 7 1 7 0 0 0 0 2 1 2 1 9 5 1 2 1 3 2 2 0 3 1 1 0 4 0 Larry 6 2 6 2 0 1 0 0 4 1 1 1 4 1 0 3 0 4 0 2 1 3 1 0 0 8 0 Class Avg 4.1 3.5 2.2 5.8 0 0 .9 0 .1 0 .1 2.2 0 .8 3.3 0 .9 6.8 2.5 0 .9 1.5 0 .9 2.5 1.8 2.3 0 .4 2.9 0 .8 1.2 0 6.8 0 Class SD 3.2 2.5 3.8 4.9 0 1.3 0 .3 0 .3 2.8 1.3 3.4 1.9 2.3 5.3 1 3 1.7 2.4 1.9 4.8 1 5.6 2.2 1.3 0 3.9 0 Note There were no codes of 4 b 5 a 5 b 6 b or 14 b assigned for any test; 13 b was assigned only 5 times ( 4 on the post and 1 on the follow up )

PAGE 280

264 categories. Take for example those reflecting her KoST. Jackie received 1 code of 2 b on the pretest and 9 such codes on the posttest; similarly, the frequency of 7 b increased from 1 to 5, from pre remained fairly constant on the follow up test. Brianna and Grace strengthened their CK as evident by the fact that they received no codes of 1 a or 8 a on the posttest and received relatively high numbers of codes 8 b and 9 b Their increases in KoST can be seen by the higher than average frequencies of codes 2 b 10 b and 11 b A significant change regarding Brianna can be seen by examining the codes 9 a and 9 b Brianna has a strong mathematics background and tended to be very procedural in her problem solving, explanations, and how she indicated she would interact with students, indicated by the high rate of code 9 a on the pretest. Throughout the teaching episode doing, and explaining mathematics. She consciously made efforts to think more conceptually, which was evidenced by the decrease in 9 a s assigned and the increase in 9 b s she received. Larry on the other hand continued to struggle with the mathematics contained in the study as well as explaining his ideas (see the high rates of codes 1 a 7 a and 16 a ). He also showed little, if any, improvement in how he contemplated and addressed student thinkin g (see codes 2 a 2 b and 11 b ). Tables of expert/novice codes revealed response patterns within individuals, as well as within the entire class. For example, the relatively low frequency of code 7 b (i.e., the ability to generate appropriate not realize the importance of diagrams presenting conceptual explanations of

PAGE 281

265 mathematical concepts. This tendency was repeated by a low rate of code 12 b (i.e., the appropriate use of manipulatives) and the total absence of code 13 b (i.e., the appropriate incorporating technology is somewhat tro ubling given the tremendous focus placed upon the two microworlds used in this study. Linear Regression Involving CK and KoST and Total Test Scores The last quantitative measure s change in knowledge that occurred during this study were regression lines fitted to each post and follow up CK, KoST, and total test scores (Table 18 ). R 2 values were included as an indication of how well the regression line fits the test scores. The Table 18 Regre ssion Equations for PST CK KoST and Total Score CK Scores KoST Scores Total Scores PST regression eq. R 2 regression eq R 2 regression eq. R 2 #1 y = .5x + 12.2 .75 y = .5x + 12. 2 .11 y = 24.3 0 Grace y = .5x +15.8 .75 y = 2.5x + 12.2 .60 y = 3x + 25 .75 #3 y = x + 12.7 .75 y = x + 10.3 .75 y = 2x + 21 .99 #4 y = 2.5x + 10.2 .31 y = 3x + 10.3 .96 y = 5.5x + 15 .62 #5 y = 2x + 8.7 .75 y = 1.5x + 9.8 .18 y = 3.5x + 15 .40 #6 y = 1.5x + 14.2 .75 y = 2.5x + 11.5 .48 y = 4x + 21.7 .66 Jackie y = 3.5x + 5.8 .55 y = 2.5x + 10.2 .60 y = 6.5x + 10 .57 Brianna y = 2x + 14 .99 y = 15.7 0 y = 2x + 27.7 .92 #9 y = 4x + 9.3 .75 y = 2.5x + 10.2 .99 y = 6.5x + 13 .86 #10 y = 2.5x + 12.2 .99 y = x + 12.3 .16 y = 3.5x + 21 .65 #11 y = x + 11 .25 y = .5x + 11.8 .11 y = 1.5x + 21.3 .18 Larry y = x + 8.7 .43 y = x + 8.7 .75 y = 2x + 15.3 .92 Class y = 1.8x + 11.2 .71 y = 1.5x + 11.3 .69 y = 3.3x + 19.2 .70

PAGE 282

266 closer their value is to 1 the better the regression line fits the data. The mean R 2 for CK, KoST, and total score was .67, .47, and .55 respectively, while the median was .75, .54, and .66 respectively. Due to small sample size (N = 12), the mean was more volatile to extreme R 2 values. An R 2 of 0 occurred twice (once for KoST and once for total score), up test. The several lower /weaker R 2 values for KoST scores can be partially explained by the six instances where a follow up KoST score was lower than the posttest score (average decrease was 3.25). Compare that with the four instances where CK had a lower follow up score than posttest (average decrease 2.25). The slope perimeter increased by 1.8 points (range 0 20) from pretest through follow up test. The 2 values which explained more than 50% of the variance, where as six of the KoST equations had R 2 values > 50%. The regression lines for the case and total score (Figure 30 ), along with those of the other eight PSTs (Figures 28, 29, 31, & 32 ), a ppear below to provide comparisons as well as to demonstrate each in dividual s change in CK, KoST, and total knowledge that occurred throughout the study. Describing the Change in The first category of findings used in answering research questions 1 and 2, Distinguishing between a rea and p erimeter was not as clearly discernable in the findings from the intervention or post intervention stages of the study. This would most likely be due to the very nature of the intervention. That first category became apparent in the findings from the pre study Survey Questionnaire. The PSTs were specifically asked to

PAGE 283

267 Figure 27

PAGE 284

268 Figure 28

PAGE 285

269 Figure 29

PAGE 286

270 Figure 3 0 Regression lines and equations for each case subject s total score

PAGE 287

271 Figure 31 Regression lines and equations for each PST s total score

PAGE 288

272 Figure 3 2 Regression lines and equations for each PS

PAGE 289

273 discuss and explain their current notions and understandings regarding area and perimeter. The very act of working through the pretest, being interviewed (for the case su bjects), and then receiving content instruction prior to the first teaching episode seemed to resolve many of the glaring confusions regarding distinguishing whether a problem involved working with area or perimeter. For example, there were no responses si milar to were integrated within the two major categories of knowledge used to answer research questions 3 and 4: (a) Units of measure, and (b) Perceived relationships between area and perimeter. Findings from the three teaching episodes, interview vignettes, postt est, follow up test, and classroom observations will be presented in the next several sections. Because the teaching episodes (TEs) comprise the primary means of intervention for this study, findings from the TE s embody emergent knowledge. Findings from th e posttest represent post intervention knowledge, and are supported by findings from the follow up test, an indication of retention. The writing prompts contained within the TEs were written to provide a progressive learning experience. By design, the TEs allowed each PST to create their own personal learning trajectory. Because of this, findings presented in the emergent knowledge sections w ere not directly compared to findings from specific test items (i.e., in a pre post comparison method). The results f rom the TEs function as a bridge between the pretest and posttest, and indicate levels of change that w ere discussed as a continuum of change resulting from the intervention (i.e., from TE 1 through TE 3). Therefore whenever possible, discussions bega n wit h appropriate findings from a

PAGE 290

274 teaching episode and then be expanded upon and/or supported with (i.e., triangulated) select problems from the post and follow up tests. The questions on the posttest were parallel to the pretest in difficulty, content (e.g., area, perimeter, linear, and/or square units), and misconception(s) addressed. The questions on the follow up test were identical to the pretest. The majority of the findings and subsequent discussion regarding the posttest focused on the questions that p arallel those presented while answering research questions one and two. In order to make the answering of research questions 3 and 4 more apparent, findings w ere presented as predomina ntly addressing either the CK (the focus of question 3) or the KoST (t he focus of 4) of the PSTs. By their very nature, CK and KoST interact with each other and are therefore not mutually exclusive. At times it w as both impossible and impractical to completely separate certain CK and KoST findings. Also, not every category o f findings (e.g., Knowledge regarding irregular shapes ) address ed both CK and KoST or contain ed pre intervention, emergent, and post intervention findings. Emergent findings were limited in scope by the content contained within the three TEs; however, ea ch appropriate category of findings contain ed some form of comparison (i.e., pre to post or pre to follow up, with emergent findings strategically inserted) in order to document change in CK and/or KoST. The findings regarding units of measure and perc eived relationships in entirety provide d the case intervention CK and KoST with their emergent and post intervention CK and KoST to assist in answering research questions 3 and 4. The fact that un its of measure (i.e., linear and square units) are fundamental to area and perimeter resulted in their findings being interspersed throughout the pre post and follow up

PAGE 291

275 tests as well as the planned intervention (i.e., the TEs); therefore, it was not possible to parcel the categories of findings regarding units of measure in the same fashion as it was with the findings on perceived relationships. Research question 3 specifically deals with changes in n by examining findi ngs regarding concepts of area and perimeter surrounding linear and square units. Changes in CK Regarding Units of Measure When considering rectangles (the primary shape discussed in this study), determining area and perimeter involve s calculations with the lengths of sides. A conceptual understanding of area and perimeter needs to equip the student and teacher alike with the knowledge to more consistently perform the correct measurement. While each measure involves a calculation with s ides, area and perimeter also require attention to their appropriate unit s (i.e., linear or square). These concepts are intrinsically linked, and a profound CK and KoST should always include appropriate mention of linear and square units when discussing ar ea and perimeter. Because of the fundamental importance of units of measure, a considerable amount of reporting will be devoted to this category. came from TE 1, TE 2, the post and follow up tests, observations by the researcher/instructor and second observer, and the second interview with the case subjects. The first interview with each case subject was designed to only gather information to help establish a baseline of their CK and KoST; therefore, the first intentional intervention came on November 2, 2007 with the presentation of TE 1 ( see Figure 14, p. 130 ). Teaching episode 1 commenced with a 15 minute, instructor lead discussion regarding units of measure. Linear, square, and cubic units were taught along

PAGE 292

276 with their appropriate measurement (perimeter, area, and volume). These central, unifying ideas undergird all measurement. Visual representations for each unit were presented to help develop a conceptual understanding of how shapes are comprised of the various units used to measure them. The instructor/researcher purposely used diagrams when teaching about units to model effective instruction; however, the instructor/researcher did not specifically tell the PSTs that they should follow suit in their personal responses. Confusing the measure with its uni t. Since teaching episode 1 (TE 1) was the first regarding the measures of area a nd perimeter and their understandings of the appropriate unit for each. TE 1 was designed to provide the PSTs an opportunity to investigate ideas surrounding area and perimeter and linear and square units. There were 3 primary concepts at work within TE 1: (a) perimeter involves linear not square units (CK), (b) finding the perimeter of an irregular shape (CK), and (c) comprehending, explaining, and investigated by asking th correct perimeter would be, and (3) Explain, mathematically speaking, what is correct or incorrect about Justi a perimeter of 20 square units although this TE fell into one of four groups. s method was correct. Out of

PAGE 293

277 this situation he came up with the would produce the right answer she then went on to ex counting the square units that are shaded. He really only needs the linear units. He does appear this PST was careless in her analys since the corners get counted twice since the corners get counted twice, is tro ubling because it seems to put the focus on trying to make using the correct unit, linear in this case, for the appropriate measure (i.e., perimeter). treated the shape as though it were a 4 9 rectangle with a She went on to explain her of the shape. Here Jackie is performing an iteration to calculate perimeter; al beit, she iterated the wrong unit. Jackie, just as Justin did, incorrectly applied her CK within a pro blem solving situation. Later during the same session after r eflect ing on her ideas

PAGE 294

278 ( with the aid of the Shape Builder MW) boxes around the shape instead of counting the sides around the shape. That would linear units) w as seen often within the findings as a dividing line between novice and expert responses. So after initial difficulties, it appeared Jackie had resolved her confusion to a greater degree than the other PST. Once Jackie and the other PST realized their initial thoughts about the focus problem were wrong, that meant all 12 PSTs were able to (al though at different times and to different degrees) decipher as incorrect. There were three PSTs in the next category of responses. These PSTs realized sed unproductively either what would have to be done in order to make his method work, or trying to over analyze it instead of simply explaining why it was wrong. For problem], which I not for other irregular shapes and is basically unproductive. method was incorrect and were also able to find the correct perimeter, used either unclear

PAGE 295

279 squares not the perimeter outside the shape. It is the black line around the outside of the acceptable language for a teacher. During the second interview with Larry, after weeks o f intervention, we discussed his responses to the TE 1. He was given an opportunity to are that lacking or unorganized which affected his ability to use meaningful and appropriate vocabulary when discussing mathematics with elementary children. All ei ght of the previously mentioned PSTs avoided the important discussion involving terms, such as linear and square units, and how Justin was using square units to measure perimeter. The last category of responses more effectively communicated these ideas. T here were four PSTs (including Grace and Brianna) whose responses incorporated, to different degrees, the concepts of perimeter and linear and square units, I answered this problem because I am still struggling with the concepts of area and perimet

PAGE 296

280 measuring square units instead of linear units. Perimeter is the outside boundary of the shape dimensional units, rather than the 1 dimensional linear units that make up the actual perimeter of the shape These findings were early on in the intervention process, and several of the PSTs who had incomplete, unorganized, or unproductive explanations in the first half of TE 1 were making positive strides near the end, as will be seen when discussing their KoST regarding the student presented in TE 1 Findings related to the category, Confusing the measure with its units were also em appearing on the follow up test (Note: the pretest and the follow up tests contained the same problems in the same order). As reported when discussing the pretest (see Figure 22 p. 214) the PSTs had considerable difficulty with drawing a polygon (on a grid provided) that had a perimeter of 24 units and then explaining how they knew they were correct the two parts of problem #1. Eight out of 12 PSTs provided diagrams and/or explanations that addressed, to different degrees, concepts related to area, a nd the scores reflected the confusion. There were five scores of 1 (range 0 to 4), four scores of 2, two who earned a score of 3, and one model response of 4 (Grace). Results from the same item appearing on the follow up test were much better. The mean s core for problem #1 increased from 1.92 on the pretest to 2.83 on the follow up. Overall, there were three scores of 2 awarded, eight scores of 3, and one

PAGE 297

281 model response of 4 (Brianna). Not only did the scores improve, but so did the depth of the responses. Three PSTs correctly drew an irregular polygon that had a perimeter of 24. Six responses included justifications of their shape using language similar to even more precise by explaining that the perimeter of their shape could be found by counting the outside linear units. CK containing rich dialogue such as thi s was, for the of the three earning a score of 2 on item one of the follow up. Larry drew a 6 6 square, which does have a perimeter of 24, but his response to the sec ond part of the problem (How would you help a 4 th grader understand that the polygon you drew really does have well when he took the follow up test, but one would still hope for greater detail and and its appropriate unit of measure is that it is lacking. Grace made what appeared to be a careless mistake and drew a 4 6 rectangle which has an area of 24. The reason it appeared to be careless was because her explanation for part 2 implied she drew a rectangle that had a perimeter of 24. She wrote hat it intervention CK could be summarized as most often correct but possessing a limited ability to explain. This response, as well as more in the coming

PAGE 298

282 There were eight PSTs who earned a score of 3 for their work on the first problem of the follow up test. Their responses revealed slight differences in their understandings related to units of measure, and in an ability to explain their ideas. All eight drew a correct shape but their subsequent justification was eit her not directly connected to their specifics relating her explanation to th e shape she drew; thus, her response would not be helpful to a 4 th grader. Jackie was also in this group and her response contained vague response were clearly labeled num bers on her shape correctly explaining and showing how to count the edges (linear units). Since the follow up test occurred after all the compared to what she wrote on he . I have no idea if the polygon I drew [a 3 knowledge of area a nd perimeter and linear and square units. possesses a strong mathematics background, but pre intervention explanations often lacked specifics (e.g., meaningful language) necessary for elementary children. Her pretest response to the same question earned a 3, because it was less than thorough and did not include any mention of linear units. On the follow up test she drew the same picture as on the pretest (a 5 7 rectangle), but now it was clearly evident that she saw

PAGE 299

283 the outside all the way around the rectangle. Make sure you count the outside edge of the boxes, using linear units, instead of the box es. When you add up the sides, 7 + 7 + 5 + 5, explanatory framework, appeared to be reaching similar levels as her mathematical knowledge. Manifestations of ( i.e., procedural versus conceptual) were often displayed through their solution strategies and subsequent explanations to post and follow up test items. Problem 1 from the posttest ( Figure 35 ) illustrates this facet of the s to their understandings involving units of measure. Procedural versus conceptual CK. Problem 1 was meant to be relatively easy so the PSTs could ease into the posttest and gain some confidence. The primary concepts involved realizing that the wordi recognizing/remembering the area relationship between a triangle and a rectangle half. The expectation was that the PSTs would quickly calculate the area of the rectangle to be 12, or better yet visually recog nize the rectangle comprised a 3 4 array of squares (or square units), and then see the one half relationship (or better yet draw it) to calculate the answer of 24 triangles. While 11 out of 12 PSTs (including all 4 case subjects) got the correct answer, the different methods used, along with the responses given to part (b), revealed various degrees of CK. Five PSTs drew in a 3 4 array of squares inside the rectangle (Larry was the only case subject) and of those only two (no case subjects) showed the o ne half relationship by dividing the 12 squares into 24 triangles and thus arriving at their answer. Such a method typically produced conceptual responses similar

PAGE 300

284 1 (a) How many triangles, like the one shown below, will it take to completely cover the rectangle shown? (b) As a teacher might, clearly explain how you arrived at your answer? Figure 3 3 Problem 1 from the posttest. the right answer to problem 1 and also drew an array of squares inside the rectangle (i.e., conceptual groundwork), his explanation does not connect the area of the rectangle with the area of the triangle or emphasize the one half relationship It reveals a limited ability underlying motivation to simply get right answers and tell students how to get right answers, as opposed to developing conceptual understanding. Two other PSTs used the triangle given and drew another triangle on top of it; thus, producing a square and illustrating the one half relationship. From there they provided a conceptual response focusing on the one was a blend of conceptual and procedural ideas. She indicated during the secon d

PAGE 301

285 possessed a fragile confidence in her own CK; however, what she wrote next is evid ence saw the triangle s area was and the square of her response, albeit a little vague. But instead of continuing by fil ling in a grid of 24 rectangle is evidence that, at this point in the study, she was still unaware of the vignette from our second interview reveals Jackie possessed more CK than she was able to consistently apply and effectively communicate. I wanted to determine how much conceptual understanding was supporting her procedural knowledge. T: How did you know that the area of the rectangle was 12? J: I did 4 times 3 for the area of the rectangle. T: And why does that produce area, multiplying 4 time s 3? J: this is how I was thinking it [drawing horizontal lines in rectangle]. I was picturing one, two, three [counting] columns, and then [drawing vertical lines in rectang le] one, two, three. This is how I viewed it. I put it in terms of square units [she draws in a grid]. So I guess I could show my students that figured it out. T: So the formula is basically the short cut for summing up all the rows and columns? J: Yeah, for summing up all the rows and columns. And then for the area of the triangle I did 1 times 1 divided by I know that to find the area of a triangle you u se a formula. You go 1 times 1 divided by or and so I just did .5, divided by .5 and you can see I did some division work on the side, with the decimal I just brought it ov er and then I got 24. T: J: felt confident about this test and I thought like that it was a pretty good way, like T: you a little puzzled?

PAGE 302

286 J: T: Ok, well, what does this little square represent (pointing to a square inside the grid]? This is one of the twelve, so what could you actually call this? This is one . ? J: One twelfth? T: Oh yes, very good. I was thinking simpler, like one square centimeter, an d the total area is twelve square centimeters. J: Ok. T: What if I drew a diagonal through one of the squares inside the rectangle? J: Ok, OH! Then I could just do that for all of them [laughing, and starts to draw in diagonals inside each square unit]. T: Each one of these shapes [pointing to one of the triangle drawn in] would be? J: Umm (5 second pause) T: Just like the triangle given in the problem, right? J: Right, yeah. T: J: Base times the height divided by two. T: Why do we divide by two? J: T: So, you could have actually just drawn out the rest of the square centimeters. J: T: Yes, you are still J: Oh! T: You did it mathematically procedurally. J: Yeah. T: conceptual way to get the answer. J: Ok. T: J: T: And that would be a good way to verify it for your students, and they could see the twenty four triangles. J: Yeah, and that would be a really good way, especially since I was thinking in my head about the rows and co lumns. T: such a visual person, and you went away from that. I saw that you started to draw something inside the rectangle. Do you remember that? J: Yeah, oh yeah. T: You remember that? That you started to draw something there? J: T: Does that seem mathematically ok in your head? J: Yeah, I love that. Yeah. It took considerable prodding to lead Jackie in to discovering the one half relationship and

PAGE 303

287 a more conceptual solution strategy. It appears however that the above conversation was meaningful to Jackie. On the follow up test (6 weeks later), Jackie used an array structure in response to a hypothetical s tudent who calculated 18 for the area of a 3 cm 6 cm rectangle that was given, but who indicated that he did not understand what the 18 represented or meant (Figure 23 p 224 ). square) was used for which measure (perimeter or area). Her apparent growth during the second interview and her resp onse to the above question on the follow up test are a significant improvement from her CK displayed during the first interview. There she was th you, I just know that you multiply the base times the developed and became better organized there was a more stable foundation from which her explanatory framework coul d better support her KoST. procedural approach to solving problem 1. Brianna correctly answered part (a) through straight calculations involving formulas. Her response to part (b), which involved I found the area of the rectangle by multiplying 4 3, which gave me 12. I know

PAGE 304

288 that the area of a triangle is base height. Since base and he ight are 1, the area would be . Then I divided the area of the rectangle by the area of the triangle, 12 , which is the same as 2 12 and will give me 24. So I know there are 24 triangles in the rectangle. Procedurally, Brianna gave a clear and precise ex planation, although such explanations fall short in developing conceptual understanding among students. Her lack of any mention of appropriate units is less than acceptable. There is evidence however that Brianna did not conclude the study with a strictly procedural based CK, which would characterize a novice teacher. During her second interview, Brianna and I discussed her that they did not understand or follo w all the mathematics in your explanation. Can you think of a way to help that student visualize and better understand the answer you came several continues to draw a 1 1 sq uare next to the 3 4 rectangle and then divides the square Brianna then went on to begin partitioning up the 3 4 rectangle into 1 1 squares and dividing each square into two triangles while she e x plained the relationship between the area of the rectangle (12) and the number of triangles inside the rectangle (24). as also evident in her work with irregular figures. Her method for and explanation of pro blem 3 on the pretest (Figure 21 p 212 ) was procedural and formula driven. To find the area of a relatively easy irregular figure, she divided it up into squares and rectangles and applied the appropriate formulas. When faced with the same problem on the follow up test, she

PAGE 305

289 partitioned the figure into square units (using dotted lines), a conceptual approach, and to get 8 cm 2 blended post struggling to make meaning of mathematical procedures. This is illustrated by her response to the student in problem 6 on the follow up test (Fi gure 23 p 224 ) who was struggling to make sense of what the answer (i.e., the number) to the area of rectangle [square centimeters would have been better] and when we find area we use square units. I would divide the rectangle up to show him that when we count up the squares inside the reference to that conceptual idea showed how to effect mathematical difficulty, and demonstrated her developing KoST. Throughout the study both Brianna and Grace performed relatively well. One somewhat noticeable difference was in their explanations. While Brianna was very mathemati cal and procedural, Grace more often than not made obvious attempts to conceptually explain her ideas and methods. For example when solving problem on e on the posttest (see Figure 33 ), even when Grace did not include any drawings, as was a consistent findi ng in her responses, she provided a very conceptual explanation that highlighted making use of a helpful representation (grid paper in this case): The rectangle contains 12 cm 2 in other words, 12 1cm squares will fit in the rectangle (put a cm 2 grid over the rectangle to illustrate). Then show each cm square can be divided in half to look like the triangle given. So there are 24 triangles twice as many as the number of squares.

PAGE 306

290 thoroughness, conciseness, and clarity of her response illustrate how her CK became well organized during the study. Knowledge regarding irregular shapes. Finding the area and perimeter of an irregular shape has been shown to pose various difficulties f or students and teachers alike (Rutledge, Kloosterman, & Kenney, 2009; Tierney et al., 1990). Ques tion 3 on the pretest (Figure 21 p 212 ), and again on the follow up test, asked the PSTs to find the area and perimeter of an irregular figure and then expl you arrived at both your answers. On the pretest Larry correctly found the area but not the perimeter. He got both correct on the follow explanations (that were supposed to be meani ngful to a 4 th grader) reveals a minimal explanatory framework which does nothing to bolster his limited CK. First, from his pretest: to find Area perimeter up test: for Area to find perimeter 4 array i nside the rectangle, to help visualize the area, involved making inferences about the shape and is a higher level of measurement reasoning than before the intervention (Battista, 2006). So, although Larry showed some progress regarding concepts related to area, his explanations (such as those presented above) are still lacking and would be confusing in any classroom setting. Although quality and depth of his explanations re vealed an overall shallow CK ill equipped to

PAGE 307

291 support a robust and classroom useful KoST. Even before the intervention, Grace had a relatively solid understanding of the major concepts being discussed in this study; however, her explanatory framework (especially regarding units of measure) at times was unorganized and she would struggle trying to clearly communicate her thoughts, as a teacher would need to do. This is pre and follow conceptual a pproach (i.e., he partitioned the irregular figure into square units), but his meager explanations nullified any benefit to that approach. Grace did not pursue a conceptual approach for finding the area in problem 3, either on the pretest or follow up; how ever, she not only solved it correctly both times, but also offered two different solution strategies for finding the area (one involving conservation). Solving a problem in more than one way is a trait of an expert teacher and was one quality of her CK th at explanations (part b of problem 3) increased in detail, organization, and clarity. Creative in problem solving Being able to solve a pr oblem in more than one way is an example of an application of an organized CK and is also a trait of an expert teacher. This category of findings originated after Grace, without prompting, solved question 3 on the pretest in more than one way. Because such problem solving characterizes ex pert teachers, the other three case subjects were given an opportunity,

PAGE 308

292 during the first interview, to solve question 3 differently than they did on the pretest. Larry was not able to solve the problem another way. While talking with Larry it became eviden t that he could not intelligently talk about area, perimeter, and units of measure, because Larry was unable to consistently identify what attribute of the figure was being measured (i.e., one or two dimensional). Jackie, after a few exchanges, was able to see that partitioning the figure into square units would produce its area although she 35 seconds to consider the task and after momentarily calculating perimeter, g ot herself Teaching episode 2 ( Figure 15, p. 133 ) provided the next setting for a planned e second part of TE 2, which is relevant to this discussion, involved the PSTs finding a figure out how much area the footprint covers? Can you also describe a second purpose of the second question was to continue in ascertaining whose CK possessed expert tendencies, such as being able to solve a problem in more than one way, which is a trait of an expert teacher. One specific response to the second part of this question offers a humorous side note and a reminder of the importance of clear communication in a second the higher There were four PSTs (including Larry) who similarly indicated that they had no

PAGE 309

293 idea how to solve this problem. Ironically, one of the fou r did some creative sketch work (see Figure 34 ) on the copy of the footprint provided and came up with a very good with most unsuccessful responses in this study, there were no sketches at all from the Figure 3 4

PAGE 310

294 other three who indicated they did not know how to solve the problem. Up to this point, a solving. Three other PSTs (including Jackie) offered vague methods with no final answer. Jackie was the only one of these three to do any work on the paper footprint. She numbered the eight complete square units inside the footprint and then basically stopped. This brings up two related facts regarding area that caused confusion for many in the class: (a) a figure can contain partial/incomplete square units, and (b) a figure can have a decimal area. Several of those previously mentioned indicated that it was interacting with t he Gizmo microworld that opened their eyes to both of these possibilities. Of the five remaining PSTs, two offered very good methods for approximating the area of the footprint, but they did not actually apply their method and get an approximation. One su ggested cutting out all the square and parts of squares and forming a rectangle and then using the L W formula. Brianna, who was the other, actually offered two solution strategies. One involved adding up the whole and partial squares and the other was to estimate t he height and width and multiply solving abilities to offer two realistic strategies, but she did not apply either. She made no sketches and offered no estimations. Literally, the question only ask s fo r a strategy, but four of the five PSTs who came up with a strategy also continued the progression and arrived at an estimation. Two others recommended an approach similar to Brianna where they approximated the length and the width and multiplied them. The ir approximations were 21.25 and 22.5. Only Grace addressed all the CK components of this TE, and did so in expert fashion. Her two

PAGE 311

295 ) the other involved moving partial square inches together to form wholes and then adding them up. Grace said her second method would produce an area of between 18.5 18.75 square inches. That work points out a unique difference between the structure of Bri students and both have performed relatively well throughout the study. The footprint problem involved more creative problem solving than detailed mathematics, and that appeared to be a strength for Grace. Grace was 54 years old when this study was conducted. Her high school geometry course was far in her past. The pre post and follow up tests, because of time constraints, tended to be more mathematical (as opposed to exploratory) which favored Brianna. Abilit y to explain and illustrate units of measure. Problem 4, which appeared on the pre post, and follow up tests, offered a good opportunity to further investigate any changes in knowledge regarding units of measure (and the ability to explain that knowledg e) that occurred from pretest through the follow up test. Problem 4 asked the linear unit and a square unit to a 5 th o n the pre post and follow up tests, and proved to be the most difficult problem for the PSTs. A model response would involve: (a) linking a linear unit to perimeter and a square unit to area, (b) illustrating a discrete linear and square unit, and (c) clearly explaining these concepts without confusing language s problem 4 was statistically the most difficult problem on the posttest, there was only one PST who received an unacceptable score of 1 for her posttest respon

PAGE 312

296 diagrams was especially noticeable on this problem and contributed to an overall less than acceptable conveyance of these relatively elusive concepts. Only 4 out of 12 PSTs (Jackie, Larry, and Brianna were three of them) incorporated any diagrams as part of their explanation, and there were only 4 (Jackie was one) who did so on the follow up test. Both Larry and Jackie earned a 1 for her prete st response to this problem, but improved on that dramatically by providing an appropriate diagram for a linear and a square unit (i.e., a shaded square for a square unit and a line segment for a linear unit) on both her posttest and follow up test; howeve r, she did not connect linear units to perimeter or area up test Jackie correctly referred to li earlier only received a score of 2 for her post and follow up test responses. confused regarding li near units and only slightly more knowledgeable regarding square units. During the intervention Larry did however show some growth in his understanding of units of measure. In his response to posttest question 4 he described linear units as wou ld represent a linear unit. For square units, Larry drew a 2 3 array of square units with 4 of the 6 shaded in. He explained how each shaded square represented a square unit. Just like Jackie, Larry neglected to connect linear units to perimeter and squa re units to area. That combined with the initial unclear diagram for linear units earned Larry a 2

PAGE 313

297 up test was a retreat classroom use. It should probably be noted that Larry reported not feeling well during the follow up test, which may help to explain his relatively quick completion time. Grace continued to improve on her ability to explain mathematical concepts and on problem 4 on the posttest, she did a good job of differentiating between linear and line segments), and combined with the fact that Grace never included any diagra ms to clarify or strengthen her responses resulted in her not receiving a score higher than 3. Brianna, on the other hand, earned a 4 for her posttest response (the only 4 given for this problem), because her diagrams were mathemat ically correct and pedagogically useful. This is an improvement over her pretest CK regarding units, as diagrams provided during our first interview illustrated she was unclear about the precise nature of a discrete linear and square unit. Her inconsistency with diagrams surfaced on the follow up test as she only received a 3 for this problem, because she forgot to include appropriate diagrams. One detractor from both her posttest and follow up test responses was her choice of incorrect. The mathematical vocabulary (or at least the choice of words) employed within e scoring implications. More

PAGE 314

298 Given the relative difficulty of problem 4, one might expect the PSTs to make ev ery effort to thoroughly communicate their ideas. This however was not the case. Using appropriate vocabulary (e.g., saying square cm to describe area when cm are given) was definitely the exception throughout the study for most PSTs. When asked to the 10 PSTs (Brianna being one) who ear common square and 2 dimensio method is incorrect because he is counting the actual squares, not the perimeter outside hey are used (i.e., in finding perimeter and area) as opposed to describing their distinguishing properties, was common among PSTs possessing an incomplete CK about these concepts. In addition to using clear and precise language, integrating diagrams (and other representations) can help improve communication and foster conceptual understanding of mathematical concepts. The lack of PSTs providing diagrams to support and illustrate their explanations was troubling. That behavior contributed to poorly communi cated and diagrams was not at the forefront of importance when explaining. It will be se en how this

PAGE 315

299 Utilizing drawings abilit y to explain concepts in meaningful ways (i.e., their explanatory framework). Such explanations involve effective communication. Incorporating suitable drawings is one s evident when they were given opportunities to provide diagrams to support or add precision to a mathematical response or to add necessary context or to clarify when asked sconception. Table 19 reveal s the opportunities (12 PSTs 4 problems) to use drawings on the pretest, 16 (33%) drawings were attempted, but there were only five (10%) that accompanied a meaningful and c orrect response. The rate of drawings provided increased for the posttest. There were 72 reasonable opportunities (12 PSTs 6 problems) to incorporate a drawing, 42 (58%) drawings were provided, and of those, 27 (38%) assisted in achieving a correct respo nse. That is an increase of 28% over the pretest. The follow up test, which contained the exact same questions as the pretest, showed an increased use of drawings over the pretest. Out of the same 48 opportunities, drawings were used 29 times (60%), and 1 9 of those (40%) were successful in facilitating an acceptable response. That is a 30% increase over the pretest rate and a negligible 2% increase over the posttest. A prime example of that fact is how the PSTs dealt with question #4, which appeared on the pre post and follow up tests, and was statistically the most difficult item in the study (mean of 1.58, 2.33, and 2.33 respectively; range 0 to 4). That question

PAGE 316

300 Table 19 Use of Drawings Throughout the Study Pretest Items Posttest Items Follow up Items PST 4 ( U ) 5 (R) 6 (U) 8 (R) 1 (U) 4 (U) 6 (R) 8 (R) 9 (U) 10 (R) 4 (U) 5 (R) 6 (U) 8 (R) #1 X X X x X X Grace X * X X #3 X X X X X X X #4 X X X x x x X #5 x X X X #6 X X X X X X X X Jackie X x X X X X X X Brianna X X X X X X X #9 x x X X X #10 x X X X X X X #11 x X X X x X X Larry X x X x Note U = dealt with units, R = dealt with perceived relationships; = suggested a drawing but did not draw it; X = used appropr iate X t response.

PAGE 317

301 linear unit and a square unit to a 5 th linear and square units was very difficult for them; however, on the pretest only one of the 12 PSTs attempted drawings, albeit inaccurate, as a means to help visualize and/or explain these difficult concepts. As evidenced in Table 19 the use of drawings increased for question #4 from the pretest lev els, but the occurrence of meani ngful and accurate drawings was very low 1 out of 6 for the posttest and 2 out of 5 for the follow up. It draw a line, as Brianna did on the follow up test. At other times, PSTs would draw things such as a 12 inch ribbon (not to scale) when describing linear units. The discrete nature of the concept of a unit was not consistently evident. An apparent pattern in Table 19 was that certa in PSTs tended to use drawings more consistently than others. For example, following the pretest both Jackie and Brianna began incorporating drawings in their responses on a more regular basis, whereas Grace and Larry did not. The use of drawings was not d irectly connected to performance. Grace was one of the top performers in the study, but barely ever used drawings to communicate her ideas, but PST #6, another top performer, effectively used drawings on the post and follow up tests. Some weaker PSTs incr eased in their successful use of drawings was inconsistent. On the entire pretest Jackie only provided one (rather vague) diagram to help support her explanations. For th e posttest however, Jackie included 19 appropriate diagrams. That awareness of the importance of including representations when explaining mathematical principles and relationships showed a significant increase

PAGE 318

302 in her KoST. One possible explanation for the lack of drawings by certain PSTs might be that of necessity. For many of the pre post, and follow up questions, certain higher performing PSTs (e.g., Grace) did not seem to need sketches or diagrams in order to facilitate a successful answer; however, w hen faced with an elusive problem, Grace would use sketches. For example, TE 2 asked the PSTs for their thoughts on finding the area of a footprint traced on square inch grid paper, and while that task was not the primary focus of TE 2, it proved very moti vating and equally challenging. Only 4 out of 12 PSTs were even able to provide any meaningful sketches in an attempt to approximate the area of the footprint, and Grace was one of them. As a matter of fact, she was one of only two who arrived at a very ac curate approximation of between 18.5 and 18.75 square ch was very similar to Figure 34 but hers included a numbering of the full and partial square inches. So for some (e.g., Grace and Brianna), not consistently using diagrams did not appear to be due to a lack of CK. Another example of this arose during the posttest. There were two questions on the posttest (#s 9 and 10) in which drawings were expected and yet Brianna did not provide any. During her second interview, she was asked abou t her lack of drawings. Although Brianna did not provide a reason for not including drawings, whenever one was requested she rather easily provided useful and meaningful drawings. As was true on the pretest, there were times on the post and follow up test prepared to construct a meaningful drawing. That was the case with question 4. Other times the PSTs were car e less and drew rectangles that were not to scale and thus did not faci litate a correct response. Although an increase in the use of diagrams was noticeable for many PSTs, there were numerous missed opportunities, which in reality, translate into a lack of

PAGE 319

303 realization of the importance of drawings in communicating and clarifying mathematical concepts. Both the increased usage and the missed opportunities reveal varying degrees plays into their ability to successfully respond to student shortcomings and/or misconceptions (a facet of their KoST). The findings in these next several sections primarily address research question 4, f thinking the diagnosing aspect, and (b) appropriately addressing student difficulties and misconceptions the intervention. Focused on solving or diagnosing & re sponding emergent CK & KoST The emergent findings presented in the next rather detailed section continues to examine the nderstanding s regarding units of measure (i.e., their CK), but now the focus will be on how they indicated they would respon d to student difficulties and misconceptions, specifically regarding units of measure (i.e., their KoST) KoST are manifestations of the organization of their CK. An expert KoST would enable a and then respond appropriately to difficulties by These findings came primarily from the three teaching episodes (TEs), and include a discussion on the impact of th will be shown how several PSTs had a misguided focus which lead them to work on secondary aspects of certain TEs, while not giving enough attention to diagnosing the

PAGE 320

304 tely responding to that student. Certain emergent intervention findings to compare to; however, such findings still contribute to answering research questions 3 and 4 as they illumin depth look at emergent findings related to KoST will begin by revisiting TE 1. Teaching episode 1 (Figure 14, p. 130 ) involved a student (Justin) using square units in an attempt to devise an alternative method to find th e perimeter of an irregular eachers indicated they would respond to the student and his/her did not remain dormant during the teaching episodes. Her responses to questions 6 and 7 with the Shape Builder mic response to writing prompt 5 was teacher centered and focused on telling Justin how to get the correct answer. Whi strengthened her CK regarding appropriate units for perimeter, her mathematical explanatory framework, another facet of CK.

PAGE 321

305 method and his thinking involved various responses with common themes, which helped to paint a picture of their current KoST. Generally their responses involved: (a) praising hi m for realizing perimeter was around him to explain his method, (d) teacher explaining what perimeter is, or (e) systematically walking Justin through his method and pointing out that it would not arrive at the right answer. him that he is doing a good job i n trying to make sense of it visually, but he needs to response (and those like his) falls short because instead of addressing the fundamental misconception surrounding modified, besides the fact that the modification was mathematically incorrect. Unlike Larry, Grace indicated she would respond to Justin through a teacher centered approach dimensional linear borders the not include a discussion of units with her explanation; however, after interacting with the Shape Builder MW, Grace amended her previous response to include diagrams and meaningful ly directed questions to help Justin conceptualize and clarify the differen ces between perimeter and area. Several other PSTs were very creative in offering alternative illustrations to help Justin better understand perimeter (e.g., fences, pieces of string), but only two PSTs (one

PAGE 322

306 being Brianna) actually discussed the most li kely cau his confusion with linear and square units and why one measures perimeter and the other method (i.e., perimeter is the measure in his method, showing (with diagrams) the differences between linear and square units and why linear units should be used, and concluded by having Justin rework the problem to see if he understood. Brian na was able to apply her CK and customize her response to while promoting conceptual understanding, earned Brianna expert codes for her KoST. Near the end of th e individual work for TE 1, after the PSTs had opportunities to investigate the problem with the Shape Builder MW and reflect on their previous apparent confusion, how would you follow up with the entire class about the concepts with teacher centered suggestions; however this time many said they would incorporate the microworld into their microworld onto the screen and explain with a laser pointer how to come up with the ed several more incremental steps and tried to place a stronger emphasis on student understanding; however, because it lacked a thorough discussion of linear and square units it too digressed into a show and tell approach to finding the correct answer. Tea cher lead discussions emphasizing how to get the correct answer dominated these responses. Of the remaining three PSTs, one (#3) presented a

PAGE 323

307 very clever use of the Shape Builder MW to help the students better understand why gain the focus was on finding the solution. Grace offered vague ideas involving discovery type activities for the students to do on the MW, but did not indicate she would summarize the concepts of linear and square units. Only Brianna used the MW and its f eatures to guide the students in discovering for themselves more evidence that Brianna was slowly moving away from purely procedurally based approaches to where she was applying her CK in ways that bols tered her KoST. Although a more thorough discussion regarding TE 2 will be presented in later PSTs can result in poor diagnosing of student misconceptions and missed opportunities to address those difficulties. Teaching episode 2 (Figure 15, p. 133 ) involved a situation in which a 5 th grade class is studying area, and they are challenged to find t he area of one of their footprints. Their teacher instructs them to stand on a piece of paper and trace their shoe, and then individually brainstorm a strategy to find the area of the footprint. After several minutes one of the students, Tommy, comes up an d explains his method. He says he would lay a piece of string around the outside of the paper footprint, cut the string to the precise length, form the piece of string into a rectangle, use a ruler to measure the length and width of the rectangle, then fin d the area of the rectangle. In other words, he believes that the area of the rectangle will be the same as the area of his footprint. TE 2 required the PSTs to grapple with two relatively difficult concepts. One was the misconception that a fixed perimete r (i.e., the piece of string) can have only one area (i.e.,

PAGE 324

308 the desired area of the footprint). The second involved a correct method to find/estimate the area of a footprint (an irregular shape). Each PST was provided with two copies of a footprint drawn o n 1 inch grid paper as well as blank pieces of the 1 inch grid paper. Findings showed that the PSTs who struggled most throughout TE 2 were also the ones who excessively focused on trying to find the area of the footprint (i.e., what they thought involved ), and as a result paid too little attention to dissecting As was the case with Jackie, it appeared that several PSTs had difficulty in ould be verified or disproved. She problem has stumped me w to solve this problem, but I think would be needed to find the area of a footprint, she did not attempt any sketches and did not draw anything on the footprint copies incorrect generalization but still struggled in responding clearly and succinctly to way through day 1 that using his method with the string) how to find the area of the footprint, which he never did. That was certainly not the case since Brianna, and three other PSTs, were able to correctly diagnose the inconsistencies

PAGE 325

309 footprint. During the first day of TE 2 ano ther PST, call her Stephanie, had apparently that two objects have the same perimeter does not automatically mean that they will have ocus of her writings turned to finding the area of the footprint. At some time during the TE that same PST produced the sketch in Figure 36, which is a very close estimate to the area of the footprint; however, five different times while completing the rem Overall, a preoccupation with f inding what the PSTs judged as t hinking, but it also limited their meaningful interaction with the Shape Builder MW, which incidentally could have been used to build a very close replica of the footprint and lly helped me with this problem [TE 2]. At this point, I am still confused on what the right way thinking, this PST was focused on determining the answer for hersel f. This is an example where the PSTs were over engaged in their role as a learner (i.e., problem solver) to the neglect of their role as a teacher (i.e., to diagnose and instruct). l continue by examining their use of and recommendations regarding the MWs integrated into this study. Such findings contribute to answering research questions 3 and 4 as they p roclaim and KoST, because the MWs are an effective

PAGE 326

310 student or facilitate a meaningful whole class discussion. Each TE presented a classroom based scenario focused on a documented misconception regarding area and perimeter. Each began with questions related to CK, and then would transition into KoST. Interacting with the MWs came at different times during the TEs, and was accompanied with progression proved valuable to several PSTs in each of the teaching episodes. The two MWs utilized in this study possessed specia lly designed features that would allow for and indicate various forms of learning occurred while PSTs interacted with the MWs. The first teaching episode (see Figure 14, p. 130 ) focused on misconceptions involving area and perimeter and linear and square units. For this teaching episode, the students were only given access to the Shape Builder MW, as its features matched well the concepts related to the focus problem. A unique aspect of this MW is its presentation of area and perimeter as well as linear and square units simultaneously. This feature di d them Brianna a case subject) commented on this potential confusion and offered a pedagogically sound recommendation. They both thought it would be helpful if Shape Buil der had a feature that could be turned on and off and would darken the outside edges (i.e., linear units) of any shape on the grid, hence making the perimeter stand out from the

PAGE 327

311 TE 1 was was wrong and they also were able to find the correct perimeter of the figure. This should have allowed for the PSTs to more freely explore with Shape Build er as well as to better solving the problem. The hope was that the PSTs would recognize that the primary confusion of Justin was that he used square units to calculate p erimeter; thus, the misconception centered on units of measure. The PST produced various learning paths and outcomes. Table 20 thinking. While case subj ects were the focus for Table 20 because their responses could Table 20 Findings Related to Microworld Usage & Be nefits Note *Based on written responses found in TEs. For TE 1 and TE 2 the MWs were not available until after the PSTs had already worked on the problem. Grace Jackie Brianna Larry TE 1 TE 2 TE 3 TE 1 TE 2 TE 3 TE 1 TE 2 TE 3 TE 1 TE 2 TE 3 Used mostly to confirm answers* X X Used also for exploration* X X X X X X X Saw value for personal learning* X X X X X X X Saw value for instruction* X X X X X X X X X X Facilit ated a more thorough CK* X X X X X X Facilitated a more thorough KoST* X X X X X X

PAGE 328

312 be corroborated during the second interview, their positions were representative of various subsets of the PSTs. In part, the range of reactions is illustrated by the responses g with the microworld help you better understand the ideas surrounding d in scope and depth); whereas, a stronger performing student seemed to realize the intended purpose of why Justin shaded in the squares and counted them to find the perimet er. As I drew the figure in the microworld, I was beginning to think I was thin quotation teacher cannot help a struggling student until they can understand what they are thinking. The way in which the PSTs indicated they would address the entire class as a point in the study (i.e., pretest score and teaching episode c levels of KoST. About half the PSTs indicated they would use Shape Builder and project point out what is wrong with the method and what the right answer is. This tendency of teaching in order to enable students to get right answers, in contrast to focusing on conceptual understanding, is a trait of a novice teacher. Contrast that with the instruction suggested by several other PSTs. For these the focus was on identifying and distinguishing between linear and square units and how this would enable the students to

PAGE 329

313 Two particular responses (one of them being Grace) tend to substantiate that interacting with the microworld help ed to stimulate creative and conceptual instruction strategies. The first involved using a feature of Shape Builder to help drive home a fundam ental difference between area and perimeter. checked. The square would look similar to Figure 9 (p. 122) number changes but the perimeter number does no t; thus, illuminating the concept for them that the area is the inside of a shape and comprises square units while the perimeter is represented by the outside boundary of a shape. A second PST suggested an instructional strategy that was straightforward an d illuminating. The recommendation an understanding of linear versus a square units. Grace recommended creating a 1 2 rectangle in Shape Builder (i.e., a rectangle mad e up of two squares); hence, there would and then it would be plain to see the shape had an area of two square units and a perimeter of six outside edges (i.e., line ar units). The fact that these two PSTs ventured away from simply creating the figures presented in the focus problem and came up with two totally different instructional strategies reveals the flexibility and subtle power of a microworld. The intended ins truction would not only help classroom students see the

PAGE 330

314 fundamental concepts of area and perimeter (i.e., linear and square units). The relative difficulty of teachin g episode 2 ( Figure 15, p. 133 ) resulted in more extensive investigating with both of the microworlds available in this study (Shape Builder and Gizmo) as well as some meaningful learning outcomes. One PST commented feature of Shape Builder helped her realize area. I think the string distracted me from realizing sooner that perimeter does not d alre ady found several counter examples to p .120 ) when line segment). Although that is somewhat of an extreme counter example (and not a 2 only one area), it does show the facilitative nature of a well constructed microworld to stimulate growth in CK. Along these line s, several PSTs went to great lengths to list many rectangles (including ones with decimal dimensio ns) that had a perimeter of 18, but having different areas thus however wrote about how the Gizmo MW could b e a jumping point for a discussion that there are actually an infinite number of rectangles that have a perimeter PSTs t necessarily work, hence for TE 2 there were limited findings on the microworlds facilitating content learning or informing misconception specifically wrote about when their epiphany occurred. Of those, only two indicated the MWs were instrum

PAGE 331

315 the string helped them the most. interactions with the MWs will conclude with an interesting finding related to their opinions concerning learning with, versus teaching with, the MWs. For TE 1 and TE 2, the MWs were not introduced into the session until half way through Day 1. For TE 3 ( Figure 16 ) the PSTs were instructed that they could access either mi croworld right from the outset. For the first TE (the easiest of the three) the vast majority of the PSTs (11 out of 12) indicated they found the microworld helpful to their understanding of the problem t they would use the microworld as an instructional tool in a whole majority (10 out of 12) indicated they believed classroom students would benefit from personally interacting with the MW in a structur ed context. However, an unexpected trend developed as the mathematical content of the teaching episodes got progressively Although the number of PSTs who indicated they learned with and/or saw benefits of personally interacting with the microworlds was a strong majority (8 for TE #2 and 11 for TE #3), fewer (five from TE #2 and six from TE #3) said they would incorporate the microworlds when instructing future students a bout the concepts presented in the TEs, even though the same PSTs admitted those future students would most likely possess similar misconceptions as the hypothetical students presented in the teaching episodes. These beliefs indicate an incomplete applicat The PSTs in TE 3 who did suggest incorporating MWs did so in very teacher centered

PAGE 332

316 ply verify answers and/or display visual representations, and some even viewed the technology as a beneficial, but I do not believe they should take away from classroom instructi and analyzing and came to the conclusion she (Jasmine) is very mistaken and should be eracting with the however, neither of these PSTs recommended that students spend a ny time interacting with the microworlds as part of their instructional strategies. A similar contradiction appeared when only two PSTs from TE #2 and three from TE #3 (of the eight and 11 respectively who indicated they learned from the microworlds) wrote that they would allow time for the students to personally use the microworlds to explore the concepts surrounding the teaching episodes. Apparently, the majority of PSTs felt the microworlds were a valuable learning tool for themselves but not for their f uture students. There seems to be evidence that indicates that the low occurrence of suggested MW usage from the TEs was not due in entirety to the newness of the technology. Table 21 shows that of the questions whose design and content could have easily facilitated discussions involving the use of a MW, only a couple elicited such responses from the PSTs. Even questions 8 and 9 from the follow up test, which formed the basis for TEs 3 and 1 respectively (where MWs were used extensively), received very fe w references to using MWs to help instruct a struggling student. Apparently, it takes time

PAGE 333

317 Table 21 Instructional Recommendation s for Microworlds Posttest Items Follow up Items PST 6 (U) 7 (U) 8 (R) 10 (R) 5 (R) 8 (R) 9 (U) #1 X X Grace A A #3 A A #4 A A A #5 X D X #6 A A A X D Jackie A A A Brianna A A #9 A #10 A A A #11 X A Larry A A A Note U = dealt with units, R = dealt with perceived relationships; D = written response included pictures that looked like images from a microworld (MW); X = recommended using a MW w/o being prompted; A = recommended using a MW in response to a writing prompt at the end of the posttest. and many experiences for a microworld to thinking. Once the personal integration of microworld thinking has begun to take root, then a vision for its integration into instruction can begin to take form. Realizing t he importance of units in explanations Results from question 9 on the posttest (Figure 37) help ed in describing the change in PSTs KoST as it relates to units of measure. Question 9 addresses similar concepts as question 6 from the pretest, and the ere compared f or signs of growth. Both questions present a problem centered on a figure and a scenario in which a discussion of units, by the PST, would be needed to clarify the difficulties of the hypothetical student. Question 6 presented a

PAGE 334

318 student who correctly found the area of a rectangle (i.e., 18) but is confused about what the number 18 actually represented (i.e., the number of square units), while question 9 involved a student who calculated the area of a 3 successful with question 9, the PSTs needed to do two things. First, realize that the therefore, the student is apparen tly confusi ng area with perimeter. Second an appropriate intervention would involve combinations of the following: (a) asking how the student arrived at their answer of 20 so that an appropriate follow up could ensue, (b) construct a 3 7 array within th e rectangle to visualize the 21 square units the area, (c) review what is involved with finding area and perimeter, and (d) have the student then compute both the perimeter and area to compare. The scores on this problem indicate d it was the 9. A student calculates the area of the rectangle shown to be 20 square cm. (a) Is the student correct? If not what is the correct answer? How did you figure your answer? (b) What do you think the stu dent was thinking to arrive at their answer? (c) As a teacher, what specifically would you say or do to help clear up any possible confusions the student might have? Figure 3 5 Question 9 from the posttest. 1 cm

PAGE 335

319 5 th hardest problem on the posttest. There were five 2s, five 3s, and only two 4s. The primary cause for the lower scores was a wrong focus, which then lead to an incomplete and, subsequently, ineffective intervention. Take for example Brianna, who only score d a 2 on this question. She focused on the belief that the student simply used the wrong explain that to find the area of a rectangle, by using the formula length width we must was no discussion of linear and square units. Another aspect lacking from the weaker responses was the inclusion of a diagram to aid in a conceptual explan ation. The 7 array, which represents the area and could have been drawn inside the rectangle. Duri units. include that in her initial response. Larry actually did draw in the 3 7 array, but after that basically said the student got confused and did perimeter instead of area. Hi s interview I asked him about why he drew the grid of squares inside the rectangle. To my t is possible apparently that approach was not seen as valuable to the struggling student. Jackie also

PAGE 336

320 constructed the 3 7 array and according to our interview used that method to find the student used the word unit to calculate area, but did not include that dialogue or any continued to become better organized to assist in her diagnosing of student difficulties and misconception s but her further intervention. response, there was one more comment made by three PSTs that bears mentioning. Three different times it was brought up that a PST felt the was one of the three, so I aske d her during her interview if she saw any value in the apparent confusion caused by the tick marks on the rectangle. After a 15 second pause, conceive that the tick marks would most likely produce the perturbation that should have conclude with a brief answer. She used a formula (3 cm 7 cm = 21 sq cm) to calculate the area, and followed Her intervention incl uded three of the four recommendations listed earlier. She did include that sh e would ask the student how he/she came up with the answer only two PSTs did. As has been seen in other responses by Grace involving units, she did not draw in the array to ill ustrate the square units; however, she indicated that she would do just

PAGE 337

321 reasoning is that of making inferences about numerical measurements of objects (e.g., as if the array has fallen into the background and is considered already complete). It cannot be said for certain that this applies to Grace, but it would help to explain why she has continually used arrays in her discussions while seldom including drawings of t hem. I will conclude intervention knowledge the case subjects) to question 9 on the follow up test (Figure 36 ) while at the same time (a) Is Jo for the perimeter of Fig. 1, and if necessary, state what is the correct answer? (b) Explain why or why not. What specifically would you say and do? Figure 3 6 Question 9 from the follow up test method is to shade the squares along the outside of the shape, as shown in Figure 2, and then to count those squares.

PAGE 338

322 comparing them to the other two instances in which they faced the same proble m (in the pretest and TE 1). The findings for question 9 will focus on the case subjects, since their second interview, which was structured to be a learning experience, occurred after the posttest and a month before the follow up test. Since question 9 sp ans the timeline of the study, the findings surrounding it are a good representati on knowledge regarding units of measure. Question 9 is one of only 2 test questions that appeared on both the pretest and the follow up test, as well as being features in a TE (i.e., before, during, and after the intervention). The other one is #8, which will be discussed in The knowledge necessary to formulate methods to solve problems in mathematics dra 0 4) of q uestion 9 on the follow up test reveals some change in both CK and KoST. When this same problem was asked on the pretest the scores indicated that it was the second hardest item on the test ( M = 1.92; SD = 0 .9). The only scores above a 2 were one 3 and a 4 received by Brianna. On the follow up test the mean climbed to 3.17 ( SD = 0 .84), which was the second highest mean on the test. Although it might be expected that most PSTs simply on repeated exposure to the problem, there was marked improvement in how several indicated they would respond to the student and his confusion (KoST). Jackie is a pri me example of this. On the pretest, Jackie barely earned a 2 by providing the

PAGE 339

323 student involved only clarifying the differences between area and perimeter no mention of names were changed from the pretest to the follow up test) method and how she would clarify area and perimeter for him. She had no clear idea of why Justin might come up with such a method, and her clarification of how to find perimeter digressed into an explanation involving point counting (instead of linear units). A fragile and unorganized CK left Jackie with no foundation from which to respond effectively to the studen misconception. When this question surfaced again as the focus problem for TE 1, Jackie cabulary had units (her CK) was still sparse and disconnected. The intervention contained in TE 1 (exploring with the microworld, small group sharing, whole class d however, her response to Justin and his thinking was primarily focused on helping Justin but c ount the sides of the boxes around the shape, which is a common trait among novice reveals a KoST that was still unprepared to address student shortcomings in meaningful ways. She made this comment before the small group sharing and whole class discussion which Jackie indicated she enjoyed and learned much from. During our second interview I asked Jackie about her choice of words in the preceding quotation and at this poin t she

PAGE 340

324 while answering question 9 on the follow up test (the same question). S he also added that These responses earned Jackie a 4 (model response) on this que stion the only one she received throughout the study. Larry and Jackie entered the study with similar weaknesses in their CK and KoST regarding units of measure. While Jackie made marked improvements in both knowledge types, Larry appeared to make littl method as incorrect all three times, but his explanation for why it was wrong and his recommended intervention are representative of why Larry ranked in the bottom third in every statistical measure in this study. His focus started out, and remained on, getting the right answer to the neglect of developing understanding. On the pretest Larry explained corners twice intervention CK was limited in scope and his KoST was narrow in focus. He showed some growth during th e incorrect because he is counting the outside square, not the perimeter outside the sh

PAGE 341

325 difficulties. We addressed this question in our second interview, and even with prompting Larry would not thoughtfully discuss what precisely Justin might be confusing and what as a teacher he should do as a teacher. After not getting a meaningful response I would refocus the discussion and offer Larry meaningful suggestions. I was troubled when up exam to this same question included nothing from our interview. Larry had even gone back to his pretest explanation for why Justi was wrong and his intervention strictly focused on trying to help Justin make his method work. Larry was able to correctly calculate the perimeter of the irregular shape on the pretest, in TE 1, and on the follow up test. Overall though, his und erstanding regarding the concepts surrounding units of measure (his CK) was both sparse and disconnected, which resulted in a lack of awareness and appreciation of what would constitute an effective intervention for a struggling student (his KoST). Brianna and Grace entered the study with somewhat similar levels of CK and KoST. Brianna possessed stronger mathematics than Grace, but Grace was prone to be more conceptual in her approaches than Brianna who opted for procedural. On the pretest however, their performances on question 9 were not similar. Grace only scored a 2. She dimensional instead of 1 dimensional units), but then somewhat contradicted herself by indi

PAGE 342

326 confused when she could not reconcile her pretest response. She did seem to know wh at method worked in this instance. When she faced the problem again during TE 1 she again indicated that Justin xplained why 20 linear units. It is not known how Grace resolved her pretest difficulties with this problem, other than that she indicated several times during the st udy how she would spend time outside of class thinking about certain problems that had given her definit perimeter of a shape is a 1 dimensional linear measurement and that Justin should be still w as lacking in thoroughness. Grace did not specifically mention linear and square units or work in the concept of area in case Justin might be confusing those concepts as well. There is a chance that Justin could have been recently finding the area of shape s drawn on grid paper and was blending his ideas together. Realizing the benefit of and providing appropriate diagrams illustrating linear and square units would also have illustrated a more complete KoST. Overlooking the value of providing diagrams as par t responses to question 9 on the follow up test added no new information. The results

PAGE 343

327 the only PST to earn a 4 on question 9 on both the pretest and the follow up test. Qualitatively speaking, her responses to this question actually improved. Based on criteria established for the rubric scoring, she received a 4 on the pretest; however, her responses were not entirely thorough or complete. For example, when explaining why he focused on why it does not work rather than pointing out that he used square units for a linear measurement (perimeter). That represented a weak explanatory framework for her CK. Her response to the part addressing KoST made it clear however that her CK was organized and enabled her explain to prescribe an appropriate response involving linear and square units and a nice explanation/definition of perimeter. Integrating area and some diagrams would have made for a model response. B rianna improved on her pretest response by including useful diagrams in her responses for TE 1. This was a positive change for her KoST. Her CK was equally substantial and interconnected throughout TE 1, and her model score of 4 on the follow up test reve a led she had retained her knowledge about units of measure. Throughout the study, a proper treatment of units was critical to forming a proper foundation to discuss other pertinent concepts related to area and perimeter (e.g., perceived relationships). Nea r the end of the study (e.g., post and follow up test), not including the appropriate units with responses was the primary reason more model responses of 4 were not assigned. It seems unlikely for teachers to build within students a conceptual understandi ng of area and perimeter without being able to coherently discuss linear and square units.

PAGE 344

328 Knowledge Regarding Perceived Relationships The exhaustive reporting regarding units of measure was necessary given their fundamental and unifying properties. Thi s next major section deals with perceived relationships between area and perimeter and addresses a more self contained class of difficulties and misconceptions. There are primarily two relationships between area and perimeter that students and PSTs (and ev en teachers) are reported to mistakenly suppose as true. The first provides the setting for TE 2 and involves the belief that a fixed perimeter can have only one area (and vice versa). The second, and slightly more elusive, misconception forms the basis fo r TE 3. It involves the belief that there exists a direct relationship between perimeter and area, that is, as the perimeter of a shape increase s its area must also increase (and vice versa). This misconception can also be stated as, if the perimeter of a shape decreases, its area will always decrease. The next several sections relationships. Emergent CK of the fixed relationship misconception The next section will continue an regarding perceived relationships and how that knowledge changed as a result of the intervention of TE 2 and to a lesser degree the second interview, which only pertains to the case relationship misconceptions will be investigated through the findings extracted from TE 2, the post and follow up tests, and the second interview. There was no formal instructor lead introduction t o TE 2 (Figure 15, p. 133 ), other than to make sure the PSTs understood the elements of the scenario presented and

PAGE 345

329 to motivate them with the many benefits of the upcoming classroom scenario. They were also reminded to, when appropriate, include in their re sponses the very same things a teacher might put on a chalkboard while teaching about these ideas. Teaching episode 2 required the PSTs to grapple with two relatively difficult concepts. First, and primarily, was the misconception that a fixed perimeter ca n have only one area. This misconception area of his footprint, which involved taking a piece of string, measuring around the footprint he had traced on grid paper, preci sely cutting the piece of string, and then forming the string into a rectangle and computing the area of the rectangle as the area for his footprint. The second involved a correct method to find/estimate the area of a footprint (an irregular shape). Contem plating and then discussing the mathematics behind There were three primary concepts at work within TE 2: (a) The string represents the perimeter of the footprint, (b ) The string could be formed into many rectangles (or (although it is possible he could form a rectangle that was a good approximation of the he area of the footprint must be approximated (includes the ideas of an irregular shape, partial square units, and a decimal area measure). A model response would successfully address all three. It initially appeared that all 12 PSTs correctly surmised tha t the 18 inch string represented a perimeter measure; however, a response by Jackie later in the TE (which will be shared) casts doubt on that conclusion. produce the correct area.

PAGE 346

330 reveals a limited CK of irregular shapes. Two were leaning towards no, but their justifications were eit her unclear or faulty. Of these four, one eventually realized, through The thoroughnes s and insight of their explanations reveal ed varying levels of failed to provide meaningful explanations and/or diagrams as evidence. Jackie was one, tring can be used to make many different shapes that will have TEs TE 2 was the only one in which Jackie correctly intervention proce ss that she was able to correctly and clearly communicate the reasons this instance, Jackie seemed to possess the CK necessary to successfully diagnose thinking, and she was able to provide a reasonable justification; method. Instead of just building on and possibly clarifying what she said earlier about he [Tommy] assumed with that length of perimeter [i.e., the string], he would get the

PAGE 347

331 same area no matter what major aspects of TE 2, but as was often the case, her explanations were initially confusing and would not be meaningful to certain facts and concepts related to relationships between area and perimeter, specifically the fixed relationship misconception, had increased, however her ability to clearly explain her knowledge had not develo ped to the same extent. As was the case with TE 1, Brianna did not appear to struggle with diagnosing the explain the mathematical mistakes the students had made. Rega to understand that not all shapes with the same perimeter will Although Brianna also provided four, properly scaled dia grams showing how a perimeter could if Up to this point in the intervention, Brianna h ad shown a tendency to view these classroom scenarios involving student thinking as something that must be always right or always wrong. That aspect of her CK was still limited in scope. She was hardly alone. There were five PSTs who acknowledged that Tom could produce a correct answer. The i not necessarily [emphasis added]. The perimeter of a shape is related to the area, but the total

PAGE 348

332 g classroom and her lack of diagrams minimized its overall effectiveness. Overall, as was the case with TE 1, Grace strove to make her explanations thorough and appeared to focus on helping the student understand the concepts being discussed. Larry seemed to grasp the mathematical concepts intertwined in TE 2, as evidenced in his 2 rectangle and a 5 t was organized and included examples. These explanations represent a relative higher level of understanding, and for Larry that was significant. Although he eventually confus ing, and even mathematically incorrect at times; however, in TE 2 Larry showed signs of beginning to organize his CK in ways that produced coherent explanations. As was just described, research question 3 was addressed in part by presenting evidence of g rowt h in various aspects of the PST s CK from TE 1 to TE 2; however, not all the subtleties of TE 2 were addressed. No PST was able to suggest which rectangle, with a perimeter of 18, would most closely represent the area of the footprint. A correct answer would be a 3 6 rectangle. To be able to do that, they would need to be able to decipher a way to approximate the area of the footprint the last CK question for TE 2. Before presenting findings from the posttest, to help portray post intervention kno wledge, specifics related to the instructor lead, whole class discussions from TE 2 will be shared. This is done to add context for future evaluations of findings regarding

PAGE 349

333 relationship misconception. At the conclusion of TE 2, a whole class discussion was held to provide PSTs the opportunity to share learning experiences and other personal reflections regarding the TE. The instructor/researcher facilitated the discussion and had prepared material to present and spark class disc ussion. The purpose of these summaries was to clarify the major misconception(s) presented ne something that can be explored and tested. During our discussion, it was brought out that Gizmo microworld allowed a coup le PSTs to realize that there were actually an infinite number of rectangles possible that could have a perimeter of 18. The dimensions would be decimal numbers, and this was quite eye opening for most of the PSTs. Post intervention CK of the fixed re lationship misconception Problem 10 on the pretest addressed the misconception that a specific perimeter can have only one area. That problem had the highest mean score ( M = 2.75, SD = 0 .6) for the test; however, as discussed while answering research que that misconception was incomplete; to recap : (a) Only three PSTs (no case subjects) perceived that students would tend to believe the misconception that equal perimeter implies equal area, (b) during interviews Larry and Jackie changed their initial pretest answer by indicating that squares were not rectangles, (c) Grace was unsure but leaned towards the idea that square s are rectangles, and (d) Brianna was confident in the fact that a square was also a rectang le.

PAGE 350

334 Problem 5 on the posttest parallels the concepts contained in question 10 from the cm; (a) What might its area be? (b) Explain how you arrived at your answer, and ( c) Are surrounding this problem: (a) the misconception that there was only one possible area, (b) a 4 4 square is one of the possible rectangles, (c) there are actuall y an infinite number of rectangles with a perimeter of 16 cm, and to a lesser degree, (d) using the semi perimeter to assist in more quickly finding possible rectangles. Question 5 had the highest mean on the posttest ( M = 3.25; SD = 0 .6). There was only one score below a 3 on this question (PST #5), and it appeared to be due to the fact that she interpreted that question as looking for a rectangle whose perimeter and area were 16. Nine of the 12 PSTs included a 4 4 shape in their list of possible r ectangles sure that they included the 4 4 because they knew it was a recta ngle. Larry was the only case subject not to include a 4 4 shape in his list of possible rectangles; however, the fact that he included three rectangles seems to indicate he gained an understanding of the fixed relationship misconception. During our inte rview, it was obvious that his CK regarding the hierarchical nature of quadrilaterals was not organized enough for him to accommodate a square as a rectangle. After walking him through the classification process, it was still unclear if Larry grasped the h ierarchical nature of this classification: T: Does a square satisfy the properties of a rectangle? L: Yes, so a rectangle is a square. T: Are you sure? L:

PAGE 351

335 Jackie used a tr ia l and error approach i n finding different rectangles with a perimeter of 16, as did nine other PSTs. Her success in generating two (a 4 4 and a 3 5) indicated she did not hold to the fixed relationship misconception. A downside to her response was that neither of her rectangles was scaled appropriately. During our interview, she seemed confident that the 4 4 shape belonged as a possible rectangle ; however, she had considerable difficulty comprehending how the semi perimeter could hort rectangles with a perimeter of 16 (i.e., find two numbers whose sum was eight). Jackie often needed repeated exposure to concepts before she could assimilate them into her current CK. Her realization that a square can be included in a list of rectangles illustrates positive change from her pre intervention knowledge. There were three PSTs (Grace, Brianna, and #10) who successfully deduce d that there were an infinite number of rectangles (including the square) with a perimeter of 16 to recall prior class discussions and microworld experiences and incorp orate that knowledge into their explanatory framework evidence of a maturing CK. They both included squares in th eir list of possible rectangles, thus acknowledging the hierarchical relationship between squares and rectangles. Brianna used a semi perimet er method to find possible rectangles and listed all the whole number possibilities (i.e., 1 7, 2 6, 3 5, and 4 4). She also provided appropriately scaled rectangles as well. While number s that add up to 8 from her rectangles (i.e., cm 2 method involved starting with a width of 1 cm, then fou nd the necessary length (i.e., 7),

PAGE 352

336 and she continued this process up to the 4 4 square. She wrote that many other would have been a model response if Grace had explain ed why her method worked (i.e., she was employing the semi perimeter), and even more importantly if she had included useful pictures of her rectangles. The lack of incorporating diagrams into her explanations is a significant shortcoming in her CK. The con tinual absence of appropriate, supportive diagrams was an indicator that these PSTs did not truly comprehend what is typically involved in providing conceptual explanations that are meaningful to students. Emergent CK of the direct relationship misconcep tion The section that follows regarding the fixed relationship misconception a slightly more elusive misconception than contained within either TE 1 or TE 2. These findings were extracted from TE 3, the post and follow up tests, and the second interview. The gist of this misconception is that there exists a direct relationship between perimeter and area, that is, as the perimeter of a shape increases/decreases its area must also increase/decrease (and vice versa). The focus problem for teaching episode 3 (Figur e 16 p. 136 emerging CK of the direct relationship misconception. TE 3 began with four questions CK (their reaction to the claim), and then transitioned into examining their KoST (their reaction to the student). For this last TE, the PSTs were instructed they could interact with either microworld from the outset. Five of the 11 PSTs (one was absent) (Jasmine) claim, including Jackie, Larry, and Grace. Their reactions to the claim resulted

PAGE 353

337 in four categories. The first category contained two PSTs, and they accepted the stu unsure, Jackie indicated, in this study, they apparently believed that if enough examples are presented then the claim can be either proved or disproved. This belief can be seen in a comment made by Jackie during our second interview. I asked Jackie what her plan was when she used the her wrong mathematics led most of these PSTs to where they viewed the role of examples as a way to prove something, rather than just an illust ration of a numerical relationship. They did not, or possibly cannot, appreciate the need for a mathematical argument in such cases. uch as these are based on common sense, rather than mathematics. At this point in the TE, Jackie is functioning below a Level 0 since she did performed at when this miscon ception was presented on the pretest in Question 8. The second category involves two PSTs who initially accepted the claim but very soon after changed their minds. Both indicated that while exploring with a microworld they found a counterexample to Jasmi

PAGE 354

338 examples do not directly address Jas Gizmo incorrect You can have a . I oceeded to provide a 3 3 square, which he his understanding has progressed from where it was prior to any intervention. On the pretest and in the interview, Larry was only able to attain a Level 0 (i.e., he justified the providing a counterexamp le, had moved him to a Level 1 understanding (Ma, 1999). The third category of responses identified were those who thought the claim was incorrect from the onset and offered at least one appropriate counterexample. Their counterexamples were all very sim ilar in that the second rectangle provided had a much smaller width and a much longer length than the first (e.g., first rectangle would be a 4 4 and the second would be a 1 11), which would result in a larger perimeter but a smaller area. Of the four who applied this approach, there were two who also explained a that of over generalizing. This observation characterizes based on one of them stopped after simply disproving the claim; therefore, she only achieved a Level 1 understanding. Grace, however, continued to explore various relationships

PAGE 355

339 between area and perimeter and discovered two separate conditions for area perimeter relationships that elevated her understanding to a Level 3. That represented a marked (Table 22). Grace also provided evidence of expert like analysis and problem solving. Table 22 Investigating an Erroneous Student Claim Note *One PST (#4) was absent for TE 3. **These PSTs acknowledged the condition that could be true. ^ Clarified certain conditions of the area perimeter relationship. Pretest Results (Question 8) Number of PSTs (N = 12) Agreed with the student Provided appropriate counterexample Investigated the claim attained 4 (including Larry & Jackie) Yes No No Level 0 2 (including Grace) No No No In between Level 0 & 1 3 No No Yes In between Level 0 & 1 3 (including Brianna) No Yes Yes, but insufficiently Level 1 N = 11* Emergent Results (TE 3) 2 (including Jackie) Yes No No Level 0 2 (including Larry) Initially Yes, then No Yes No Level 1 3 No Yes No Level 1 3 (including Brianna) No Yes Yes** Level 2 Grace No Yes Yes** Level 3 ^

PAGE 356

340 first thing you did after Table 3, p. 1 66 ). They are able to effectively analyze mathematical problems, as well as student thinking, by recalling past knowledge, incorporating new knowledge, and organizing both in a way that writing prompt indicated the problem solving component of her CK was maturing in the way just There were three PSTs (including Brianna) who comprised the final category of understanding related to this misconception. As in the previous category, both PSTs pr of instances she would be correct but it does not hold true all the time response was very similar, and this acknowledgment would move these two PSTs into the second level of understanding (Ma, 1999). This transition marked growth for Brianna who had moved from a level 1 to a Level 2 (see Table 22). The supportive explanation behind her approach bears reporting: Although it does seem logical, it is incorrect. Jasmine is correct in understanding what perimeter a nd area are. She calculated them correctly in her example, but she is incorrect in thinking that area and perimeter are related like that. Also, in her theory she only gave one example. She fails to t ry other rectangles and see

PAGE 357

341 if it [her claim] works for every one. Brianna presented a balanced approach involving praise and corrective instruction. Her specific mention of Jasmine over generalizing is an example of expert CK. Brianna was one of the admittance, consulting either microworld. It is somewhat surprising that only two PSTs focus proble Teaching episode 3 concluded with an instructor lead, whole class discussion. (1999) a examples and counterexamples was discussed. The various numerical relationships between pe rimeter and area were investigated and specific examples were elaborated claim and why other conditions did not. The idea of a fixed perimeter having an infinite number of p ossible areas was reiterated during this whole class discussion. Another concept shared during the extensive summary of TE 3 was that a square could be included in any list of possible rectangles having a specific perimeter. Overall, the PSTs were provided with the information necessary to achieve a Level 4 response on future questions addressing the misconception that there exists a direct relationship between perimeter and area. It was conceded that PSTs would not have enough time on the post or follow u p test to fully develop the various levels of understanding related to this misconception (e.g., Grace reached Level 3 during TE 3 but fell back to Level 1 on

PAGE 358

342 posttest), but simply mentioning the various possibilities would be significant. Details from the TE summaries are shared to help the reader appreciate the depth of the intervention and also realize the extent of knowledge (both CK and KoST), including appropriate language, made available to the PSTs. The anticipation was that this knowledge would be apparent in their post and follow up test responses. Post intervention CK of the direct relationship misconception Question 6 on the posttest addressed the direct relationship misconception and also presented the first opportunity for the PSTs to shar e what they had gleaned from the in depth summary of TE 3. Statistically, this question had the second lowest mean on the test (2.58, SD = .9). Five responses that received scores of 3 would have received a 4 had the PSTs included appropriate units with th eir examples (including Jackie, Grace, and Brianna). The up questions relating to CK were: (a) Is she correct? If you are unsure, are you skeptical or do you tend to believe her? Why? and (b) thinking. One difference between this question and pretest #8 and TE 3 is that those questions used the word larger instead of smaller ; however, the direct relationship claim would be examined and discussed in much the same way. Another difference for question as was the ca se for TE 3. Interestingly, the responses aligned very similarly as they did in TE 3, both in what were said and by whom (see Table 23). Once again, four categories of responses were ev ident: (a) accepted the claim ( n = 2), (b) rejected claim without coun terexample

PAGE 359

343 Table 23 Investigating an Erroneous Student Claim : Throughout the Study Note *One PST (#4) was absent for TE 3. **These PSTs acknowledged the condition that could be true. ^ Clarified certain co nditions of the area perimeter relationship. Pretest Results (Question 8) Number of PSTs (N = 12) Agreed with the student Provided appropriate counterexample Investigated the claim attained 4 ( including Larry & Jackie) Yes No No Level 0 2 (including Grace) No No No In between Level 0 & 1 3 No No Yes In between Level 0 & 1 3 (including Brianna) No Yes Yes, but insufficiently Level 1 N = 11* Emergent Results (TE 3) 2 (including Jackie) Yes No No Level 0 2 (including Larry) Initially Yes, then No Yes No Level 1 3 No Yes No Level 1 3 (including Brianna) No Yes Yes** Level 2 Grace No Yes Yes** Level 3 ^ N = 12 Post Intervention Results 2 (including Larry) Yes, w/o ample justification No No below Level 0 1 No, but w/o counterexample No No In between Level 0 & 1 6 (including Jackie & Grace) No Yes No Level 1; Jackie close to Level 2 3 (including Brianna) No Yes Yes** Level 2 ^

PAGE 360

344 ( n = 1), (c) rejec ted claim with counterexample ( n = 7), and (d) both opposed and supp orted the claim with examples ( n = 2). There were four PSTs (including Larry and Jackie) who moved to a different level of understanding from where they were in TE 3. att ain the lowest level of zero. That represents a step backwards from the Level 1 he eventually reached in TE 3. For posttest question 6, en the measurement around the outside of the shape. If that is small then the area must be flop was brought up during his second interview, which followed the posttest. Our conversation follows: T : Do you recall anything about TE 3? L: why I was kind of, at fir st. I was kind of like, well, yeah. Then I thought about nswer. T : Well, what if you were going to disprove her claim, what would you try to do? L: Try to make two rectangles, one with a [4 second pause]. I always get this I can never, um [ 4 second pause], one with a greater perimeter and a smaller As we continued discussing this question, it became apparent that Larry still had trouble ld then be verified or disproved That, com like tendency to change his answers on

PAGE 361

345 apparent whims revealed a still fragile CK. relationship misconception in TE 3 was rated below a Level 0 claim without any justification. On the posttest however, her response to question 6 revealed positive change in several aspects of her CK. First, instead of just writing generalities about the various concepts involved, she investigated the problem mathe matically. That resulted in a classroom appropriate counterexample. She did not stop there. In her explanation, she mention of that possibility borders on a Level 2 response and represents a wider teacher trait of realizing the limitation of Stacey over Three other PSTs switched level; one moved from a Level 0 to a Level 1 and the other went from a Level 1 to a Level 0. Grace dropped to a Level 1, because she did not investigating the relationships between area and perimeter, as she did in TE 3. Examining the content of the other responses revealed n one had made any significant progress and had remained at the same level of understanding as in TE 3; Brianna again reached a Level 2, but no further. Given the thorough discussion following TE 3 that had occurred just a week before the posttest, it was so mewhat surprising that no PST, other than Jackie, was able to incorporate and organize that discussion into their CK in order to facilitate a move to a higher level of understanding on the posttest.

PAGE 362

346 The last item containing findings relevant to the PSTs intervention CK of the direct relationship misconception is question 8 from the follow up test. The significance of this question is its representative n regarding perceived relationships. Question 8 is representative, because it is one of only 2 test questions that appeared on both the pretest and the follow up test, as well as being featured in a TE (i.e., before, during, and after the intervention). The question read, angles, the one with the greater perimeter will always have the greater area. The two questions relating to CK were: (a) Is the case with posttest question 6 (unlike TE 3), question 8 did not provide any example as as a possible condition, would have to provide their own appropriate example. revealed the same four categories of responses as were found in TE 3 and posttest question 6, with a few varia tions: (a) accepted the claim ( n = 1), (b) rejected claim witho ut appropriate counterexample ( n = 2), (c) rejec ted claim with counterexample ( n = 5), and (d) both op posed and supported the claim ( n = 4). These similar findings would seem to suggest that once a PST arrived at a certain level of understanding regarding the di rect relationship misconception, they did not expand very much on that understanding or venture beyond their CK comfort zone if you will. Larry continued his posttest retreat from the Level 1 understanding he achieved explanation for his stance involved shallow mathematical thinking, only considered the

PAGE 363

347 more you have for a perimeter means that you will have more ar The absence of a mathematically meaningful justification means Larry again did not even reach a Level 0 understanding of these relationships. The third category of responses represents those PSTs who disproved the claim and provi ded an appropriate counterexample. The five PSTs in this category (including that was very close in dimensions to a square. Their second rectangle was always very long and narrow. This would produce a perimeter greater tha t the first with a smaller area, thus disproving the claim. The explanations supporting this counterexample were she dropped from a Level 2 understanding, which she had during TE 3 and had also displayed on the posttest, to a Level 1 on the follow up test. The testing situation seemed to promote t often very well) without considering or investigating other possibilities; however, in past situations when she was questioned about certain limited responses, as during an interview, she almost always was able to provide added depth and insight. Gra ce was also in category three and the fact that she only provided a counterexample, and no supportive example, resulted in her once again attaining a Level disagree, becau se although the perimeter & area have a relationship, it is not this one. The

PAGE 364

348 closer the dimensions are in length, the larger the area, even though the perimeter stays clarifyi ng the conditions (Ma, 1999). Grace argued that with the same perimeter there are many rectangles whose pairs of addends can make the same sum. She also implied that when these pairs of addends become factors, as in calculating the area of the rectangle, t hey will produce different products. Finally, Grace uses the fact that the closer in value the two factors are, then the larger the product; hence, for a given perimeter, the square is the rectangle with the largest area. Grace had informally brought this idea up in her first interview, but now it appeared she had refined and organized it and is able to present it relationships and in her ability to synthesize and explain inform ation. Another PST, call her Audrey (PST #1), showed strong positive growth regarding this misconception. Up to this point, Audrey had never attained higher than a Level 0 on any response related to the direct relationship misconception. On the follow up test, sh e attained a Level 2. She provided both an example that supported the claim and one in which the perimeter remained the same but the areas changed. While the second example refutes that a direct relationship exists between perimeter and area. The fourth category of responses involved those who both supported and refuted appropriate countere xample but failed to include a supportive example. Instead, each may be true in some cases, but area and perimeter are not directly related.

PAGE 365

349 So, you other PST (#6) maintained their Level 2 understanding from posttest to follow up test. As org anized. She was now able to clearly and concisely present and explain various concepts related to area and perimeter. Overall the class showed improvement from their first exposure to the direct relationship misconception (i.e., pretest, question 8); h owever, there were still two more levels of understanding that went basically unexplored (Ma, 1999). First, there are three possibilities to identify when the perimeter of a rectangle is increased: (a) the area can increase, (b) it can decrease, or (c) it may stay the same. The majority of the PSTs only discussed the first two possibilities. Beyond identifying or displaying one of the three previously mentioned possibilities, none of the PSTs reached the two higher levels of understanding: (a) clarifying th e conditions under which these possibilities held, and other conditions did not. Table 24 summarizes the approaches used by the PSTs as they responded to questions address ing the erroneous direct relationship misconception. For the most part, the PSTs in this study simply stopped exploring after discussing their initial reaction. Many of these PSTs did not appear self motivated to delve far beyond providing one possibility to the stated question, very often the same one they had given in the past similar situations. Instead of investigating the various possibilities surrounding this misconception, the majority would give the same, or a very similar, answer as they had previ ously and continued to operate within their CK comfort zone For example, once many realized that there was not a direct relationship between perimeter and area, which

PAGE 366

350 Table 24 s Claim of a Direct Relationship (N = 12) Pretest Pos ttest Follow up Reaction: N % N % N % Simply accepted the claim 4 33 2 17 1 8 Rejected claim without investigation 2 17 1 8.3 1 8 Rejected claim and investigated mathematically 6 50 9 75 10 84 could be acceptably shown with a single counterexample, they were satisfied with this degree of investigation even though they had been made aware that there were more possibilities that could be discussed, or at least mentioned. Due to time constraints, it would be unrealistic to expect any PST to expand upon, or even duplicate, responses provided in TEs, while working on a pre post or follow up test. Emergent KoST related to the fixed relationship misconception It was important to lay the foundation with research questions 1 and 2, an d then examine the change in the KoST related to perceived relationships represents the last major category of findings associated with answering research question 4. u nderstanding of and more importantly how they indicated they would respond to student difficulties and misconceptions (i.e., their KoST), specifically regarding the fixed relationship misconception. The findings will be presented and discussed in much the same way as it was in the previous CK section, with an emergent perspective gained from

PAGE 367

351 examining r esponses from the teaching episodes followed by pertinent questions from the post and follow up tests, and excerpts from the second interview with the case subjects. Each teaching episode began with questions related to CK, and then would transition int o KoST. Interacting with the microworlds came at different times during the TEs, but always allowed for CK and KoST to be reexamined and reflected upon. This progression proved valuable to several PSTs in each of the teaching episodes. To recap, TE 2 ( Figu re 15, p. 133 ) required the PSTs to grapple with two relatively difficult concepts. First, and primary, was the misconception that a fixed perimeter can have only method to find the area of his footprint, which involved taking a piece of string, measuring around the footprint he had traced on grid paper, precisely cutting the piece of string, and then forming the string into a rectangle and computing the area of the understanding related to the concepts and misconceptions surrounding TE 2 were positively influenced after interacting with a microworld. During the reflection opportunity f or the CK questions, two PSTs (Jackie and #4) indicated their initial and Gizmo microworld. Other than those accounts, each PST began the KoST questions for TE 2 with the sa me level of understanding as revealed in the CK section. strategy? What specifically would you say an might find difficult about finding the area of their footprint? What specifically could

PAGE 368

352 lack of complete understanding regarding the perceived direct relationship between perimeter and area, how would you follow up with the entire class about the concepts that that they involve a more pedagogical presented together. First though, the responses to question 10 will be examined. Question 10 of TE 2 required the PSTs to apply their CK to the realm of analyzing student thinking. They are not yet asked how th ey would intervene; rather, the question is concerned with their comprehension of what students might find difficult related to the area of a footprint. There were two PSTs whose response was similar to ank on to know for sure that their Larry did not discuss (a) why the absence of a formula would be problematic, or (b) what specific mathematical features of the footp rint would not accommodate the direct use of a formula to find its area. It was common for Larry, as with a novice teacher, to focus solely on content and applying a procedure to get the right answer, as opposed to, examining specific properties of the foo tprint that student could find difficult. Larry made a comment about the footprint problem during his interview that fairly summarized his The majority of response s focused on the irregular shape of the footprint. That explanation to why a formula could not be used, as well as a more appropriate

PAGE 369

353 mathemat ical reason as to why students might initially struggle with finding the area of a e simplistic nature of it does not reveal if she knows why the irregular shape would confuse students, either mathematically (because there is no formula for the shape ) or pedagogically (because textbooks do not typically present such shapes). The first part of inch g rid paper]. That is an interesting comment, especially since Jackie recommended earlier that counting the whole and boxes was one method to find the area. The grid is actually needed to help with approximating the area of the footprint. It also provides the context and representation for a discussion regarding partial units and approximating area, which an expert teacher would have realized. Such discussions would inherently place the focus on the students and their understanding, rather than on using pro cedures to find answers. There were four PSTs who offered more than one issue they felt students might struggle with. Three of the responses were of a more dependent relationship between the irregular shape causing the problem that no formula would direc tly give the area of the footprint. Brianna also suggested two issues students might struggle with. One was the irregular nature of the shape, but the second involved the unit measure of the footprint. they are not full squares, or they may count each part of a square as a whole one, and have too ma quotation since these precise student difficulties have been cited by other researchers

PAGE 370

354 (e.g., Hiebert, 1981; Lehrer, 2003). It ap surrounding TE 2 enabled her KoST to perform in powerful ways. questions 1 5. Questions 6 and 12 looked at how the PSTs would specifically ad dress ded: string, or a microworld, or both, and (c) some mention of at least one reasonable st rategy his strategy fell into three categories: (a) offered only explanation, (b) engaged in exploration, or (c) a combination of explanation and exploration. There we re three PSTs Their explanations were based on the fact that since they did not know how to find the area of the footprint they did not know what to say to Tommy or the class. It is interesting that, based on her responses to questions 1 5, one of these PSTs (#5) had a decent not view that as information worth sharing with Tommy or his clas s. There were 7 other PSTs who indicated their intervention with Tommy would involve either some sort of an explanation or exploring the situation with the student, but ive

PAGE 371

355 string ( n = 3), (b) use the MW to show examples ( n = 2), or (c) use hand drawn shapes ( n ring or a to do because at this point in the TE she could not articulate the misconception surrounding improvement from the vague and mathematically confusing response offered to the student was unorganized and incomplete, and that is reflected in her As was the case with TE 1, it was not until after interacting with a MW (the Gizmo in this instance) he recogni zes it [finding the area of the footprint] will be easier, but there are many that kno apparent lack of complete understanding, how would you follow up with the entire class about the concept

PAGE 372

356 much to show was able to experience some success (e.g., diagnosing student errors in both TE 1 and 2), she was not able to organize that new found CK in ways that would enable her to meaningful respond to individual students or an entire class. he had correctly ide that knowledge his suggested intervention did not thoroughly address that misconception perimeter can be e justification (i.e., further explanation and/or diagrams) of his strategy. During the interview, Larry was asked to provide some specifics regarding his initial response. After contemplat ing for a while, Larry was able to provide two rectangles (a 2 7 and a 3 6) as proof that two rectangles could have the same perimeter but different areas; hence, prima ry goal while working with Tommy would be to prove his string method wrong. Larry gave the impression that once Tommy saw him make a couple different rectangles with the piece of string, then he would almost immediately and completely understand why his me

PAGE 373

357 method in order to produce a correct answer. Larry left question 12 blank on TE 2, so during our s class. The focus of Larry to find the correct answer was once again evident as he described how he would use various methods (e.g., cut up the footprint into squares) to help the studen ts find the area of the footprint. There was no mention of addressing had experienced some growth throughout the study, this novice pattern of responding to fficulties by helping them find right answers reveals his KoST was still quite insufficient and had not changed much up to this point in the study. Two of the remaining five PSTs (#3 and #4) indicated they would use the microworlds with Tommy as well as with the class, but they were very unclear in what 6, 9, & Grace) stated that they would investigate with the piece of string to help Tommy see the error in his method. One PST (#9) suggested a teacher centered approach of showing Tommy two rectangles (one 2 7, another 3 6) that had a perimeter of 18 inches but different areas. There was no supportive explanation. Again, it appears the PST thought it was obvious what the examples would accomplish. She used a similar approach with the entire class, only this time she incorporated the MW. Two others (#6 and Grace) suggested guided exploration for Tommy (#6 also recommended using a geoboard ) with the expectation that he w ould discover the inconsistent nature of his method; however, PST #6 was not as confident when addressing the entire class. She perimeter does not equal the same area, bu t I would show them either of the applets. I

PAGE 374

358 did not quite grasp the role or value of counterexamples (which she had provided earlier in the TE) when addressing erroneou s student methods or claims. A more extensive uses was covered in an earlier section titled, Grace also suggested guided exploration rectangles he could make with his string and what would the areas be. He would discover centered approach could be effective; however, based on Gra misconception (considering the fact that she was one of the few able to articulate a very good strategy to approximate the area of the footprint), it would have seemed logical to include some mention of these while working with Tommy. Grace again focused on area when the perimeter stays the same, will give the students the experiences they need somewhat vague response left me wondering if she planned on including certain specifics she had discussed earlier in the TE (i.e., dimensio ns of appropriate rectangles) to help from her teacher centered intervention in TE 1, which did not involve any manipulatives. The final two PSTs (#10 & Brianna) responding to Tommy with a combination of explanation and investigation. PST #10 was more student centered. She had Tommy investigate predesigned rectangles while interjecting thoughtful questions throughout the

PAGE 375

359 process, but all that changed when respondin Apparently, the interaction with the MW that occurred between questions 6 and 12 had completely altered her focus away from the misconceptio n, which she addressed with Tommy individually, and toward an unwarranted emphasis with the class on finding the area of the footprint. It is possible that this PST (and maybe others) view the MWs as technological algorithms whose primary purpose is to co nfirm or help find answers. Brianna incorporated praise and a scaffolding explanation with Tommy that summarized the direct relationship misconception: I would explain that he was right when he used the string to measure the line drawn for the footprin string to make a rectangle and measure its area. Just because two things have the same perimeter does not mean they will also have the same area. I would then give examples of rectangles with the same perimeter but different areas. Brianna had provided several examples earlier in the TE, so it was clear she could involved appropriate diagrams and addressed and however, her approach in TE 1 had the student actively involved with solving problem s while in TE 2 she proposed a less effective teacher centered approach. On the other hand, a teacher centered approach did not domin class. Brianna gave by far the most th o rough response to question 12. Her whole class why it would not work, (b) having the students draw a rectangle with a perimeter of 18, find its

PAGE 376

360 area, and then call on them so everyone could see there are d ifferent possibilities, (c) having students go to computers and use the Gizmo to find as many rectangles as possible with a perimet er of 18, and notice the many different areas that are possible. For TE 1, Brianna also suggested a discovery learning approach with the class to help them understand why the student proposed method was wrong. The difference was that for TE 1 the learning about the important concepts at play (i.e., differences between linear and student centered and well thought response included multiple representations of the key concepts, It was apparent that those PSTs who were not able to explore the problem deeply on their own also had difficulty intervening with Tommy in meaningful ways; whereas, those with a b etter understanding of the mathematics surrounding the TE (e.g., Brianna) were more confident and adept at engaging both the student and the entire class in a discussion of the misconception as well as clarifying the major concepts surrounding it. Post i ntervention KoST of the fixed relationship misconception Pretest question 10 examined the PSTs pre intervention KoST regarding the fixed relationship misconception and those findings were previously discussed in detail. To recap the pertinent findings: (a ) When presented with the opportunity, no PST expressed an understanding of the fact that squares are a special classification of rectangles (Grace & Brianna did so during the second interview), (b) Only four PSTs (and no case subjects) expressed an awaren ess of the misconception commonly held by elementary students that equal perimeters must have equal areas, and vice versa. Question 8 on the posttest parallels the concepts presented in pretest question 10.

PAGE 377

361 he/she was able to draw several different rectangles that, according to the area formula, have an area of 36in 2 but the student was a up questions were: (a) Are the stu might, explain the reasons for your answer to Part (a), and (c) Why do think the student was surprised by their results? What specifically would you say and do in response to this g? The KoST component of this question was primarily Part (c), but before presenting those findings, certain relevant findings from Part (b) bear mentioning. A documented shortcoming of the PSTs throughout the study had been the lack of including appropria te drawings to support explanations; however, for question 8 of the posttest six out of 12 PSTs (including Jackie and Brianna) included useful drawings to enhance their explanations and another three PSTs (including Grace and to a lesser degree Larry) incl uded a table of the factors of 36 that would also help support their explanation. Included within those drawings and tables were eight individual instances where PSTs (including all four case subjects) included a 6x6 square as an appropriate rectangle with an area of 36. Both of these findings are marked increases over pre intervention findings. Responses to Part (c) of question 8 of the posttest provided two main categories of relationship misconceptio n; those who appeared to grasp the misconception and those who were able to effectively articulate the intricacies of the misconception. There was only one PST (#3) who showed he responses of three PSTs (represented by Jackie) resulted in uncertainty as to the extent to

PAGE 378

362 which they completely grasped the various elements of the misconception. Jackie and another PST (#1) included properly scaled and correctly labeled rectangles th at showed several different rectangles could all have an area of 36; however, their subsequent explanations would not have resolved the misconception among classroom students. For e amount of students do many different shapes w/ the same and different areas so that they can see ht be confused because it seems logical to expect rectangles of the same area to have the same perimeter even though that is not actually true. Though Jackie did address several issues related to the misconception (the unspoken question) it would have been wise to inquire of the student why s/he was surprised by their results. That way she could have customized her intervention KoST had progressed from her previous levels in that she n ow rather consistently diagnosed incorrect student thinking; however, she continued to struggle with providing lucid explanations of those diagnoses as well as with including appropriate mathematical language. The second category of findings involves eight PSTs who apparently grasped the misconception the student was struggling with, but specific wording and suggested (including Larry and Grace) failed to c ompletely articulate the misconception. Their 36, 2 18, 3 12, 4 9, 5 7.2, and 6 6), it can be seen that many rectangles can have the area

PAGE 379

363 of 36 in 2 but have different this group, failed to use the word perimeter while explaining the misconception. That is fairly significant since the misconception being discussed involves area and perimeter. e signified growth from her pre intervention KoST. While being posttest she stres sed conceptual methods and was very clear on how she would approach the student struggling. She was also very confident about why students might have such a when t intervention. The other four PSTs (including Brianna) used the word perimeter while repr have different perimeters. Many students corre late one area to one shape with one perimeter. We can have the same amount of space inside two objects yet they can have di and labeled correctly, which all had an area of 36 but different perimeters. Brianna did not express knowledge of the fixed relationship misconception before the interve ntion nor had she clearly explained how students might think about the fixed relationship. Both of these are evidence of a maturing KoST. Her ability to apply it will be seen next. xed relationship misconception was their suggested intervention for the confused student. It

PAGE 380

364 was common for PSTs to simply refer back to the rectangles they had previously drawn when suggesting an intervention for the confused student; however, there were five PSTs (#6, #11, Larry, Grace, & Brianna) whose suggested intervention would promote (to varying degrees) a conceptual understanding of the fixed relationship misconception. There were three different recommendations to help the student better understan d how different shaped rectangles could still have the same area. PSTs 6 and 11 similarly see that the count the square inches explanation about the misconception and why it was not correct including lan guage that would be meaningful to students. The activity she suggested to promote understanding them and see that they have the number of blocks inside them but the she had drawn earlier in question 8 which all had an area of 36 but different perimeters. These statements reveal a rather significant change in Br intervention KoST, which was very procedural and designed to help students overcome their weaknesses and get the right answer. Now Brianna incorporated activates that focused on the students understanding the mathematical concepts. That represe nts a rather robust KoST. Emergent KoST of the direct relationship misconception regarding the direct relationship was previously examined and it was shown that many experienced growth in their levels of understanding (Ma, 1999). The fin dings presented

PAGE 381

365 nderstanding of, and more importantly how they indicate they would respond to, student difficulties and misconceptions (i.e., their KoST). The now examine the second, and slightly more elusive, misconception. The gist of this misconception is that there exists a direct relationship between perimeter and area, that is, as the perimeter of a shape increases/decreases its area must also increase/de crease (and vice versa). The focus problem for TE 3 ( Figure 16, p. 136 ) provided the setting for findings related to the relationship misconception. TE 3 began with four the claim), and then transitioned into examining their KoST (their reaction to the student). For this last TE, the PSTs were instructed they could interact with either microworld from the outset. There are two questions from TE 3 that provided useful fi ndings to investigate the her proposed theory? What specifically would you say and do (even if you are unsure and apparent lack of complete understanding, how would you follow up with the entire class about the concepts that surround this classroom episode? Remember to share specific examples and representations (possibly from a microworl d) just as you would in Questions 5 and 10 looked at how the PSTs would specifically

PAGE 382

366 generalization), includ ing ap propriate counterexamples, hence disproving the direct relationship misconception, (b) method. For example, her theory does hold under certain circumstances (i.e., if both dimensions are increased), and (c) allowing for students to explore these relationships with either MW (the Gizmo would be preferable). By the time the PSTs reached these KoST questions, all but two of them (Jackie theory had already provided counterexamples to illustrate their position. Consequently, the two to offer any meaningful intervention to help Jasmine or her class. It is not that surprising that Jackie theory was correct since they were two of the three PSTs who thought the same way when this misconception appeared o n the pretest, and there had been no formal intervention up to TE 1. Even though Jackie was not able to applications of her KoST. First, she offered the student prais trying to discover more about math. The NCTM Standards encourage students to reason recognized the danger of over generalizing when making mathematical claims and that was significant as it is a characteristic of an expert teacher ( Table 3, p. 166 ). What is somewhat puzzling is why Jackie did not take her own advice out on one of the MWs. A possible answer to that question, which also exposes what was

PAGE 383

367 a common view of and approach to using the MWs, was made apparent during the second interview with Jackie. I asked her about her lack of progr know, no matter what resources or what materials, or games, or anything you give them, k now what they are looking for. I still think they admitted over exposure to show and tell teaching approaches seems to have affected her belief in what students are capable of doing on their own as well as how she herself approaches problem solving. repeated exposure to area and perimeter concepts, her KoST struggled adapting throughout the intervention. Jackie ious mathematical scenarios, which left her ill equipped to resp ond to student difficult ies. interventions often focused only on big ideas (e.g., clarifying area and perimeter), even when those ideas were not helpful in resolving the current misconception. Her choices of mathematical language often confused and muddied her attempts at explaining concepts even those concepts she seemed to understand. She did not appear to learn well on her own, but rather indicated several times how the small group and whole class sessions were very helpful. Jackie put forth a lot of effort throughout the intervention and was very engaged during both interviews. Her increased successful teacher also translated into mom ents of pedagogical clarity. For example, a

PAGE 384

368 comment made by Jackie during a teaching episode involved her belief that it might help students resolve area and perimeter conflicts if the concepts were studied simultaneously. Her view displayed relative exper t pedagogical KoST, shared by s everal researchers (Chappell & Thompson, 1999; Hiebert & Lefevre, 1986; Simon & Blume, 1994a). The focus will now turn to the recommendations of the other nine PSTs who did theory was incorrect. Their instructional strategies, both with Jasmine and her class, divided along lines of teacher centered versus student centered, with approaches involving hand drawn examples and/or the use of a MW. The first category involves those who suggested very teacher centered activities. There were four PSTs in this group (including Larry and Brianna), and generally, their explanations hey would make sure Jasmine realized her theory would not work all the time. Two PSTs (Larry and #5) indicated they would use the Gizmo MW with Jasmine, and the class, to help them see inconsistencies in her proposed theory. PST #5 included specific detail s about the types of examples she would use as well as the accompanying explanations she would use. Larry provided neither. He show the students that just b/c the equally vague, response as in TE 1. It is a little surprising that Larry did not consider it important to provide more information given the thorough summaries provided for TE 1 and TE 2 what appropriate student intervention should involve. Larry throughout the study. He often appeared

PAGE 385

3 69 confused or distant during discovery learning sessions, and did not seem interested in exploring concepts which he struggled with. Once he seemed to grasp a concept (i.e., TE 2), he rarely ventured beyond that knowledge. At times he appeared distracted by the use MWs in his responses, the goal was to accelerate the viewing of many examples to more efficiently arrive at an answer. He continually appeared content with simply getting itated a KoST that was satisfied with responding to student shortcomings in an attempt to guide them to get right teacher centered behavior. They frequently lacked meani ngful and classroom useful paper); however, it would often be the same ones and many times the reason for the aid was unclear. Overall, finishing problems and generating answers appeared to take precedent during the intervention over gaining personal insights and knowledge necessary to develop conceptual understanding within future students. Brianna and PST #10 were the other two who proposed teacher centered interventi ons. PST #10 incorporated thoughtful and directed questions with Jasmine response to Jasmine involved presenting counterexamples for her to calculate the area and perim eter of in hopes she would realize the error of her theory. Brianna was the only PST to go one step further with Jasmine and formally acknowledge that her theory could Briann a did not provide the specific examples she referred to in her explanation. Her

PAGE 386

370 intervention with the entire class was very similar in content, although she did suggest using the Shape Builder MW to present the various examples. As was common with Brianna, she directed and/or guided the instruction, whether working with one student or an entire class. In that aspect, her KoST was very narrow in focus and application. grasp every misc onception within the TEs and to be very thorough and accurate in her prescribed activities. Her ample CK initially interfered with her ability to see the need to include diagrams to help students understand her ideas; however, the frequency of quality diag rams increased From TE 2 right through the follow up test. That strong CK likely Brianna indicated that she would direct the learning during the interventions (both wit h individual students and with a class). She often had students investigating with MWs, but with predesigned problems. Her instructional strategies gradually evolved from teacher centered, with students receiving instruction, to teacher directed, w ith stud ents participating more in their learning. Absent however were frequent opportunities for students to interact with her (through assessment questions) or explore on their own. Only in TE 2 did Brianna indicate she would allow students to work independently with a MW, even then it was on a predetermined problem. Brianna was modest and relatively quiet. During her second interview I informed her that several PSTs wrote how they learned a lot when they were in her small group; that she always had clever ways t o look at and confidence in certain social/teaching situations may help to explain her teacher centered tendencies and her incomplete KoST.

PAGE 387

371 The final group of five PSTs (inc luding Grace) represents those who, to varying degrees, encouraged both Jasmine and her class to explore the concepts surrounding the direct relationship misconception. For three of these PSTs, it was interesting how two (#3 and #11) suggested more teacher directed approaches with Jasmine, but more discovery based with the entire class, and the other (#9) was more student centered working with Jasmine but teacher directed with the class. The discovery activities typically involved the student(s) finding sev eral rectangles that have the same area and then comparing their perimeters to see that the larger perimeter does not always have the larger area; hence, MWs, but they t hought hand drawn examples would be more meaningful with Jasmine while MWs would be more appropriate when working with the class. Grace and PST #6 were the only two to accomplish all three KoST objectives established at the beginning of this section: (a) they addressed the misconception, (b) they encouraged investigation to discover other relationships, and (c) they realized the value of the MWs in that investigation. PST #6, who was one of the top achievers in the study, promoted exploratory methods for both Jasmine and her class. She explained how class) if she/they could find two rectangles where the theory does not work. She e Gizmo to see if they can find any other Jasmine and for the class. PST #6 displayed a pedagogically powerful KoST. These misconceptions facilitate discovery learning a nd a responsive KoST would recognize that as appropriate intervention. Grace went one step further and shared two specific area

PAGE 388

372 perimeter relationships she would guide the students into discovering: (a) If you increase one dimension of a rectangle but decr ease the other, it will result in a smaller area, and (b) If you leave one dimension of a rectangle fixed and increase the other dimension, that during the second intervie Gizmo would allow her [Jasmine] to see that the greater the difference between the dimensions of the rectangles, the lesser the area up to a point. It does not always where the numbers get closer together, the area will increase up to a square which has the through the intervention, but it was her sharing of that CK with Jasmine and the class that revealed her KoST had equally matured. Grace wrote how, after giving the students an Gizmo MW) and expl ain with the students the various conditions that influence whether ame desire to und erstand mathematics, how students think about it, and how she can help them understand it better. Throughout the intervention, Grace would often call me over to see her computer and what she was working on. She would ask questions, because she had a gen uine desire to understand the concepts we were covering. She wanted to be prepared to teach them well. Grace is somewhat of a perfectionist, as her 4.0 GPA testifies. Early on in the study, Grace appeared to know more than she would write in her responses. That became

PAGE 389

373 evident during the first interview. Once Grace became aware of how thorough communication was necessary to promote understanding of mathematical principles, her responses changed to include greater specificity. Her desire to understand mathema tical relationship misconception came on the pretest (question 8). It was during our first interview that her internal drive to better understand the mathematics she would have to teach became evident. We were discussing her thoughts on question 8 and the proposed direct have the idea that, for quadrilaterals, a square maximizes area. Grace was not generally satisfied with leaving mathematical conflicts unresolved, and the fact that she was thinking about and working on a problem outside of class was evidence of that. It also helps to explain how she was able to make such noticeable improvements on the same have a conceptual understanding of mathematics has been shared numerous times. It was apparent in the application of both her CK and KoST, which strived to clearly communicate mathematical ideas so that students would understand them. Post intervention KoST of the direct relationship misconception The findings in this section concludes the discussion regarding perceived relationships (specifically the direct relationship), and finishes addressing research question 4, which was concerned with how the PST Pretest question 8 examined the PSTs pre intervention KoST (and CK) regarding

PAGE 390

374 the direct relationship misconception, and those findings were previously discussed in detail. To recap the pertinent fin dings: (a) 4 out of 12 PSTs (including Larry & Jackie) (including Brianna) provided ap propriate counterexamples in their response to the student. (c) Only one PST (#10) included any discovery type activities in her response to the struggling student. The suggested intervention by the other 11 PSTs was completely teacher centered. Questi on 6 on the posttest parallels the concepts presented in pretest question 8. up questions that touch on Ko As a teacher, how would you respond to Stacey? What specifically would you say and do (even if you are unsure about the mathematics involved)? One difference between this quest ion and pretest #8 and TE 3 is that those questions used the word larger instead of smaller ; however, the direct relationship claim would be examined and discussed in much the same way. Another difference for question 6 was that no example (i.e., student w could be true, they would have to supply their own example. A thorough and model respon followed by an explanation detailing why and including appropriate counterexamples;

PAGE 391

375 hence, disproving the direct rela over generalization (i.e., that she generated her claim after only a few, or even one, example), (c) investigating (or at least mentioning) the various area perimeter relationships surrounding Stace theory does hold under certain circumstances (i.e., if both dimensions are increased), and (d) recommending students to explore those relationships with either MW (preferably the Gizmo ). It should be mentioned that, due to time constraints, it would be unrealistic to expect any PST, while working on the post or follow up test, to expand upon or even duplicate the extent of the responses provided in the TEs. Descriptive statistics for posttest question 6 indicated it was the se cond hardest item on the test. That was evident by the fact that two PSTs (Larry and #1), agreed that initially agreed with the claim. Larry originally disagreed with Sta nic, because nowhere on his pape r did he attempt any diagrams or examples week after TE 3, where we had spent three class hours address ing the misconception. On the follow up test, Larry never wavered as he once again agreed with the student and their flawed claim regarding a direct relationship between perimeter and area. It was has trouble remembering ideas recently discussed. Obviously, Larry would be unable to engage a struggling

PAGE 392

376 student in any meaningful dialogue regarding these concepts, as he himself is confused about them. His ability to understand and then respond to stud his KoST) is continually derailed by his insufficient CK. Any progress Larry seemed to make during the planned intervention appeared to be short lived. The responses to question 6 from the other 10 PSTs formed three categori es of n = 3), (b) those who disagreed with the claim but made some reference to the fact that it could w ork ( n = 5), and (c) those who refuted the claim and also explained or illustra ted when the claim would hold ( n = ns of over student learning was positive, but the lack of specificity left the intervention inconclusive. Overall, their suggested responses to Stacey were more teacher centered, and similar in content, because each narrowly focused the discussions surrounding only counterexamples. There were a total of seven PSTs who indicated that they both disagreed with and could m. That was double the amount ( n = 3) who reached this level of understanding and effective student involvement during TE 3. Five of the PSTs (including Jackie and Grace), while alluding to the possibility that the claim cou ld hold, did not provide any specific examples (i.e., diagrams or dimensions) which would be meaningful in helping Stacey understand more about the misconception. They did claim,

PAGE 393

377 discovering a non thinking explain how she would follow up with Stacey to assist in resolving the misconception, exploring the other possibilities she had mentioned earlier, or how she wou ld help Stacey reconstruct her knowledge. The same misconception appeared on the follow up test, and this time Grace only disproved the claim; however, she did a much better job explaining the condition that would make it false and her response to the st udent was coherent and included drawings of her counterexample. Two others (Jackie and #4) of these same five mentioned they her to prove her theory to me providing more t Although this may be true for During her second interview Jackie elaborated more on her proposed intervention to include conditions beyond just proving the theory wrong: I think I would have her come up in front of the class and present her theo ry so the class could see what she meant. Then I would have her ask the class what they

PAGE 394

378 thought about it; have them work on the problem, and have them raise their hand if they proved her theory wrong, or raise their hand if they proved it right. On th e basis of her CK regarding this misconception Jackie realized that the student was wrong, and also pinpointed the source of the erroneous thinking. The context and level of student involvement recommended by Jackie revealed her KoST was beginning to incor porate the ideas and practices that had been discussed during the TEs. These were noticeable differences from her pre intervention awareness and application of such pedagogy. Jackie showed she had retained much of her KoST when faced with the same misconce ption on the follow up test. There she gave the same basic response as on the posttest, but was even more cl ear about how she would respond to the struggling student. There were two PSTs (Brianna and #10) who went one step beyond simply providing a count hold. These PSTs informally explained or illustrated one condition that would support the perimeter, PST #10 provided a diagram of a 3 7 and a 1 idea. Brianna indicated that she would have the student provide examples supporting her cla parallel item on the follow up test (question #8) revealed some concepts regarding this misconception were not retained by these two PSTs. Both of them neglected to even It was somewhat unexpected that only three PSTs made reference to incorporating

PAGE 395

379 MWs wh ile working with Stacey. The direct relationship misconception had just been personally investigated and corporately discussed the previous week, and it provides a prime opportunity to explore the various concepts with the Gizmo MW. The user can quickly an d easily drag the corner of a rectangle to produce countless different rectangles, while watching the area and perimeter measurements change in real time. The instant feedback would be very valuable for a student and support the various conditions surround ing this misconception. A well developed KoST would have realized the benefits of the MW to aid a struggling student. Other findings from the follow up test indicated that there were signs of continued lated to this misconception. Only one PST (Larry) agreed with the claim on the follow up test, as compared to four on the pretest and two on the posttest. PST #1, who was the other PST on the posttest to agree positive changes in both her CK and KoST on the same question on the follow up test. Research Question 5: Identifying and Describing CK KoST Relationships This time I understoo d, so I felt I could do that. Now that I understand, I thought that would be a good way to go. (Jackie, following the posttest discussing on how she would address erroneous thinking ) The findings in this next section address the fifth and final research question: In

PAGE 396

380 and perimeter related to their content knowledge (CK) of those same concepts. This study operated under the somewhat logical assumption that CK and KoST are interrelated; further more, possessin g a robust KoST would be depende nt upon possessing at least an adequate CK. Answering r esearch question 5 involved examining the various relationships that might exist between CK and KoST. The case subje cts were the focus of this research question, because their interview findings were necessary to triangulate with other data (i.e., tests and teaching episodes). The relationships explored were associated with area and perimeter in general, and more specif ically, units of measure and perceived relationships. T he answering of research question 5 involved two components. First, were quantitative findings involving: (a) the correlation coefficients for CK and KoST at the three time points (i.e., pre post, and follow up), (b) CK KoST relationships as seen in both the rubric scoring of responses to pre post and follow up test items (e.g., Table 14 p. 256 ) a nd the summary tables of exper t/novice codings (e.g., Table 16 p. 261 ), and (c) appropriate regression graphs (previously used to answer research questions 3 and 4). The second element was more descriptive and entailed elaborating on the initial relationships identified by the q uantitative analysis. Two comprehensive analysis strands, devised and organized around the area and perimeter concepts/misconceptions central to this study (Table 5 p. 188 ) helped guide the presentation of the qualitative findings and answer research que stion 5. These strands tracked parallel items (e.g., CK related to units of measure) from the pre post and follow up tests, and the three teaching episodes. The goal and challenge of an swering research question 5 was to ascertain and then describe how if at all, KoST and CK are related within the context of this study.

PAGE 397

381 Identifying CK KoST Relationships As was shown while answering research questions 3 and 4, the majority of PSTs exhibited some sort of increase in their CK (75%), KoST (also 75%), o r both (58%) from pretest to posttest. A reexamination of the regression lines (Figures 2 7 29 pp. 267 269 ) revealed that a positive relationship (i.e., correlation) existed for those same PSTs, as seen in the positive slopes. Subsequent calculations of th e correlation coefficients for KoST and CK at the three time points confirmed the existence of some relationship: (a) pretest, r = .53, (b) posttest, r = .64 (significant at the .05 level [two tailed]), and (c) follow up, r = .57. Not completely surprising ly, these values are moderate to strong. The lower variability, s mall number of sub test items ( n = 5) and the presence of one, possibly two, poor er measuring question helps explain the lower correlations for the pre and follow up tests Before discussing the one viable relationship uncovered, there are other results worth mentioning, although none involved more than two PSTs. Grace and PST #6 showed an initial similarity involving a static CK and an increasing KoST. That result would see m like an illogical relationship. One would think be increasing. After closer examination, their CK was static because it was initially very high. Grace and #6 had the highest and second higher scores respectively on the pretest CK sub pretest. Their CK was more than adequate to support an increase in their KoST, which for them involved incorporating effective instructional methods into an already receptive framework. T here were no other descriptive indicators warranting further investigation of this result. A second observation involved the other two case subjects Brianna and Larry.

PAGE 398

382 There were slight increases in their CK but no discernable increase in the KoST. It might seem that this result would warrant further discussion; however, when the scores of these PSTs were more closely examined, the need for further investigation was sufficiently diminished. Larry, the overall weakest performing PST, concluded the prete st with the second lowest score on both the CK and KoST sub tests, and he made very little measurable change throughout the study. Brianna completed the posttest with the second highest CK sub test score and the highest KoST sub test score; therefore, the fact that ially from pretest to posttest wa s not surprising. A result of these fact s was the lack of numerical trails to investigate further. Delving deeper into the KoST and CK sub test scores from the pre a nd posttest, and applying the 3 point criterion established and described in Chapter 3, revealed several patterns that formed the basis for the findings that will assist in answering research question 5. There were six PSTs (Jackie, #4, #5, #9, #10, and # 11) who experienced a discernable increase in both their CK and their KoST CK will be the focus of the findings regarding this group. Every member in the group had both their KoST and their CK sub test scores increase by at least 3 points from pretest to posttest (range of increase 3 9). All six of the PSTs in the CK increases from pretest to posttest in the frequency of expert codi ngs assigned to both their CK and KoST. There are other common traits within the group, that will be presented later, that help confirm Jackie as a fair representative for the group. At this poi nt the identified relationship is mostly numerical. The goal n ow is to attempt to uncover and explain the character of those numbers.

PAGE 399

383 Describing CK KoST Relationships Two comprehensive analysis strands were devised to organize the area and perimeter concepts/misconception s central to this study (Table 5 p. 188 ). T hey helped to in their ability to understand, analyze, and respond to hypothetical students thinking (i.e., their KoST). The two analysis strands are (a) units of measure (i.e., linear versus square), and (b) the presumed relationships between area and perimeter (i.e., that equal perimeters must result in equal areas and vice versa, and the belief that a direct relationship exists between area and perimeter in that increasing (or decreasing) one will have the effect of increasing (or decreasing) the other. These analysis strands formed the basis for the topics of inquiry across various time points (i.e., across teaching episodes and from pretest to posttest to follow up). A case subject was the primary focus of the comparative analysis, because her responses received appropriate pattern matching through the two semi structured interviews. She was also representative of the prominent CK KoST relationship patterns ident ified in the previous section (e.g., Jackie increase in KoST [+6] with increase in CK [+9]). The I ncreased CK I ncreased KoST Relationship There were six PSTs (Jackie, #4, #5, #9, #10, and #11) who experienced a discernable increase in bo th their CK and their KoST increased test scores increase by at least 3 points from pretest to posttest (range of increase 3 9). All six of the PSTs in the gro up also saw increases from pretest to posttest in the frequency of expert codings assigned to both their CK and KoST. All but one PST in the CK

PAGE 400

384 group scored below the mean on the CK sub test and all six scored very close to the KoST sub test mean. As described earlier, various descriptive statistics placed the represent the group. The fact that Jackie was a case subject allows for additional sources (e.g., the f irst and second interviews) to help document and explain possible relationships that exist between her CK and KoST. The purpose of the following sections is not just to present examples of the increases in CK and KoST, as that was done while answering rese arch question s 3 and 4, but rather to establish baseline relationships between CK and KoST, to describe how they change d through intervention, and to discern in what ways CK and KoST interact with each other. relationship prior to intervention The comparative analysis beg an intervention CK informed her KoST regarding units of measure and perceived relationships. Problems 1, 3, and 4 from the pretest focused on basic CK regarding units of measure and 5 addressed perceived relationships, while corresponding KoST problems were numbers 6, 7, and 9 for units of measure and 8 and 10 for perceived relationships. It was apparent from the CK proble ms that Jackie was lacking an understanding of fundamental concepts surrounding area and perimeter (i.e., which unit should be used to calculate each, and how area and perimeter relate to each other ) and she knew it. Subsequent probing would reveal just h ow much Jackie did not know, had forgotten, or up question asked how she would explain her answer to a 5 th grader. Ja ckie drew a 3 8 rectangle, which has an area of 24

PAGE 401

385 square units. So it is obvious she had forgotten or does not understand what the concept honest . I have no i dea of the polygon I drew represents a perimeter of 24. But I guess I is admittedly confused but the parallel trend from meager CK to an in appropriate explanation to a student (an aspect of her KoST) is telling. She not only initially confuses perimeter with area, but her explanation adds more contradictory information by dimensional concept at best, or a 3 dimensional at worst) wh ile supposedly explaining to a student about perimeter (a 1 dimensional measurement). Basic relationships between area and perimeter also involve dimensions. 4 rectangle is tripled, what is he problem to involve triangles, thus was unproductive describing the relationships. Others in the CK none realized the 2 dimensional aspect of area would cause the area to be increased by a factor of 9. Not appreciating the fact that area is a 2 dimensional concept would often Prior to intervention, the majority of PS Ts in the study were not able to coherently explain or illustrate the concepts of linear and square units. Her first interview confirmed routinely resulted in confusing the fragile CK would also cause her to wrongly apply procedural methods, followed by procedural explanations even when inappropriate. For example, problem 3 on the pretest

PAGE 402

386 (Figure 21, p. 212 ) asked the PSTs to find the area and perimeter of an irregular shape. Jackie was not able to draw from a CK that included an understanding of linear and square units, and that caused her to apply erroneous methods to find perimeter and area of the irregular shape. The oth er PSTs in this group were able to find the correct area and perimeter in problem 3, but their incorrect treatment of the appropriate unit for each measure lead to nonconceptual explanations and misapplying the b h formula to situations where it was not needed or helpful. As the pretest continued, the PSTs faced problems which required them to more directly apply their KoST. Jackie knew, and stated often, that multiplying base times height would give the area of a rectangle. Yet, further probing revealed a lack of understanding about the common formula. I asked Jackie why multiplying base times height produces the area of continued to leave its mark on how Jackie responded to struggling students. Problem 6 on the pretest ( Figure 23, p. 224 ) asked the PSTs to respond to a student who correctly found the area of a 3 6 rectangle to b e 18, but indicated he did not understand what exactly the 18 represented. Jackie attempted a conceptual approach by drawing a 3 6 array of only is incorrect but wo uld be very confusing since the 3 6 rectangle would have a perimeter of 18 cm. Several PSTs in the CK unit for measuring area. Conflicting ideas about area and perimeter, linear and square units, and perceived rel ationships also produced incomplete diagnoses of student misconceptions and

PAGE 403

387 ineffective instructional suggestions regarding these concepts The last problems on the pretest that specifically addressed perceived relationships and units of measure were 8 and 9 (Figure 25, p. 230 ) respectively. Each problem centered on a student proposing either an erroneous claim (#8) or solution method (#9). In problem 8 Jasmine, the Sketching out various rectangles can often lead to, at the very least, a counterexample to the claim. Jackie did not attempt any sketches and did not offer any evidence of fully comprehending the claim, and as a result offered nothing but vague suggestio ns for how to respond to Problem 9 involved a student proposing an erroneous method to find the perimeter of an irregular shape (drawn on a grid) by counting the numbe r of square units. Because Jackie had an insufficient understanding regarding units of measure, her diagnosis and student actually seemed to understand perimeter. His confusion involved using the wrong unit (i.e., square unit) to foundation for which to explore, diagnose, and the n respond to the student and their thinking. a 3 6 array on problem 6 actually earned her a higher rubric score (for including a conceptual approach), even though the subsequent interview revealed she did not pos sess the mathematical understanding to

PAGE 404

388 make good pedagogical use of the array Such instances also help to explain how Jackie had a higher KoST score on the pretest (5 points higher) than she did for CK. Initially at times, inferior CK was easier to identi fy, and score or code, than was inferior KoST. A PST could provide what appeared on the surface as evidence of expert KoST. For example, they might write that students often struggle with certain concepts regarding area and perimeters (e.g., linear and squ are units), and such an acknowledgment would earn various expert codes (see Table 3, p. 166 ); however, it could be possible (and many times was) that that same PST did not posses s the necessary CK to be able to adequately explain those concepts to the stud shortcomings and misconceptions (i.e., an equally incomplete KoST). The next section will present findings that demonstrat of the planned intervention. relationshi p: Emergent findings The primary means to strengthen regarding units of measure a nd perceived relationships were the t hree teaching episodes (TEs). Teaching episode 1 focused on units and TEs 2 and 3 addressed perceived relationships involving area and perimeter. Tables 25 and 26 (two of 16 such tables consulted while organizing fin dings for research question 5) provide d evidence of t Note the low frequency of b (or expert) codings assigned during the TE, but how they increased on the posttest. The progression regarding perceived relationships was even s lower to develop. There were many more novice ( a ) codes assigned to responses within TEs 2 and 3 than to TE 1 and also fewer expert ( b ) codes awarded. Table 13 (p. 251 )

PAGE 405

389 Table 25 Sample of Expert/Novice Codings Relevant to Units of Measure Analysis Strand ( CK ) Pre Intervention Intervention Post Intervention (Pretest) (Teaching Episode 1) (Posttest) (Follow up) Q1 Q3 Q4 Q2 Q3 Q4 Q6 Q1 Q3 Q4 Q1 Q3 Q4 1a 1 1 1 1 1 1 1 1 1 1b 2a 1 1 1 1 2b 1 1 1 1 1 1 3a 3b 4a 4b 5a 5b 6a 1 6b 7a 1 1 7a 1 7b 1 1 1 1 8a 1 8b 1 1 1 1 9a 1 1 2 9b 10a 10b 11a 1 2 1 11b 12a 1 1 12b 13a 13b 14a 1 14b 15a 1 2 1 1 15b 16a 1 2 2 1 2 1 2 17a 17b a Sum 6 9 5 1 1 1 0 4 3 4 3 4 4 b Sum 0 0 0 0 0 0 1 2 3 2 2 2 2 Note An a signifies a novice response and b signifies an expert response (see Table 3 ).

PAGE 406

390 Table 26 Sample of Expert/Novice Codings Relevant to Units of Measure Analysis Strand ( KoST ) Pre Intervention Intervention Post Intervention (Pretest) (Teaching Episode 1) (Posttest) (Follow up) Q6 Q7 Q9 Q5 Q7 Q8 Q10 Q11 Q7 Q9 Q6 Q7 Q9 1a 1 1 1 1 1 1b 1 2a 2 1 1 1 1 1 2b 1 1 1 1 3a 3b 4a 4b 5a 5b 6a 6b 7a 7a 7b 1 1 1 1 8a 1 1 8b 1 1 1 9a 1 1 1 9b 10a 1 10b 1 1 1 11a 1 11b 1 1 1 12a 1 1 12b 13a 1 1 1 1 13b 14a 1 1 14b 15a 1 15b 16a 1 2 1 1 1 1 1 1 17a 17b a Sum 6 7 4 2 3 1 1 0 2 3 2 3 3 b Sum 1 0 0 0 0 0 2 0 5 3 2 3 2 Note An a signifies a novice response and b signifies a n expert response.

PAGE 407

391 reveals similar trends for expert/novice frequencies during the TEs for Jackie and the other group members currently being discussed. The intervent ion was designed around discovery learning, so PSTs progressed at their own rate. By the time the posttest was given, the PSTs had experienced three TEs, and they had multiple opportunities to refine both their CK and KoST. There are two emergent findin gs from TE 1 relevant to the CK as the same problem she faced in question 9 on the pretest, which s he performed better on For t he TE, s he had little problem agreeing with the student involving measuring perimeter with square units. Her wrong diagnosis was based on the fact that she had incorrectly calculated the perimeter of the irregular shape earlier in the pr oblem. Her feeble CK about perimeter and units le d her to which resulted in a lost opportunity for a successful intervention. Later on in the TE when it became evident to her that the was wrong, she had another opportunity to apply her KoST when lengthy response without being aware of her previous struggles, one might be impressed with her suggestions of bringing discussion, using the Shape Builder microworld (MW) to display the irregular figure in front of the class, and then having students provide reasons why they agreed or disagreed. Jackie did seem more concerned with promoting understanding than simply dismissing expert codes; however, her desire to promote understanding proceeded no further than

PAGE 408

392 her good sounding instructi on strategies. There was no mention of the source of the suggestion of reviewing important concepts regarding area and perimeter. In other words, inten tioned response to the class fell short because her CK regarding these concepts were still unorganized and unable to properly inform her KoST. The CK KoST interactions were very similar for TEs 2 and 3. On her own, Jackie was not able to advance her CK ver y far during TE 2 and 3. Jackie increasingly explored more about the various misconceptions on her own especially through the MWs, but she often became distracted with side issues (e.g., finding the area of the footprint in TE 2) that kept her from fully deciphering the misconception so she could properly respond to the student. Through her writings and interviews Jackie indicated that she had gained knowledge small grou p and whole class discussions embedded within the TEs. Several other PSTs took occasion to share all they felt they had learned throughout the TEs. Concluding the pre intervention and emergent findings, there are four results that weak CK affected her KoST: (a) She did not possess the necessary mathematical vocabulary to support explanations, (b) it (her weak CK) interfered with her ability to effectively diagnose student errors and misconceptions, (c) it limited her instruction/int ervention to procedural methods and responses, and (d) it hindered her capacity to fully utilize the features and educational benefits of instructional technologies (e.g., microworlds). As testimonies have indic a ted, the various activities contained within the TEs helped to improve the current status of Jackie, and the others in the CK here they performed much better on the posttest.

PAGE 409

393 relationshi p, post intervention impoverished CK can and does affect the usefulness of their KoST. What about when these knowledge types appear to mature together? How do the increases interact? The post have been presented ( e.g. Tables 13 14, 16 25, and 26), and illustrated (Figure 37 ), but how and in what ways did they occur? Did they occur in conjunction with each other or were there times of disconnect (i.e., CK improving with KoST lagging behind). There was evidence from t CK had not only increased but that it had also changed. On the pretest when Jackie and others ran into problems that were unfamiliar to them, they would either leave them blank (e.g., PST 5 and p roblem 4) or what they wrote was incorrect and/ or unrelated. On the first problem but . wer. She attempted to solve the problem through a conceptual approach, but in the end resorted to a procedural, formula . I understood Although her actual understanding will be shown to still be incomplete, her self professed confidence was due to a more stable CK of basic area and perimeter facts and concepts. For example, Jackie (and others) exhibited a new awareness of the discreteness and defining characteristics of linear and square units (see Figure 38 ). This aspect of her improved CK allowed for better clarity and mathematical vocabulary while unpacking and explaining her ideas. It also facilitated more conceptual solution methods. Similar to the pretest, problem 3 on the posttest had the PSTs find the perimeter

PAGE 410

394 Figure 3 7

PAGE 411

395 Figure 38 and area of an irregular figure. On the pretest, Jackie tried to apply formulas to find the perimeter and area and was unsuccessful on both. On the posttest, she focused on the the inside and her accompanying explanations included helpful diagrams with meaningful dialogue. also contributed to a more successful handling of area perimeter relationships. Instead of trying to describe the various relationships presented within the problems (e.g., fixed and direct) with just words, as she did on the pretest, Jackie now supporte d he r responses with ample diagrams On the pretest, she only provided one (rather vague) diagram while explaining her thoughts and ideas. For the posttest however, Jackie included 19 appropriate diagrams. That awareness of the importance of including rep resentations when explaining mathematical principles and relationships showed a significant increase in her KoST. problem on the posttest. She successfully diagnosed all five of the erroneous student

PAGE 412

396 problems, which in and of itself could result in higher rubric scores and greater frequencies of expert codes assigned, but now her responses were much more organized and addressed concepts central to the problem. For example on the pretest when a student suggested finding the perimeter of an irregular figure by counting the square units around the inside border of the shape ( i.e., Problem 9), Jackie sa id the method was wrong. Her subsequent response to the student was shallow and involved a basic review of how to find area and perimeter, but included no mention of the appropriate unit for each measure. That would have been meaningful, since the student was using square units (a 2 dimensional concept) to find perimeter (a 1 dimensional concept). A similar problem on the posttest (#7) involved a student (Jose) who was asked to draw a rectangle with a perimeter of 24 units. Figure 39 response. Every PST indicated that the student was incorrect, and most (including every member of the e student benefited because of it. Jackie was again representative of her group, and her intervention with Jose Figure 39

PAGE 413

397 the perimeter of the shape characte rized a much more classroom useful KoST, and based on comments made during her second interview, it appeared her KoST regarding thinking was telling: He was thinking that wa y because those squares are on the outside of the shape, This new CK KoST partnership was also evident when dealing with student thinking about perceived relationships. Problem 6 on the posttest will conclude the findings regarding the group. It addressed the direct relationship misconception, which proved t o be relatively troubling to the PSTs. Responses to this misconception have been examined repeatedly They either agreed with the student (as Jackie did) or they disa greed without providing any counterexamples or meaningful follow up with the student. Facing it again in TE 3 Jackie initially struggled with the relationship, but by the end seemed to reconcile the ed the shortcomings regarding the direct relationship misconception to the point where on the

PAGE 414

398 Appendix H). That was evident by the fact that many in the grou p, including Jackie, illustrated with a counterexample why a smaller perimeter will not always produce a smaller area. Jackie, and others, also displayed a deeper understanding of this relationship coul d be true under the right providing mmon tendency of students to over generalize after seeing only one example of a mathematical relationship. Jackie centered app roach was geared towards understanding, rather than simply disproving the student or getting the right answer. Jackie was asked during the second interview about her apparent new level of confidence displayed on the posttest regarding this problem: Well I hit the thing where she has to provide more than one example. You know how before we were saying that it would be a pretty absolute statement for the student to make their claim with only one example. Then I would propose her nd have them play This time I understand, so I felt I could do that. Now that I understand, I thought that would be a good way to go. something mean responses on the follow up test revealed that these changes were not short term. up) responses to students was

PAGE 415

399 the integration of MWs as an instructional t ool. Problem 6 (as well as 7, 8 and 10) involved misconceptions, or erroneous claims that could have been disproved and then explored effectively with the assistance of either of the MWs used in this study; however, reference and follow up test responses. That was most likely due to time constraints. The ques tion at this point in the study was whether the PSTs possessed the CK and the KoST to appreciate and effectively use the MWs as an instructional tool. Part of that question was answered on the posttest. The last probl signified she would use the MW in the part of the question that would compliment her more evidence of a maturing KoST. The changes to post intervention have been described, along with how they appeared to interact. All four of the earlier findings regarding the impact of o be modified to reflect how a more robu st CK had influenced her KoST: (a) It supplied the necessary vocabulary to enhance her explanations, (b) She was much more capable to consistently diagnose student errors and misconceptions, (c) H er explanations now included multiple entry points and tende d to focus on conceptual approaches, and (d) It increased her awareness of the benefits of instructional technologies (e.g., microworlds) to help struggling students. In conclusion, the proposed relationship did appear to behave in many of the ways anticipated by the researcher. There appears to be a mutually beneficial interaction between advances in CK and KoST.

PAGE 416

400 CHAPTER 5 SUMMARY, CONCLUSIONS, AND IMPLICATIONS This study examined the levels of content knowledge and knowledge of student thinking related to area and perimeter of an intact group of preservice elementary microworlds. In particular, it focused on their understandings, misconceptions, wr itten and verbal explanations of that knowledge, and achievement on written area and perimeter tests within the context of a mathematics methods course. In short, this study ions of area and perimeter and how they change and develop through planned intervention, (b) student thinking (KoST), and (c) develop a form of anchored instruction involving web based microworlds designed for exploring area and perimeter. That framework focused on situated problem solving and provided a learning environment for both individuals and This c hapter contains three sections. The first section presents a summary of the

PAGE 417

401 sect ion discusses the implications of the research findings for teachers teacher educators, and future research. Summary of Findings This summary is comprised of two main sections. The f irst section provides a comprehensive look at findings related to all PSTs This will involve the two main strands of inquiry used throughout the study (i.e., units of measure and perceived relationships). The second focuses on the four case subjects and provides individual learning trajectories involving: (a) Their knowled ge prior to any intervention, (b) Their reaction s during the intervention, and (c) The changes in their knowledge following the intervention. These findi research questions. The primary research question examined b y this study was, In what ways do preservice elementary content knowledge and pedagogical content knowledge, related to area and perimeter, change as a result of experiencing anchored instruction integrated with web based microwo rlds, de signed for the investigation of area In particular: 1 involvement in the teaching episodes? 2. W hat perimeter prior to involvement in the teaching episodes? 3. at all, during the course of this study? 4.

PAGE 418

402 p erimeter change, if at all, during the course of this study? 5. area and perimeter related to their content knowledge of those same concepts? Intervention CK an d KoST: Research Questions 1 & 2 to area and PST possessed an incomplete CK regarding these concepts. Because of the important role CK plays in the organization of KoST, greater emphasis was placed on the a nalysis and their lack of pre intervention KoST. General CK Regarding Area and Perimeter Although area and perimeter are used for different applications, they do have similarities. It is these similaritie s that make the concepts of area and perimeter susceptible to confusion. If someone possess an incomplete or strictly procedural knowledge of area and perimeter, then it is understandable why they could confuse the two. When considering rectangles (the primary shape discussed in this study), determining area and perimeter involve s calculations with lengths of sides. A conceptual understanding of area and perimeter better equip s both the student and teacher with the knowledge to more consistently perfor m the correct measurement. Although each measure involves a calculation with lengths of sides, area and perimeter also require attention to their appropriate unit (i.e., linear or square). These concepts are intrinsical ly linked, and a PST with a profound CK and KoST realizes the importance and value of incorporating linear and square units within discussions involving area and perimeter.

PAGE 419

403 Distinguishing between area and perimeter Early on in the study it was appare nt that most of the PSTs possessed a procedural knowledge of area and perimeter. T he majority of the m basic procedure for finding their measure. Most PSTs were bound, even handicapped, by a dependency on formulas The result of which was They seemed completely unaware of the various misconceptions students encounter when working el 12 responses were varied, but the vast majority of them (9 out of 12) were along the lines y difficulties that went beyond a surface level answer included aspects of conceptual understanding, the majority of PSTs indicated that be the primary source of difficulty for students, in contrast to understanding the concepts. CK Regarding Units of Measure The importance of possessing a conceptual understanding of linear and square units cannot be overs tated. The unit of measure functions as a conceptual bridge connecting an object and the number used to represent its size. Hiebert (1981) states, As reported by resea rch with school students (Chappell & Thompson, 1999; Kamii, 2006), it was difficult at times in this study to distinguish if the PSTs were confusing area and perimeter, linear and square units, or both.

PAGE 420

404 Inattention to units Most of the PSTs addressed concepts of area and perimeter without any discussion about their appropriate units. This oversight contributed to PSTs n a grid that was provided). Eight out of 12 PSTs offered a response that addressed, to different degrees, concepts related to area. Likewise, insufficient attention to units resulted in several of the weaker performing PSTs (e.g., Jackie and Larry) strugg l ing with various aspects of irregular shapes (especially perimeter). These PSTs often attempted only procedural methods (typically involving a formula) to find the area and perimeter of irregular shapes. Even the higher performing PSTs (e.g., Brianna and Grac e), although more mathematically accurate with their responses, were also very procedural in their approaches to finding area and coherently explain concepts related to ar ea and perimeter. Ability to explain and illustrate units of measure Mathematical procedures, although effective at producing answers, typically do not inherently convey conceptual understand ing of a construct. The area formula for a rectangle is a p rime example of this. Instead of actually explaining the distinguishing characteristics of linear and square units and providing classroom useful and practical examples, most PSTs (including Larry and Jackie) simply explained how they are used (i.e., linea r units are used with perimeter and its appropriate unit was revealed in question 6 of the pretest when o nly four out of 12 PSTs 2 ) as the appropriate unit PSTs possessing a strong er mathematical

PAGE 421

405 knowledge (e.g., Brianna) seemed better able t o coherently and accurately distinguish bet ween linear and square units. Overal l for those PSTs, their pre intervention CK regarding these concepts was sufficient to get correct answers, but it was procedural in nature and application. A strong mathematical acuity was not sufficient to facilitate c onceptual explanations or the illustrating of ideas regarding units of measure. Utilizing drawings explain concepts in meaningful ways (i.e., their explanatory framework) using effective communi cation. Incorporating suitable drawings is one important aspect of a successful help future students better under 9 out of 12 PSTs made reference to drawing a picture or bringing in objects for display, only four provided any type of drawing to represent their ideas. Drawings were overlooked while addressing basic as well as m ore obscure ideas regarding area and perimeter. Most PSTs indica ted that conceptualizing and explaining linear and squar e units was difficult for them; however, only one of the 12 PSTs even attempted to draw a figure as a means to help visualize and/or explain these difficult concepts. Even when the PSTs were strugglin g to express meaningful thoughts and ideas, as evidenced by their scored responses, they frequently would not resort to a drawing to either help themselves visualize the concept or aid in the effective communication of their ideas. Out of 48 potential oppo rtunities on the pretest (12 PSTs 4 problems), only five drawings were provided that accompanied a meaningful and correct response. This pattern was also evident when the PSTs tried to explain their thinking regarding certain perceived relationships betw een area and perimeter. Although 8 of 12 PSTs did provide diagrams to support their explanations,

PAGE 422

406 only two of them were suitable for classroom use. On some problems it appeared the prepared to construct a meaningful drawing, while other times the PSTs were car e less and drew rectangles that were not to scale and thus were not helpful in facilitating a correct response. In all, t he ineffective use or lack of drawings to assist in problem solving or to clarify explanations was ev idence of CK that lacked a well developed explanatory framework. CK Reg arding Perceived Relationships B etween Area and Perimeter The perimeter and area of a figure are two different measures. The perimeter is a measure of the length of the boundary of a figure, where as the area is a measure of how much space a figure occupies. In the case of a rectangle, the calculations of both measures are related to the sides of the figure. These similarities provide the settings for two classic misconception s involv ing the area and perimeter of a rectangle: (1) That increasing the perimeter of a rectangle will always increase its area (i.e., the direct relationship misconception), and (2) Rectangles that have the same perimeter measurement will also have the same are a, and vice versa (the constant relationship misconception). When presented with a problem on the pretest containing the direct relationship misconception, four out of 12 PSTs (including Larry and Jackie) indicated that the correct. Their explanations tended to be based on the incorrect assumption that increasing the perimeter of a rectangle must increase both dimensions and thus the area. Another five PSTs, although they disagreed with the student in the problem, were unable to provide an appropriate counter example. All nine

PAGE 423

407 levels of thinking regarding this misconception (Ma, 1999). Only three PSTs (including Brianna) successfully examined other possible relationships beyond their initial relationship misconception resulted in similar, mostly unsuccessful, results. Problem 10 on the pretest ( p 245 ) provided an opportunity for PSTs to share their understandings regarding different sized rectangles (i.e., different areas) with the same perimeter. The fact that only four out of 12 expressed an awareness of a common student tendency to erroneously thin k that equal perimeters will result in equal areas indicates the majority of the PSTs were not aware of the fixed relationship misconception. Another relationship contained within this problem was that squares are special rectangles. No PSTs acknowledged t his hierarchical relationship or considered it relevant intervention CK was not sufficiently organized to enable them to consistently understand and diagnose student thinking or appropriately respond to student difficulties. Pre Intervention KoST On the pre area and perimeter, in contrast to und erstanding the concepts. They were concerned that most elementary students would have difficulties with all the formulas. Before the intervention, many PSTs indicated a lack of confidence in mathematics and having limited experience in diagnosing student t hinking related to mathematics; therefore, when faced on the pretest with a problem solving situation involving erroneous student thinking, focus on solving the problem (i.e., finding the answer), to the neglec

PAGE 424

408 PSTs who focused solely on the rightness adequately explore the mathematics surrounding the problem or the misconce ption (i.e., look for a counter example), o allow students opportunities to personally work through the various mathematical concepts of a problem and no PST di splayed the wherewithal to encourage students to investigate further with manipulatives or technology. Summary of Emergent Findings: Impact of Intervention These findings came primarily from the three teaching episodes (TEs), and include discussing th The Teaching Episodes TE 1 : U nits of measure The PSTs performed relatively well with the CK questions related to TE 1 Out of 12 PSTs only Jackie and one other PST did not initially conclude knowledge along with a limited capacity to apply their CK and adequately address the struggling student (Justin) in the TE resulted in much higher novice frequencies related to was seen often within the findings as a dividing line between more expert responses. The majority of PSTs avoided discussing important terms, such as linear and square u nits, and how Justin was incorrectly using square units to measure perimeter. While aspect of their KoST) began to show improvement. Several PSTs were creative in offeri ng alternative illustrations to help Justin better understand perimeter (e.g., fences,

PAGE 425

409 pieces of string), but only two PSTs (one being Brianna) actually discussed the most thod, his confusion with linear and square unit s and how one measures perimeter and the other measures area. TE 2: The fixed relationship misconception For several PSTs (e.g., Jackie), their CK regarding certain facts and concepts related to relationships between area and perimeter, specifically t he fixed relationship misconception, had increased; however, their ability to clearly explain t he i r knowledge had not developed to the same extent. As was the pattern with most unsuccessful responses in this study, no sketches were provided from the three PSTs who indicated they did not know how to solve the problem. focus while diagnosing student s methods, and offered a prime example of how a wrong focus by PSTs can result in poor diagnosing of student misconceptions and missed opportunities to address those difficulties. Findings showed that the PSTs who struggled most throughout TE 2 were also the ones who excessively focused on trying to find the area of the footprint, and as a result paid too little behind it. It appeared that several PSTs (e.g., Jackie) had difficulty translating the was apparent that those PSTs who were not able to explore the problem deeply on their own also had difficulty responding to the fictitious student in meaningful ways; whereas, those with a better understanding of the mathematics surrounding TE 2 (e.g., Brianna) were more c onfident and adept at suggesting how best to engage both the student and the entire class in a discussion of the misconception. Overall, a preoccupation with finding

PAGE 426

410 properly interaction with the Shape Builder MW. TE 3: The direct relationship misconception It was common with many PSTs (especially the poorer performing) that while examining the various student claims in this study, they apparently believed if enough examples were presented then the claim can be either proved or disproved. A limited backgro und in mathematics led most of these PSTs to where they viewed the role of examples as a way to prove something, rather than just an illustration of a numerical relationship. They did not, or possibly cannot, appreciate the need for a mathematical argumen t in such cases. Overall, the PSTs attained higher levels of understanding (Ma. 1999) regarding the misconception that there exists a direct relationship between perimeter and area. Table 23 (p. 343 ) shows that while only one PST achieved a Level 1 underst anding (o ut of 4) during the pretest, 10 out of 12 PSTs reached at least Level 1 during TE 3, including three Level 2s and one level 3. pedagogical clarity even for those wh o initially struggled. A comment made by Jackie during a teaching episode involved her belief that it might help students resolve area and perimeter conflicts if the concepts were studied simultaneously. Her view displayed relative expert pedagogical conte nt knowledge, shared by s everal researchers (Chappell & Thompson, 1999; Hiebert & Lefevre, 1986; Simon & Blume, 1994a). Impact of Microworld Usage I saw the error in the studen while PSTs interacted with the MWs. Table 21 (p. 317)

PAGE 427

411 of MWs ra nged from a means to confirm CK thinking. When asked in T E 1 whether the MW was helpful in deciphering the focus and counted them to find the perimeter. As I drew the figure in the microworld, I was beginning to think I was th she, like Tommy, was over think the string distract ed me from realizing sooner that perimeter does n ot determine grew as a result of interacting with the MW and also how they were gaining a vision for how to use the MW as a t ool to help diagnose student thinking. Findings related to how the PSTs proposed using the MWs with the students presented in the TEs, as well as their classmates, revealed mixed results. For the first teaching episode (the easiest of the three) the vast majority of the PSTs (11 out of 12) indicated they found the microworld helpful to their understanding of microworld as an instructional tool in a whole class discussion A similar majority (10 out of 12) indicated they believed classroom students would benefit from personally interac ting with the MW in a more controlled setting However, an unexpected trend developed as the mathematical content o f the teaching episodes got more elusive. Although the number of PSTs who indicated they learned with and/or saw benefits of personally interacting with the microworlds w as a strong majority (8 for TE

PAGE 428

412 #2 and 11 for TE #3), fewer (five from TE #2 and six from TE #3) said they would incorporate the microworlds when instructing future students about the concepts presented in the TEs, even though the same PSTs admitted those f uture students would most likely possess similar misconceptions as the hypothetical students presented in the teaching episodes. Apparently, the majority of PSTs felt the microworlds were a valuable learning tool for themselves but not for their future stu dents. This trend may be partially explained by the following quotation with microworlds still seem s slightly foreign to me, since it was in this class that I received my first opportunity to use an applet I have found the applets helpful in supporting or refuting theories p roposed quotation from one of the higher achieving PSTs. During her second inter view, Grace provided what she perceived as the value of the area and perimeter misconceptions Working through some examples of what kids were thinking when they figur ed out the problem s and just having all those examples, I think was very beneficial. Instead of just learning the concept s and how to do them, you need to be challenged Y w are you going to deal with it? T was how to deal with the way the kids might think, and how they might be thinking. Intervention CK and KoST The findings presented in chapter 4 related to research questions three and four were quite extensive. To facilitate cohesion, concise summaries highlighting post

PAGE 429

413 intervention CK and KoST findings will be presented. Readers interested in deeper discussions of any findings presented here are encouraged to reference chapter 4. Descriptive Fi ndings The postt est mean of 28.25 represents a 33% increase over the pretest average score of 21.25 (range = 0 40) The entire class decreased their total number of unacceptable scores (0s, 1s, and 2s) from 74 on the pretest to 35 on the posttest. There w ere seven 4s (model scores) assigned on the pretest, however 19 on the posttest. There were fewer novice codes assigned and the number of expert codes increased by over three g reatest positive change. The relatively low frequency of code 7 b (i.e., the ability to generate appropriate representations) assigned to the PSTs responses revealed a notable gap in their KoST, because they apparently did not realize the importance of diag rams presenting conceptual explanations of mathematical concepts. This tendency was repeated by a very low rate of code 12 b (i.e., the appropriate use of manipulatives) and the total absence of code 13 b (i.e., the appropriate integration of technology to p romote troubling given the tremendous focus placed upon the two microworlds used in this study. Positive change w as seen quan titatively. Table 14 (p. 256 ) illustrates that the CK for 9 of the 12 PSTs increased from pretest to posttest. The features of the changed. Table 16 (p. 261) reveals how the CK of all 12 PSTs experienced increases from pretest to posttest in the number of expert like characteristics assigned to their

PAGE 430

414 became clarified throughout the study. PSTs showed a greater propensity to include and discuss the correct unit of measure when solving area and perimeter problems. This was evident when working with irregular shapes. On the pretest, confusion regarding what a linear unit was caused several PSTs to incorrectly calculate the perimeter of an irregular figure. That dif ficulty was almost nonexistent on the post and follow up tests. Conceptual approaches aided in gaining new knowledge about finding area of irregular shapes and how the focus should be on counting square units instead of formulas. Procedural versus c oncep tual k nowledge There was a noticeable shift in the type of CK being displayed, from a procedural, formula based approach to a more conceptual one. Procedural CK dominated pre intervention thinking; however, a slow transition to more conceptual approaches began to surface during the teaching episodes and was much more evident during the post and follow mathematics background facilitated predominately procedural responses on the pretest, but during and after the inter vention she was more prone to support her procedurally correct responses with conceptual elements (e.g., she would discuss and illustrate units when explaining answers regarding her area and perimeter). Ability to explain Promoting understanding bec ame equally, or in some cases more, important to the PSTs than simply finding the right answer. This new found appreciation of conceptual understanding helped PSTs solve non traditional problems like finding the area of a footprint, and more importantly fa cilitated more powerful explanatory frameworks. The explanations regarding relatively difficult concepts, such as linear and square units, grew in clarity and thoroughness as a result of the PSTs

PAGE 431

415 experiencing the interventions. A problem on the follow up t est required drawing a polygon that had a perimeter of 24 and then justifying that response. Six responses ee PSTs were even more precise by explaining that the perimeter of their shape could be found by counting the outside linear units. CK containing rich dialogue such as this was, for the most part, Utilizing d rawings communicate their new found CK was an increased use of classroom appropriate drawings in the post and follow CK when explaining t heir ideas and solution strategies. Table 19 (p. 300 ) reveals an increased use of drawings following the intervention. Out of 48 potential opportunities (12 PSTs 4 problems) to use drawings on the pretest, 16 (33%) drawings were attempted, but there were only five (10%) that accompanied a meaningful and correct response. The rate of drawings provided increased for the posttest. There were 72 reasonable opportunities (12 PSTs 6 problems) to incorporate a drawing, 42 (58%) drawings were provided, and of those, 27 (38%) assisted in achieving a correct response. Use of drawings on the follow up test increased very slightly (+2%). An apparent pattern in Table 19 was that certain PSTs tended to use drawings more consistently than others. For example, following the pretest both Jackie and Brianna began incorporating drawings in their responses on a more regular basis, whereas Grace and Larry did not. The use of drawings was not dire ctly connected to performance. Grace was one of the top performers in the study, but barely ever used drawings to communicate her ideas, but

PAGE 432

416 PST #6, another top performer, effectively used drawings on the post and follow up tests. Jackie only provided on e (rather vague) diagram on the entire pretest to help support her explanations. For the posttest however, Jackie included 19 appropriate diagrams. These findings illustrate how the explanatory framework component of the CK Regarding Perceived Relationships elusive misconception (i.e., the direct relationship between area and perimeter) grew as Levels of Understanding of that relati method (or claim) into a mathematical relationship that could then be verified, disproved, hed (namely Brianna, #6, & #10), where the PSTs explored the various relationships in which in this study simply stopped exploring after discussing their initial reaction Many of these PSTs did not appear self motivated to delve far beyond providing one possibility to the stated question, very often the same one they had given in the past similar situations. Instead of investigating the various possibilities surrounding t his misconception, the majority would give the same, or a very similar, answer as they had previously and continued to operate within their CK comfort zone Throughout the study, only one PST (#1) was not able to display some measurable increase in her und erstanding of the direct relationship misconception. Problem 10 on the follow up test provided an opportunity for PSTs to share their understandings regarding different sized rectangles (i.e., different areas) with the same

PAGE 433

417 perimeter. Four out of 12 PSTs on the pretest expressed an awareness of a common student tendency to erroneously think that equal perimeters will result in equal are as, and three PSTs made the same acknowledgement on the follow up test. This finding indicates the majority of the PSTs we re still not perceptive to the fixed relationship misconception even after intervention. Another relationship contained within this problem was that squares are special rectangles. No PSTs acknowledged this hierarchical relationship on the pretest, but on the posttest nine of the 12 PSTs included a 4 4 shape in their list of : Research Question 4 The pedagogical c omponent of KoST made it sli ghtly more challenging than CK to isolate, quantify, and describe how it changed during the study. In spite of that, shortcomings and misconc eptions of students (i.e., their KoST changed in positive ways) grew within the context of this study, in different ways and to varying degrees. Positive change was seen quantitatively. Table 14 (p. 256 ) illustrates that the KoST subtest score s for 9 of the 12 PSTs increased from pretest to posttest. The quality of P (p. 261 ) revealed how the KoST of all 12 PSTs increased in the number of expert like characteristics assigned to their written responses from pretest to postte st. Precisely how the KoST changed was also discussed in great detail in chapter 4 limited to: (a) an increased awareness of common misconceptions students have

PAGE 434

418 regarding are a and perimeter, (b) a development and restructur ing of their mathematical vocabulary (relative to the concepts in this study), (c) a realization of the value of discussing and illustrating individual units of measure when explaining area and perimeter con cepts, (d) increased use of drawings when communicating ideas to students, (e) a movement away from procedural and teacher centered interventions to more conceptual explanations and student centered activities (e.g., PSTs showed an increased understanding of how and why to integrate MWs to help build conceptual knowledge), and (f) an increased focus on diagnosing student difficulties and less of an emphasis on solving problems and finding answers. houghts regarding the perturbations purposely placed within several test problems. Several noted that certain aspects of various test questions (e.g., Figure 35 p. 318 ) should be changed or removed was those very aspects of the problems that served as a catalyst to promote intellectual struggle, reflection, and a new found understan ding regarding a certain concept. Apparently, several PSTs viewed such conflicts as too troublesome for elementary students, unknowingly failing to acknowledge the motivating nature of true problem t sed while working on the scenarios presented in the TEs. Possibly the timed element of the tests or the interviews, In s ummary, the planned intervention of this study appeared to play a role in the PSTs becoming more perceptive of subtly difficult mathematics involving area and perimeter (e.g., linear and square units and the fixed and direct relationship

PAGE 435

419 misconceptions) and better equipped to anticipate and address those difficul ties with quantities and qualities, after their involvement with the anchored instruction. Case Subject Summaries Four case subjects were identified and examined in dept h to gain insights about the range of knowledge of PSTs in the class. Grace and Brianna represented PSTs with above average cognitive and mathematical ability, and Jackie and Larry were represent ative of PSTs possessing average to below average ability in mathematics and cognitive processes. The case involve : (a) Their knowledge prior to any intervention, (b) Their reactions during the intervention, and (c) The changes in their knowledge following the intervention Larry CK regarding area and perimeter was sparse in amount and poorly organized at the beginning of the study Initially, he displayed a rules orientated approach to area and perimeter, an inability to consistently focus on the correct unit of measure, and a tendency to respond to superficial features of a problem In addition, h e struggled when asked to explain his responses. In fact, during interviews he would often contr adict himself. KoST. He was ill prepared to consistently construct meaningful and/or accurate drawings, which limited the degree to which he could respond to student difficult ies. His overall CK and KoST prior to intervention can be characterized by a comment Larry

PAGE 436

420 Larry often appeared confused or distant dur ing discovery learn ing sessions. For example, TE 3 allowed the PSTs to use either MW right from the start. I observed Larry open a MW, create the shapes presented in the focus problem (p. 136), and then stare at the computer screen for several minutes, occasionally glancing at the fist page of the TE. The scenario presented in TE 3 resulted in most PSTs exploring and testing hypotheses in the MWs, but Larry appeared to disengage when there was a need to address concepts he found difficult. When he was able to grasp the mathem atical underpinnings of a concept he rarely ventured beyond that knowledge. At times he appeared distracted by the MWs MWs in his responses, it was to permit him to view e xamples quickly so that he could efficiently arrive at an answer. He appeared to be content with getting what he thought to students b y attempting to guide them to ge t right answers. Larry did not experience great success with the independent learning component of the TEs. To encourage success during the TEs, it was necessary to continually prod and prompt L a rry to continue to explore the concept beyond his initial shallow understanding of the concept(s) often tied to formulas and procedures, and involved teacher would incorporate instructional aids at times; however, he would often utili ze the same ones (e.g., grid paper), and many times the reason for incorporating the aid was unclear. Overall, he placed greater he placed greater precedence on completing the problems and generating answers than on gaining personal insights and knowledge necessary to

PAGE 437

421 develop conceptual understanding within future students. As is common with novice teachers (like Larry), they tend to respond to faulty student thinking by simply reiterating what they know about the topic, rather than investigating the student thinking and what lead to the erroneous claim ( Fuller, 1996; Livingston & Borko, 1990) misconceptions (i.e., his KoST) was limited by his insufficient CK. Progress made in relation to connecting mathematical concepts in meaningful ways tended to be short lived. Throughout the study he s truggled with the mathematics as well as with explaining his ideas. In addition, Larry showed little to no improvement in how he contemplated and address ed student thinking Grace At the onset of the study, Grace appeared to possess abov e average amounts of CK regarding various aspects of area and perimeter but struggled using it consistently to diagnose student thinking and theref ore could not adequately address certain student misconceptions regarding theses concepts. Her strengths included an ability to carefully process information coupled with a strong desire to help future students understand mathematics. In contrast to Jackie and Larry, Grace did not become flustered after realizing her thinking was incorrect. Like e xpert teachers (Chi, Glaser, & Farr, 1988), Grace was able to carefully analyze a problem before and while solving it. Grace displayed this often. She would pause, reread the problem, gather her thoughts, explain where she had gone wrong and why, and then continue on with her work or explanation. Throughout the intervention, Grace would often call me over to show me and/or inquire about her work with the MWs. She o ften explored beyond the basic ideas

PAGE 438

422 s focus problem as will be described later while discussing TE 3 It appeared Grace CK and KoST grew, and became better organized, as a result of the intervention, even beyond the planned learning. She would ask clarifying questions that reflected a genuine desire to understand the concepts we were being addressed She wanted to be prepared to teach students well. During the first interview Grace appeared to know more than she would write in her re sponses. Once Grace became aware of how thorough communication was necessary to promote understanding of mathematical principles, her responses changed to include greater specificity. As her CK regarding area and perimeter misconceptions became more cohere nt and organized, she was better equipped to respond to student difficulties in pedagogically powerful ways. Her desire to understand mathematical concepts did not end when class ended. about the focus the idea that, for quadrilaterals, a square maximizes area. Gra ce was not generally satisfied with leaving mathematical conflicts unresolved. Her stated desire for her future students was for them to have a conceptual understanding of mathematics. That was apparent in the application of both her CK and KoST, for which the ir focus was to clearly communicate mathematical ideas so that students would understand them. The outcomes from the TEs provided empirical evidence that Grace was motivated by and benefited from exploring the student misconceptions presented in the TEs. She thrived within the discovery learning environment and her classmates reported profiting from having her in their cooperative learning groups.

PAGE 439

423 Throughout the study, Brianna made good use of her strong mathematics background (e.g., she successfully completed Pre Calculus), and was careful and precise in her problem solving. It was common for Brianna to quietly think over a question for 30 seconds before making, what was usually, an insightful comment. As one of the three top performing PSTs (Grace and #6 were the other two), Brianna often provided coherent and thorough written responses, complete with accurate mathematics; however, prior to interve ntion she struggled when asked to illustrate and explain her ideas conceptually. intervention CK was sufficient to get correct answers, but it was very procedural in nature and application. Her CK was sufficient to allow her to diagnose man y of the student difficulties presented; however, her responses tended to focus on getting correct answers rather than on developing conceptual understanding. Prior to the in sufficient interventions for students. This illustrated that h er KoST was not at the same levels as her CK. approach to viewing, doing, and explaining mathematics. She consciously made efforts to think more conceptually. Brianna would become very engaged in the mathematical challenges of the TEs. Her strong mathematics background continued to power her CK and allowed her to grasp every misconception within the TEs and to be very th orough and accurate in her prescribed activities. Her ample CK appeared to initially interfere with her ability to see the need to include diagrams to help students understand her

PAGE 440

424 explanations; however, the frequency of quality diagrams increased from TE 2 right through the follow control the learning environment which at times hindered her instructional recommendations from focus ing on the students In all three TEs, Brianna indicated tha t she would direct the learning during the interventions (both with individual students and with a class). She often recommended having students investigate with the MWs, but with predesigned problems. Brianna thoroughly explored within the MW environments often commenting on interesting nuances. For example, she wrote how she discovered that there are an infinite number of rectangles with different dimensions that could have the same perimeter. Her instructional strategies gradually evolved from teacher c entered, with students receiving instruction, to teacher directed, where students participating more in their learning. Brianna appeared to benefit from being required, throughout the study, to communicate her mathematical understandings on a level appropr iate for elementary students. Near the end of the intervention Brianna was exhibiting the greatest levels of expert teacher framework, appeared to reach similar levels as her mathematical knowledge. At the onset of the study, Jackie CK regarding area and perimeter was fragile and disconnected. She was unable to consistently decipher whether problems were addressing area or perimeter, and wa s unaware of the importance of delineating such orientated approach, which left her unable to conceptually explain basic area and perimeter concepts

PAGE 441

425 or provide practical examples of them, other than how they are used (e.g., linear units are used with perimeter and square units with area). Interview excerpts revealed that although Jackie was aware of certain aspects of the student misconceptions presented, her lack of CK impeded her ability to diagnose and appropriately respond to faulty student thinking. The fragile nature of her CK was evident as she would often change her initial answer when asked to clarify her thoughts. A reflective statement made by Jackie near the end of her fi rst interview aptly summarized just kind of have this knowledge of formulas and a few concepts that here and there, and I think that some of them are mixed up. area and perimeter concepts, her KoST struggled to adapt throughout the intervention. Jackie had dif through various mathematical scenarios, and that left her ill equipped to effectively interventions often focused on general ideas (e.g., clarifying area and perimeter), even when those ideas were not helpful in resolving the misconception at hand. Her choices of mathematical language often confused and muddied her attempts at explaining concepts to students even those concept s she seemed to understand. Jackie indicated, and displayed, how interacting with the MWs deepened her understanding of area and perimeter concepts as well as how students think about them; however, she was not able to consistently perceive their relevance to the learning process or provide viable classroom uses for the MWs. Jackie

PAGE 442

426 would need repeated exposure and support to enable her to incorporate such tools into her future teaching. Jackie did not appear to learn best on her own, but rather indicated several times how the small group and whole class sessions were very helpful. It was observed that when Jackie was engaged in conversation (e.g., interviews, cooperative work) about fy and present her understanding about the concepts being discussed. Her increased posttest score (115% increase over her pretest) was evidence of her effort throughout the study. Jackie made noticeable gains in her CK related to area and perimeter. These gains appeared to translated into moments of pedagogical clarity. For example, a comment made by Jackie during a teaching episode involved her belief that it might help students resolve area and perimeter conflicts if the concepts were studied simultaneously. Her view displayed relative expert pedagogical KoST, shared by several researchers (Chappell and Thompson, 1999; Hiebert & Le fevre, 1986; Simon & Blume, 1994a). Following the context and level of student involvement that revealed her KoST was beginning to incorporate the ideas and practices that ha d been discussed during the TEs. Conclusions Previous research has shown that preservice elementary teachers (PSTs) have procedural and conceptual shortcomings regarding area and perimeter. The majority of that research focused on revealing and measurin g such misconceptions; therefore, little is

PAGE 443

427 known about the underlying causes of these misconceptions, how they may interfere with difficulties, or what alternative instructio nal methods may help alleviate the area and perimeter misconceptions that PSTs have. This study sought to measure and describe the content knowledge (CK) and knowledge of student thinking (KoST) of an intact group of PSTs both before and after a planned in tervention, and then examine possible relationships between their CK and KoST. Regarding Pre Intervention CK and KoST Expert/Novice Differences Preservice elementary teachers (including student teachers) are obviously considered novices. It was not surp rising then that, prior to any intervention, the 12 PSTs in this study displayed many of the same novice tendencies reported in the literature. R esearchers have found that the CK acquired by novice teachers is primarily procedural in content and applicatio n (Ball & Wilson, 1990; Borko et al., 1992; Fuller, 1996; Simon & Blume, 1994a). Similarly, t he majority of PSTs in this study seemed to equate measure. Most were bound even handicapped, by a dependency on formulas ; t he result of which was for teaching the subject matter Their procedural CK be the primary source of difficulty for students, in contrast to understanding the concepts. Their tendency to focus on the mathematical content at hand rather than the student confirms what other researchers have found to be true of novice teachers (Brown & Borko, 1992; Livingsto n & Borko, 1990; Meredith, 1993).

PAGE 444

428 The PSTs in this study expressed concerns about teaching mathematics. Eight out elementary age children. Even Brianna, who entered the study wi th a strong mathematics background, was apprehensive about teaching. Similarly, Borko et al. (1992) reported that novice teachers are very concerned about their limited pedagogical content knowledge and the impact such a shortcoming may have on teaching an d learning. The in their getting bogged down or confused and therefore unable to appreciate or contemplate how students might interact with the same mathematics. These findings taken together suggest that the college mathematics courses taken by PSTs do not inherently promote a conceptual understanding of area and perimeter or instill sufficient confidence to teach elementary children about these concepts. Basic CK: Units of Measure As presented in chapter 2, many studies have documented the ways in which novice teachers struggle with the mathematical content they must teach. This was evident n that has a Eight out of 12 provided a response that addressed, to different degrees, concepts related to area. Similar confusion has been documented with classroom students (Hirstein et al. 1978; Kouba et al., 1988) and preservic e teachers (Reinke, 1997). Since area and perimeter concepts were not understood conceptually, it was rather easy for many PSTs to confuse area and perimeter along with linear and square units. Instead of these concepts being a part of a web of ideas they were isolated facts which provided a

PAGE 445

429 very fragile foundation on which to attempt to problem solve and diagnose faulty student thinking. Confounding linear and square units is a specific application of area and perimeter confusion, and has been reported am ong classroom children (Chappell & Thompson, 1999; Lappan et al. 1998; Lehrer, 2003) and teachers as well (CBMS, 2001; Tierney et al. 1986). The unit of measure functions i s a conceptual bridge connecting an object and the number used to represent its s teacher possessing a conceptual understanding of linear and square units cannot be overstated, there is little research examining PSTs understandings regarding these concepts or how to improve the teaching of them. This study found that prior to intervention, PSTs often forgot to include or discuss units with their answers and their ability to explain the concepts of linear and square units was sadly lacking. Instead of actually explaining the distinguishing characteristics of linear and square units and providing classroom useful examples, Larry and Jackie (and most PSTs in this study), simply explained how they ar e used (i.e., linear units are used with perimeter and square units with area). Other studies have reported that PSTs struggle with explaining concepts related to area and perimeter (Even & Tirosh, 1995; Menon, 1998; Reinke, 1997; Simon & Blume, 1994a), bu t few have specifically described what was deficient with the The finding that a common thread to inferior responses by PSTs involved a lack of appropriate drawings to support explanations is important and had not been seen report ed in previous studies. This finding was very evident with problems related to units

PAGE 446

430 of measure. Drawings were also frequently neglected when the PSTs suggested instructional strategies to use with struggling students. This reflected an underdeveloped KoST Hiebert and Carpenter (1992) acknowledge that the expert teacher realizes the importance of providing conceptual representations; however, the PSTs in this study, even though they often wrote about how important and helpful visuals are to students, negle cted to include supportive diagrams and/or meaningful representations with their explanations. It was not just the poorer performing PSTs who struggled explaining their ideas and justifying their answers. Brianna and Grace (two, top performing PSTs) were r elatively successful at distinguishing between and appropriately using linear and square units; however, when asked to explain and illustrate these concepts, their responses were deficient. It would seem that possessing mathematical knowledge about area an d perimeter does not automatically translate into knowing how best to represent those concepts to elementary children or possibly even realizing the importance of doing so. Ability to Diagnose and Respond to Student Thinking Knowledge of student thinking (KoST) is a component of PCK. Research pertaining to knowledge of student thinking is still in its infancy. Shulman (1986) noted has been a misconceptions regarding area and perimeter, and none involving intervention to address shortcomings in these areas. When faced with problem solving situations involving erroneous student thinking, focus on solving the problem

PAGE 447

431 sing the hypothetical adept at diagnosing student errors. Such a finding runs counter to research performed by Meredith (1993) who found that preservice elementary teachers sp ecializing in successful at diagnosing student errors were often unable to provide coherent explanations that included supportive diagrams. These findings are in keepi ng with Borko et al. (1992) and Even and Tirosh (1995) who found that PSTs with strong mathematics backgrounds displayed a limited repertoire of instructional representations and were often uestions. It does not appear that increased mathematics training alone will develop or enhance pedagogical content knowledge. Most PSTs in this study did not possess the necessary knowledge, experience, or both to consistently diagnose student thinking or appreciate what is essential to help children understand the errors in their thinking. Perceived Relationships between Area and Perimeter The calculations of both area and perimeter involve the lengths of the sides of the figures, and thus someone lacking a conceptual understanding of area and perimeter could encounter many problems and difficulties (Ma, 1999). These similarities provide the setting for two common misconceptions involving the area and perimeter of a rectangle: (1) That incre asing the perimeter of a rectangle will always increase its area (i.e., the direct relationship misconception), and (2) Rectangles that have the same perimeter measurement will also have the same area, and vice versa (i.e., the fixed relationship misconcep tion).

PAGE 448

432 correctly concluded that the square garden had the greatest area. This finding differs from previous research done by Woodward and Byrd (1983), who found that 76 out of the 129 PSTs (or 59%), who were asked the same question, thought the gardens would be the same size. Similar percentages of PSTs in both studies (around 30%) expresse d at least some awareness of the common student tendency to think that equal perimeters will result in equal areas. This represents a somewhat predictable finding. The PSTs would be successful on the mathematical component of a problem, however they would not be able to apply that knowledge so as to anticipate what students might find difficult or confusing about the same problem. This mindset inhibited many PSTs from systematically investigating an erroneous student claim. The direct relationship misconce ption (the belief that increasing/decreasing perimeter must increase/decrease area) offered the PSTs various learning trajectories to follow and explore. Question 8 on the pretest presented a student who claimed that increasing the perimeter of a rectangle out of correct. Their explanations tended to be built on the incorrect assumption that increasing the perimeter of a rectang le must increase both dimensions and thus the area, and were relationship; however, t hey failed to notice that the perimeter of a rectangle can increase as two of the sides of the rectangle decrease in length. Only three PSTs (25%) were able

PAGE 449

433 to arrive at a correct solution by pr esenting an appropriate counter example. Ball (1988) and Ma (19 99) presented a problem very similar to question 8 to elementary preservice teachers (26 and 23 KoST in that it left them ill equipped to engage the student in any meaningful discussion regarding that claim. Most PSTs in this study put all their effort into deciphering whether the student was right or wrong That hindered the extent to which they investig ated the various area perimeter relationships beyond what they initially found or concluded. Regarding Relationships between CK and KoST The PSTs in this study exhibited varying degrees of growth in their CK (75%), KoST (also 75%), or both (58%) from pr etest to posttest. It was found in several different units of measure) often left them ill equipped to explain and illustrate their own thoughts about those concepts an d even more incapable of appropriately responding to student shortcomings and misconceptions. This was manifested by a lack or poor use of and effective intervention strategies. Ma (1999) reported similar findings with the U.S. teacher s she studied; however, she conducted no intervention to allow for further findings regarding potential relationships between the two knowledge types. The common trend observed in this st udy was an increased CK regarding area and perimeter concepts and misconceptions (following intervention) was typically accompanied by a growing use of appropriate draw ings and coherent language when providing explanations. A lso noted was an increased focu s on diagnosing student thinking and suggesting more student

PAGE 450

434 centered interventions all evidence of a maturing KoST. The apparent dependency of KoST (more broadly PCK) upon CK has been written about by researchers such as Shulman (1986, 1987), Rowan et a l. (2001), and Hutchison (1997), but drawing conclusions and making recommendations based on that dependency has proven elusive. cognitions about student thinking (i.e. their KoST) and the interplay of these upon subsequent instructional decisions. Regarding Anchored Instruction with Web Based Microworlds This teacher development experiment (Cobb, 2000; Simon & Tzur, 1999 ; Simon, 2000) sought to implement and closely o bserve instructional strategies that aligned with the theoretical underpinnings of anchored instruction (CTGV, 1990, 1991, 1992, 1993) based microworlds provided a research based technolo gy conduit (Marzano, 1998) to support and aid the learning of area and perimeter misconceptions through various learning settings: independent discovery, and group dynamics between myself (the researcher) and the participants (preservice teachers) and amon g the participants themselves. The focus problems for the instructional sequence, which were based on common area and perimeter misconceptions held by elementary students (and teachers), proved to be motivating and provided a range of entry points from w hich the PSTs could investigate concepts and misconceptions. The PSTs made several comments regarding how they enjoyed learning about what their future students could be expected to struggle with. There were several interesting findings regarding the web b ased microworlds (MWs), Shape Builder and Perimeter and Area Gizmo specifically selected for this

PAGE 451

435 study. The MWs did not consistently promote the type or level of involvement that was anticipated. Throughout the teaching episodes (TEs) the PSTs who strugg led the most were also the ones who became preoccupied with some tangential aspect of the TE (e.g., finding the area of the footprint in TE 2), and as a result spent insufficient time analyzing t. For the most part, the PSTs in this study simply stopped exploring after arriving at and discussing their initial reaction. Many of these PSTs did not appear self motivated to delve far beyond providing one possibility to the stated question. Instead of investigating the various possibilities surrounding a misconception (either with or without the MWs), the majority would give the same, or a very similar, answer as they had previously and continued to operate within their CK comfort zone Similar to PSTs in Chinnappan (2000), this study found only hindered their ability to properly diagnose and address student thinking, but it also limited their meaning ful interaction with the MWs. This finding may be explained in part mathematical conjecture to refute or justify, and they lacked the necessary mathematical details f or which to explore with the MWs. Throughout the intervention, the vast majority of the PSTs commented on how they found specific features of the microworlds helpful to their understanding of the mathematics surrounding the focus problems as well as fac ilitating insights regarding the achieving PSTs displayed evidence of technological pedagogical content knowledge (TPCK) by suggesting specific revisions to the Shape Builder MW that would improve feedback and height en awareness of

PAGE 452

436 distinguishing learning features of the MW. During the early stages of the intervention, the PSTs explained how they would use the microworld as an instructional tool in a whole ority (10 out of 12) indicated they believed future classroom students would benefit from personally interacting with the MW in a structured context. However, an unexpected trend developed as the mathematical content of the teaching episodes got progressiv ely more Although the number of PSTs who indicated they learned with and/or saw benefits of personally interacting with the microworlds was a strong majority, far fewer said they would incorporate the microworlds when instructing future students about the concepts presented in the TEs, even though the same PSTs admitted those future students would most likely possess similar misconceptions as the hypothetical students present ed in the teaching episodes. A similar contradiction appeared when of the several PSTs who indicated they learned from the microworlds only a few wrote that they would allow time for the students to personally use the microworlds to explore the concepts su rrounding the teaching episodes. These findings concur with research done by Timmerman (1999). In both studies the PSTs did not use MWs as part of suggested instruction even though they acknowledged having difficulties generating conceptual explanations. A pparently, the majority of PSTs concluded the microworlds were a valuable learning tool for themselves but not necessarily for students. Every PST indicated that this study was their first exposure to web based MWs, which helps to explain their frequent ne glect to incorporate them within instructional recommendations. Collectively these results suggest that even though the content of the study was accessible (i.e., area and perimeter)

PAGE 453

437 and the technology which was integrated was appropriate for elementary st udents, there are no guarantees that PSTs will automatically perceive how best to utilize the features of the MWs to promote exploration and a deeper understanding of area and perimeter concepts nor necessar il y comprehend the MW as a tool for future teachi ng. The instructional sequence for this study was designed to encoura ge the PSTs to revisit their prior knowledge and consider them as points for reflecting about teaching. The value of viewing the PSTs in these dual roles was confirmed as most of th em developed mathematical insights (i.e., a more heightened CK) as they attempted to solve problems that involve area and perimeter misconceptions and address erroneous student claims as they were functioning as students themselves. Their KoST was challen ged and enhanced as they reconciled their personal mathematical understandings with what would be necessary and to provide an appropriate explanation and instruction to elementary students. There was only one study found that investigated the use of anc hored instruction in a mathematics course for preservice teachers. Kurz and Baterelo (2004) found that most PSTs who were exposed to anchored instruction expressed optimism that students could learn through such an instructional approach. This research ext ends their findings by describing how anchored instruction could be successfully integrated into a mathematics course for elementary preservice teachers and by documenting the positive Implications for Practice The results of this study, coupled with the knowledge provided from existing

PAGE 454

438 research, lead to some implications for teachers and teacher educators. As discussed in the review of literature and the results of this study, students and preservice teachers struggle with many aspects related to area and perimeter concepts and relationships. Implications for Teachers Confusions between area and perimeter and linear and square units could be reduced if these topics were introduced and developed in conjunction with each other. Traditionally, in school mathematics area and perimeter are taught in isolation, thus making it difficult to uncover misconceptions until these concepts appear together typically on a test. These misconceptions (especially involving linear and square units) could function as springboards for engaging in the exploration of area and perimeter. Prese nting scenarios involving student misconceptions and erroneous student work (or claims) could motivate students to delve deeper than the surface understanding presented in most textbooks. The very nature of such problem solving scenarios would encourage re ading, explaining, representations, and justifying of responses. These activities would more readily alert the teacher to existing and potential confusions as well as promote various forms of discourse and higher ordered thinking. Studying misconceptions would most likely involve the use of manipulatives to help promote conceptual understanding and better visualization of the concepts being explored. Results from previous research along with findings from this study suggest that technology (e.g., web base d MWs) is an effective and dynamic alternative to hand held manipulatives. The benefits of technology use include immediate feedback for students, If area and perime ter were taught in tandem, then fewer individual lessons would be

PAGE 455

439 needed and time spent on review ing these concepts would be decreased, because students would have a more connected and conceptual understanding of the subject matter. Implications for Teach er Educators Teacher educators must take a greater role in familiarizing teachers with common area and perimeter misconception s and in providing instructional approaches to address those misconceptions. The 12 PSTs involved in this study were juniors and seniors and had completed all their mathematics requirements. That is, they had received all the subject matter instruction deemed necessary to teach elementary mathematics. However, as discussed earlier, PSTs (and classroom teachers) struggle conceptualiz ing many of the mathematical concepts (including area and perimeter) they have to teach, and hence have difficulties diagnosing misconceptions and effectively anticipating and addressing student errors without simply restating rules or procedures. The re sults from this study suggest that undergraduate teacher education programs must ensure that preservice teachers, elementary and secondary, are fully prepared to be teachers of mathematics including addressing student misconceptions. Research has docume nted numerous misconceptions and error patterns that students possess regarding the mathematics they learn. To increase levels of CK and KoST within PSTs, teacher educators must examine their programs to ensure that the misconceptions identified in this an d other studies are addressed. It is important to not only examine the mathematical perspective of these misconceptions (e.g., possessing a profound understanding of linear and square units) but also to cultivate various knowledge types (PCK, CK, KoST, TPC K, etc.) simultaneously. For example, although it is important for PSTs to know that increasing the perimeter of a rectangle will not

PAGE 456

440 ALWAYS result in a larger area, it is equally important for them to understand why students would think this and how then to address the misconception. PSTs must be aware of powerful and easily accessible technologies (e.g., web based MWs) that can be used to facilitate the exploration and deeper understanding of the mathematics surrounding these misconceptions. These technol ogies are becoming readily available in most classrooms. PSTs should learn best practices for incorporating them Results from the research literature reveal when asked to explain and represent their ideas (Borko et al. 1 992; Even & Tirosh, 1995; Menon, 1998; Reinke, 1997; Simon & Blume, 1994a). PSTs need many opportunities to present and refine their subject matter knowledge, and instructional strategies. Promoting a community of learners within the methods course that en courages interactive cycles of found ideas and integrate them to form a more coherent understanding of the mathematics they must teach (Bowers & Doerr, 2001; Simon, 2000; Wales & Stager 197 7) Implications for Future Research area and perimeter (prior to, during, and following a specially designed intervention), it leads to new questions. The results appear to show that the planned intervention understandings of common student misconceptions as well as instructional strategies for responding to student difficulties and erroneous claim s; however, only one other study (Kurz & Baterelo, 2004) was found that investigated the use of anchored instruction in a

PAGE 457

441 mathematics course for PSTs, and it d id not involve any specific content. Further research is needed to help establish the viability o f such an instructional approach within a mathematics methods course not just to instruct in area and perimeter but other content as well. Future research could also help further evaluate various aspects of this ecific aspect(s) of this study had the greatest the three tests, the teaching episodes, the anchor (i.e., student misconceptions), the cooperative learning experiences, or the interactions with the MWs? Such questions have not been answered. Multivariate analysis might prove helpful in isolating the strength of the contributing variables to the entire anchored instructional sequence. For example, there were inconsistencies regarding wh at the PSTs wrote about the MWs and their p ersonal learning versus their proposed instructional strategies involving MWs with future students. More research is needed to determine if, or to what degree, the MWs are a valuable component of anchored mathematics instruction with PSTs. Conducting rese arch with interns, possibly a longitudinal study, where they experience anchored instruction similar to this study and then are observed teaching the same concepts within a school setting, possessing the necessary technology, might help provide insight as to how well knowledge of content and instructional strategies gained during anchored instruction transfers to actual classroom practices of PSTs. motivational to the PSTs in this s tudy. There is a need to examine the extent to which classroom students might also find such learning settings interesting. Researchers could conduct an experimental study with classroom students examining the impact of learning

PAGE 458

442 area and perimeter concepts through studying misconceptions. Results from such studies would provide a foundation to extend future research to other content areas. Other questions that need to be addressed include: In what ways would the anchored mathematics instruction need to be a ltered to be compatible with school students? To with web based applets? Previous research has shown the benefits of MWs within school settings (Clements & Sarama, 1997 ; Kordaki, 2003; Lederman & Niess, 2000; Yelland, 2002). Given their aptitude towards technology, it is important to examine differences Another question raised by this study that needs further investigation involve s the ons and when making instructional recommendations. It was not clear why the PSTs did not perceive the importance of diagrams when communicating mathematical concepts especially more difficult o nes. Representations, including demonstrating understanding, have been described as a vital part of effective classroom communication (NCTM, 2000), and since the majority of PSTs in this study did not use them, research is needed to investigate the se of representations and the importance attributed to them. than satisfactory. This was most likely due to a combination of one or more of the following: (a) small sample si ze ( n = 12), (b) sma ll number of items on subtest ( n = 5), and (c) a couple poorly written test items (identified through analysis of descriptive statistics). A replication of this study with a much larger sample (includi ng modified test

PAGE 459

443 items) could help to mitigate these concerns and help to clarify the extent to which this This study represents beginning steps in understanding how to develop anchored instruction useful for a mathemati cs methods course. There is much more to investigate and much more work to be done. Based on the results of this teaching experiment, I believe there is hope for further development and deeper understanding of the impacts of content knowledge and knowledge of student thinking.

PAGE 460

444 REFERENCES Abdal Haqq, I. (1995). Infusing technology into preservice teacher education. ERIC Clearinghouse on Teaching and Teacher Education Washington DC. Retrieved March 14, 2004, from http://www.ericfacility.net/ericdigests/ed389699.html American Psychological Association (1985). Standards for educational and psychological testing. Washington, DC: Author. American Educational Research Association, American Psychological Association & National Council on Measurement in Education (1999). Standards for educational and psychological testing. Washington, DC : American Educational Research Association. Ausubel, D. (1968). The psychology of meaningful verbal learning. New York: Grune & Stratton. Ball, D. L. (1988 a ). The subject matter preparation of prospective mathematics teachers : Challenging the myths Review of Educational Research, 54 65 86. Ball, D. L. (1988 understanding of mathematics. In Behr, M. J. LaCampagne, C. B., and Wheeler M. M (Eds.). Proceedings of the Conference of the North American Group for the Psychology of Mathematics Education DeKalb, Il 268 274 division. Journal for Research in Mathematics Education, 21 132 144. Ball, D. L. (1991). Research on teaching mathematics: Making subject matter k nowledge part of the equation. In J. Brophy (Ed.), Advances in research on teaching, (Vol. 2, pp. 1 48). Greenwich, CT: JAI.

PAGE 461

445 Ball, D. L. (2003, February). What mathematical knowledge is needed for teaching ? Paper presented at the Education, Washington, DC. Ball, D. L., & Bass, H. (2000). Interweaving content and pedagogy in teaching and learning to teach: Knowing and using mathematics. In J. Boaler (Ed.), Multiple perspectives on the teaching and learning of mathematics (pp. 83 104). Westport, CT: Ablex. Ball, D. L., Lubienski, S. T., & Mewborn, D. S. (2001). Research on teaching Richardson (Ed.), Hand book of research on teaching (pp. 433 45). Washington, DC: American Education Research Association. Ball, D. L., & Wilson, S. M. (1990). Knowing the subject matter and learning to teach it: Becoming a mathematics teacher (Research Report 90 7). East Lansi ng: Michigan Sate University, National Center for Research on Teacher Education. Barnett, C (1998). Mathematics case meth ods project. Journal of Mathematical Teacher Education 1 (3), 349 356. Barron, L. C. & Goldman, E. S. (1994). Integrating technology w ith teacher preparation. In B. Means (Ed.), Technology and Education Reform (pp. 81 110). San Francisco: Jossey Bass Bassarear, T. (2005). Mathematics for elementary school teachers (3 rd ed.), Boston: Houghton Mifflin.

PAGE 462

446 Battista, M. T., Clements, D. H., Arnoff, J., Battista, K., & Borrow, C. (1998). spatial structuring of 2D arrays of squares. Journal of Research in Mathematics Education, 29 503 532. ter knowledge within the domain of area measurement. Educational Studies in Mathematics, 31, 235 268. Bau er, J. W. (1998). Anchored instruction in preservice education technology classes: A research project. Technology and Teacher Education Annual 1998, (pp. 241 245) Charlottesville, VA: Association for the Advancement of Computing in Education. Bauer, J. W., Ellefsen, E. R., & Hall, A. M. (1994). A model for using anchored instruction in preservice educational technology classes. In J. Willis, B. Robin & D. A Willis (Eds.), Technology and Teacher Education Annual 1994 (pp. 131 134). Charlottesville, VA: Association for the Advancement of Computing in Education. Baumback, D., Brewer, S., & Bird, M. (1995). Using anchored instruction in inservice teac her education. Technology and Teacher Education Annual 1995, (pp. 809 813 ) Charlottesville, VA: Association for the Advancement of Computing in Education. Beaton, A. E., Mullis, I. V., Martin, M. O., Gonzalez, E. J., Kelly, D. L., & Smith, T. A. (1996). International Mathematics and Science Study (TIMMS). TIMMS International Study Center, Boston College. Chestnut Hill, Mass.

PAGE 463

447 Beattys, C. B. & Mahler, C. A. (1985). Approaches to learning area measurement and its relation to spatial skill Proceedings of the 7 th International Conference for the Psychology of Mathematics Education, USA, 307 315. Beckmann, S. (2003). Mathematics for elementary tea chers volume II: Geometry and other topics (Preliminary ed.). Boston: Addison Wesley. Boers van, O. & Monique, A. M. (1990). Understanding of variables and their uses acquired by students in traditional and computer intensive algebra (Doctoral dissertati on, University of Maryland College Park, 1990). Digital Dissertations, 51, 1538. experiment. Journal for Research in Mathematics Education, 25 166 208. Borko, H., Eisenhar t, M., Brown, C. A., Underhill, R. G., Jones, D., & Agard, P. C. (1992). Learning to teach hard mathematics: Do novice teachers and their instructors give up too easily? Journal for Research in Mathematics Education, 23 194 222. Bottge, B. A., Heinrichs, M., Chan, S. Y., Mehta, Z. D., & Watson, E. (2003). Effects of video based and applied problems on the procedural math skills of average and low achieving adolescents. The Journal of Special Education, 18(2), 5 22 Bottge, B. A., Heinrichs, M., Chan, S. Y., & Serlin, R. C. (2001). Anchoring understanding of math concepts in rich problem solving environments. Remedial and Special Education 22(5), 299 313.

PAGE 464

448 Bottge, B. A., Heinrichs, M., Mehta, Z. D., & Hung, Y. H. (2 002). Weighing the benefits of anchored instruction for students with disabilities in general education classes. The Journal of Special Education, 35(4), 186 200. Bransford, J., Sherwood, R., Hasselbring, T., Kinzer, C., and Williams, S. (1990 a ). Anchored instruction: Why we need it and how technology can help. In D. Nix & R. Sprio (Eds), Cognition, educa tion and multimedia: Exploring ideas in h igh t echnology, (pp. 115 141). Hillsdale, NJ: Erlbaum Associates. Bransford, J. D., Vye, N., Kinzer, C., & Risko, V. (1990 b ). Teaching thinking and content knowledge: Towards an integrated approach. In B. F. Jones & L. Idol (Eds.), Dimensions of thinking and cognitive instruction (pp. 381 413). Hillsdale, NJ: Earlbaum. Bray, W. S., Dixon, J. K., & Martinez, M. (2006) Fostering communication about measuring area in a transitional language class. Teaching Children Mathematics, 13, 132 135. Brophy, J. E. (1991). Conclusion to advances in research on teaching: knowledge of subject mater as it relates to their t eaching practice. In J. E. Brophy (Ed.), classroom instruction (pp. 347 362). Greenwich, CT: JAL Press. Brown, C.A., & Borko, H. (1992). Becoming a mathematics teacher. In D. A. Grouw s (Ed.) Handbook of research on mathematics teaching and learning (pp. 209 239). New York: Macmillan. Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the c ulture of l earning. Educational Researcher, 18 (1) 32 42.

PAGE 465

449 Browning, C.A., & Klespis, M. (2000). A reaction to Garofalo, Drier, Harper, Timmerman, and Shockey. Contemporary Issues in Technology and Teacher Education, [Online serial], 1 (2), 226 228. Bull, G. L. (1997). Technology and schools. Advances in Computers, 45, 321 354. Bus h, W. S. (2000). Mathematics assessment: Cases and discussion questions for grades 6 12. Reston, VA: National Council of Teachers of Mathematics. Cai, J., Lane, S, & Jak abcsin, M. S. (1996). The role of open ended tasks and holistic scoring rubrics. In P. C. Elliott & M. J. Kenney (Eds.), Communication in mathematics, K 12 and beyond (pp. 137 147). Reston, VA: The National Council of Teachers of Mathematics. Carpenter, T. P. Coburn, T. G., Reys, R. E. & Wilson, J. W. (1975). Notes from nat ional assessment: Basic concepts of area and volume. Arithmetic Teacher 22 501 507. Carpenter, T. P. & Fennema, E. (1992). Cognitively guided instruction: Building on the knowledge of students and teachers. International Journal of Educational Research, 17, 457 470. Carpenter, T. P., Fennema, E., Peterson, P. L., & Carey, D. (1988). Journal for Research in Mathematics Educati on, 19 345 357. Carpenter, T. P., Fennema, E., Peterson, P. L., Chiang, C., & Loef, M. (1989). Using American Educational Research Journal, 26 499 531. Casa, T. M., Spinelli, A ., & Gavin, M. K. (2006). This about covers it! Strategies for finding area. Teaching Children Mathematics, 13 168 173.

PAGE 466

450 Chappell, M. & Thompson, D. R. (1999). Perimeter or area: Which measure is it? Mathematics Teaching in the Middle School, 5, 20 23. Chi M. T., Glaser, R., & Farr, M. J. (1988). The nature of expertise New Jersey: Lawrence Erlbaum. Cholmsky, P. (2003, December). Why gizmos work: Empirical evidence for the instructional effectiveness of explorelearning Charlottesville, VA: ExploreLearning. Retrieved April 9, 2004 from http://www.explorelearning.com/index.cfm?method=cCorp.dspResearch fractions in a javabars environment. Mathem atics Education Research Journal, 12 (3), 234 253. Clements, D. H. (1989). Computers in elementary mathematics education. Englewoods Cliffs, NJ: Prentice Hall. Clements, D. H. (1999). Effective use of computers with young children. In J. V. Copley (Ed.), Ma thematics in the Early Years (pp. 119 128) Reston, VA: National Council of Teachers of Mathematics. Clements, D., & Battista, M. (1992). Geometry and spatial reasoning. In D. A. Grouws (Ed.), Handbook of research on mathematics teaching and learning (420 464). New York: Macmillan. Clements, D. H., Battista, M. T., Sarama, J., & Swaminathon, S. (1997). Development of Elementary School Journal, 98 (2), 171 186.

PAGE 467

451 Clements, D. H., & McMillen, S (1996). Rethinking concrete manipulatives. Teaching C hildren M athematics 2 270 279. Clements, D .H., & Sarama J. (1997). Research on logo: A decade of progress. Computers in the Schools 14(1/2), 9 46. Clements, D. H., Sarama, J., & Battista, M. T. (1998). Development of concepts of geometric figures in a specially designed Logo computer environment. Focus on Learning Problems in Mathematics, 47 64. Cobb, P. (1987). Information processing psychology and mathem atics education A c ons tructivist perspective. Journal of Mathematical Behavior, 6 3 40. Cobb, P. (2000). Conducting teaching experiments in collaboration with teachers. In A. Kelly & R. Lesh (Eds.), Handbook of research design in mathematics and science education, (pp. 267 306 ). New Jersey: Lawrence Erlbaum. Cobb, P. & Steffe, L. P (1983). The constructivist researcher as teacher and model builder. Journal for Research in Mathematics Education, 14 83 94. Cobb, P., Yackel, E., & Wood, T. (1992). A constructivist alternative to the representational view of mind in mathematics education. Journal for Research in Mathematics Education, 23 2 33. Cobb, P., Yackel, E., & Wood, T. (1993). Learning mathematics: Multiple perspectives, theoretical orientation. In T. Wood, P. Cobb, E. Yac kel, & D. Dillon (Eds.), Rethinking elementary school mathematics: Insights and issues (Journal for Research in Mathematics Education Monograph Series, Vol. 6, pp. 21 32. Reston, VA: National Council of Teachers of Mathematics.

PAGE 468

452 Cognition and Technology Group at Vanderbilt (1990). Anchored instruction and i ts relationship to situated c ognition. Educational Researcher 19 (6) 2 10. Cognition and Techn ology Group at Vanderbilt (1991 ). Technology and the design of generative learning environments. Educational Technology 31 (5) 34 40. Cognition and Technology Group at Vanderbilt (1992a). Anchored instruction in science and mathematics: Theoretical basis, developmental projects, and initial research findings. In R. A. Duschl & R. J. Hamilton (Eds.), Philosophy of Science, cognitive psychology, and edcuational theory and practice ( pp. 244 273). Albany, NY: State University of New York Press. Cognition and Technology Group at Vanderbilt (199 2b). The jasper experiment: An e xploration of issues in learnin g and instructional d esign. Educational Technology Research & Development 40 (1), 65 80. Cognition and Technology Group a t Vanderbilt (1993). Anchored i nstruction and s ituated cognition r evisited. Educational Technology 33 (3) 52 70. Collins, A. (1991). Co gnitive apprenticeship and instructional technology. In L. Idol & B. F. Jones (Eds.), Educational values and cognitive instruction: Implications for reform ( pp. 121 138). Hillsdale, NJ: Erlbaum. Connors, M. A. (1997). Technology in mathematics education: A personal perspective. Journal of Computing in Higher Education, 8 (2), 94 108. Cooper, J. M., & Bull, G. L. (1997). Technology and teacher education: Past practice and recommended directions. Action in T eacher Education, 19 (2), 97 106. Davidson & Associates. (1994). Math blaster mystery (Windows version) [Computer software]. Torrance, CA: Davidson & Associates.

PAGE 469

453 Dede, C. (2000). Emerging influences of information technology on school curriculum. Journal of Curriculum Studies, 32 (2), 281 303. Dewy, J. (1933). How we think (rev. ed.). Boston: Heath. Dewey, J. (1964). The relation of theory to practice in education. In R. Archambault (Ed.), John Dewey on education (pp. 313 338). Chicago: University of Chicago Press. (Original work published in 1904). D catalysts for changes in their teaching practices. Journal of Research on Computing in Education, 31, 221 238. Drier, H. S. (2001). Beliefs, experiences, and reflections that affect the development of techno mathematical knowledge Paper presented at the meeting of the Society for Information Technology and Teacher Education, Orlando, FL. Dunham, P. H. & Thomas, P. D. (1994). Research on graphing calculators. Mathematics Teacher, 87 440 445. Dufresne, R. J., Leonard, W. J., & Gerace, W. J. ( nd). A qualitative model for the storage of domain specific knowledge and its implication s for problem solving Retrieved May 5, 2007, from University of Massachusetts, Scientific Rea soning Research Institute Physics Education R esearch Group Web site: http://umperg.physics.umass.edu/topics/model Edwards, L. D. (1995). Microworlds as representations. In A. di Sessa, C. Hoyles, R. Noss (Eds.), Computers and exploratory learning (pp. 127 154). NATO ASI Series, Subseries F, Vol. 146, Springer Verlag, Heidelberg.

PAGE 470

45 4 Eisenberg, T. A., (1977). Begle revisited: Teacher knowledge and student achievement in algebra. Journal for Research in Mathematics Education, 8 216 222. Eisenhart, M., Borko H., Underhill, R., Brown, C., Jones, D., & Agard, P. (1993). Conceptual knowledge fall through the cracks: Complexities of learning to teach mathematics for understanding. Journal for Research in Mathematics Education, 21 8 40. Even, R. (1993). Subject matter knowledge an d pedagogical content knowledge: Prospective secondary teachers and the function concept. Journal for Research in Mathematics Education, 24 94 116. Even, R. & Tirosh, D. (1995). Subject matter knowledge and knowledge about students as sources of teacher presentations of the subject matter. Educational Studies in Mathematics, 29, 1 20. Fennema, E. A. Grouws (Ed.), Handbook of research on mathematics teaching and learning (pp. 147 164). New York: Macmillan. mathematical knowledge in instruction. American Educational Resear ch Journal, 30(3), 555 583. Ferrer, B. B., Hunter, B., Irwin, K. C., Sheldon, M. J., Thompson, C. S., & Vistro Yu, C. P. (2001). By the unit of square unit? Mathematics Teaching in the Middle School 7, 132 137.

PAGE 471

455 Fuller, R. ( 1996). pedagogical content knowledge of mathematics Paper presented at the Mid Western Educational Research Association Conference, Chicago, IL. Gall, M. D., Borg, W. R. & Gall, J. P. (1996). Educational research: An introduction (6 th ed.). New York: Longman P ublishers. Garofalo, J., Drier, H., Harper, S., Timmerman, M.A., & Shockey, T. (2000). Promoting appropriate uses of technology in mathematics teacher preparation. Contemporary Issues in Technology and Teacher Education 1 (1) 66 88. Gearhart, M., Saxe, G. B., Dawson, V., Carter Ching, C., Bennett, T., Rhine, S., & Sloan, T. (1996, April). When can educational reforms make a difference? Opportunities to learn fractions in elementary mathematics classrooms. Paper presented at the annual meeting of the Americ an Educational Research Association, New York. General Accounting Office. (1984). New directions for federal programs to aid math and science teaching (GAO/PEMO 85 5). Washington, DC: Author. Glaser, B. G. & Strauss, A. L. (1975 ). The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine. Glaser, C. W., Rieth, H. J., Kinzer, C. K., Colbrun, L. K., & Peter, J. (2000). A description of the impact of multimedia anchored instruction on classroom interactions. Jo urnal of Special Education Technology, 14(2), 27 43. Glenn, A. D. (2000). Connecting technology to content in learning. In American Association of Colleges for Teacher Education, Log On or Lose Out: Technology in 21st Century Teacher Education (pp. 12 3 126 ). Washington, DC: Author.

PAGE 472

456 Goldman, S. R., Petrosino, A. J., Sherwood, R. D., Garrison, S., Hickey, D. & Bran sford, J. D. et al. (1996). Anchoring science instruction in multimedia learning environments. In S. Vosniadou, E. D. Corte, R. Glaser, & H. Mandl (Eds.), International Perspectives on the Design of Technology Supported Learning Environments (pp. 257 284). New Jersey: Erlbaum. Goodlad, J. I. (1984). A place called school New York: McGraw Hill, Inc. Goodman, J. (1984). Reflection and teacher educati on: A case study and theoretical analysis. Interchange 15 (3), 9 26. Graeber, A. O. (1999). Forms of knowing mathematics: What preservice teachers should learn. Educational Studies in Mathematics, 38, 189 208. Groves, S. (1994, April). Calculators: A learning environment to promote number sense. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA. Grossman, P. (1991). Mapping the terrain: Knowledge growth in teaching. In H. C. Waxman & H. J. Walberg (Eds.), Effective teaching: Current research (pp. 18 203). Berkley, CA: McCutchan Publishing. Grouws, D. A., & Shultz, K. A. (1996). Mathematics teacher education. In J. Sikula, T. J. Buttery, & E. Guyton (Eds.), Handbook of research on teach er education (pp. 442 458). New York: Macmillan.

PAGE 473

457 Grouws, D. A. & Smith, M. S. (2000). NAEP findings of the preparation and practices of mathematics teachers. In E. A. Silver & P. A. Kennedy (Eds.), Results from the Seventh Mathematics Assessment of th e National Assessment of Education Progress, (pp. 193 234). Reston, Va.: National Council of Teachers of Mathematics. Hannafin, R. D, Burruss, J. D., & Little, C. (2001). Learning with dynamic geometry programs: Perspectives of teachers and learners. The Journal of Educational Research, 94 (3), 132 144. Hanson, N. R. (1970). A picture theory of theory meaning. In R. G. Colodny (Ed.), The nature and function of scientific theories (pp. 233 274. Pittsburgh: University of Pittsburgh Press. Hart, K. (1984). W hich comes first length area, or volume? Arithmetic Teacher, 31, 16 18, 26 27. Healy, L. & Hoyles, C. (1999). Visual and symbolic reasoning in mathematics: Making connections with computers? Mathematical Thinking and Learning, 1(1), 59 84. Heid, K. M. ( 1997). The technological revolution and the reform of school mathematics. American Journal of Education, 106, 5 61. Proceedings of the 8t h International Conference for the Psychology of Mathematics Education Australia, 63 69. Hiebert, J. (1981). Units of measure: Results and implications from national assessment. Arithmetic Teacher, 28, 38 43.

PAGE 474

458 Hiebert, J. (1984). Why do some children have trouble learning measurement concepts? Arithmetic Teacher, 31 19 24. Hiebert, J., & Carpenter, T.P. (1992). Learning and teaching with understanding. In D. Grouws (Ed.). Handbook of Research on Mathematics Tea ching and Learning ( pp. 65 97). New York: MacMillan Publishing Company. Hiebert, J. & Lefevre, P. (1986). Conceptual and procedural knowledge of mathematics: An introductory analysis. In Hiebert, J. (Ed.), Conceptual and Procedural Knowledge: The Case of Mathematics (pp. 1 27). Hillsdale, NJ: Erlbaum. Hirstein, J. L., Lamb, C. E., & Osborne, A. (1978). Student misconceptions about area measure. Arithmetic Teacher, 25 10 16. knowledge for teaching on student achievement. American Educational Research Journal, 42 (2), 371 406. mathematics knowledge for teaching. The Elementary School Journal 105(1), 11 30. Hogle, J. G. (1995). Computer microworlds in education: Catching up with Danny Dunn. University of Georgia, Department of Instructional Technology. (ERIC Document Reproducation Service No. ED425738). Hovermil l, J. (2003). Technology supported inquiry learning with Fathom: A professional development project. Unpublished doctoral dissertation, University of Colorado, Boulder.

PAGE 475

459 Hoyles, C. (1991). Developing mathematical knowledge through microworlds. In A. Bishop S. Olsen, & J. Dormolen (Eds.), Mathematical knowledge: Its growth through teaching (pp. 147 172). Netherlands: Kluwer Academic Publishers. Hoyles, C., Noss, R., & Adamson, R. (2002). Rethinking the microworld idea. Journal of Educational Computing Resea rch, 27 (1/2), 29 53. Howe, R. (1999). Knowing and teaching elementary mathematics. [Review of the book Knowing and teaching elementary m athematics ]. Notices of the AMS, 46(8) 881 887. Hutchison, L. (1997). Learning for teaching: A case of constructing the bridge between subject matter knowledge and pedagogical content knowledge. Laramie: University of Wyoming. (ERIC Document Reproduction Service No. ED413332). International Society for Technology in Education. (2000). National educational technology standards for teachers Eugene, OR: Author. International Society for Technology in Education. (2008, June). Technology and student achievement An indelible link. Retrived October 4, 2009, from http://www.iste.org/content/navigationmenu/advocacy/policy/policy.htm Curriculum Press. Jensen, R., & Williams, B. (1993). Technology: Implications for middle grades mathe matics. In D. Owens & S. Wagner (Eds.), Research ideas in the classroom: Middle grades mathematics (pp. 225 243). New York: Macmillan Publishing. Johnson, H. C. (1986). Area is measure. International Journal of Mathematical Education in Science and Technol ogy, 17(4) 419 424.

PAGE 476

460 Jonassen, D. H. (1991). Objectivism versus constructivism: Do we need a new philosophical paradigm? Educational Technology Research & Development, 39(3), 5 14. Jonassen, D. H., Carr, C., & Yueh, H. P. (1998). Computers as mindtools fo r engaging learners in critical thinking. TechTrends, 43 (2), 24 32. Jonassen, D. H., & Reeves, T. C. (1996). Learning with technology: Using computers as cognitive tools. In D. H. Jonassen (Ed.), Handbook of research on educational communications and technology (pp. 693 719). New York: Macmillan. Kamii, C. (2006). Measurement of length: How can we teach it better? Teaching Children Mathematics 13, 154 158. Kamii, C. & Clark, F. B. (1997). Measurement of length: The need for a bette r approach to teaching. School Science and Mathematics 97(3), 116 121. Kaput, J. J. (1992). Technology and mathematics education. In D. A. Grouws (Ed.), Handbook of research on mathematics teaching and learning (pp. 515 556). New York: Macmillian. Kaput, J J. (1994). The representational roles of technology in connecting mathematics with authentic experience. In R. Biehler, R. Scholz, R. Strasser, B. Winkelmann (Eds.), Didactic of Mathematics as a Scientific Discipline, (pp. 379 397). Dordrecht: Kluwer Aca demic Publishers. Kariuki, M. & Duran, M. (2004). Using anchored instruction to teach preservice teachers to integrate technology in the curriculum. Journal of Technology and Teacher Education, 12(3), 431 445.

PAGE 477

461 Keller, B. A., Hart, E. W., & Martin, W. G. (2001). standards for school mathematics School Science and Mathematics, 101(6), 292 304. Keller, B., Wasburn Moss, J., & Hart, E. (2002, June 21). Improving students' spatial visualization skills and teachers' pedagogical content knowledge by using on line curriculum embedded applets. Retrieved October 10, 200 4 from http://illuminations.nctm.org/downloads/IsoPaperV4.doc Kellogg, M. & Kersaint, G. (2004). Creating a v ision for t he standards using online videos in an elementary mathematics methods c ourse. C ontemporary Issues in Technology and Teacher Education [Online serial], 4 (1 ). Available at: http://www.citejournal.org/vol4/iss1/mathematics/article1.cfm Ken n edy, M. M., Ball, D. L., & McDiarmid, G. W. (1993). A study package for examining and tracking changes in teachers' knowledge (Technical Series 93 1). East Lansing, Michigan: The National Center for Research on Teacher Education. Kenney, P. A. & Kouba, V. L. (1997). What do students know about measurement? In P. A. Kennedy & E. A. Silver (Eds.), Results from the Sixth Mathematics Assessment of the National Assessment of Education Progress, (pp. 141 163). Reston, Va.: National Council of Teachers of Mathematics Kersaint, G. & Thompson, D. (2002). Editorial: Continuing the dialog on technology and mathematics teacher education. Contemporary Issues in Technology and Teacher Education [Online serial], 2 (2 ). Available at : http://www.citejournal.org/vol2/iss2/mathematics/article1.c fm

PAGE 478

462 Khoury, H.A., & Zazkis, R. (1994). On fractions and non standard representations: Pre Educational Studies in Mathematics 27 191 204. Kinzer, C. K., Gabella, M. S., Rieth, H. J. (194). An argument for using multimedia and a social studies. Technology and Disability, 3(2), 117 128. Klock, K. (2000). Technology and the transformation of learning In American Association of Colleges for Teacher Education, Log On or Lose Out: Technology in 21st Century Teacher Education (pp. 46 48 ). Washington, DC: Author. Komorek, M. & Duit, R. (2004). The teaching experiment as a powerful method to develop and evaluate teaching and learning sequences in the domain of non linear systems. International Journal of Science Education, 26 (5), 619 633. Kordaki, M. (2003). The effects of to regarding the concept of conservation of area. Educational Studies in Mathematics, 52, 117 209. Kouba, V. L., Brown, C. A., Carpenter, T. P., Lindquist, M. M., Silver, E. A., & Swafford, J. O. (1988). Re sults of the fourth NAEP assessment of mathematics: Measurement, geometry data interpretation, attitudes, and other topics. Arithmetic Teacher, 35 10 16. Kurz, T. L. & Bararelo, I. (2004). Using anchored instruction to evaluate mathematical growth and understanding. Journal of Educational Technology Systems, 33(4), 421 436. LaFrance. M. (1989 April ). The quality of expertise: Implications of expert novice differences fro m knowledge acquisition. SIGART Newsletter 108, 6 14.

PAGE 479

463 Lajoie, S. (1993). Computing environments as cognitive tools for enhancing learning. In S. Lajoie & S. Derry (Eds.), Computers as cognitive tools (Vol. 1, pp. 261 288). Hillsdale, NJ: Lawrence Erlbaum Associates. Lampert, M., & Ball, D. L. (1998). Teaching, multimedia, and m athematics: Investigations of real p ractice New York: Teachers College Press. Lappan, G., Fey, J. T., Fitzerald, W. M., Friel, S. N., & Phillips, E. D. (1998). Covering and surrounding: Two dimensional measurement. New York: Addison Wesley Longman, Inc. L eavy. A. (2006). Using data comparison to support a focus on distribution: Examining inquiry. Statistical Education Research Journal, 5 (2), 89 114. Lederman N., & Niess, M. improvement of teaching and learning? School Science and Mathematics, 100 (7), 346 8. Lehrer, R. (2003). Developing understanding of measurement. In J. Kilpatrick, W. G. Martin, & D. Schifter (Eds.), A Res earch Companion to Principles and Standards for School Mathematics (pp. 179 192). Reston, VA: National Council of Teachers of Mathematics. Lehrer, R. & Franke, M. L. (1992). Applying personal construct psychology to the study actions. Journal for Research in M athematics Education 2 3 223 241. Leinhardt, G., & Smith, D. A. (1985). Expertise in mathematics instruction: Subject matter knowledge. Journal of Educational Psychology, 77 (3), 247 271.

PAGE 480

464 Lindquist, M. M. (1997). NAEP find ings regarding the preparation and classroom practices of mathematics teachers. In P. A. Kennedy & E. A. Silver (Eds.), Results from the Sixth Mathematics Assessment of the National Assessment of Education Progress, (pp. 61 86). Reston, Va.: National Counc il of Teachers of Mathematics. Lindquist, M. M., & Kouba, V. L. (1989). Measurement. In M. M. Lindquist (Ed.), Results from the fourth mathematics assessment of the National Assessment of Educational Progress (pp. 35 43). Reston, VA: National Council of Teacher of Mathematics. Linn, R. L., & Slinde, J. A. (1977). The determination of the significance of change between pre and posttesting periods. Review of Educational Research, 47, 121 150. Livingston, C., & Borko, H. (1990). High school mathematics revi ew lessons: Expert novice distinctions. Journal for Research in M athematics Education 2 1 372 387 Lord, F. M. (1956). The measurement of growth. Educational and Psychological Measurement, 16, 42 1 437. Lowery, N. V. (2002). Construction of teacher knowle dge in context: Preparing elementary teachers to teach mathematics and science. School Science and Mathematics 102(2), 68 83. of fundamental mathematics in China and the Un ited States. Mahwah, NJ: Erlbaum.

PAGE 481

465 Maher, C. C., & Beattys, C. B. (1986). Examining the construction of area and its mesurement by te n to fourteen year old students. Proceedings of the 8 th International Conference for the Psychology of Mathematics Education Australia 163 168. Manouchehri, A. (1997). School mathematics reform: Implications for mathematics teacher preparation. Journal of Teacher Education, 48 (3), 197 209. Mapolelo, D. C. (1999). Do pre service primary teac hers who excel in mathematics become good mathematics teachers? Teaching and Teacher Education, 15 715 725. Marks, R. (1990). Pedagogical content knowledge: From a mathematical case to a m odified conception. Journal of Teacher Education, 41 (3), 3 11. Mart in, W. G., & Harel, G. (1989). Proof frames of preservice elementary teachers. Journal for Research in Mathematics Education, 20 41 51. Martin, W. G. & Strutchens, M. E. (2000). Geometry and measurement. In E. A. Silver & P. A. Kennedy (Eds.), Results fr om the Seventh Mathematics Assessment of the National Assessment of Education Progress, (pp. 193 234). Reston, Va.: National Council of Teachers of Mathematics. Marzano, R.J. (1998). A theory based meta analysis of research on instruction Aurora, CO: Mid continent Research for Education and Learning. Mathematics Association of America. (1991). A call for change: Recommendations for the preparation of teachers of mathematics. Washington, DC: Author.

PAGE 482

466 Mathematical Sciences Education Board. (1996 March ). The preparation of teachers of mathematics: Considerations and challenges (A letter report) Center for Science, Mathematics, and Engineering Education Retrieved July, 10, 2005 from http://books.nap.edu/html/teacher_preparation/TP_Text.htm McClain, K. (20 03). Supporting preservice t understanding of place v alue and multidigit a rithmetic M athematical Thinking and L earning, 5 (4), 281 306 McGowen, M. A. & Davis, G. E. (2002). Growth and development of pre service e owledge. Paper presented at the annual me e ting of the North American chapter of the International Group for the Psychology of Mathematics Education, Athens, GA. McIntyre, J. D. & Pape, S. (1993). Using video protocols to enhance teacher reflective thinkin g. Teacher Educator, 28(3) 2 10. McLarty, K., Goodman, J., Risko, V., Kinzer, C. K., Vye, N., Rowe, D., & Carson, J. (1990). Implementing anchored instruction: Guiding principles for curriculum development. In J. Zutell & S. McCormick (Eds.), Literacy the ory and research: Analyses from multiple paradigms (pp. 109 120). Chicago, IL: National Reading Conference. Means, B. (Ed.). (1994). Technology and education reform San Francisco: Jossey Bass Publishers. understanding of perimeter and area School Science and Mathematics, 98 (7) 361 368. Meredith, A. (1993). Knowledge for teaching ma thematics: Some student views. Journal of Education for Teaching, 19 325 338.

PAGE 483

467 Miles, M. B. & Huberman, A. M. (1994 ). Qualiative data analysis: An expanded sourcebook (2 nd ed.). Thousand Oaks, CA: Sage. Milken Exchange on Educational Technology. (1999). Will new teachers be prepared to teach in a digital age? Santa Monica, CA : Milken Family Foundation. Mitchell, J., & Williams, S. E. (1993). Expert/novice differences in teaching with technology. Paper presented at the Annual Meeting of the American Educational Research Association, Atlanta, GA. Moskal, B. M. & Leydens, J. A. (2000). Scoring rubric development: Validity and reliability. Practical Assessment, R esearch & Evaluation, 7 (10). Retrieved April 3, 2007 from http://PAREonline.net /getvn.asp?v=7&n=10. Moyer, P. S. (2001). Using representations to explore p erimeter and area. Teaching Children Mathematics, 8 52 59. National Center for Education Statistics (2003). NAEP Questions. Retrieved January 2, 2004 from http://www.nces.ed.gov/nationsreportcard/itmrls/searchresults.asp National Center fo r Education Sta tistics (2005). NAEP Questions. Retrieved May 6, 2007 from http://nces.ed.gov/nationsreportcard/itmrls/itemdisplay.asp National Center fo r Education Statistics (2007). NAEP Questions. Retrieved March 20 2007 from http://nces.ed.gov/nationsreportcard/itmrls/startsearch.asp National Council of Teachers of Mathematics. (1989). Curriculum and evaluation standards for school mathematics. Reston, VA: Author. National Council of Teachers of Mathematics. (1991). Profession al s tandards for teaching m athematics. Reston, VA: Author.

PAGE 484

468 National Council of Teachers of Mathematics. (2000). Prin ciples and standards for school mathematics Reston, VA: Author. education: Results on e ducation 1990. Washington DC: Author. National Research Council. (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press. National Research Council. (2001). Adding it up: Helping children learn mathematics. Washington, DC: National Academy Press. Niemi, D. (1997). Cognitive science, expert novice research, and performance assessment. Theory into Practice, 36(4), 239 246. Logo. Journal for Research in Mathematics Education, 18, 343 362. Noss, R. (1988). The computer as a cultural influence on mathematical learning. Educational Studies in Mathematics, 19, 251 268. Nunes, T., Light, P. & Mason, J. (1993). Tools for thought: The measurement of length and area. Learning and Instruction, 3, 39 54. Nunnal l y J. C. (1978 ). Psychometric theory (2 nd ed.) New York: McGraw Hill Offic e of Technology Assessment (1995). Teachers and technology: Ma king the connection Washington, D.C.: U.S. G overnment Printing Office. Oliver, K. (1999). Anchored Instruction Retrived January 20, 2007, from http://www.edtech.vt.edu/edtech/id/models/powerpoint/anchored.pdf

PAGE 485

469 Outhred, L. N. & Mitchelmore, M. C. (2000) of rectangular area measurement. Journal for Research in Mathematics E ducation, 31 144 167. Paper t S. (1980). Computer based microworlds as incubators for powerful ideas. In R. Taylor (Ed.), The computer in the school: Tutor, tool, tutee (pp. 203 210). New Patton, M. Q. (2002). Qualitative research and evaluation methods (3 rd ed.) Thousand Oaks, CA: Sage. Pea, R. D. (1986). Cognitive technologies for mathematics education. In A. Schoenfeld(Ed.), Cognitive science and mathematics education (pp. 89 122). Hillsdale, NJ: Erlbaum. Pellegrino, J. W., & Altman, E. A., (1997). Information technology and teacher preparation: Some critical issues and illustrative solutions Peabody Journal of Education, 72 (1), 92 93. teaching and learning. Educational Researcher, 17 5 14. Piaget, J., Inhelder, B., & Szeminska, A. (1981). tion of Geometry. N.Y.: Norton and Company. Quinn, R. J. (1997). The effects of mathematics methods courses on the mathematical attitudes and content knowledge of preservice teachers. Journal of Educational Research, 91 (2), 108 113

PAGE 486

470 Rafilson, F. (1991). The case for validity generalization. Practical Ass essment, Research & Evaluation [Online serial], 2(14 ). Available at: http://paraonline.net/getvn.asp ?v=2&n= 13 How to evaluate progress in problem solving. Reston, VA: National council of Teachers of Mathematics. Reeves, T. C. (1999, April). A model to guide the integration of the WWW as a cognitive tool in K 12 education. Paper presented at the annual meeting of t he American Educational Research Association, Montreal, Quebec, Canada. Retrieved December 10, 2006, from http://it.coe.uga.edu/~treeves/AERA99Web.pdf School Science and Mathematics, 97, 75 77. coordination of units in an area setting. Journal for Research in Mathematics E ducation, 27 564 581. base in professional development. Educational Researcher, 27 (5), 27 31. Rich, B., Lubinski, C., & Otto, A. (1994). Pedagogical content knowledge, curricular knowledge and teacher c hange Paper presented at the annual me e ting of the North American chapter of the international Group for the Psychology of Mathematics Education, Baron Rouge. Rieber, L. P. (1992). Computer based microworlds: A bridge between constructivism and direct instruction. Educational Technology Research and Development, 40(1), 93 106.

PAGE 487

471 Rieber, L. P. (1994). Computers, graphics, and learning Madison, Wisconsin: Brown & Benchmark. Rieber, L. P. (2004). Microworlds. In D. Jonassen (Ed.), Handbook of research on educational communications and technologies (583 603). Mahwah, NJ: Erlbaum. Rogosa, D. R., Brandt, D., & Zimowski, M (1982). A growth curve approach to the measurement of change. Psychological Bulletin, 90, 726 748. Rojano, T. (1996). Developing algebraic aspects of problem solving within a spreadsheet environment. In N. Bednarz, C. Kieran, & L. Lee (Eds.), Approaches to algebra: Perspectives for research and teaching (137 145). Boston, MA: Kluwer Academic Publishers. Rowan, B., Schilling, S. G., Ball, D. L., & Miller, R. (2001). pedagogical content knowledge i n surveys: An exploratory study (Study of Instructional Improvement, Research Note S 2). Ann Arbor, Michigan: University of Michigan. Rutledge Z., Kloosterman, P., & Kenney, P. A. (2009). Mathematics skills and NAEP results over a generation. Mathematics Teacher, 102(6), 445 451. Sanders, S. E. empowerment or debilitation? Educational Studies in Mathematics 26, 397 408. Focus, rig or, coherence. American Educator, 32(1), 20 24. School Mathematics Study Group. (1972). Correlates of mathematics achievement: Teacher background and opinion variables. In J. W. Wilson and E. A. Begle (Eds.), NLSMA Reports (No. 23, Part A). Palo Alto, CA: Author.

PAGE 488

472 Serafina, K. & Cicchelli, T. (2003). Cognitive theories, prior knowledge, and anchored instruction on mathematical problem solving and transfer. Education and Urban Society, 36(1), 79 93. Shamatha, J. H., Peressini, D., & Meymaris, K. (2004). Tech nology supported mathematics activities situated within an effective learning environment theoretical framework. Contemporary Issues in Technology and Teacher Education 3 (4), 362 381. S hulman, L. S. (1986). Those who understand: Knowledge growth in teachi ng. Educational Researcher, 15 (2), 4 14. Shulman L. S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57, 1 22. Sheets, C. (1993). Effects of computer learning and problem solving tools on the development of sec functions (Doctoral dissertation, University of Maryland College Park, 1993). Digital Dissertations, 54, 1714. Shyu, H. (1999). Effects of media attributes in anchored instruction. Journal of Educationa l Computing Research, 21(2), 119 139. Journal for Research in Mathematics E ducation, 24 233 254. Simon, M. A. (2000). Research on the development of mathematics teachers: The teacher development experiment. In A. Kelly & R. Lesh (Eds.), Handbook of research design in mathematics and science education, (pp. 335 359). New Jersey: Lawrence Erlbaum.

PAGE 489

473 Simon M., & Blume, G. (1994 a ). Mathematical modeling as a component of understanding ratio as measure: A study of prospective elementary teachers. Journal of Mathematical B ehavior 13 183 197. Simon, M., & Blume, G. (1994b). Building and understanding multipl icative relationships: A study of prospective elementary teachers. Journal for Research in Mathematics E ducation, 25 472 494. Simon, M., & Blume, G. (1996 ). Justification in the mathematics classroom: A study of prospective elementary teachers. Journal of Mathematical Behavior, 15 3 31. from perspectives: Generating accounts practice Journal for Research in Mathematics E ducation 30 252 264. Simonsen, L. M., & Dick, T. P. (1997). Te a calculator in the mathematics classroom. The Journal of Computers in Mathematics and Science Teaching, 16(2/3) 239 268. Sinclair, N. (2005). Mathematics on the internet. In S. Wilder & D. Pimm (Eds.), Teaching secondary mathematics with ICT, (pp. 203 216). New York: Open University Press. Smith, K. B. (1996). Guided discovery, visualization, and technology applied to the ne w curriculum for secondary mathematics. Journal of Computers in Mathematics and Science teaching, 15 (4), 383 399. Sonnabend, T. (2004). Mathematics for teachers: An interactive approach for grades K 8 (3 rd ed.). Australia: Broke s /Cole.

PAGE 490

474 Stacey, K., Helme, S., Steinle, V., Baturo, A., Irwin, K., & Bana, J. (2001). Preservice Journal of Mathematics Teacher Education, 4 205 225. Stager, R. A. & Wales, C. E. (1972). Guided design: A new concept in co urse design and operation. J ournal of Engineering Education, 62 (6), 539 541. S teffe L. P. (1983) The teaching experiment methodology in a constructivist research program. In M. Zwerg et al. (E ds ), Proceedings of the Fourth International Congress on Mathematical Education (Bosten: Birkhuser), 469 471. Using teaching experiments to enhance (eds.), Improving teaching and le arning in science and mathematics, (pp. 65 76). New York: Teachers College Press. Steffe, L. P. & Thompson, P. W. (2000). Teaching experiment methodology: Underlying principles and essential elements In A. Kelly & R. Lesh (Eds.), Handbook of research desi gn in mathematics and science education, (pp. 267 306 ). New Jersey: Lawrence Erlbaum. Steffe, L. P. & Wiegel, H. G. (1994). Cognitive play and mathematical learning in computer microworlds. Educational Studies in Mathematics, 26, 111 134. Stein, M. K., Ba xter, J. A., & Leinhardt, G. (1990). Subject matter knowledge and elementary instruction: A case from functions and graphing. American Educational Research Journal, 27 (4), 639 663.

PAGE 491

475 Steinberg, R., Haymore, J., & Marks, R. (1985, April). e and structuring content in mathematics. Paper presented at the Annual Meeting of the American Educational Research Association, Chicago. Stoddart, M., Connell, M., Stofflett, R., & Peck, D. (1993). Reconstructing elementary ding of mathematics and science content. Teacher & Teacher Education 9 (3), 229 241. Stohl, H., & Tarr, J. E. (2002). Developing notions of inference using probability simulation tools. Journal of Mathematical Behavior, 21, 319 337. Stone, M. E. (1994). Tea ching relationships between area and perimeter with the The Mathematics Teacher, 87, 590 594. Strudler, N., Quinn, L.F., McKinney, M., & Jones, W.P. (1995). From coursework to the real world: First year teachers and technology. In D. A. Willis, B. Robin, & J. Willis (Eds.), Technology and teacher education annual, 1995 Charlottseville, VA: AACE. Strutchens, M. E. & Blume, G. W. (1997). W hat do students know about geometry? In P. A. Kennedy & E. A. Silver (Eds.), Results from the sixth mathematics assessment of the national assessment of education p rogress, (pp. 165 194). Reston, Va.: National Council of Teachers of Mathematics. Strutchen s, M. E., Harris, K. A., & Martin, W. G. (2001). Assessing geometric and measurement understanding using manipulatives. Mathematics Teaching in the Middle School, 6, 402 405. Sullivan, P. & Lilburn, P. (2002). Good questions for math teaching: Why ask them and what to ask, K 6. Sausalito, CA: Math Solutions Pub lications.

PAGE 492

476 Swafford, J. O., Jones, G. A., & Thor n ton, C. A. (1997). Increased knowledge in geometry and instructional practice. Journal for Research in Mathematics E ducation, 28 467 483. Thompson, A D (2000). The challenge of faculty prof essional development: New approaches and structures for teacher educators In American Association of Colleges for Teacher Education, Log On or Lose Out: Technology in 21st Century Teacher Education (pp. 162 166 ). Washington, DC: Author. Thompson, D. R. & Senk, S. L. (1998). Using rubrics in high school mathematics courses. The Mathematics Teacher, 91, 786 793. Tierney, C., Boyd, C., & Davis, G. (1986 ). Pros p e area. Proceedings of the 14 th International Conference for the Psychology of Mathematics Education, Mexico, 2, 307 315. Timmerman, M. A. (2004). Using the Internet: Are prospective elementary teachers prepared to teach with technology. Teaching Children Mathematics, 10, 41 0 415. Timmerman, M. A. (1999). Learning in the context of a mathematics teacher education mathematics teaching, and the teaching of mathematics with technology. Proceedings of t he 10 th International Conference of the Society for Information Technology & Teacher Education, USA, 998 1003. Tobin, K. (2000). Interpretive Research in Science Education. In Handbook of Research Design in Mathematics Science Education In A. Kelly & R. L esh (Eds.), Manwah, NJ: Lawrence Erlbaum Publishers.

PAGE 493

477 Triadafillidis, T. A. (1995). Circumventing visual limitations in teaching the geometry of shapes. Educational Studies in Mathematics, 225 235. Tzur, R. & Timmerman, M. (1997 ). Why do we invert and multiply? Elementary In J. A. Dossey, J. O. Swafford, M. Parmantie & A. E. Dossey (Eds.), Proceedings of the Annual Meeting of the International Group for the Psychology of Mathematics Education (pp. 553 559). Columbus, OH: ERIC Clearinghouse for Science, Mathematics, and Environmental Education. Valverde, G. A. & Schmidt, W. H. (Winter, 1997 98). Refocusing U.S. Math and Science Education, Issues in Science and Technology 1 6. Van de Walle, J. A. Elementary and Middl e School Mathematics: Teaching D evelopmentally (7 th ed.), Boston: Pearson Education, Inc. & R. Tischler (Eds.), English tr anslation of selected writings of Dina van Hiele Geldof and Pierre M. van Hiele (243 252). Brooklyn, NY: Brooklyn College, School of Education. Wales, C. E. and R. A. Stager (1977). Guided Design New York, National Centre for Guided Design: Author. Walter, M. (1970). A common misconception about area. Arithmetic Teacher, 17, 286 289. Wearne, D., & Hiebert, J. (1988). A cognitive approach to meaningful mathematics instruction: Testing a local theory using decimal numbers. Journal for Research in Mathe matics Education, 1 9 371 384.

PAGE 494

478 Wetherill, K., Midgett, C., & McCall M. (2002). Determining the impact of applet based instructional materials on teacher knowledge of content and pedagogy, instructional planning, and student learning of fractions. Retri e ved October, 12, 2004, from http://illuminations.nctm.org/downloads/UNCWrschReport.pdf Whitehead, A. N. (1929) The aims of education New York: Macmillan. Willett, J. B. (1988). Questions and answers in the measurement of change. Review in Research in Ed ucation 15 345 422. Willis, J. W. & Mehlinger, H. D. (1996). Information technology and teacher education. In J. Sikula, T. J. Buttery, & E. Guyton (eds.), Handbook of research on teacher education (2 nd ed.), pp. 978 1029. New York: Simon & Schuster Macmillan. Wil son, M. R. (1994). One preservic impact of a course integrating mathematical content and pedagogy. Journal for Research in Mathematics Education, 25 346 370. Wilson, P. S. & Rowland, R. (1 993). Teaching meansurement. In R. J. Jensen (Ed.). Research ideas for the classroom: Early childhood mathematics (pp. 171 194). Reston, VA: National Council of Teachers of Mathematics. Woodward, E., & Byrd, F. (1983). Area: Included topic, neglected conc ept. School Science and Mathematics, 83 (4), 342 47. Yelland, N. (2002). Creating microworlds for exploring mathematical understandings in the early years of school. Journal of Educational Computing Research, 27(1&2), 77 92. Yin, R. K. (199 4). Case study d esign: Design and methods (2 nd ed.) (Applied Social Research M ethods Series, Vol. 5). Thousand Oaks CA: Sage Publications, Inc.

PAGE 495

479 Zeichner, K & Tabachnick, B. R. (1981). Are the effects of university teacher education washed out by school experience? Journal of Teacher Education, 32(3), 7 11. Zimmerman, D. W., & Williams R. H. (1982) Gain scores in research can be highly r eliable Journal of Educational Measurement 19 (2), 149 154.

PAGE 496

480 APPENDICES

PAGE 497

481 Appendix A : P iloting of Instruments Timeline and Summary of Piloting Sessions Spring, 2004 A 16 question (14 open ended, two multiple choice) area and perimeter assessment was administered. The problems pertained to student difficulties with area and perimeter as presented in the literature. Before the assessments were collected, the preservi ce teachers were shown four web based microworlds that appeared appropriate for exploring area and perimeter concepts. Because we were conducting class in a computer lab, the students were then given the chance to review their answers to the assessment an d make appropriate changes. They were asked to provide feedback regarding which applets they liked and why. how I could see the relationship of doubling the perimeter, but quadru During this exploration time, I was able to observe the students interacting within the microworlds and question them on their choices and the features of the applets. Informal analysis revealed that in order to elicit more reflective fee dback future assessments would need to ask for greater justification of answers as well as specifically asking the preservice teachers to explain their responses as if they were talking to an elementary student. It was also found that certain questions wou ld have a tendency to bias others. Fall, 2004 First, near the beginning of the semester a version of the proposed questionnaire was administered to the students in my methods of teaching elementary mathematics course. All of the preservice teachers surve yed indicated

PAGE 498

482 Appendix A (Continued) that they were not aware of any specific technology that could be used to enhance the responses on the questionnaire, the format was changed to be more standardized, check boxes were added, and more opportunities for open ended responses were included. About a month later, a 13 question pretest was administered. Two subsequent whole class discussions addressed the area and perimeter misconcepti ons that were infused into the q uestions. The number of microworlds now being considered was down to three, from the previous four. Those three applets were used as part of instruction for the whole class discussion and their effectiveness was evaluated by I really like this (the Shodor) web site because it gave me a chance to practice area and perimeter and gave me immediate feedback. I was able to instantly see if I was right or wrong in my a nswer. Two weeks after the pretest a 14 question posttest (similar but not parallel) was administered. Wording of questions was again refined, and the time required to take the test was evaluated. It was concluded that statements such and posttests revealed students did mu ch better on the posttest and further work appeared promising. Spring, 2005 What had proved to be the six most challenging area and perimeter items from previous assessments were administered to the students in my methods of

PAGE 499

483 Appendix A (Continued) teaching secondary mathematics course for the purpose of formulating follow up interviews. Purposeful sampling of two students resulted in the opportunity to design semi structured interviews to further probe the understandings underlying their responses t o the area and per imeter questions. Fall, 2005 Versions of the pre and posttest were administered as well as a five question follow up test that was incorporated into their final exam at the end of the semester. As a result of this pilot work, more exp licit directions were written and the questions focusing on content knowledge and knowledge of student thinking were separated and identified within the test. Lists were written identifying questions that would bias each other as well as one indicating pai rs of parallel questions based on content and difficulty. Spring, 2006 Considerable time was spent in revising items for the area and perimeter assessments. This version included more formal and explicit directions and separate sections were created indi cating content knowledge (CK) and knowledge of student thinking (KoST) questions. The revised assessment was administered to a section of students in a course titled, Teaching Elementary School Mathematics at a nearby large southeastern university. These a ssessments were individually scored by both the researcher and a second scorer using specially designed rubrics (Appendix H). As a result of the difficulties with double scoring the 27 tests, considerable revisions to the scoring rubrics (Appendix G contains examples of I ) were created and incorporated into the rubrics to help distinguish between scores bordering

PAGE 500

484 Appendix A (Continued) between two scores on the rubric. Important results from this scoring session w ere the breaking down of content knowledge questions into a part (a) in which correctness would be considered and a part (b) which would decipher the quality of their explanation. The knowledge of student thinking quest ions were constructed to named in the item correctly solved the problem presented, a part (b) which asked for c) asking for how the test taker, thinking and functioning as a teacher, would respond to the student in the hypothetical problem or situation. Summer, 2006 Revisions from the spring pilot were applied and two different, but similar versions, of the are a and perimeter assessment were constructed and administered to two summer sessions of a teaching elementary school mathematics course at the same southeastern university. Fall, 2006 A five question, multiple part area and perimeter assessment was admin course. These questions were purposely chosen because they had produced the most divers e responses in previously administered assessments, and the researcher wanted to pilot the revise d versions of these questions in order to generate interview protocols as well as a follow up instructional session involving anchored instruction Near the end of the semester, an early version of the Teaching Episode format was piloted. The purposes wer e to: (a) observe how the preservice teachers (PTs) worked through the problem solving scenario presented, (b) observe how they

PAGE 501

485 Appendix A (Continued) interacted with the Explorelearning microworld and how the applet influenced their problem solving app roaches and their instructional suggestions, (c) observe the level and value of PT s cooperative format (d) accumulate written data to provide insight into and allow for analysis of the thinking regar ding content knowledge and knowledge of student thinking of area and perimeter, and (e) provide visual, audio, and written feedback regarding the current format of the Teaching Episode. Analysis revealed that most of the PT s were currently at a novice stage in both their treatment of content knowledge and knowledge of student thinking. They spent minimal time analyzing the mathematics of the problem; hence, they initially overlooked mathematical subtleties of the problem a valuable skill of experienced and effective teachers. For some students the applet did not seem to facilitate mathemat ical or pedagogical growth ; however, others indicated signs of growth in both categories. One of the PT, Kristen indicated a growth in co ntent knowledge by writing Pete (the fictitious student presented in th e focus problem of the Teaching Episode) because I forgot that an area of 18 meant that 18 square units could f it into the rectangle When I used the Gizmo I saw for myself that there Several PTs tended to focus on the content at hand (e.g., using the correct formula to find area) instead of the student and how to help him conceptually understan d the problem. Rebekah, on the other hand, showed some growth in her pedagogical content knowledge

PAGE 502

486 Appendix A (Continued) centimeter is. It might be helpful to show him a grid like the one on the Gizmo website. If he can picture what exactly he is measuring, he will be learning more that the ma jority of the PTs were thoughtfully involved with the problem solving scenario and being introspective about their current understandings regarding the problem at hand. Conclusion : Summary of Design, Training, and Major R esults Involving the Instruments as a Result of Piloting Area & Perimeter Tests The end result of applying the search criteria presented in chapter 2 was a collection of 28 questions that were then categorized as most appropriate (or easiest to modify) to address either content knowledge (CK) or knowledge of student thinking (KoST) regarding area and perimeter. A total of 35 items (some were various for ms of the same problem) were then piloted. Content knowledge problems were amended to ask the participant to perform a calculation or answer a constructed response question and then to explain how they arrived at their answer. The knowledge of student thin king problems typically have three parts: (a) decide if the thinking, solution, method, or claim presented regarding a hypothetical student is correct, (b) justify their response to part (a), and (c) as a teacher explain exactly how they would respond to t he mathematical thinking of the hypothetical student or students presented in the problem. A statement similar to,

PAGE 503

487 Appendix A (Continued) proved more effective at eliciting the desired level of reflection. Many of the problems required slight modifications including the addition of appropriate drawings and grids for knowledge of student thinking problems evolved through piloting and were finalized by the fourth and last pilot test. Training and scoring sessions (discussed later) conducted with the second scorer proved very helpful in strengthening certain test items to be used in future piloting while also eliminating other weaker items. The potential to illicit a range of thoughtful responses was very important in the item selection process. As the tests were created issues such as posttest sensitivity were considered and plann ed for. Posttest sensitization can occur when the posttest inadvertently acts as a learning experience in its own right (Gall, Borg, & Gall, 1996). To address this possibility, the posttest will consist primarily of parallel items to the pretest with two i tems the same as the pretest and one item modified slightly. Because the follow up test is interested more in retention than growth, it will contain the same items as the pretest. Rubric Scoring A zero to four point scale was utilized and the criteria fo r the different score levels was initially based on the sub et al., (1996, p. 143). A score of 0, 1, or 2 was conside red unacceptable and a score of 3 or 4 was considered acceptable. The researcher and a secondary scorer were involved in

PAGE 504

488 Appendix A (Continued) numerous training rounds of scoring and revising to both the rubrics and the format of the testing items (see Appendix A). For example, the first dual scoring session of five area and perimeter tests incorporated the use of holistic scoring rubrics (Append ices G and H) and an anchor paper (Thompson & Senk 1998). The fact that roughly 35 open ended questions were going to be piloted and the participants were frequently encouraged to explain and justify their responses produced too many response variations f or effective use of anchor papers. Instead the language of the rubrics was g radually refined (see Appendix H ) to reflect a conscious effort to separate a procedurally oriented response from a more conceptual ly based one. A score of two became the dividing line to separate a procedural only response and one demonstrating conceptual understanding of the concepts at hand. That is, the best score that a response lacking conceptual understanding could receive is a two. Later on in the training process tables wer e created to succinctly al misconception (see Appendix I ) and help differentiate a response emphasizing procedures from one focusing on understanding. Table 2 contains information on only the items proposed for use in the pre post and follow up tests. The item specific tables supplemented the scoring rubrics During the training process, the tables were clarified to im prove consistent application and separate procedural only from conceptual based responses. Training & Scoring The area and perimeter testing instrument was piloted three separate times. Each pilot used a test containing a majority of different questions w ith one or two problems

PAGE 505

489 Appendix A (Continued) revised from previous pilots. The first test pilots contained 15 questions which proved difficult to complete in the preferred one hour time constraint. The second and third tests were shortened to 10 items, but still were producing reliable measures. Copies of the 27 tests from the first piloting of the area and perimeter assessment instruments were mailed to the second scorer. Soon after, the first training session occurred and involved familiarizing the second scorer with the goals of the study, the nature and objectives of the area and perimeter tests themselves, and the scoring rubrics for content knowledge and knowledge of student thinkin g items on the tests (Appendix H ). During the first session, the wording of various sections of the rubrics was clarified and the session concluded with some important revisions regarding differentiating specific criteria for certain scores on the rubrics, including the importance of diagrams for responses. It was agree d upon that when scoring the tests we would grade by items (i.e. grade the first problem on all tests before grading the second problem). It was decided that we would completely score all 15 items for two randomly selected tests. We then worked through ea ch item, discussing how and why we arrived at the scores we did. We spent extra time discussing the responses we scored differently. We concluded with a general reminder to focus on conceptual understanding and use that construct in the process of separati ng acceptable from unacceptable responses within the range of the rubric criteria. The first session also resulted in making sure all test items clearly separated the types of responses (e.g., correctness, explain your thinking, explain the thinking of t he student in the problem). Before the second training session occurred, both the researcher and the second

PAGE 506

490 Appendix A (Continued) scorer independently scored all 15 items on three more tests. The purpose of the second training session was to use all d isagreements to help clarify the scoring rubrics to improve consistent application of the criteria and to strengthen the testing items through revision of confusing language. The format of items was modified to improve the potential of diverse and rich res ponses. For example, in addition to asking the participant s (preservice elementary teachers) to attempt to explain what they thought the student in a certain problem might have been thinking when making their (incorrect) response, when appropriate the part icipant was also asked to explain what and why elementary students might have difficulty with a particular question or concept. This change produced a greater range and depth of responses on future piloted tests. Three more training sessions were conduct ed. Because of the large number of items being piloted (28), there was a concerted effort to clarify the language of the rubric so as to avoid item specific rubrics. Each session would involve independently scoring all 15 items for five tests and then comp aring all scores and then discussing the modifying of items and rubric revisions. There were several important results of these sessions, including: (a) appropriate units must be included to receive a score of 4, (b) conceptually wrong responses cannot rec eive a score higher than 2, (c) rubric language was clarified to increase the consistency in distinguishing between a score of 3 and 4 especially for the Knowledge of Student thinking rubric, and (d) before any future scoring was conducted, the researche r should create tables specifying the concepts and misconceptions being addressed by each item (Appendix I ). This proved instrumental to future scoring sessions.

PAGE 507

491 were d ouble scored. There were two important results from subsequent discussions. First, misconceptions was addressed. Second, the rubrics were to be the primary scoring tools with the Concepts and Misconceptions table assisting with responses that were Throughout the scoring of the first 27 tests, repeated revisions and modifications were made to the scoring rubrics and their applic ation. An example of a clarification that arose during the training process involved the criteria for separating a score of 3 from a score of 4. For both rubrics a top score of 4 was reserved for what is termed a model response that demonstrates a thorough a completely correct response including precise terminology, notation, and execution of algorithms, and provides diagrams or pictures to support/explain the response. A score of 3 also represents a successf ul or acceptable response and differs from a 4 in that it an essentially correct response but may contain minor computational errors, and includes a picture or diagra m that may contain minor errors (e.g., not drawn to scale) but offers very little explanation, or provides a detailed explanation but no supporting picture or diagram. In an earlier version of the rubrics, the language describing a score of 3 and 4 simply made reference to the inclusion of diagram or picture to support the response. The need to clarify and specify the scoring criteria became evident in several items. An example illustrating the need to clarify the scoring involved the use, misuse, or omission of an appropriate diagram or picture along with the response is the following onship

PAGE 508

492 how would you present the explanation for how you arrived at your answer to a class of 4 th or 5 th participant to correctly explain how they arrived at their answer but their often lengthy responses were somewhat confusing, and would certainly be so to a 4 th or 5 th grader. Therefore, it was decided that a response mathematically and procedurally correc t but lacking a diagram that would help a student conceptualize the explanation (or a diagram with insufficient explanation) would receive at best a score of 3. It was important to establish a model response as one procedurally correct and conceptually rob ust, and including an appropriate diagram or picture to support an explanation geared toward elementary students was deemed necessary. The training and cooperative revising proved successful. The results from the inter rater reliability process include: c larifications made in the language of the holistic scoring rubrics, the addition of Concepts and Misconceptions tables, and improvements in item format and wording including the elimination of several items. These improvements were implemented in the sco ring of all subsequent test papers, and high scoring reliability was achieved throughout. The training and scoring sessions for the first batch of 27 tests had a robust inter rater reliability of 94%. The second and third scoring sessions had a slight drop in inter rater reliability, 88% and 86%. These two subsequent scoring sessions involved only 10 item tests, which helps to explain the drop in inter rater reliability. Also, the test used for the third pilot contained four problems which had negative corr ected item total correlation. These problems were removed from consideration for this study.

PAGE 509

493 A ppendix B: SYLLABUS FOR METHODS COURSE METHODS OF TEACHING ELEMENTARY MATHEMATICS COURSE GOALS This course focuses on discovering the reasons behind the actions in mathematics. This course is required in the undergraduate program in Elementary Education. It provides the development of knowledge and skills necessary to prepare students to assume roles as teachers of mathematics in elementary classes. The National Council of Teachers of Mathematics (NCTM) in its Guidelines for the Preparation of Teachers recommends such a course. The vision of mathematics learning espoused by the NCTM ass umes the following: Knowing mathematics means being able to use it in powerful ways. To learn mathematics, Students must be engaged in exploring, conjecturing, and thinking rather than only in rote learning of rules and procedures. Mathematics learning is not a spectator sport. When students construct personal knowledge from meaningful experiences, they are much more likely to retain and use what they have learned. This fact underlies of mathematics, to view and use it as a tool for reasoning and problem solving ( Curriculum and Evaluations Standards for School Mathematics: Executive Summary, NCTM, March 1989, p. 5 ). The purpose of this course is to provide opportunities for preservic e teachers to examine and build upon their understandings of various mathematics topics, and to construct a vision of teaching and learning mathematics that considers the goals and the assumptions of the current reform movement in mathematics education. Co ntent, methods, and materials for teaching elementary school mathematics will be examined cooperatively. As a perspective elementary teacher it is important to: Develop a conceptual understanding of the mathematics topics. Think about the kinds of mathem atics students can learn through the use of multiple representations (i.e., applets, manipulatives ) Evaluate mathematical activities from the standpoint of a teacher.

PAGE 510

494 Appendix B (Continued) III. COURSE OBJECTIVES Upon completion of this course, students will have demonstrated: 1. Knowledge of the major goals and characteristics, including scope and sequence, of elementary school mathematics programs, and aspects of theories of learning as applied to the planning and instruction for the teachin g of elementary school mathematics. 2. Knowledge of the current developments in education, including research that may affect the elementary school mathematics curriculum. 3. Knowledge of the properties of a number system and their application in the t eaching of elementary school mathematics. 4. Knowledge of pre number concepts and ideas and their application in the teaching of elementary school mathematics. 5. Knowledge of numeration concepts and principles and their application within the Hindu Arabic System. 6. Knowledge of the whole number concepts, principles and computational skills (algorithms) and their application in the teaching of elementary school mathematics. 7. Knowledge of number theory concepts and principles and their app lication in the teaching of elementary school mathematics. 8. Knowledge of rational number (fraction and decimal) concepts, principles and computational skills (algorithms) and their application in the teaching of elementary school mathematics. 9. Know ledge of problem solving process/strategies and their application in the teaching of elementary school mathematics. 10. Knowledge of and an ability to use the various tools available to the elementary teacher to aid in the effective teaching of elementary mathematics (e.g., traditional concrete manipulatives as well as technological advances, for example, the Internet including various web applets)

PAGE 511

495 A ppendix C : PRE STUDY SURVEY QUESTIONNAIRE Survey Questionnaire Name :______________________________ Classification : Junior or Senior Age: (check one) ____ 18 22 Gender: ____ Male ____ 23 27 ____ Female ____ 28 32 ____ 33 37 ____ 38 or older Please indicate or write in your response to each question below. PLEASE CHECK ONLY ONE BOX. Yes No 2. Did your Geometry class include doing proof (e.g., two column proofs)? PLEASE CHECK ONLY ONE BOX. Yes No 3 Did you take any other classes in high school, besides Geo m e try that included geometry topics? PLEASE CHECK ONLY ONE BOX. Yes No If yes, what was the course called?

PAGE 512

496 Appendix C (Continued) 4 did you learn about area or, perimeter or both in your geometry class(es)? PLEASE CHECK ONLY ONE BOX. Area Perimeter Both 5 Were there any other high school classes in which you remember studying area ? PLEASE CHECK ONLY ONE BOX. Yes No If yes, please give details of the class(es). 6 Were there any other high school classes in which you remember studying perimeter ? PLEASE CHECK ONLY ONE BOX. Yes No If yes, please give details of the class(es). 7 Have you taken MAT 145, Liberal Arts Mathematics ? PLEASE CHECK ONLY ONE BOX. Yes No When? 8 I f you answered yes to question 6, who was your instructor when you took MAT 145, Liberal Arts Mathematics ?

PAGE 513

497 Appendix C (Continued) 9 Did you study area or perimeter or both in MAT 145, Liberal Arts Mathematics ? PLEASE CHECK ONLY ON E BOX. Area Perimeter Both 10 Have you studied area in any other college mathematics courses? PLEASE CHECK ONLY ONE BOX. Yes No If yes, please explain. 11 Have you studied perimeter in any other college mathematics courses? PLEASE CHECK ONLY ONE BOX. Yes No If yes, please explain. 12 Are you currently taking a mathematics course this semes ter? PLEASE CHECK ONLY ONE BOX. Yes No If yes, please list it. Who is your instructor for that course?

PAGE 514

498 Appendix C (Continued) 13 Do you remember ever using concrete manipulatives (i.e., square tiles, geoboards) when learning about area ? PLEASE CHECK ONLY ONE BOX. Yes No If yes, what manipulatives did you use and what do you remember doing with them? 14 Do you remember ever using concrete manipulatives (i.e., square tiles, geoboards) when learning about perimeter ? PLEASE CHECK ONLY ONE BOX. Yes No If yes, what mani pulatives did you use and what do you remember doing with them? 15 Do you remember using any forms of technology (i.e., computer software or the Internet) when learning about area and perimeter? PLEASE CHECK ONLY ONE BOX. Yes No If yes, what form(s) of technology and what do you remember about the experience? 1 6 What is your opinion on using technology (e.g., computers and/or the Internet) to help elementary stud ents learn about area and perimeter?

PAGE 515

499 Appendix C (Continued) 17 How confident are you currently about teaching area and perimeter concepts to elementary age children? PLEASE CHECK ONLY ONE BOX. apprehensive confident very confident Please share the reasons behind your response. 18 Are you aware of any specific technology currently available to assist elementary teachers in gaining a better understanding of area and perimeter concepts? PLEASE CHECK ONLY ONE BOX. Yes No If yes, please explain. 19 Are you aware of any specific technology currently available to assist elementary teache rs when instructing children regarding the concepts of area and perimeter? PLEASE CHECK ONLY ONE BOX. Yes No If yes, please explain. 20 If web based technologies (i.e., Internet activities) were available to help you teach elementary children about area and perimeter would you feel confident using them in your future classroom? PLEASE CHECK ONLY ONE BOX. Yes No Why or why not?

PAGE 516

500 Appendix C (Continued) 21 If you answered no, what do you think it would take for you to feel more confident in using technology to teach mathematics to your future students? 22 Do you feel web based technologies (i.e., Internet activities) are an appropriate tool to assist in teaching area and perimeter to elementary children? PLEASE CHECK ONLY ONE BOX. Yes No Please explain the reason(s) behind your choice. 23 If you had to teach area and perimeter to elementary children tomorrow: (a) What specifically would you tell them about the concepts? Perimeter Area (b) What would you do to help them understand the concepts? Perimeter Area 2 4 What do you think your future students may find difficult regarding the learning of : Perimeter Area

PAGE 517

501 APPENDIX D : AREA AND PERIMETER PRETEST 6 NAME _______________________________ Classification __________________________ Gender ______________________________ 6 The actual tests (pre post, and follow up) had more room to show work than those appearing in the appendices. Typically, there were one or two questions per page. DO NOT WRITE IN THIS AREA Student Number: _________

PAGE 518

502 Appendix D (Continued) NAME __________________________________ Area and Perimeter Pre t est All explanations and diagrams should be appropriate for elementary age students Please do not use a calculator. Be sure your answers include proper units. Write with pencil and please write legibly Feel free to use the back of any page for comments regarding any questions you found confusing and explain why it confused you. PART I : Content Knowledge (CK) For questions 1 5: (1) Answer each question the best that you can. (2) It is very important to use thorough and deta iled explanations to fully represent your knowledge. 1. (a) On the grid below, draw a polygon that has a perimeter of 24 units. ( b ) How would you help a 5 th grader understand that the polygon you drew really does have a perimeter of 24? = 1 square unit 2. Present a real world situation (or story problem), appropriate for 4 th or 5 th graders, in which they would need to find the area of a specified region.

PAGE 519

503 Appendix D (Continued) 3. What is the area and peri meter of Figure A? 1 cm (All corners are right angles.) (a) area = perimeter = (b) E xplain, as you would to a 4 th grader, how you arrived at both your answers. Area Perimeter 4. As a teacher, ho w would you explain the concepts of a linear unit and a square unit to a 5 th grader? Stress the differences in the concepts. Include a practical example 5. If each of the dimensions of a 2 x 4 rectangle is tripled, what is the relationship between the original and the enlarged figures? (a) Your answer? (b) As a teacher, how would you present the explanation for how you arrived at your answer to a class of 4 th or 5 th graders? Fig. A

PAGE 520

504 Appendix D (Continued) NAME __________________________________ Pretest P ART II : Knowledge of Student Thinking (KoST) For problems 6 10 please address the following: Part (a) is a short answer typically a Part (b) asks for you to explain your thinking, and Part (c) asks from the perspective of a teacher If there are more than three parts, p lease address each part thoroughly and separately. 6. Pete, a 5 th grader, calculates the area of the rectangle below. He arrives at an answer of 18. (a) and complete ? (b) Explain why or why not : (c ) After performing the calculation, Pete comes up to you looking puzzled and asks and his thinking ? What specifically would you say and do ? 3 cm 6 cm

PAGE 521

505 Appendix D (Continued) 7. Kayla, a 5 th grade student, was asked to draw all the four sided dog pen designs that she could make using 18 units of fence for each design. Below are the drawings, on dot paper, that she came up with. (a) Explain your answer. (b) Explain what is correct and incorrect regarding as evident in her work. (c) As a teacher, how would you respond to Kay la ? What precisely would you say and do? 8. Jasmine claims that whenever you compare two rectangles, the one with the greater perimeter will always have the greater area. (a) Is she correct? (b) (c) As a teacher, how would you respond to Jasmine ? What specifically would you say and do?

PAGE 522

506 Appendix D (Continued) 9. Justin wants to calculate the perimeter of th method is to shade the squares along the outside of the shape, as shown in Figure 2, and then to count those squares. Fig. 1 Fig. 2 (a) (b) Explain why or why not. (c) What specifically would you say and do ? (Fig. 1)

PAGE 523

507 Appendix D (Continued) 10. Mr. Jones purchased 60 feet of fence to enclose his garden. He wanted the garden to have a rectangular shape. He also wanted to have the most space possible for his garden. He drew out several possibilities, which are shown below. 8 ft Garden 1 Garden 2 10 ft 22 ft 20 ft Garden 4 5 ft 15 ft Garden 3 25 ft Garden 5 2 ft 15 ft 28 ft Part (a) place (b) explain your selection for Part (a); and Part (c) is below. _____ 1. Garden 1 is the biggest garden. (b) Explanation for Part (a) : _____ 2. Garden 2 is the biggest garden. _____ 3. Garden 3 is the biggest garden. _____ 4. Garden 4 is the biggest garden. _____ 5. Garden 5 is the biggest garden. _____ 6. The gardens are all the same size. Part (c ) : Which incorrect statement do you think would most often be selected by 4 th or 5 th graders ? What might they be thinking? Please explain your choice

PAGE 524

508 APPENDIX E : AREA AND PERIMETER POSTTEST NAME _______________________________ Classification __________________________ Gender ______________________________ DO NOT WRITE IN THIS AREA Student Nu mber: _________

PAGE 525

509 Appendix E (Continued) NAME __________________________________ Area and Perimeter Postt est All explanations and diagrams should be appropriate for elementary age students Please do not use a calculator. Be sure your answers include proper units. Write with pencil and please write legibly Feel free to use the back of any page for comment s regarding any questions you found confusing and explain why it confused you. PART I : Content Knowledge (CK) For questions 1 5: (1) Answer each question the best that you can. (2) It is very important to use thorough and detailed explanations to fu lly represent your knowledge. 1. (a) How many triangles, like the one shown below, will it take to completely cover the rectangle? (b) As a teacher might, clearly explain how you arrived at your answer? 2. Present a real world story problem, appropriate for 4 th or 5 th graders, in which they would need to find the area of a specified region. Provide the solution to your problem. 4 cm 3 cm 1 cm 1 cm

PAGE 526

510 Appendix E (Continued) 3. If eac h individual segment is equal to 1 cm, what is the area and perimeter of the shaded figure? 1 cm (a) Area = ___________ Perimeter = ___________ (b) As a teacher, explain how you arrived at BOTH your answers, and the meaning of those numbers. Area: Perimeter: 4. As a teacher, how would you explain the concepts of a linear unit and a square unit to a 5 th grader? Stress the differences in the concepts. Include a practica l example 5. A certain rectangle has a perimeter of 16 cm. (a) What might its area be? (b) Explain how you arrived at your answer. (c) Are there other correct responses ? If so, explain what they are.

PAGE 527

511 Appendix E (Continued) NAME __________________________________ Posttest P ART II : Knowledge of Student Thinking (KoST) For problems 6 10 please address the following: Part (a) is a short answer typically a Part (b) asks for you to explain your thinking, and Part (c) asks from the perspective of a teacher If there are more than three parts, p lease ad dress each part thoroughly and separately. 6. Stacey claims that whenever you compare two rectangles, the one with the smaller perimeter will always have the smaller area. (a) Is she correct? If you are unsure are you skeptical or do you tend to believe her? Why? (b) Explain why yo u agree or disagree with Stacey (c) As a teacher, how would you respond to Stacey ? What specifically would you say and do (even if you are unsure about the math ematics involved)? 7. Jose, a fifth grader, was asked to draw a rectan gle with a perimeter of 24 Below is his drawing. (a) Is he correct? Why? (b) What do es Jose about his knowledge of perimeter ? (c) As a teacher, how would you respond to Jose and his drawing ?

PAGE 528

512 Appendix E (Continued) 8. A student comes to you and says that he/she was able to draw several different rectangles that, acc ording to the area formula, have an area of 36 in 2 but the student was a little surprised when the rectangles did not all look the same size. (a) m athematically reasonable? (b) As a teacher might explain the reasons for your answer to Part (a). (c ) Why do you think the student was surprised by their re sults? What specifically 9. A student calculates the area of the rectangle shown to be 20 square cm. (a) Is the student correct? If not what is the correct answer? How did you figure your answer? (b) What do you think the student was thinking to arrive at their answer? (c) As a teacher, what specifically would you say or do to help clear up any possible confusions the student might have? 10. Marcus claims that it is only logical that if two different rectangular figures have the same perimeter they must have the same area. (a) Is Marcus correct? Why? (b ) What do you think Marcus might have been thinking about in order to make his claim? (c) and his thinking ? 1 cm

PAGE 529

513 APPENDIX F : AREA AND PERIMETER FOLLOW UP TEST NAME _______________________________ Classification __________________________ Gender ______________________________ DO NOT WRITE IN THIS AREA Student Number: _________

PAGE 530

514 Appendix F (Continued) NAME __________________________________ Area and Perimeter Follow Up Test All explanations and diagrams should be appropriate for elementary age students Please do not use a calculator. Be sure your answers include proper units. Write with pencil and please write legibly Feel free to use the back of any page for comments regarding any questions you found confusing and explain why it confused you. PART I : Content Knowledge (CK) For questions 1 5: (1) Answer each question the best that you can. (2) It is very important to use thorough and detailed explanations to fully represent your knowledge. 1. (a) On the grid below, draw a polygon that has a perimeter of 24 units. ( b ) How would you help a 4 th grader understand that the polygon you drew really does have a perimeter of 24? = 1 square unit 2. Present a real world story problem, appropriate for 4 th or 5 th graders, in which they would need to find the area of a specified region. Provide the solution to your problem

PAGE 531

515 Appendix F (Continued) 3. What is the area and perimeter of Figure A? 1 cm (All corners are right angles.) (a) area = perimeter = (b) Explain, as you would to a 4 th grader, how you arrived at both your answers. Area Perimeter 4. As a teacher, how would you explain the concepts of a linear unit and a square unit to a 5 th grader? Stress the differences in the concepts. Include a practical example 5. If each of the dimensions of a 2 x 4 rectangle is tripled, what various relationship s between the original and the enlarged figures should be discussed with a class of 4 th or 5 th graders ? (a) Your answer? (b) As a teacher, how would you present the explanation for how you arrived at your answer to a class of 4 th or 5 th graders? Fig. A

PAGE 532

516 Appendix F (Continued) NAME __________________________________ Follow up test P ART II : Knowledge of Student Thinking (KoST) For problems 6 10 please address the following: Part (a) is a short answer typically a Part (b) asks for you to explain you r thinking, and Part (c) asks from the perspective of a teacher If there are more than three parts, p lease address each part thoroughly and separately. 6. John, a 4 th grader, calculates the area of the rectangle below. He arrives at an answer of 18. (a) Is John correct and complete ? (b) Explain why or why not : (c ) After performing the calculation, John comes up to you looking puzzled and asks to John and his apparent confusion ? 3 cm 6 cm

PAGE 533

517 Appendix F (Continued) 7. Ariel, a 5 th grade student, was asked to draw all the four sided dog pen designs that she could make using 18 units of fence for each design. Below are the drawings, on dot paper, that she came up with. (a) Is Ariel (b) Explain what is correct and incorrect regarding Ariel as evident in her work. (c) As a teacher, how would you res pond to Ariel ? What precisely would you say and do? 8. Madison claims that whenever you compare two rectangles, the one with the greater perimeter will always have the greater area. (a) Is she correct? If you are unsure are you skeptical or do you tend to believe her? Why? (b) Explain why yo u agree or disagree with Madison (c) As a teacher, how would you respond to Madison ? What specifically would you say and do?

PAGE 534

518 Appendix F (Continued) 9. Jose wants to calculate the perimeter o f the shape shown in Figure 1. Jose method is to shade the squares along the outside of the shape, as shown in Figure 2, and then to count those squares. Fig. 1 Fig. 2 (a) Is Jose method produce for the perimeter of Fig. 1, and if necessary, state what is the correct answer? (b) Explain why or why not. (c) As a teacher, how would you res pond to Jose What specifically would you say and do? (Fig. 1)

PAGE 535

519 Appendix F (Continu ed) 10. Mr s. Smith purchased 60 feet of fence to enclose her flower garden. Sh e wanted the garden to have a rectangular shape. Sh e also wanted to have the most space possible for her garden. Sh e drew out several possibilities, which are shown below. 8 ft Garden 1 Garden 2 10 ft 22 ft 20 ft Garden 4 5 ft 15 ft Garden 3 25 ft Garden 5 2 ft 15 ft 28 ft Examine each of Mr s. Smith s drawings of her possible garden designs. For Part (a) place (b) explain your selection for Part (a); and Part (c) is below. _____ 1. Garden 1 is the biggest garden. (b) Explanation for Part (a) : _____ 2. Garden 2 is the biggest garden. _____ 3. Garden 3 is the biggest garden. _____ 4. Garden 4 is the biggest garden. _____ 5. Garden 5 is the biggest garden. _____ 6. The gardens are all the same size. Part (c ) : Which incorrect statement do you think would most often be selected by 4 th or 5 th graders ? Please explain your choice What might they be thinking and why ?

PAGE 536

520 APPENDIX G : PRELIMINARY RUBRIC S FOR SCORING AREA AND PERIMETER TESTS Scoring Rubri c for Content Knowledge (CK) Questions A score of 0, 1, or 2 3 or 4 0 for no response 1 = unacceptable 2 = inferior/mediocre 3 = acceptable 4 = complete The response is incomplete or contains many errors. Although some of the conditions of the task may have been addressed, an inadequate conclusion and/or faulty reasoning are present. Shows a very limited understanding of the mathematical concepts and procedures embodied by the task. Altho ugh a correct approach, or even a correct solution, is provided, an essential understanding of the mathematical concepts are lacking. Indicates partial understanding of the mathematical concepts and/or procedures embodied in the task. The response contains errors related to misunderstanding important aspects of the task, misuse of the mathematical procedures, or faulty interpretations of results, and may contain some major computation errors. An essentially correct response. Response indicates an essential, nearly complete (but less than thorough) understanding of the mathematical concepts & principles. Uses ne arly correct mathematical terminology and notations. Computations are generally correct but may contain minor errors A correct response. Response indicates a thorough and well connected understanding of the mathematical concepts & pr inciples. (The response may contain minor flaws which do not detract from the demonstration of a thorough understanding. Uses appropriate mathematical terminology and notations. Executes algorithms completely and correctly.

PAGE 537

521 A ppendix G (Continued) Scoring Rubric for Knowledge of Student T hinking (KoST) Questions A score of 0, 1, or 2 3 or 4 Note. KoST includes using explanation s focusing on building conceptual understanding. 0 = no response 1 = unacceptable 2 = inferior/mediocre 3 = acceptable 4 = complete Response provides no, or incorrect, insight thinking; provides no, or incorrect diagnosis of student error(s). Offers no, or incorrect, examples, explanations, or representations that could serve as constructive feedback. Shows no understanding of the mathematical concepts and principles. Response provides limited thinking, OR provides limited diagnosis of student error(s) when present. Offers incomplete or partially incorrect examples, explanations, OR representations that provides constructive feedback Response contains errors related to misunderstandings of important ma thematical concepts. Response provides adequate insight into the OR provides adequate diagnosis of student error(s) when present. Offers appropriate examples, explanations, OR representations that provides constructive feedback. Response is mathematically sound. It may contain minor computation errors but no conceptual ones. Response provides thorough thinking, AND provides complete diagnosis of student error(s) when present. Offers clear and complete examples, explanations, AND representations (when appropriate) that provides constructive feedback. Response is mathematically correct and contains no computational or conceptual errors.

PAGE 538

522 APPENDIX H : AMENDED RUBRIC S FOR SCORING AREA AND PERIMETER TESTS Scoring Rubric for Content Knowledge (CK) Questions A score of 0, 1, or 2 3 or 4 1 = unacceptable 2 = inferior/mediocre 3 = acceptable 4 = model response *See separate table for pertinent concepts and misconceptions. The response is incomplete and contains many errors. Although some of the conditions of the task have been addressed, an inadequate conclusion and/or faulty reasoning are present. Very little, if any, conceptual understanding is evident. No concepts* or misconceptions* are adequately addressed. Uses completely incorrect terminology and notations. Provides incorrect and misleading procedures and computations. Includes incorrect and misleading diagrams or pictures. A partly correct approach or partly completed solution. Indicates a partial understanding of mathematical concepts and/or misconceptions Mathematical terminology and notations reveal some misunderstandings. Misuses some procedures and contains some major computation errors. Fails to include, make ref erence to, or acknowledge the value of an appropriate picture or diagram, or it is only minimally helpful. A good explanation following a wrong response or a misunderstanding of the question, or a right response followed by a non conceptual explanation. Responds essentially correct to each part. Indicates an essential, nearly complete (but less than thorough) conceptual understanding of the concept(s) and/or underlying misconception(s) May not contain conceptual errors or any incorrect extraneous information. Uses nearly correct mathematical terminology, notations, and explanations (may omit units). Computations are generally correct but may contain minor errors. Includes a helpful picture or diagram but it contains minor errors, OR includes a statement/picture where a picture/explanation would have been more helpful. Correctly respon ds to each part of the question in a well articulated manner. Comprehends the p inherent mathematical concept(s) and underlying misconception(s) A plausible response that may contain minor flaws which do not detract from the demonstration of a thorough and conceptual understanding. May not contain conceptual errors or any incorrect extraneous information. Uses precise and complete mathematical terminology & computations and notations (MUST include appropriate units). States & e xecutes algorithms completely and correctly. When necessary includes diagrams or pictures tha t support and help to interpret, understand, and conceptualize the response. Assign a score of 0 when no meaningful response is provided.

PAGE 539

523 Appendix H (Continued) Scoring Rubric for Knowledge of Student Thinking (KoST) Questions A score of 0, 1, or 2 3 or 4 1 = unacceptable 2 = inferior/mediocre 3 = acceptable 4 = model response Knowledge of Student Thinking includes an explanation focusing on building conceptual understanding of mathematical content. *See separate table for concepts and misconceptions. A partly correct response that provides no or incorrect insight into or diagnosis of the and/or fails to address a major concept* or misconception.* A correct yes/no response followed by no, or incorrect, examples, explanations, or representations that could serve as constructive feedback. Shows no clear understanding of the concept s or the appropriate notation. Includes incorrect and misleading diagrams, pictures, or explanation. A partly correct procedurally based, response that provides limited insight into or diagnosis of student thinking, OR addresses a major concept or mi sconception. Offers incomplete, vague partially correct or confusing examples, explanations, OR representations in an attempt to provide constructive feedback Contains errors related misconceptions, notation, or the question itself. Fails to include, make reference to, or acknowledge either the value of an appropriate picture or diagram, or the major concept behind the question. A mostly correct and conceptually based response that provides adequa te insight and diagnosis of student error(s) when present, AND addresses a major concept or misconception. Appropriate examples, explanations, OR representations (1 of 3) that provide constructive or facilitative feedback. Is mathematically sound. It may contain minor computational or notational errors but no conceptual ones Includes helpful picture or diagram but it may contain minor errors, OR response is sufficient but a picture or diagram would have been more helpful. A completely correct and well articulated response that: provides thorough thinking, complete diagnosis of student error(s) when present, AND addresses a major concept and a misconception when both are present. Offers clear, complete, and plausible examples, explanations, AND representations ( 2 of 3 ) that provide constructive or facilitative feedback. Is mathematically correct and contains no computational, conceptual, or notational errors or omissions. Includ es diagrams or pictures that support and help conceptualize the response when necessary. Assign a score of 0 when no meaningful response is provided.

PAGE 540

524 APPENDIX I: SUPPLEMENTAL GRADING SHEETS Explanation of usage : As a result of piloting the scoring rubrics, supplemental grading sheets were created to assist in scoring items (especially the knowledge of student thinking questions) from the pre post and follow up tests. The tables that follow summarize the major concepts and misconceptions that each item contains. They are not meant to be stand alone scoring tools. When a scorer was unsure of which score to award to a certain item or teetering between scores, the supplemental grading sheets (SGS) proved very helpf ul in decipher ing the most appropriate score. The following criteria, which appear in parts of the Content Knowledge and Knowledge of Student Thinking rubric s have been combined for sake of simplifying the explanation is applied: (a) If a response fails to address either a major concept or a misconception listed in the SGS for that item t hen a 1 is the highest score that it em can receive (b) If a response indicates partial or limited understanding of the mathematica l content or or add resses either a major concept or misconception from the SGS, then that item could at most receive a score of 2, (c) If a response indicates adequate or nearly complete understanding of the mathematical content or and addresses a major concept or misconception from the SGS, then that item could at most receive a score of 3, (d) If a well articulated response indicates complete understanding of the ing of the student presented in the problem and addresses both a major concept and misconception form the SGS, then that item is considered a model response and can be awarded the highest score possible of 4. These criteria represent part, albeit an i mport ant part of the rubric used in the scoring process.

PAGE 541

525 Appendix I (Continued) Supplemental Grading Sheet for Pretest 1 1 = Follow up test is exactly the same as the pretest. # Potential major misconception(s) 1 1. Polygon has perimeter of 24 2. Appropriate explanation 1. Perimeter (P) versus area (A) 2. Linear units versus square units 2 1. Plausible context that requires finding area for the stated question 1. Addresses perimeter or volume instead of area 3 1. Correct A & P with correct units 2. Conceptual explanation 1. Applying A & P formulas for rectangle to irregular p olygon 4 1. Conceptual differences between linear and square units (not A & P) 2. Good practical examples of each 1. Linear units are two dimensional and square units are 3 D 2. Confusing which unit is used for which 5 1. Tripling both dimensions 2. Conceptually representing or explaining area increasing 9 times 1. The area will only triple does each side look 3 times larger ? 6 1. Area requires square units (e. g., sq cm) 2. The 18 represents how many sq cm are needed to cover the rectangle 1. The rectangle also has an perimeter of 18 (cm) 2. The meaning of the 18 square cm 7 1. Using fence to build pens implies a perimeter measure 2. There are also a 2x7 & 1x8 dog pen possible 1. Understanding of a linear unit 2. Counting dots = finding perimeter 3. A 3x6 rectangle results in the number 18 for area AND perimeter. 8 1. Perimeter can be increased by increasing one dimension & decreasing the other 2. Provide appropriate counter example 1. Increasing perimeter of a rectangle will always increase the area (i.e. a direct relationship exists) 2. Not realizing that increasing perimeter CAN increase area 9 1. Discuss correct method for finding perimeter 2. Distinguishing between linear and square units using square units is incorrect. 1. Counting squares to figure perimeter is a correct procedure 2. Must have a formula to calculate perimeter 10 1. Squares ARE rectangles (units not needed for scratch work) 1. Same perimeters will have same areas 2. Basing greatest area on appearance

PAGE 542

526 Appendix I (Continued) Supplemental Grading Sheet for Posttest # Potential major misconception(s) 1 1. Area relationship between figures 2. 1. Confusing area & perimeter 2. Confusing linear & square units 2 1. Plausible context that requires finding area for the stated question 1. Addresses perimeter or volume instead of area 3 1. Correct A & P with correct units 2. Conceptual explanation 1. Applying A & P formulas for rectangles and/or squares 4 1. Conceptual differences between linear and square units (not A & P) 2. Good practical examples of each 1. Linear units are two dimensional and square units are 3 D 5 1. Figuring area from perimeter 2. Infinite possible answers (including a 4 x 4 square) 1. Fixed perimeter implies fixed area 6 1. Perimeter can be increased by increasing one dimension & decreasing the other 2. Provide appropriate counter example 1. Decreasing perimeter of a rectangle will always de crease the area (i.e. a direct relationship exists) 2. Not realizing that de creasing perimeter CAN de crease area 7 1. Linear units for perimeter 2. Rectangle shown has P er of 28 1. Using sq. units to represent P er 2. Confusing area with perimeter 8 1. Several factors of 36 produce the same area 2. Equal areas may have different 1. Expecting all rectangles with same a rea to have same perimeter 2. Figures with equal areas will all look the same (i.e. be the same size) 9 1. Conceptually represent the area 1. Confusing area with perimeter 2. Confusing linear and square units 10 1. Comparing 2 different rectangles with the same perimeter 2. Value of a counter example 1. Figures with same area will have the same perimeter, & vice versa

PAGE 543

527 A PPENDIX J : SAMPLES OF TEST ITEMS FROM PILOTING TO ILLUSTRATE SCORING All samples involve the same Knowledge of Student Thinking type question to illustrate what elements of a response result in different scores. The following response earned a score of 1 based on the Knowledge of Student Thinking attempted. It appears the preservice teacher was using dots to possibly count square units, but because nothing was said regardin g that, no credit could be awarded.

PAGE 544

528 Appendix J (C ontinued) The following response earned a score of 2 because although part (a) is correct, parts (b) and (c) are only partly correct. The response to p art ( b ) fail ed to acknowledge that the student most likely came up with an answer of 20 because they were calculating (c) is procedural in nature (i.e. focuses on using a formula) in contrast to a conceptual approach which would encourage the counting of the square units to find area.

PAGE 545

529 Appendix J (Continued) The following response earned a score of 3. Part (a) is correct and in part (b) the preservice teacher correctly identi fi ed that the student was calculating perimeter as opposed t o area (even though they did not specifically write that). The answer to part (c) procedural explanation ( i.e. using the LxW formula) which is not best suited or the most meaningful for a student exhibiting misunderstandings. A conceptual approach would involve drawing in the 3x7 grid and revealing the 21 square centimeters and drawing the

PAGE 546

530 Appendix J (Continued) The following response correct and part (b) correctly states the student was most likely confused area with well said. They mentioned the importance of connecting the concept of square units with finding area and also drew in the square units in the rectangle. T he response also made a point to differentia te what it means to find are a from that of finding perimeter.

PAGE 547

531 APPENDIX K : LEARNING PACKETS FOR TEACHING EPISODES Note : The spacing for teaching episode #1 will be very similar to the one used for the study; however, teaching episodes 2 and 3 will be condensed to save space. Teaching Episode #1: U nits of Measure While involved with the teaching of elementary mathematics, you will also be continually learning about mathematics about the subtle notions underlying the structure and concepts as well as what students find difficult about learning and doing mathematic s. So in reality, a teacher is also a student. When however you do assume the role of classroom teacher, you will often be faced with situations in which students produce responses or ask questions that will stretch the limits of your knowledge and understanding of elementary mathematics and how to hel p students understand it. Today you will encounter one such situation. Taking the time to reflect upon (i.e. ponder or think about) your knowledge and how it impacts your instructional decisions is a necessary and vital part of becoming an effective teache r; therefore, throughout this learning experience you will be asked to pause and reflect upon your current understanding of the problem, questions you are working through, possible u resolved them, and the resulting changes in your knowledge of the concepts at hand. Such activity is vital to developing and maturing into an insightful, responsive, and effective communicator of elementary mathematics.

PAGE 548

532 Appendix K (Continued) Name: ___ ___________________________________ Teaching Episode #1 The Setting : You are a fifth grade teacher, and you have just begun a review of basic area and perimeter concepts that your students had explored in fourth grade. You present your students with what you believe will be a rather easy task: Figure 1 The Situation : One of your students, Justin, shows you his method which is to shade the squares along the outside of the shape, as shown in Figure 2, and then to count those shaded squares. 1 What was the first thing you did after reading through this situation? Why did you do that? Figure 2 2 3.

PAGE 549

533 Appendix K (Continued) (Students asked to find perim eter) 4 Please take a moment and write down your initial thoughts regarding the problem to this point. 5 specifically would you say and do? Time to Reflect:

PAGE 550

534 Appendix K (Continued) Name: ______________________________________ (Day 1 cont.) Your Investigative Tool : (Researcher will read what appear s in quotes. It will not teaching and learning of elementary mathematical concepts is a tra it of successful teachers. Such tools can be instrumental in deepening your own personal understanding of the mathematical concepts you must teach. Some of these tools can be found on the Internet in the form of Java applets called microworlds They are in teractive and designed experience, you will have access to such an applet which has been specially designed to Use the mi croworld to explore patterns, test your hypotheses, and generate helpful representations for your solutions and your explanations. Include app ropriate sketches of your microworld designs to help illustrate and explain your thinking. Please begin by follow ing these directions : 1. Open the Internet, and enter the website for EDU 316. 3. Click on the li I would like you to thoughtfully consider your previous responses to questions 1 5 As you do so imagine you have the ability to use and display the Internet applet when personally thinking about this problem, when working individually with a student and when addressing the entire class. After exploring a nd investigating with the microworld and the questions below: 6 What, if anything, would you add or rev ise from your responses to questions 1 4 ?

PAGE 551

535 Appendix K (Continued) 7 What, if anything, would you add or revise to your response to question 5 ? document (in the Time to Reflect sections) specific questions and ideas ( including false starts ) you have thought about and explored with the microworld Share details regarding how you decided what to say and show to Justin, including specific examples to represent how and what you woul d communicate. For example you could include statements such applet I came to realize that my understanding concerning describe how the applet may have influenced your new understanding include specific drawings of applet designs (or discuss specific features of the applet ) that helped you. Plea se take a moment and address questions 8 12 : #8 What do you think students mi ght find difficult about finding the perimeter of the shape shown in Figure 1? What could confuse them? Time to Reflect:

PAGE 552

536 Appendix K (Continued) #9 In what ways, if any, did interacting with the microworld help you better understand the you do and how did it help. #10 regarding units of measure how would you follow up with the entire class about the concepts that surround this classroom episode? Remember, share specific examples and representations (possibly from the microworld ) just as you would in the classroom, as well as why y ou choose what to say and do. #11 Do you think elementary students could benefit from personally interacting with the microworld In what ways? (If you think no, please see #12 ). #12 11 please share why, and then tell what instructional tool(s) and/or strategies you feel would be more appropriate for the concepts investigated today.

PAGE 553

537 Appendix K (Continued) Name: _________________ _____________________ (Day 2) You will now be asked get into cooperative groups. 1 3 a. b. Take the next several minutes and have each group member share how they arri ved at their solutions for questions 1 5 (pp. 2 3) as well as the two questions on page 4 pertaining to Shape Builder As each member shares the other members should compare what they are hearing with their personal responses. Make notes header to include ideas, insights, and instructional strategies that were not part of, or are extensions of, your responses. Indicate fro m whom you gained the new ideas and how these ideas have influenced your thinking. 14 What new knowledge did you gain regarding questions 1 5 (pp. 2 3)? Shared Knowledge Time to Work Together :

PAGE 554

538 Appendix K (Continued) 15. What new knowledge did you gain regarding the two Shape Builder questions 6? 16 What new knowledge did you gain from your group regarding questions 8 & 10? Shared Knowledge cont.

PAGE 555

539 Appendix K (Continued) 17. After the group sharing is done, your instructor will conclude with a brief summary. Again, in the space provided below write down anything presented that added to: (a) Y our understanding of the concepts (b) Y our knowledge of student thinking and the specific difficulties they can have with area and perimeter, and (c) Y our knowledge of potential teaching strategies to help address student thinking related to these concepts. Grand Discussion

PAGE 556

540 Appendix K (Continued) Teaching Episode #2: Fixed Area & Perimeter INTRODUCTION: While involved with the teaching of elementary mathematics, you will also be continually learning about mathematics about the subtle notions underlying the structure and concepts as well as what students find difficult about learning and d oing mathematics. So in reality, a teacher is also a student. When however you do assume the role of classroom teacher, you will often be faced with situations in which students produce responses or ask questions that will stretch the limits of your knowl edge and understanding of elementary mathematics and how to help students understand it. Today you will encounter one such situation. Taking the time to reflect upon (i.e. ponder or think about) your knowledge and how it impacts your instructional decision s is a necessary and vital part of becoming an effective teacher; therefore, throughout this learning experience you will be asked to pause and reflect upon your current understanding of the problem, questions you are working through, possible misconceptio resolved them, and the resulting changes in your knowledge of the concepts at hand. Such activity is vital to developing and maturing into an insightful, responsive, and effective communicato r of elementary mathematics.

PAGE 557

541 Appendix K (Continued) Name: ______________________________________ (Day 1) Teaching Episode # 2 The Setting : (adopted from Bassarear, 2005, p. 677) Your 5 th grade class is studying area, and you challenge them to find the area of one of their footprints. You instruct your students to stand on a piece of paper and trace their shoe, and then individually brainstorm a strategy to find the area of the footprint. The Situation : After several minutes one of your students, Tommy, comes up to you and explains his method. He says he would lay a piece of string around the outside of the paper footprint, cut the string to the precise length, form the piece of stri ng into a rectangle, use a ruler to measure the length and width of the rectangle, then find the area of the rectangle. In other words, he believes that the area of the rectangle will be the same a s the area of his [ Each participant will be provided with a copy of a footprint drawn on square piece of string ] 1 What was the first thing you did after reading through this situation? Why did you do that? 2 3. If no, why not? 4 method.

PAGE 558

542 Appendix K (Continued) 5. covers? Try to describe a second way to find the area of the footprint. (page break) Please take a moment and write down your current thoughts regarding the mathematics surrounding this problem as well as knowledge and or understandings changed from when you began working on this problem? If so, please share these changes. 6 specifically wou ld you say and do? (page break) Name: ______________________________________ (Day 1 cont.) Your Investigative Tool : (Researcher will read what appear s in quotes. It will not teaching and learning of elementary mathematical concepts is a trait of successful teachers. Such tools can be instrumental in deepening your own personal understanding of the mat hematical concepts you must teach. Some of these tools can be found on the Internet in the form of Java applets called microworlds They are interactive and designed experi ence, you will have access to such an applet which has been specially designed to Use the microworld to explore patterns, test your hypotheses, and generate helpful representations for your solutions and y our explanations. Include appropriate sketches of your microworld designs to help illustrate and explain your thinking. Time to Reflect:

PAGE 559

543 Appendix K (Continued) Please begin by following these directions : 1. Open the Internet, and enter the website for EDU 316. 2. Under 3. O pen both microworlds You will be provided with login information. Part of becoming a professional educator is becoming proficient at selecting the most appropriate instruct ional tool(s) for a specific learning outcome With that in mind, please access either microworld and thoughtfully consider your previous responses to questions 1 6. As you do so imagine you have the ability to use and display the microworlds while pers onally thinking about this problem, while working individually with a student, and when addressing the entire class. After exploring and investigating with the microworlds, answer the questions below. 7 What, if anything, would you add or rev ise from your responses to questions 1 5 ? 8 What, if anything, would you add or r evise from your response to question 6 ? (page break) Now I would like you (functioning as both a learner of mathematics as well as a teacher) to thoughtfully answer questions 9 14. #9 What mathematical concepts are involved with finding the area of a footprint? #10. What do you think students might find difficult about finding the area of their footprint? What specifi cally could confuse them? (page break) # 11 In what ways, if any, has interacting with the microworlds influenced your thoughts related to this problem? How has your thinking changed up to this point; both your personal understandings regarding the concepts in this problem and your Remember please be specific and provide examples be sure and specify what microworld you are referring to. (page break) #12 u nderstanding regarding the perceived direct relationship between perimeter and area how would you follow up with the entire class about the concepts that surround this classroom episode? Share specific examples and representations (possibly from a microworld ) just as you would in the classroom. Be sure and tell why you choose what to say and do.

PAGE 560

544 Appendix K (Continued) #1 3 Do you think elementary students could benefit from personally interacting with the microworlds If yes, in what ways? ( #14 ). #14 please share why, and then tell what instructional tool(s) and/or strategies you feel would be more appropriate for the concepts investigated today. (page break) Name: ______________________________________ (Day 2) You will now be asked to get into cooperative groups. 1 5 a. b. Take the next several minutes and have each group member share how they arrived at their solutions for the questions stated in problems 16 18. As each member shares the other members should compare what they are hearing with their personal responses. Make notes instructional strategies that were not part of, or are extensions of, your personal responses. Indicate from whom you gained the new ideas and how these ideas have influenced your t hinking 16 What new mathematical knowledge did you gain regarding questions 1 5 on pp. 1 3 and #9 on p. 6? (page break) 17 What new knowledge did you gain regarding the use of the TWO microworlds (see questions 7, 8, & 11 on pp. 5 & 7)? Be sure and specify what microworld your are referring to. Use Shape Builder when referring to the ExploreLearning G izmo Shared Knowledge Time to Work Together :

PAGE 561

545 Appendix K (Continued) 18 What new knowledge did you gain from your group regarding student thinking (see question s 10 & 13) and instructional practices (see questions 6 & 12)? (page break) 19. After the group sharing is done, your instructor will conclude with a brief summary. Again, in the space provided below write down anything presented that added to: (a) Your understanding of the mathematical concepts teaching scenario, (b) Your knowledge of student thinking and the specific difficulties they can have with area and perimeter, and (page break) (c) Your knowledge of potential teaching strategies to help address student thinking related to these concepts. Please be specific. Concluding Question : Did you access either microworld outside of class? If yes, why? Grand Discussion

PAGE 562

546 Appendix K (Continued) Teaching Episode #3: A Direct Relationship ? While involved with the teaching of elementary mathematics, you will also be continually learning about mathematics about the subtle notions underlying the structure and concepts as well as what students find difficult about learni ng and doing mathematics. So in reality, a teacher is also a student. When however you do assume the role of classroom teacher, you will often be faced with situations in which students produce responses or ask questions that will stretch the limits of y our knowledge and understanding of elementary mathematics and how to help students understand it. Today you will encounter one such situation. Taking the time to reflect upon (i.e. ponder or think about) your knowledge and how it impacts your instructional decisions is a necessary and vital part of becoming an effective teacher; therefore, throughout this learning experience you will be asked to pause and reflect upon your current understanding of the problem, questions you are working through, possible mis resolved them, and the resulting changes in your knowledge of the concepts at hand. Such activity is vital to developing and maturing into an insightful, responsive, and effective co mmunicator of elementary mathematics. Your Investigative Tools : Being aware of and willing to use various manipulative tools to enhance the teaching and learning of elementary mathematical concepts is a trait of successful teachers. Such tools can be in strumental in deepening your own personal understanding of the mathematical concepts you must teach. Some of these tools can be found on the Internet in the form of Java applets called microworlds They are interactive and specially designed to help you vi learning experience. For this teaching episode you may use either microworld from the outset to explore patterns, test your hypotheses, and generate helpful representations for your solutions and your explanations. Include app ropriate sketches of your microworld designs to help illustrate and explain your thinking. Please begin by following thes e directions : 1. Open the Internet, and enter the website for EDU 316. 3. Click on and open both microworlds You may use the m from the beginning

PAGE 563

547 Appendix K (Continued) Name: __________________________________ Teaching Episode # 3 The Setting : You have just completed the last scheduled unit on area and perimeter with your 5 th grade class. You feel they understand the concepts pretty well. While the students are working up to you very excited. The Situation : Jasmine t hen tells you th at she has figured out class about She explains that she has discovered that whenever you compare two rectangles, the one with the greater perimeter will always have the greater area She shows you this picture as p roof of what she is saying: 4 in 8 in. perimeter = 16 in. perimeter = 24 in. area = 16 square in. area = 32 square in. 1 What was the fir st thing you did after reading through this situation? Why did you do that? 2 4 in. 4 in.

PAGE 564

548 Appendix K (Continued) 3. correct? If no, why not? If you are unsure are you skeptical or do you tend to believe it? Why? 4 theory. (page break) Please take a moment and write down your current thoughts regarding the knowledge and or understandings changed from when you began working on this problem? If so, please share these changes. 5 What specifically would you say and do (even if you are unsure about the mathematics involved)? (page break) Now, if you have not already done so, please access either, or both, of the microworlds available to you. I would like you to thoughtfully consider your previous responses t o questions 1 5. As you do so, imagine you have the ability to use and display the microworlds while personally thinking about this problem, while working individually with a student, and when addressing the entire class. Include appropriate sketches of your microworld designs to help illustrate and explain your thinking. After exploring and investigating with the microworlds, answer questions 6 & 7. 6 What, if anything, would you add or rev ise from your responses to questions 1 4 ? 7 What, if any thing, would you add or r evise from your response to question 5 ? (page break) Time to Reflect:

PAGE 565

549 Appendix K (Continued) Now I would like you to thoughtfully answer questions 8 12 #8 Do you think many students may have the same incomplete understanding as Jasmine? If so, what do you think might be the cause? When answering, consider the used. #9 In what ways, if any, did interacting with the microworlds help you better understand do and how did it help. Remember, please be specific provide examples. Be sure and share which microworld (and what features ) helped with what ideas or concepts. (page break) #10 regarding the perceived direct relationship between perimeter and area how would follow up with the entire class about the concepts that surround this classroom episode? Remember, share specific examples and re presentations (possibly from a microworld ) just as you would in the classroom Be sure and tell me why you choose what to say and do. #11 Do you think elementary students could benefit from personally interacting with the microworlds If yes, in what ways? #12 ). #12 please share why, and then tell what instructional tool(s) and/or strategies you feel would be more appropriate for the concepts investigated today. (page break)

PAGE 566

550 Appendix K (Continued) Name: ______________________________________ (Day 2) Brief A & P Review: 1. How do you find the perimeter of a rectangle ? 2. How do you find the area of a rectangle? You will now be asked to get into cooperative groups. 1 3 a. b. Take the next several minutes and have each group member share how they arrived at their solutions for the questions stated in problems 14 18. As each member shares the other members should compare what they are hearing with their personal responses. Make notes instructional strategies that were not part of, or are extensions of, your personal responses. Indicate from whom you gained the new ideas and how these ideas have influenced your thinking 14. What new mathematical knowledge did YOU gain regarding questions 1 4 on pp. 1 & 2? (page break) 15. What new knowledge did YOU gain regarding the use of the TWO microworlds (see questions 6, 7, & 9 on pp. 4, & 5)? Be sure and specify what microworld your are referring to. Use Shape Builder ExploreL earning G izmo (page break) Shared Knowledge Time to Work Together :

PAGE 567

551 Appendix K (Continued) 16 What new knowledge did YOU gain from your group regarding: (a) student thinking ( see questions 8 & 11) and (b) instructional practices (see questions 5 & 10)? (page break) 17. As a result of hearing the ideas of your group members, what is YOUR current (page break) 18 After the group sharing is done, your instructor wil l conclude with a brief summary. Again, in the space provided below write down anything presented that added to: (a) Your understanding of the mathematical concepts teaching scenario, (b) Your knowledge of student thinking and the specific difficulties they can have with area and perimeter, and (c) Your knowledge of potential teaching strategies to help address student thinking related to these concepts. Please be specific. Concluding Question : Did you access either applet outside of class? If yes, why? Grand Discussion

PAGE 568

552 APPENDIX L : SECOND OBSERVER PROTOCOL During the Teaching Episode please record your observations of instructional activities as well as the activities of the preservice teacher s. Please make special note of activity that reflects perimeter as well as their pedagogical content knowledge (specifically, knowledge of student thinking) Please pay care ful attention as to : (a) go about making sense of the teaching scenario (b) how they make use of the applet while problem solving and (c) how they interact cognitively with their peers. Indicate behavior as focus ing on : Personal Insights & 1. Content knowledge regarding area & perimeter (CK) or Interpretations 2. Knowledge of student thinking (PCK)

PAGE 569

553 APPENDIX M : MICROWORLDS ORIENTATION SESSION MICROWORLDS ORIENTATION SESSION NAME: __________________________________________ Open the ShapeBuilder microworld and follow the instructor while you are guide d on an features. U s e the microworld to help answer question #1. Please docum ent what features you used and which ones help you in solving the problem. 1. Add, by shading, at least one square to the grey figure below so that your new figure also has a perimeter of 14 units. (More than one answer is possible.) = 1 square unit Summary of microworld usage :

PAGE 570

554 Appendix M (Continued) Now open th e area & p erimeter microworld from ExploreLearning Once again, please follow the instructor as you are guided through the many features of this applet. U se the applet to answer question #2. Please document what features you used and which ones help you in solving the problem. 2. What is the area of the shaded region? ( Each measure is in inches. ) Explain how you arrived at your answer. 10 Summary of applet usage : 8 6 8

PAGE 571

555 APPENDIX N: PURPOSELY SELECTED TASKS FOR FINAL INTERVIEW Name ____________________________________ Task 1 Figure 1 Figure 2 Examine Figures 1 & 2. Assuming Figures 1 & 2 are congruent squares, w hat relationships do you notice between the rectangles 1, 2, & 3 and the triangles 1, 2, & 3? Task 2 Given the fact that shape A and shape B have the same length and width, which shape will have the greater perimeter? Why? 1 2 3 2 1 3

PAGE 572

556 APPENDIX O: Anchored Instruction Assessment Survey What follows is a checklist designed to elicit from you (the expert reviewer) the degree to which you: (1) agree with the definition of anchored instruction (as operationally defined by me, the researcher), (2) are able to identify the elements (design principl es) of anchored instruction in my materials, an d (3) anticipate that my materials and procedures will cause anchored instruction to happen for my participants. Below each section of the survey you will have the opportunity to provide qualitative input regarding your selections. For ex ample, answering q uestions such as: Why? Why not? How might it be improved? Your suggestions for improving my materials and procedures are welcomed and appreciated. Text based Teaching Episodes (i.e., a series of three, spanning 6 class periods) will present authentic, problem solving scenarios anchored around common difficulties and misconceptions elementary students (and teachers alike) have regarding area and perimeter. The Teaching Episode s will be enhanced and supported by two geometry microworlds whose features should promote sustained exploration of each classroom based scenario from multiple points of view. Section 1 ~ My Definition of Anchored Instruction

PAGE 573

557 Appendix O (Continued) ate box that best describes the degree to which you agree with the above definition of Anchored Instruction. Strongly Agree Agree Disagree Strongly Disagree My definition of Anchored Instruction If you Disagree or Strongly Disagree with my definition, please share why.

PAGE 574

558 Appendix O (Continued) 1. be a macro contextual video based anchor capable of random accessibility videodiscs were chosen by the CTGV (e.g., The Sherlock Project & the Jasper Projects) (Bransford et al., 1989; CTGV, 1990, 1992b, 1992c, 1993). not mean to imply that the anchors in anchored instruction must always be based on video. Case based approaches to (Bransford, 190b p. 398 ). Such approaches have met with great success in business schools. The CTGV felt however that video would provide richer sources of information better suited for school students. 2. develop within a narrative format 3. promote broad transf er (i.e., by promoting an explicit emphasis on analyzing similarities and differences among problem situations, and on bridging to new areas of application, facilitates the degree to which spontaneous transfer occurs ( CTGV, 1992) 4. help students n otice the features of problem situations that make particular actions relevant. In order to appropriately conditionalize their knowledge, the anchors for instruction must help students focus on the relevant features of the problems they are try ing to solve (Bransford et al., 1990a). 5. allow participants to experience the kinds of problems and opportunities that experts in various areas encounter (e.g., classroom teachers interacting with a student who has a misconception related to materi al being taught) (Goldman et al., 1996). 6. involve complex situations that require students to formulate and solve a set of interconnected subproblems (Bransford, Sherwood, & Hasselbring, 1988) Section 2 ~ The Anchor

PAGE 575

559 Appendix O (Continued) In this study for instruction will be Teaching Episodes (i.e., a series of three, spanning 6 class periods) which address common difficulties and misconceptions elementary students (and teachers alike) have regarding area and perimeter. (Please re fer to the documents i ncluded in your packet). d within expressed by the designers of Anchored Instruction. Strongly Agree Agree Disagree Strongly Disagree My selection for the a nchor If you Disagree or Strongly Disagree with my choice for an anchor, please share why.

PAGE 576

560 Appendix O (Continued) (Principles presented by McLarty et al., 1989 & CTGV, 1997) #1. Choosing an appropriate anchor. ( Addressed separately above) #2. Possess a generative learning format. The anchored environment involves complex situations that create a meaningful context for problem solving. #3. Developing shared expertise around the anchor. Students (or preservice teachers) need multiple opportunities to view the anchor and be engage d in problem solving. Discussion based upon the shared context of the anchor helps students comprehend and organize the information. #4. Expanding the anchor. One anchor may not meet all the learning objectives that have been set forth. Students may nee d more than one experience with the anchor to enable acquiring more balanced information which could facilitate comparisons or contr asts between anchored experiences. #5. Using knowledge as a tool. The anchor provides students with a meaningful context from which they acquire new information; .g., the actual classroom). #6. Merging the anchor. The anchor will provide opportunit ies for using oral language, reading, writing, and participating in other literacy related skills (e.g., cooperative work and classroom discussion). # 7. Allowing student exploration. Giving students /preservice teachers access to, and opportunities to ex plore, the elements and concepts surrounding the anchor helps them to develop a sense of expertise. (Examining the microworlds and their features wou ld be encouraged). Realize participants will experience a pre study orientation session designed to acquain t them with the various features of both microworlds used in this study. Section 3 ~ Design Principles

PAGE 577

561 Appendix O (Continued) #8. Provide opportunity for participants to share what was learned from the anchored instruction. My study addresses this Conceptual Framework document included in your packet) st describes the degree to which you feel the design principles of Anchored Instruction are addressed by the materials of my study. Strongly Agree Agree Disagree Strongly Disagree 1. Choosing an appropriate anchor 2. Possess a generative learning format 3. Developing shared expertise around the anchor 4. Expanding the anchor 5. Using knowledge as a tool 6. Merging the anchor 7. Allowing student exploration 8. Provide opportunity for participants to share what was learned from the anchored instruction If you Disagree or Strongly Disagree that my materials address a specific design principle, please share your rationale(s) be low. (more room was left in the actual survey)

PAGE 578

562 Appendix O (Continued) procedures will cause anchored instruction to occur for the p articipants of my study. Strongly Agree Agree Disagree Strongly Disagree s participants will experience anchored i nstruction experience a nchored instruction, please share why.

PAGE 579

563 ABOUT THE AUTHOR Matthew Kellogg Bob Jones University in 1988. Immediately after graduation, he accepted a job teaching mathematics in northeast PA for grades 8 12. H e returned to school at DeSales University in PA to enter the Maste and completed that degree in 1998. He continued teaching high school mathematics while pursuing his in 2000 accepted a position with Clearwater Christian College (CCC) teaching mathematics IT, and undergraduate mathematics methods courses for pre service teachers Also in 2000, he took an adjunct position with St. Petersburg College (SPC) where he continues to teach va rious mathematics courses. In 2003, while still employed with CCC and SPC he entered the Ph.D. program in Mathematics Education at the University of South Florida. While in the Ph.D. program Mr. Kellogg continued to teach mathematics, IT, and methods c ourses at CCC and also has been published in a technology education journal worked closely with elementary and secondary interns, and conducted an in service session for elementary classroom teachers.


xml version 1.0 encoding UTF-8 standalone no
record xmlns http:www.loc.govMARC21slim xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.loc.govstandardsmarcxmlschemaMARC21slim.xsd
leader nam 22 Ka 4500
controlfield tag 007 cr-bnu---uuuuu
008 s2010 flu s 000 0 eng d
datafield ind1 8 ind2 024
subfield code a E14-SFE0003306
035
(OCoLC)
040
FHM
c FHM
049
FHMM
090
XX9999 (Online)
1 100
Kellogg, Matthew.
0 245
Preservice elementary teachers' pedagogical content knowledge related to area and perimeter :
b a teacher development experiment investigating anchored instruction with web-based microworlds
h [electronic resource] /
by Matthew Kellogg.
260
[Tampa, Fla] :
University of South Florida,
2010.
500
Title from PDF of title page.
Document formatted into pages; contains X pages.
Includes vita.
502
Dissertation (Ph.D.)--University of South Florida, 2010.
504
Includes bibliographical references.
516
Text (Electronic dissertation) in PDF format.
538
Mode of access: World Wide Web.
System requirements: World Wide Web browser and PDF reader.
3 520
ABSTRACT: Practical concepts, such as area and perimeter, have an important part in today‟s school mathematics curricula. Research indicates that students and preservice teachers (PSTs) struggle with and harbor misconceptions regarding these topics. Researchers suggest that alternative instructional methods be investigated that enhance PSTs‟ conceptual understanding and encourage deeper student thinking. To address this need, this study examined and described what and how PSTs learn as they engage in anchored instruction involving web-based microworlds designed for exploring area and perimeter. It‟s focus was to examine the influences of a modified teacher development experiment (TDE) upon 12 elementary PSTs‟ content knowledge (CK) and knowledge of student thinking (KoST) regarding principles, relationships, and misconceptions involving area and perimeter as they develop simultaneously in a problem-solving environment. The learning of meaningful mathematics is a personal and independent activity, as one struggles to create and reason through their own mathematical realities and misconceptions. This study describes PSTs‟ reasonings, misconceptions, and difficulties as they grappled with new knowledge or reconciled new knowledge with prior understandings. Quantitative and qualitative research methods, including case-subject analysis, were used. Instructional sessions similar to Steffe‟s (1983) teaching episodes comprised this study‟s intervention. Results indicate that prior to intervention most of the PSTs possessed a procedural knowledge of area and perimeter and were bound by a dependency on formulas; their KoST pertaining area and perimeter was relatively underdeveloped. They seemed unaware of prevalent misconceptions students acquire while working with these concepts (specifically, units of measure and perceived relationships). The PSTs displayed an ineffective use of drawings to support their responses. Their preoccupation with finding what they judged as "the answer" to various problem-solving situations hindered their ability to properly diagnose and address student thinking and limited their meaningful interaction with the microworlds (MWs). A majority of PSTs felt the MWs were a valuable learning tool for themselves but not for their future students. The planned intervention played a role in the PSTs becoming more perceptive of the difficult mathematics involved with area and perimeter and better equipped to anticipate and address those difficulties with future students.
590
Advisor: Gladis Kersaint, Ph.D.
653
Mathematics
Technology
Knowledge of student thinking
Misconceptions
690
Dissertations, Academic
z USF
x Secondary Education
Doctoral.
773
t USF Electronic Theses and Dissertations.
4 856
u http://digital.lib.usf.edu/?e14.3306