Evaluation of HQT Online Courses: Growth of Participants Technology, Pedagogy and. Content Knowledge (TPACK) A dissertation presented to

Evaluation of HQT Online Courses: Growth of Participants Technology, Pedagogy and Content Knowledge (TPACK) A dissertation presented to the faculty o...
0 downloads 0 Views 3MB Size
Evaluation of HQT Online Courses: Growth of Participants Technology, Pedagogy and Content Knowledge (TPACK)

A dissertation presented to the faculty of The Patton College of Education of Ohio University

In partial fulfillment of the requirements for the degree Doctor of Philosophy

Cheryle D. McGlothlin December 2014 © 2014 Cheryle D. McGlothlin. All Rights Reserved.

2 This dissertation titled Evaluation of HQT Online Courses: Growth of Participants Technology, Pedagogy and Content Knowledge (TPACK)

by CHERYLE D. MCGLOTHLIN

has been approved for the Department of Educational Studies and The Patton College of Education by

Teresa J. Franklin Professor of Educational Studies

Renée A. Middleton Dean, The Patton College of Education

3 ABSTRACT MCGLOTHLIN, CHERYLE D, Ph.D., August 2014, Curriculum and Instruction, Instructional Technology Evaluation of HQT Online Courses: Growth of Participants Technology, Pedagogy and Content Knowledge (TPACK) Director of Dissertation: Teresa J. Franklin This research project examined the pre and post survey scores of the seven constructs of TPACK to determine the impact of participation in an online course in regards to the constructs outlined with the TPACK instrument. Three online courses were used in the study with a total of 36 participants (math 6, science 17, and social studies 13). The intent of this concurrent mixed methods study was to determine whether online courses taken by teachers can help address the lack of technological integration in the classroom through a learn by doing model. The first phase, quantitative research addressed the relationship of the constructs of TPACK with teachers who participated in an online course not specifically aligned to the TPACK model by using t-test, ANOVA, and multiple regression analysis. In the second phase, the comment area of the pre and post survey and discussion forums within the online course were examined, to give voice to the participants. The responses within the comments sections of the pre and post survey and the discussion forums were analyzed inductively using thematic content analysis, a common approach of grounded theory (Burnard et al., 2008). And finally as part of the course requirement, participants created a lesson plan using a template. The Technology Integration Assessment Rubric (TIAR) was used to score the lesson plans.

4 The four overarching themes used in the TIAR instrument are 1) ways in which technology was used with specific curriculum, 2) was the technology used by the teacher and/or by the student, 3) was technology used in a way that support the teaching strategies outline, 4) does the technology use alignment to curriculum goals and strategies, and content, pedagogy and technology fit. The t-test demonstrated statistical significance in each of the courses; however, the social studies course demonstrated the most consistent significance and greatest effect size across the constructs. The ANOVA analysis did not provide any clear patterns in terms of the demographic information. The multiple regressions provided some information in the relationship of the demographic information specifically gender and age. The data from the lesson plans supported this analysis. And finally, the discussion forums clearly demonstrated an increase in content, pedagogy and technology integration especially for the social studies course which warranted further analysis of the courses. This examination of the courses determined that the facilitator of the social studies course had a participation log approximately eight times greater that the math facilitator and three times greater than the science facilitator. This variation in facilitator presence may account for the variance between surveys. Further research needs to be conducted to include classroom observations and teacher perception analysis and a comparative analysis in order to determine the improved use of pedagogy and technology integration.

5 DEDICATION

I would like to dedicate this body of work to Pam Paris, and Ned Basinger, the gift of their friendship and support have helped me realize my dreams.

6 ACKNOWLEDGMENTS I want to thank my advisor, Dr. Teresa Franklin. Without her guidance, patience, encouragement, and friendship, I would not have been able to finish this personal goal of completing this degree. I would also like to thank my committee members; each of you has brought your own expertise to the table and has expanded my view of the world. Dr. Don Flournoy, your work in telecommunications has helped me develop a more global perspective of education. Dr. David Moore, your knowledge of graphic software and the ability to bring great discussion and learning from Socratic seminars has given me the courage to venture forward in both of these areas as well. Dr. Mark Weston, you have taught me by example that integration is not about hardware, it is about the end goal of teaching and learning and the best device to get you there. I would be remiss if I didn’t thank my children and their families who have steadfastly been in my corner with their gentle words of encouragement. I want to give a special thank you to my co-workers. Tom Reed, my supervisor whose knowledge of statistics saw me through and helped me to think about my data in ways I had never imagined. Teresa Dempsey, Sandy Denney, Jean Kugler, TJ Smith and my husband at work, Will Kirby, thank you for listening to me whine and yet telling me to keep going and that you had faith in me. And finally, I could never have completed this degree without my belief in a higher power and the support of my church. They have carried me when giving up seemed like the best option.

7 TABLE OF CONTENTS Page Abstract ............................................................................................................................... 3  Dedication ........................................................................................................................... 5  Acknowledgments............................................................................................................... 6  List of Tables .................................................................................................................... 13  List of Figures ................................................................................................................... 16  Chapter 1: Introduction ..................................................................................................... 17  Research Question ........................................................................................................ 25  Significance of the Study .............................................................................................. 25  Participants.................................................................................................................... 27  Scope of Study .............................................................................................................. 28  Limitations of the Study ............................................................................................... 28  Definitions of Terms ..................................................................................................... 29  Organization of the Study ............................................................................................. 32  Chapter 2: Review of Literature ....................................................................................... 34  Impact ........................................................................................................................... 34  Barriers to Integration ................................................................................................... 36  Teacher Education ........................................................................................................ 39 

8 Technology, Pedagogy, Content Knowledge (TPACK) ............................................... 43  TPACK Mixed Methodology Research .................................................................... 45  TPACK Quantitative Research ................................................................................. 49  TPACK Qualitative Research ................................................................................... 61  Summary ....................................................................................................................... 68  Chapter 3: Methodology ................................................................................................... 69  Purpose.......................................................................................................................... 69  Research Question ........................................................................................................ 70  Design of Study ............................................................................................................ 70  Data Sources ................................................................................................................. 72  HQT Courses History and Description ......................................................................... 74  Highly Qualified Teacher-- Mathematics Course Description ................................. 79  Highly Qualified Teacher-- Science Course Description ......................................... 82  Highly Qualified Teacher -- Social Studies Course Description .............................. 87  Introduction to Social Studies Participants ............................................................... 88  Procedures ..................................................................................................................... 91  Data Analysis ................................................................................................................ 95  Researcher Bias............................................................................................................. 97  Chapter Summary ......................................................................................................... 99 

9 Chapter 4: Findings ......................................................................................................... 100  Assumptions................................................................................................................ 100  Participant Analysis .................................................................................................... 101  TPACK Survey Results .............................................................................................. 105  Cronbach’s Alpha Analysis of TPACK Constructs .................................................... 105  T-test Analysis of TPACK Constructs........................................................................ 108  Technology Knowledge Questions from TPACK Survey ...................................... 109  Content Knowledge Questions from TPACK Survey ............................................ 113  Pedagogy Knowledge Questions from TPACK Survey ......................................... 115  Pedagogical and Technological Content Knowledge Questions from TPACK Survey ................................................................................................................................. 119  Technology Pedagogical Knowledge Questions from TPACK Survey ................. 121  Technology Pedagogical and Content Knowledge Questions from TPACK Survey ................................................................................................................................. 124  Combined Construct Questions .............................................................................. 128  Summary Survey T-Test Results ............................................................................ 132  Effect Size of T-Test for TPACK Constructs ......................................................... 136  Summary of T-Test Effect Size .............................................................................. 145  ANOVA Analysis of TPACK Constructs .................................................................. 145 

10 ANOVA Mathematics Pre and Post Survey ........................................................... 145  ANOVA Science Pre and Post Survey ................................................................... 146  ANOVA Social Studies Pre and Post Survey ......................................................... 148  ANOVA Combined Course Pre and Post Survey ................................................... 151  Summary of Pre/Post Survey ANOVA Analysis.................................................... 153  ANOVA Effect Size ............................................................................................... 155  Summary of ANOVA Effect Size........................................................................... 157  Multiple Regression Analysis of TPACK Constructs ................................................ 158  Mathematics Pre-Post Survey Regression Analysis ............................................... 159  Science Pre-Post Survey Regression Analysis ....................................................... 160  Social Studies Pre-Post Survey Regression Analysis ............................................. 164  All Datasets Pre-Post Survey Regression Analysis ................................................ 165  Effect size of Multiple Regression .......................................................................... 166  Summary of Effect Size of Multiple Regression .................................................... 168  Pre and Post Survey Comments .................................................................................. 168  Mathematics Pre/Post Survey Open-ended Responses ........................................... 168  Science Pre/Post Survey Open-ended Responses ................................................... 169  Social Studies Pre/Post Survey Open-ended Responses ......................................... 169  Participant Lesson Plan Data Analysis ....................................................................... 170 

11 Cronbach’s Alpha for Lesson Plans ........................................................................ 170  One Sample T-Test Results for Lesson Plan Rubric .............................................. 173  ANOVA for Lesson Plan Rubrics........................................................................... 173  Participant Discussion Forum Data and Analysis....................................................... 174  Math Discussion Forum .......................................................................................... 174  Science Discussion Forum ...................................................................................... 177  Social Studies Discussion Forum............................................................................ 180  Chapter Summary ....................................................................................................... 182  Chapter 5: Discussion ..................................................................................................... 186  Key Findings ............................................................................................................... 186  Limitations .................................................................................................................. 189  Eliminating Limitations .............................................................................................. 191  Recommendations ....................................................................................................... 191  Future Research .......................................................................................................... 193  Conclusion .................................................................................................................. 194  References ....................................................................................................................... 196  Appendix A: Survey of Pre-Service Teacher's Knowledge of Teaching & Technology 216  Appendix B: Science Pre/Post Survey ............................................................................ 220  Appendix C: Mathematics Pre/Post Survey.................................................................... 226 

12 Appendix D: Social Studies Pre/Post Survey ................................................................. 233  Appendix E: TPACK Interview Protocol ....................................................................... 240  Appendix F: Technology Integration Assessment Rubric .............................................. 242  Appendix G: Permission Use of Survey ......................................................................... 244  Appendix H: Permission to Use TPACK Diagram......................................................... 245  Appendix I: Creative Common License Technology Integration Assessment Rubric ... 246  Appendix J: Permission to Use interview Protocol ........................................................ 247  Appendix K: Completion Report .................................................................................... 249  Appendix L: Internal Review Board Exemption ............................................................ 250  Appendix M: Ohio University Online Consent Form .................................................... 251  Appendix N: Email Contact AUG. 26 ............................................................................ 253  Appendix O: Email Contact Sept. 6 ................................................................................ 255  Appendix P: Email Contact Sept. 16 .............................................................................. 256  Appendix Q: Phone Contact Chart ................................................................................. 257  Appendix R: Mathematics Pearson Correlation ............................................................. 258  Appendix S: Science Pearson Correlation ...................................................................... 259  Appendix T: Social studies Pearson Correlation ............................................................ 260  Appendix U: Combined Subject Pearson Correlation .................................................... 261 

13 LIST OF TABLES Page Table 1: Survey Internal Consistency of Each Construct................................................ 50 Table 2: Technology Integration Assessment Rubric......................................................94 Table 3: Breakdown of Course Participant Demographics.............................................104 Table 4: Reliability Statistics for Mathematics TPACK Survey.....................................106 Table 5: Reliability Statistics for Science TPACK Survey.............................................107 Table 6: Reliability Statistics for Social Studies TPACK Survey ..................................108 Table 7: Reliability Statistics for Combined Subject TPACK Survey............................108 Table 8: T-test Analysis Question 1…………….............................................................110 Table 9: T-test Analysis Question 2…………….............................................................110 Table 10: T-test Analysis Question 3...............................................................................111 Table 11: T-test Analysis Question 4……………...........................................................111 Table 12: T-test Analysis Question 5……………...........................................................112 Table 13: T-test Analysis Question 6……………...........................................................112 Table 14: T-test Analysis Question 7……………...........................................................113 Table 15: T-test Analysis Question 8……………...........................................................114 Table 16: T-test Analysis Question 9……………...........................................................114 Table 17: T-test Analysis Question 10…………….........................................................116 Table 18: T-test Analysis Question 11…………….........................................................116 Table 19: T-test Analysis Question 12…………….........................................................117 Table 20: T-test Analysis Question 13…………….........................................................117

14 Table 21: T-test Analysis Question 14…………….........................................................118 Table 22: T-test Analysis Question 15…………….........................................................118 Table 23: T-test Analysis Question 16…………….........................................................119 Table 24: T-test Analysis Question 17…………….........................................................120 Table 25: T-test Analysis Question 18…………….........................................................120 Table 26: T-test Analysis Question 19…………….........................................................122 Table 27: T-test Analysis Question 20…………….........................................................122 Table 28: T-test Analysis Question 21…………….........................................................123 Table 29: T-test Analysis Question 22…………….........................................................123 Table 30: T-test Analysis Question 23…………….........................................................124 Table 31: T-test Analysis Question 24…………….........................................................125 Table 32: T-test Analysis Question 25…………….........................................................126 Table 33: T-test Analysis Question 26…………….........................................................126 Table 34: T-test Analysis Question 27…………….........................................................127 Table 35: T-test Analysis Question 28…………….........................................................127 Table 36: T-test Analysis Combined Questions TK........................................................129 Table 37: T-test Analysis Combined Questions CK........................................................129 Table 38: T-test Analysis Combined Questions PK .......................................................130 Table 39: T-test Analysis Combined Questions PCK......................................................130 Table 40: T-test Analysis Combined Questions TCK.....................................................131 Table 41: T-test Analysis Combined Questions TPK......................................................131 Table 42: T-test Analysis Combined Questions TPACK ...............................................132

15 Table 43: Mathematics Effect Size……………..............................................................137 Table 44: Science Effect Size……………......................................................................138 Table 45 Social Studies Effect Size ……………............................................................141 Table 46: Combined Subject Effect Size ……………....................................................143 Table 47: Combined Construct Effect Size ……………................................................144 Table 48: ANOVA Math Survey Question….…….........................................................146 Table 49: ANOVA Science Survey Constructs...............................................................147 Table 50: ANOVA Science Survey Questions………………........................................148 Table 51: ANOVA Social Studies Survey Constructs..…...............................................150 Table 52: ANOVA Social Studies Survey Questions .....................................................151 Table 53: ANOVA Combined Course Survey Constructs .............................................152 Table 54: ANOVA Combined Course Survey Questions................................................153 Table 55: Mathematics Multiple Regression Table.........................................................160 Table 56: Science Multiple Regression Table.................................................................163 Table 57: Social Studies Multiple Regression Table.......................................................165 Table 58: All Data Multiple Regression Table ...............................................................166 Table 59: Inter-rater Reliability Mathematics Lesson Plan ............................................170 Table 60: Inter-rater Reliability Science Lesson Plan ....................................................171 Table 61: Inter-rater Reliability Social Studies Lesson Plan...........................................172 Table 62: Inter-rater Reliability Combined Dataset….……............................................173

16 LIST OF FIGURES Page Figure 1: TPACK Diagram .............................................................................................20 Figure 2: Fray Model ......................................................................................................80 Figure 3: Screenshot of Learning Segments of HQT Online Mathematics Course ........81 Figure 4: Modified Frayer Model ...................................................................................82 Figure 5: Screenshot of Introduction for HQT Online Science Course ..........................84 Figure 6: Screenshot of HQT Online Science Course ....................................................86 Figure 7: Screenshot of Introduction for HQT Online Social Studies Course ...............88 Figure 8: Screenshot of HQT Online Social Studies Course ..........................................91 Figure 9: T-Test Statistical Significance by Subject Overly TPACK Diagram ...........135 Figure 10: Statistical Significance Responses .............................................................186

17 CHAPTER 1: INTRODUCTION As a teacher of teachers for the past twenty years, I have had the opportunity to participate in helping districts, buildings and educators adopt and integrate technology into the classroom. In this work, I have found that integration has not found a foothold in helping student increase achievement due to the fact that teachers do not clearly understand how technology can support the curriculum and instruction. Districts provide professional development that has been presented as a “how to” in the use of technology, creating two missing components prohibiting integration. First, teachers have not experienced the technology as an educational tool aiding in their own education, so they are unclear as to what the technology can do. Secondly, teachers have not been provided a clear link between the use of technology and the curriculum they teach. The goal of this research was to determine the growth of technological, pedagogical content knowledge through the immersion in an online course that allows the teacher to become the student while experimenting and exploring the affordances of the system and resources that support their subject matter on the World Wide Web. This research can provide all stakeholders another tool to support the goal of wide spread integration of technology, if immersion in the online environment while participating in content specific online courses demonstrates an increase in TPACK. There have been many technology integration models or frameworks to measure and guide teachers in their use of technology. Some of the measurement frameworks include the Levels of Technology Integration (LoTI) scale, the Apple Classroom of Tomorrow (ACOT), and the International Society for Technology in Education (ISTE)

18 National Education Technology Standards for Teachers (NETS T) 2000 Standards (ISTE, 2008). These were presented from the perspective of the generalist and strictly address the perspective of what teachers do, not what they know (Graham et al., 2009). The technological pedagogical context knowledge (TPACK) framework provides a holistic viewpoint of knowledge correlation with successful integration of technology into learning environments (Polly & Brantley-Dias, 2009). With this approach, TPACK accounts for what teachers know and what teachers do in classroom instruction in regards to technology integration (Polly & Brantley-Dias, 2009). TPACK is an extension of Shulman's 1986 work. Shulman used three constructs to gather data on teacher knowledge. Shulman's pivotal work in 1986 examines the growth of teachers in the areas of content knowledge, pedagogical knowledge and pedagogical content knowledge. Further work of Margerum-Lays and Marx (2003) sought to embed technology as part of the framework, referenced as PCK for the use of instructional technology. Slough and Connell (2006) used the term technological content knowledge; finally Mishra and Koehler (2006) suggested the term technological pedagogical content knowledge (TPCK). TPCK, currently referred to as TPACK, has become the comprehensive term used in the literature for this framework (Angeli & Valanides, 2009; Mishra et al., 2009). TPACK was first introduced as a conceptual framework by Mishra and Koehler (2006). It has since been used increasingly as a theoretical framework for understanding what is needed for teachers to effectively integrate technology into classroom instruction (Cox, 2008; Mishra & Koehler, 2006; Polly & Brantley-Dias, 2009). By building on the

19 work of Shulman (1986) and others, Mishra and Koehler have fostered the clear linkage between technology, pedagogy and content knowledge and how it might be used to support increased effectiveness of teaching and learning (Archambault & Crippen, 2009). Within the framework, “TPACK is broken into seven constructs: technology knowledge (TK), pedagogical knowledge (PK), content knowledge (CK), technological pedagogical knowledge (TPK), technological content knowledge (TCK), pedagogical content knowledge (PCK) and technological pedagogical content knowledge (TPACK)” (Schmidt et al., 2009, p. 125) Through these seven constructs, the framework provides a snapshot of how the various types of knowledge overlap and are interrelated. The framework and the constructs will be used and referred to throughout this research project. As demonstrated in the diagram (Figure 1), the framework becomes a useful tool in examining what knowledge teachers might need to develop to successfully integrate technology into their teaching practice.

20

Figure 1: Mishra, P., & Koehler, M. (2011). TPACK - technology pedagogy content knowledge. In Got TPACK? Retrieved June 17, 2011, from http://tpack.org (Creative Commons License) The Venn Diagram (Figure 1) of the seven constructs shows how the constructs overlap and combine. Through this figure we are able to see the main independent constructs of Pedagogical Knowledge, Content Knowledge, and Technology Knowledge. The figure further demonstrates how the overlapping and combining of the independent constructs creates new constructs within the framework; specifically Technological Content Knowledge, Pedagogy Content Knowledge, Technological Pedagogical Knowledge and Technological Pedagogical Content Knowledge. In order to understand how the constructs are intertwined it is important to define and clarify each one.

21 Content knowledge (CK) refers to the knowledge within the subject matter required for teaching. Teachers should possess the subject matter structure, purposes and ideas so they are able to assist students in gaining subject matter literacy (Koehler & Mishra, 2009; Mishra & Koehler, 2006; Schmidt et al., 2009; Shulman, 1986). Pedagogical knowledge (PK) is the educator's knowledge of appropriate strategies to use when teaching content. Within this construct, teachers must have a clear understanding of how student construct knowledge and acquire skills. Teachers are able to apply cognitive, social and developmental theories of learning when they possess pedagogical knowledge (Koehler & Mishra, 2009; Mishra & Koehler, 2006; Schmidt et al., 2009; Shulman, 1986). Technology knowledge (TK) involves knowledge about more common simple technologies such as pencil and paper to technologies that include the internet and digital resources of all types that can be accessed for teacher and student use through various devices such as computers, smart boards, or smart phones. Technology knowledge is fluid and changes rapidly due to the fact that technology shifts exponentially (Koehler & Mishra, 2009; Schmidt et al., 2009). Technological pedagogical knowledge (TPK) is the ability to know how classroom practice can be changed when technologies are used in a specific manner that addresses pedagogy and content requirements. In this way, the dynamics of learning are changed. In order to use technology efficiently and effectively, teachers need to know the constraints and the affordances (Koehler & Mishra, 2009; Mishra & Koehler, 2006; Schmidt et al., 2009).

22 Shulman (1986) points out that pedagogical content knowledge (PCK) has the power to transform the classroom. PCK represents the teacher’s ability to represent the content in multiple ways. It allows teachers to adapt the presentation and learning of content to the needs of the student. Within PCK, teachers are aware of the common misconceptions and are able to address them. They promote learning that ties curriculum together in a meaningful way, connecting to student's prior knowledge (Koehler & Mishra, 2009; Mishra & Koehler, 2006; Schmidt et al., 2009; Shulman, 1986). Technology content knowledge (TCK) is the ability of teachers to know what technology is appropriate for a given discipline. In comparison to TPK, teachers must understand the constraints and the affordances in terms of presenting or representing content; TCK requires an understanding how technologies are best suited in addressing specific content in their domain. Through that understanding, teachers will be able to decide what technology to use based constraints or affordances specific technology allow (Koehler & Mishra, 2009; Mishra & Koehler, 2006; Schmidt et al., 2009). Technological pedagogical content knowledge (TPACK) demonstrates the teacher’s ability to put all of the pieces together. Teachers who have acquired this level are able to demonstrate the ability to successfully integrate technology into their content area. They have an instinctive understanding of the multifaceted relationship between the basic constructs (CK, PK, TK) using appropriate technologies and demonstrating concrete pedagogical methods within the subject area (Koehler & Mishra, 2009; Mishra & Koehler, 2006; Schmidt et al., 2009).

23 TPACK is also the ability of teachers to use all of the tools at their disposal to elevate student learning. The framework affords the successful integration of technology into daily classroom practice, weaving all three sources of knowledge: technology, pedagogy and content. The relationship between the three types of knowledge remains fluid as new technologies emerge. Because of this, technology can sometimes drive the decisions made about content and pedagogy (Mishra & Koehler, 2006). Schmidt et al. (2009) suggests that the framework could potentially impact the design of professional development as these constructs are measured. Cox and Graham's (2009) work has helped to provide working definitions of the constructs, however, the framework has not been applied to practicing teachers who are taking an online course to extend their content knowledge in specific subject areas. In this research, the TPACK framework was used with classroom teachers who are participating in online courses to extend and enhance the educational opportunities of their students by deepening the understanding of content knowledge, thereby, affecting the teaching practice of the individual. As a result of this type of professional development, did the teachers content knowledge, pedagogical knowledge, technological knowledge, technological pedagogical knowledge, technology content knowledge or technology pedagogy content knowledge change? Within the online learning environment, it is necessary for teachers to depart from familiar traditional instructional design models because the activities and structures of teaching and learning online are highly situational, influenced by the context of the online environment and knowledge of students (Harnett, St. George, A & Dron, 2011). Teachers

24 as participants will need to become familiar with the learning content management system from the learner's perspective to complete course work. As teachers develop the skill to navigate the learning management system and determine how to participate in the course, they become immersed in the learning management system dedicated to specific subject matter. Teachers were asked to apply new found resources and skills to their classroom situation as appropriate. For these reasons, a mixed methodology approach was applied to complete the research. The focus of this study was the measurement of the mean differences of the seven constructs found within the TPACK framework by using a pre and post measurement and the examination of lesson plans and discussion forums within a learning management system. Statistical analysis using t-test, ANOVA and multiple regressions was used to determine significance and effect of the data sets related to the growth of TPACK. In order to determine if the experience of participating in an online class changed the practice of classroom teachers in their ability to integrate technology appropriately, lesson plans were scored using the Technology Integration Assessment Rubric found in Appendix F (Harris, Grandgenett & Hofer, 2012). Originally, the design of the study included the use of random semi-structured interviews. However, due to the unavailability of participants and time constraints an examination of discussion forums within the course were completed using grounded theory methodology. By completing the analysis, recommendations were made concerning the revision of and development of online courses that support increased integration of appropriate technologies into the classroom on a broader scale.

25 Research Question The following research question was the focus of this study: 1. Does the Highly Qualified Teacher Online Professional Development change the classroom teacher practice as evaluated by the TPACK criteria? Significance of the Study By analyzing the self report of the pre and post surveys, lesson plans and discussion forums, the determination of whether or not immersing teachers in an online course impacted the development of TPACK. As teachers develop their own understanding of the affordances of technology and how it can be embedded in the content, teachers are able to identify and use technology as a teaching and learning tool (Koehler & Mishra, 2009). Significant monies have been provided to school districts through federal government programs such as Enhanced Education through Technology (Ed Tech) Programs (U.S. Department of Education, 2009). These funds made available under the American Recovery and Reinvestment Act of 2009 (ARRA) were provided for the purchase of technology and related professional development (U.S. Department of Education, 2009). The use of technology continues to be hit and miss in the classroom (Angeli & Valanides, 2009; Bain & Weston, 2012; Moersch, 1999; Mishra, Koehler & Yahya, 2007; Ward & Parr, 2009). If immersion in the online environment while participating in content specific online courses demonstrate an increase in TPACK pre and post survey means, and lesson plans, along with online discussion forums; the research could provide all stakeholders another tool to afford the goal of wide spread integration of technology.

26 The courses in the study were online requiring a certain amount of self-regulated learning (Usta, 2011). Self-regulated learning is not spontaneously acquired by learners (Anderton, 2006). When using hypermedia technologies, learners experience cognitive confusion (Bidarra & Dias, 2003; Kramarski & Michalsky, 2010). Often they do not know if they are on the right track; they are unsure of what strategies to use to facilitate learning. They are unsure of how and when to use strategies to obtain their learning objective. Kramarski and Michalsky (2010) suggests through the participation in programs that provide these opportunities, teachers are better able to regulate their learning and teaching experiences. Technological Pedagogical Content Knowledge was used as a framework with self-regulated learners in an online environment in an attempt to determine growth of the TPACK constructs. Koehler (2007) explains that TPACK should be considered as a framework for guiding and explaining the educator's thinking regarding technology integration. In this study, an attempt was made to determine teacher growth in each of the seven constructs of the TPACK framework using the mean of pre and post surveys, as a result of taking an online course meant to increase content knowledge. The TPACK framework was separated into the seven constructs to examine similarities or differences of the courses under consideration. Lesson plans and discussion forum were examined using proven tools to determine if classroom practice has been impacted as a result of taking the online course. By doing so, potential change may be outlined for the development of online professional development for in-service teachers that will support teachers to integrate technology more effectively into their specific content area. It is also

27 hoped that this project will provide insight as to whether or not immersing the professional educator in an experiential learning situation within the online environment may serve as model for teacher educators to conduct professional development in the online environment with the greatest impact. Participants The participants of the study are self selected educators. Their demographic information was collected; however it varies widely in terms of education levels, years in the profession, and age. There were 6 consenting participants in the mathematics online course, 2 males and 4 females. Five of the participants had master’s degree and taught grades 7 – 12. In the science online course, there were seventeen consenting participants, 10 male and 7 female. Eleven of the science participants had bachelor degrees, leaving 6 participants who had earned master’s degrees. This group of participants taught grades 2 – 12. The social studies online course had 2 male and 11 female consenting participants for a total of 13. Nine of the participants had master’s degrees, while the four had earned bachelor degrees. This group of participants also taught grades 2-12. The participants were special education educators (extended content subject matter courses are required), intervention specialist, general content educators and educational consultants. While the courses were targeted for practicing classroom teachers, it was found that some participants took the class to maintain their teaching licensure and to deepen their personal learning of content.

28 Scope of Study The study first considered the seven constructs of TPACK present in the participants enrolled in mathematics, science, and social studies online course through a self-report survey instrument at the beginning of the course. A re-examination and comparison to determine growth occurred at the end of the course as the participants retook the self-report survey. The questions of the instrument addressed the constructs which are: technology knowledge (TK), pedagogical knowledge (PK), content knowledge (CK), technology content knowledge (TCK), technology pedagogical knowledge (TPK), pedagogical content knowledge (PCK) and technology pedagogical content knowledge (TPACK). A lesson plan from each participant was downloaded and scored using the Technology Integration Assessment Rubric. Discussion forums were viewed and discussions were classified through the use of themes and word usage. Since this was an exploratory research project and was the first to use TPACK constructs with in-service teachers in relation to participation in an online course focused on the acquisition of content knowledge, not structured around the TPACK framework; this work will be expanded upon by others. Limitations of the Study The study was limited in terms of the number of participants due to time and financial constraints. While there was a demonstrated linear correlation between each of the constructs, this was the first attempt to triangulate the statistical analysis with qualitative data sources in relation to activities completed by teachers in an online course. The participants of the study were self selected because they register for the online course

29 in order to secure new licensing, further their degree level, and meet a state requirement, or to extend their knowledge in a specific content area. As a result, use of the tools and methodology will need to be tested in a broader population. The study was further limited to participants who have Internet access. Definitions of Terms Access: In this study, access refers to the ability of teachers and students to use technology within the educational environment; this may include the ability to use software to the ability to use the Internet for educational purposes (Van Roekel, 2008). Asynchronous Communication: Communication between two or more individuals that does not occur in real time but does require participants to be using the same medium, example: email, discussion forums, wiki's, or blogs (Hughes & Canul, 2006). Blended Learning: Incorporates face-to-face learning with online learning components created by the teacher using features to personalize instruction, allowing for reflection and differentiation of instruction. This type of online learning exists on a continuum depending upon the grade level, the expertise of the teachers and the curriculum being presented (Bonk & Graham, 2004). Content Knowledge: "knowledge about actual subject matter that is to learned or taught" (Mishra & Koehler, 2006, p. 1026) Content Management System: A web based application that allows for the creation of curricular materials. It may include websites used to store and download curricular content. In this case, the CMS used is Moodle 2.1 (Paulsen, 2003).

30 Distance Education: this type of learning began as correspondence courses and have now progress to include learning through the use of the internet. This type of education focuses on an instructional system that delivers education to a student who is not physically on site (Hughes & Canul, 2006). Flexible Learning: Learning that allows the learning to go at their own pace; it is free from constraints and logistical concerns (Goodyear, 2008). Hybrid Learning: Incorporates face-to-face learning with a content management system that cannot be changed or influenced by the instructor. Online line component and the face-to-face instruction support the student in the process of learning (Riffell & Sibley, 2003; Jackson & Helms, 2008). Hypermedia: A technology environment, which incorporates a variety of embedded learning objects such as: video clips, audio, hypertext and animations (Singer, 1995; Niederhauser, 2008). Independent Learning: Refers to the concept that learners are involved in their learning process, due to the involvement learners begin to make connections to the larger community (Race, 2005). Learning: viewed as a process of inputs, which become short term memory and then encoded for long term memory recall (Siemens, 2004) Learning Content Management System: A type of web-based application that incorporates all of the attributes of a content management system and a learning management system (Hughes & Canul, 2006).

31 Learning Management System: A system accessed from the Internet to course content; it allows for enrollment, course catalog, skills management in terms of activities, asynchronous and synchronous communication tools and assessment of work completed (Paulsen, 2003). Online Learning: Learning that takes place in a digital environment, can take the form of activities embedded in a website to a full class placed online within a content/learning management system. Components of online learning include the use of intranet/internet tools and resources (Hughes & Canul, 2006). Pedagogical Knowledge: Refers to the methods, strategies and process of teaching. This type of knowledge includes classroom management, lesson plan development, assessment and student learning (Shulman, 1986). Pedagogical Content Knowledge: Strategies and methods of teaching the content are intertwined. Educators must be aware of appropriate strategies to use for specific content taking into account all those aspects for both the process of teachings as well as an indepth knowledge of the content within their specific areas of study (Shulman, 1986). Self-Regulated learning: An active goal setting learning process. Learners attempt to control their cognition, motivation and behavior through monitoring and regulating (U.S. Department of Education, 2011). Synchronous: In the online environment, refers to communication happening among a group simultaneous, as in a live chat, real time (Paulsen, 2003). Technology Integration: the pervasive and productive use of technologies for the purposes of curriculum-based learning and instruction (Harris, 2008).

32 Technological Knowledge: Refers to the knowledge of knowing various types of technology, these range from low-tech such as sticky notes, pencil and paper to high-tech in the form of digital technologies using the Internet, video, whiteboards, social networking, and cloud technology (Mishra & Koehler, 2006). Technological Pedagogical Content Knowledge: Refers to the knowledge that is required for the integration of technology into any given content area. It can be an intuitive understanding of the inter-connectedness of the components of technological, pedagogical and content knowledge (Mishra & Koehler, 2006, p. 1026). TPACK: Mishra and Koehler (2006) highlighted the connections and interactions among content, pedagogy and technology and referred to it as TPCK. Niess et.al (2009) along with Thompson and Mishra (2008) recoined the name to TPACK to demonstrate the complexity relationship of the constructs resulting in successful integration of technology into the classroom. TPACK Teacher: One who develops a "thoughtful interweaving of all three key sources of knowledge: technology, pedagogy and content" (Mishra & Koehler, 2006). Traditional Education: A setting where people go to a physical location and attend a face-to-face instructor led course, referred to as classroom instruction. Traditional instruction can vary from teacher centered to student centered (Relan & Gillani, 1997). Organization of the Study The study is arranged into five chapters. The first chapter includes the statement of the problem, research questions, and significance of the study, description of participants, scope of the study, limitations of the study, definitions of terms, and the

33 organization of the study. Chapter two includes a brief literature review of barriers to technology use, changes in teacher education and technology pedagogical content knowledge (TPACK). The literature review is then divided in the areas of types of research in the TPACK field, including TPACK mixed methodology research, TPACK quantitative research, TPACK qualitative research and chapter summary. Chapter three includes an outline of the methodology used in the study, purpose, research questions, and design of the study, data sources, procedures, data analysis, research bias, and chapter summary. Chapter four contains findings, opening with a discussion of participant demographic information, statistical analysis of pre and post surveys, participants’ lesson plan data and analysis and discussion forum analysis along with chapter summary. Chapter five includes a discussion of the research, key findings, the limitations of the study, eliminating limitations, recommendations, future research and conclusion.

34 CHAPTER 2: REVIEW OF LITERATURE The literature review of chapter two presents background information on the impact of technology in the classroom. It addresses research on barriers to integration, and teacher education. Further research is presented including the topics on technology, pedagogy, and content knowledge (TPACK). These topics are presented using research, which demonstrates mixed methods, quantitative and qualitative investigative methodology. Impact Past research regarding the impact of technology on the educational process, including the educational experiences, the achievement gap and ability to engage students in their own education has demonstrated a positive impact (Bowerman, 2005; Burchett, Cradler, McNabb & Freeman, 2002; Chapman, & DeBell, 2006; U.S. Department of Education, 2010). A commonly held belief has been that educational technology may improve instruction and ultimately improve the education of students. In general, technology may be part of the solution for the problems and shortcomings of education (Ascione, 2006; McGillivray, 2000a, 2000b; Bonk, 2009). Other research demonstrates the exact opposite. This research states technology in education has had a minimal impact on how teachers teach and on how students learn (Herrington & Kervin, 2007; Jaillet, 2004; Weston & Bain, 2010). While pockets of success exist on a small scale, wide scale educational reform through the use of technology has not been seen (Bain & Weston, 2012, Weston & Bain, 2014). Achievement by students has not mirrored the expectations

35 or predictions (Bain & Weston, 2012; Becker, 2000b; Cuban, 2001; Cuban, Kirkpatrick & Peck, 2001; Herrington & Kervin, 2007; Jaillet, 2004). Due to high stakes testing and accountability practices, an emphasis has been placed on the quality of teaching practices in order to improve student outcomes (Darling-Hammond & Rustique-Forrester, 2005). Researchers have agreed that skillful teachers are the single most important factor in the classroom (Alton-Lee, 2003; DarlingHammond, 2000; Darling-Hammond & Rustique-Forrester, 2005). It is important to engage skillful teachers in the integration of technology if dollars are to be spent wisely (Apple Education, 2009; Sun, Heath, Byrom, Phlegar & Dimock, 2000). In alignment with teacher quality is the continued questioning regarding the ongoing practice of purchasing technology with the expenditure of dollars from an ever shrinking education budget and the extent to which it impacts learning (Ward & Parr, 2009). Ward and Parr (2009) conducted a study to examine the extent and purpose teachers are utilizing Information and Communication Technology (ICT). The study was conducted in four schools located in New Zealand that were considered advanced in the area of technology due to their infrastructure and commitment to professional development. Secondary teachers were surveyed regarding their use of technology for administrative tasks, preparation for teaching, student work and person use. Some of the tasks included research using the internet, working independently, creating presentations, collaborating with classmates, mastering skills, practice drills specific to the subject, and simulations or games. The overall mean level of use for all areas was 2.2 (rarely); on the

36 whole, this suggested the participants were not high users of technology despite the infrastructure or professional development provided (Ward & Parr, 2009). The survey highlights three findings. Pedagogical use was significantly impacted by the teachers’ readiness to use ICT. Student use was not impacted by the participant’s skill level within the classroom. This indicates that skill building alone will not precipitate a change in classroom practice. Lastly, there were relational inconsistencies between the factors, pointing to the complex nature of determining what impacts a paradigm shift in classroom practice (Ward & Parr, 2009). Barriers to Integration As researchers continue to monitor the extent and purpose of the educational usage of ICT (Bain & Weston, 2012; Ward & Parr, 2009). Others examine areas of access, support, training, educational pressures and school conditions (Becker, 2001; Berge & Muilenburg, 2006; Drayton, Falk, Stroud, Hobbs & Hammerman, 2010; Hernandez-Ramos, 2005; Imhof, Vollmeyer & Beierlein, 2006; IESD, 2011; Kadijevich, 2006; Li, 2007; Moersch, 1995; Palak & Walls, 2009). Access has been defined in the research by how many computers and other types of hardware are in a school or district and how many computers are connected to the Internet. Access includes questions regarding working order of the computers or hardware and the type of software installed in the computers (Hernandez-Ramos, 2005; Smerdon et al., 2000). The number of computers in the building or district may include those that are located in the classroom or those that a located in computer labs and media centers in the building (Smerdon et al., 2000). According to the U.S. Department of

37 Education in the fall of 2008, 100 % of all public schools had at least one or more computers used for instruction with the capability to access the Internet. In 2008, the ratio of students to computers with internet access was 3.1: 1 (U.S. Department of Education, 2010); however, access to computers within the classroom was found to be 5.3: 1 (Gray, Lewis & Lewis, 2010). Approximately 97 % of building classrooms had an educational computer in the classroom for easy access (U.S. Department of Education, 2010; Gray, Lewis & Lewis, 2010). Even though this percentage did not include laptop carts, it was found that 58 % of schools were equipped with laptop carts (U.S. Department of Education, 2010; Gray, Lewis & Lewis, 2010). Statistical data has been presented regarding the number of machines available, however, access goes beyond physical access or the number of machines. Cuban et al. (2001) explains that the instability of technology has left educator’s ambivalent regarding technology's ability to change learning outcomes,. Unreliable or broken machines has become part of the access problem. For the most part technology support teams were unable to address problems immediately, leaving teachers and students alike to mistrust the entire system. Often teachers heard administrators and the support team respond to an unanswered need, with problems regarding inadequate or faulty wiring, servers crashing and the inability for the system to work fast enough to save student work. The high cost of both upgrading software and the time consuming effort to install software eroded the stability of the hardware and software, leaving teachers to doubt that technology could assist with teaching and learning in any sustainable way (Butler & Sellbom, 2002; Cuban, Kirkpatrick & Peck, 2001).

38 Along with access and lack of technical support another barrier cited in the literature is a lack of training. Golon (2008) found that technology was being used to automate the common functions of running a classroom, such as taking attendance, reporting grades and managing course documents. Computers were not being used for instruction or by students to advance learning (Golon, 2008). Naicker (2011) states two assumptions are made when training and support are offered to educators. First, it is assumed that educators are able to take what they have learned in a workshop and are able to make a connection back to their curriculum and to use it to facilitate deeper understanding for students. Researchers postulate that unless targeted professional development is provided, technology will not be integrated into the curriculum by educational professionals due to lack of skills and knowledge (Baylor & Ritchie, 2002; Naicker, 2011; Wright, 2010). The second assumption pointed out by Naicker (2011) is that although the staff development has occurred, not all teachers are ready and willing to make a paradigm shift in their teaching practices. Providing a general workshop model for in-service teachers is inadequate in assisting teachers learning new technology. A general workshop does not provide the opportunity for educators to experiment and to develop pedagogical changes in their practice (Vrasidas & Glass, 2004; Wright, 2010). When pedagogical changes are not fostered through professional development, new technologies are unable to be integrated into the curriculum, which address student outcomes and artifacts that demonstrate content acquisition (Baylor & Ritchie, 2002; Darling-Hammond & Sykes, 1999; Cuban, 2001a; Vrasidas & Glass, 2004).

39 Educators are keenly aware of positive and negative responses from both administration and the community in relation to the use of technology in the educational process. In order to assist innovative teachers in dealing with the pressures from peers and the community, administrators need to recognize how the areas of leadership, teacher responsiveness and school results are interwoven (Baylor & Ritchie, 2002). To promote the integration of technologies, it is necessary for stakeholders to provide resources, and establish long-term commitment and focus (Culp, Honey & Mandinach, 2003). Administrators who use technology in the course of fulfilling their daily responsibilities act as a catalyst for change within their building (Baylor & Ritchie, 2002). Baylor & Ritchie (2002) suggest that buildings where the administrator is actively modeling the use technology, there is an increased use of technology by teachers. Rewarding teachers who are infusing technology into their curriculum experience a higher return on the time and money spent in professional development and the upgrading software and hardware. Along with higher returns in time and money, teachers are able to build their own capacity effecting student learning and higher order thinking skills (Baylor & Ritchie, 2002). Teacher Education As far back as Edwin J. Tapp (1943) and supported today (Glaeser, Ponzetto & Shleifer, 2006), researchers have stated education is meant to safeguard democracy. It is through education that the ideals and values can be extended to foster democracy and provide internal stability for governments (Glaeser, Ponzetto & Shleifer, 2006). Shulman (1987) explains that as a society, we engage in teaching, manifesting itself in academic

40 student achievement demonstrated through student literacy supporting freedom, personal responsibility, respect for others, and engagement in continued learning, understandings, skills and values necessary for a free and just society. Changes in the structure, theory and delivery of education were reflective of the change from agrarian to industrial society. As a result, new theories and strategies for conducting education began to emerge. As shown throughout history each new theory and strategy was introduced to support and extend student learning (Cubberley, 1919). Sirin, (2005) conducted a meta-analysis and found early research between 19902000 on student achievement examined socioeconomic and intake factors, in contrast to current studies that have concentrated on teaching pedagogy and teacher quality (Sirin, 2005). Bloom (1984a) demonstrated through research that a teacher working one on one with a student could improve learning by 2-standard deviations, however, a teacher working with an entire classroom of students had much lower learning gains. Research has demonstrated that student achievement is enhanced when teachers possess these four abilities: “1) subject matter expert, 2) pedagogical knowledge 3) positive relationships with students and 4) a positive role model for students” (Aaronson, 2007; Rowe, 2004, pp.4-14). In order for a teacher to become a master teacher, the teacher needs to be able to determine what students know and what should happen next to scaffold learning (Hattie, 2003). Teacher quality is the overriding factor in the success of students (Goodwin, 2011; Hattie, 2009). At we look at the world today, according to Puente (2012), a skilled teacher who implements technology in the classroom advances the collaboration, the building of

41 partnerships and creativity for all. This is further supported by the research of Bain & Weston (2012), which states that ICT can be used to address and streamline the challenge of classroom teachers to meet the needs of all of their students. In order for technology to become a tool for teachers to improve student achievement, it must become more than an add on at the end of the class (Wright, 2010). Professional development programs in the area of technology have varied from the basic 'how to workshop' to more advanced professional learning communities. It is a mistake to assume that just because an educator knows how to use ICT, the skills are automatically translated into use in the classroom and a transformation in teaching practice (Wright, 2010). Good professional development requires more than a two hour workshop; it requires sustained, ongoing professional interaction that assists the teacher in changing his/her practice (Brinkerhoff, 2006). This is especially true in the area of ever changing technology (Brinkerhoff, 2006; Cuban, 2001; Laffey, 2004; Wright, 2010). Professional development programs are designed to assist educators in implementing new practices, this requires support and funding. With shrinking educational budgets, when the support and funding are gone most of the changes in practice disappear (Solomon & Schrum, 2007). In order to fully integrate technology into their curriculum, pre-service and inservice teachers need to experience the technology (Brush & Saye, 2009). For pre-service teachers, various models have been explored; researchers recommend the inclusion of technology integration in all courses and experiences of pre-service teachers to assist in building TPACK. It is important to provide pre-service teachers with the opportunity to

42 think about and utilize appropriate technology. Within the curriculum and instruction course work, pre-service teachers need to prepare lessons that embed technology. The embedding of technology needs to occur while incorporating pedagogical reasoning to include knowledge about students, knowledge of content, knowledge of instructional strategies, knowledge of classroom management and knowledge of how to assess student learning. Concurrently, teachers need to consider how technology impacts student interaction with the subject matter (Niess, 2008). For in-service teachers, the integration of technology is not about the technology, but rather the content and effective instructional practices and learning that is enhanced as a result of using technology (Harris, 2008). Professional development for experienced teachers should use an activity structure or activity approach (Brinkerhoff, 2006; Harris, 2008). There is a growing awareness of the role of professional development for the successful integration of technology into the classroom experience for students. Common characteristics of effective professional development programs include:  emphasized linkage to student achievement, training teams of teachers together,  increase support mechanisms to support the teachers work,  demonstrated administrative support,  provides incentives,  allows time to learn and access to equipment,  builds a cadre of trainers,  designates pre-requisite skills,  focuses integration of technology as part of instruction,

43  models the use of technology in the school environment, and  provides targeted instruction (Oates, 2002, p. 13). The characteristics of effective professional development are central, since research shows the development of technology skills does not correlate with the ability to integrate the technology into teaching, good teaching requires teachers to have a clear understanding of how technological knowledge, pedagogical knowledge and content knowledge are mutually reinforcing (Koehler, Mishra & Yahya, 2007). These tenets are used to create appropriate context, strategies and representation of concepts to be taught (Koehler, Mishra & Yahya, 2007). Technology, Pedagogy, Content Knowledge (TPACK) Preparing teachers for the use of technology in the classroom permeates almost every plan for improvement and educational reform efforts as a key component (Davis & Falba, 2002; International Society for Technology in Education, 2002; Kozma, R., 2003; National Council for Accreditation of Teacher Education, 1997; Pringle, Dawson & Adams, 2003; Weston & Bain, 2014). According to a 1993 report from the United States Department of Education entitled, Using Technology to Support Education Reform, nearly all teachers acknowledged in the early stages of technology implementation their jobs were made more difficult. As the affordances of the technology are taken into account, educators should be able to first consider their curriculum and then create materials to include appropriate pedagogy and technology use that will promote the learning of the various learners in their classroom (Naicker, 2011). The ability to understand the larger picture of daily efforts and the ability to give direction and meaning

44 would be lost due to the lack of theory and a conceptual framework to guide and inform the integration of technology (Angeli & Valanides, 2009). There has been a systematic impetus in research circles to develop scientific principles to help explain the teacher thinking about the integration of technology into the classroom for about five years (Angeli & Valanides, 2009; Mishra, Koehler & Yahya, 2007). Researchers have been working to extend the framework of Shulman (1986; 1987) pedagogical content knowledge (PCK). Angeli & Valanides (2009) believed that an extended PCK model would provide a framework for the study of teacher knowledge and for collecting and organizing data around teacher understanding about the integration of technology referred to technological, pedagogy, content knowledge (TPACK). The framework of TPACK consists of seven constructs. Briefly, the first three constructs of the TPACK framework are the basic knowledge required to build the more complex constructs. Specifically, technology knowledge (TK) refers to possessing of knowledge of technology tools; pedagogy knowledge (PK) is the knowledge of different teaching strategies and content knowledge (CK) is the fundamental knowledge of the subject matter. The more complex constructs include technology pedagogical knowledge (TPK), which requires the ability to use the appropriate technology combined with teaching strategies. Technology content knowledge refers the ability of teachers to know what technologies to use to represent the subject matter. Pedagogical content knowledge (PCK) is the ability to combine subject matter with the appropriate strategy for teaching. Finally, technological pedagogical content knowledge (TPACK) refers to the knowledge of using pedagogy with technology as it applies to the specific content matter being

45 taught (Mishra & Koehler, 2006; Schmidt et al., 2009; Koehler & Mishra, 2009; Hwee, Koh & Chai, 2014). TPACK Mixed Methodology Research The framework of TPACK came about out of necessity as colleges and universities examined ways to educate faculty in the use of technology. In research conducted by Koehler et al. (2007), faculty members were given an authentic task of creating an online course. A learning management system was introduced to six faculty members and 18 students to create an online course in their content area. By using the system to create a course, faculty and students were able to discover the affordances of the technology. The course was designed like most graduate level courses with some exceptions, the faculty and graduate students worked collaboratively to create the course. Collaborative groups made pedagogical decisions including the development of online communities, discussion forums and problem based learning in the online environment. As these decisions were made, different technological tools had to be tested in order to choose the appropriate technology which would best meet the need of the objectives in the content area. All of the participants learned the principals of effective web design and how the course would be presented to future students as well as copyright issues (Koehler, Mishra & Yahya, 2007). Because prior experience with technology could affect the learning experience of the group, teams had an experienced faculty member and a faculty member less experienced. Each faculty teams were joined with a team of graduate students of similar abilities. Data was gathered from transcripts and then coded according to the categories

46 of technology, pedagogy, content, technological pedagogy, technological content, technological pedagogical content and technological pedagogical content knowledge. The categories were chosen to determine the nature and evolution of the interactions of the collaborative team as they were engaged in the design of the online course (Koehler, Mishra & Yahya, 2007). Data collected during the early, middle and late weeks of the course suggested significant differences. The early discussions of the first team centered on specific use of technology. Discussions of technology and its relationship to pedagogy began to surface about the middle of the course. By the end of the course, interrelationship of pedagogy and content became the conversational focus of the team. Conversely, conversations of the second team was dominated throughout the course by the technology, content was noticeable missing in the conversation. The group worked on learning the technology by developing the website. The similarity between the two groups exists in the increase in the length of conversational threads. For the qualitative analysis of the data, a diagrammatic model was developed to represent how the design discussion developed around technology and content thinking. The data found faculty and student grew in their knowledge and sensitivity to the complex nature of how content, pedagogy and technology are intertwined given their involved in the instruction design of an online course. The goal of this research was to further develop a theoretical model that could be used for analyzing changes in the classroom practices of teachers (Koehler, Mishra & Yahya, 2007).

47 Doering et.al (2009) conducted research on the professional development of inservice social studies educators incorporating the tenets of TPACK. The goal of the research was to comprehend how teachers' consciousness of TPACK changed as a result of participating in an online learning environment. Twenty in-service social studies teachers were to participant in training on the GeoThentic learning environment. Eight were chosen to participate in the project. The GeoThentic learning environment encompassed numerous modules using inquiry based geographic problems. At the end of a one-day workshop, educators were required to return to their classroom and teach two 50-minute periods using the GeoThentic learning environment. Qualitative and quantitative data were gathered before and after the program. The authors used a software package to analyze the data, looking for emerging codes, data patterns and themes. In order to ensure data reliability researchers met 6 times to compare, contrast and discuss the data, and to reanalyze and triangulate the data (Doering, Veletsiano, Scharber & Miller, 2009). The authors concluded that teachers perceived a positive change in their TPACK. The most improvement was the technology knowledge construct, five out of eight reported an increase. Five out of the eight indicated that their knowledge increased in at least one of the knowledge constructs. The pedagogy construct had mixed results with three indicating improvement while three others reported a decrease, while two indicated that there was no change (Doering, Veletsiano, Scharber & Miller, 2009). Where educators reflected upon their TPACK with self-assessment throughout the interview process; the professional development had a positive impact on the ability to

48 use the technology presented. The authors suggest the implications of their research is the creation of professional development that combines the areas of technological, pedagogical and content knowledge as one knowledge base because they are so intertwined (Doering, Veletsiano, Scharber & Miller, 2009). In a more recent longitudinal study spanning three semesters, eight pre-service teachers who already had a bachelor’s degree in a specific content area enrolled in this master program as a cohort. Over the course of the three semesters, the participants were enrolled in courses that addressed the social and historical foundation of education, the trends and current issues, educational psychology and research, methods courses specific to their area of study educational technology course and finally a practicum during the last semester. Data was collected over the three semesters at multiple data points. Data sources included the TPACK survey, reflection, and snapshot assignments completed at each of the four data points. The reflection and snapshot assignments asked students discuss when it is appropriate to use technology and when it is not appropriate. As a last data point, the pre-service teachers were asked to complete two lesson plans. One lesson plan was completed at the end of the educational technology course and another during their practicum. After writing the last lesson plan, participants were interviewed regarding how they envision technology supporting teaching and learning in the classroom (Hofer & Grandgenett, 2012). The findings of the study found strong growth in TPACK. Pedagogical content knowledge and technological content knowledge data showed a large mean surge during the fall quarter when participants were enrolled in the educational technology course.

49 Lesson plan coding demonstrated a degree in technology infusion, it is surmised that due to the fact that the first lesson plan was a requirement of the educational technology course, which was taught as a methods class rather than an isolated course, when left to their own devices without individual guidance technology infusion was less apparent (Hofer & Grandgenett, 2012). TPACK Quantitative Research The research of Mishra et al. (2009) noted the need to develop a reliable instrument for measuring TPACK and its individual components. In doing so, this research posits the affirmation of professional development approaches that do or do not change teacher's knowledge. According to Mishra et al. (2009) there have been numerous attempts to measure the individual and compounded constructs of TPACK. With selfassessments teachers report their personal perception of their understanding of content, pedagogy and technology knowledge. These surveys have established and changed the generalize approach to other contexts, content areas and models of professional development. The research conducted by Mishra et.al. (2009) sought to generate an instrument to measure pre-service teachers through self-assessment of the seven knowledge domains included with TPACK. The survey was created online and presented to users through WebCT learning management system to pre-service teachers (Mishra et al., 2009). The survey took approximately 15 to 20 minutes to complete. One hundred twenty four pre-service teacher participants completed the survey. Seventy-nine percent of the students were elementary education majors; 14.5% of the participants were early

50 childhood majors, and 6.5% of those surveyed were in a different major. One hundred and sixteen or 93.5% of the group were females, while 8 or 6.5% were males. Approximately half or 50.8% of the survey group were freshmen, 29.8% were in their sophomore year of study, 16.1% were juniors and seniors made up 3.2% of the population (Mishra et al., 2009). After running a factor analysis, 28 items proved to be problematic and were removed from the survey. Each construct within TPACK was addressed in the survey, however it is important to know that the content knowledge area included mathematics, social studies, science and literacy due to the fact that teachers at this level of education will be teaching all of these content areas. As seen in Table 1, Cronbach's alpha measurement showed high internal consistency among the items offered with the same construct (Mishra et al., 2009).

Table 1 Survey Internal Consistency of Each Construct Construct Technology Knowledge

Internal Consistency (alpha) .82

Content Knowledge

Math = .85, Social Studies = .84, Science = .82, Literacy = .75

Pedagogical Knowledge

.84

Pedagogical Content Knowledge

.85

Technological Content Knowledge

.80

Technological Pedagogical Knowledge

.86

Technological Pedagogical Content Knowledge

.92

(Schmidt et al., 2009, p. 2)

51 In order to determine the degree of linear relationship between the subscales of TPACK constructs, researchers have conduced analysis using Pearson's product-moment correlation (Table 1). The subscales varied between .02 (math and social studies content areas) to .71 (TPACK and TPK). TPACK and TPK with an r of .71 had the highest coefficient of complementary relationships, while the relationship between TPACK and TCK and TPK (r =.49) were identical. While the results are relatively small, these results do show promise as an instrument. The ongoing works of these researchers include a longitudinal study to determine the level of TPACK demonstrated by induction year teachers (Mishra et al., 2009). While growth in one construct does not necessarily mean reflective growth in another construct, these statistical conclusions are further supported by the work of Angeli and Valanides (2009) stating that TPACK is a distinctive body of knowledge constructed from the interconnectedness of each construct making up the model. There is growing belief that the instructive uses of technology are connected to specific domain with the content areas (Graham et al., 2009). Using four of the seven constructs of TPACK, Graham et al. (2009) investigated science education. Technology knowledge was used as a measurement of basic computer skill such as the use of word processing, spreadsheet and presentation software. Technology pedagogical knowledge represented how technology was used to engage students with technology oriented activities including presentations and assessments. Technology content knowledge was represented by the use of technology tools and representations that are used by instructors within the science content area. Technology pedagogical content knowledge represents

52 the use of technology to support science content within the classroom. The goal of the research was to contribute to the epistemology of how to identify and measure TPACK in relation to science and to assess changes in the teachers' TPACK confidence that participated in the SciencePlus professional development initiative. These four constructs were used because researchers viewed them as extensions of PK, TCK, TK and PCK (Graham et al., 2009). The study involved fifteen teachers who participated in a science inquiry professional development initiative called SciencePlus from Brigham Young University in 2008. The professional development provided a digital microscope with instruction on its use as well as introduction to other technologies including Google Earth, GPS devices and digital cameras. Participants used various technologies to document, analyze and present data from the inquiry projects (Graham et al., 2009). Of the fifteen participants, eleven of them were elementary teachers (10 females, 1 male); the other four participants were secondary science teachers (1 female, 3 male). Teaching experience of the group ranged from 1 to 26 years. Access to the appropriate technology within each classroom was verified; each had at least one computer with access to the Internet and a data projector. Only two of the participants have access to probeware in their classroom. A pre and post questionnaire was developed to measure confidence levels related to the four constructs. In the pre-questionnaire, participants expressed high confidence in their ability to use productivity software such as Word and PowerPoint. They expressed less confidence in their ability to produce media-rich materials or use content specific technologies (Graham et al., 2009).

53 The items were developed from McCrory's work in the Handbook of Technological Pedagogical Content Knowledge, Chapter 9 (2008) in terms of characteristics and affordances of the technology specific to the work of scientists. The measurement scale for the questions were: 6 = Complete confidence, 5 = Quite confident, 4 = Fairly confident, 3 = Somewhat confident, 2 = Slightly confident, 1 = Not confident at all. In addition, the scale for the TCK items included 0 = I don't know about this type of technology. Upon statistical analysis, it was found that all confidence levels increased, TK was found to have the largest increase. This adds support to the idea that increased confidence levels in TK is foundational to increasing the other aspects of TPACK. The TCK analysis proved to be significantly lower than all the others, authors speculate that because TCK measures the technology tools for doing science and the combination of the fact that eleven of the fifteen were elementary teachers may have caused this result. They state teachers at an elementary level may be more comfortable with technologies designed for teaching science since historically elementary teachers have little experience doing science (Graham et al., 2009). While this eight-month study was based in the pedagogy of inquiry-based instruction, many of the participants were the main users of the technology. Technology was not put in the hands of their students to use during the implementation of the lessons in the classroom (Graham et al., 2009). As the study of Graham et al. (2009) examines the TPACK of in-service teachers before and after a workshop designed to help build specific areas of the framework, Archambault & Crippen (2009) present research related to virtual schools and online

54 teachers in the K-12 environment. This work is essential since there are 38 states that have virtual schools or online initiatives (Wicks, 2010). The work focused on TPACK in the world of online learning through the lens of high quality online teaching. The research sought to examine the perceived level of knowledge held by teachers who teach online and what these ratings say about the TPACK framework. Educators targeted for the criterion sampling currently teach at least one class in a K-12 virtual school authorized by the state. The survey was sent out to 1,795 online teachers across the United States. Five hundred ninety six or 33% responded to the survey in 25 states, a response rate that was considered acceptable (Archambault & Crippen, 2009). The instrument was made up of twenty-four questions, taking into account each of the constructs to be addressed using a 5-point Likert scale. In order to determine construct validity, numerous discussions with experts were conducted leading to needed modifications. Once the items were complete, relevant and arranged, the researchers turned their attention to content validity. In order to determine content validity, a pilot of the survey was conducted. There were two phases to the pilot, which included a think aloud, strategy, requiring participants to explain their thinking as they went through each item, to ensure the correct interpretation of each it. The initial pilot resulted in rewording of items to ensure clarity. The second phase of the pilot required participants to again participate in a think aloud, but this time they were asked to place the survey items in the constructs of TPACK. The researchers discovered that participants demonstrated difficulty in separating content and pedagogy. Internal consistency through Cronbach's

55 alpha coefficient, as it was conducted on each subscale ranging from TCK = .699 to TK = .888 (Archambault & Crippen, 2009). According to the descriptive analysis, respondents did not answer every question on the survey; however, the overall mean of the respondents’ perceptions was 3.81, with a standard deviation of .939. The respondents rated their knowledge at the highest levels in pedagogy and pedagogical content. Teachers report their ability to use a variety of teaching strategies as high in terms of creating materials that plan district standards, to chart the scope and sequence within their topic areas. Teachers rated themselves high in the area of recognizing misconceptions held by students in a particular topic, and the ability to determine correct and incorrect problem solving methods, which fall into the area of pedagogical content knowledge. The highest ranked individual item fell into this domain, the ability to produce lesson plans. Knowledge of technology dropped severely as teachers did not feel as comfortable with troubleshooting computer problems experienced by students on their home computers. The results suggest that the teachers are most comfortable with traditional face-to-face teaching and the skill set needed for that environment. Open-ended questions were used to illicit further thoughts and information from the respondents, it became apparent that teachers struggle with and learning emerging technologies (Archambault & Crippen, 2009). Despite these difficulties, online teachers within this group continue to make changes to their courses, striving to make the course better. The authors suggest the respondents may have perceived more strength in their abilities in terms of content, pedagogy and pedagogical content because of their educational experiences at the

56 university level as well as their professional experience in the traditional classroom. Because this study reflects a sample of the K-12 online learning professionals, these results cannot be generalized to the overall population. And while the self-reporting nature of the survey has been found to have a certain amount of bias and may exist here, steps were taken to ensure neutral wording of each question. The findings of this study have implications for the preparation of teachers that may find themselves in a setting other than the traditional classroom (Archambault & Crippen, 2009). A study completed by Chai, Koh & Tsai, (2010) used the framework posited by Mishra & Koehler (2006) investigated the use of an ITC course and its effect on the practice of pre-service educators. The study conducted with Singapore pre-service teachers had two goals. The first goal examined the effectiveness of an ICT program that was designed to increase the TPACK of the pre-service teachers and secondly to predict the contribution of TK, PK and CK to the increase using stepwise regression. A cohort of 889 secondary pre-service teachers was selected for the study. The course was made up of 12 two-hour sessions. The first five sessions were designed to build the students theoretical understanding of didactic approaches (PK) that involved meaningful learning with technology. The next six sessions were developed to build their technological knowledge (TK). Technology tools were presented and the students were given the opportunity to explore each tool in relation to their content area. Content knowledge (CK) was not a part of the instruction in this course because the participating teachers that were recruited for this study were secondary education majors, which were considered subject matter experts. As a final project that would demonstrate TPACK students were to design

57 a lesson that was specific to their content. Students were to provide a lesson plan with teaching materials and a written paper explaining their use of the technology and the pedagogy that supported its use. The projects were evaluated against a rubric that measured their application of TK, PK, CK and TPACK (Chai, Koh & Tsai, 2010). The instrument used to evaluate the course was an adaptation of the survey created by Mishra et al. (2009). The changes included assessing CK in the content areas of the teachers involved, and to include two different areas of curriculum study because students are assigned to two related areas of study, therefore requiring the use of curriculum study 1 (CS1) and curriculum study 2 (CS2). In order to increase the reliability of the measurement, the 5 point Likert scale was changed to a 7 point scale. The final instrument was made up of 18 questions (Chai, Koh & Tsai, 2010). The teachers were provided with an email with a link for the pre and post evaluation that explained the purposed of the study. In the pre-evaluation, 439 of the 889 of the students participated in the study. The average age of the pre-evaluation participants was 26.77 years of age where SD = 5.27; more females than males participated in the pre-evaluation (females 248, males 208). In the post evaluation a total of 365 of the 889 members of the cohort participated. In the post evaluation, the mean age was 27.08 with a SD = 5.45. As in the pre evaluation more females than males participated in the post evaluation (female 192, male 173). A chi square analysis found no significant association between gender responses. T-tests were used and found no significant difference between pre and post tests in regards to age. By completing these

58 analyses it can be surmised that the respondents were demographically similar (Chai, Koh & Tsai, 2010). For the first research question, with the use of a t-test, the study found significant gains in each construct that was tested (TK = 8.90, CK = 9.21, PK = 8.62, TPACK = 9.83; all p

Suggest Documents