tinytex::install_tinytex()
This is a Rmd-template for protocols and reporting of systematic reviews and meta-analyses. It synthesizes three sources of standards:
The template is aimed at
We are aware that MARS targets aspects of reporting after the systemtic review/ meta-analysis is completed rather than decisions and reasoning in the planning phase as PRISMA-P and PROSPERO. MARS nevertheless provides a good framework to determine crucial points for systemtic reviews/ meta-analyses to be addressed as early as in the planning phase.
Standards have been partially adapted. Click ‘show changes’ to see changes and reasons for change.
standard | implemented change | reason |
---|---|---|
MARS | Left out paper section “Abstract” | An abstract is important for reporting, not however, for planning and registering. |
MARS | Left out paper section “Results” and parts of “Discussion” | Specifications on how to report results is important for reporting, not however, for planning and registering. Prospective information on how results will be computed/ synthesized is preserved. |
MARS | Left out “Protocol: List where the full protocol can be found” | This form practically is the protocol. |
PROSPERO | Left out non-mandatory fields or integrated them with mandatory fields. | Avoiding too detailed specifications. All relevant informations will be integrated. |
PROSPERO | Left out some options in “Type and method of review” | Options left out are purely health/ medicine related. |
PROSPERO | Left out “Health area of the review” | This field is purely health/ medicine related. |
Cleaning up the Mess : A Systematic Review on the Diverse Conceptualizations of the Technological, Pedagogical and Content Knowledge (TPACK) Framework
Systematic Review
This form is used as registration
Start: October 2020
Anticipated Completion Date: July 2021
The review has not yet started [yes/no]: no
Review stage | Started | Completed |
---|---|---|
Preliminary searches | Yes | Yes |
Piloting of the study selection process | No | No |
Formal screening of search results against eligibility criteria | No | No |
Data extraction | No | No |
Risk of bias (quality) assessment | No | No |
Data analysis | No | No |
Corresponding author
Amendments will be published as new version of the document under the same DOI (or will point to previous version).
BMBF - This project is part of the “Qualitätsoffensive Lehrerbildung”, a joint initiative of the Federal Government andthe Länder which aims to improve the quality of teacher training. The programme is funded by the Federal Ministry of Education and Research. The authors are responsible for the content of this publication.
No conflict of interest.
When it comes to school or education as such, the discussion about the use of digital media has become ubiquitous. However, recent studies indicate that teachers still rarely use digital technologies for educational purposes, and if they do, they fail to integrate them into teaching in a didactically meaningful manner (Farjon, Smits & Voogt, 2019). One of the main boundary conditions of successful technology integration, that researchers have identified, is the professional knowledge of teachers. Accordingly, to use technologies in classrooms purposefully, teachers need specific knowledge that is tailored around the use of digital technologies. One of the most recited and adopted models used to describe such knowledge is the TPACK (technological, pedagogical and content knowledge) model by Mishra and Koehler (2006). The TPACK model captures the idea of bringing together and connecting basic knowledge components (i.e., knowledge about technology, pedagogy and content) to form a new central form of knowledge – TPACK (technological, pedagogical and content knowledge. In literature, TPACK has evolved to become the central focus of researchers when it comes to knowledge regarding technology integration (Kim & Lee, 2018).
Ever since the introduction of the TPACK model in 2006, numerous researchers have worked on the model trying to clarify its underlying structure (Angeli & Valanides, 2009; Cox & Graham, 2011); or with the model using it as theoretical background for data driven studies (Angeli et al., 2016; Cavanagh & Koehler, 2013;). Yet, to date the question of what TPACK constitutes remains a source of scholarly debate (Petko, 2020). The drive of this debate seems to be mainly due to the vast, diverse and often seemingly contradictory conceptualizations of TPACK that exist in the TPACK literature. To provide a comprehensive picture on TPACK, it is therefore necessary to understand and organize the different conceptualizations that researchers introduced when working on or with the TPACK model.
Against this background, we conduct a systematic review that attempts to clarify and systematize existing conceptualizations within the broad corpus of TPACK research. More precisely, we are interested in examining if TPACK researchers – in their endeavours of conceptualizing TPACK – have put emphasis on specific TPACK components (i.e., subdomains of TPACK: Technological Knowledge, content knowledge, pedagogical Knowledge, Pedagogical content knowledge, Technological content Kowledge, technologocical pedagagogical knowledge and technological pedagogical and content knowledge) while possibly neglecting others. In other words, we will examine which foci lenses have been used by researchers when dealing with the TPACK model. A particular focus will here be on how and to what extent content-specific features of TPACK were accounted for in existing conceptualizations of the TPACK model. Moreover, we will systematically look at existing TPACK measurement methods used in data driven studies to gain a deeper understanding of the empirical applicability of existing TPACK conceptualizations.
This systematic review will help to organize and understand different existent TPACK conceptualizations in research, and thereby paves the way for fruitful applications of this highly complex framework in the future.
References
Angeli, C., & Valanides, N. (2009). Epistemological and methodological issues for the conceptualization, development, and assessment of ICT-TPCK: Advances in technological pedagogical content knowledge (TPCK). Computers & Education, 52(1), 154–168. doi: 10.1016/j.compedu.2008.07.006.
Angeli, C.; Voogt, J.; Fluck, A.; Webb, M.; Cox, M.; Malyn-Smith, J.,& Zagani, J. (2016): A K-6 Computational Thinking Curriculum Framework: Implication for Teacher Knowledge. In: Educational Technology & Society, 19(3), 47–57.
Cavanagh, Robert F.; Koehler, Matthew J. (2013): A Turn toward Specifying Validity Criteria in the Measurement of Technological Pedagogical Content Knowledge (TPACK). Journal of Research on Technology in Education, 46(2), 129-148 doi: https://doi.org/10.1080/15391523.2013.10782616
Cox, S., & Graham, CR (2009). Diagramming TPACK in practice: Using an elaborated model of the TPACK framework to analyze and depict teacher knowledge. TechTrends: Linking Research & Practice to Improve Learning, 53(5), 60-69. doi: https://doi.org/10.1007/s11528-009-0327-1
Farjon, D., Smits, A., & Voogt, J. (2019). Technology integration of student teachers explained by attitudes and beliefs, competency, access, and experience. Computers & Education, 130, 81-93. doi: https://doi.org/10.1016/j.compedu.2018.11.010
Kim, S.-W., & Lee, Y. J. (2018): The Effects of the TPACK-P Educational Program on Teachers’ TPACK: Programming as a Technological Tool. International Journal of Engineering & Technology, 30(7), 636–643. doi: https://www.doi.org/10.14419/ijet.v7i3.34.19405
Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for integrating technology in teacher knowledge. Teachers College Record, 108(6), 1017-1054. doi: https://doi.org/10.1111/j.1467-9620.2006.00684.x
Petko, D. (2020). Quo vadis TPACK? Scouting the road ahead. Proceedings of EdMedia + Innovate Learning (pp. 1349-1358). Online, The Netherlands: Association for the Advancement of Computing in Education (AACE). Retrieved October 10, 2020 from https://www.learntechlib.org/primary/p/217445/.
RQ1) How is teachers’ technological-pedagogical content knowledge (TPACK) conceptualized within research? What different components are defined and on which component does the focus lie (T, P, C, TP, TC, TP, PC, TPC)?
RQ2a) What kind of test instruments have been used to measure TPACK in empirical studies and in which ways to the differ in their application?
RQ2b) How do the several existing TPACK measurement instruments mirror the distinct conceptualizations of TPACK?
Papers that investigate teachers’ professional knowledge for technology-enhanced teaching based on the TPACK framework by Mishra & Koehler (2006) or adoptions and extensions thereof
Theoretical contributions that conceptualize TPACK (or adoptions and extensions thereof)
Empirical studies that assess TPACK of instructiors such as (pre-service/in-service) teachers across all levels, university professors, tutors etc.
Peer-reviewed journal articles, conference proceedings and dissertations
No fulltext available
Papers that are not written in English
Papers in which TPACK (or adoptions and extensions thereof) was not explicitly mentioned in the title or abstract
Papers from which no clear conceptualization of TPACK can be made out
Search String
((TPACK OR TPACK OR “technological pedagogical content knowledge” OR “technological-pedagogical-content-knowledge” OR “technological pedagogical and content knowledge”) AND teacher*)
Additional Specifications Used
Note: In 2005, the acrononym TPACK was first used by Mishra and Koehler. Thus, to offer a complete picture on existing TPACK conceptualizations, all TPACK contributions that have been published since 2005 are taken into consideration.
Two independent reviewers conduct every of the following steps:
At each of these three steps, articles will be included if the inclusion criteria apply and none of the exclusivity criteria apply. If this is not the case, articles will be excluded. If it is not clear whether articles should be included or excluded, these articles will be labelled as “maybe” and then discussed among the raters until consens is reached. If there are disagreements aomng the raters regarding the inclusion or exclusion of a certain publication, this publication will be discussed together in more detail until consens is reached.
Studies will be coded in Rayyan by two independent raters using the “inclusion”, “exclusion” and “maybe” labelling function.
To extract detailed information on the included studies, a standardized Excel or self-programmed dashboard that produces a relational database will be established and applied by two independent raters.
All extracted data and information will be analyzed for interrater agreement and discrepancies will be discussed (see also 3.5).
Note: This list might be subject to extensions and/or adoptions as we start with the revision process.
not relevant as this is a review, not a meta-analysis.
Through the systematic approach used to obtain our final sample, the general quality of included publications should be high. To account for individual difficulties and possible bias within publications, a qualitative content analysis approach will be conducted for each publication individually. By doing so, we can make sure to detect and properly reflect upon (empirical) difficulties that we come across in individual publications.
After the final selection of the sample from the revision process, a content analysis approach will be conducted (see Mayring, 2014). This means, that for each of the selected publication, relevant information will be clustered and organized into units of meaning. These units of meaning carry information that will help in answering the research questions.
Our analytical approach can be considered both deductive as well as inductive. It will be deductive in the sense that our starting point for developing the coding scheme will be the TPACK model and the complex interplay of its subdomains. On the other hand, our approach includes inductive characteristics as our coding scheme might be supplemented by further labels extracted from individual contributions as we conduct our sample.
Our dichotomous approach will contribute to a comprehensive understanding of TPACK helping future researchers to theoretically and empirically apply the complex framework of TPACK.