If we want to understand what works in studies of teacher education programs, we also need to understand what does not work. In this article, we discuss why a study evaluating the effects of an education program on implementation practices yielded unexpected results. Interviews with a sample of teacher graduates from the program revealed that the program did have effects on implementation practices that were not evident in the original study. These effects are in the form of increased student participation, teamwork and the conception of error as opportunity. The instrument and procedures of the original study did not allow these effects to be seen.
The impact sheet to this article can be accessed at 10.6084/m9.figshare.22339567.
Purchase
Buy instant access (PDF download and unlimited online access):
Institutional Login
Log in with Open Athens, Shibboleth, or your institutional credentials
Personal login
Log in with your brill.com account
Antonakis, J. (2017). On doing better science: From thrill of discovery to policy implications. The Leadership Quarterly, 28(1), 5–21. https://doi.org/10.1016/j.leaqua.2017.01.006.
Cho, J. (1998). Rethinking curriculum implementation: Paradigms, models, and teachers’ work. Annual meeting of the American Educational Research Association, San Diego, CA.
Corbin, J., & Strauss, A. (1990). Grounded theory research: Procedures, canons, and evaluative criteria. Qualitative Sociology, 13(1), 3–21.
Darling-Hammond, L., Hyler, M. E., & Gardner, M. (2017). Effective teacher professional development. Learning Policy Institute.
Desimone, L. M. (2009). Improving impact studies of teachers’ professional development: Toward better conceptualizations and measures. Educational Researcher, 38(3), 181–199. https://doi.org/https://doi.org/10.3102/0013189X08331140.
Desimone, L. M., Porter, A. C., Garet, M. S., Yoon, K. S., & Birman, B. F. (2002). Effects of professional development on teachers’ instruction: Results from a three-year longitudinal study. Educational Evaluation and Policy Analysis, 24(2), 81–112. https://doi.org/https://doi.org/10.3102/01623737024002081.
Gertler, P. J., Martínez, S., Premand, P., Rawlings, L. B., & Vermeersch, C. M. J. (2017). La evaluación de impacto en la práctica [Impact assessment in practice]. BID — Banco Mundial.
Gómez, P. (2002). Análisis didáctico y diseño curricular en matemáticas [Didactical analysis and curriculum design in mathematics]. Revista EMA, 7(3), 251–293.
Gómez, P. (2007). Desarrollo del conocimiento didáctico en un plan de formación inicial de profesores de matemáticas de secundaria [Development of pedagogical knowledge in a preservice secondary mathematics teacher education program]. Departamento de Didáctica de la Matemática de la Universidad de Granada. http://funes.uniandes.edu.co/444/.
Gómez, P. (Ed.). (2018). Formación de profesores de matemáticas y práctica de aula: conceptos y técnicas curriculares [Training of mathematics teachers and classroom practice: concepts and curricular techniques]. Universidad de los Andes.
Gregoire, M. (2003). Is it a challenge or a threat? A dual-Process model of teachers’ cognition and appraisal processes during conceptual change. Educational Psychology Review, 15(2), 147–179. https://doi.org/10.1023/A:1023477131081.
Guskey, T. (2002). Does it make a difference? Evaluating professional development. Educational Leadership, 59(6), 45–51.
Karsenty, R. (2021). Implementing professional development programs for mathematics teachers at scale: What counts as success? ZDM — Mathematics Education, 53(5), 1021–1033. DOI: 10.1007/s11858-11021-01250-01255.
Kennedy, M. M. (2016). How does professional development improve teaching? Review of Educational Research, 86(4), 945–980. https://doi.org/https://doi.org/10.3102/0034654315626800.
Koichu, B., Aguilar, M. S., & Misfeldt, M. (2021). Implementation-related research in mathematics education: the search for identity. ZDM — Mathematics Education, 53(5), 975–989. https://doi.org/10.1007/s11858-021-01302-w.
Mayring, P. (2015). Qualitative content analysis: Theoretical background and procedures. In A. Bikner-Ahsbahs, C. Knipping, & N. Presmeg (Eds.), Approaches to qualitative research in mathematics education. Examples of methodology and methods (pp. 365–380). Springer. https://doi.org/10.1007/978-94-017-9181-6_13.
Minor, E. C., Desimone, L., Lee, J. C., & Hochberg, E. D. (2016). Insights on how to shape teacher learning policy: The role of teacher content knowledge in explaining differential effects of professional development. Education Policy Analysis Archives/Archivos Analíticos de Políticas Educativas, 24(61), 1–30. https://doi.org/10.14507/epaa.24.2365.
Opfer, V. D., & Pedder, D. (2011). Conceptualizing teacher professional learning. Review of Educational Research, 81(3), 376–407. https://doi.org/10.3102/0034654311413609.
Pinzón, A., & Gómez, P. (2023). Effects of a professional development program on teachers’ curricular practices [Manuscript submitted for publication]. Universidad de los Andes.
Remillard, J., & Heck, D. (2014). Conceptualizing the curriculum enactment process in mathematics education. ZDM — Mathematics Education, 46(5), 705–718. https://doi.org/10.1007/s11858-014-0600-4.
Schoenfeld, A. H. (2000). Purposes and methods of research in mathematics education. Notices of the American Mathematical Society, 47(3), 641–649.
Son, J.-W. (2013). How preservice teachers interpret and respond to student errors: Ratio and proportion in similar rectangles. Educational Studies in Mathematics, 84(1), 49–70. https://doi.org/https://doi.org/10.1007/s10649-013-9475-5.
Son, J.-W. (2016). Moving beyond a traditional algorithm in whole number subtraction: Preservice teachers’ responses to a student’s invented strategy. Educational Studies in Mathematics, 93(1), 105–129. https://doi.org/10.1007/s10649-016-9693-8.
Tirosh, D., Tsamir, P., & Levenson, E. (2015). Fundamental issues concerning the sustainment and scaling up of professional development programs. ZDM — Mathematics Education, 47(1), 153–159. https://doi.org/10.1007/s11858-015-0665-8.
Torres, J. C., & Duque, H. (1994). El proceso de descentralización educativa en Colombia. Revista Colombiana de Educación, 29, 7–50. https://doi.org/10.17227/01203916.5364.
Uysal, H. H. (2012). Evaluation of an in-service training program for primary-school language teachers in Turkey. Australian Journal of Teacher Education, 37(7), 14–29. https://doi.org/10.14221/ajte.2012v37n7.4.
Varela, F. J., & Shear, J. (1999). First-person methodologies: What, why, how. Journal of Consciousness studies, 6(2–3), 1–14.
Wenger, E. (1998). Communities of practice. Learning, meaning, and identity. Cambridge University Press.
All Time | Past 365 days | Past 30 Days | |
---|---|---|---|
Abstract Views | 359 | 162 | 39 |
Full Text Views | 70 | 6 | 0 |
PDF Views & Downloads | 60 | 12 | 0 |
If we want to understand what works in studies of teacher education programs, we also need to understand what does not work. In this article, we discuss why a study evaluating the effects of an education program on implementation practices yielded unexpected results. Interviews with a sample of teacher graduates from the program revealed that the program did have effects on implementation practices that were not evident in the original study. These effects are in the form of increased student participation, teamwork and the conception of error as opportunity. The instrument and procedures of the original study did not allow these effects to be seen.
The impact sheet to this article can be accessed at 10.6084/m9.figshare.22339567.
All Time | Past 365 days | Past 30 Days | |
---|---|---|---|
Abstract Views | 359 | 162 | 39 |
Full Text Views | 70 | 6 | 0 |
PDF Views & Downloads | 60 | 12 | 0 |