学习科学

学习进阶测评工具研发:以小学生统计思维为例

  • 李化侠 ,
  • 宋乃庆 ,
  • 杨涛 ,
  • 辛涛
展开
  • 1. 人民教育出版社课程教材研究所,中小学道德与法治国家教材建设重点研究基地,北京 100081;
    2. 西南大学中国基础教育质量监测协同创新中心西南大学分中心,重庆 400715;
    3. 北京师范大学中国基础教育质量监测协同创新中心,北京 100875

网络出版日期: 2020-04-13

基金资助

北京师范大学中国基础教育质量监测协同创新中心重大成果培育性项目“我国基础教育测评模型构建范式研究”(2018-06-002-BZPK01);中国博士后科学基金面上资助项目“学习兴趣测评模型构建及其在小学数学教材编写中的应用”(2018M641404)

Development of Assessment Tool of Learning Progressions: Taking Primary School Students’ Statistical Thinking Test for Example

  • Li Huaxia ,
  • Song Naiqing ,
  • Yang Tao ,
  • Xin Tao
Expand
  • 1. People’s Education Press, Curriculum and Teaching Materials Research Institute, The National Research Institute for Teaching Materials of Moratity and Rule of Law for Primary School, Secondary School and Vacational School, Beijing 100081, China;
    2. Southwest University, National Innovation Center for Assessment of Basic Education Quality Southwest University Branch Center, Chongqing 400715, China;
    3. Beijing Normal University, Collaborative Innovation Center of Assessment toward Basic Education Quality, Beijing 100875, China

Online published: 2020-04-13

摘要

学习进阶可以描述学生思维发展的轨迹,揭示学习进程和思维发展规律。测评工具研发是学习进阶的重要组成,研发不足会制约学习进阶的研究与应用。该研究的目的是采用学习进阶理论研制小学生统计思维的测评工具,为加强学习进阶的实证研究、应用研究提供参考。研究方法包含建立小学生统计思维学习进阶的理论假设、组建测试题目、分析题目质量、验证学习进阶理论与学生实际表现四个步骤。研究结果表明,小学生统计思维的学习进阶理论假设与实际状况基本相符,基于学习进阶理论开发的测评工具能为了解学生思维发展、改进教学提供更多参考信息,并可为发现学习规律提供新视角。

本文引用格式

李化侠 , 宋乃庆 , 杨涛 , 辛涛 . 学习进阶测评工具研发:以小学生统计思维为例[J]. 华东师范大学学报(教育科学版), 2020 , 38(4) : 72 -82 . DOI: 10.16382/j.cnki.1000-5560.2020.04.006

Abstract

Learning progressions can describe the trajectory of students’ thinking development and reveal students’ learning patterns, but the development of assessment tool restricts its research and application. This article takes primary school students’ statistical thinking test for example to develop the assessment tool of learning progressions.The methodology involves learning progressions research framework, building the theory hypothesis of primary school students’ learning progressions of statistical thinking, collection of problem sets, item quality analysis, verification by students’ performance. The results indicate that the leaning progressions theory hypothesis of primary school students’ statistical thinking is basically in accordance with the students’ performance, and this paradigm can provide more reference for teaching and students’ thinking. Also, it can help discover new learning patterns from a new perspective.

参考文献

巴桑卓玛. (2006). 中小学生对统计的认知水平研究. 长春: 东北师范大学博士学位论文.
彼格斯, 科利斯, 等. (2010). 学习质量评价(高凌飚等译). 北京: 人民教育出版社.
高一珠, 陈孚, 辛涛, 詹沛达, 姜宇. (2017). 心理测量学模型在学习进阶中的应用:理论、途径和突破. 心理科学进展,25(09),1623-1630
李化侠, 辛涛, 宋乃庆, 杨涛. (2018). 小学生统计思维测评模型构建. 教育研究与实验,181(02),80-86
李亚. (2016). 我国地理核心素养的学习进阶研究. 上海: 华东师范大学硕士学位论文.
史宁中, 张丹, 赵迪. (2008). “数据分析观念”的内涵及教学建议——数学教育热点问题系列访谈之五. 课程·教材·教法,28(06),40-44
王祖浩, 杨玉琴. (2012). 基于Rasch模型的“化学实验认知能力”测验工具编制及测评研究. 化学教育(中英文),33(9),95-102
韦斯林, 贾远娥. (2010). 学习进程:促进课程、教学与评价的一致性. 全球教育展望,39(09),24-31
张燕华, 郑国民, 关惠文. (2014). 中学生语文学科能力表现——基于Rasch模型的语文测试评价. 课程·教材·教法,34(11),69-74
Alonzo, A., & Steedle, J. (2009). Developing and assessing a force and motion learning progression. Science Education, 93(3), 389-421
Briggs, D., Alonzo, A., Schwab, C., & Wilson, M. (2006). Diagnostic assessment with ordered multiple-choice items. Educational Assessment, 11(1), 33-63
Chen, J.(2012). Applying item response theory methods to design a learning progression-based science assessment(Doctoral Dissertation). Michigan: Michigan State University.
Corcoran, T., Mosher, F., & Rogat, A.(2009). Learning Progressions in Science: An Evidence-Based Approach to Reform. CPRE Research Report # RR-63. Consortium for Policy Research in Education.
Duncan, R., & Hmelo-Silver, C. (2009). Learning progressions: Aligning curriculum, instruction, and assessment. Journal of Research in Science Teaching, 46(6), 606-609
Duschl, R., Maeng, S., & Sezen, A. (2011). Learning progressions and teaching sequences: A review and analysis. Studies in Science Education, 47(2), 123-182
Frank, K. (2000). Impact of a confounding variable on a regression coefficient. Sociological Methods & Research, 29(2), 147-194
Javid, L. (2014). The comparison between multiple-choice(MC) and multiple true-false(MTF) test formats in Iranian intermediate EFL learners’ vocabulary learning. Procedia-Social and Behavioral Sciences, 98, 784-788
Jones, G., Thornton, C., Langrall, C., Mooney, E., Perry, B., & Putt, I (2000). A framework for characterizing children's statistical thinking. Mathematical Thinking and Learning, 2(4), 269-307
Kane, M., & Bejar, I. (2014). Cognitive frameworks for assessment, teaching, and learning: A validity perspective. Psicología Educativa, 20(2), 117-123
Keeves, J. P., & Alagumalai, S. (1999). New approaches to measurement. In Masters, G. N., & Keeves, J. P. (ed.). Advances in Measurement in Educational Research and Assessment. New York: Pergamon.
Lane, S. (2010). Validity of high-stakes assessment: Are students engaged in complex thinking?. Educational Measurement Issues & Practice, 23(3), 6-14
National Research Council [NRC](2007). Taking Science to School: Learning and Teaching Science in Grades K-8. Washington, DC: The National Academies Press.
Plummer, J. D. (2012). Challenges in defining and validating an astronomy learning progression. In Alonzo, A.C., & Gotwals A.W. (ed.). Learning Progressions in Science. Rotterdam: SensePublishers.
Tatsuoka, K. K, Corter, J. E, & Tatsuoka, C (2004). Patterns of diagnosed mathematical content and process skills in TIMSS-R across a sample of 20 countries. American Educational Research Journal, 41(4), 901-926
Watson, J., & Callingham, R (2003). Statistical literacy: A complex hierarchy construct. Statistics Education Research Journal, 2(2), 3-46
Watson, J., & Kelly, B. A. (2002). Can grade 3 students learn about variation? Proceedings of the Sixth International Conference on Teaching Statistics(ICOTS6). Cape Town: International Statistics Institution 2002.
Wei, S., Liu, X., & Jia, Y. (2014). Using Rasch measurement to validate the instrument of students’ understanding of models in science(sums). International Journal of Science and Mathematics Education, 12(5), 1067-1082
Wilson, M. (2004). Constructing Measures: An Item Response Modeling Approach. New Jersey: Lawrence Erlbaum Associates.
Wu M. L., Adams, R. J., & Wilson, M. R.(1998). ConQuest: Generalized Item Response Modelling Software. Sydney: Australian Council for Educational Research(ACER).
文章导航

/