教育评价

美国基础教育项目效果评估经验何在?——基于25项高质量评估研究的主题文本分析

  • 时晨晨
展开
  • 中国人民大学教育学院,北京 100872

网络出版日期: 2021-12-21

基金资助

美国比尔及梅琳达·盖茨基金会(Bill & Melinda Gates Foundation)“教育研究与改革中心:为中小学领导者提供证据整合”(Center for Research and Reform in Education:Enhancing Evidence Synthesis for School Leaders)项目(OPP1198720);中国国家留学基金管理委员会“国家建设高水平大学公派研究生项目”(留金发[2018]3101号)

What is the Practical Experience of Impact Evaluation of PK-12 Educational Programs in the United States:Thematic Text Analysis Based on 25 High-Quality Evaluation Research

  • Chenchen Shi
Expand
  • School of Education, Renmin University of China, Beijing 100872, China

Online published: 2021-12-21

摘要

美国基础教育项目效果评估不仅实践起步早、根基相对坚实,而且目前也已发展成为教育科学研究的重要类型以及循证教育改革的证据来源,积累了较为成熟的实践经验。本研究对25项高质量评估研究进行主题文本分析后发现,其实践经验主要表现为:第一,充足的评估经费是评估的首要前提。评估经费来源于以政府为首的各界支持,经费筹措是集体同心的智慧行动,经费使用贯穿评估全程。第二,专业的评估人员是评估的核心输入。评估人员来自高校等专业组织,其选择确定往往会历经多方因素的全面考量,目前存在内部、外部、内外合作等三种评估人员角色定位。第三,科学的评估开展是评估的关键过程。评估规划会确定评估类型与问题,评估设计会厘清理论基础、评估方法和效果指标,评估实施会做好样本招募、干预实施以及数据收集。第四,实用的评估成果是评估的重要输出。评估发现会被恰当分析与合理解释,评估结果会被规范撰写与多样呈现,评估产品还会被进一步大力传播与有效使用。此外,美国基础教育项目效果评估在努力提升质量的同时也开始不断迸发出新的实践趋势。

本文引用格式

时晨晨 . 美国基础教育项目效果评估经验何在?——基于25项高质量评估研究的主题文本分析[J]. 华东师范大学学报(教育科学版), 2022 , 40(1) : 43 -59 . DOI: 10.16382/j.cnki.1000-5560.2022.01.004

Abstract

The impact evaluation of PK-12 educational programs in the United States not only started early in practice, with a relatively solid foundation, but also has developed into an important type of educational scientific research and the evidence source for the evidence-based educational reform at present, accumulating a lot of relatively mature practical experience. After thematically analyzing the text of 25 high-quality evaluation research, this research found that its practical experience is as follows. First of all, adequate evaluation funding is the first prerequisite for evaluation. The funding support comes from government and other sources, the raising is a collective action of wisdom, and the allocation runs through the entire evaluation process. Secondly, professional evaluators are the core input of evaluation. The evaluators come from professional organizations such as universities and colleges, and appropriate evaluators will be carefully selected and determined based on many factors, and there are currently three types of evaluator role positioning, namely, internal evaluators, external evaluators, and evaluators who collaborate internally and externally. Thirdly, scientific evaluation conducting is the key process of evaluation. The evaluation planning usually determines the evaluation type and questions. The evaluation design will clarify the theoretical basis, evaluation methods and outcome measures. And the evaluation implementation will attach great importance to sample recruitment, intervention implementation and data collecting. Fourthly, utility fruit is an important output of evaluation. The evaluation findings will be properly analyzed and reasonably explained. The evaluation results will be written by specifications and presented in various ways. And the evaluation products will be vigorously disseminated and used effectively. In addition, while striving to improve its quality, the impact evaluation of PK-12 educational programs in the U.S. has also begun to burst out new practical trends.

参考文献

null 时晨晨. (2020). 实验与实践的融合: 美国循证教育改革研究(博士学位论文). 北京: 北京师范大学.
null 吴康宁 教育改革成功的基础 教育研究 2012 33 1 24 31 吴康宁. (2012). 教育改革成功的基础. 教育研究,33(1),24—31.
null Angrist, J. D American education research changes tack Oxford Review of Economic Policy 2004 20 2 198 212 Angrist, J. D. (2004). American education research changes tack. Oxford Review of Economic Policy, 20(2), 198—212.
null Augustine, C. H., Engerg, J., Grimm, G. E., et al. (2018). Can restorative practices improve school climate and curb suspensions? An evaluation of the impact of restorative practices in a mid-sized urban school district. Santa Monica, CA: RAND Corporation.
null Balu, R., Porter, K., & Gunton, B. (2016). Can informing parents help high school students show up for school. New York: Manpower Demonstration Research Corporation.
null Bavarian, N., Lewis, K. M., Dubois, D. L., et al Using social-emotional and character development to improve academic outcomes: A matched-pair, cluster-randomized controlled trial in low-income, urban schools Journal of School Health 2013 83 11 771 779 Bavarian, N., Lewis, K. M., Dubois, D. L., et al. (2013). Using social-emotional and character development to improve academic outcomes: A matched-pair, cluster-randomized controlled trial in low-income, urban schools. Journal of School Health, 83(11), 771—779.
null Berg, T. (2018). Can we increase attendance and decrease chronic absenteeism with a universal prevention program? A randomized control study of Attendance and Truancy Universal Procedures and Interventions(Dissertation). Eugene, OR: University of Oregon.
null Bernstein, L. S., McLaughlin, J. E., Crepinsek, M. K., et al. (2004). Evaluation of the School Breakfast Program Pilot Project: Final report. Alexandria, VA: U. S. Department of Agriculture.
null Borman, G. D Experiments for educational evaluation and improvement Peabody Journal of Education 2002 77 4 7 27 Borman, G. D. (2002). Experiments for educational evaluation and improvement. Peabody Journal of Education, 77(4), 7—27.
null Bowen, D. H., & Kisida, A. B. (2019). Investigating causal effects of arts education experiences: Experimental evidence from Houston’s Arts Access Initiative. Houston, TX: Kinder Institute for Urban Research, Rice University.
null Earle, J., Maynard, R., Neild, R. C., et al. (2013). Common guidelines for education research and development. Washington, DC: Institute of Education Sciences and National Science Foundation.
null Faria, A. M., Sorensen, N., Heppen, J., et al. (2017). Getting Students on track for graduation: Impacts of the Early Warning Intervention and Monitoring System after one year(REL 2017-272). Washington, DC: Regional Educational Laboratory Midwest.
null Figlio, D. (2015). Experimental evidence of the effects of the Communities In Schools of Chicago partnership program on student achievement. Evanston, IL: Northwestern University.
null Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2011). Program evaluation: Alternative approaches and practical guidelines(4th ed.). Boston, MA: Pearson, 2011.
null Flay, B. R., Biglan, A., Boruch, R. F., et al Standards of evidence: Criteria for efficacy, effectiveness and dissemination Prevention Science 2005 6 3 151 175 Flay, B. R., Biglan, A., Boruch, R. F., et al. (2005). Standards of evidence: Criteria for efficacy, effectiveness and dissemination. Prevention Science, 6(3), 151—175.
null Giancola, S. P. (2014). Evaluation matters: Getting the information you need from your evaluation. Kennett Square, PA: Giancola Research Associates, Inc.
null Gottfredson, D. C., Cook, T. D., Gardner, F. E., et al Standards of evidence for efficacy, effectiveness, and scale-up research in prevention science: Next Generation Prevention Science 2015 16 7 893 926 Gottfredson, D. C., Cook, T. D., Gardner, F. E., et al. (2015). Standards of evidence for efficacy, effectiveness, and scale-up research in prevention science: Next Generation. Prevention Science, 16(7), 893—926.
null Guryan, J., Christenson, S., Claessens, A., et al. (2017). The effect of mentoring on school attendance and academic outcomes: A randomized evaluation of the Check & Connect program. Evanston, IL: Northwestern University.
null Hatry, H. P., Winnie, R. E., & Fisk, D. M. (1973). Practical program evaluation for state and local government officials. Washington, DC: The Urban Institute.
null Heller, S., Pollack, H. A., Ander, R., et al. (2013). Preventing youth violence and dropout: A randomized field experiment. Cambridge, MA: National Bureau of Economic Research.
null Heller, S., Shan, A. K., Guryan, J., et al Thinking, fast and slow? Some field experiments to reduce crime and dropout in Chicago The Quarterly Journal of Economics 2017 132 1 1 54 Heller, S., Shan, A. K., Guryan, J., et al. (2017). Thinking, fast and slow? Some field experiments to reduce crime and dropout in Chicago. The Quarterly Journal of Economics, 132(1), 1—54.
null Herrera, C., Grossman, J. B., Kauh, T. J., et al. (2007). Making a difference in schools: The Big Brothers Big Sisters school-based mentoring impact study. Philadelphia, PA: Public/Private Ventures.
null Hirsch, B. J., Hedges, L. V., Stawicki, J., et al. (2011). After-school programs for high school students: An evaluation of After School Matters. Evanston, IL: Northwestern University.
null ICF International. (2010a). Communities In Schools national evaluation volume 4: Randomized controlled trial study(Jacksonville, Florida). Fairfax, VA: ICF International.
null ICF International. (2010b). Communities In Schools national evaluation volume 5: Randomized controlled trial study(Austin, Texas). Fairfax, VA: ICF International.
null ICF International. (2010c). Communities In Schools national evaluation volume 6: Randomized controlled trial study(Wichita, Kansas). Fairfax, VA: ICF International.
null Jones, C. J., Christian, M., & Rice, A. (2016). The results of a randomized control trial evaluation of the SPARK literacy program. Washington, DC: Society for Research on Educational Effectiveness.
null Jones, S. M., Brown, J. L., & Lawrence, A. J Two-year impacts of a universal school-based social-emotional and literacy intervention: An experiment in translational developmental research Child Development 2011 82 2 533 554 Jones, S. M., Brown, J. L., & Lawrence, A. J. (2011). Two-year impacts of a universal school-based social-emotional and literacy intervention: An experiment in translational developmental research. Child Development, 82(2), 533—554.
null Kuckartz, U. (2014). Qualitative text analysis: A guide to methods, practice and using software. Thousand Oaks, CA: Sage Publications.
null Mertens, D. M., & Wilson, A. T. (2019). Program evaluation theory and practice: A comprehensive guide(2nd ed.). New York: The Guilford Press.
null Neace, W. P., & Mu?oz, M. A Pushing the boundaries of education: Evaluating the impact of Second Step?: A violence prevention curriculum with psychosocial and non-cognitive measures Child & Youth Services 2012 33 1 46 69 Neace, W. P., & Mu?oz, M. A. (2012). Pushing the boundaries of education: Evaluating the impact of Second Step?: A violence prevention curriculum with psychosocial and non-cognitive measures. Child & Youth Services, 33(1), 46—69.
null Parise, L. M., Corrin, W., Granito, K., et al. (2017). Two years of case management: Final findings from the Communities In Schools random assignment evaluation. New York: Manpower Demonstration Research Corporation.
null Philp, J. D. (2015). FLIGHT: Final evaluation report. Columbia, SC: The Evaluation Group.
null Robinson, C. D., Lee, M. G., Dearing, E., et al Reducing student absenteeism in the early grades by targeting parental beliefs American Educational Research Journal 2018 55 6 1163 1192 Robinson, C. D., Lee, M. G., Dearing, E., et al. (2018). Reducing student absenteeism in the early grades by targeting parental beliefs. American Educational Research Journal, 55(6), 1163—1192.
null Rogers, T., & Feller, A Reducing student absences at scale by targeting parents’ misbeliefs Nature Human Behaviour 2018 2 5 335 342 Rogers, T., & Feller, A. (2018). Reducing student absences at scale by targeting parents’ misbeliefs. Nature Human Behaviour, 2(5), 335—342.
null Savage, J. D. (2000). Funding science in America: Congress, universities, and the politics of the academic pork barrel. Cambridge, UK: Cambridge University Press.
null Shi, C., Inns, A., Lake, C., & Slavin, R. E. (2019). Effective school-based programs for K-12 students’ attendance: A best-evidence synthesis. Baltimore, MD: Center for Research and Reform in Education, Johns Hopkins University.
null Slavin, R. E. (2018, November 29). A warm welcome from Babe Ruth’s home town to the Registry of Efficacy and Effectiveness Studies (REES)[Blog post]. Retrieved from https://robertslavinsblog.wordpress.com/2018/11/29/a-warm-welcome-from-babe-ruths-home-town-to-the-registry-of-efficacy-and-effectiveness-studies-rees/.
null Slavin R. E. (2019, October 24). Developer- and researcher-made measures[Blog post]. Retrieved from https://robertslavinsblog.wordpress.com/2019/10/24/developer-and-researcher-made-measures/.
null Slavin R. E How evidence-based reform will transform research and practice in education Educational Psychologist 2020 55 1 21 31 Slavin R. E. (2020). How evidence-based reform will transform research and practice in education. Educational Psychologist, 55(1), 21—31.
null Slavin, R. E., & Cheung, A Lessons learned from large-scale randomized experiments Journal of Education for Students Placed at Risk 2017 22 4 253 259 Slavin, R. E., & Cheung, A. (2017). Lessons learned from large-scale randomized experiments. Journal of Education for Students Placed at Risk, 22(4), 253—259.
null Smolkowski, K., Seeley, J. R., Gau, J. M., et al Effectiveness evaluation of the Positive Family Support intervention: A three-tiered public health delivery model for middle schools Journal of School Psychology 2017 62 6 103 125 Smolkowski, K., Seeley, J. R., Gau, J. M., et al. (2017). Effectiveness evaluation of the Positive Family Support intervention: A three-tiered public health delivery model for middle schools. Journal of School Psychology, 62(6), 103—125.
null Snyder, F., Flay, B., Vuchinich, S., et al Impact of a social-emotional and character development program on school-level indicators of academic achievement, absenteeism, and disciplinary outcomes: A matched-pair, cluster-randomized, controlled trial Journal of Research on Educational Effectiveness 2010 3 1 26 55 Snyder, F., Flay, B., Vuchinich, S., et al. (2010). Impact of a social-emotional and character development program on school-level indicators of academic achievement, absenteeism, and disciplinary outcomes: A matched-pair, cluster-randomized, controlled trial. Journal of Research on Educational Effectiveness, 3(1), 26—55.
null U. S. Institute of Education Sciences. (2020). Education research grants request for applications beginning in fiscal year 2021. Washington, DC: Institute of Education Sciences.
null Weiss, C. H. (1998). Evaluation: Methods for studying programs and policies(2nd ed.). Upper Saddle River, NJ: Prentice Hall.
文章导航

/