Journal of East China Normal University(Educational Sciences) ›› 2022, Vol. 40 ›› Issue (9): 90-104.doi: 10.16382/j.cnki.1000-5560.2022.09.009
Previous Articles Next Articles
Shujun Liu1, Yan Li2, Yuewei He3, Jingjing Wang3
Accepted:
2022-05-30
Online:
2022-09-01
Published:
2022-08-24
Shujun Liu, Yan Li, Yuewei He, Jingjing Wang. Could Intelligent Essay Feedback Improve the Effect of Writing Teaching in Middle School?[J]. Journal of East China Normal University(Educational Sciences), 2022, 40(9): 90-104.
"
初稿样例 | 终稿样例 | 修改 层次 | 修改行为 | 修改 动因 | 修改 效果 |
就比如,《西游记》中的齐天大圣孙悟空。 | 就比如,《红楼梦》中的林黛玉和贾宝玉;就比如,《水浒传》中被逼上梁山的好汉;就比如,《西游记》中的齐天大圣孙悟空。 | 高阶 修改 | 增加 | 自主修改 | 成功 |
手上岂不是沾染了更多的鲜血,更多的罪恶? | 手上岂不是沾染了更多的鲜血,沾染了更多的罪恶? | 低阶 修改 | 增加 | 自主修改 | 不 成功 |
那唐僧也及及可危了。 | 那唐僧也岌岌可危了。 | 低阶 修改 | 替换 | 回应反馈 | 成功 |
对于“紧箍咒”在取经路上的地位,在学术界中也是争议不断。 | 对于“紧箍咒”在取经路上的地位,学术界也是争议不断。 | 低阶 修改 | 删除 | 回应反馈 | 成功 |
"
初稿 | 反馈内容 | 终稿 | 效果 |
金箍并非是悟空自愿带上的。 | 文中疑似存在重复累赘现象。并非的意思是“并不是”,与“是”部分语意重复。 | 金箍并非悟空自愿带上的。 | 成功 |
在诸多情节中,都可以见到悟空被金箍束缚而不得不屈服唐僧的身影。 | 疑似主语缺失。主语缺失,往往会影响文章的表意效果。 | 我们常可以见到悟空被金箍束缚而不得不屈服于唐僧的情节。 | 成功 |
原文无 | 可通过运用新鲜的论据来增添文章新意。 | 孟子说得好:“富贵不能淫,贫贱不能移,威武不能屈。此之谓大丈夫。”而这几点,猪八戒很显然都与之背道而驰。 | 成功 |
这紧箍对孙悟空来说是一种束缚,让他失去自由。 | 语言还值得细细雕琢、打磨。 | 紧箍咒限制了孙悟空的自由。 | 不成功 |
1 | 邓彤. (2014). 议论文写作逻辑缺位如何矫治. 中学语文教学, 44 (1), 16- 40. |
2 | 董艳, 李心怡, 郑娅峰, 翟雪松. (2021). 智能教育应用的人机双向反馈: 机理、模型与实施原则. 开放教育研究, 27 (2), 26- 33. |
3 | IN课堂智能教育平台. (2018). IN课堂——语文作文智能批改教育迈向智能化阶段. 取自IN课堂网站(2018年11月20日):http://inketang.com/v8/news_detail_00.html |
4 |
刘华. (2012). 议论文三要素与中国化的议论文写作体系的建构. 语文建设, 37 (4), 10- 13.
doi: 10.3969/j.issn.1001-8476.2012.04.003 |
5 | 刘淑君, 李艳, 杨普光, 李小丽, 高红芳. (2021). 智能作文评价的效果研究. 开放教育研究, 27 (3), 73- 84. |
6 | 宋灵青, 许林. (2018)“AI”时代未来教师专业发展途径探究. 中国电化教育, 43(7), 73−80. |
7 | 唐锦兰, 吴一安. (2011). 在线英语写作自动评价系统应用研究述评. 外语教学与研究, 43 (2), 273- 281. |
8 | 闫光才. (2021). 信息技术革命与教育教学变革: 反思与展望. 华东师范大学学报(教育科学版), 39 (7), 1- 15. |
9 | 余胜泉, 王琦. (2019). “AI+教师”的协作路径发展分析. 电化教育研究, 40 (4), 14- 22+29. |
10 | Attali, Y. (2004). Exploring the Feedback and Revision Features of Criterion. Journal of Second Language Writing, 14 (3), 1- 20. |
11 | Azmi, A. M., Al-Jouie, M. F., & Hussain, M. (2019). AAEE - Automated evaluation of students’ essays in Arabic language. Information Processing & Management, 56 (5), 1736- 1752. |
12 |
Chapelle, C. A., Cotos, E., & Lee, J. Y. (2015). Validity arguments for diagnostic assessment using automated writing evaluation. Language Testing, 32 (3), 385- 405.
doi: 10.1177/0265532214565386 |
13 | Chen, J., & Zhang, L. J. (2019). Assessing student-writers’ self-efficacy beliefs about text revision in EFL writing. Assessing Writing, 40 (3), 27- 41. |
14 |
Chodorow, M., Gamon, M., & Tetreault, J. (2010). The utility of article and preposition Error Correction Systems for English language learners: Feedback and assessment. Language Testing, 27 (3), 419- 436.
doi: 10.1177/0265532210364391 |
15 | Choi, J. (2010). The impact of Automated Essay Scoring (AES) for improving English language learner's essay writing(Doctoral Dissertation). Charlottesville, VA: University of Virginia. |
16 | Cotos, E., Link, S., & Huffman, S. (2017). Effects of DDL technology on genre learning. Special Issue in Language Learning and Technology, 21 (3), 104- 130. |
17 |
Flower, L., & Hayes, J. R. (1981). A cognitive process theory of writing. College Composition and Communication, 32 (4), 365- 387.
doi: 10.2307/356600 |
18 | Foltz, P. W. (2014). Improving student writing through automated formative assessment: Practices and results. Paper presented at the International Association for Educational Assessment (IAEA), Singapore. |
19 | Graham, S., & Harris, K. R. (2003). Students with learning disabilities and the process of writing: A meta-analysis of SRSD studies. In Swanson, H. L., Harris, K. R., & Graham S. (ed.). Handbook of learning disabilities. New York: Guilford Press. |
20 | Grimes, D., & Warschauer, M. (2010). Utility in a fallible tool: A multi-site case study of Automated Writing Evaluation. Journal of Technology Learning & Assessment, 8 (6), 1- 44. |
21 |
Hao, S. D., Xu, Y. Y., Ke, D. F., Su, K. L., & Peng, H. L. (2016). SCESS: A WFSA-based automated simplified Chinese essay scoring system with incremental latent semantic analysis. Natural Language Engineering, 22 (2), 291- 319.
doi: 10.1017/S1351324914000138 |
22 |
Hayes, J. R. (2012). Modeling and Remodeling Writing. Written Communication, 29 (3), 369- 388.
doi: 10.1177/0741088312451260 |
23 |
Hoang, G. T. L., & Kunnan, A. J. (2016). Automated Essay Evaluation for English Language Learners: A Case Study of MY Access. Language Assessment Quarterly, 13 (4), 359- 376.
doi: 10.1080/15434303.2016.1230121 |
24 | Ishioka, T., Kameda, M., & Coling. (2006). Automated Japanese Essay Scoring System based on Articles Written by Experts. Proceedings of the 21st International Conference on Computational Linguistics and 44th Annual Meeting of the Association for Computational Linguistics. Sydney. Australia: Association for Computational Linguistics. |
25 |
Li, J., Link, S., & Hegelheimer, V. (2015). Rethinking the role of automated writing evaluation (AWE) feedback in ESL writing instruction. Journal of Second Language Writing, 27, 1- 18.
doi: 10.1016/j.jslw.2014.10.004 |
26 | Link, S., Mehrzad, M., & Rahimi, M. (2020). Impact of Automated Writing Evaluation on teacher feedback, student revision, and writing improvement. Computer Assisted Language Learning, 33 (3), 1- 30. |
27 |
Morphy, P., & Graham, S. (2012). Word processing programs and weaker writers/readers: A meta-analysis of research findings. Reading and Writing, 25 (3), 641- 678.
doi: 10.1007/s11145-010-9292-5 |
28 | National Assessment Governors Board. (2017). Writing framework for the 2011 National Assessment of Educational Progress. Retrieved from https://catalog.library.nashville.org/Record/CARL0000508062 |
29 | Page, E. B. (1966). Grading essays by computer: Progress report. Educational Testing Service(Ed.). Proceedings of the Invitational Conference on Testing Problems. NewYork City: Princeton, NJ: Educational Testing Service: 87—10. |
30 | Palermo, C., & Thomson, M. M. (2018). Teacher implementation of Self-Regulated Strategy Development with an automated writing evaluation system: Effects on the argumentative writing Chock for performance of middle school students. Contemporary Educational Psychology, 54 (7), 255- 270. |
31 |
Piazza, C. L., & Siebert, C. F. (2008). Development and Validation of a Writing Dispositions Scale for Elementary and Middle School Students. Journal of Educational Research, 101 (5), 275- 285.
doi: 10.3200/JOER.101.5.275-286 |
32 | Scardamalia, M., Bereiter, C., & Goelman, H. (1982). The role of production factors in writing ability. In Nystrand, M. (Ed. ). What writers know: The language, process, and structure of written discourse. San Diego, CA: Academic Press. |
33 | Shermis, M. , Garvan, C. , & Diao, Y. (2008). The impact of Automated Essay Scoring on writing outcomes. The Annual Meetings of the National Council on Measurement in Education. |
34 | Stevenson, M., & Phakiti, A. (2014). The effects of computer-generated feedback on the quality of writing. Assessing Writing, 19 (12), 51- 65. |
35 |
Wang, Y. J., Shang, H. F., & Briody, P. (2013). Exploring the impact of using automated writing evaluation in English as a foreign language university students’ writing. Computer Assisted Language Learning, 26 (3), 234- 257.
doi: 10.1080/09588221.2012.655300 |
36 | Ware, P. (2014). Feedback for adolescent writers in the English classroom: Exploring pen-and-paper, electronic, and automated options. Writing & Pedagogy, 6 (2), 223- 249. |
37 | Wilson, J., & Czik, A. (2016). Automated Essay Evaluation software in English language arts classrooms: Effects on teacher feedback, student motivation, and writing Quality. Computers & Education, 100 (5), 94- 109. |
38 |
Wilson, J. & Roscoe, R. D. (2020). Automated Writing Evaluation and feedback: Multiple metrics of efficacy. Journal of Educational Computing Research, 58 (1), 87- 125.
doi: 10.1177/0735633119830764 |
No related articles found! |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||