Could Intelligent Essay Feedback Improve the Effect of Writing Teaching in Middle School?

  • Shujun Liu ,
  • Yan Li ,
  • Yuewei He ,
  • Jingjing Wang
Expand
  • 1. College of Humanities, Huzhou Normal University, Huzhou Zhejiang 313000, China
    2. College of Education, Zhejiang University, Hangzhou 310058, China
    3. Hangzhou Baochuta Shenhua Experimental School, Hangzhou 310058, China

Accepted date: 2022-05-30

  Online published: 2022-08-24

Abstract

The development of Chinese Intelligent Essay Evaluation System is expected to change the practice and research of traditional writing teaching. However, the way of its integration into daily writing teaching and its teaching effect are highly concerned by Chinese teachers and writing teaching researchers. This study took 28 middle school students who participated in the writing extension course in B school as samples, and carried out a 10-week quasi-experimental study with the method of pre- and post-test in a single group, to verify the effect of integrating Intelligent Essay Feedback into writing teaching in middle school from multiple dimensions. Students took part in three writing activities after receiving argumentative writing guidance. Each time, they needed to conduct intelligent evaluation of the first draft and then revised the essay according to the intelligent feedback. The research focuses on analyzing the characteristics of writing revision and the improvement of writing quality, exploring the development of students’ writing motivation and revision belief, and investigating students’ perception of Intelligent Essay Feedback. The findings are as follows. Firstly, the most commonly used revision behaviors of students were adding and replacing, followed by deleting and rearrange; the ratio of lower order revision was higher than that of higher order revision; students attached great importance to self-directed revision, the success rate of which was lower than that based on intelligent feedback. Secondly, students’ writing performance improved significantly, the length of essays increased significantly, and students made significant progress in the use of stylistic elements such as arguments, explanations and conclusions. Thirdly, students’ writing motivation significantly improved in the dimensions of persistence and passion, and their writing revision belief significantly improved in the lower and higher order dimensions. Fourthly, most students believe that Intelligent Essay Feedback can promote writing practice, and the quality of feedback is the key factor affecting students’ perception. Therefore, Intelligent Essay Feedback can effectively support students’ writing revision process and improve the quality of writing revision. The continuous exploration of multiple integration paths of Intelligent Essay Feedback, teacher feedback, peer feedback and curriculum structure would be beneficial to the promotion of human-computer collaborative writing teaching practice.

Cite this article

Shujun Liu , Yan Li , Yuewei He , Jingjing Wang . Could Intelligent Essay Feedback Improve the Effect of Writing Teaching in Middle School?[J]. Journal of East China Normal University(Educational Sciences), 2022 , 40(9) : 90 -104 . DOI: 10.16382/j.cnki.1000-5560.2022.09.009

References

1 邓彤. (2014). 议论文写作逻辑缺位如何矫治. 中学语文教学, 44 (1), 16- 40.
2 董艳, 李心怡, 郑娅峰, 翟雪松. (2021). 智能教育应用的人机双向反馈: 机理、模型与实施原则. 开放教育研究, 27 (2), 26- 33.
3 IN课堂智能教育平台. (2018). IN课堂——语文作文智能批改教育迈向智能化阶段. 取自IN课堂网站(2018年11月20日):http://inketang.com/v8/news_detail_00.html
4 刘华. (2012). 议论文三要素与中国化的议论文写作体系的建构. 语文建设, 37 (4), 10- 13.
5 刘淑君, 李艳, 杨普光, 李小丽, 高红芳. (2021). 智能作文评价的效果研究. 开放教育研究, 27 (3), 73- 84.
6 宋灵青, 许林. (2018)“AI”时代未来教师专业发展途径探究. 中国电化教育, 43(7), 73?80.
7 唐锦兰, 吴一安. (2011). 在线英语写作自动评价系统应用研究述评. 外语教学与研究, 43 (2), 273- 281.
8 闫光才. (2021). 信息技术革命与教育教学变革: 反思与展望. 华东师范大学学报(教育科学版), 39 (7), 1- 15.
9 余胜泉, 王琦. (2019). “AI+教师”的协作路径发展分析. 电化教育研究, 40 (4), 14- 22+29.
10 Attali, Y. (2004). Exploring the Feedback and Revision Features of Criterion. Journal of Second Language Writing, 14 (3), 1- 20.
11 Azmi, A. M., Al-Jouie, M. F., & Hussain, M. (2019). AAEE - Automated evaluation of students’ essays in Arabic language. Information Processing & Management, 56 (5), 1736- 1752.
12 Chapelle, C. A., Cotos, E., & Lee, J. Y. (2015). Validity arguments for diagnostic assessment using automated writing evaluation. Language Testing, 32 (3), 385- 405.
13 Chen, J., & Zhang, L. J. (2019). Assessing student-writers’ self-efficacy beliefs about text revision in EFL writing. Assessing Writing, 40 (3), 27- 41.
14 Chodorow, M., Gamon, M., & Tetreault, J. (2010). The utility of article and preposition Error Correction Systems for English language learners: Feedback and assessment. Language Testing, 27 (3), 419- 436.
15 Choi, J. (2010). The impact of Automated Essay Scoring (AES) for improving English language learner's essay writing(Doctoral Dissertation). Charlottesville, VA: University of Virginia.
16 Cotos, E., Link, S., & Huffman, S. (2017). Effects of DDL technology on genre learning. Special Issue in Language Learning and Technology, 21 (3), 104- 130.
17 Flower, L., & Hayes, J. R. (1981). A cognitive process theory of writing. College Composition and Communication, 32 (4), 365- 387.
18 Foltz, P. W. (2014). Improving student writing through automated formative assessment: Practices and results. Paper presented at the International Association for Educational Assessment (IAEA), Singapore.
19 Graham, S., & Harris, K. R. (2003). Students with learning disabilities and the process of writing: A meta-analysis of SRSD studies. In Swanson, H. L., Harris, K. R., & Graham S. (ed.). Handbook of learning disabilities. New York: Guilford Press.
20 Grimes, D., & Warschauer, M. (2010). Utility in a fallible tool: A multi-site case study of Automated Writing Evaluation. Journal of Technology Learning & Assessment, 8 (6), 1- 44.
21 Hao, S. D., Xu, Y. Y., Ke, D. F., Su, K. L., & Peng, H. L. (2016). SCESS: A WFSA-based automated simplified Chinese essay scoring system with incremental latent semantic analysis. Natural Language Engineering, 22 (2), 291- 319.
22 Hayes, J. R. (2012). Modeling and Remodeling Writing. Written Communication, 29 (3), 369- 388.
23 Hoang, G. T. L., & Kunnan, A. J. (2016). Automated Essay Evaluation for English Language Learners: A Case Study of MY Access. Language Assessment Quarterly, 13 (4), 359- 376.
24 Ishioka, T., Kameda, M., & Coling. (2006). Automated Japanese Essay Scoring System based on Articles Written by Experts. Proceedings of the 21st International Conference on Computational Linguistics and 44th Annual Meeting of the Association for Computational Linguistics. Sydney. Australia: Association for Computational Linguistics.
25 Li, J., Link, S., & Hegelheimer, V. (2015). Rethinking the role of automated writing evaluation (AWE) feedback in ESL writing instruction. Journal of Second Language Writing, 27, 1- 18.
26 Link, S., Mehrzad, M., & Rahimi, M. (2020). Impact of Automated Writing Evaluation on teacher feedback, student revision, and writing improvement. Computer Assisted Language Learning, 33 (3), 1- 30.
27 Morphy, P., & Graham, S. (2012). Word processing programs and weaker writers/readers: A meta-analysis of research findings. Reading and Writing, 25 (3), 641- 678.
28 National Assessment Governors Board. (2017). Writing framework for the 2011 National Assessment of Educational Progress. Retrieved from https://catalog.library.nashville.org/Record/CARL0000508062
29 Page, E. B. (1966). Grading essays by computer: Progress report. Educational Testing Service(Ed.). Proceedings of the Invitational Conference on Testing Problems. NewYork City: Princeton, NJ: Educational Testing Service: 87—10.
30 Palermo, C., & Thomson, M. M. (2018). Teacher implementation of Self-Regulated Strategy Development with an automated writing evaluation system: Effects on the argumentative writing Chock for performance of middle school students. Contemporary Educational Psychology, 54 (7), 255- 270.
31 Piazza, C. L., & Siebert, C. F. (2008). Development and Validation of a Writing Dispositions Scale for Elementary and Middle School Students. Journal of Educational Research, 101 (5), 275- 285.
32 Scardamalia, M., Bereiter, C., & Goelman, H. (1982). The role of production factors in writing ability. In Nystrand, M. (Ed. ). What writers know: The language, process, and structure of written discourse. San Diego, CA: Academic Press.
33 Shermis, M. , Garvan, C. , & Diao, Y. (2008). The impact of Automated Essay Scoring on writing outcomes. The Annual Meetings of the National Council on Measurement in Education.
34 Stevenson, M., & Phakiti, A. (2014). The effects of computer-generated feedback on the quality of writing. Assessing Writing, 19 (12), 51- 65.
35 Wang, Y. J., Shang, H. F., & Briody, P. (2013). Exploring the impact of using automated writing evaluation in English as a foreign language university students’ writing. Computer Assisted Language Learning, 26 (3), 234- 257.
36 Ware, P. (2014). Feedback for adolescent writers in the English classroom: Exploring pen-and-paper, electronic, and automated options. Writing & Pedagogy, 6 (2), 223- 249.
37 Wilson, J., & Czik, A. (2016). Automated Essay Evaluation software in English language arts classrooms: Effects on teacher feedback, student motivation, and writing Quality. Computers & Education, 100 (5), 94- 109.
38 Wilson, J. & Roscoe, R. D. (2020). Automated Writing Evaluation and feedback: Multiple metrics of efficacy. Journal of Educational Computing Research, 58 (1), 87- 125.
Outlines

/