EFL students’ perception on the use of Grammarly and teacher feedback

Main Article Content



(1)  Mohammad Amiqul Fahmi   
Universitas Negeri Malang
Indonesia

(2) * Bambang Yudi Cahyono   
Universitas Negeri Malang
Indonesia

(*) Corresponding Author

Abstract

Many studies on the Automated writing evaluation (AWE) Program predominantly focused on the outcomes of the students writing and the comparison between AWE programs. However, studies investigating the students' perception on combining an AWE program and teacher feedback are still insufficient.  This study examined the students' perception on the use of Grammarly and teacher feedback on their writing. It also sought to know whether the students' English proficiency level influences their perception. The participants included 26 undergraduate students of the Faculty of Law who were taking an English for Specific Purpose (ESP) Writing course when the data were collected. The data were the students' responses to the questionnaire and their TOEFL scores. The result of the analysis showed that the students perceived the use of Grammarly and teacher feedback positively. Furthermore, the students' perception on the use of Grammarly and teacher feedback was not influenced by their English proficiency level. Students of high and low English proficiency levels gave positive responses to the use of Grammarly and teacher feedback.


HIGHLIGHTS:


  • Combining two kinds of feedback (AWE program and the teacher) providers will produce the feedback that is truly helpful for both the teacher and the students.

  • The role of teacher feedback cannot be neglected as it can complement the demerits of the AWE program.

  • English teachers should consider the appropriate approach when using Grammarly for students with certain English proficiency level because students from different English proficiency levels need a different approach.

Downloads

Download data is not yet available.

Article Details

How to Cite
Fahmi, M. A., & Cahyono, B. Y. (2021). EFL students’ perception on the use of Grammarly and teacher feedback. JEES (Journal of English Educators Society), 6(1), 18-25. https://doi.org/10.21070/jees.v6i1.849
Section
Articles
Author Biographies

Mohammad Amiqul Fahmi, Universitas Negeri Malang

Mohammad Amiqul Fahmi is an English teacher in SMA Al-Izzah, Batu. He is currently taking Graduate Program in ELT at Universitas Negeri Malang, East Java. He graduated his ELT undergraduate program from Universitas Jember. He can be contacted trough email mafahmi26@gmail.com

Bambang Yudi Cahyono, Universitas Negeri Malang

Bambang Yudi Cahyono is a Professor in Applied Linguistics at the English Department, Faculty of Letters, Universitas Negeri Malang, East Java, Indonesia. He earned his M A in Applied Linguistics from Concordia University, Montreal, Canada and PhD in Linguistics and Aplied Linguistics from the University of Melbourne Australia

References

Ariyanto, M. S. A., Mukminatien, N., & Tresnadewi, S. (2019). Students’ and teacher’s perceptions towards the implementation of ProwritingAid and teacher feedback. Teori, Penelitian, dan Pengembangan, 4(10), 1353-1363.

Brown, H. (2002). Strategies for success. A practical guide to learning English. New York: Addison Wesley Longman.

Chappele, C., Cotos, E., & Lee, J. (2015). Validity arguments for diagnostic assessment using automated writing evaluation. Iowa State University Digital Respiratory, 32(3), 385-405. doi: 10.1177/0265532214565386

Chen, C. F. E., & Cheng, W. Y. E. (2008). Beyond the design of automated writing evaluation: Pedagogical practices and perceived learning effectiveness in EFL writing classes. Language Learning & Technology, 12(2), 94-112.

Chou, H. C., Moslehpour, M., & Yang, C. Y. (2016). My access and writing error corrections of EFL college pre-intermediate students. International Journal of Education, 8(1), 144-161.

Crusan, D. (2015). And then a miracle occurs: The use of computers to assess student writing. International Journal of TESOL and Learning, 4(1), 20-32.

Cotos, E. (2011). Potential of automated writing evaluation feedback. CALICO Journal, 28(2), 420-459.
Daniel, P., & Leslie, D. (2013). Grammar software ready for EFL writers?. OnCue Journal, 9(4), 391-401.

Ebyary, K. E., & Windeatt, S. (2010). The impact of computer-based feedback on students’ written work. International Journal of English Studies, 10(2), 121-142.


Ellis, R. (2003). Task-based language learning and teaching. New York: Oxford University press.

Ferster, B., Hammond, T. C., Alexander, C., & Lyman, H. (2012). Automated formative assessment as a tool to scaffold student documentary writing. Journal of Interactive Learning Research, 23(1), 21-39.

Grammarly. (2020). About Grammarly. Retrieved from https://support.grammarly.com/hc/en-us/categories/115000018611-About-Grammrly

Grimes, D., & Warschauer, M. (2010). Utility in a fallible tool: A multi-site case study of automated writing evaluation. The Journal of Technology, Learning, and Assessment, 8(6), 1-44.

Hegelheimer, V., Dursun, A., & Li, Z. (2015). Automated writing evaluation in language teaching: Theory, development, and application. CALICO Journal, 33(1), i-v. https://doi.org/10.1558/cj.v33i1.29251

Karyuatry, L. (2018). Grammarly as a tool to improve students’ writing quality: Free online-proofreader across the boundaries. JSSH (Jurnal Sains Sosial dan Humaniora), 2(1), 83-89.

Liu, S., & Kunnan, A. J. (2016). Investigating the application of automated writing evaluation to Chinese undergraduate English majors: A case study of WriteToLearn. CALICO Journal, 33(1), 71-91. https://doi.org/10.1558/cj. v33i1.26380

Mohsen, M. A., & Alshahrani, A. (2019). The effectiveness of using a hybrid mode of automated writing evaluation system on EFL students’ writing. Teaching English with Technology, 19(1), 118-131.

Nova, M., & Lukmana, I. (2018). The detected and undetected errors in automated writing evaluation program’s result. English Language and Literature International Conference, 7(2), 120-126.

O’Neill, R., & Russell, A. M. T. (2019). Stop! Grammar time university students' perceptions of the automated feedback program Grammarly. Australasian Journal of Educational Technology, 35(1), 42-56.

Qassemzadeh, A., & Soleimani, H. (2016). The impact of feedback provision by Grammarly software and teachers on learning passive structures by Iranian EFL learners. Theory and Practice in Language Studies, 6(9), 1884-1894. https://doi.org/10.17507/tpls.0609.23

Qiang, Z. (2014). An experimental research on applying automated essay scoring system to college English writing course. International Journal of English Language Teaching, 1(2), 35-41. https://doi.org/10.5430/ijelt.v1n2p35

Roscoe, R. D., Wilson, J., Johnson, A. C., & Mayra, C. R. (2017). Presentation, expectations, and experience: Sources of student perceptions of automated writing evaluation. Computers in Human Behavior, 70, 207-221. doi: 10.1016/j.chb.2016.12.076.

Scharber, C., Dexter, S., & Riedel, E. (2008). Students’ experiences with an automated essay scorer. The Journal of Technology, Learning, and Assessment, 7(1), 1-45. Retrieved from http://www.jtla.org

Tai, H. C., Lin, W. C, & Yang, S. C. (2015).Exploring the effects of peer review and teachers’ corrective feedback on EFL students’ online writing performance. Journal of Educational Computing Research 2015, 53(2) 284-309. doi: 10.1177/0735633115597490

Wang, F., & Wang, S. (2012). A comparative study on the influence of automated evaluation system and teacher grading on students’ English writing. Procedia Engineering, 29, 993-997. doi:10.1016/j.proeng.2012.01.077

Wang, J., & Brown, M. S. (2007). Automated essay scoring versus human scoring: A comparative study. Journal of Technology, Learning, and Assessment, 6(2), 1-28.

Wang, P. L. (2013). Can automated writing evaluation programs help students improve their English writing?. International Journal of Applied Linguistics & English Literature, 2(1), 6-12. doi:10.7575/ijalel.v.2n.1p.6

Wang, S. & Li, R. (2019). An empirical study on the impact of an automated writing assessment on Chinese college students’ English writing proficiency. International Journal of Language and Linguistics, 7(5), 218-229. doi: 10.11648/j.ijll.20190705.16

Ware, P. (2018). Automated writing assessment. In J. I. Liontas, & M. DelliCarpini (Eds.), The TESOL Encyclopedia of English Language Teaching (pp. 1-7). Hoboken, NJ: Wiley Blackwell. https://doi.org/10.1002/9781118784235.eelt0543

Warschauer, M., & Grimes, D. (2008). Automated writing assessment in the classroom. Pedagogies: An International Journal. 3(1), 22-36. Taylor & Francis Group. https://doi.org/10.1080/15544800701771580

Wilson, J. (2016). Associated effects of automated essay evaluation software on growth in writing quality for students with and without disabilities. Reading and Writing, 30(4), 691-718. doi: 10.1007/s11145-016-9695-z

Wilson, J., & Andrada, G. N. (2016). Using automated feedback to improve writing quality: opportunities and challenges. In Y. Rosen, S. Ferrara & M. Mosharraf (Eds,), Handbook of research on technology tools for real-world skill development (pp. 678-703). IGI Global: US. doi: 10.4018/978-1-4666-9441-5.ch026

Wilson, J., & Czik, A. (2016). Automated essay evaluation software in English language arts classrooms: Effects on teacher feedback, student motivation, and writing quality. Computers & Education, 100, 94-109. https://doi.org/10.1016/j.compedu.2016.05.004

Yulianti, E., & Reni (2018). Utilizing Grammarly in teaching recount text through genre based approach. International Journal of Science, Technology and Society. 6(1), 1-5.

Zhang, Z. (2020). Engaging with automated writing evaluation (AWE) feedback on L2 writing: Student perceptions and revisions. Elsevier Inc: Assessing Writing, 43, 1-14. https://doi.org/10.1016/j.asw.2019.100439
Zupanc, K., & Bosnic, Z. (2015). Advances in the field of automated essay evaluation. Informatica, (39), 383-395.