Thompson and Lee (2012) provide the following quotation by a freshman composition student at the very beginning of their study:
I can’t tell you how many times I’ve gotten a paper back with underlines and marks that I can’t figure out the meaning of.
What this student tries to tell reminds us that for any feedback provided to students should be comprehensible enough for the student to act on his/her work considering the comments and suggestions. However, in most cases, students’ work is returned with several marks, comments and suggestions, which make little sense, as stated by the authors.
The authors, in order to deeply engage and motivate students, used screencasting in college-level writing courses lectured by two instructors in five sections. Instead of providing written comments, the instructors used Jing to create videos, and through these videos, they provided their comments and suggestions, employing a color-coding system. In the two sections, students were surveyed about the feedback provided on their essay drafts and PowerPoint presentations by using the screencasting software. In the other three online sections, an informal survey was applied to inquire about students’ attitudes. The surveys conducted included questions that have students reflect on achieving the objectives set for the course and how they perceived the use of recorded videos as feedback, rather than the written comments and suggestions.
The results of the study revealed that the participants provided positive feedback regarding the use of Jing to provide digital feedback and found this tool beneficial. Some of the responses provided by the participants were as follows:
“Video feedback helped me to better improve my work because it was almost like a classroom setting that allowed the teacher to fill in the interaction gaps without actually having an in-class setting. Not only that, the information could be replayed repetitively, allowing me to review them and reflect on them once I need help with my work.”
“It’s great to be able to get the feedback while watching it being addressed on the essay itself.”
“It’s one thing to just read your instructors feedback but to be able to see it and understand what you are talking about really helps!”
“I like it better than normal comments because I can hear your thought process when you are making a comment so it is easier to understand what you’re trying to say.”
“I can see and follow the instructor as she reads through my writing with the audio commentaries. It helps me to pin-point exactly what areas need to be corrected, what is hard to understand, which areas I did well on, and which areas could be improved.”
However, some participants suffered from some technical issues or did not like this mode of feedback and said that
“Jing feedback videos and [Dropbox] comments still do not work on my end. I have talked with tech guys and they can’t figure it out. I can’t find out how I did and ways to improve my writing.”
“I like the videos but they were really hard to get them to work.”
“Sometimes it’s hard to open the videos.”
“Personally I don’t like the jing videos. I’d rather have the comments written down so that I can quickly access the notes and not have to keep track of just where in the video a certain comment is.”
The authors, regarding the above comments, provided several reasons. The very first explanation was that the participants were not use to receiving feedback through videos as they were coming from a ‘print-based culture’ and they did not know how to deal with videos and how to work with this kind of feedback.
The authors provided and discussed several advantages of providing digital feedback over the written mode in detail. However, as also stated by the authors, one of the weaknesses of the study, or challenging issue for further research, was that the study focused on the participants’ attitudes on digital feedback, but could not compare the participants’ work with their attitudes. That is, we have no information on whether providing digital feedback resulted in ‘better’ written essays. The study could not assess the impact of this mode of feedback on the participants’ learning, and thus leaves a pivotal question to be answered: Does providing digital feedback improve students’ learning more than the traditional written mode?
Thompson, R., & Lee, M. J., (2012). Talking with students through screencasting: Experiments with video feedback to improve student learning. The journal of Interactive Technology and Pedagogy, 1, Retrieved from http://jitp.commons.gc.cuny.edu/2012/table-of-contents-issue-one/
A list of references on technology and providing feedback
Chiu, C.Y., & Savignon, S. (2006) Writing to mean: Computer-mediated feedback in online tutoring of multidraft compositions. CALICO Journal 24(1), 97-114.
Denton, P., Madden, J., Roberts, M., & Rowe, P. (2008). Students’ response to traditional and Computer-Assisted formative feedback: A comparative case study. British Journal of Educational Technology, 39(3), 486-500.
Dickinson, M., Eom, S., Kang, Y., Lee, C. M., & Sachs, R. (2008). A balancing act: How can intelligent computer-generated feedback be provided in learner-to-learner interactions? Computer Assisted Language Learning, 21(4), 369-382.
Dippold, D. (2009). Peer feedback through blogs: Student and teacher perceptions in an advanced german class. ReCALL, 21(1), 18-36.
Heift, T. (2001). Error-specific and individualized feedback in a web-based language tutoring system: Do they read it? ReCALL, 13(1), 99-109.
Heift, T. (2004). Corrective feedback and learner uptake in CALL. ReCALL, 16(2), 416-431.
Heift, T., & Rimrott, A. (2008). Learner responses to corrective feedback for spelling errors in CALL. System, 36(2), 196-213.
Hoppe, D., Sadakata, M., & Desain, P. (2006). Development of real-time visual feedback assistance in singing training: A review. Journal of Computer Assisted Learning, 22(4), 308-316.
Issac, F., & Hu, O. (2002). Formalism for evaluation: Feedback on learner knowledge representation. Computer Assisted Language Learning, 15(2), 183-199.
Loewen, S., & Erlam, R. (2006). Corrective feedback in the chatroom: An experimental study. Computer Assisted Language Learning, 19(1), 1-14.
Nagata, N. (1993). Intelligent computer feedback for second language instruction. The Modern Language Journal, 77(3),330-339.
Özdener, N. & Satar, H. M. (2009). Effectiveness of various oral feedback techniques in CALL vocabulary learning materials. Egitim Arastirmalari-Eurasian Journal Of Educational Research, 34, 75-96.
Pramela K. (2006). The power of feedback in an online learning Environment. Journal of Language Teaching Linguistics & Literature, XII, 95-106
Pujola, J. T. (2001). Did CALL feedback feed back? Researching learners’ use of feedback. ReCALL, 13(1), 79-98.
Richard Towell (1991). Innovation and feedback in a self-access learning project in modern languages. British Journal of Educational Technology, 22(2), 119-128.
Ros i Solé, C., & Truman, M. (2005). Feedback in distance language learning: Current practices and new Directions. In B. Holmberg, & M. A. Shelley and C. J. White (Eds), Distance education and languages: Evolution and change (pp. 72-91). Clevedon: Multilingual Matters.
Sanz, C., & Morgan-Short, K. (2004). Positive evidence versus explicit rule presentation and explicit negative feedback: A computer assisted study. Language Learning, 54(1), 35-78.
Sotillo, S. M. (2005). Corrective feedback via Instant Messenger learning activities in NS-NNS and NNS-NNS dyads. CALICO Journal, 22(3), 467-496.
Tokuda, N., & Liang, C. (2004). A new ke-free online ICALL system featuring error contingent feedback. Computer Assisted Language Learning, 17(2), 177-201.
Tsutsui, M. (2004). Multimedia as a means to enhance feedback. Computer Assisted Language Learning, 17(3/4), 377-402.