Skip to content
Skip to navigation menu

New ways of giving feedback

One approach to enhancing feedback is to experiment with novel ways of giving it — ‘novel’ in the sense of trying out methods of providing feedback that weren’t technologically feasible a quarter-century ago, or at least have become much more common because new technologies can now be used to communicate them very effectively.

One good example of the latter is ‘generic’ or whole class feedback, which has recently come to the fore as an invaluable form of post-exam feedback to students who may no longer meet for timetabled classes and/or may have begun their vacation. Email or website postings offer a rapid and economical form of communication between examiners and students. A second example is the use of screencasts to give students in large first-year courses speedy access to more detailed guidance on commonly occurring problems.

Other novel means of giving feedback are even more closely interconnected with new technologies. Using clickers is a fast and systematic electronic means of checking, during a lecture or other large class, how securely the students have grasped a difficult concept or issue. Automated feedback enables students to self-test with online multiple-choice questions in a form that gives them feedback on incorrect answers. Software has also been developed which can make recycling written comments possible (so that it’s not necessary to draft every feedback comment from scratch) or which can enable a tutor to shift from writing comments to providing audio and video feedback. Either possibility can offer savings in time, and so the prospect either of a more manageable feedback workload or the opportunity to make fuller comments than would otherwise have been feasible.

FURTHER READING

Hounsell, D. (2008) The Trouble with Feedback: new challenges, emerging strategies. TLA Interchange Issue 2.  http://www.tla.ed.ac.uk/interchange/spring2008/hounsell2.htm

JISC (2007) Effective Practice with e-Assessment. An overview of technologies, policies and practice in further and higher education. Bristol & London: JISC (2007)
http://www.jisc.ac.uk/publications/documents/pub_eassesspracticeguide.aspx

Audio and video feedback

The development of MP3 players is one widely-used technology that has been exploited in recent years to provide a new method of giving feedback to students – the podcast. Often used in combination with other types of feedback, those who use podcasts to provide feedback find them an informal method which can be used to provide a good deal of feedback quite quickly, rather as they would in a face-to-face meeting with a student. Several surveys have been done of students’ responses to receiving their feedback via podcast, and most students appear to find it a positive experience, giving them detailed feedback they can listen to more than once and in their own time while seeming more personal than written comments. A further development has been to use “screen” capture to provide video feedback, with the added advantage that students can see the part of their assignment which the staff member is referring to in their audio comments.

CASE EXAMPLES

France, D. and Wheeler, A. (2007) Reflections on using podcasting for student feedback. Planet, 18. Higher Education Academy Subject Centre for Geography, Earth and Environmental Sciences.
Geoscience students were given feedback on their assignments using podcasting. The students completed pre- and post-podcasting questionnaires. http://www.gees.ac.uk/pubs/planet/index.htm#P18

Hill, D. (2008) The use of podcasts in the delivery of feedback to dissertation students. Higher Education Academy Subject Centre for Hospitality, Leisure, Sport and Tourism Case Study.
Sports students were given podcasts of feedback on subsequent chapters of their dissertations. They were asked about the balance of podcast and face-to-face feedback.
http://www.heacademy.ac.uk/hlst/resources/casestudies/assessment#at
JISC (2010) Enhancing the experience of feedback. Case study 6, University of Leicester. JISC: Effective assessment in a digital age.  An example of using podcasts on a distance learning MSc in Occupational Psychology, with ‘lessons learned’ and ‘advantages gained’.  http://www.jisc.ac.uk/digiassess

Jordan, J. (2004) The use of orally recorded exam feedback as a supplement to written comments. Journal of Statistics Education 12(1).
In this example, taken from a course in Statistics, the lecturer uses of orally recorded feedback on the exam, together with traditional grading and written comments.  http://www.amstat.org/publications/jse/jse_archive.htm

King, D., McGugan, S. and Bunyan, N. (2008) Does it make a difference? Replacing text with audio feedback. Practice and Evidence of Scholarship of Teaching and Learning in Higher Education 3(2), 145-163.
This paper reports the findings of focus group interviews with staff and students following the use of audio feedback on assignments. Improvements could be seen in the quantity and quality of feedback given. However anticipated savings in staff time were not realised.  http://www.pestlhe.org.uk/index.php/pestlhe/article/view/52

Lunt, T. and Curran, J. (2009) ‘Are you listening please?’ The advantages of electronic audio feedback compared to written feedback. Assessment & Evaluation in Higher Education iFirst.
Presents evidence from tutors and students on a Business Studies course to support the use of audio feedback provided via a VLE. The method is seen as efficient and valued by the students.
http://www.informaworld.com/smpp/title~content=g779269628~db=all

McLaughlin, P. (2009) eFeedback gets personal. Centre for Bioscience Bulletin, No. 28, p.3.
Screen capture was used to create video feedback to Bioscience students on their assignments, as a way of helping students not to misinterpret written comments.  http://www.bioscience.heacademy.ac.uk/resources/bulletin.aspx

Merry, S. and Orsmond, P. (2007) Feedback via MP3 audio files. Centre for Bioscience Bulletin, No. 22, p.5.
This short article reports on Bioscience students’ reponses to receiving feedback on their assignments as an audio file rather than by written comments.  http://www.bioscience.heacademy.ac.uk/resources/bulletin.aspx

Merry S. and Orsmond P. (2008) Students’ Attitudes to and Usage of Academic Feedback Provided Via Audio Files. Bioscience Education volume 11
A longer article on the project outlined above, on Bioscience students’ responses to audio feedback.
http://www.bioscience.heacademy.ac.uk/journal/vol11/beej-11-3.aspx

Micklewright, D. Podcasting as an alternative mode of assessment feedback. Higher Education Academy Subject Centre for Hospitality, Leisure, Sport and Tourism Case Study.
Sports Science undergraduates were given feedback on their assignment through a podcast rather than receiving written feedback. They were asked which method they preferred.
http://www.heacademy.ac.uk/hlst/resources/casestudies/assessment#at

Nortcliffe, A. and Middleton, A. (2008) A three year case study of using audio to blend the engineer’s learning environment. Engineering Education 3(2), 45-57.
Lecturers teaching Software Engineering recorded conversations between themselves and students during lab classes. These recordings were then made available to the student to use as feedback. The findings of an evaluation of this project are discussed.  http://www.engsc.ac.uk/journal/index.php/ee/issue/view/29

Rodway-Dyer, S., Dunne, E. and Newcombe, M. (2009) Audio and screen visual feedback to support student learning. Paper given at ALT-C Conference, September 2009, Manchester.
This paper contains case studies of evaluating the use of audio feedback in Geography and of feedback in Biosciences labs. It also gives an example of videoing verbal feedback during labs to use for training demonstrators.
http://repository.alt.ac.uk/641/

Stannard, R. (2007) Using screen capture software in student feedback. Higher Education Academy English Subject Centre Case Study.
A case study of using screen capture software to show students how to improve their language skills. The web page includes some short videos of examples of the software being used.
http://www.english.heacademy.ac.uk/explore/publications/casestudies/technology/camtasia.php

Case study – providing audio comments. Massey University: Innovations in Assignment Marking.
An example from communications studies of providing audio feedback.  http://etools.massey.ac.nz/casestudypc.htm

A Word in Your Ear, Sheffield Hallam University, 18 December 2009
Conference papers, podcasts and posters available to download from this conference on the theme of audio feedback.
http://research.shu.ac.uk/lti/awordinyourear2009/papers.html

FURTHER READING

Savin-Baden, M. (2010) The sound of feedback in higher education. Learning, Media and Technology 35(1), 53-64.  This article explores recent research on, and practices used for, podcasting assignment feedback (PAF). It argues that PAF should be based on the principles of dialogic learning. http://www.tandf.co.uk/journals/titles/17439884.asp

Screencasts

Screencasting is a technology that allows academics to demonstrate to students how things are done, in the way a master might show an apprentice. A screencast records the actions on a computer screen, so it is particularly useful for demonstrating, for example, how to write or use software, or stages in a calculation, as it shows the process by which something is done. It can also provide a model answer or an exemplar of a particular kind of problem. Since multiple students can access a screencast, it can be used to provide useful feedback on common problems which students encounter in an assignment.

CASE EXAMPLES

Cassidy, S. (2007) Screencasting, Blogs and Feedback. Macquarie University Learning and Teaching Centre Podcast Series on Engaging Students.
A podcast giving examples of screencasting, using blogs and providing students with instructive feedback, including the use of screencasts to give feedback to students on processes.
http://www.mq.edu.au/ltc/resources/podcasts/cassidy.htm

Recycling written comments

Individualised written feedback can be very important in helping students to learn. However, it is time-consuming and increased student numbers have led to more pressure on staff time in producing these comments. The papers in this section describe methods of “recycling” comments that lecturers find themselves frequently making on common issues in student work. In some cases comments are recycled using specialised software, and in others standard word-processing packages. The past comments are then redeployed on new students’ work, edited as appropriate and often blended with other tailormade feedback. The benefits of this approach,it is argued, include greater consistency in providing feedback and more effective use of staff time.

CASE EXAMPLES

Balfour, J. (2007) Some light at the end of the feedback tunnel? CEBE Transactions 4(2), 54-66.
This article describes software used to provide feedback comments to students on their lab reports, against specified criteria from a bank of suitable comments.  http://www.cebe.heacademy.ac.uk/transactions/volumes_index.php?edition=4.2

Brown, J. Annotating electronic assignment copies with comments. Massey, Victoria, Otago universities and UCOL: Innovations in Assignment Marking project.
In this short case study, standard features on Microsoft Word are used to provide detailed feedback on Education students work, allowing for flexibility in editing comments and the opportunity to re-use comments on common mistakes.  http://etools.massey.ac.nz/casestudyac.htm

Juwah, C. et al. (2004) Enhancing effectiveness and efficiency in student feedback. Case Study 4 in: Enhancing Student Learning through Effective Formative Feedback. Higher Education Academy: Student Enhanced Learning through Effective Feedback project.  Staff teaching final-year Accounting and Finance used grade-related criteria and a bank of feedback statements to provide quick and detailed feedback.  download publication

Pezdek, K. (2009) Grading student papers: reducing faculty workload while improving feedback to students. Association for Psychological Science Observer 22(9)
This article suggests coding and reusing common comments used in feedback. It also describes a process for having students proofread each others assignments.
http://www.psychologicalscience.org/observer/getArticle.cfm?id=2578

Online and e-feedback

Many of the innovations cited in this section are ways of providing students with feedback on online tests which they log on to in their own time. One of the advantages of this type of feedback is that it is immediate, and can be accessed by students at a time of thier choosing. And while the main type of question used tends to be multiple choice, it is also possible to design short-answer questions (Jordan et al. 2009). The feedback can be more or less sophisticated, with software able to go beyond yes-and-no answers to feedback which provides constructive suggestions for improvement.

While presenting a number of technical difficulties, online feedback does have the advantage of flexibility, and the possibility of links to other online resources. However, most of its proponents suggest that it should not be the only source of feedback that a student receives.

A rather different approach uses video cameras linked to a computer system to enable staff in a medical school to provide feedback without being in the room (Hughes et al. 2008).

CASE EXAMPLES

Balfour, J. (2007) Some light at the end of the feedback tunnel? CEBE Transactions 4(2), 54-66.
This article describes software used to provide feedback comments to students on their lab reports, against specified criteria from a bank of suitable comments.  http://www.cebe.heacademy.ac.uk/transactions/volumes_index.php?edition=4.2

Esendal, T. and Dean, M. (2009) An online tool to give first-year programming students pre-assessment feedback. Italics 8 (2) 36-44. Higher Education Academy Subject Centre for Information and Computer Sciences e-journal.  This paper describes ‘Doctor Code’, an online analysis and feedback tool which evaluates the code first-year Computing students have written and informs students of the outcome.  http://www.ics.heacademy.ac.uk/italics/vol8iss2.htm

Golden, K., Stripp, C. and Lee, S. (2007) Encouraging student use of feedback, reflection and engagement through web-based learning support. MSOR Connections 7(2) 7-10. Higher Education Academy Maths, Stats & OR Network newsletter.
Feedback was provided automaticlly to students studying Engineering Mathematics on online computer based tests. Students could then follow up the feedback by using web-based materials, a text book or by seeing a tutor.
http://www.mathstore.ac.uk/index.php?pid=37&vol=7&num=2
Hepplestone, S. et al. (2009) Technology, Feedback, Action! The impact of learning technology upon students’ engagement with their feedback. Higher Education Academy: Enhancing Learning Through Technology Research Project Report 08/09.  This project evaluated how a range of technical interventions might encourage students to engage with feedback, and identified a series of recommendations around the use of technology in giving feedback.  http://www.heacademy.ac.uk/resources/detail/ourwork/evidencenet/Technology_Feedback_Action

Hepplestone, S. et al. (2010) Using technology to help students engage with their feedback. A best practice guide for academic staff. Sheffield Hallam University. download leaflet

Hepplestone, S. et al. (2010) Using technology to help students engage with their feedback. A ten minute guide for senior managers. Sheffield Hallam University. download leaflet

Hughes, C., Toohey, S. and Velan, G. (2008) eMed-Teamwork: a self-moderating system to gather peer feedback for developing and assessing teamwork skills. Medical Teacher 30(1), 5-9
Medical students gave feedback to each other on team skills, using a computer-based system which also allowed for students commenting on the feedback they received and tutors giving feedback.
http://www.informaworld.com/smpp/title~content=t713438241

Jordan, S. and Mitchell, T. (2009) e-Assessment for learning? The potential of short-answer free-text questions with tailored feedback. British Journal of Educational Technology 40(2), 371-385
This article describes a computer-based system to provide feedback to students on short-answer questions. The students’ use of the feedback is discussed.  http://www.wiley.com/bw/journal.asp?ref=0007-1013

Khan, K., Davies, D. and Gupta, J. (2001) Formative self-assessment using multiple true-false questions on the internet: feedback according to confidence about correct knowledge. Medical Teacher 23(2), 158-163
Discusses the development of a web-based feedback for Medical students across different teaching sites.
http://www.informaworld.com/smpp/title~content=t713438241

Montague, B. Building up an electronic collection of marked assignments. Massey University: Innovations in Assignment Marking Case Study.
An example of using Blackboard to provide feedback to IT students and to provide an overview of the submitted assignments and feedback given.  http://etools.massey.ac.nz/casestudybc.htm

Murray, S. Feedback and engagement. Macquarie University Learning and Teaching Centre: Engaging Students – A Podcast Series.  A podcast discussing the use of WebCT to provide feedback on assignments.
http://www.mq.edu.au/ltc/resources/podcasts/index.htm

Nix, I. and Wyllie, A. (2009) Exploring design features to enhance computer-based assessment: learners’ views on using a confidence-indicator tool and computer-based feedback. British Journal of Educational Technology Early View.  A small-scale study of Social Work and Health Science students’ views on using a confidence indicator to submit online answers, and on the feedback provided via the computer.
http://www3.interscience.wiley.com/journal/122649797/abstract

Price, G. (2006) Computer aided assessment and feedback – can we enhance students’ early experience at University? New Directions 2. Higher Education Academy Subject Centre for Physical Sciences.
A case study describing how students in first-year Chemistry were given online quizzes. Feedback was instantly available, with constructive suggestions as to how to improve performance if necessary.
http://www.heacademy.ac.uk/physsci/publications/newdirections

Tong, R. and Beynon, C. (2008) Can formative computer aided assessment assist student learning? Higher Education Academy Hospitality, Leisure, Sport and Tourism Network Case Study
In this case study, students studying Sport Physiology completed MCQs throughout the course and were given feedback electronically.
http://www.heacademy.ac.uk/hlst/resources/casestudies/assessment

Whole-class feedback

It’s tempting to cast ‘generic’ or whole-class feedback in the role of the permanent poor relation to individualised feedback. Yet as the table below suggests, it needn’t be seen as an option of last resort. It has real advantages as a speedy means of emailing feedback on end-of-course exam scripts to students who no longer meet for timetabled classes. But as the table also indicates, it does have the edge over one-to-one feedback in the greater elbow-room it offers the feedback-giver. And the enlarged space for comment can be used in various constructive ways: to offer fuller explanations of aspects of the subject-matter that many students had not adequately grasped; to review the alternative approaches that could be taken to tackling a particular question or problem; or to pick out for praise especially good features of students’ answers. In other words, whole-class feedback needn’t be a glum post-mortem, but can widen students’ grasp of what counts as good work in the subject. In this respect, it merits more general use as a complement to individualised feedback.

whole-class feedback individualised feedback
PROS
  • relatively quick, economical and readily communicable
  • greater scope to explain and expand on comments, compare alternative answers and approaches to the question, focus on common misunderstandings
  • tailored to the particular answer or work submitted
  • facilitates comment on very specific sections or parts of a student’s answer or submission
CONS
  • may not connect with the feedback concerns of every student, or lack relevance to their particular answer to the question set
  • time-consuming to generate

 

CASE EXAMPLES

Harland, J. (2007) Feedback to large practical classes. Centre for Bioscience Bulletin No.22, p. 7.
In this case study, a comment sheet was developed for providing feedback to Bioscience students on lab reports, providing answers and common errors. This was circulated alongside shorter individual comments.
http://www.bioscience.heacademy.ac.uk/resources/bulletin.aspx

Using clickers (PRS)

The use of personal response systems (PRS) or electronic voting systems (VRS) – often simply known as ‘clickers’ – can enhance students’ learning experience in a variety of ways. At one time known largely for its use in TV quiz programmes like ‘Who Wants To Be A Millionaire?’, the use of handheld clickers with a choice of buttons for responding to questions is becoming more widespread. Clickers can be used in lectures to encourage engagement with the lecture content, and how well this is achieved depends partly on the design of the questions, so as to test understanding as well as knowledge. Bates et al. (2006) argue that ‘A good question is one where a spread of answers might be expected or where it is known that common misconceptions lurk.’

The system provides immediate feedback to students on how well they have understood the question asked, as well as giving the lecturer feedback on how many of their students have understood a particular concept which they can use to address any problems, or to start group discussions. The system has also been used in peer feedback (see Barwell and Walker, 2009) where the advantages of anonymity in responding can be beneficial. While clickers are sometimes used simply to break up a lecture, when used well, it is with sound pedagogy behind the use and not simply as a novelty which will wear off.

A less high-tech version is based on IF-AT (immediate feedback assessment technique) forms which work like scratchcards.

CASE EXAMPLES

Barwell, G. and Walker, R. (2009) Peer assessment of oral presentations using clickers: the student experience. Proceedings of the 3rd HERDSA Annual Conference.
This paper reports on the findings of focus groups which asked for students’ views on using clickers to give peer feedback on presentations. http://www.herdsa.org.au/?page_id=520

Bates, S., Howie, K. and Murphy, A. (2006) The use of electronic voting systems in large group lectures: challenges and opportunities. New Directions Issue 2, 1-8. Higher Education Academy Subject Centre for Physical Sciences journal.
A case study, discussing pedagogical, technical and operational issues associated with the introduction of PRS into first-year lectures in Physics and Biological Sciences. http://www.heacademy.ac.uk/physsci/publications/newdirections

Beekes, W. (2008) ‘Ask the audience’ in lectures. BMAF Magazine 4, pp.3-4.
Feedback was given to students in a university Management School, using PRS. http://www.heacademy.ac.uk/business/publications/bmag

Cotner, S., Fall, B., Wick, S., Walker, J., Baeploer, P. (2008) Rapid feedback assessment methods: can we improve engagement and preparation for exams in large-enrollment courses? Journal of Science Education and Technology 17(5), 437-443
This article argues for the use of IF-AT forms (scratchcards) for providing fast feedback and encouraging engagement by students. http://www.springerlink.com/content/102587/

de Jong, T., Lane, J., Sharp, S. and Kershaw, P. (2009) Optimising personal audience response systems technology to enhance student learning in teacher education lectures. Proceedings of the 3rd HERDSA Annual Conference.
This paper reports on the findings of a survey of Education students’ views of PARS and makes suggestions for its use.
http://www.herdsa.org.au/?page_id=520

Draper, S. (2009) Catalytic assessment: understanding how MCQs and EVS can foster deep learning. British Journal of Educational Technology 40(2) 285-293
This article discusses the design of MCQ questions and their use with ‘clickers’ to promote deep learning by focusing on learning relationships between items and encouraging peer interaction.
http://www.wiley.com/bw/journal.asp?ref=0007-1013

Premkumar, K. and Coupal, C. (2008) Rules of engagement – 12 tips for successful use of ‘clickers’ in the classroom. Medical Teacher 30(2), 146-149 http://www.informaworld.com/smpp/title~content=t713438241

Robinson, C. and King, S. (2009) Introducing electronic voting systems into the teaching of Mathematics. MSOR Connections 9(1) 29-33. Higher Education Academy Maths, Stats and OR Network newsletter.
A case study, describing the introduction of EVS into an Engineering Mathematics course and staff and student views of its use. http://www.mathstore.ac.uk/index.php?pid=37&vol=9&num=1

Russell, M. (2008) Using an electronic voting system to enhance learning and teaching. Engineering Education 3(2) 58-65. Higher Education Academy Engineering Subject Centre.
Students on an Engineering Science module used EVS to complete a formative test at the start of lectures, giving staff and students feedback on how well they were understanding the topics.
http://www.engsc.ac.uk/journal/index.php/ee/issue/view/29

Wit, E. (2003) Who wants to be … the use of a personal response system in statistics teaching. MSOR Connections 3(2) 14-20. Higher Education Academy Maths, Stats and OR Network newsletter.
This article describes the introduction of PRS into a Statistics for Psychologists course, and discusses educational principles and the development of suitable questions for use.
http://www.mathstore.ac.uk/index.php?pid=37&vol=3&num=2

Comments are closed.