Skip to content
Skip to navigation menu

Refining traditional feedback

A good many of the strategies surveyed in this website aim to augment, complement or substitute for traditional approaches to feedback. But there is also value in exploring what might be done to work with the grain, by trying to strengthen traditional feedback.

We’ve identified three strategies for pursuing that goal: speeding up the process with faster feedback; aiming for greater clarity and consistency through pro forma feedback ; and refocusing written comments to boost the helpfulness of feedback for students and their learning.

Faster feedback

There are three options for speeding up the provision of feedback and its immediacy for students, with the broader aim of increasing its impact on their progress and performance.  The high-tech option is most commonly found in larger courses where multiple-choice or similar types of questions are a significant component in the overall assessment mix. In such instances, it typically takes the form of an online computerised resource that enables students at various points in a course to test out their understanding, and to get instant constructive feedback on those items which they answer incorrectly (for example, Plastow, 2007).

A lower-tech form of speedier feedback links rapid, whole-class feedback to tutorial activities, as exemplified in initiatives in the fields of Law (Glofcheski, 2006) and Politics (Macmillan and Mclean, 2005).   The whole-class approach is also a speedy way of providing feedback on exams, typically via a medium such as email to communicate with students for whom timetabled classes have come to an end.

A third option is to rethink how as well as when feedback is provided, by shifting from feedback to feedforward or cumulative assignments.

CASE EXAMPLES

Glofcheski, R. (2006). Same-day feedback and analysis of assessed coursework. In Carless, D. et al. (ed.) How Assessment Supports Learning: Learning-Oriented Assessment in Action. Section 3.6. Hong Kong: Hong Kong UP.

Students studying Law submit a solution to a hypothetical problem. This problem is discussed later that day in a tutor-led discussion. The tutor then places an ‘ideal’ solution on the course website.

Macmillan, J. and Mclean, M.J. (2005). Making first-year tutorials count. Active Learning in Higher Education. 6.2, pp. 94-105.

In a first-year module on International Relations, students were asked to submit briefing papers five days before tutorials. These were then discussed with other students during the tutorial and individual feedback was given at the same time by the tutor. Three days after the tutorial students submitted evaluation papers, taking into account feedback from the tutor and other students.  http://alh.sagepub.com/

Plastow, K. (2007) Online assessment feedback as an instrument of reflective practice. ALTC, Universities of Melbourne and Sydney: Enhancing Assessment in the Biological Sciences website.

A team teaching in the Biosciences used a set of guiding principles to write feedback comments for right and wrong MCQ responses, which were delivered immediately after the submission of the online test. The benefits to students included the immediate correction of errors in thinking.
http://bioassess.edu.au/examples/plastow-online-assessment-feedback-instrument-reflective-practice

Sellers, D. Improving feedback in a level 5 Pathology module. FAST Case Study (Formative Assessment in Science Teaching).
Staff teaching a Pathology module tried a range of measures to increase the speed with which students received feedback, including placing MCQs on a VLE, separating the giving of feedback from the giving of marks, giving more guidance on coursework, and giving feedback using a model answer. http://www.open.ac.uk/fast/

Pro forma feedback

Assessment and feedback pro formas (also called ‘cover sheets, ‘assignment attachments’, and in the US, ‘rubrics’) are widely used as a way of framing and focusing comments to students in a standard format. They can take various forms, which are illustrated in the downloadable examples of pro formas attached.

In their most common forms, pro formas provide a set of standard headings under which to group feedback comments [as in example 1], and/or to evaluate a student’s achievement along a number of rating scales with a tick or cross [as in example 2 and example 3]. In either case, all or most of the headings or scales used mirror the criteria being adopted to assess the assignment or script concerned. However, some pro formas add a box for general comments [as in example 4] or – with an eye to encouraging more feedforward comments – ‘Suggestions for improvement’. And some [again, as in example 4] may separate out criteria which are individually rated from other observations which need only be noted.

Where rating scales are used, they can be points along a single dimension, contrasting poles on a continuum, multidimensional, and accompanied by a one-word label or with an explanatory phrase. [See, for instance, example 2, example 3, example 5 and example 6]. And as these examples also show, points on a scale may indicate a broad level of attainment (e.g. on a range from ‘low to ‘high’) or be individually graded (A, B, C, etc). A recent refinement is to add information about where the student concerned stands relative to the rest of the class [example 7] on each criterion.

Examples of pro formas can be found not only for use with traditional essays and reports, but also, for instance, to convey feedback on placements and field experience [example 6, example 8], laboratory work [example 9], posters [example 10], projects [example 11] and presentations [example 12].

Using a pro forma to communicate feedback can have several advantages:

  • linking feedback directly to assessment criteria can be helpful to both marker (to maintain focus and aid consistency of coverage across a set of assignments or scripts) and student (seeing how comments or ratings relate to the criteria)
  • use of a pro forma by all members of a course team can help to achieve consistency of marking and feedback across the group of markers
  • a combination of ratings and comments provides a relatively economical means of balancing breadth and depth of feedback
  • students and tutors can track progress across a series of assignments, and identify where efforts to improve performance would best be focused.

Where students become familiar with a pro forma over time, it can also serve as a bridge to self-generated or peer feedback. Example 13, for instance, is an Essay Feedback Checklist which students completed and submitted alongside their essays; their tutors, who used the same checklist, were then able to target feedback on mismatches between their own and the students’ ratings (Norton et al., 2002).

Similarly, inviting students to design a pro forma (or adapt an existing one for use with a different kind of assignment), can be a powerful way of interacting with students to enhance their grasp of assessment expectations and standards.

Feedback pro formas can have disadvantages too. Finer-grained comments can lose ground to rating scales if pro forma use is not regularly monitored across a course team. And where a pro forma is imposed upon a team rather than jointly devised, it can be seen by some markers as more hindrance than help.

CASE EXAMPLES

Allen, D. and Tanner, K. (2006) Rubrics: tools for making learning goals and evaluation criteria explicit for both teachers and learners. CBE Life Science Education 5(3), 197-203.
A paper describing different types of proforma, how to design them and why they are useful, based on examples from Biology. http://www.lifescied.org/cgi/content/full/5/3/197

Cooper, D. (2005) Assessing what we have taught: the challenges faced with the assessment of oral presentation skills. In: Higher Education in a Changing World: proceedings of the 28th HERDSA Annual Conference, Sydney, 3-6 July 2005.  http://www.herdsa.org.au/?page_id=167

Freake, S. Reformatting feedback on assignments to enhance effectiveness. FAST Case Study (Formative Assessment in Science Teaching).
Tutors assessing Physics assignments were given a pro forma to use to encourage them to comment positively and on areas for improvement in the future.  http://www.open.ac.uk/fast/

Mulder, R.(2009) Use of a scoring matrix to provide detailed feedback on performance. ALTC, Universities of Melbourne and Sydney: Enhancing Assessment in the Biological Sciences website.
http://bioassess.edu.au/examples/mulder-use-scoring-matrix-provide-detailed-feedback-performance

Norton, L., Clifford, R., Hopkins, L., Toner, I. and Norton, J.C.W. (2002) Helping psychology students write better essays. Psychology Learning and Teaching 2(2), 116-126.  http://www.psychology.heacademy.ac.uk/s.php?p=250

Tang, S.Y.F. and Chow, A.W.K. (2007) Communicating feedback in teaching practice supervision in a learning-oriented field experience assessment framework. Teaching and Teacher Education 23(7), 1066-1085  http://www.elsevier.com/wps/find/journaldescription.cws_home/224/description#description

Refocusing written comments

For a great many students and their teachers, feedback is indelibly associated with the written comments that accompany a grade or mark on assigned and assessed work. And it seems likely that this form of feedback will continue to be extensively used in the future.

It does however, have widely documented shortcomings (see e.g. Black & Wiliam, 1998; Chanock, 2000; Higgins et al. 2002; Hounsell, 2007; Weaver, 2006). The evidence suggests, for instance, that however well-intentioned or painstakingly crafted, written comments can lack transparency, because they allude to a set of standards and conventions (e.g. about ‘good structure’, or the use of evidence) that students aren’t yet familiar with. The comments may also be so focused on negatives (what a student hasn’t done well) rather than positives that they undermine rather than enhance students’ self-confidence or commitment to the subject. Or they dwell too intently on what the student might have done in a piece of work that is now of-the-past rather than in-the-present, and at the expense of advice on how to do better in future.

The most recent evidence comes from a study by Walker (2009) that combined an analysis of Engineering and Computing tutors’ written comments with follow-up interviews with the students concerned. Using Brown and Glover’s typology (2006), Walker found that two-fifths of the tutors’ comments were on content, one-fifth on skills development (e.g. on structure, focus on the question set, communicative grasp), and about one-third were motivating in their use of praise or encouragement.

Walker was also concerned with how usable the students had found the comments, whether retrospectively – for the assignment submitted – or prospectively, in improving their future work in the subject. What emerged from the interviews was that the students found the skills development comments the most usable in future work. In addressing gaps in an assignment just submitted, however, the students prized comments which included an element of explanation, and which therefore helped them to bridge the gap between their current knowledge, understanding and skills and those expected of them. It was also apparent that a relatively high proportion of comments made on assignments were unlikely to be usable.

A follow-up study of Language tutors’ comments (Fernandez-Toro and Truman, 2009) highlights the subject dimension to comments. They found that, relative to the Computing and Engineering tutors, the Languages tutors comment more on skills than on content, made more comments simply indicating (rather than correcting) errors or providing explanations.

What implications might be drawn from findings such as these about focusing comments effectively?

Comments are more likely be effective if :

  • they focus on the task undertaken or the work produced, rather than being directed at the student personally
  • they are directly linked to the assessment criteria being used, and/or the learning outcomes for the course unit concerned
  • observations and judgments are particularised or illustrated with a specific example or reference to the text
  • suggestions are made about how as well as what to improve, in forthcoming work in the subject.
  • suggested improvements give priority to two or three points that the student could feasibly make progress on.

The impact of feedback comments is also likely to be maximised when efforts to draft more constructive comments are accompanied by one or more of the following:

  • where feasible, inviting feedback on feedback, with the aim of getting a fuller grasp of what kinds of comments, in what forms, students find most helpful
  • using a pro forma to group or prioritise comments and to highlight the links between comments and assessment criteria
  • introducing feed-forward assignments, so that suggestions for improvements can be directly followed up by students
  • making use of exemplars or other opportunities for students to engage with criteria and standards, so that feedback comments are more firmly anchored in a developing grasp of what counts as high-quality work in the subject at that level
  • experimenting with elective feedback to encourage students to indicate in advance what comments they would find most helpful

CASE EXAMPLES

Bright, K. Providing individual written feedback on formative and summative assessments. Higher Education Academy UK Centre for Legal Education resource.
Some guidelines for providing written feedback to Law students.  http://www.ukcle.ac.uk/resources/assessment/effectivefeedback.html

Fernandez-Toro, M. and Truman, M. (2009) Improved learning through improved feedback on Languages TMAs. (Interim report). Milton Keynes: Open CETL, Open University.
Follows up the Walker (2009) study by analysing 4000+ written comments made by Language tutors on 72 assignments in two Open University Spanish modules. Findings suggest that Languages tutors comment more on skills than on content, and make more comments simply indicating (rather than correcting) errors or providing explanations. http://www.open.ac.uk/opencetl/activities/details/detail.php?itemId=492fcd766c828

Glover, C. and Brown, E. (2006) Written feedback for students: too much, too detailed or too incomprehensible to be effective? Bioscience Education 7
This article discusses the findings from a research project on the effectiveness of written feedback in Biosciences and Physical Sciences. It identifies some key qualities and some examples of inappropriate feedback.
http://www.bioscience.heacademy.ac.uk/journal/vol7/beej-7-3.aspx

Malouff, J. Rooke, S. and Schutte N. (2008) Helping students improve their writing. Association for Psychological Science Observer 21(8) http://www.psychologicalscience.org/observer/getArticle.cfm?id=2399

Pezdek, K. (2009) Grading student papers: reducing faculty workload while improving feedback to students. Association for Psychological Science Observer 22(9)
http://www.psychologicalscience.org/observer/getArticle.cfm?id=2578
Suggestions for helping students improve their writing, based on examples from Psychology.

Mutch, A. (2003) Exploring the practice of feedback to students. Active Learning in Higher Education, 4 (1), 24-38.
This article explores the feedback practice on written course work within a Business programme and suggests how practices might be enhanced. http://alh.sagepub.com/

Pitts, S. (2005) ‘Testing, testing…’ How do students use written feedback? Active Learning in Higher Education, 6 (3), 218-229.
This study asked Music students what they found helpful in the written feedback they received. http://alh.sagepub.com/

Rae, A. M., and Cochrane, D. K. (2008) Listening to students: How to make written assessment feedback useful. Active Learning in Higher Education, 9.3, pp. 217-230.
In this study, Nursing students were asked how written feedback had helped them learn, the processes of receiving feedback, and how they made sense of the feedback they were given. http://alh.sagepub.com/

Walker, M. (2009) An investigation into written comments on assignments: do students find them usable? Assessment & Evaluation in Higher Education, 34(1), 67-78.
The article explores students’ response to the written feedback they receive on written assignments. It presents an analysis of over 3000 written comments in Engineering, and Information and Communication Technologies courses, and also the results of telephone interviews with students who commented how usable they found the different comments they received. http://www.tandf.co.uk/journals/carfax/02602938.html

Weaver, M. (2006) Do students value feedback? Student perceptions of tutors’ written responses. Assessment & Evaluation in Higher Education, 31(3), 379-394.
The article reports Business and Art and Design students’ perceptions of what they found helpful and unhelpful in feedback on their assignments. The students valued feedback, but needed advice on understanding and using feedback before they could engage with it. Feedback could be improved by focusing on messages conveyed by their writing, providing feedback set in the context of assessment criteria and learning outcomes, and ensuring that feedback is timely. http://www.tandf.co.uk/journals/carfax/02602938.html

Young, P. (2000) ‘I might as well give up’: self-esteem and mature students’ feelings about feedback on assignments. Journal of Further and Higher Education 24(3), pp. 409-418.
This article looked at Access students’ responses to feedback on assignments. All students found the first assignment problematic, but thereafter great variations were found, which related to varying levels of self-esteem. Students varied in their attitudes to receiving feedback, their perceptions of the messages they were receiving and in the extent of the impact of the feedback on their sense of self. http://www.tandf.co.uk/journals/titles/0309877X.asp

FURTHER READING

Black, P. & Wiliam, D. (1998) Assessment and Classroom Learning. Assessment in Education 5(1), pp.7-74
http://www.tandf.co.uk/journals/titles/0969594x.asp

Brown, E. & Glover, C. (2006) Evaluating written feedback. In: Brown, E. and Glover, C. Innovative Assessment in Higher Education. London/New York: Routledge, pp. 81-91.
This chapter focuses on the written feedback that students receive on assignments and shows how tutors can evaluate its strengths and weaknesses empirically.

Carless, D. (2006) Differing perceptions in the feedback process. Studies in Higher Education, 31(2), 219-233.
This article explores differences in student and staff perceptions of written feedback, and how students interpret and use this feedback. http://www.tandf.co.uk/journals/carfax/03075079.html

Chanock, K. (2000) Comments on essays: do students understand what tutors write? Teaching in Higher Education 5 (1), 95–105 http://www.tandf.co.uk/journals/titles/13562517.asp

Higgins, R. et al. (2002) The conscientious consumer: reconsidering the role of assessment feedback in student learning. Studies in Higher Education, 27(1), pp. 53 – 64
Reports the initial findings of a research project investigating the meaning and impact of assessment feedback for students in higher education. While there were a number of barriers to the utility of feedback outside of students’ control, including the quality, quantity and language of comments, the students nonetheless seemed to read and value their tutors’ comments. They were motivated intrinsically and sought feedback which would help them to engage with their subject in a ‘deep’ way. http://www.tandf.co.uk/journals/carfax/03075079.html

Hounsell, D. (2007). Towards more sustainable feedback to students. In: Boud, D. and Falchikov, N., eds. Rethinking Assessment in Higher Education. Learning for the Longer Term. London: Routledge, pp. 101-113.

Price, M. (2007) Should we be giving less written feedback? Centre for Bioscience Bulletin No. 22, p.9.
In this short article, the limitations of written feedback are discussed, and suggestions for using it less but better are given. http://www.bioscience.heacademy.ac.uk/resources/bulletin.aspx

Swithenby, S. Feedback can be a waste of time. Open University: Challenging Perspectives on Assessment.
A short video which gives suggestions on how to give feedback which students will be able to use in future assessment.
http://stadium.open.ac.uk/perspectives/assessment/

Comments are closed.