skip to main content

Does the old school report have a future?

Long reads
Does the old school report have a future?

Students’ school reports were once the cornerstone of communication to parents: a much anticipated (sometimes dreaded) document that offered a final reckoning of a child’s achievement across a range of subjects each semester.

Over recent years, efforts have been made to improve student reports and make the information they provide more relevant and meaningful.

In this first of a series of articles on how schools communicate student learning progress, we examine some of the recent history of reporting in Australian schools and highlight some of the competing forces that have influenced current practices in student reporting.

Reporting in ‘plain language’

There is a surprising lack of research into reporting on student learning, but what little there is suggests that reporting formats and conventions in the past have not been found to be particularly helpful.

Dissatisfaction with student reports can be tracked at least to the early 1980s. In a 1982 transcript analysis of a conversation between an interviewer and the parents of a Year 8 student in a Victorian school, the lack of useful information in the child’s report became a central theme of the interview (Meiers, 1982). The boy’s report was entirely descriptive, with no numerical marks or letter grades; the article’s author described this approach to reporting as ‘both desirable and constructive’, partly because it avoided making comparisons between students. However, the parents expressed a frustration that they could not compare their son’s achievements against the requirements of the course, his classmates, his age-group peers or some other expected standard. A lack of specificity about the quality of the boy’s work – as well as ‘quite a lot of comments like pleasant and co-operative and so on' – meant the parents were left feeling that the report was vague at best.

By the mid-2000s, the perception of the quality of reports appeared not to have greatly improved. Don Watson (author of Death Sentence: The Decay of Public Language) bemoaned that a fashion for ‘management speak’ had entered public discourse, including the language of Australian schools. He wryly observed that for teachers reporting on student performance, 'the temptation is to create a technocratic or "professional" language containing a set of stock phrases that give an impression of objectivity or even measurement', but which in reality 'leaves readers only half conscious or befuddled' (2006).

In response to a growing perception that school reports had become opaque, and teacher comments effectively meaningless, in 2006 the Federal Government instituted a shake-up to the way schools reported to parents on their children’s learning. It proposed that funding to the states would be tied to a requirement that schools write 'plain language' reports, and that a child’s performance in her subjects, relative to the state’s year-level standard, must be graded using an A to E (or similar five-point) scale. In addition, schools were to place students within a quartile performance ranking in each of their classes. The ‘back to basics’ changes proposed by the Federal Government were intended to provide clearer messaging to parents about how their child was progressing at school.

A number of consultations with Australian parents seemed to support the view that student reports were not communicating useful information. For example, in Tasmania in 2006, a Reporting to Parents Taskforce confirmed that ‘many parents are dissatisfied with the standard of school reporting’. It recommended that parents should not be ‘addressed in the often turgid, sometimes impenetrable language of the curriculum’.

Similar findings also emerged from a New South Wales (NSW) parent consultation in the same year. That consultation took place after the introduction of the new reporting requirements, and it showed that support from parents for the Federal Government’s ‘plain language’ reports was emphatic. Parents involved in the consultation particularly thought that the re-introduction of A to E grades gave them a clearer picture of their child’s performance and achievement. This was because A to E grades were familiar to them and provided some sort of comparative metric.

Problems with an A to E scale

While the move from the government was welcomed by many parents, it was also met with some discontent – particularly among education professionals. Few argued with the need to improve the language used in reports, which was variously seen as jargonistic, obscurantist, and uninformative. However, the reintroduction of the A to E scale proved – and remains – a sticking point.

Several educators questioned the impact on student motivation because of the potential for reports to define students by their performance on the A to E scale. This was particularly a concern for students demonstrating average- to low-performance who might still make significant learning progress (albeit at a below-year-level standard), but whose sense of their own progress might be masked by constantly receiving low grades.

Others have since suggested that the use of the five-point scale to grade student work depends too much on assumed ‘common sense’ notions of achievement that – in reality – are based on arbitrary teacher judgments with very little consistency between teachers, classes and schools, offering parents little in the way of useful information (London, 2012).

Perhaps in a move to reduce the potential for inconsistent application of A to E grades, the Victorian State Government negotiated to apply the A to E scale not to the relative quality of a student’s performance against one year-level standard, but to a student’s demonstrated achievement against curriculum standards across year levels. Though perhaps a step in the right direction, aligning a child’s A to E grade with how far advanced they are in the curriculum standards was still seen in Victoria as problematic.

The Victorian Curriculum and Assessment Authority’s own Revised Curriculum Planning and Reporting Guidelines (2015) acknowledges the criticism of this reporting scale: ‘the use of the A–E scale, where C represents a “satisfactory” or “expected” level of achievement, does not sufficiently recognise student work at a high level’. The guidelines note that it is often ‘interpreted to mean students cannot achieve a B or an A result unless they undertake an accelerated learning program.’

Reporting progress vs achievement

When considering the utility and purpose of student reports, it is important to distinguish what it is exactly that teachers are asked to report. The words ‘achievement’ and ‘progress’ are often used interchangeably in student reports and conflated to mean the same thing. Indeed they are highly related concepts; it is often through tracking one’s achievements that a sense of one’s progress can be measured.

However, if achievement is taken only to mean the grades, scores or marks received on summative assessment tasks, then progress often appears only to mean whether the child’s standard of achievement (their grades) is improving, maintaining or declining. Where progress is understood differently – to mean ‘increasing “proficiency” reflected in more extensive knowledge, deeper understandings and higher-level skills within a domain of learning’ (Masters, 2017) – an emphasis only on reporting achievement on summative assessments would give very little sense of a child’s progress from where they began.

Parents appear to value understanding how their child is progressing in a subject, as well as their performance in assessment tasks. In a landmark report to the Australian Federal Government (2000), based on data from interviews with over 500 parents, the authors reported that ‘Parents place a higher priority on receiving information about their children’s progress than any other type of information they receive from schools.’ It is important to note that the authors did not themselves make explicit what ‘progress’ means in this context, or how, in the minds of parents, receiving information on a child’s progress is distinguished – if at all – from merely receiving updated marks and grades. However, they did note a principle concern for parents was that ‘there is a lack of objective standards [in student reports] that parents can use to determine their children’s attainment and rate of progress’, suggesting a desire for more than grades, scores and marks to be able to monitor their child’s growth in learning over time.

This concern was at the heart of much of the critique and commentary around the Federal Government’s imposing of the A to E scale. While letter grades satisfied parent demand to know how their child’s performance in a subject was judged against the familiar five-point scale, it also worked against parents’ own desire to know whether their child was making sufficient progress in learning. As Professor Geoff Masters, Chief Executive Officer of the Australian Council for Educational Research later commented, ‘letter grades do not provide useful long-term pictures of student progress because they relate only to short-term success on defined bodies of taught content’ (2013).

The need for a method of reporting to parents that serves the purpose of ‘tracking a student’s development in an area of learning over time’ (Forster, 2005) has been under discussion for more than a decade and has gained much traction among educators in recent years. The NSW Education Minister has recently agreed to review the use of A to E grades in primary school reports, under a recommendation by the NSW Primary Principals’ Association.

The Association’s president, Phil Seymour, said that primary school principals are advocating for a reporting system based upon ‘an individual growth model that focuses on (a student’s) cumulative progress’ (McDougall, 2018). As Forster asserted back in 2005, a reporting system such as this, that would be effective in monitoring learning progress over time, would require significant technological infrastructure and teacher training. Not only would it require gathering evidence of student growth in skill acquisition or conceptual understanding, and the ability to track this evidence longitudinally; it would also require that teachers have a vivid, developmental understanding of ‘what it means to make progress in an area of learning’ (Forster, 2005).

Are we there yet?

Despite enduring interest in the form and usefulness of student reports, little research about them has been conducted over the last 15 years. How informative are today’s student reports as a result of the Federal Government’s imposed changes? Are student reports still considered important? What impact has technology had on teachers’ assessments of student learning and on school-parent communication? Have available technologies changed the nature of reports? And how well does student reporting – in whatever form it takes – really communicate student learning progress?

These questions form the basis of a current research project at the Australian Council for Educational Research focused on communicating student learning progress. Early analysis is uncovering what we suspect to be significant confusion about the role and purpose of student reporting. There appears enormous variation nationally in the format, content, foci and intended audiences for student reports.

Findings of the research will be reported in Teacher articles across 2018 and readers are invited to contribute to the project – see below for details.

Research staff involved in ACER's project focused on communicating student learning progress are seeking copies of student reports and any other forms of information that schools have related to communicating student progress. The research team is collecting such materials in order to understand the different forms these take, and determine ways that communicating student progress might be improved. All student reports and other documents received will be de-identified. No school names or student names will be used in the project. Readers are asked to submit examples of recent school reports or other relevant documents to assist in this research. Examples can be emailed as attachments to hilary.hollingsworth@acer.org.

References

Cuttance, P., & Stokes, S. A. (2000). Reporting on student and school achievement. Department of Education, Training and Youth Affairs.

Forster, M. (2005). A new role for school reports. EQ Australia. Retrieved from http://web.archive.org/web/20140326033928/http://eqa.edu.au/site/anewroleforschool.html

London, H. (2012). To grade or not to grade? Leadership in Focus (26): 51-53.

Masters, G. N. (2013). Testing times: making the case for new school assessment. The Conversation. Retrieved from https://theconversation.com/testing-times-making-the-case-for-new-school-assessment-13076

Masters, G. N. (2017). Monitoring learning. Educating Australia: challenges for the decade ahead. Melbourne University Publishing, Carlton, Vic. p. 105

McDougall, B. (2018, March 17) Better reports easy as ditching the ABCs. The Daily Telegraph, p. 11         

Meiers, M. (1982). School reports: the parent's perspective. English in Australia (59): 20-25.

Ridgway, B., et al. (2006). Parents have their say on new student reports. Sydney, Sydney: Dept of Education and Training. Retrieved from
https://www.det.nsw.edu.au/media/downloads/research/studentreport.pdf

Taskforce, T. R. t. P. and T. D. o. Education (2006). Report to the Minister for Education Hon David Bartlett MHA. Hobart, Hobart: Dept of Education Tasmania. Retrieved from https://documentcentre.education.tas.gov.au/Documents/Reporting-To-Parents-Taskforce-Report.PDF

Victorian Curriculum and Assessment Authority Victorian Curriculum F–10: Revised curriculum planning and reporting guidelines (December 2015). Retrieved from http://www.vcaa.vic.edu.au/Documents/viccurric/RevisedF-10CurriculumPlanningReportingGuidelines.pdf

Watson, D. (2006). School Reports, in Taskforce, T. R. t. P. and T. D. o. Education (2006). Report to the Minister for Education Hon David Bartlett MHA. Hobart, Hobart: Dept of Education Tasmania.

Think about the way your school delivers reports to parents and carers. Does technology impact the way you communicate results with them? Has technology changed the nature of reporting?

How well does student reporting in your school – in whatever form it takes – really communicate student learning progress?

Dr Hilary Hollingsworth and Jonathan Heard will be presenting at the 2018 Research Conference. Their session is titled Communicating student learning progress: What does that mean, and can it make a difference?

Tracey 03 April 2018

Part of the problem is that we’re conflating assesssmsnt, feedback, achievement, progress, reporting, growth, all into one hodgepodge “report”. All have an important and discrete place within the Educational system.

Chris 05 April 2018

I find it intriguing that we have not fully realised the affordance that technology offers in relation to real-time (just in-time) formative assessment practices that research tells us makes a significant impact on student learning (Wiliam, Black, Hattie)
I have a pre-school age child whose school uses a “reporting/communication” tool where daily updates are captured by the educators including work samples, outcomes linked, photos of my child engaged in learning tasks etc. The question remains is traditional (semester) reporting an artefact of a bygone era? If we adopt this type of system, I would say yes.

Gabee Leone 15 April 2018

Ongoing reporting with samples to support, along with indication for future learning is what our parents want! The issue is there is not a reporting package available to suit the schools needs! Effective teaching and learning depends on clarity, ownership & transparency! Rubrics have assisted and supported learning. How can rubric assist reporting?
I agree really understanding progression of learning is integral to reporting!

Leave a comment




Skip to the top of the content.