Schools May be Gaming NAPLAN to Manipulate Academic Performance

Less advanced students told to sit out tests.
Aug 16, 2024
NAPLAN
The test remains controversial.

Schools might be strategically manipulating participation in NAPLAN to improve their public performance results.

The research paper, Unintended consequences of school accountability reforms: Public versus private schools, by UNSW Business School authors and published in the Economics of Education Review, found that certain schools are more likely to exclude lower-performing students - a move which would enhance a school’s overall test results.

Private schools were found to be more likely to manipulate their testing pool and schools with lower initial test scores relative to other schools teaching similar student bodies show higher rates of non-participation in NAPLAN tests following the public release of performance data on the My School website.

The study said this increase in non-participation was largely driven by formal parental withdrawal, a process that parents can initiate by applying directly to the school principal.

Students may be withdrawn from the testing program by their parent/carer for reasons such as “religious beliefs or philosophical objections to testing”, according to the NAPLAN website, which states that this is decided by parents/carers in consultation with their child’s school.

“We found that the fraction of students withdrawn from testing in years after the 2010 launch of My School went up, while the fraction who were absent or exempt from testing remained roughly steady - with poorly-performing students far more likely than other students to be withdrawn from testing,” said Gigi Foster, a Professor in the School of Economics at UNSW Business School, who co-authored the research paper together with University of Melbourne Associate Professor Mick Coelli.

“We further find that this increase in withdrawal rates occurred in schools that were initially reported on My School to be poor performers relative to peer schools,” she said.

This behaviour, commonly referred to as “gaming the system,” undermines the objective of NAPLAN to provide a fair assessment of student performance across Australia, Prof Foster asserted.

Private Schools Are More Likely to Game the System
The research identified a sector-specific response, with private schools more likely to adjust their testing pools compared with public or Catholic schools. This indicates a higher tendency among private schools to manipulate participation rates to maintain their reputations.

More specifically, the research found that students at private schools are more than twice as likely than their peers at state schools to be pulled out of NAPLAN tests in subsequent years if they received low grades in previous years.

“We provide some suggestive evidence that these higher rates of withdrawal in lower-performing schools were more prominent in independent private schools - which are famous for charging parents a pretty penny for the privilege of enrolling their kids - than in public schools,” said Prof Foster.

“These findings are consistent with a situation in which the increased withdrawal is used as a tactic to manipulate the image of a school’s quality: excluding more poor performers from testing makes the school look better than it otherwise would look on My School.”

How Parents Play a Role in NAPLAN Testing Participation
Parents have reported pressure from schools to withdraw children from NAPLAN testing in surveys conducted by state education authorities, while the popular press have also reported on claims by parents that schools have instructed children not to sit the NAPLAN tests "in order to boost their chances of obtaining higher overall scores", the research paper stated. 

NAPLAN testing protocols indicate that withdrawals are intended to address issues such as religious beliefs and philosophical objections to testing, but the research paper observed that providing a considered explanation for withdrawal appears to be unnecessary.

Furthermore, the process of applying for withdrawal was not actively promoted by governments or the testing authority. “We believe that knowledge of the process was provided to parents directly by schools,” said the researchers, who noted that parents apply for withdrawal directly to the school, not to the testing authority or government.

“The volume of such reports led the NSW Minister for Education to warn that principals or teachers found encouraging children not to sit the tests may face disciplinary action.”

Unintended Consequences of Manipulating NAPLAN
Prof Foster explained that she and co-author A/Prof Coelli, were familiar with overseas research evaluating government programs designed to make school performance more transparent to parents, and thereby to raise the accountability of schools.

“The idea of such programs is to help improve educational outcomes, since with more information, parents would be expected to select higher-performing schools for their children to attend, thereby exerting competitive pressure on low-performing schools to either lift their game or close,” she said.

However, she said, these types of programs could have unintended consequences - if not designed with careful thought. As low-performing pupils at poorly performing schools were more likely not to sit the NAPLAN tests, Prof Foster said this meant that the My School program may have had the unintended consequence of hiding from public view the low skills in English and numeracy of some of Australia’s weakest school students.

“This is somewhat ironic since the whole point of a school accountability program is generally to improve the education that students - and particularly disadvantaged students, whose parents may have few sources of information about school quality apart from the program - receive,” she said.

Data Analysis and Policy Implications
In conducting their analysis, the researchers utilised data from the My School website spanning from 2008 to 2015. This data set included standardised test scores and participation rates from a balanced panel of 6981 schools over eight years.

While the intention behind publicising school performance data was to drive improvements and transparency, Prof Foster said, the results indicated that it could also incentivise undesirable behaviours.

“The government could fix a maximum percentage of students who may be excluded from testing on any given testing day, although the monitoring costs of this may be prohibitive and it may result in further unintended consequences, such as schools reducing the number of allowed exclusions amongst average performers - perhaps then creating pressure on sick or injured students to sit the tests - in order to create more ‘slots’ for weak performers to sit out the tests,” she said.

“Another option would be for the My School site to report the percentages of students excluded from testing at each school, while adding a note informing parents that a comparatively high fraction of students excluded from testing may signal that a school’s true average performance is lower than what it appears to be on the My School tables.”

Image by Aedrian Salazar