Latest News

Eslake gets it wrong

Recently, economist Saul Eslake has been speaking out in local media saying that the ACT spends too much on public education, with no better outcomes to show for it. AEU members have been rightly baffled by such claims.

We’ve worked through the evidence presented by Eslake and noticed where he’s got it wrong. We’ve written to the committee who commissioned Eslake’s work to ask them to fact-check his report.

The AEU will always stand up to attacks on public education.

The background

The Select Committee for the Fiscal Sustainability of the ACT commissioned Saul Eslake to independently review the Territory’s finances. Eslake produced an interim report to inform the investigations of the committee.

There has been a fair bit of media attention about this report, and Mr Eslake has been providing comment to news outlets about his claims.

The interim report makes claim that “the ACT spends more per student on school education than any other state or territory – but doesn’t get better outcomes”.1 Here’s what we found when we dug into the data.

Looking at the facts

Mr Eslake’s report gets both halves of the story wrong. It claims ACT schools produce poor outcomes. This is incorrect: ACT students are among the highest performing in the country on every major national and international assessment. It claims ACT schools spend too much. This is also incorrect: the ACT is the only jurisdiction funding its public schools to the national minimum standard, and even that is not enough to meet the needs our schools are expected to address.

But there is a deeper problem with the Eslake analysis. The strong results that ACT public schools achieve are not the product of a generously funded system. They are the product of a workforce that goes above and beyond, every day, to deliver quality education despite chronic under-resourcing. In the AEU’s 2025 survey of 1,268 members, 92% said schools lack adequate resources to implement the Education Directorate’s strategic initiatives.2 Eighty per cent cannot complete their work during work hours.3 Teachers spend an estimated $2.58 million a year of their own money on classroom resources.4 More than half have their scheduled release time interrupted by behaviour incidents or give it up to cover for absent colleagues.5

The Canberra community can clearly see that our schools are hanging together by threads. Five schools could not operate a full 5-day timetable in 2025 due to acute under-staffing, and we have received confirmation that two remain in such a situation this year. For those that do open all week, collapsed and split classes are rife. Just last month we saw two special schools closed for seven days because we do not have the flexibility to deal with infrastructure issues.

ACT public education is not a system with money to spare. It is a system being held together by the professionalism and goodwill of the people who work in it. The Eslake report’s suggestion that education spending should be cut would not trim fat. It would cut into the bone of a system that is already under strain and put at risk the very outcomes that make the ACT’s public schools among the best in Australia.

Our educators deserve better. ACT students and their families deserve better.

1. The claim that the ACT “doesn’t get better outcomes” is wrong

The headline on slide 29 of the Eslake report says the ACT “spends more per student on school education than any other state or territory – but doesn’t get better outcomes.” This is the central claim of the education section of the report. It relies on two measures: Year 12 certification rates and Year 9 NAPLAN scores by parental education. Both are problematic.

1.1 What the evidence actually shows: ACT students are among the highest performing in the country

The right place to start is with what the data actually shows about student outcomes in the ACT using measures that directly assess what students in ACT schools can do.

NAPLAN: The 2025 NAPLAN results confirm that the ACT remains one of Australia’s top-performing jurisdictions. In the most recent results, ACT students achieved the highest mean scores in the nation for reading across Year 5, Year 7, and Year 9.9 In terms of student proficiency, the proportion of ACT students performing at ‘Strong’ or ‘Exceeding’ was higher than or similar to the national average in all 20 NAPLAN domains, and statistically significantly above the national average in seven of them.10 In 2024, the ACT was the joint-leader for the most top-performing domains in the country, recording the highest mean scores in seven different categories.11

PISA: In the 2022 Programme for International Student Assessment, ACT students performed at a higher level in reading literacy than students in any other Australian jurisdiction, and a higher proportion met the national proficient standard than in any other jurisdiction.12 The same was true for mathematical literacy.13 PISA also found that the relationship between socioeconomic status and performance was similar across all jurisdictions. PISA found that among the most socioeconomically advantaged students, scores for the ACT, NSW, Victoria and Western Australia were almost identical, undermining the suggestion that the ACT underperforms relative to its demographic profile.

PIRLS: In the 2021 Progress in International Reading Literacy Study, the performance of ACT Year 4 students was statistically significantly higher than that of students in every other Australian jurisdiction.14 Close to 90% of ACT students met the proficient standard, compared with just over 80% in Victoria and Queensland, and over 70% in other jurisdictions.15 The ACT had the highest proportion of students at the ‘Advanced’ benchmark (19%) and the equal lowest proportion who did not reach the ‘Low’ benchmark.16 Between 2016 and 2021, the proportion of ACT students not reaching the ‘Low’ benchmark halved – from 7% to 3% – a significant improvement not seen in any other jurisdiction.17 If the ACT were assessed as a country, it would rank alongside top-performing nations such as England, Finland, and Poland.18

Retention and completion: Within the ACT public school system specifically, the Education Directorate’s Annual Report 2024–25 shows that 89% of Year 12 students received a Year 12 Certificate, consistent with previous years.19 The ACT public school system has the highest apparent retention rate (Year 10 to Year 12) of any jurisdiction: 95.5% for full-time students in 2024, compared with a national average of 74.3%.20 When part-time students are included, the ACT’s rate has been above 100% in every year from 2015 to 2024, meaning more students are completing Year 12 than started Year 10 as students transfer into the system.21

By every measure that directly assesses what ACT students can do, whether national or international, the ACT is at or near the top of the country. This is the context that is entirely absent from the Eslake report.

The broader demographic picture is consistent with this. The ABS Survey of Education and Work shows that 92% of ACT 20–24 year olds have attained at least Year 12 or equivalent (compared with 86.2% nationally), and 94.1% have attained Year 12 or AQF Certificate II or above (compared with 90.5% nationally).22 The Better and Fairer Schools Agreement (BFSA) adopts Key Performance Measure (KPM) 7(a) – the proportion of 20–24-year-olds with at least Year 12 or AQF Certificate II or above – and sets a national target of 96% by 2031. The ACT, at 94.1%, is already closest to reaching it.23

We note that these demographic figures are influenced by the characteristics of people who move to Canberra (for university and public service employment, for instance) and do not directly measure the performance of the ACT school system. We include them here because the BFSA adopts them as the agreed national benchmark, and because they are broadly consistent with the system-level evidence above.

1.2 Why the measure Mr Eslake used does not reliably reflect ACT performance

Mr Eslake did not use any of these measures. He used the Year 12 certification rate from the Productivity Commission’s Report on Government Services 2026 (Table 4A.56), which shows the ACT at 73.7% compared with a national average of 76.3% in 2022.24 This tells the opposite story. It is worth understanding why.

This certification rate divides the number of Senior Secondary Certificates issued by the “potential Year 12 population.” That denominator is not the number of students actually in Year 12. It is the ABS estimated resident population aged 15-19, divided by five.25 This crude demographic figure creates specific problems for the ACT, such as the following:

  • Canberra’s 15-19 year old resident population is inflated by large numbers of young people who move to the ACT for tertiary study (at ANU, UC, CIT, ADFA and other institutions) at ages 18-19. They swell the denominator without having been in the ACT school system.

  • Cross-border dynamics add further distortion. Students living in surrounding NSW postcodes may attend ACT schools or vice versa, and the numerator and denominator come from different data sources that do not necessarily align.

Crucially, the certificates counted in the numerator come from all school sectors – government, Catholic, and independent. This means the certification rate is an all-sector measure. Yet on the very same slide, Mr Eslake compares it against government school expenditure only. He is measuring what all schools produce but costing only what public schools spend. This is not a like-for-like comparison.

The Productivity Commission’s own Indicator Framework flags the certification rate data as “not comparable and/or not complete” across jurisdictions. The RoGS states that “this indicator should be interpreted with caution” because “assessment, reporting and criteria for obtaining a Year 12 or equivalent certificate varies across jurisdictions” and because “students completing their secondary education in [TAFE] institutes are included in reporting for some jurisdictions and not in others.”26 The Department of Education is currently reviewing the methodology. In other words, the very source Mr Eslake relies on warns against the cross-jurisdictional comparison he makes.

It is important to distinguish between two measures that are sometimes confused. The Productivity Commission’s certification rate divides the number of Senior Secondary Certificates issued in the ACT by a demographic estimate of the potential Year 12 population. The BFSA’s attainment target (KPM 7(a)) is an entirely different measure: the proportion of 20–24-year-olds who have attained at least Year 12 or equivalent or AQF Certificate II or above, drawn from the ABS Survey of Education and Work. The BFSA does not use the certification rate as a performance measure. The National Report on Schooling in Australia 2024 is explicit that even the simpler Year 12 completion proportion “is not, by itself, a KPM for schooling”27. On the measure the BFSA does use, the ACT performs well above the national average: 94.1% compared with 90.5% nationally in 2024.28

In short: the measure Mr Eslake chose is the one measure that makes the ACT look bad, and it does so because of well-documented statistical distortions, not because ACT students are failing.

1.3 The SES-based comparisons are unreliable because the national model doesn’t work for the ACT

Slide 29 also shows Year 12 certification rates for students from “high SES households,” with the ACT appearing to perform dramatically worse than the national average (72.2% vs 82.9% in 2022).29 This is a surprising figure for a jurisdiction where 92% of 20-24 year olds have Year 12 or equivalent.

The explanation is that the measure of socio-economic status used in these comparisons has recognised problems in the ACT context. ACARA’s own Index of Community Socio-Educational Advantage (ICSEA) and Socio-Educational Advantage (SEA) calculations have been under formal review since 2023. A representative of ACARA confirmed in correspondence to the AEU that:30

  • Canberra sits at such an extreme of socio-educational advantage that the national model struggles to accurately predict student performance, with the result that too much is expected of ACT students.

  • The way SEA scores are dispersed across Canberra’s suburbs inflates error rates for ACT schools, particularly government schools, making it much harder for ACT schools to show performance significantly exceeding the predicted result.

  • The problem is being addressed through a proposed revision to the national model, but this requires agreement from all states, territories and sectors.

In plain terms, the measure that Mr Eslake uses to argue ACT schools underperform for high-SES students is subject to a formal national review because ACARA knows it doesn’t work properly in the ACT.

1.4 NAPLAN comparisons need more nuance

Slide 30 presents Year 9 NAPLAN scores by parental education, showing ACT students scoring at or slightly below the national average when parental education is held constant.[31] On this, there are a few important things to understand:

  • The differences are small – typically 2 to 7 points on a scale with a mean around 550. For Year 9 Reading among students whose parents hold a Bachelor degree or above, the ACT score was 600.7 vs 600.0 nationally – essentially identical.32

  • The ACT’s confidence intervals are much wider than the national figures because of the much smaller sample size (the ACT has roughly 2% of Australia’s student population). The differences are not statistically significant.33

  • The same ACARA model issues described above directly affect these comparisons.34

  • Critically, the RoGS data used by Mr Eslake does not disaggregate NAPLAN results by school sector (public vs non-government). On the very same slide, he shows government-only expenditure. He is comparing public school spending against all-sector outcomes. This is misleading.35

The results are real, but they come at a cost to the workforce

The evidence above demonstrates that the ACT’s public school system delivers strong outcomes by any credible measure. But it would be wrong to read this and conclude that all is well. These results are achieved not because the system is adequately resourced, but because educators absorb the shortfall.

Four in five AEU members report being unable to complete their work during work hours. Nearly 60% deal with complex student needs without adequate support on a weekly basis. One in three gives up scheduled release time to cover for absent colleagues. The system’s strong outcomes are a credit to the people in it – not evidence that funding is sufficient, let alone excessive.

2. The per-student expenditure comparison is misleading

Slide 29 presents recurrent expenditure on government schools per FTE student in 2023-24, showing the ACT at around $26,000 including user cost of capital, compared with a national average of about $22,000.36 From this, Mr Eslake concludes ACT school spending is excessive. There are several things wrong with this.

2.1 Comparing the ACT to underfunded systems and concluding we spend too much

Mr Eslake’s chart shows the ACT spending more per government school student than other states and territories, and invites the reader to conclude that ACT spending is excessive. But the chart actually shows something else: that every other state and territory (except the NT) spends less than the ACT because they are funding their public schools below the national minimum.

The Schooling Resource Standard is a per-student funding amount – a base amount plus needs-based loadings for disadvantage, disability, school size and other factors. It was developed following the Gonski Review using data from reference schools where at least 80% of students achieved above the national minimum standard.37 In other words, the SRS represents an estimate of how much funding a school needs for only four in five students to reach minimum standards. It is a floor, not a ceiling.

The ACT is the only jurisdiction in Australia where public schools are funded to 100% of the SRS.38 Most other jurisdictions remain significantly below this benchmark. When Mr Eslake lines up ACT spending against states that are chronically underfunding their schools and concludes the ACT is the outlier, he has the analysis backwards. The ACT is not spending too much. The other states are spending too little – and they know it, which is why the BFSA commits them to reaching full funding over the decade to 2034.39

This does not mean there is no case for scrutiny of how education funding is used. But presenting a chart that compares a fully funded system to underfunded ones, without any acknowledgement of the national funding standard that explains the difference, is not a serious basis for policy.

2.2 The ACT Government spends more from its own budget because it receives less from the Commonwealth

Mr Eslake’s chart shows only state and territory government expenditure per student. It does not show Australian Government payments. When both are considered, the picture changes.

The ACT receives the lowest Australian Government payment per government school student of any jurisdiction in the country: $3,562, compared with a national average of $4,277.40 Tasmania receives $4,619, Queensland $4,396, South Australia $4,353 – all significantly more per student than the ACT.41

Because the ACT receives less from the Commonwealth, the ACT Government picks up a larger share of the total cost from its own budget. When total government expenditure (Australian plus state and territory) is considered, the ACT ($29,635 per student) is second to the Northern Territory ($32,223) but within range of NSW ($27,953) and Tasmania ($27,975) – a gap of around $1,700, not the $4,000-plus gap that Eslake’s state-and-territory-only chart implies.42

In other words, the ACT Government’s per-student expenditure is higher partly because it is compensating for receiving less Commonwealth funding per student than any other jurisdiction. That is not evidence of overspending. It is the predictable consequence of the funding structure.

The BFSA is intended to address this imbalance. Under the agreement, the Commonwealth will increase its share of the SRS for government schools from 20% to 25% by 2034, with states and territories contributing at least 75%. For the ACT, which is already at 100% of the SRS, this means the Commonwealth will take on a larger proportion of the cost that the ACT Government currently bears alone. This is a welcome structural reform, but it underscores the point: the ACT’s higher state/territory expenditure per student reflects a funding structure in which the ACT Government has been compensating for lower Commonwealth contributions, not a spending problem.

2.3 The figure includes a notional cost that inflates the ACT

The RoGS figure Mr Eslake uses includes the “user cost of capital” – an imputed economic cost representing the opportunity cost of capital tied up in school buildings and land. This is not money the government actually spends. It is a theoretical figure, and it is disproportionately inflated for jurisdictions with high land values.

2.4 Small systems cost more per student – that’s a structural fact, not waste

The ACT’s government school system is just 6% of the size of our neighbours in NSW.43 A small system still needs to provide assessment frameworks, curriculum support, administrative systems, specialist services, and regulatory functions regardless of size. These fixed costs are spread across far fewer students. That is a structural reality of being a small jurisdiction, not evidence of profligacy.

3. The argument about private school share is wrong

On slide 28, Mr Eslake argues the ACT “ought to be able to spend less (per head) on education given a slightly smaller school-age cohort and greater use of private schools.”44

This argument is not logical. The ACT Government has an obligation to properly fund every student in the public system. The existence of a large private school sector does not reduce the needs of students in public schools, nor the cost of meeting those needs. A child with disability, a student learning English as an additional language, a young person experiencing family crisis – none of these students cost less to educate because a neighbouring family chose a private school.

What the high private school share does mean is that ACT public schools educate a student body with a particular profile. When 40% of students are in the non-government sector,45 the public system absorbs a disproportionate share of students with complex needs, students from disadvantaged backgrounds, and students with disability – the students who cost more to educate well. Suggesting the government should spend less per student because other students attend private schools ignores this reality entirely.

4. What the report ignores: the infrastructure divide

While Mr Eslake implies the ACT is over-investing in education, the reality on the ground in ACT public schools tells a very different story.

The AEU’s 2025 State of Our Schools survey of ACT public school principals found that 49% say external maintenance of their buildings is inadequate, 52% say internal maintenance is inadequate, 61% say their school lacks purpose-built structures to support students with disability, and only 36% are able to offer the full curriculum with their current infrastructure.46 Demountable classrooms across ACT public schools increased by 68% between 2022 and 2025, reaching 257 across 92 schools.47 Principals describe buildings identified as not fit for purpose, schools with no working lift for students with disability, mould requiring constant shutdowns, and preschool toilets with no enclosed roof.48

This is not evidence of a system that is swimming in money. It is evidence of a system under strain.

The AEU’s recent report The Building Divide in Australian Schools documents the wider context. Total capital investment in ACT private schools has outpaced investment in public schools by $293.7 million over the decade to 2023, and by $93.6 million in 2023 alone.49 In 2023, private school students in the ACT received twice the capital investment per student that public school students received.50 The five highest-spending private schools spent more on capital works than 87 of 89 non-special public schools combined.51

Much of this divide is driven by Commonwealth policy. The Coalition ceased Commonwealth capital funding to public schools in 2017, while expanding the Capital Grants Program reserved for private schools.52 The ACT Government itself joined with other state and territory governments in 2024 to call on the Commonwealth to address this, writing that jurisdictions are “experiencing unprecedented pressures from ongoing population growth, resulting in the need to build new schools, support additional infrastructure and upgrade existing facilities.”53

If we’re genuinely interested in the efficiency and adequacy of education spending in the ACT, the state of public school infrastructure is where the real pressure lies – not the recurrent expenditure comparisons Mr Eslake presents.

Conclusion

The Eslake Interim Report presents a picture of ACT school education that is not supported by the evidence. It uses the wrong outcome measures, relies on a socio-economic model that ACARA itself says doesn’t work for the ACT, presents expenditure data without essential context, and draws conclusions that the underlying data does not support.

But the deeper concern is where this analysis leads. If we accept the premise that ACT public schools spend too much for too little, the logical conclusion is to cut. That would be a serious mistake.

ACT public schools deliver among the best results in the country. This is not because they are over-funded, but because the workforce goes above and beyond what any reasonable employer should expect. Our members work unpaid hours, spend their own money on classroom resources, manage increasingly complex student needs without adequate specialist support, and teach in buildings that are aging, overcrowded and frequently unfit for purpose. Ninety-two per cent of our members say their schools are not adequately resourced to deliver on the initiatives the system has committed to. This is not a system with room to cut. It is a system that needs its investment protected and, in many areas, increased.

 


[1] Saul Eslake, The Fiscal Sustainability of the Australian Capital Territory: Interim Report to the Legislative Assembly Select Committee on the Fiscal Sustainability of the ACT (Melbourne: Corinna Economic Advisory, 27 February 2026).

[2] AEU ACT Branch, Our Voices, Together: What AEU Members Say about Working in ACT Public Schools (Canberra: AEU ACT Branch, 2025), 14.

[3] AEU ACT Branch, Our Voices, Together, 12.

[4] AEU ACT Branch, AEU ACT Budget Submission 2024–25 (Canberra: AEU ACT Branch, 2024), 1. Estimate based on AEU survey of ACT public school teachers.

[5] AEU ACT Branch, Our Voices, Together, 11.

[6] “Eslake review proves policy failures to blame for the ACT debt crisis,” The Canberra Times, March 5, 2026.

[7] Seeder, Benjamin, “Should Tasmania Close or Merge Schools to Help Improve the Bottom-line?,” The Advocate, July 9 2025, “Saul’s Call: Close the Colleges”, Tasmanian Business Reporter, November 30, 2017.

[8] Genford, David, “Respecting the Profession,” Australian Education Union Tasmania Branch, accessed 6 March 2026, https://aeutas.org.au/respecting-the-profession.

[9] Australian Curriculum, Assessment and Reporting Authority (ACARA), 2025 NAPLAN National Results Dataset (Sydney: ACARA, 2025).

[10] ACARA, 2025 NAPLAN Dataset.

[11] ACARA, 2025 NAPLAN Dataset, 2024 Results.

[12] Lisa De Bortoli, Catherine Underwood, and Sue Thomson, PISA 2022: Reporting Australia’s Results, Volume I: Student Performance and Equity in Education (Melbourne: Australian Council for Educational Research, 2023).

[13] De Bortoli, Underwood, and Thomson, PISA 2022, mathematical literacy results by jurisdiction.

[14] Kylie Hillman et al., Progress in International Reading Literacy Study: Australia’s Results from PIRLS 2021 (Melbourne: Australian Council for Educational Research, 2023), table 2.1.

[15] Hillman et al., PIRLS 2021.

[16] Hillman et al., PIRLS 2021, fig. 2.5.

[17] Hillman et al., PIRLS 2021.

[18] Hillman et al., PIRLS 2021, fig. 2.7 and international rankings.

[19] ACT Education Directorate, Annual Report 2024–25, 60.

[20] Productivity Commission, Report on Government Services 2026, pt. 4: School Education, table 4A.26.

[21] Productivity Commission, Report on Government Services 2026, table 4A.26.

[22] ACARA, Student Attainment dataset, National Report on Schooling in Australia data portal, sourced from ABS, Education and Work, Australia, May 2024.

[23] Australian Government, Better and Fairer Schools Agreement.

[24] Productivity Commission, Report on Government Services 2026, pt. 4: School Education, table 4A.56.

[25] Productivity Commission, Report on Government Services 2026, table 4A.56.

[26] Productivity Commission, Report on Government Services 2026, pt. 4: School Education, Indicator Framework and Indicator Results, “Attainment.”

[27] ACARA, National Report on Schooling in Australia 2024, chap. 8, sec. 8.2.

[28] ACARA, Student Attainment dataset, National Report on Schooling in Australia data portal, sourced from ABS, Education and Work, Australia, May 2024.

[29] Productivity Commission, Report on Government Services 2026, table 4A.56.

[30] Steve Croft (Senior Manager, Reporting, ACARA) to Bianca Hennessy (AEU ACT Branch), 3 November 2025, re: ACARA Enquiry #00011917, “Review of SEA for ACT.”

[31] Productivity Commission, Report on Government Services 2026, pt. 4: School Education, tables 4A.37, 4A.41, and 4A.45.

[32] Productivity Commission, Report on Government Services 2026, tables 4A.37, 4A.41, and 4A.45.

[33] Productivity Commission, Report on Government Services 2026, tables 4A.37, 4A.41, and 4A.45.

[34] Croft to Hennessy, 3 November 2025.

[35] Productivity Commission, Report on Government Services 2026, tables 4A.37, 4A.41, and 4A.45.

[36] Productivity Commission, Report on Government Services 2026, pt. 4: School Education, table 4A.32.

[37] David Gonski et al., Review of Funding for Schooling: Final Report (Canberra: Australian Government, December 2011).

[38] Australian Government Department of Education, SRS funding data.

[39] Australian Government, Better and Fairer Schools Agreement.

[40] Productivity Commission, Report on Government Services 2026, table 4A.32 (Australian Government payments for school education services excluding capital grants, per FTE government school student, 2023-24).

[41] Productivity Commission, Report on Government Services 2026, table 4A.32.

[42] Productivity Commission, Report on Government Services 2026, table 4A.32 (Australian and state and territory government expenditure, per FTE government school student, including user cost of capital, 2023-24).

[43] ABS, Schools, Australia, 2025, table 43a. ACT government school FTE: 46,104; NSW government school FTE: 773,377.

[44] Eslake, Fiscal Sustainability of the ACT, slide 28.

[45] ABS, Schools, Australia, 2024, table 43a. ACT non-government FTE: 31,019 of 76,916 total (40.3%).

[46] Australian Education Union (AEU), The Building Divide in Australian Schools (Melbourne: AEU, 2025), ACT chapter, citing AEU, 2025 State of Our Schools Survey.

[47] AEU, Building Divide, ACT chapter.

[48] AEU, Building Divide, ACT chapter.

[49] AEU, Building Divide, ACT chapter, sourced from ACARA Finance dataset 2009–2023.

[50] AEU, Building Divide, ACT chapter.

[51] AEU, Building Divide, ACT chapter.

[52] AEU, Building Divide, chap. 3

[53] Joint letter from state and territory education ministers to the Commonwealth, 2024, cited in AEU, Building Divide, 103.

 

Member login

Being an AEU member comes with great benefits, like access to our exclusive, member-only information and advice.