Anthropic is coming to Australia. What does that mean for electricity prices?

Anthropic, the creators of Claude, are opening an office in Sydney as part of their expansion to Australia and New Zealand, which rank 4th and 8th globally on per capita Claude usage. What does this mean for Anthropic and Australia?

The most common topics people use Claude for in Australia. The full data explorer here is worth playing with.

What Anthropic is actually doing here

Initially, this just means that Anthropic will have a bigger focus on supporting their products’ customers in Australia and New Zealand, such as Canva, Quantium, and Commonwealth Bank of Australia. However, Anthropic is also exploring expanding their compute capacity in Australia through third party partners, and they’re also “in early conversations about longer-term infrastructure in the region”. All this would mean more data centre load in Australia, probably on top of the increased load we’re already expecting.

There are two main types of AI data centre load; inference and training. Training is the computationally intensive process of building the model by feeding it vast amounts of data, which happens once (or occasionally) for each model version. Inference is the ongoing, per-query compute that happens every time someone sends a message to Claude or uses the API.

Anthropic already routes some inference traffic through Australia via cloud partners like AWS and Google Cloud, which both have Sydney regions. But Anthropic says expanding local compute capacity is one of the most consistent requests it hears from Australian enterprises and government agencies, particularly those with data residency needs — and it’s actively exploring this through existing third-party infrastructure.

It seems unlikely that Anthropic will train their models in Australia anytime soon. Frontier model training requires large, concentrated compute clusters — Anthropic’s $50 billion infrastructure deal with Fluidstack is building these in Texas and New York. The language in Anthropic’s announcement is carefully scoped to “compute capacity” — which is inference language. They’re also in “early conversations about longer-term infrastructure,” but training at scale in Australia would be a much bigger and more distant proposition.[1]

How data centres can raise electricity prices

It’s worth mentioning Anthropic’s existing principles relating to ensuring they don’t socialise the electricity costs associated with growth in electricity demand because of their training and inference load. Data centres (or any other increasing source of load) can raise electricity prices in two main ways.

First, by requiring more generation capacity (or demand response). When new large loads like data centres connect to the grid, they increase total electricity demand. If that demand pushes up against supply constraints — particularly during peak periods — it can tighten the wholesale electricity market, driving up spot prices that flow through to all consumers. This can also bring forward the need for new generation investment. Demand response — paying large consumers to reduce their load during tight periods — can help, but it’s an additional cost borne by the system.[2]

Second, by requiring more electricity network infrastructure to accommodate peak demand. Transmission and distribution network costs are, in simple terms, ultimately paid for by all electricity consumers (including you and me). It shows up in our household electricity bill partly under the fixed daily charge,[3] and partly as a volumetric charge (the more energy you consume, the more of the total fixed network cost you pay for — we’ll come back to this later).

Anthropic has “committed” to:

  • Pay for 100% of the grid upgrades needed to interconnect their data centres
  • Procure new power and protect consumers from price increases
  • Reduce strain on the grid by investing in curtailment systems to cut power during peak demand
  • Be a responsible neighbour to the local communities around their data centres

It’s unclear whether Anthropic intends for these principles to apply to their global operations, or just in the US (“but AI companies shouldn’t leave American ratepayers to pick up the tab”).

When more load means lower bills

In theory, there is a pathway for increases in load to actually reduce electricity costs for other consumers. This is evidenced most neatly by electric vehicle (EV) charging. Because people charge their EVs throughout the day, they don’t significantly increase peak electricity demand, which would therefore require more electricity network to be built to accommodate peak demand.[4] Modelling conducted by Energy Consumers Australia and CSIRO, as well as direct evidence from California, shows that as more people buy and charge EVs, they take on a greater share of the network costs relative to those who don’t own an EV. That’s because network costs are recovered at least in part by the volumetric component of the electricity bill in Australia (and many other places).

The result is that EV owners save money because driving an EV is cheaper over the life of the car than driving an internal combustion engine vehicles, and non-EV owners save money because they pay less for the network.

Annual savings from electric vehicles over 20 years from 2023, 2030, 2040, and 2050 for EV and non EV-owning households. This analysis assumes the EV adoption targets in the 2022 ISP Step Change scenario are achieved. ECA

Note that for this to remain true, a sizeable portion of the network costs will need to be recovered via volumetric charges. The Australian Energy Market Commission recently floated the idea of shifting to more network costs being recovered via fixed costs, which has sparked lively discussion. There are advantages and disadvantages to doing it this way, to be sure,[5] but it’s of note that it would make increased electricity throughput from electric vehicles and other sources of load result in higher bills for other consumers rather than the other way around.

Could data centres lower electricity prices too?

So more electricity demand from EVs could actually save non-EV owners money. Could the same be true of data centres for non-data centre consumers? I don’t know, but it depends on how much data centres contribute to peak demand (both in terms of peak wholesale prices bumping up the spot price, and peak network constraints requiring more network to be built — these don’t always occur at the same time). That depends to a large degree on how flexible the data centre electricity load can be — i.e. can they precool before a demand spike, can they ramp down their compute during peak load, or can they rely on onsite battery/self-generation to ride through the peaks.

A key metric here is Power Usage Effectiveness (PUE) — the ratio of a data centre’s total energy consumption to the energy used by its IT equipment alone. A PUE of 1.0 would mean every watt goes to computing; anything above that represents overhead from cooling, power distribution, and lighting. According to the Uptime Institute (2025), the global average PUE in 2025 was 1.54, but Google has an average of 1.09.

On flexibility, the evidence is growing but mixed. A Duke University study estimated that curtailing data centre loads for just 0.25% of their uptime could free up enough capacity to accommodate 76 GW of new load. An ACEEE white paper notes that a test of a software platform at an Oracle data centre reduced peak power consumption by 25% during peak grid hours. And a broader academic study published in ScienceDirect found that participation in demand response programs can reduce data centre energy purchase costs by up to 24%.

If data centres reduce the per-unit-of-energy cost of electricity by increasing network utilisation more than they increase the per-unit-of-energy cost of electricity by increasing wholesale prices and causing more network costs to be socialised to other consumers, they’ll lower electricity costs for consumers. If not, they’ll raise them.

Electricity costs are a small percentage of total training costs for frontier models (~2-6%), and my intuition has been that they would be relatively insensitive to changes in electricity price. In other words, even when electricity prices are high or they’re offered a lot of money to ramp down to meet a network need, why wouldn’t they just want to let those GPUs rip and make even more money? I can’t find electricity costs for inference operations specifically, but estimates for total operating costs of data centres range from 15-25% to 40-60%, so perhaps for non-training compute, demand flexibility will be attractive.

All views are my own, and do not represent my current or previous employers.


[1] Or maybe not, who knows?

[2] I wrote about the need for more generation capacity and the levers Australia uses to achieve this here.

[3] Although confusingly for many, not all of the daily charge is used to pay for network costs.

[4] The best analogy for this was written by my former colleague Ashley Bradshaw. Department stores are relatively empty most of the time, but they’re built with peak demand (the month of Christmas) in mind, not average demand. The same is even more true for electricity networks.

[5] Volumetric charges incentivize solar, batteries, and energy efficiency while benefiting all users from increased EV adoption, but may be unfair to renters, apartment dwellers, and low-income households who cannot access consumer energy resources (CER) and end up paying disproportionately for the network. Fixed charges result in more predictable revenue for networks and prevents solar/battery owners from avoiding their fair share of network costs.

Leave a Reply

Your email address will not be published. Required fields are marked *