Microsoft is planning to start using small nuclear reactors for powering its data centres as it prepares for the energy hungry era of AI.

The strategic move into nuclear energy was quietly revealed through a job listing in late September for the role of ‘Principal Program Manager Nuclear Technology’.

According to the job posting, Microsoft is looking for someone “who will be responsible for maturing and implementing a global Small Modular Reactor (SMR) and microreactor energy strategy”.

The job, which is no longer open for applications, goes beyond just assessing the viability of using nuclear power for Microsoft data centres as the successful candidate will be in charge of developing “a clear and adaptable roadmap for the technology’s integration” along with choosing the project’s “technology partners and solutions”.

SMRs are an emerging form of nuclear fission reactor that uses the energy from a nuclear chain reaction to create steam that turns turbines and generates electricity.

It’s the same concept that has long been used for nuclear power in conventional reactors and remains controversial because of catastrophic disasters like those in Chernobyl and Fukushima, the issue of nuclear waste, and concerns around the proliferation of nuclear weapons.

Because nuclear energy doesn’t produce carbon dioxide, its proponents see the technology as a viable fossil fuel alternative.

SMRs seem to be creating renewed interest in nuclear power because they are smaller and more flexible than traditional reactors that could be produced at a commercial scale.

OpenAI CEO Sam Altman has ties with a company building small scale nuclear reactors to power his company’s expanding AI ambitions.

A proposal to replace Australia’s existing coal-fired power stations with SMRs was panned after government estimates put the plan at a cost of $387 billion.

The SMRs were estimated to cost $18,167 per kilowatt in 2030 which was vastly more expensive than the cost of energy from large scale solar ($1,058 per kilowatt) and onshore wind ($1,989 per kilowatt).

The information economy is hungrier for electricity than ever, especially as data centres have expanded to include stacks of GPUs for training AI systems.

For Microsoft, the increase in AI and cloud adoption over the last few years has seen also seen a spike in the amount of water it is using just to keep its hardware cool.

As part of the race to find more efficient ways of managing their server farms, Microsoft famously left a data centre on the bottom of the ocean for two years as part of its Project Natick to test the viability of an alternative way to cool its systems.

Three years after the pilot ended, Microsoft are still yet to release their findings on the project's viability.