It’s easy to overlook the fact that every digital action and interaction consumes energy. Behind nearly every chatbot, streaming video, software visualization, or datacenter calculation lies a power source that emits carbon (CO2).
As a result, a new tool is taking shape: carbon-aware scheduling.
As the name implies, “Carbon-aware scheduling is about shifting workloads to take advantage of lower carbon footprints,” said Sammy Lakshmanan, U.S. Sustainability Principal for consulting firm PwC. “By timing energy-intensive tasks to run when renewable sources like wind and solar are most abundant, datacenters can significantly reduce their carbon footprint.”
That can mean shifting calculations from one location to another at different times of the day or night. “Computing consumes energy, but unlike transportation, manufacturing, and other conventional uses, it delivers an unusual level of flexibility,” said David Irwin, a professor in the Electrical and Computer Engineering Department at the University of Massachusetts, Amherst (UMass Amherst).
The concept is gaining traction. Carbon-aware scheduling tools analyze and optimize the trade-offs among carbon emissions, cost, and performance. Together with carbon accounting software used within an organization or through a cloud provider, they’re able to trim datacenter carbon emissions by an average of 21%, according to one group of researchers.
Feeling the Carbon Crunch
Carbon-aware scheduling taps into a growing challenge: understanding how compute loads intersect with sustainability goals. “What began as a theoretical concept a few years ago has matured into an operational strategy, powered by real-time data, AI, and closer coordination with energy providers,” Lakshmanan said.
The software factors in everything from the nature of workloads and utility purchase agreements to the composition of power sources on a grid, including coal, natural gas, and renewables like solar and wind.
The primary benefit is a measurable reduction in carbon emissions without undergoing major infrastructure changes—or any hit to operational efficiency. However, the approach can also trim energy costs. Shifting AI workloads for a single large project can save thousands of megawatt hours (MWh), enough to power hundreds of homes for more than a year.
“Batch workloads—like model training, data analytics, backups, or video rendering—are ideal because they’re typically flexible in when and where they run,” said Scott Likens, Global Chief AI Engineer for PwC.
“These types of workloads do not have pressing deadlines or users waiting for a timely response,” said Benjamin C. Lee, a professor in the Electrical and Systems Engineering and Computer and Information Science departments at the University of Pennsylvania.
A variety of proprietary and open source tools has emerged, including The Green Software Foundation’s Carbon-Aware SDK, Kepler (Kubernetes-based Efficient Power Level Exporter), Scaphandre, Cloud Carbon Footprint, Gaia, and Carbon Aware Kubernetes. These apps typically provide near-real-time visibility into carbon levels across global power grids, as well as scheduling tools for cloud providers and applications.
The gains can be substantial. For example, Umeå University in Sweden reported up to a 28.6% reduction in emissions by aligning workloads with periods of lower grid carbon intensity (though it acknowledged some performance trade-offs). Another group of researchers from the University of California Berkeley, the Massachusetts Institute of Technology, UMass Amherst, and Caltech achieved a 32.9% reduction in emissions using carbon scheduling techniques.
“With the right data and tools, it’s possible to make informed decisions about power generation and carbon consumption based on thousands of locations around the world,” Irwin said.
Power Plays
Not surprisingly, carbon-aware scheduling also serves up some challenges and limitations. Irwin pointed out that many datacenters running AI for training are already maxed out, and little or no GPU availability exists. “If every chip is already working flat-out, you can’t shift workloads,” he explained. “You must have spare capacity to shift or reschedule tasks.”
There are also potential geographic trade-offs, including latency, daily and seasonal swings in carbon intensity, power fluctuations, and a need to adhere to data sovereignty regulations. The software must be able to accommodate these issues. “There can be increased complexity in scheduling and the need for advanced forecasting tools,” Lakshmanan said.
Still another concern is embodied carbon, which Irwin said accounts for a considerable portion of all business-related carbon emissions. “Although carbon-aware scheduling addresses operational emissions, it doesn’t factor in the carbon embedded in servers and infrastructure,” he said. “Overprovisioning to allow greater scheduling flexibility could unintentionally increase embodied emissions.”
In addition, results can vary significantly depending on various factors such as workload flexibility, as well as how often a schedular adjusts to changes, and the source composition of the energy grid, Likens explained.
Additionally, datacenters normally operate with a power allocation budget—typically measured in megawatts—for internal work teams and subscribers, Lee noted. This means that various user groups must collaborate to achieve optimal scheduling. If one group exceeds its budget, another group may experience performance degradation, or the entire datacenter may suffer.
Getting to Net Zero
A broader challenge is that the economics of carbon-aware scheduling don’t always add up. Many organizations lack a direct financial incentive to adopt the technology. For now, tax incentives and a regulatory framework don’t exist, and any such program requires significant upfront planning and engineering.
As a result, “Most organizations continue to optimize computational resources based on cost, performance, and uptime. Carbon scheduling isn’t on their radar,” Irwin said.
That could soon change. “As AI workloads grow and sustainability regulations tighten, carbon-aware scheduling is likely to become a strategic necessity,” Likens said. “We’ll see deeper integration with cloud platforms, offering more granular, real-time carbon data and native support for low-carbon scheduling across regions and time zones.”
What’s more, AI infrastructure could become “carbon-smart,” he noted. “Schedulers could automatically choose when and where to train based on emissions impact, energy cost, and workload urgency.”
Meanwhile, some researchers, like Lee, are developing more sophisticated techniques for reducing emissions. For example, he has experimented with multi-agent game theory to help processors make smart, independent decisions about power use. Each chip weighs its performance needs, anticipates power demand from others, and coordinates power (and eventually carbon) draw with others in mind. “It permits decentralized decision making in response to carbon signals,” he said.
Ultimately, the success of carbon-aware computing will depend largely on public and shareholder pressure, as well as new market incentives and regulatory controls. “We’ve already squeezed a lot out of energy efficiency,” Irwin concluded. “Carbon-aware scheduling is a next step—but it isn’t a silver bullet. We also need cleaner grids, better embodied carbon accounting, and ultimately, fewer unnecessary computations.”
Samuel Greengard is an author and journalist based in West Linn, OR, USA.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment