
AI may be our most powerful tool, but without limits it could undermine the very sustainability goals it aims to support. Mitali Badkul, Full-time MBA student at Trinity Business School argues that the real challenge is not building smarter systems, but designing them to respect environmental boundaries. Unless we align innovation with restraint, we risk accelerating the crisis we are trying to solve.
Wired for Excess: AI, sustainability, and the loop we must break by Mitali Badkul.
“Na hi jñānena sadṛśam pavitram iha vidyate” – There is nothing more sacred or purifying in this world than true knowledge. Bhagavad Gita.
But true knowledge is not just about building intelligence, it is about knowing how, when, and why to use it. Are we applying wisdom or simply creating smarter systems that quietly amplify our excesses?

When early humans discovered fire, they unlocked warmth, safety, and the ability to cook. But fire also brought destruction, forests burned, ecosystems shifted, and overuse crept in. Every innovation, it seems, carries both its glow and its shadow.
Centuries later, coal told a similar story. James Watt’s improved steam engine made coal use more efficient, but instead of reducing consumption, it multiplied it. This counterintuitive outcome became known as the Jevons Paradox: when efficiency increases, overall demand often rises too. It reminds us that technology alone does not guarantee sustainability unless it is designed with restraint.
That paradox is quietly resurfacing today. As AI systems become more efficient and widely adopted, their total energy consumption is rising, not falling. Investor’s Business Daily (IBD, 2025) highlights how tools like DeepSeek risk repeating this dynamic, where the very features making AI more powerful also drive greater usage. Echoing this concern, Microsoft CEO Satya Nadella recently warned that AI is “headed straight into a Jevons Paradox,” where smarter computation may only accelerate environmental strain (Outlook Business, 2025).
Today, artificial intelligence stands as one of our most powerful tools yet. From flood prediction and grid optimization to sustainable agriculture. These are real environmental wins. But power without reflection often leads us back to old patterns. Higher crop yields often lead to land expansion, not restoration. Faster logistics means more consumption, not less. The smarter we get, the more we take. Every click, every “smart” suggestion, carries a carbon cost.
Efficiency without restraint becomes indulgence, and that is not sustainable.
In science, this is called a feedback loop. In life, it is a pattern we keep failing to learn from. If we do not pause to reflect, we risk building the future with the same logic that challenged our past. And beyond how we use AI lies a deeper concern: how we power it.
AI: The Hidden Toll of Intelligence
Training and operating large AI models like GPT- 4 have significant environmental impacts. A recent study cited by Sustainability News (2025) revealed that ChatGPT alone generates over 260,930 kilograms of CO₂ each month, equivalent to 260 transatlantic flights between New York and London. While one page view produces just 1.59 grams of CO₂, its 164 million monthly users make the cumulative footprint staggering.
Recent developments have made this trend impossible to ignore. When OpenAI launched an image generation feature inside ChatGPT, it went viral overnight. The surge in user demand overwhelmed their infrastructure, prompting CEO Sam Altman to admit their GPUs were “melting”, leading to temporary usage limits (CNBC, 2025).
This was not just a technical bottleneck; it was a glimpse into a deeper imbalance. What starts as a creative tool quickly scales into massive, unseen consumption. There is no carbon meter at the interface, no reminder of the energy beneath the prompt. This is the Wired for Excess dilemma in action: even our most advanced tools are designed for scale, not self-restraint. As AI becomes more integrated into daily life, its energy demands may spiral in proportion to our appetite for convenience and novelty, not necessity.
According to the International Energy Agency (2024), global electricity demand from data centres, AI, and the cryptocurrency sector could double by 2026, reaching over 1,000 terawatt-hours (TWh). To put this in perspective, this figure is roughly equivalent to the electricity consumption of Japan. The IEA highlights that the rapid scaling of AI models, especially large language models, requires immense computing power, and consequently, vast amounts of electricity. Despite this, there is limited transparency or regulation surrounding the energy sources used, leading to substantial uncertainty about their carbon intensity.
If sustainability means reducing harm to future generations, then this invisible energy toll must be addressed with the same urgency as more visible emissions sources or we risk losing the chance to prevent it.

A Mirror from Nature
In nature, balance is intelligence. Bees adapt their pollination routes to shifting climates. Forests regrow after fires. Ecosystems adjust, until they can’t. In these systems, feedback loops keep the whole in check. The goal is equilibrium, not maximization.
Coral reefs, for instance, respond to environmental stress through bleaching, a warning signal of imbalance. Migratory birds shift routes over time as climate patterns evolve, demonstrating natural learning and adaptation. These systems sense when to pause, retreat, or shift behavior. In contrast, many AI systems are built to scale indefinitely, without feedback loops that prompt revaluation. Learning from nature means embracing not just growth, but timely course correction.
AI, however, optimizes without awareness. It is designed to achieve a target – engagement, yield, speed, but rarely considers the broader consequence of hitting that target. It lacks boundaries, unless we consciously build them.
If AI is to support sustainability, not just productivity, it must learn to sense limits, not just hit metrics. As Eastern wisdom reminds us: “Ati sarvatra varjayet” – Excess should be avoided in everything. This is not just philosophy, it is systems thinking. A design principle. A cautionary tale. A quiet challenge to every line of code we write.
AI and Sustainability: Designing for Sufficiency
While many AI applications risk fuelling excess, there are also encouraging examples where AI aligns with sustainability. In marine conservation, AI-powered systems track illegal fishing patterns and monitor coral reef health. In energy management, smart algorithms balance demand and optimize the use of renewables in national grids. In agriculture, predictive AI helps prevent water waste and crop loss through precision irrigation.
These are not just optimizations, they are interventions that prevent overuse, reduce harm, and restore balance. When designed with restraint and regeneration in mind, AI becomes more than a mirror of human habits, it becomes a partner in protecting the planet. One standout example comes from Google’s DeepMind, which applied AI to reduce the energy used for cooling data centres by 40% – a measurable, large-scale reduction in environmental impact (DeepMind, 2016).
Throughout history, human innovation has pursued more – more output, more speed, more reach. But sustainability asks us to pursue enough. AI’s role must now evolve from maximizing performance to enabling sufficiency. Sustainability is not about halting innovation, it is about aligning progress with planetary limits.
This means building carbon-conscious architectures, training models on renewable energy, and optimizing for efficiency throughout the software lifecycle. McKinsey’s Green Software Imperative outlines actionable steps: energy-aware coding, green data centre design, and digital infrastructure that scales responsibly (McKinsey Digital, 2022). But these are more than technical fixes. They are philosophical resets. We must reimagine intelligence not as acceleration, but as alignment.
One approach is to introduce “ethical friction”: deliberate slowdowns, nudges, or design constraints that surface the hidden costs of digital actions. Imagine if our systems signaled energy usage next to high-consumption queries or warned users before running resource-intensive AI models. These moments of pause reconnect users with the real-world impact of their decisions.
As Riley Barrett explains in Designing for Ethical Friction, friction can be a tool, not a bug, a design feature that restores agency and accountability in our digital experiences. Instead of making everything effortless, friction reminds us that effort often matters.
Just as a seatbelt gently resists until buckled, AI could subtly resist when overused, prompting reflection instead of indulgence (Barrett, 2023). The essence of American environmentalist Wendell Berry’s work reminds us that the world is not merely a resource to be used, but a trust passed onto future generations. In the East, the concept of samatvam, balance and equanimity guides right action. Whether viewed through the lens of Western environmental ethics or samatvam, sustainability demands responsibility. It is not about rejecting progress, but about asking better questions before we build. Perhaps sufficiency is not a limitation, but a form of intelligence we have forgotten to apply to our technologies.

AI and Sustainability: Progress without Boundaries?
We began with fire, our first great innovation. It warmed, protected, and empowered us. But it also burned. Every chapter of progress since coal, oil, automation has carried the same duality: power and overuse.
AI is no different. Its ability to accelerate sustainability efforts is real. It is helping us map climate risks, monitor emissions, and optimize resource use (Mendelsohn, 2024). But left unchecked, it may also contribute to the very crisis it aims to solve through the rebound effect, rising energy demands, and silent over consumption.
The Jevons Paradox, first described in 1865, is more relevant than ever. As computation becomes faster, cheaper, and more embedded into everyday life, our demand for energy- intensive digital activity only grows. Jevons warned not just of economic rebound, but of the illusion of progress without boundaries. His insight echoes through today’s AI dilemma.
Intelligence without awareness is just momentum. Progress without limits is just repetition.
Sustainability does not ask for smarter systems, it asks for wiser ones. We must design AI with intent, ethics, and humility. Embed balance into the machine. Make the invisible visible. And ensure that what we build serves not just the present but the future. So the question is not what AI can do for sustainability. It is what we are willing to do, to ensure AI itself remains sustainable.
As users, designers, and citizens, we too hold a share of responsibility. What we ask from AI and what we ignore will shape the path it takes.
Will we let it run wild, repeating history’s loops? Or will we teach it to walk wisely and finally break the cycle?

Useful links:
- Link up with Mitali Badkul on LinkedIn
- Read a related article: Code Green: How AI is Reshaping Sustainable Finance
- Download this and the other winning student articles in Global Voice magazine #32
- Discover Trinity Business School, Trinity College Dublin
- Apply for the Trinity MBA.
Learn more about the Council on Business & Society
The Council on Business & Society (CoBS), visionary in its conception and purpose, was created in 2011, and is dedicated to promoting responsible leadership and tackling issues at the crossroads of business, society, and planet including the dimensions of sustainability, diversity, social impact, social enterprise, employee wellbeing, ethical finance, ethical leadership and the place responsible business has to play in contributing to the common good.
- Follow the CoBS on LinkedIn
- Download magazines and learning content from the CoBS website downloads page.
Member schools of the Council on Business & Society.
- ESSEC Business School, France, Singapore, Morocco
- FGV-EAESP, Brazil
- School of Management Fudan University, China
- IE Business School, Spain
- Indian Institute of Management Bangalore, India
- Keio Business School, Japan
- Monash Business School, Australia, Malaysia, Indonesia
- Olin Business School, USA
- Smith School of Business, Queen’s University, Canada
- Stellenbosch Business School, South Africa
- Trinity Business School, Trinity College Dublin, Ireland
- Warwick Business School, United Kingdom.

Discover more from Council on Business & Society Insights
Subscribe to get the latest posts sent to your email.
