A truly excellent primer into generative AI and the climate justice movement. A very good context section, which really got me thinking about all of the complexities in this body of work. Outlines seven principles for how campaigners should approach both use and regulation of AI, explaining then breaking down each into “responsible practices”, “advocating for change” and “policy recommendations”.
- “Just enough” approach - Just enough Internet: Why public service Internet should be a model of restraint
- AI is increasingly positioned as the key engine of (infinite) economic growth through productivity gains, with growth prioritised at all costs
- Risk: expansion of AI coming into conflict with other policy priorities and environmental mandates
- “We want to ensure these technologies support the fight for a fair, fossil-free future, not undermine it.”
- “technologies can be designed for both good and bad purposes and have both positive and negative outcomes. There are both enlightened, compassionate users and builders as well as cynical and uncaring ones.”
- “campaigners … must lean on the expertise of leaders and organisations in both technology and sustainability. Blanket narratives about the use of AI aren’t helpful in the current socio-political climate, as they create a binary “them against us” approach which can turn off everyday users of technology.”
- Campaigners are falling into the trap of reducing the debate to a binary “AI = bad, no AI = good”
Role of GenAI in climate, nature, rights
Energy
- Problem: we don’t know how much energy GenAI is using, but we do know that it’s increasing exponentially
- And new renewables built are not offsetting existing grid, just coping with new demand (”Between 2017 and 2023, all additional wind energy generation in Ireland was absorbed by data centres, for example.”)
- “Given that data can be moved around the world quickly and easily, Big Tech companies are likely to build data centres wherever it’s cheapest for them — and it may be cheapest to consume electricity in places which have the fewest rules around energy mixes.”
- “the same companies promoting AI-led climate solutions are also promoting their AI systems to the fossil fuel industry to optimise the extraction of resources previously considered unviable.”
Environmental
- Tech firms want AI to become faster and faster, so they need specialised chips GPUs (Graphic Processing Units), better than standard CPUs (Central Processing Units).
- “As the demand [for AI] grows so too does the expansion of mining operations for the benefit of companies and consumers in the Global North at the expense of people and the environment in the Global South”
- Displacement effects: “the key minerals and metals needed for AI expansion are “nearly the same” as those required for transition to a low-carbon economy, putting AI in direct competition with emerging renewable and battery technologies.”
Right and Justice
- “good examples of how generative AI is supporting such progressive initiatives are harder to come by [than machine learning, predtive analysis for SDGs]”
- “The waste and pollution, ecosystem collapse and worker exploitation associated with AI advances disproportionately impact the communities who are already most vulnerable to the climate crisis. It’s a form of environmental colonialism when lower-income countries bear the brunt of technological advancements while richer countries reap most of the benefits.”
Principles
1. Curiosity = opportunities for better choices
- “Working with communities to discuss, question and explore how they can or want to use AI, and, more importantly, whether a project should be using AI at all, are essential steps in the ethical use of AI.”