DATE:
January 14, 2026
04
TO:
Board of Supervisors
SUBJECT
Title
PROTECTING SAN DIEGANS FROM THE IMPACTS ASSOCIATED WITH LARGE ARTIFICIAL INTELLIGENCE (AI) DATA CENTERS (DISTRICTS: ALL)
Body
OVERVIEW
Large artificial intelligence (AI) data centers are being developed across the country at a rapid pace. They typically range in size from hundreds of thousands to millions of square feet and are quite literally powering the emerging and unprecedented societal shift toward AI. These facilities house and interconnect thousands of advanced computer chips, particularly graphics processing units (GPUs), which are essential for data and power-intensive tasks designed to train large language models, machine learning networks, and other data-heavy processes that are driving the development of AI technologies like OpenAI's ChatGPT and xAI's Grok, among many others.
AI data centers require incredible amounts of electricity and water to operate and can significantly strain local infrastructure and grid capacity. For example, consider that a single moderately-sized facility - in terms of power - of 100 megawatts (MW) consumes as much as 100,000 households' worth of electricity annually, according to the International Energy Agency (IEA). And that's just the beginning. A single 100 MW project, as large as it is, pales in comparison to facilities currently being proposed or constructed approaching or exceeding 1,000 MW in size, which would be capable of consuming an amount of electricity equivalent to over a million households. OpenAI's Stargate Project alone, for instance, plans to develop 10 gigawatts (10,000 MW) of AI data center capacity by 2029.
To meet the ever-growing power demand of emerging AI systems, across the country large data centers are quickly becoming one of the fastest-growing electricity users, with power demand set to possibly double over the next decade. Large technology companies are seeking available land with proximity to existing and/or pl...
Click here for full text