Now that you know a little bit about what AI is and how it is trained, it’s time to explore what else makes this technology possible. AI isn’t magic; it’s a technical tool that requires physical resources to run. Everything AI does, from recommending videos to helping a car drive, uses energy and has an impact on the environment.
In the next section, you’ll get to act as an AI engineer and build your own sustainable model. You’ll make decisions that impact the model’s performance, as well as its energy consumption and environmental impact.
Select each accordion to learn more about the four variables you will get to adjust:

AI models run on powerful computer chips, but some are more resource-efficient than others. Their production and disposal can lead to unsustainable mining practices and excessive electronic waste.
A Tensor Processing Unit (TPU) is an efficient choice, optimized for AI tasks with lower energy usage than a Graphics Processing Unit (GPU) or standard Central Processing Unit (CPU).

AI models learn from data. While a small dataset is easy to process and store, it can limit a model’s accuracy. A large dataset offers vast information, but can lead to overfitting and high storage costs. A medium-sized dataset often provides the best balance.

AI hardware generates a lot of heat, and keeping it cool can be a huge challenge. Traditional air conditioning (AC) is energy-intensive.
Water cooling is efficient, but it can consume millions of gallons of water.
Natural air cooling is the most sustainable choice in certain climates, using very little energy or water.

The type of energy that powers the AI system is another important choice to consider. This determines whether the system is powered by non-renewable sources like fossil fuels, renewable energy sources like wind and solar, or a combination of power grids.