However, that reputation is being confronted with the new reality that tech giants like Microsoft, Amazon, Google, or Meta (the parent company of Facebook, Instagram, and WhatsApp) are developing and implementing their commitment to generative artificial intelligence.
First of all, AI becomes a danger to the environment. Graphics cards with high power consumption—GPU, for the English acronym—are used to train the large linguistic models behind tools such as ChatGPT and by the new data centers that are needed and consume enormous amounts of water and electricity.
Amazon’s data center empire in Northern Virginia, USA, uses more electricity than the entire grid managed by the city where the company is headquartered, Seattle. Google data centers will use 19.7 billion liters of water in 2022, 20% more than last year. Meta’s AI model, Llama 2, also seems thirsty.
The key players in the AI race are eager to publicize the possibilities They compensate for this increase in environmental impact with programs like Microsoft’s commitment that its data centers in Arizona use no water for more than half the year.
Or Google, which just announced a partnership with Nvidia, the largest maker of artificial intelligence microchips, with the goal of replenishing 120% of the fresh water used by its offices and data centers by 2030.
According to Adrienne Russell, co-director of the Center for Journalism, Media, and Democracy at the University of Washington, USA, these efforts may be more than just clever marketing: “The technology industry has made a long, coordinated effort to go digital.” Innovation seems to be compatible with sustainability, but it is not.
Russell cites as an example the change brought about by cloud computing and the way Apple’s offerings are marketed and portrayed as examples of technology companies attempting to associate themselves with counterculture, freedom, digital innovation, and sustainability.
This line of business is already being used to promote AI as an improvement from an environmental perspective.
In August, Jensen Huang, the company’s CEO, made the announcement during Nvidia’s financial results presentation that “Accelerated computing, a technology based on artificial intelligence (marketed by Nvidia itself), is more cost- and energy-efficient and contrasts with “general purpose computing”, which would be more expensive and comparatively more polluting.
The data suggests otherwise. A recent report from analytics firm Cowen estimates that AI data centers could require more than five times the power of traditional facilities.
The GPUs, typically from Nvidia, each consume up to 400 watts An artificial intelligence server can consume 2 kilowatts. A typical cloud computing server uses between 300 and 500 watts, according to Shaolei Ren, a researcher at the University of California, Riverside, who has studied how new AI models use resources.
“There are studies that are filled with this untrue information that sustainability and digital innovation go hand in hand, like ‘you can keep growing’ and ‘everything is massively scalable and it’s still good, and that some kind of ”Technology works for everyone,” says Russell, a professor at the University of Washington.
The momentum around artificial intelligence and its environmental footprint is likely to increase as companies look to integrate large language models into more of their operations.
Russell thinks so It would be better to focus on other, more sustainable innovations, such as Wi-Fi mesh networks or native privacy initiatives. Communities are setting up privacy and internet connectivity controls on their own terms and in a way that isn’t as dependent on big tech companies.
“If we manage to identify examples, no matter how small, of people designing sustainable technologies, we can begin to imagine and criticize these huge technologies that are neither sustainable from an ecological nor a social point of view,” defends the university professor.