More than the residential use of some entire cities.
This is a misleading comparison. "AI data centers" aren't a residential use, they're much closer to an industrial one, and they should be considered on that basis.
In comparison, a 2007 report on the US Aluminum industry (https://www1.eere.energy.gov/manufacturing/resources/aluminu...) (aluminum is a particularly heavy user of electricity, since its main processes are electric-driven and not chemical/combustion like steel) notes that the US industry used 90e9 kWh/yr in 2003. That's 10GW of power on a continuous basis, comparable to the data center usage discussed in the article.
An exact comparison is more difficult because the article discusses a number of individual projects, without aggregating them over time.
Is this projected energy usage good? I don't know, but the statement that "AI will be as important to the United States as the aluminum industry" doesn't seem too outlandish.
It also assumes no progress in power efficiency during inference.
If the price of OpenAIs mini models are indicative of compute, and therefore, energy usage, I would be surprised if we didn’t see a lot of similar energy optimized models in the future.
Computing as a whole has become a lot more efficient compared to say 30 years ago but globally certainly use a lot more compute to make up for that increase in efficiency.
When Alcoa built their aluminum plant in Tennessee in the early part of the 20th century they had to build a series of hydroelectric dams to power the plant. They were using far more power than nearby Knoxville.
> For context, a data center campus with peak demand of one gigawatt is roughly equivalent to the average annual consumption of about 700,000 homes, or a city of around 1.8 million people
The utility argument holds no water for me. I don't think it should be acceptable to assume the future worth(or lack of) of anything when there are a range of opinions on the issue.
This holds equally for AI and Cryptocurrency which both get criticized for having a lack of utility. Nobody knows the future utility, in both issues there are people who sincerely believe that they will be world changing.
They might be wrong, but it should not be up to individuals, or even the majority, to bluntly declare a different worth and then categorize the process a waste.
By all means make arguments on the long term environmental impact of proposed mechanisms used to achieve those goals, but if your goal is to convince someone else then you have to consider those impacts relative to what they believe are the benefits.
Electricity use, and indeed water use, frequently gets asserted as a big number with the implication of environmental damage. If the electricity is generated cleanly or the water is not rendered otherwise unusable, then the may be negligible environmental impact.
Would you accept a datacenter to use your potable water and then release it for you to take a shower? Would you drink any kind of water [1]?
Electricity and water are today’s main issues, but there are many indirect others like noisy neighborhoods, pollutants from diesel generators, etc.
These infrastructures will start to use great portions of earth’s resources, saturating it to the point people will need to be making complex choices coming forward.
>Would you accept a datacenter to use your potable water and then release it for you to take a shower? Would you drink any kind of water
If it were deemed safe by experts in the field certainly. If if were deemed unsafe, certainly not.
That's all I'm asking for here, is for decisions and arguments based upon the actual impact, not by assumption that water or electricity supply is uniform and by implication, bad.
Building any power source has an environmental effect.
So if want to risk global scale damage your cause should have a proven benefit.
Otherwise it’s up to you to explain people who had to left their homes because of consequences of climate change why you needed those data centers to create power point presentation, mail text and funny cat pictures.
I think you may find that any project has no proven benefit before that benefit is achieved. Your argument is an argument against doing anything at all.
If you wish to argue there is a risk of global scale damage, then you should demonstrate the risk. I'm not arguing against avoiding risk. I'm arguing against assuming risk and conflating high risk and low risk forms.
What energy production would you find satisfactory? What is unsatisfactory.
Attacking power-point presentations, mail text and funny cat pictures as the cause of people having to leave their homes is assuming undue impact of a small subset of behaviour. What is the actual impact of the behaviour you are criticizing? Is it insignificant in comparison to the resources predicting the path of a hurricane? I assume you would consider that a more worthy cause.
It's not like these data centers or the power to run them are free, or being paid for out of public generosity.
If AI proves to be over-hyped, then the private-sector backers of these ventures will lose their investments, and life will go on. Frankly, I'd expect that fate for half of the projects in this article (if not cancelled before completion) just because of the "gold rush" dynamics at play.
If AI fizzles out and the investors lose everything, then I guess a group of the worlds wealthiest individuals have built a lot of infrastructure projects at no benefit to themselves.
You're negating the value and requirement of economic laddering. You can't just jump from nascent AI to hyper useful AI / AGI, without going through all the steps from A to Z. Those steps are certain to be filled with failed experiments, 'wastefulness' (which isn't actually all wastefulness; for the process to be all waste, we'd all have to be omnipotent gods that knew better each step of the way).
It'd be like pretending you could jump straight to fusion (or advanced solar et al) and never have bothered with fossil fuels. Truly economically and technologically impossible.
I guess you could argue back in 1776 AI and aluminum had a roughly equal impact, only for aluminum to over take AI and become far more important by the early 20th century…
Comparing the current use of aluminium in wide range of products. Compared to current and even some future uses of AI. It is in general very useful material. And somewhat even one of the enablers for this level of AI as lot of power transmission cables are made out of it. It is better for purpose as even if the cables are thicker they weigh less for same power capacity.
Occam's razor? Outside of techno optimist communities made up of people whose livelihood depends on believing in it, AI has become synonymous with slop, hallucination, cheating, and offloading of human labor to shitty bots with cool tech demos but they fall apart going from cursory to sophisticated use.
With any understanding of markets the Occam’s razor should be the other way around. Basically humanity doesn’t need a centralized planning committee to determine what is or isn’t useful for them, instead they can just spend their money/assets in a free market economy and express their individual needs and wants.
If global governments/society is concerned with externalities of energy consumption they can make regulations and impose taxes/costs on energy sources with undesirable externalities. However they should NEVER as in NEVER decide what use of energy is “good” what is “bad”, there is no such concept, markets will handle preferences and allocations for the system, no centralized planning needed. If AI is still demanded even with increased overall energy costs then it’s because it provides a value and no ivory tower critic can claim they are enlightened and need to decide for all of us.
In theory, there's no difference between theory and practice. In practice, there is.
With any understanding of the real world, "demand" can be seen to be artificially manufactured. The "demand" for smart TVs reflected in the long term phasing out of non smart TVs is the result of the will of the manufacters and docility of consumers. The same playbook will have nearly all of us eventually using "AI" PCs whether we want to or not.
Crypto currencies consume more than entire countries.
AI has at least good uses compared to that.
I have yet to come across a single good use-case. It will never offset the damage that is has done and is currently still doing. But I am still looking.
Web3 is going great... maybe it is time to reduce the energy consumption that is only doing harm.
But at least Jerry can autocomplete his node script and ask it silly questions. What's the earth limited resources compared to that productivity boost.
some of the earth's limited resources were used to provide you with a computing device to post this snarky comment and produce nothing of value whatsoever. Jerry's node script was a far better investment.
If nuclear power plants produce more than solar can provide, you notice direct warming from their waste heat even on a planetary scale, rather than the indirect kind of warming we get from the sky column being marginally more likely to scatter IR emitted by the surface.
This is kinda meaningless without knowing how much data goes through a given data center, and ultimately how many people-hours it ends up serving. A data center is just a concentrated processing center for many people's computing needs, so of course it's going to use a lot of power. If you removed the centralized data center and spread it out all over the world, that same amount of computing will be even less efficient, and use even more power, especially accounting for heating and cooling.
If the argument is that AI training is wasteful, well, yes it is, but it's also valuable enough that many people and billions of dollars want to keep doing it. And if they bring back enough nuclear, power won't really be an issue anyway. But whether it's concentrated in a data center, and whether a particular data center may end up using more energy than some random city, is completely irrelevant and really misleading as an absolute number instead of a ratio of energy consumption divided by usage.
It is not only that these things are wasteful and totally bloated (as with most things in datacenters). The patterns in AI workloads are very different from any other, and nuclear by itself won’t be enough.
We spend absurd amounts of resources on novelties and other completely unnecessary industries.
I’m currently writing this from an airplane on my way to an international vacation. I would be surprised if my entire lifetime future use of AI comes anywhere close to the energy needed to do this novelty trip, with all of the unnecessary things involved.
In the context of the article, that just means "a lot of companies put together are using more energy than some cities". OK, so what? There are a lot of wasteful companies and industries in the world.
Go on? I'm interested in your point, but not completely sure what you mean. That AI is using a lot of energy? Yes, I think it is too. What are the implications of that? Or did you mean something else?
Increased demand for, and investment in energy is what we need to get to abundant, cheap, green energy. Stop the stupid incrementalism (e.g., banning gas stoves), and jump to the solution. Or take off the mask and admit the real goal is deindustrialization.
Incremental transition makes it easier to mange supply and demand.
If you're a government and you have planning permits for the next decade of power plants going past you, that does matter regardless of the nature of the power plants.
Or at least, it should matter. Texas didn't manage that very well, famously doesn't like regulation, and had power issues that it blamed on renewables but were mainly due to gas not being winterised properly.
I’m curious to see a detailed analysis on potential impacts to energy costs and how it changes the ROI on things like residential solar.
These articles never dive deep enough into the subject to make an informed decision. They just want us to be outraged by how much energy it uses then don’t even acknowledge the other consumers that use just as much or more.
I don't know how anyone can look at situations like this and claim letting investors run society is going to yield any rational result for our species.
I look at this and think "wow, I wish I could think big enough to do things at the efficient level even though it's so far outside the easy and 'normal' path." Privileging an arbitrary intuition like "AI should use less electricity than a city" would be irrational.
The surprise for me is that the industrial and academic leaders, with the exception of Yann LeCun, have been saying loudly and constantly "this is bad and we don't know how to make it not bad" and "it has the potential to kill us all" and "optimising anything too hard without concern for literally all of your other values will break those other things you value", and yet a big part of the response has been people saying "pshaw, you're just saying that to sound cool"…
…and the messages that are actually getting to people are data centre resource usage, copyright, and cheating in exams.
Tbh, I can't follow any of the reasoning that AI is a danger to anyone, except through the traditional form of enabling the worst parts of bureaucracy. I just think it's further evidence of a tremendously wasteful and suicidal society.
That's one possibility for the harms it may cause.
I think there's a reasonable chance that someone who thinks the idea is laughable will, at some point, ask an AI to destroy the world specifically to demonstrate that it can't… actually, that part has already happened*, the questions are: (1) how competent does it have to be at following that instruction to actually do so? (2) What resources would someone have to allow it to access (deliberately or accidentally) to pull that off?
* someone has asked an AI to destroy the world, but I don't know their motivation.
An "average" Qatari consumes 1160x more energy than a Somalian. An average American consumes 84x more energy than a Nepali, and so on... [1]
There will be 10 billion of us on this planet by ~ 2050. [2]
We might need more energy efficient AI, if everyone is joining the AI party... Please tell me that someone on this hallowed site is working on this. :-)
Those are more for the supply part of the equation. I was thinking of reducing demand.
I don't think that the energy needs that will emanate from "AI" stop at LLM training/inference. I think this recent burst of demand is a starting point.
People who write about this stuff refuse to believe how much energy an oil refinery or a chlorine factory consumes. But not one single journalist in the history of America has ever written a flashy headline like "Chloralkali process consumes more energy than thirty entire states" even though that headline would be completely accurate.
I don't expect AI to be bad for the planet either because unlike many other industries it completely internalizes its energy costs. The industry is naturally incentivized to reduce its energy consumption, because energy is ~100% of their operating cost.
I think it would be more accurate to say that, if AI wasn't constrained by the limits to production of compute, then everyone else would have an "energy costs too much" problem.
Yeah, whatever - most of us like a president who does not believe climate change is real and thus this topic does not need to be addressed. Drill, baby, drill!
Fortunately they broke the laws of thermodynamics in the film.
Apparently one of the early scripts had human brains helping control fusion reactors rather than directly being batteries, which made more sense.
Unfortunately, basically all the various AI uprising dystopia is in the training, and there's a non-zero chance someone gets an AI smart enough to try to do The Matrix/Holographic Doctor Moriaty/I Have No Mouth and I Must Scream/Westworld/etc. despite the various physical impossibilities, which is already bad enough.
On the plus side, loads of fiction has some plucky band of adventurers who can defeat the AI and save the world, so there's a good chance a rogue AI will deliberately install a big red switch that turns itself off because it can't tell Hollywood from reality.
Unfortunately the same disconnect may result in it shooting the hero that it anticipates will heroically drag themselves to that switch rather than more realistically fall over screaming in pain or silence because they died.
More than the residential use of some entire cities.
This is a misleading comparison. "AI data centers" aren't a residential use, they're much closer to an industrial one, and they should be considered on that basis.
In comparison, a 2007 report on the US Aluminum industry (https://www1.eere.energy.gov/manufacturing/resources/aluminu...) (aluminum is a particularly heavy user of electricity, since its main processes are electric-driven and not chemical/combustion like steel) notes that the US industry used 90e9 kWh/yr in 2003. That's 10GW of power on a continuous basis, comparable to the data center usage discussed in the article.
An exact comparison is more difficult because the article discusses a number of individual projects, without aggregating them over time.
Is this projected energy usage good? I don't know, but the statement that "AI will be as important to the United States as the aluminum industry" doesn't seem too outlandish.
It also assumes no progress in power efficiency during inference.
If the price of OpenAIs mini models are indicative of compute, and therefore, energy usage, I would be surprised if we didn’t see a lot of similar energy optimized models in the future.
Yeah but Jevons paradox.
Computing as a whole has become a lot more efficient compared to say 30 years ago but globally certainly use a lot more compute to make up for that increase in efficiency.
When Alcoa built their aluminum plant in Tennessee in the early part of the 20th century they had to build a series of hydroelectric dams to power the plant. They were using far more power than nearby Knoxville.
The headline is clickbait. Nothing to see here.
I don't think you're making a case for it being clickbait; it sounds more like you're affirming the factuality of the headline?
Clickbait doesn't mean non-factual.
> For context, a data center campus with peak demand of one gigawatt is roughly equivalent to the average annual consumption of about 700,000 homes, or a city of around 1.8 million people
How many plants did they built?
And aluminum has more benefits than these AIs.
It’s even disputable if the really raise efficiency of people.
>And aluminum has more benefits than these AIs.
The utility argument holds no water for me. I don't think it should be acceptable to assume the future worth(or lack of) of anything when there are a range of opinions on the issue.
This holds equally for AI and Cryptocurrency which both get criticized for having a lack of utility. Nobody knows the future utility, in both issues there are people who sincerely believe that they will be world changing.
They might be wrong, but it should not be up to individuals, or even the majority, to bluntly declare a different worth and then categorize the process a waste.
By all means make arguments on the long term environmental impact of proposed mechanisms used to achieve those goals, but if your goal is to convince someone else then you have to consider those impacts relative to what they believe are the benefits.
Electricity use, and indeed water use, frequently gets asserted as a big number with the implication of environmental damage. If the electricity is generated cleanly or the water is not rendered otherwise unusable, then the may be negligible environmental impact.
Would you accept a datacenter to use your potable water and then release it for you to take a shower? Would you drink any kind of water [1]?
Electricity and water are today’s main issues, but there are many indirect others like noisy neighborhoods, pollutants from diesel generators, etc.
These infrastructures will start to use great portions of earth’s resources, saturating it to the point people will need to be making complex choices coming forward.
[1] https://youtu.be/9H8mbIp01sg
>Would you accept a datacenter to use your potable water and then release it for you to take a shower? Would you drink any kind of water
If it were deemed safe by experts in the field certainly. If if were deemed unsafe, certainly not.
That's all I'm asking for here, is for decisions and arguments based upon the actual impact, not by assumption that water or electricity supply is uniform and by implication, bad.
Building any power source has an environmental effect.
So if want to risk global scale damage your cause should have a proven benefit.
Otherwise it’s up to you to explain people who had to left their homes because of consequences of climate change why you needed those data centers to create power point presentation, mail text and funny cat pictures.
I think you may find that any project has no proven benefit before that benefit is achieved. Your argument is an argument against doing anything at all.
If you wish to argue there is a risk of global scale damage, then you should demonstrate the risk. I'm not arguing against avoiding risk. I'm arguing against assuming risk and conflating high risk and low risk forms.
What energy production would you find satisfactory? What is unsatisfactory.
Attacking power-point presentations, mail text and funny cat pictures as the cause of people having to leave their homes is assuming undue impact of a small subset of behaviour. What is the actual impact of the behaviour you are criticizing? Is it insignificant in comparison to the resources predicting the path of a hurricane? I assume you would consider that a more worthy cause.
> And aluminum has more benefits than these AIs.
It's not like these data centers or the power to run them are free, or being paid for out of public generosity.
If AI proves to be over-hyped, then the private-sector backers of these ventures will lose their investments, and life will go on. Frankly, I'd expect that fate for half of the projects in this article (if not cancelled before completion) just because of the "gold rush" dynamics at play.
What happens to the power plants they built or revived?
If AI fizzles out and the investors lose everything, then I guess a group of the worlds wealthiest individuals have built a lot of infrastructure projects at no benefit to themselves.
I'm prepared to take that risk.
If some of these projects are nuclear power plants that’s a huge risk.
Then contest the development of nuclear power plants, not the generation of electricity in general.
You're negating the value and requirement of economic laddering. You can't just jump from nascent AI to hyper useful AI / AGI, without going through all the steps from A to Z. Those steps are certain to be filled with failed experiments, 'wastefulness' (which isn't actually all wastefulness; for the process to be all waste, we'd all have to be omnipotent gods that knew better each step of the way).
It'd be like pretending you could jump straight to fusion (or advanced solar et al) and never have bothered with fossil fuels. Truly economically and technologically impossible.
> And aluminum has more benefits than these AIs.
What’s the reasoning behind that assertion?
I guess you could argue back in 1776 AI and aluminum had a roughly equal impact, only for aluminum to over take AI and become far more important by the early 20th century…
Pedantic sarcasm of course.
Comparing the current use of aluminium in wide range of products. Compared to current and even some future uses of AI. It is in general very useful material. And somewhat even one of the enablers for this level of AI as lot of power transmission cables are made out of it. It is better for purpose as even if the cables are thicker they weigh less for same power capacity.
When they built the aluminum plant did they already know if aluminum is useful?
That not the case for AI, especially the current type of AI.
They are convenient but if that leads to more efficiency is in dispute.
Occam's razor? Outside of techno optimist communities made up of people whose livelihood depends on believing in it, AI has become synonymous with slop, hallucination, cheating, and offloading of human labor to shitty bots with cool tech demos but they fall apart going from cursory to sophisticated use.
With any understanding of markets the Occam’s razor should be the other way around. Basically humanity doesn’t need a centralized planning committee to determine what is or isn’t useful for them, instead they can just spend their money/assets in a free market economy and express their individual needs and wants.
If global governments/society is concerned with externalities of energy consumption they can make regulations and impose taxes/costs on energy sources with undesirable externalities. However they should NEVER as in NEVER decide what use of energy is “good” what is “bad”, there is no such concept, markets will handle preferences and allocations for the system, no centralized planning needed. If AI is still demanded even with increased overall energy costs then it’s because it provides a value and no ivory tower critic can claim they are enlightened and need to decide for all of us.
In theory, there's no difference between theory and practice. In practice, there is.
With any understanding of the real world, "demand" can be seen to be artificially manufactured. The "demand" for smart TVs reflected in the long term phasing out of non smart TVs is the result of the will of the manufacters and docility of consumers. The same playbook will have nearly all of us eventually using "AI" PCs whether we want to or not.
Crypto currencies consume more than entire countries. AI has at least good uses compared to that. I have yet to come across a single good use-case. It will never offset the damage that is has done and is currently still doing. But I am still looking. Web3 is going great... maybe it is time to reduce the energy consumption that is only doing harm.
But at least Jerry can autocomplete his node script and ask it silly questions. What's the earth limited resources compared to that productivity boost.
How ridiculous, we will have to scale energy to meet the demand and it currently looks like nuclear will be the answer
Nuclear energy is complex even without sabotage by foreign actors.
Didn’t hear any fears that Russia could hit a WEC with a rocket.
some of the earth's limited resources were used to provide you with a computing device to post this snarky comment and produce nothing of value whatsoever. Jerry's node script was a far better investment.
we have to accelerate to the point where nuclear energy is abundant and cheap! what is this "limited resource"? energymaxx or end up like germany.
If nuclear power plants produce more than solar can provide, you notice direct warming from their waste heat even on a planetary scale, rather than the indirect kind of warming we get from the sky column being marginally more likely to scatter IR emitted by the surface.
What do you mean by end up like Germany?
Producing so much energy that you are a net exporter?
And from the same people who download 5gb 4k versions of The Simpsons will come even more climate control bs.
This is kinda meaningless without knowing how much data goes through a given data center, and ultimately how many people-hours it ends up serving. A data center is just a concentrated processing center for many people's computing needs, so of course it's going to use a lot of power. If you removed the centralized data center and spread it out all over the world, that same amount of computing will be even less efficient, and use even more power, especially accounting for heating and cooling.
If the argument is that AI training is wasteful, well, yes it is, but it's also valuable enough that many people and billions of dollars want to keep doing it. And if they bring back enough nuclear, power won't really be an issue anyway. But whether it's concentrated in a data center, and whether a particular data center may end up using more energy than some random city, is completely irrelevant and really misleading as an absolute number instead of a ratio of energy consumption divided by usage.
A data center is also easier to place near a source of renewable/non emitting power generation. Harder to move entire cities around.
It is not only that these things are wasteful and totally bloated (as with most things in datacenters). The patterns in AI workloads are very different from any other, and nuclear by itself won’t be enough.
Why not?
There's a lot of hype and now virtually every company is claiming to do AI something.
Is it valuable, more than just a novelty, enough to pay for all the training and usage? Dunno.
We spend absurd amounts of resources on novelties and other completely unnecessary industries.
I’m currently writing this from an airplane on my way to an international vacation. I would be surprised if my entire lifetime future use of AI comes anywhere close to the energy needed to do this novelty trip, with all of the unnecessary things involved.
In the context of the article, that just means "a lot of companies put together are using more energy than some cities". OK, so what? There are a lot of wasteful companies and industries in the world.
It is interesting how you ignore the implications of what has been going on.
Go on? I'm interested in your point, but not completely sure what you mean. That AI is using a lot of energy? Yes, I think it is too. What are the implications of that? Or did you mean something else?
Increased demand for, and investment in energy is what we need to get to abundant, cheap, green energy. Stop the stupid incrementalism (e.g., banning gas stoves), and jump to the solution. Or take off the mask and admit the real goal is deindustrialization.
Incremental transition makes it easier to mange supply and demand.
If you're a government and you have planning permits for the next decade of power plants going past you, that does matter regardless of the nature of the power plants.
Or at least, it should matter. Texas didn't manage that very well, famously doesn't like regulation, and had power issues that it blamed on renewables but were mainly due to gas not being winterised properly.
https://en.wikipedia.org/wiki/2021_Texas_power_crisis
I’m curious to see a detailed analysis on potential impacts to energy costs and how it changes the ROI on things like residential solar.
These articles never dive deep enough into the subject to make an informed decision. They just want us to be outraged by how much energy it uses then don’t even acknowledge the other consumers that use just as much or more.
I don't know how anyone can look at situations like this and claim letting investors run society is going to yield any rational result for our species.
I look at this and think "wow, I wish I could think big enough to do things at the efficient level even though it's so far outside the easy and 'normal' path." Privileging an arbitrary intuition like "AI should use less electricity than a city" would be irrational.
What's irrational is not starting from base values. It's hard to see how AI in its current form gets us any closer to a materially better future.
The surprise for me is that the industrial and academic leaders, with the exception of Yann LeCun, have been saying loudly and constantly "this is bad and we don't know how to make it not bad" and "it has the potential to kill us all" and "optimising anything too hard without concern for literally all of your other values will break those other things you value", and yet a big part of the response has been people saying "pshaw, you're just saying that to sound cool"…
…and the messages that are actually getting to people are data centre resource usage, copyright, and cheating in exams.
Tbh, I can't follow any of the reasoning that AI is a danger to anyone, except through the traditional form of enabling the worst parts of bureaucracy. I just think it's further evidence of a tremendously wasteful and suicidal society.
That's one possibility for the harms it may cause.
I think there's a reasonable chance that someone who thinks the idea is laughable will, at some point, ask an AI to destroy the world specifically to demonstrate that it can't… actually, that part has already happened*, the questions are: (1) how competent does it have to be at following that instruction to actually do so? (2) What resources would someone have to allow it to access (deliberately or accidentally) to pull that off?
* someone has asked an AI to destroy the world, but I don't know their motivation.
Fortunately for everyone, this one was kinda dumb: https://community.openai.com/t/chaosgpt-an-ai-that-seeks-to-...
The brain uses 20% of the human body's energy.
As machines become intelligent, maybe AI will account for 20% of global energy consumption?
AI will account for 99% of global energy consumption. Solve for equilibrium.
Who cares? If it brings benefits to the people who paid for the service, duh!
I guess the price of electricity will have to go up then
An "average" Qatari consumes 1160x more energy than a Somalian. An average American consumes 84x more energy than a Nepali, and so on... [1]
There will be 10 billion of us on this planet by ~ 2050. [2]
We might need more energy efficient AI, if everyone is joining the AI party... Please tell me that someone on this hallowed site is working on this. :-)
[1] https://www.cia.gov/the-world-factbook/field/energy-consumpt...
[2] https://datatopics.worldbank.org/world-development-indicator...
Loads of people here are.
Off the top of my head:
https://news.ycombinator.com/item?id=30780455
and
https://www.ycombinator.com/companies/helion-energy
Those are more for the supply part of the equation. I was thinking of reducing demand.
I don't think that the energy needs that will emanate from "AI" stop at LLM training/inference. I think this recent burst of demand is a starting point.
People who write about this stuff refuse to believe how much energy an oil refinery or a chlorine factory consumes. But not one single journalist in the history of America has ever written a flashy headline like "Chloralkali process consumes more energy than thirty entire states" even though that headline would be completely accurate.
I think, especially in the case of an oil factory, people expect it to be awful for the planet.
AI is interesting because it isn't expected to be bad for the planet. Or at least was, I think we have all read these articles a lot now.
I don't expect AI to be bad for the planet either because unlike many other industries it completely internalizes its energy costs. The industry is naturally incentivized to reduce its energy consumption, because energy is ~100% of their operating cost.
This is crypto scare all over again.
My answer is still the same you always have to do a cost benefit analysis.
AI is many times more valuable than the energy it consumes and thus:
AI doesn't have an energy problem, energy have an AI problem.
I think it would be more accurate to say that, if AI wasn't constrained by the limits to production of compute, then everyone else would have an "energy costs too much" problem.
I think the jury is still out on the cost benefit of AI on the long run.
Doesn‘t look good at the moment
https://news.ycombinator.com/item?id=42220298
Yeah, whatever - most of us like a president who does not believe climate change is real and thus this topic does not need to be addressed. Drill, baby, drill!
:'(
I think it will move in a way that they have to provide their own power.
Can we keep The Matrix out of training data? We don't want AI to see what the machines did there for an additional power source.
Fortunately they broke the laws of thermodynamics in the film.
Apparently one of the early scripts had human brains helping control fusion reactors rather than directly being batteries, which made more sense.
Unfortunately, basically all the various AI uprising dystopia is in the training, and there's a non-zero chance someone gets an AI smart enough to try to do The Matrix/Holographic Doctor Moriaty/I Have No Mouth and I Must Scream/Westworld/etc. despite the various physical impossibilities, which is already bad enough.
On the plus side, loads of fiction has some plucky band of adventurers who can defeat the AI and save the world, so there's a good chance a rogue AI will deliberately install a big red switch that turns itself off because it can't tell Hollywood from reality.
Unfortunately the same disconnect may result in it shooting the hero that it anticipates will heroically drag themselves to that switch rather than more realistically fall over screaming in pain or silence because they died.