They could just invest in a solar farm or something, they are just a lot more economical.
Nuclear is okay, but the costs compared to renewables are very high, and you have to put a lot of effort and security into building a reactor, compared to a solar panel that you can basically just put up and replace if it snaps.
You probably know this discussion already through.
Edit: Glad to see a nice instance of the discussion going here.
In their specific use case that won’t really work.
They want to use all of their available property for server racks. Covering the roof with solar won’t give enough power/area for them. A small reactor would use a tiny fraction of the space, and generate several times the power. That’s why it’d be worth the extra cost.
For those who haven’t seen this discussion before, I feel like doing the next step in the dance. Cheers Plex.
It’s important to note that nuclear is capable of satisfying baseload demand, which is particularly important for things like a commercial AI model training facility, which will be scheduled to run at full blast for multiple nines.
Solar+storage is considerably more unreliable than a local power plant (be it coal, gas, hydro, or nuclear). I have solar panels in an area that gets wildfire smoke (i.e. soon to be the entire planet), and visible smoke in the air effectively nullifies solar.
Solar is fantastic for covering the amount of load that is correlated with insolation: for example colocated with facilities that use air-conditioning (which do include data centers, but the processing is driving the power there).
While you are right about baseload being more satisfiable through nuclear, you are wrong that it’s in any way important for AI model training. This is one of the best uses for solar energy: you train while you have lots of energy, and you pause training while you don’t. Baseload is important for things that absolutely need to get done (e.g. powering machines in hospitals), or for things that have a high startup cost (e.g. furnaces). AI model training is the opposite of both, so baseload isn’t relevant at all.
It’s not life-critical but it is financially-critical to the company. You aren’t going to build a project on the scale of a data center that is capable of running 24/7 and not run it as much as possible.
That equipment is expensive, and has a relatively short useful lifespan even if not running.
This is why tire factories and refineries run three shifts, this isn’t a phenomenon unique to data centers.
It’s not life-critical but it is financially-critical to the company. You aren’t going to build a project on the scale of a data center that is capable of running 24/7 and not run it as much as possible.
Sorry, but that’s wrong. You’ll run it as much as is profitable. If electricity cost goes up, there is a point where you’ll stop running it, since it becomes too expensive. Even more so considering that AI models don’t have a set goal to reach - you train them as long as you want and can, but training a little bit extra will have diminishing returns after a while.
That equipment is expensive, and has a relatively short useful lifespan even if not running.
Not really, the limiting factors in AI training are mostly supply of cards. The cards already in use will stay in use until they fail, they won’t be replaced with newer cards the second they get released.
This is why tire factories and refineries run three shifts, this isn’t a phenomenon unique to data centers.
This is comparing apples and oranges, since tire factories:
have long-term planning and production goals to reach
have employees who must be planned
have resource input costs that are higher than electricity
Of course you want the highest utilisation that you can economically reach, but a better comparison would be crypto mining - which also has expensive equipment that has a relatively short useful lifespan even if not running, and yet they stop mining when electricity is too expensive.
What are you trying to imply? That training Transformer models necessarily needs to be a continuous process? You know it’s pretty easy to stop and continue training, right?
I don’t know why people keep commenting in spaces they’ve never worked in.
No datacenter is shutting off of a leg, hall, row, or rack because “We have enough data, guys.” Maybe at your university server room where CS majors are interning. These things are running 24/7/365 with UU tracking specifically to keep them up.
This is false. Nuclear has a very competitive levelized cost of energy (LCOE). Nuclear has high upfront costs but fuel is cheap and the reactor can last much longer than solar panels. The big picture matters not just upfront costs.
Raw material is usually a small fraction of the cost of refueling. I would also argue that the Russian-Ukrainian conflict is a small blip in the lifetime of a reactor, ~80 years. Transient pricing will have a negligible effect on the LCOE.
Not only that, imagine how thrilled nature and the environment will be at massive extraction efforts ripping apart landscapes to provide fuel for a method of generating power that is obsolete since at least three decades by now.
I can’t imagine they are. What would the training data of those models be? Why would you train the model when the user sent a request? Why would you wait responding to the request until the model is trained?
Sure, but that’s not done with the kind of model this thread is about (separate training and inference). You’re talking about classical ML models with continuous updates, which you wouldn’t run on this kind of GPU infrastructure.
Yeah, I don’t know where nuclear advocates got the idea that their preferred method is the cheapest. It’s ludicrously untrue. Just a bunch of talking points that were designed to take on Greenpeace in the 90s, but were never updated with changing economics of energy.
I can see why Microsoft would go for it in this use case. It’s a steady load of power all the time. Their use case is also of questionable benefit to the rest of humanity, but I see why they’d go for it.
The people who actually put money into energy projects are signalling their preferences quite clearly. They took a look at nuclear’s long history of cost and schedule overruns, and then invested in the one that can be up and running in six months. The US government has been willing to issue licenses for new nuclear if companies have their shit in order. Nobody is buying.
Yes, because humans in a capitalist society are always well known for making the best decisions possible based on the good of humankind. Nothing else factors in whatsoever.
For anyone too thick, profit. Profit factors in above literally everything else. And short term profit at that. We shouldn’t make decisions of what’s best for society based on what massive corporations decide is best for their bottom line.
If you’re implying nuclear would be the better option outside of profit motive, please stop. We have better options now.
If we cleared every hurdle and started building reactors en mass, it would be at least five years before a single GW came online. Often more like ten. Solar and wind will use that time to run the table.
Edit: Also, this is a thread about a company dedicating a nuclear reactor to training AI models to sell people shit. This isn’t the anti-capitalist hill to die on.
Invest in a next generation technology that is yet unproven, but hopes to solve the financial problems that have plagued traditional reactor projects. And years away from actual implementation, if it happens at all.
They could just invest in a solar farm or something, they are just a lot more economical.
Nuclear is okay, but the costs compared to renewables are very high, and you have to put a lot of effort and security into building a reactor, compared to a solar panel that you can basically just put up and replace if it snaps.
You probably know this discussion already through.
Edit: Glad to see a nice instance of the discussion going here.
In their specific use case that won’t really work.
They want to use all of their available property for server racks. Covering the roof with solar won’t give enough power/area for them. A small reactor would use a tiny fraction of the space, and generate several times the power. That’s why it’d be worth the extra cost.
For those who haven’t seen this discussion before, I feel like doing the next step in the dance. Cheers Plex.
It’s important to note that nuclear is capable of satisfying baseload demand, which is particularly important for things like a commercial AI model training facility, which will be scheduled to run at full blast for multiple nines.
Solar+storage is considerably more unreliable than a local power plant (be it coal, gas, hydro, or nuclear). I have solar panels in an area that gets wildfire smoke (i.e. soon to be the entire planet), and visible smoke in the air effectively nullifies solar.
Solar is fantastic for covering the amount of load that is correlated with insolation: for example colocated with facilities that use air-conditioning (which do include data centers, but the processing is driving the power there).
While you are right about baseload being more satisfiable through nuclear, you are wrong that it’s in any way important for AI model training. This is one of the best uses for solar energy: you train while you have lots of energy, and you pause training while you don’t. Baseload is important for things that absolutely need to get done (e.g. powering machines in hospitals), or for things that have a high startup cost (e.g. furnaces). AI model training is the opposite of both, so baseload isn’t relevant at all.
It’s not life-critical but it is financially-critical to the company. You aren’t going to build a project on the scale of a data center that is capable of running 24/7 and not run it as much as possible.
That equipment is expensive, and has a relatively short useful lifespan even if not running.
This is why tire factories and refineries run three shifts, this isn’t a phenomenon unique to data centers.
Sorry, but that’s wrong. You’ll run it as much as is profitable. If electricity cost goes up, there is a point where you’ll stop running it, since it becomes too expensive. Even more so considering that AI models don’t have a set goal to reach - you train them as long as you want and can, but training a little bit extra will have diminishing returns after a while.
Not really, the limiting factors in AI training are mostly supply of cards. The cards already in use will stay in use until they fail, they won’t be replaced with newer cards the second they get released.
This is comparing apples and oranges, since tire factories:
have long-term planning and production goals to reach
have employees who must be planned
have resource input costs that are higher than electricity
Of course you want the highest utilisation that you can economically reach, but a better comparison would be crypto mining - which also has expensive equipment that has a relatively short useful lifespan even if not running, and yet they stop mining when electricity is too expensive.
“And you pause training while you dont.” lmao I don’t know why people keep giving advice in spaces they’ve never worked in.
What are you trying to imply? That training Transformer models necessarily needs to be a continuous process? You know it’s pretty easy to stop and continue training, right?
I don’t know why people keep commenting in spaces they’ve never worked in.
No datacenter is shutting off of a leg, hall, row, or rack because “We have enough data, guys.” Maybe at your university server room where CS majors are interning. These things are running 24/7/365 with UU tracking specifically to keep them up.
What are you talking about? Who said anything close to “we have enough data, guys”?
Are you ok? You came in with a very snippy and completely wrong comment, and you’re continuing with something completely random.
This is false. Nuclear has a very competitive levelized cost of energy (LCOE). Nuclear has high upfront costs but fuel is cheap and the reactor can last much longer than solar panels. The big picture matters not just upfront costs.
Source: https://www.energy.gov/sites/prod/files/2015/08/f25/LCOE.pdf
Yo better check your fuel prices: https://www.economist.com/finance-and-economics/2023/09/21/why-uranium-prices-are-soaring
Plus imagine how expensive uranium will get once we start relying on nuclear. It’ll be the new oil.
Raw material is usually a small fraction of the cost of refueling. I would also argue that the Russian-Ukrainian conflict is a small blip in the lifetime of a reactor, ~80 years. Transient pricing will have a negligible effect on the LCOE.
Not only that, imagine how thrilled nature and the environment will be at massive extraction efforts ripping apart landscapes to provide fuel for a method of generating power that is obsolete since at least three decades by now.
Don’t need to, just down-blend from the available fuel used from weapons put out of commission as a result of disarmament treaties.
Now, about those materials used to construct solar panels…
deleted by creator
Sucks to wait for the sun to come out to make Bing answer though. “Disclaimer: Answer dependent on cloud cover or night time”.
Do you seriously think that Bing trains an AI model when you send a request? Why would they do that?
Oh, they’re working on it. It’s dumb, but it’s happening.
I can’t imagine they are. What would the training data of those models be? Why would you train the model when the user sent a request? Why would you wait responding to the request until the model is trained?
Often, these models are a feedback loop. The input from one search query is itself training data that affects the result of the next query.
Sure, but that’s not done with the kind of model this thread is about (separate training and inference). You’re talking about classical ML models with continuous updates, which you wouldn’t run on this kind of GPU infrastructure.
You’d get used to it awfully fast though.
are you arguing solar is more economical than nucleae cause if so youre wrong by a longshot
That was true 20 years ago. You are working off extremely outdated information.
No, you are. Solar is much cheaper than nuclear is.
Yeah, I don’t know where nuclear advocates got the idea that their preferred method is the cheapest. It’s ludicrously untrue. Just a bunch of talking points that were designed to take on Greenpeace in the 90s, but were never updated with changing economics of energy.
I can see why Microsoft would go for it in this use case. It’s a steady load of power all the time. Their use case is also of questionable benefit to the rest of humanity, but I see why they’d go for it.
The people who actually put money into energy projects are signalling their preferences quite clearly. They took a look at nuclear’s long history of cost and schedule overruns, and then invested in the one that can be up and running in six months. The US government has been willing to issue licenses for new nuclear if companies have their shit in order. Nobody is buying.
Yes, because humans in a capitalist society are always well known for making the best decisions possible based on the good of humankind. Nothing else factors in whatsoever.
For anyone too thick, profit. Profit factors in above literally everything else. And short term profit at that. We shouldn’t make decisions of what’s best for society based on what massive corporations decide is best for their bottom line.
If you’re implying nuclear would be the better option outside of profit motive, please stop. We have better options now.
If we cleared every hurdle and started building reactors en mass, it would be at least five years before a single GW came online. Often more like ten. Solar and wind will use that time to run the table.
Edit: Also, this is a thread about a company dedicating a nuclear reactor to training AI models to sell people shit. This isn’t the anti-capitalist hill to die on.
The article here is literally talking about a company that wants to.
Invest in a next generation technology that is yet unproven, but hopes to solve the financial problems that have plagued traditional reactor projects. And years away from actual implementation, if it happens at all.