Coding, Data Science, A.I. catch-All | Grok update goes MechaHitler

  • Thread starter Thread starter nycfan
  • Start date Start date
  • Replies: 368
  • Views: 11K
  • Off-Topic 

I'm glad we don't but it's a little surprising to me that we don't charge different rates for different uses of electricity. I could definitely see a scenario where people vote to have industrial and commercial uses subsidize residential uses of electricity.

I believe right now most industrial users tend to pay lower rates but that's more because industrial users tend to use electricity in a more steady and predictable manner so it's easier to maximize the efficiency of power plants and other infrastructure. Not because political policy dictates that industrial companies pay less.
 
I'm glad we don't but it's a little surprising to me that we don't charge different rates for different uses of electricity. I could definitely see a scenario where people vote to have industrial and commercial uses subsidize residential uses of electricity.

I believe right now most industrial users tend to pay lower rates but that's more because industrial users tend to use electricity in a more steady and predictable manner so it's easier to maximize the efficiency of power plants and other infrastructure. Not because political policy dictates that industrial companies pay less.
I think where we may be headed is rather the opposite, at least in the near term and how we prioritize resources when there is high stress on the grid. The data centers and other industry will come first. That’s not exactly new but I think may be amplified quite a bit.
 
I think where we may be headed is rather the opposite, at least in the near term and how we prioritize resources when there is high stress on the grid. The data centers and other industry will come first. That’s not exactly new but I think may be amplified quite a bit.
The AI companies might be able to train their models when electricity demand is low. There is no technical reason why they couldn't train hardcore at night when power demand is low and pull back during the day. The AI companies would prefer not to do that so they can maximize their AI chip investment by running 100% all the time but it is possible.

The inference portion, the ask question get answer in the case of LLM's, would need to be done when people are active but I think the power needs are much less.
 
Back
Top