• Match!!
    link
    fedilink
    English
    arrow-up
    17
    ·
    10 months ago

    openAI could just stop training worse models for a bit

    • enkers@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      10 months ago

      Isn’t this exactly why they’re doing it? They trying to make their product economically viable by optimizing compute cost in relation to perceived usefulness.

  • spaduf@slrpnk.net
    link
    fedilink
    arrow-up
    17
    arrow-down
    1
    ·
    edit-2
    10 months ago

    The costs are significant and growing but we should put some things into perspective to really tackle the problem efficiently. As an individual, heavy usage of these tools (something like 1000 images generated) is still roughly the same level of emissions as driving across town and generating text is pretty much negligible in all scenarios.

    Where we really need to be concerned is video generation (which could easily blow current energy usage out of the water) and water usage in these massive data centers. However, most of the current research on the subject does a pretty poor job of separating water usage for “AI” and general usage. This is why the next step is enforcing transparency so we can get a picture of how things are shaping up as this technology develops.

    All that said, there are some pretty low hanging fruit when it comes to improving efficiency. A lot of these models are essentially first-passes on a project and efficiency will improve simply as they start to target edge and local models. Similarly, these water cooling systems are predicated on some fairly wasteful ideas, namely that cool fresh water is abundant and does not warrant preservation. Simply factoring in that this is clearly no longer the case will go a long way towards reducing that usage.

    • spaduf@slrpnk.net
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      edit-2
      10 months ago

      To address the article a little more directly: it’s notable that the article begins with Sam Altman’s take on the subject. His feelings are based on two fundamentally flawed premises:

      1. These models MUST get bigger for the improvements that their users DEMAND.
      2. The only solution to any environmental criticism is FUSION. A technology that Altman has personally invested in.

      2 is ridiculous just on the face of it, but I think folks may have a harder time understanding why 1 is problematic. It is true that OpenAIs business model essentializes the idea that these models can’t ever be run locally, but the incentive to use their cloud services are quickly diminishing as smaller, local models catch up. This cycle will likely continue until local models are good enough to serve the needs of the vast majority of people, especially as specialized hardware makes it’s way into more and more consumer devices.

  • rowrowrowyourboat@sh.itjust.works
    link
    fedilink
    arrow-up
    6
    ·
    10 months ago

    one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

    And it’s not just energy. Generative AI systems need enormous amounts of fresh water to cool their processors and generate electricity. In West Des Moines, Iowa, a giant data-centre cluster serves OpenAI’s most advanced model, GPT-4. A lawsuit by local residents revealed that in July 2022, the month before OpenAI finished training the model, the cluster used about 6% of the district’s water. As Google and Microsoft prepared their Bard and Bing large language models, both had major spikes in water use — increases of 20% and 34%, respectively, in one year, according to the companies’ environmental reports. One preprint suggests that, globally, the demand for water for AI could be half that of the United Kingdom by 2027.

  • Wanderer@lemm.ee
    link
    fedilink
    arrow-up
    2
    arrow-down
    2
    ·
    10 months ago

    Increasing energy use, or at least increasing the amount of usable energy we consume is always going to increase.

    It’s not about stopping that it’s about making sure renewables increases a lot faster than that.

    Same amount of energy as 33,000 for a tool like this. That peanuts.

    The real problem no one wants to talk about is population growth. We need less people who are going to use more energy and have better standards of living. But the world’s probably fucking doomed anyway.

    • milicent_bystandr@lemm.ee
      link
      fedilink
      arrow-up
      7
      ·
      10 months ago

      The real problem no one wants to talk about is population growth.

      I don’t know how young you are, but this has certainly been talked about. A lot. tl;dr there exists no tl;dr for this issue but a significant aspect is inequality rather than absolute population size.

    • riodoro1@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      10 months ago

      The real problem no one wants to talk about is population growth.

      But all muh unborn Mozarts and shit. But muh instincts. BUT MUH RIGHTS.

      People are not ready for this discussion and the fact that every developed nation makes fucking free parking spots for large families, tax write offs and whatever the fuck proves it. Let’s not fix this planet, instead breed like fucking rabbits because somehow “we are going extinct” while surpassing 9 billion in population. We are fucked because we can’t keep our instincts at bay. Our instinct is to reproduce and own as much useless crap as possible. Turns out it isn’t really compatible with how the universe works, but that won’t fucking stop us from trying.

      Carbon capture your children.