• danc4498@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 hours ago

    You don’t know if they’re sentient or not. I bet if the AI becomes sentient it won’t let humans know.

  • prof@infosec.pub
    link
    fedilink
    arrow-up
    12
    ·
    7 hours ago

    Okay, I agree, but you can’t tell me some server rooms don’t look cool af.

    • stoy@lemmy.zip
      link
      fedilink
      arrow-up
      9
      ·
      7 hours ago

      That is correct, I have worked in a server room that has glass floor panels with gamer lighting under them, mood lighting in the ceiling and a huge viewing window in the wall to the rest of the office.

      It looked damn cool.

  • Katzelle3@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    11 hours ago

    The left image looks like an old Cray computer. Could be an interesting indicator of the age of the image or the artist who made it.

      • otacon239@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        edit-2
        11 hours ago

        I think their point stands. I wonder if the creator of the show was directly influenced by Cray.

        • cm0002@lemmy.worldOP
          link
          fedilink
          arrow-up
          2
          ·
          11 hours ago

          I so completely and utterly misread their comment, I knew that was a big hit I took lolol

    • antimidas@sopuli.xyz
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      8 hours ago

      Cray (the company) often had interesting designs that probably ended up influencing a lot of sci-fi. CDC (control data corporation) had interesting designs as well, prior to that, and Cray (the person) worked there before founding his own company.

      One other supercomputer line with iconic looks is Connection Machines which are IMO some of the coolest looking computers ever made.

  • blarth@thelemmy.club
    link
    fedilink
    arrow-up
    9
    arrow-down
    1
    ·
    10 hours ago

    They are going to be powered by small modular reactors, though. So I guess there’s one extra box you can tick.

    • hansolo@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 hours ago

      We can tick the box on spec now and check back in 10 years to see if they ever actually developed a commercially viable rector.

    • Tar_Alcaran@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      2 hours ago

      AI companies are completely unable to afford to keep running, and I’d be incredibly surprised if they’re still seriously around in 2028

  • vivendi@programming.dev
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    9
    ·
    9 hours ago

    This is … Incredibly dumb.

    The whole climate impact of AI is overstated a million times over. You can run a perfectly capable, GPT 4o destroying LLM on your own GPU right now. What is the difference between jonkling to vidya gaems or running an LLM for doing something productive for a fucking change?

    The servers running the network you posted this shit over , the servers used to develop your favorite Gacha coom game, the servers used to deliver your shit ass Xitter feed, or your Netflix garbage, use more power than AI and LLMs are actually actively more productive than that shit

    • Tar_Alcaran@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      Weirdly enough then that OpenAI is losing money on every querry, they lose money on every paid subscription and are on track to lose billions this year.

    • VonReposti@feddit.dk
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      edit-2
      8 hours ago

      You can run a fully fletched LLM at home but you can’t train the model. The latter is a huge contributor to power consumption. Running it is peanuts in comparison.

      • vivendi@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        4
        ·
        8 hours ago

        That is a one time cost. Do you add the cost of every single watt of electricity ever generated, the cost of making factories, extracting materials and so on, to the “climate impact” of electrical rail service or EVs?

        This is just hysteria, I’m going to be absolutely deadass with you

        Also, you theoretically could make an LLM entirely on your own PC, it’d just take a bit of time

        • VonReposti@feddit.dk
          link
          fedilink
          arrow-up
          2
          ·
          7 hours ago

          Yes I do in fact. We need to lower the economical impact of production too, consumption is just a drop in the bucket. To put it in perspective, I can run my PC from a second hand generator. Most low end generators might even be able to run 10s of my PCs. A datacenter training the high end LLMs that I could be running needs a nuclear power plant worth of energy. We are talking multiple magnitudes of difference.

          I suppose you don’t consider the coal-powered electricity that powers your EV when reflecting over your impact too?

          • vivendi@programming.dev
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            7 hours ago

            OK you walk the walk I respect that

            If you could be arsed, it’s entirely possible to train an LLM with 10 PCs using strictly open source and ethical datasets available on huggingface (although not something that is going to beat SOTA models)