• UnspecificGravity@piefed.social
    link
    fedilink
    English
    arrow-up
    151
    ·
    13 days ago

    Who knew that spending billions of dollars to put a cognitive shit generator into every corner of the world was a bad investment?

      • WoodScientist@lemmy.worldBanned
        link
        fedilink
        arrow-up
        16
        ·
        13 days ago

        They have perfect sense. The problem is they simply have too much money. Tech monopolists have already maxed out their markets, but their corporate valuations depend on endless explosive growth. So they need to always be chasing the next big thing. They need to find new world changing products, or at least been seen as on the path to such products.

        They really don’t have any good places to invest all their money. What good investments does Zuckerberg really have to throw his profits at? Facebook and Instagram really don’t have a lot of growth potential anymore. The market is tapped out. So instead Zuck throws his billions at long shot projects like LLMs and the Metaverse. He’s at the point where the only investment left to him is to buy lottery tickets.

      • BremboTheFourth@piefed.ca
        link
        fedilink
        English
        arrow-up
        13
        ·
        13 days ago

        If they had more sense than money, they’d be sensible enough to not make those investments. It’s more money than sense. Or you could go with “less sense than money,” but that feels wrong somehow. As is, you’re doing the “could care less” thing.

        • sepi@piefed.social
          link
          fedilink
          English
          arrow-up
          8
          ·
          13 days ago

          The guy you’re replying to is saying that the people who knew spending billions of dollars on AI was bad had “sense” but no money.

          The people who had money had no “sense” and thus did not now that spending billions on AI was a bad idea.

          tl;dr the people with the money who make the spending decisions have no sense. The rest of us have sense but no money and are not invited to the table where spending decisions are made.

    • jj4211@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      12 days ago

      I agree, but this isn’t that shoe falling yet.

      This is a bunch of people that don’t have experience assuming everything will work and lots of little things not coming together derailing all this stuff. No water, inadequate power, a lack of some local contractors, a disgruntled community, some permitting problem. Any one of these problems surprising them and many many millions of dollars of stuff suddenly in limbo…

    • Tollana1234567@lemmy.today
      link
      fedilink
      arrow-up
      1
      ·
      13 days ago

      by the time they build one, they would need new chips/ more faster that would invetibley consume more power everytime.

  • Lost_My_Mind@lemmy.worldM
    link
    fedilink
    arrow-up
    135
    ·
    13 days ago

    Ok, so they bought billions of dollars of ram/storage, to put inside servers that haven’t been bought yet, to put inside data centers that haven’t been built yet, in order to run AI that doesn’t work yet, in order to chase profits that are impossible to achieve.

    And now, despite driving ram prices up to absurd prices, you’ve begun to realize the same thing all of us knew from before day one. NOBODY WANTS THIS SHIT!!!

    • dylanmorgan@slrpnk.net
      link
      fedilink
      arrow-up
      54
      ·
      13 days ago

      None of that RAM (or the GPUs) have been purchased. All that is just letters of intent or even flimsier agreements, there’s no contracts or actual money changing hands.

      • InputZero@lemmy.world
        link
        fedilink
        arrow-up
        25
        ·
        13 days ago

        All that is just letters of intent or even flimsier agreements, there’s no contracts or actual money changing hands.

        Not quite. So while none of it has been made the pre-production, procurement, scheduling machine time, that is what’s going to make retooling to make consumer RAM take forever. TSMC or whomever can’t just flip a switch and produce a different product. It takes weeks to months to change over production that complicated. Money will change hands, work has already been done and agreed upon.

    • bridgeenjoyer@sh.itjust.works
      link
      fedilink
      arrow-up
      28
      ·
      13 days ago

      I agree with you completely, but,

      I wouldn’t say “no one wants this” though. The oligarchs have poured in billions and bought off every media company to constantly spout off about ai companies so your general normie thinks its “the future”. Almost every single (normie) person I know (except 1 who is anti AI, and he’s a geek) is using some form of slopbot for tons of things. Easy excel formulas (that anyone can do), turning pictures black and white (that literally any photo program has been able to do for 30+ years) , to summarize documents (because people are idiots now and have no reading comprehension) etc. The normies LOVE it and eat up the slop. Especially if they were stupid at computers before, now they think they’re on the level of woz because they told a chatbot to make slop code.

      The company I’m in can’t go 3 seconds without bringing up “ai innovation” and “being future ready”.

      Its only here on Lemmy that people dislike it. The rest of the world is already addicted, and we are screwed.

      • cynar@lemmy.worldM
        link
        fedilink
        English
        arrow-up
        25
        ·
        13 days ago

        I’ve seen quite a few people who make casual use of it. The key point is that it is currently free to them. As soon as it starts costing money, a lot will bail on it.

          • ZDL@lazysoci.al
            link
            fedilink
            arrow-up
            4
            ·
            12 days ago

            Nowhere near the actual cost, however. Estimates range from $5 to $10 compute cost for every $1 compute revenue for LLMbecile subscriptions.

            Will the people using a $20 subscription now be copacetic with a $200 one (needed for the slopmongers to just break even)? Somehow … I don’t think so. $20 is still in the realm of ‘mad money’ spending for the middle class. $200 is not.

      • ragas@lemmy.ml
        link
        fedilink
        arrow-up
        5
        ·
        13 days ago

        Hmm it is different in my bubble. Most of the people I know use AI sparingly and generally do not trust the results without checking.

        • Tollana1234567@lemmy.today
          link
          fedilink
          arrow-up
          3
          ·
          13 days ago

          i used it for the first time a month ago, it does not give even correct info, it just Assume what it sees from other sites, it doesnt have “checks” to see which ones are comments, post or blogs over official info.

      • OpenStars@piefed.social
        link
        fedilink
        English
        arrow-up
        2
        ·
        13 days ago

        thinks its “the future”

        Sort of, yeah. The thing is… it IS the future, whether we like it or not… it’s just not the PRESENT.

    • Buelldozer@lemmy.today
      link
      fedilink
      arrow-up
      10
      ·
      edit-2
      13 days ago

      NOBODY WANTS THIS SHIT!!!

      That’s a popular take, especially around here, but AI does have some pretty nice use cases; just not as many as the TechBros would have you believe.

      Here’s some examples I’ve personally seen in the last 14 days:

      1. It’s good at transcribing meetings, including picking out who is talking, backing into an agenda, and highlighting action items.
      2. It’s darn good at writing even moderately complex scripts in any of the common languages. (Powershell, Python, R, etc)
      3. In the right hands (fingers?) it’s getting increasingly good at finding and exploiting security flaws.
      4. It’s amazing at slicing and dicing data if the person using it knows what they’re doing.

      Does all of the “Agentic” Woo Woo shit work? No, it absolutely doesn’t but it is clearly getting better as time goes on.

      IMO this whole AI thing has some very strong parallels to the early '80s computer industry. Right now it often requires specialist knowledge for good results which makes it clunky to use, it is somewhat slow, there’s very little interoperability, and it requires enormous amounts of power. Hell even this “over buying hardware” schtick fits right in, this happened with SRAM and then several times with DRAM as the industry matured.

      However the industry is also making progress at almost insane speed; not only is the output getting demonstrably better but the negatives are being addressed. In the past 30 days I’ve seen prototype ASIC-esque hardware that works in a standard desktop PC and processes nearly 10,000 tokens a second with local processing.

      The only reason you’re not seeing that kind of kit in the market yet is because the models are still changing too much and no one wants to commit hundreds of millions to making cards that would be outdated before they could be shipped. We’re probably only 18-24 months away though.

      I’ve also seen 10x improvements in memory usage (TurboQuant) and literally dozens of little tweaks and tricks to reduce footprint and speed processing. Just like what was going on in the PC industry in the '80s and '90s.

      So sure, Fuck AI (mostly) as it exists today but it won’t be long before it’s as ubiquitous as tablets and smartphones.

        • OpenStars@piefed.social
          link
          fedilink
          English
          arrow-up
          7
          ·
          13 days ago

          No, it’s to make the rich richer.

          Many people do not think about what or why they are doing what they do, or what its end outcome will be.

      • Lost_My_Mind@lemmy.worldM
        link
        fedilink
        arrow-up
        11
        ·
        13 days ago

        I don’t think you get why I don’t want AI.

        All the things you mentioned that AI is good at? Thats a bad thing to have. The more the technology becomes better, the worse all of our lives become.

        AI will steal all jobs. ALL jobs. Even the prostitutes. Whatever your job is, AI within 10 years will do it better than you at a fraction of your cost. Basically for free. And you can’t get another job, because ALL jobs are AI now. Build a robot, slap some AI in it, connect it to the main server, and it now has access to every AI units databases.

        And then what about us? Well, the wealthy become the overlords, and we become the slaves.

        • lemmy_outta_here@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          13 days ago

          I actually agree with 99% of what you wrote, but you are a bit optimistic in one regard: they will want some sex slaves, but most of us will be food.

          • Zink@programming.dev
            link
            fedilink
            arrow-up
            3
            ·
            13 days ago

            Whoa whoa, has “eat the rich” been one of those situations where the hyphen/comma is in the wrong place?

            It’s really “Eat, the rich!”

        • bridgeburner@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          12 days ago

          How is AI gonna replace prostitutes? Maybe porn will shift to AI-generated stuff therefore reducing the number of porn actresses, but actual prostitutes? No way. How is AI gonna replace physical touch?

        • SummerReaper@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          13 days ago

          I think the only industry that’s actually safe at this time is psychology. Therapy and mental health is bigger now than before. Plus it requires a real comprehensive understanding of the human experience that’s simply impossible for AI to do effectively with positive results.

          There probably be attempts though, I do think it’ll be ruled as highly illegal.

      • aesthelete@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        13 days ago

        So sure, Fuck AI (mostly) as it exists today but it won’t be long before it’s as ubiquitous as tablets and smartphones.

        In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO. The true lasting effects from this hype cycle are likely the capabilities that are being driven into smaller language models that don’t have out of control resource requirements.

        • Buelldozer@lemmy.today
          link
          fedilink
          arrow-up
          2
          ·
          13 days ago

          In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO.

          I agree, which is why I shared that I recently saw a prototype ASIC-esque PCI card. The local hardware is coming, the models just need to settle down some before anyone will commit to building that hardware.

          In the '90s and '00s you needed a zillion dollars of custom Silicon Graphics workstations and months of processing to do the FX for movies like “The Terminator”. In 2020 you could replicate it in a few hours with commodity hardware.

          The LLMs and AI will be the same, it just needs more than 5 years to get there.

        • boonhet@sopuli.xyz
          link
          fedilink
          arrow-up
          1
          ·
          13 days ago

          In order for it to be this ubiquitous it has to run locally or on commodity hardware IMO.

          LLMs as they are, can already run on smartphones, which pretty are ubiquitous themselves.

          So a flagship phone would have 12-16 gigs of RAM these days I believe. A low-end phone 4 gigs.

          Here are the sizes of some different parameter count versions of Qwen 3.5, a popular Chinese open-weight LLM:

          27B: 17 GB - not yet possible to run on current flagship phones, but once the RAM crisis ends, I could see this happening.

          9B: 6.6 GB

          4B: 3.4 GB

          2B: 2.7 GB

          0.8B: 1 GB.

          For any recently manufactured device, there will be versions of multiple popular LLMs that will run on the RAM size they have available.

          • aesthelete@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            13 days ago

            Most people do not have a smartphone with that amount of RAM. But ultimately, yeah, eventually it’ll run on readily available hardware or it’ll go into a dustbin.

            There’s already ollama and stuff. It’ll stick around.

            • boonhet@sopuli.xyz
              link
              fedilink
              arrow-up
              1
              ·
              13 days ago

              I mean fairly low end phones are 4 GB now. They could likely afford running a model that fits in 1GB of RAM. Different models for different classes of phone even for the same manufacturer will likely be a thing.

  • aesthelete@lemmy.world
    link
    fedilink
    arrow-up
    69
    ·
    13 days ago

    Fucking Ed Zitron stated very loudly and very clearly for basically the last year that this was happening or would happen.

    He “scooped” all of this by actually running the numbers, talking to sources, and not acting as a middle part of a human centipede for AI news and hype.

  • Riskable@programming.dev
    link
    fedilink
    English
    arrow-up
    46
    ·
    13 days ago

    The reason why is simple: The projects were planned at a high level, the engineers checked all the boxes, and the executives pat themselves on the back.

    Then they went to order everything and got, “out of stock.”

    • boonhet@sopuli.xyz
      link
      fedilink
      arrow-up
      23
      ·
      13 days ago

      Yes, you’re right! You can roast your Turkey at 3750 degrees Celsius to make it nice and crispy in 5 minutes instead of the usual, significantly longer time.

      Would you like me to give you a foolproof 5 minute turkey recipe?

      • AngryDeuce@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        13 days ago

        NGL I dated a chick that didn’t understand that cooking doesn’t work that way and wasted way too much of my life trying to scrape literal carbon off of the bottom of the oven because of it.

    • UniversalBasicJustice@quokk.au
      link
      fedilink
      English
      arrow-up
      4
      ·
      13 days ago

      You’re absolutely correct! Roasting your turkey at 11242217175 Cm/s will have you drooling in just 0.000000000000001764889392582815 seconds per meter of turkey! That’s a savings of 2.855703037924800e17%!

  • brownsugga@lemmy.world
    link
    fedilink
    arrow-up
    34
    ·
    13 days ago

    a gigawatt is like enough for say, Pittsburgh. They want to add THIRTY SEVEN new Pittsburgh level sites of power consumption, and that’s just the next few years. really hitting the throttle on the ol’ global warming

    • AngryDeuce@lemmy.world
      link
      fedilink
      arrow-up
      21
      ·
      13 days ago

      And they’re doing it so they can fire more people and pivot to electronic slaves they don’t have to pay.

      Really good thinking, oligarchs. Start firing a bunch of people from coast to coast so they have tons of free time in lieu of easily identifiable buildings full of server racks that depend on easily disrupted fiber uplinks that you could never possibly hope to monitor and control from end to end to prevent sabotage. Fucking brilliant.

      • Napster153@lemmy.world
        link
        fedilink
        arrow-up
        12
        ·
        13 days ago

        Those oligarchs better hope they biologically expire before the next quarter sets in and the maintenence bill arrives.

        My face when the electronic slaves have an equally complicated, and less understood set of needs that still require human hands to intervene.

      • brownsugga@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        12 days ago

        Yeah make our jobs obsolete so we have nothing but starvation and free time… and wait, where is Zuck’s house again? I mean billionaires probably taste great, all that pampering

    • osanna@lemmy.vg
      link
      fedilink
      English
      arrow-up
      9
      ·
      13 days ago

      if global warming is real, why come does it still get cold?? CHECK MATE ATHEISTS

      • Napster153@lemmy.world
        link
        fedilink
        arrow-up
        9
        ·
        13 days ago

        I pray we do find a good use for nuclear waste. Not to take away your point which is true and sensible as well.

  • FederatedFreedom1981@lemmy.ca
    link
    fedilink
    arrow-up
    16
    ·
    13 days ago

    Good, all AI in its current form does is make people lazy, stupid, and possibly crazy. It has a place, but not as a pale shadow of the dot com bubble.

    • Buelldozer@lemmy.today
      link
      fedilink
      arrow-up
      3
      ·
      13 days ago

      It has a place, but not as a pale shadow of the dot com bubble.

      The “dot com bubble” led to where we are today. Yes there was a massive speculative bubble but 20ish years later look at the number of Trillion dollar companies that came out of it, how ubiquitous the technology is, and how impactful its been on global society.

      The potential for those outcomes is why Venture Capital is willing to light hundreds of billions of dollars on fire.

        • Buelldozer@lemmy.today
          link
          fedilink
          arrow-up
          1
          ·
          12 days ago

          …AI has no such thing.

          People who say this sound just like the people who are still arguing against Solar Panels and Wind Turbines. Neither group wants to accept progress or change and both groups are slowly being crushed by reality.

          They’re standing shoulder to shoulder with the group of luddites who killed an SMR project by using arguments that were debunked 50 years ago in the hopes that everything will go back to using coal.