Matthias Scheutz, Karol Family Applied Technology Professor, compared this inefficiency to everyday AI tools. “These systems are just trying to predict the next word or action in a sequence, but that can be imperfect, and they can come up with inaccurate results or hallucinations. Their energy expense is often disproportionate to the task. For example, when you search on Google, the AI summary at the top of the page consumes up to 100 times more energy than the generation of the website listings.”

As AI adoption accelerates across industries, demand for computing power continues to climb. Companies are building increasingly large data centers, some of which require hundreds of megawatts of electricity. That level of consumption can exceed the needs of entire small cities.

  • eleijeep@piefed.social
    link
    fedilink
    English
    arrow-up
    36
    ·
    11 days ago

    Lots of talk about “neuro-symbolic” AI, which just sounds like the connectionists finally conceding that the symbolic approach is necessary, but trying desperately to cling on to their dignity by putting a “neural” in front.

    It’s a discussion we’ve been having since the 70s but where is the progress in symbolic AI algorithms? Are we still building the semantic web? Can we solve the framing problem?

    Is this just more hype to fuel the bubble?

  • Ludicrous0251@piefed.zip
    link
    fedilink
    English
    arrow-up
    35
    ·
    11 days ago

    Just one more AI revolution and we’ll have AGI, trust me guys. This one is gonna be like 100x better at… well, AI stuff obviously.

  • lb_o@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    11 days ago

    If they claim their model runs faster, then why not. Let’s hope it can prevent AI driven energy overconsumption.

  • mr_account@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    11 days ago

    I’m just going to focus on the 2 words “AI could” in that title and ignore everything after that

  • gandalf_der_12te@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    7
    ·
    11 days ago

    We already had symbolic software, it was simply called software before AI arrived, now it should probably be called “classical software” or “deterministic software”.

    • lauha@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 days ago

      Context based text prediction by classical software used to be really good 10 years ago, for example google search, excel auto fill etc.

      • gravitas_deficiency@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 days ago

        Now, we can do less accurate and helpful predictions with an order of magnitude more power and brute-force compute, and it’s all fine because the profusion of fancy buzzwords that the VC dipshits don’t even fucking understand the basis of let’s said VCs circle jerk their investment portfolios to the moon.