Inspired by another post.

Quick sources.

According to Google, a single search requires about 0.0003 kWh of energy.

https://www.rwdigital.ca/blog/how-much-energy-do-google-search-and-chatgpt-use/

Each ChatGPT query consumes an estimated 2.9 Wh of electricity…

https://balkangreenenergynews.com/chatgpt-consumes-enough-power-in-one-year-to-charge-over-three-million-electric-cars/


Edit: I’m an idiot for not even considering conversions. I simply pasted the numbers from the sources. Apologies.

0.0003 kWh is 0.3 Wh, and 2.9 Wh is 0.0029 kWh.

I think the regular search is effectively one-tenth a chatgpt prompt.

…according to a simple calculator, and a lot of commenters who’ve now accidentally made this funnier.

I’m not an electrician.



Okay, after some more rabbling, here’s some edits. Take your pick:

      • BigDanishGuy@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        While I do agree with you, kWh is what’s on my electric bill. Meaning it’s easier to relate to.

        I guess it should have been expressed in base unit without SI-prefix. Because writing 0.3x10^-3 kWh just seem silly.

    • callouscomic@lemmy.zipOP
      link
      fedilink
      arrow-up
      6
      ·
      7 months ago

      Hmm, great point. I simply copied the figures from the sources. I dunno why I didn’t consider converting one of them.

      Edited post to add conversions. Thank you.

      But now I just think the meme is even funnier for a new accidental reason.

      • BigDanishGuy@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        I thought the joke was about people not being able to convert. It took a couple of clock cycles on the old meat CPU to figure out that it was an anti AI meme.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      29
      ·
      7 months ago

      The unit bullshittery going on in this meme is frying my brain. I need a nap.

      If I can stay awake long enough maybe I can work out how this works. My PC burns around 500Wh under load, so the time I just spent playing Abyssus was burning about 3 GPT searches a minute, by that estimation. It’s still much less than 30, and I can probably figure whatever I’m looking for in much less than that, but still.

      I wonder if it’s supposed to be better or worse if I decide to burn all that at home by running a local LLM. I don’t think my GPU is more power efficient than their data centers, and it’d almost certainly run longer than 20 seconds, but I do have pretty green power sources in this area and it is air cooled.

      I guess it depends on whether my office gets hot enough to make me turn on the AC.

    • EisFrei@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      7 months ago

      While we’re at it: Ten times less shouldn’t be a thing.

      Ten percent, 0.1 times, a tenth …

  • djmikeale@feddit.dk
    link
    fedilink
    arrow-up
    8
    ·
    7 months ago

    I look up recipes a lot, would love to know the actual cost of visiting a site, given all of the JavaScript and cookies, tracking, images etc that has to be loaded, that I don’t really care about

    • Buddahriffic@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      I use it to help figure out the order to cook things like stir frys with whatever ingredients I happen to have, which is an improvement over my previous “just throw it all in and stop when I feel like it’s time to eat then wonder why the meat is so tough”.

      It also helped me figure out that I’ve been steaming food instead of frying it for a long time. Though my cooking got a lot messier when I corrected that and I went from never burning anything to occasionally burning some things.

  • lime!@feddit.nu
    link
    fedilink
    English
    arrow-up
    7
    ·
    7 months ago

    now calculate how much power this meme took to make. or how much power all the page views take.

    the easiest reference for this stuff is how much more efficient a thing is compared to an old-fashioned lightbulb. The standard lightbulb is 50W, which means leaving it on for an hour consumes 50Wh. so with that amount of energy, you could do 15 chatgpt prompts. or 150 regular searches. or run your (1500W) vacuum cleaner for two minutes. or rev your car for 1.8 seconds, assuming your engine is around 130bhp.

    • callouscomic@lemmy.zipOP
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      One of the sources I linked literally gave you your lightbulb answer, but you didn’t check. You just went straight to some kind of mansplaining.

      Also, it’s a fucking meme. I don’t give that much of a shit.

      • [migrated to PieFed]@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        Ah, well, good thing your feeling is more important than any potential source of actual data.

        That’s not at all what it says; sarcasm is uncalled for.

    • callouscomic@lemmy.zipOP
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      Next time try making the meme without a pedophile.

      Got it. I won’t make a meme with you next time.

  • hungryphrog
    link
    fedilink
    arrow-up
    3
    ·
    7 months ago

    And chances are that option 2 gives you misinformation, and of course doesn’t cite sources.

  • sigezayaq@startrek.website
    link
    fedilink
    arrow-up
    3
    ·
    7 months ago

    You don’t take into account that Google made its search worse on purpose and now you have to make two or more queries to find what you need.

  • FreudianCafe@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    7 months ago

    The error with the measuring units is so grotesque it looks like bad faith to ride the AI bad wave but who knows

  • rozlav
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    anyone wanna correct it then ? I currently don’t have computer for this, maybe there is android foss meme maker app though ?