• Nougat@fedia.io
    link
    fedilink
    arrow-up
    18
    ·
    2 years ago

    Read the fine print on the milk one:

    … as well as clam juice, glue, sunscreen, toothpaste, and hand lotion.

  • brian@lemmy.ca
    link
    fedilink
    arrow-up
    14
    ·
    2 years ago

    While it’s amusing that it feeds these Onion articles, it’s also a bit worrying when the search queries are worded in such a way that allows for such stark confirmation biases.

    It’s very similar to asking ChatGPT the same question phrased differently and getting entirely different answers.

    • JPAKx4
      link
      fedilink
      arrow-up
      4
      ·
      2 years ago

      It learned from human speech patterns, which includes people biting the onion.

  • Corroded@leminal.space
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 years ago

    I imagine this is going to get fixed in the next couple months but I feel like it would real funny if Google created a fork for this current iteration.

  • SkyNTP@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    2 years ago

    The only idiots here are the humans who bought into the AI craze without understanding how Large Language Models actually work and their limitations.