It wants to seem smart, so it gives lengthy replies even when it doesn’t know what it’s talking about.

In attempt to be liked, it agrees with most everything you say, even if it just contradicted your opinion

When it doesn’t know something, it makes shit up and presents it as fact instead of admitting to not knowing something

It pulls opinions out of its nonexistent ass about the depths and meaning of a work of fiction based on info it clearly didn’t know until you told it

It often forgets what you just said and spouts bullshit you already told it was wrong

    • prole
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      Lol they wouldn’t pay for them

    • Bronzebeard@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 month ago

      You can find most college textbooks scanned online, through torrents or PDF download sites. These companies are known to have downloaded numerous to rented book collections

      • anomnom@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 month ago

        Thanks, way to forget that it’s fancy autocomplete.

        Yeah they probably pirated it, but apparently done weight knowledge sources very well.

        That seems to be the big missing part of all this gen AI.

        I wonder if the selection of images online tends to be the higher quality subset of all imagery, whereas writings are all over the place, including a large quantity of shitposts. Could it make training image generators easier than text ones?