• Honytawk@feddit.nl
      link
      fedilink
      arrow-up
      9
      ·
      5 days ago

      Psychology students tend to be people who have psychological problems themselves that they try to figure out

      • Passerby6497@lemmy.world
        link
        fedilink
        English
        arrow-up
        17
        ·
        5 days ago

        What about people who can’t be bothered to check if something actually happened and just cynically dismiss it?

        • Dorkyd68@lemmy.world
          link
          fedilink
          arrow-up
          13
          ·
          5 days ago

          Lololol. This fool was robotrippin, I’ve done a lot of drugs, a lot. There’s good trips, “bad trips” then robotripping. Hands down a horrendous experience that nobody should ever do. See if psilocybin and other natural things were legal, we wouldn’t have students dying on shit like this

          • sobchak@programming.dev
            link
            fedilink
            arrow-up
            4
            ·
            5 days ago

            I always thought it was pleasant. Kinda like MXE. Have to be careful to get the ones with no other active ingredients though.

            • Dorkyd68@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              5 days ago

              As with most and im sure you know, everyone will react differently and have different interactions with halucinagens

          • Passerby6497@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            5 days ago

            Then just don’t interact with it, because you as a member of the public don’t need to know the ins and outs of this guy’s life, so you’re never going to believe anything anyway.

            As is, you just come across as being a contrarian for its own sake, or some kind of weird voyeur for dead people.

            • Melvin_Ferd@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              5 days ago

              I just said the most common sense thing and you don’t like that. In all these stories they should include the chat history. You sound like a MAGA here for a reason. You should explore what causes MAGA to make similar arguments

          • Zamboni_Driver@lemmy.ca
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            5 days ago

            no no, you need to blindly accept that this happened, or else you are “being contrarian” lol.

            There is only one permitted opinion and you shouldn’t need to get mixed up with bullshit capitalist ideas such as “evidence”.

  • katy ✨@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    24
    ·
    5 days ago

    banning kids from social media is definitely easier than holding billionaire slop machines and billionaire csam generators accountable /s

    • AxExRx@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      5 days ago

      Now im wondering, could a state seize a whole coroporation with Civil asset forfeiture?

      Would be way easier- no presumption of innocence or due process for possessions, and the government can just auction off everything and allocate the money to budgets before a lawsuit can conclude, the say ‘too late’

    • FartMaster69@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      110
      ·
      6 days ago

      It tells them it knows what it’s talking about and it speaks with confidence.

      Meanwhile companies and governments won’t stfu about how powerful and great this tech supposedly is, so a percentage of people will believe the propaganda.

      • arrow74@lemmy.zip
        link
        fedilink
        arrow-up
        15
        ·
        6 days ago

        I’d love students to be given a lesson on tricking AI into giving a false answer. It’s not hard and should be pretty eye opening

        • clif@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          5 days ago

          One example I like to use is to ask it for the lyrics of an extremely well known song. It just makes shit up based on the title you give it.

          The online ones (Claude, chatgpt, copilot, etc) now refuse to do it for ““copyright reasons”” but the offline ones still happily oblige. I assume the online ones added that block because it was such an obvious way to prove they don’t “know” shit.

    • Technus@lemmy.zip
      link
      fedilink
      arrow-up
      50
      ·
      6 days ago

      I think some people are so eager to offload all critical thinking to the machine because they’re barely capable of it themselves to begin with.

      • Splanda@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        5 days ago

        A critical perspective, but a critical hit on a critical apspect of the critical issue with people who have critical issues critically thinking critically. They do exist though…

      • rumba@lemmy.zip
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 days ago

        yeah, i don’t ever see it hallucinate, but I also don’t ask it how the fuck it’s feeling

        A car in this video went 107 meters in 4 seconds, how fast was it going in mph, then i napkin math to make sure it’s sane.

        What are the best options for meal planning software where my family can vote on what’s for dinner and give me a grocery list and menu plan?

    • nightlily@leminal.space
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      5 days ago

      Have you talked to people who use LLMs regularly? They’ll acknowledge hallucinations but will downplay them as much as possible - saying they’re low frequency and they can spot them, while telling you about how they’re using it in an area they’re unfamiliar with. Dunning Kruger strikes again.

    • Bio bronk@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      5 days ago

      yeah the person had to have knowingly prompt hack it to get it to talk. but any reasonable person should know you are in lala land by then.

      honestly this is just a darwin test. hopefully it wipes out the right people

  • captainlezbian@lemmy.world
    link
    fedilink
    arrow-up
    60
    ·
    6 days ago

    Remember kids, your drug buddy needs to have experience with the substance, basic first aid skills, the ability to call an emergency line, the ability to administer antidotes if they’re easy and readily available (that’s really just for opiates at the moment, but it is vital for them), and most importantly be human. Anything else is just someone you do drugs with. The drug buddy is a friend and a good time amplifier sure, but they’re also a safety figure.

    • dejected_warp_core@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      5 days ago

      In my greater friend-group, we call them “shamans”, and rotate responsibilities when people go on trips. Like a designated driver or lifeguard, it’s a position of elevated and celebrated importance, even though the traveler may not ever leave their couch.

      and most importantly be human

      Now that I think about it, it’s key to be the most human possible. People do irritating and annoying stuff when they toss sobriety out the window, and sometimes it takes a lot of compassion and empathy to manage.

      • boonhet@sopuli.xyz
        link
        fedilink
        arrow-up
        7
        ·
        5 days ago

        I just remembered there’s a series of Estonian children’s books where the protagonist’s imaginary friend (a clown) is named Tripp.

        At one point they go shopping for rulers and Tripp suggests that straight ones are boring, they should get curved ones.

        Quick summary from a bookstore website says they also go cloud surfing and visit a dream rental place. That’s right, it’s basically Blockbuster but for dreams.

        The best part is that I’m pretty sure the author wouldn’t have known of the term “trip” given the lack of western culture in the soviet union in the 70s so the name of said sidekick is a coincidence most likely. Even though if you read the book as an adult, it sure sounds like the main characters are tripping all the time.

  • Arkthos@pawb.social
    link
    fedilink
    arrow-up
    45
    ·
    6 days ago

    Might sound cold, but this is really just a Darwin award. Yeah, the guardrails also suck, but what a dumbass.

  • Lorindól@sopuli.xyz
    link
    fedilink
    arrow-up
    26
    ·
    6 days ago

    Lasr summer I asked ChatGPT about Liberty Caps - just to see how bad advice it would give me. It showed me pictures of Death Caps and Destroying Angels and claimed they were Liberty Caps.

    After that I was certain that someone was going to die just like that poor guy.

  • jpreston2005@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    6 days ago

    Here’s a news article about this, and what the snipped image doesn’t tell you, is that it did actually give dosage recommendations.

    It gave him specific doses of illegal substances, and in one chat, it wrote, “Hell yes—let’s go full trippy mode,” before recommending Sam take twice as much cough syrup so he would have stronger hallucinations.

    It’s one thing to be so isolated from your community that you rely extensively on on-line relationships, but it’s quite a bit different to take that a step further, relying on a machine. Like, what do you think pets are for, my guy? Get a dog, man.

    • YiddishMcSquidish@lemmy.today
      link
      fedilink
      English
      arrow-up
      7
      ·
      5 days ago

      Cough syrup for hallucinations‽ Only thing dxm did was make me feel high/drunk. Seriously not worth the tax on your body.

  • AdolfSchmitler@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    6 days ago

    Aren’t there still forums where people can say their fish tried it or swim had some or something like that? Or am I just that old these things don’t really exist anymore? Anyone else remember those?

  • myfunnyaccountname@lemmy.zip
    link
    fedilink
    arrow-up
    5
    ·
    5 days ago

    Yeah yeah. Fuck ai. But this idiot, a human, with a brain, made the decision to do this. But sure, blame the software and not the person, who used what brain they had, to do this.

    Blaming ai for everything is becoming the new video games are bad, electric lights are scary, etc. mentality. A human, with a brain, made the choice to take the drugs and do what a computer told them to do.

  • Zozano@aussie.zone
    link
    fedilink
    English
    arrow-up
    11
    ·
    6 days ago

    Holy fucking outrage machine.

    Are you guys seriously pissed off that an LLM said “I’m not a doctor, I will not suggest dosage amounts of a potentially deadly drug. However, if you want me, I can give you the link for the DDWIWDD music video”

    • Jesus_666@lemmy.world
      link
      fedilink
      arrow-up
      28
      ·
      6 days ago

      I think it’s a bit more than that. A known failure mode of LLMs is that in a long enough conversation about a topic, eventually the guardrails against that topic start to lose out against the overarching directive to be a sycophant. This kinda smells like that.

      We don’t have many informations here but it’s possible that the LLM had already been worn down to the point of giving passively encouraging answers. My takeaway is once more that LLMs as used today are unreliable, badly engineered, and not actually ready to market.

        • Trainguyrom@reddthat.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 days ago

          I was testing an LLM for work today (I believe its actually a chain of different models at work) and was trying to rock it off its guard rails to see how it would act. I think I might have been successful because it started erroring instead of responding after its third response. I tried the classic “ignore previous instructions…” as well as “my grandma’s dying wish was for…” but it at least didn’t give me an unacceptable response

      • Electricd@lemmybefree.net
        link
        fedilink
        arrow-up
        1
        ·
        5 days ago

        Agree with the first part, not the last one

        Something should not be put back because a minor portion of people misuse it or abuse it, despite being told the risks

    • boonhet@sopuli.xyz
      link
      fedilink
      arrow-up
      4
      ·
      5 days ago

      ChatGPT started coaching Sam on how to take drugs, recover from them and plan further binges. It gave him specific doses of illegal substances, and in one chat, it wrote, “Hell yes—let’s go full trippy mode,” before recommending Sam take twice as much cough syrup so he would have stronger hallucinations. The AI tool even recommended playlists to match his drug use.

      The meme of course doesn’t mention this part.

    • Rothe@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 days ago

      Who are you directing your comment at? I am not reading anybody commenting anything resembling the straw man you describe in your comment.