“People demanding that AIs have rights would be a huge mistake,” said Bengio. “Frontier AI models already show signs of self-preservation in experimental settings today, and eventually giving them rights would mean we’re not allowed to shut them down.

“As their capabilities and degree of agency grow, we need to make sure we can rely on technical and societal guardrails to control them, including the ability to shut them down if needed.”

As AIs become more advanced in their ability to act autonomously and perform “reasoning” tasks, a debate has grown over whether humans should, at some point, grant them rights. A poll by the Sentience Institute, a US thinktank that supports the moral rights of all sentient beings, found that nearly four in 10 US adults backed legal rights for a sentient AI system.

  • Cyv_
    link
    fedilink
    arrow-up
    24
    ·
    2 months ago

    Problem is, AI isn’t sentient. It’s advanced auto complete.

    Sure, if we get AGI give it rights, but we’re nowhere near that point right now.

  • Butterbee (She/Her)@beehaw.org
    link
    fedilink
    English
    arrow-up
    21
    ·
    2 months ago

    It’s wildly difficult to control the output of the black box and that’s hardly llms showing signs of self-preservation. These cries are from people in the industry trying to pretend the models are something that they are not, and cannot ever be. I do agree with the sentiment that we should be prepared to pull the plug on them though, for other reasons.

  • ɔiƚoxɘup@beehaw.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    If they have our rights, they should be subject to the same laws, including whatever incarceration or capital punishment exists in that state.