It’s not always easy to distinguish between existentialism and a bad mood.

  • 20 Posts
  • 697 Comments
Joined 3 年前
cake
Cake day: 2023年7月2日

help-circle






  • I use AI sparingly to make sure the company-paid subscription is a net loss for the AI vendor.

    Hey, it could happen.

    Overall, I think it was a bit cookie cutter for an article of this type, but maybe It’s just the preaching to the choir effect. Even the fact that he ostensibly quit his job over this stuff doesn’t hit as hard as it should, it comes off as if he could have done so at any time but this way he gets to grandstand about it.

    Also stuff like this:

    It wasn’t a bad job, not by most metrics. It ticked the boxes a job is supposed to tick: good pay. Health insurance. Remote work. Time off. Nice coworkers.

    sounds like it should be in a how do you do, fellow workers copypasta.





  • Their heart seems to be in the right place, police interrogation will be exploitative and brainwashy with no real consequences for the interrogators, but they sure chose the dumbest possible way to make their point:

    Despite the claims of AI evangelists, chatbots aren’t people and haven’t achieved sentience. The differences between a chatbot and a real person, however, make Heaton’s ability to elicit a false confession more disturbing, not less.

    “ChatGPT lacks many of the vulnerabilities that make people more likely to falsely confess — like stress, fatigue, and sleep deprivation,” said Saul Kassin, a professor emeritus at John Jay College who wrote the book on false confessions. “If ChatGPT can be induced into a false confession, then who isn’t vulnerable?”





  • (1) A lot of the stuff Netflix axed as soon as they hit three seasons instead of paying the guild mandated raises, like Santa Clarita Diet. Also on the subject of Netflix, the cancellation of Dark Crystal: Age of Resistance after one season was completely unforgivable, a really high quality labor of love that didn’t deserve to be done in like this.

    (2) Recency bias but Redliners by David Drake, I guess, military sci-fi should fit the style. It’s the story of a mid-alien-war settler expedition in non-enemy territory that on the dl was supposed to work as a more active means of reintegrating it’s guard detachment of ptsd’ed out veterans (the titular redliners) back into society, that goes immediately, terribly, awry.

    (3) Matrix and Oldboy for perfecting various aspects of modern film-making into cultural milestonehood, but realistically any aspirant would probably be far better served by binging MST3K and Garth Marenghi’s Darkplace.

    (4) Star Trek, probably. Its parodies already seem more in the spirit of the original than the current series anyway.





  • I checked it out because I was curious if CEV was some international relations initialism I’d never heard of, turns out its just My Guess About What He Wants in rationalese.

    Excerpt from the definition of Coherent Extrapolated Volition, or how to damage your optical nerve from too much eye rolling:

    Extrapolated volition is the metaethical theory that when we ask “What is right?”, then insofar as we’re asking something meaningful, we’re asking “What would a counterfactual idealized version of myself want* if it knew all the facts, had considered all the arguments, and had perfect self-knowledge and self-control?” (As a metaethical theory, this would make “What is right?” a mixed logical and empirical question, a function over possible states of the world.)

    A very simple example of extrapolated volition might be to consider somebody who asks you to bring them orange juice from the refrigerator. You open the refrigerator and see no orange juice, but there’s lemonade. You imagine that your friend would want you to bring them lemonade if they knew everything you knew about the refrigerator, so you bring them lemonade instead. On an abstract level, we can say that you “extrapolated” your friend’s “volition”, in other words, you took your model of their mind and decision process, or your model of their “volition”, and you imagined a counterfactual version of their mind that had better information about the contents of your refrigerator, thereby “extrapolating” this volition.