• Shizzymcjizzles@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    6
    ·
    12 days ago

    My friend is a full stack programmer with over 15 years experience with one of the largest financial institutions. So he can handle what you’re talking about no problem. But what IS a huge problem is that the reason he has the requisite knowledge now is because he spent years learning best practices by doing the grunt work that’s going to disappear. So in a few years they might no longer have people with the skills to do things right and then what you’re describing will absolutely happen and build quality will go to hell. The assumption from big tech is by then the models will have improved enough it won’t matter by then.

    • ramble81@lemmy.zip
      link
      fedilink
      English
      arrow-up
      10
      ·
      12 days ago

      That’s a hell of an assumption. Since we’re whipping out credentials, I’ve been in IT almost 30 years and I can tell you it’s not going to work like that.

      • Buelldozer@lemmy.today
        link
        fedilink
        English
        arrow-up
        6
        ·
        12 days ago

        Since we’re whipping out credentials, I’ve been in IT almost 30 years and I can tell you it’s not going to work like that.

        I’m not the person you were replying to but I’ve also been in tech since 1996 and lots of things have worked just like that. All successful technology starts off barely functional and improves over time until nearly all members of it’s intended audience can successfully use it.

        As an example in 1996 setting up a router was a specialty task that required training, by 2016 any moron could buy one off the shelf and have it running in an hour. As another example basic HTML was a specialty skill in 1996 but by 2003 you could do it with Microsoft Word. Smartphones are another example, they went from barely functional Windows Mobile and Blackberry devices which required ridiculous amounts of back end skill to deliver email to iPhones and Androids that any numskull can use for nearly anything at all.

        My point is this; too many people are stuck on the “What use is a newborn baby?” question without realizing that the infant is growing-up at blinding speed. It’s also the first technology to carry the promise, real or not, of self-improvement when it reaches sufficient maturity. Assuming that happens all further improvement will be increasingly automatic and happen even faster.

        AI isn’t going away and it’s only going to get better as time goes on.

        • Shizzymcjizzles@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          12 days ago

          You get it. I don’t understand the people in tech burying their heads in the sand. If the question were AGI that is definitely disputable in terms of even the viability. But plain old AI is already here. It’s not even a baby anymore.

        • ramble81@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          12 days ago

          Thank you for assuming what I do or don’t do, or what I’m plugged into or not.

          • Shizzymcjizzles@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            12 days ago

            There’s no assumption made there. In IT 30 years of experience makes you a dinosaur. And you’re questioning what I’m talking about as if the jury is still out when it’s fait accompli. You’re clearly not plugged in.