• YoSoySnekBoi@kbin.earth
    link
    fedilink
    arrow-up
    67
    ·
    14 days ago

    250% price hikes, the telltale sign of a functioning economy. But don’t worry guys, the number is still going up so it’s your fault you can’t afford our products

  • DacoTaco@lemmy.world
    link
    fedilink
    arrow-up
    29
    ·
    14 days ago

    Ye nvidia needs to fuck off. They have no reason to do this, since they dont even supply ram to the board partners anymore.
    All price increase is for the board partners, not them. Fuck off nvidia

  • tehsillz@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    14 days ago

    There’s plenty good DOS games to still play btw… add in Amiga and NES emulators and we don’t really need this stuff anyway.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    15
    ·
    14 days ago

    Pretty much no gamer is buying that shit anyway.

    The last “top tier” card I owned was the Radeon 7870.

  • frunch@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    13 days ago

    There has to be a point where we stop giving a fuck about graphic capabilities. It feels like we’ve already peaked, and any further progress in the past several years has been marginal. I mean yes, if you need ultra 4K or 8K or 16K or whatever the fuck they’re up to then you’re gonna have to drop however much coin they demand. Truth is nobody ever needed graphics on that level especially for video games. Really good games have existed despite having potato graphics, and some of the most visually-stunning games don’t have the level of popularity or fandom of some of those potato-quality-graphics games.

    When i come across articles like this, i like to think of the early video gaming days when real ingenuity was often what moved the bar. Finding ways to use chips in ways they weren’t intended to operate to make them do what the creators wanted. Without those extra layers of friction, it seems more of the attention has shifted to the graphics over the content of the game itself. There’s still plenty of good stuff out there no doubt, and graphics do matter but i feel most of the responsibility is on the developers at this point. You don’t need the most graphically capable card, and even if you bought it–aren’t there still other high-priced items you’ll also need to fully unlock it’s capabilities? Then you need games that fully take advantage of all those capabilities. This all reminds me of audiophiles in their ever-persistent hunt not only for the perfect combination of equipment, but also the music that’s been produced on such a high level as to need such sophisticated equipment to hear the differences in the first place.

    We are getting priced out of high-end home computing, in any case.

  • Luffy@lemmy.ml
    link
    fedilink
    arrow-up
    11
    ·
    edit-2
    14 days ago

    Why even buy a New graphics card nowadays?

    I currently have an rtx 3060ti, and all games that run badly on it won’t run better on a more recent card anyway.

    You can’t outrun bad optimisation, no matter how good the card is. And I don’t need to put up with it anyways, I still have a massive backlog of Stalker, RE and hollow knight or even dark souls before I feel any need to touch the absolute liquid dumps that are cronos, Stalker and the like with its 20fps stutters

    Also, the stuff that already runs, runs more smoothly.than I ever would need it to.

    • paraphrand@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      14 days ago

      So, any comparisons I might dig up that show the performance difference between that card and a 5070 are, just nonsense? What?

      • Luffy@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        14 days ago

        just nonsense

        Sure. I mean, of course it makes sense to pay 600€ so instead of playing hollow knight at 144 fps, I can play hollow knight at 144 fps.

        And for the other games, instead of stalker 2 or cronos stuttering unplayably at 20 fps, it now stutters like hell at 40 fps.

        I don’t see any reason to upgrade. The games I mentioned still run badly on a reasonable new GPU. Everything else, except these games, still run good on my GPU.

    • Xilence@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      13 days ago

      Out of curiosity, are you gaming on windows 11? I get ~70fps at 1440 on STALKER 2 on Linux with a 9060xt.

      • Luffy@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        13 days ago

        A 9060xt isn’t a 3060ti

        And no, I played it on Bazzite and Arch. I didn’t look at the specific fps, but it stutters beyond enjoyment for me

  • Bazell@lemmy.zip
    link
    fedilink
    arrow-up
    6
    ·
    13 days ago

    Who really needs RTX 5090 to live anyway? Everything past 5070 is overpriced special tech that average person doesn’t really need.

  • xep@discuss.online
    link
    fedilink
    arrow-up
    5
    ·
    12 days ago

    Since the 5090 has a 12vhpwr connector, there is a high chance you are actually setting your own money on fire.

  • sonofearth@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    13 days ago

    It is just sad western countries don’t have the concept of MRP. At least temporarily you won’t have to pay a lot of money.

  • humanspiral@lemmy.ca
    link
    fedilink
    arrow-up
    2
    ·
    13 days ago

    Also, there’s no way that their next generation AI chips will come in at lower price per TFLOPS/ram than those made a couple of months ago. Absurdly, NVidia and US datacenter partners have record inventory levels for undeployed AI chips. A crash in chip/memory prices seems more likely than this scenario playing out.