• ryper@lemmy.ca
    link
    fedilink
    English
    arrow-up
    134
    ·
    8 months ago

    The full tweet:

    Majority of gamers are still playing at 1080p and have no use for more than 8GB of memory. Most played games WW are mostly esports games. We wouldn’t build it if there wasn’t a market for it. If 8GB isn’t right for you then there’s 16GB. Same GPU, no compromise, just memory options.

    I don’t think he’s that far off; eSports games don’t have the same requirements as AAA single-player games.

  • mormund@feddit.org
    link
    fedilink
    English
    arrow-up
    26
    ·
    8 months ago

    Guess I’ll stick with my GTX 1070TI until next century when GPU manufacturers have passed the bong to someone else. Prices are insane for the performance they provide these days.

    • skulblaka@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      Same. I’ve encountered exactly one game, ever, that I couldn’t play with that card, and that was last month with Doom: Dark Ages which won’t even boot without RTX support.

      Literally never had a single other problem over the past 7 years of use. I played Cyberpunk 2077 with that card. I’m currently playing Clair Obscur with that card and it looks stupendously beautiful on it.

    • inclementimmigrant@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      8 months ago

      I mean honestly, yeah. With a simple 4 GB chip they could have won the low end and not screwed over gamers.

      They’ve really seemed to have forgotten their roots with the GPU market, which is a damn shame.

  • edgemaster72@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    8 months ago

    Then put 8GB in a 9060 non-XT and sell it for $200. You’re just wasting dies that could’ve been used to make more 16GB cards available (or at least a 12 GB version instead of 8).

    • DtA@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      That wouldn’t work. AMD uses a lot of low memory cheap cheap memory in unison to achieve high speeds, that’s why their cards have more vram than nvidia, not because the amount matters, but because more memory chips together can get higher speeds.

      Nvidia uses really expensive chips that are high speed so they can fewer memory chips to get the same memory speed.

      Then AMD lied and manipulated gamers for advertising that you need 16gb vram.

      Memory speed > memory amount

  • fox2263@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    8 months ago

    Do you just not want more money?

    Nvidia have dropped the ball epically and you have a golden opportunity to regain some GPU share here.

  • SomeRandomNoob@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    17
    ·
    8 months ago

    IMHO The Problem is only partly the 8GB VRAM (for 1080p). An at least equal part of the Problem is the sitty Optimisation of some game engines. Especially Unreal Engine 5.

    • MangoPenguin
      link
      fedilink
      English
      arrow-up
      7
      ·
      8 months ago

      Yeah seeing a cool game and then seeing it’s made in UE5 really puts a damper on things. I wish the engine had more work into performance optimization.

    • BlameTheAntifa@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      There is nothing wrong with Unreal Engine and UE5 is not meaningfully different than UE4. The problem is that developers only “optimize” to pass console certifications while PC gamers are left out in the cold. It also doesn’t help that PC gamers have a lot more options and will often insist on choosing settings that are far beyond the capabilities of their particular hardware.

  • CrowAirbrush@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    8 months ago

    I just ditched my 8gb card because it wasn’t doing the trick well enough at 1080p and especially not at 1440p.

    So if i get this straight AMD agrees that they need to optimize games better.

    I hate upscaling and frame gen with a passion, it never feels right and often looks messy too.

    First descendant became a 480p mess when there were a bunch of enemies even tho i have a 24gb card and pretty decent pc to accompany that.

    I’m now back to heavily modded Skyrim and damn do i love the lack of upscaling and frame gen. The Oblivion stutters were a nightmare and made me ditch the game within 10 hours.

    • BlameTheAntifa@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      FSR4 appears to solve a lot of problems with both upscaling and frame gen – not just in FSR, but generally. It appears they’ve fixed disocclusion trails, which is a problem even DLSS suffers from.

  • Lucy :3@feddit.org
    link
    fedilink
    English
    arrow-up
    11
    ·
    8 months ago

    Oh fuck you AMD. NVidia fucked up with the 4060 already, and again with the 5060.

    • ryper@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      ·
      8 months ago

      Last month’s Steam survey had 1080p as the most common primary display resolution at about 55%, while 4k was at 4.57%.

  • Einar@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    ·
    8 months ago

    I wish.

    Send one of these guys by my place. I’ll show them what 8GB can not do…

  • xploit@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    8 months ago

    Oh so it’s not that many players are FORCED to play at 1080p because AMDs and Novideos “affordable” garbage can’t cope with anything more to make a game seem smooth, or better yet the game detected we’re running on a calculator here so it took pity on us and set the graphics bar low.

    • insomniac_lemon@lemmy.cafe
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 months ago

      Hey, give a little credit to our public schools (poorly-optimized eye-candy) new games! (where 10-20GiB is now considered small)

    • alphabethunter@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      8 months ago

      He is only testing AAA games at top settings. And that’s the point AMD is “making”. Most pc gamers are out there playing Esport titles at the lowest possible settings in 1080p to get the max fps possible. They’re not wrong, but you could still say that it’s ridiculous to buy a brand-new modern card only expecting to run esport titles. Most people I know that buy modern GPUs will decide to play new hot games.