• latenightnoir@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    4 days ago

    The first F.E.A.R. had excellent dynamic lighting, I’d argue it had the epitome of relevant dynamic lighting. It didn’t need to set your GPU on fire for it, it didn’t have to sacrifice two thirds of its framerate for it, it had it all figured out. It did need work on textures, but even those looked at least believable due to the lighting system. We really didn’t need more than that.

    RT is nothing but eye candy and a pointless resource hog meant to sell us GPUs with redundant compute capacities, which don’t even guarantee that the game’ll run any better! And it’s not just RT, it’s 4k textures, it’s upscaling, it’s Ambient Occlusion, all of these things hog resources without any major visual improvement.

    Upgraded from a 3060 to a 4080 Super to play STALKER 2 at more than 25 frames per second. Got the GPU, same basic settings, increased the resolution a bit, +10 FPS… Totes worth the money…

    Edit: not blaming GSC for it, they’re just victims of the AAA disease.

    Edit 2: to be clear, my CPU’s an i7, so I doubt it had much to do with the STALKER bottleneck, considering it barely reached 60% usage, while my GPU was panting…

    Edit 3: while re-reading this, it hit me that I sound like the Luddite Boss, so I need to clarify this for myself more than anyone else: I am not against technological advancement, I want tech in my eyeballs (literally), I am against “advancements” which exist solely as marketing accolades.

    • AdrianTheFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      Really? Ambient occlusion used to be the first thing I would turn on. Anyways, 4k textures barely add any cost to the GPU. That’s because they don’t use any compute, just vram, and vram is very cheap ($3.36/GB of GDDR6). The only reason consumer cards are limited in vram is to prevent them from being used for professional and AI applications. If they had a comparable ratio of vram to compute, they would be an insanely better value compared to workstation cards, and manufacturers don’t want to draw away sales from that very profitable market.

    • daellat@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      4 days ago

      Cpu % usage is not a great stat. If on a 10 core CPU the main thread is maxed and the others are on 20% it would read 28% overall but you’re still CPU limited.

      Even the 7800x3d is cpu limited in stalker 2 in any npc area

      • latenightnoir@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        4 days ago

        Sorry, yeah, forgot the deets. 9700k, none of the cores were overworked, 60% seemed to be average usage across them.

        And, yeah, checked in NPC-heavy areas, where the stuttering, lag, and frame times were the worst, and I didn’t have it set to “Ridiculous” - using a combination of High for textures and Med for effects (like shadows and lighting), running it at 1080p on the 3060 and 1440p on the 4080 Super (bumped it up to native, basically). Exclusively on SSD, 32 Gigs of RAM.

        Edit: no upscaling because the input lag was horrid.

        • AdrianTheFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          Still, even if any thread looks like it’s always at 60%, if a load appears and disappears very quickly and gets averaged out on the graph (as it could in an unoptimised or unusual situation) it could still be a factor. I think the only real way to know is to benchmark. You could try underclocking your CPU and see if the performance gets worse, if you really want to know.

        • WhiteBurrito@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          4 days ago

          I that’s part of it… The 9700K is like 7 years old at this point, and I’m all for holding off updates if it does what you want, but eventually you’ll have to if you want to actually take advantage of that 4080

          • latenightnoir@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            4 days ago

            Well aware of that, but no game has ever had issues with it so far, so…

            And I even run it without any OC, because it handles everything I throw at it juust fine.

            Edit: plus, to be honest… if things keep going the way they’re going, I can see a clear cut-off point for me around gaming… Very few new games I’d actually want to play, and I own every game I’ve ever enjoyed playing.

            Edit 2: as an example, I can run Cyberpunk 2077 with everything cranked up to 11 and my system’s actually chillin’.

            • daellat@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              3 days ago

              I think you’d be surprised at how much headroom a fast cpu will give you in some games on that resolution and with that gpu. I’ve got a similar performing one (7900xtx) and I swear by my 7800x3d. Every game (apart from stalker 2) just feels so responsive. But it’s no cheap upgrade thats for sure.

    • faintwhenfree@lemmus.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 days ago

      I remeber reading the real sell to developers is less calculations, currently textures have to be designed for different lightening, which would require pre rendering same textures across multiple lightings. And that is time and resource intensive for developers.

      Ray tracing is a simpler solution. I’m not an expert, but that seemed sensible to me.

      • AdrianTheFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        I heard the Source 2 editor has (relatively offline, think blender viewport style) ray tracing as an option, even though no games with it support any sort of real time RT. Just so artists can estimate what the light bake will look like without actually having to wait for it.

        So what people are talking about there is lightmaps, essentially a whole other texture on top of everything else that holds diffuse lighting information. It’s ‘baked’ in a lengthy process of ray tracing that can take seconds to hours to days depending on how fast the baking system is and how hard the level is to light. This just puts that raytraced lighting information directly into a texture so it can be read in fractions of a millisecond like any other texture. It’s great for performance, but can’t be quickly previewed, can’t show the influence of moving objects, and technically can’t be applied to any surface with a roughness other than full (so most diffuse objects but basically no metallic objects, those use light probes and bent normals usually, and sometimes take lightmap information although that isn’t technically correct and can produce weird results in some cases)

        The solution to lighting dynamic objects in a scene with lightmaps is through a grid of pre baked light probes. These give lighting to dynamic objects but don’t receive it from them.

      • latenightnoir@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 days ago

        Honestly, this wouldn’t have been an issue, ever, if we wouldn’t have switched to “release fast, fuck quality, crunch ya’ plebs!” It’s yet another solution for a self-generated problem.

        • faintwhenfree@lemmus.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 days ago

          I don’t know who is downvoting you, but release fast at cost of quality definitely makes the problem worse. Because people keep buying half baked unfinished games.