• MudMan@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    6 hours ago

    I agree that it’s a meme comparison anyway. I just found it pertinent to call out that remasters have been around for a long time.

    I don’t know that I agree on the rest. I don’t think I’m aware of a lazy game developer. That’s a pretty rare breed. TAA isn’t a bad thing (how quickly we forget the era when FXAA vaseline smearing was considered valid antialiasing for 720p games) and sue me, but I do like good visuals.

    I do believe we are in a very weird quagmire of a transitional period, where we’re using what is effectively now a VFX suite to make games that aren’t meant to run in real time on most of the hardware being used to run them and that are simultaneously too expensive and large and aiming at waaay too many hardware configs. It’s a mess out there and it’ll continue to be a mess, because the days of a 1080Ti being a “set to Ultra and forget it” deal were officially over the moment we decided we were going to sell people 4K monitors running at 240Hz and also games made for real time raytracing.

    It’s not the only time we’ve been in a weird interaction of GPUs and software (hey, remember when every GPU had its own incompatible graphics API? I do), but it’s up there.

    • warm@kbin.earth
      link
      fedilink
      arrow-up
      1
      ·
      6 hours ago

      TAA is absolutely a bad thing, I’m sorry, but it’s way worse than FXAA, especially when combined with the new ML upscaling shit.
      It’s only really a problem with big games or more specifically UE5 games as temporal is baked into it.

      Yeah, there was that perfect moment in time where you could just put everything max, have some nice SMAA on and be happy with >120fps. The 4K chase started yeah, but the hardware we have now is ridiculously powerful and could run 4K 120fps no problem natively, if the time was spent achiveing that rather than throwing in more lighting effects no one asked for, speed running development and then slapping DLSS on at the end to try and reach playable framerates, making the end product a blurry ghosting mess. Ugh.

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        5 hours ago

        Hell, no. 120 fps wasn’t even a thing. That flash in the pan moment was when 1080p60 was the PC standard and 720p30 the console standard and the way the hardware worked you could hit max specs on a decent PC every time. It lasted like three or four years and it was wonderful.

        By the point we started going above the NTSC spec on displays the race was lost. The 20 series came out, people started getting uppity about framerate while playing some 20 year old game and it all went to crap on the PC front.

        As for AA, I don’t think you remember FXAA well, or at least in relation to what we have. ML upscaling is so much sharper than any tech we had a couple of gens ago, short of MSAA (and frankly even MSAA). The problems that have become familiar in many UE5 games are not intrinsic to the tech, they have a lot to do with what the engine does out of the box and just how out of spec some of the feature work is.

        I feel like people have gotten stuck with some memes (no motion blur! DLSS bad! TAA bad!) that are mostly nostalgic of how sharp 1080p used to look compared to garbage-tier sub 720p, sub 30 fps console games. It’s getting to the point where I have so many major gripes with a lot of modern games but I feel it becomes one of those conversations you can’t have in public because it gets derailed immediately.

        In any case I think we can at least agree that it’s been an awkward couple of generations of PC hardware and software for whatever reason and GPUs, engines and displays need to get realigned in a way where people can just fire up games and expect them to look and run as designed.