Sujet : Re: New Nvidia cards - 5xxx series
De : rstowleigh (at) *nospam* x-nospam-x.com (Rin Stowleigh)
Groupes : comp.sys.ibm.pc.games.actionDate : 11. Jan 2025, 13:13:04
Autres entêtes
Message-ID : <dtm4ojdvmu5t3hhavpjmhetlgb79me2jkm@4ax.com>
References : 1 2
User-Agent : Forte Agent 4.0/32.1071
On Sat, 11 Jan 2025 11:21:27 +0100, "Werner P." <
werpu@gmx.at> wrote:
Am 09.01.25 um 13:56 schrieb Rin Stowleigh:
Guess I'll be sticking with my RTX 4080 S for now. The new cards seem
to be getting their performance boost mostly artificially from frame
generation. Which might be fine for certain single player games but
just adds input lag, and that sucks for multiplayer.
Not that I really need more frames right now anyway.
At least it's nice to know that driver technology is finally
addressing the death of Moore's law, so after having switched to 1440p
for gaming I won't have to worry about ever going back to 1080p.
There's a discussion of the issue here if interested (and no shortage
of other YT rants on the subject if you look for them).
https://www.youtube.com/watch?v=_rk5ZTqgqRY
AMD seems to finally have gotten AI upscaling right with the next
iteration of cards, and Intel is swiftly getting better at the lower
end, I can see the days of my trusty 2080 coming to an end within the
next 2 years, and then it will be eitehr Intel or more likely AMD!
AI upscaling was the main reason to stick with NVidia for me, but the
Linux support which is important for me always was a huge pain
especially when it came to wayland, add on top the immense price hikes
of their cards and I will leave the camp with the next iteration like I
left Intel for AMD when Ryzen came out!
>
Nvidia is still so far ahead in the GPU race overall, so if you ever
have any interest in some of the non-gaming uses like running
generative AI locally, any form of content creation that can use GPU
etc., then there is still no contest.
On the CPU side Intel had some misfortunate with the whole
over-volting thing.
But don't believe any of the early videos about the Core Ultra 9 chips
compared to the newer Ryzens where certain games like FC6 and
Cyperpunk have dramatically higher framerates. A Windows update
released a few weeks after the Core Ultra chips came out fixed a
flawed driver that corrected performance, and some games released
patches that did same -- for example the Cyperpunk patch increase in
December was around 30% better framerates than what had been reported
upon initial release of the Core Ultras. Sadly, not so many YT
click-bait artists are updating their benchmarks it seems, I guess
because slamming Intel generates more views (and thus revenue) than
videos that say "sorry, we jumped the gun and were wrong".
I wasn't optimizing for gaming when getting the 285k, more for
everything else (software development and content creation). However,
at least on ultra settings and 1440p, most of my gaming benchmarks are
close enough to the best anyone else is getting on an Intel i9-14900k
or high-end Ryzen chip at same settings on a rtx4080 Super, and yet
the CPU and GPU stay staggeringly cool in this system... I never hear
the fans come on in this rig at all, even though in every benchmark it
is between 2.5-3.5x the speed of my last system.