On Tue, 27 May 2025 12:02:32 -0400, in comp.sys.ibm.pc.games.action,
Spalls Hurgenson wrote:
Still, I think Azor has a point. Gamers largely have spoken (through
their wallets) on the issue, and what they've said is that, "HD is
more than good enough". We've come to the point where video games
visuals are more than satisfactory already, and the added cost of
upgrading to higher-end hardware --be it video-cards, monitors or
whatever-- just isn't worth the price. In fact, with the popularity of
handhelds like the Nintendo Switch and Valve's SteamDeck,
HD-resolutions got a second lease on life; the small screen size of
those devices make the pixel density of 1080p more than sufficient.
>
Exactly. The only resolution upgrade that has caught my eye is 1440p, and
the accompanying video card price rules it out, afaic.
Fact: [N]vid[ia]-cards are just stupid expensive. I'd wanna throw an RTX
xx70 or xx70 Ti at 1440p, and I'm just not interested in spending that
kind of money on it with the half assed effort the RTX 5070 is. The 4070
too, for that matter. I don't know why Nvidia even bothers if they're
going to nerf their hardware like that.
So, 1080p with real HDR and stupid refresh, with adaptive sync, is it.
To give you an idea, I game rn on a 27" 1000nit screen with Gsync (and
Freesync) that has a max refresh of 240hz.
I use 144hz for day to day desktop.
For gaming, I limit frame generation to 75fps so the card doesn't get
hot. Sometimes even passive cooling works. Because it's adaptive refresh,
max 75-90 frames at 120hz is just fine. Rendering ~60fps to 120hz seems
to be the sweet spot for me. Steady, fluid, beautiful.
For that experience, the monitor cost around $210(!), and the card, a
4060 Ti with 16GB RAM, was $490ish. Which is almost as much as my 1080
cost years ago ($550).
Why anyone would want to go to 4k with an RTX xx80 minimum card at these
prices is beyond me. "But... but, bigger numbers!" WTH?*
If AMD can come in lower than Nvidia on that, they will win gaming.
Nvidia doesn't even seem to care about the gaming market any more.
So AMD is probably right to bet on lower-powered, less-expensive cards
that only sport 4-8GB RAM. Nvidia might be all the rage in the news
with their 12/16/24/32GB monstrosities (complete with 600W power
requirements... I think my toaster uses less!) but outside of
gamer-super-enthusiasts (and crypto/AI-bros) there's not really much
call for that sort of performance.
>
I would prefer 12-16GB, just for the headroom. 4GB is a no go.
I'm sure the developers will be pissed though. For too long they've
been riding on the wave of ever-more powerful machines looming on the
horizon to excuse their sloppy code. Maybe now we'll get some
optimization in.
Nah. That'll *NEVER* happen. Besides, what's sloppy at this point is the
drivers, not the game code. The worst the game code gets is waiting for
the shaders to pre-compile, imo. If AMD does really good Vulkan, and
decent generalized D3D 12.2 for any game that doesn't do Vulkan, that'll
be slay. All this "game ready" optimization just seems to break shit. I
haven't gone to studio drivers yet, but I probably should.
For future purchases, ray tracing is the only advance I want, and not as
a proprietary gimmick like bump mapping, tessellation, and hardware T&L
first were. When it's mature. When you can render full scene. Maybe
around Windows 15 and D3D 14.5?
And if that never happens - it's a big deal - I'm fine with my 4060 Ti.
Low-end 50xx cards hold no interest for me at all.
So I'd be happy to switch to AMD after this card if they do this right.
It's a refreshing strategy.
-- ZagWhat's the point of growing up if you can't be childish sometimes? ...Terrance Dicks, BBC````````````````````````````````````````````````````````````````````````
* 4k from 10' on anything less than a 90" TV screen is stupid too. Mine
is 55". Even 720p is fine in that case, though I prefer 1080p. Bigger.
Numbers. Are. For. Weenies.