They're not targeting high-end PCs. They're targeting current generation consoles, specifically the PS5 + 1080p. It just turns out that when you take those system requirements and put them on a PC—especially a PC with a 1440p or 2160p ultrawide—it turns out to mean pretty top of the line stuff. Particularly if as a PC gamer you expect to run it at 90fps and not the 30-40 that is typical for consoles.
Without disagreeing with the broad strokes of your comment, it feels like 4K should be considered standard for consoles nowadays - a very usable 4K HDR TV can be had for $150-500.
Thats a waste of image quality for most people. You have to sit very close to a 4k display to be able to perceive the full resolution. On PC you could be 2 feet from a huge gaming monitor, but an extremely small percentage of console players have the tv size and distance ratio where they would get much out of full 4k. Much better to spend the compute on higher framerate or higher detail settings.
I think higher detail is where most of it goes. A lower resolution, upscaled image of a detailed scene, at medium framerate reads to most normal people as "better" than a less-detailed scene rendered at native 4k, especially when it's in motion.
> You have to sit very close to a 4k display to be able to perceive the full resolution.
Wait, are you sure you don't have that backward? IIUC, you don't[] notice the difference between a 2K display and a 4K display until you get up to larger screen sizes (say 60+ inches give or take a dozen inches; I don't have exact numbers :) ) and with those the optimal viewing range is like 4-8 feet away (depending on the screen size).
Either that or am I missing something...
[]Generally, anyway. A 4K resolution should definitely be visible at 1-2 feet away as noticeably crisper, but only slightly.
My first 4K screen was a 24" computer display and let me tell you, the difference between that and a 24" 1080p display is night and day from 1-2 feet away. Those pixels were gloriously dense. Smoothest text rendering you've ever seen.
I didn't use it for gaming though, and I've "downgraded" resolution to 2x 1440p (and much higher refresh rates) since then. But more pixels is great if you can afford it.
It's one thing to say you don't need higher resolution and fewer pixels works fine, but all the people in the comments acting like you can't see the difference makes me wonder if they've ever seen a 4K TV before.
I still use 4K@24", unfortunately they're getting scarce. 4K@27" is where it's at now unfortunately. But I'll never go back to normal DPI. Every time at the office it bugs me how bad regular DPI is.
That's fair, but it makes me wonder if perhaps it's not the resolution that makes it crisper but other factors that come along with that price point, such as refresh rate, HDR, LCD layer quality, etc.
For example, I have two 1920x1080 monitors, but one is 160 Hz and the other is only 60 Hz, and the difference is night and day between them.
It’s best to think about this as angular resolution. Even a very small screen could take up an optimal amount of your field of view if held close. You get the max benefit from a 4k display when it is about 80% of the diagonal screen distance away from your eyes. So for a 28 inch monitor, that’s a little less then 2 feet, pretty typical desk setup.
Assuming you can render natively at high FPS, 4k makes a bigger difference on rendered images than live action because it essentially brute forces antialiasing.
I think you're underestimating the computing power required to render (natively) at 4K. Some modern games can't even natively render at 1440p on high-end PCs.
1440p and 2160p is a total waste of pixels, when 1080p is already at the level of human visual acuity. You can argue that 1440p is a genuine (slight) improvement for super crisp text, but not for a game. HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.
> 1440p and 2160p is a total waste of pixels, when 1080p is already at the level of human visual acuity.
Wow, what a load of bullshit. I bet you also think the human eye can't see more than 30 fps?
If you're sitting 15+ feet away from your screen, yeah, you can't tell the difference. But for most people, with their eyes only being 2-3 feet away from their monitor, the difference is absolutely noticeable.
> HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.
HDR is an absolute game-changer, for sure. Ray-tracing is as well, especially once you learn to notice the artifacts created by shortcuts required to get reflections in raster-based rendering. It's like bad kerning. Something you never noticed before will suddenly stick out like a sore thumb and will bother the hell out of you.
Text rendering alone makes it worthwhile. 1080p densities are not high enough to render text accurately without artefacts. If you double pixel density, then it becomes (mostly) possible to renderi text weight accurately, and things like "rythm" and "density" which were things that real typographers concerned themselves with start to become apparent.
You're probably looking up close at a small portion of the screen - you'll always be able to "see the pixels" in that situation. If you sit far back enough to keep the whole of the screen comfortably in your visual field, the argument applies.
You are absolutely wrong on this subject. Importantly, what matters is PPI, not resolution. 1080P would look like crap in a movie theater or on a 55" TV, for example, while it'll look amazing on a 7" monitor.