Associate
I CANT THE FOMO IS TOO STRONG and i hate myself because of it!!!!You can't buy one though m8, we all have to stay strong and HOLD!
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I CANT THE FOMO IS TOO STRONG and i hate myself because of it!!!!You can't buy one though m8, we all have to stay strong and HOLD!
Game seems to be heavy on the CPU for whatever reason ..CPU limited at 4k with only a 17% performance bump. Does that seem off for anyone else?
Something to do with texture streaming iirc.Game seems to be heavy on the CPU for whatever reason ..
CPU limited at 4k with only a 17% performance bump. Does that seem off for anyone else?
Seems very suss to be cpu limited at 4K.
DLSS 2 only 17% when it’s normally like 60-100% improvement or something
Watch the video I posted above , this particular game hits the system hard ... he is seeing a noticeable difference going from pcie 3.0 to 4.0 and is saying memory bandwidth is having a noticeable impact aswell, with RT on at 4k it seems to be at its worse .I don't buy the whole CPU limited thing either, something is extremely off about all this, and the embargo of framerates thing until just before release is wtf, yet only one set of fps figures have been released in the form of a small video from nVidia.
I don't trust this one bit. There is no way a 12900K is somehow CPU limited let alone any of the other 12th gen chips as well.
For some reason the press hype about this doesn't sit straight, like something fishy is afoot. There just doesn't seem to be any genuine excitement from the press about tis and all we are seeing is the mass-press repeat the exact same script nvidia have given them.
I have a bad feeling about this....
Maybe it's just Spiderman and other titles that have something engineered into the code for such a time as this... The long game so to speak?
Photoshop. The flanking girls are both the same height at 6”3Is the 4090 really tall or the 1060 and 2070 Super really short? Either way, my answer is yes, yes, yes, yes.
DLSS 3 is stupid anyway. Just decrease some settings to keep the frames high and use DLSS 2 => problem solved without the latency increase and the artifacts that come with DLSS 3.
Why using DLSS 3? To brag you play at "ultra settings" ? If you don't notice the artifacts due to the high frame rates, you won't notice you play at high settings instead of ultra either.
I heard they are already working on DLSS 4 and the next frame generation will be so advanced you won't even have to start the game to use it. Just think about what you want to play and Nvidia does the rest.
This is exactly how I see it, can imagine dlss 3 being utterly horrendous at 60fps.DLSS 3 is stupid anyway. Just decrease some settings to keep the frames high and use DLSS 2 => problem solved without the latency increase and the artifacts that come with DLSS 3.
Why using DLSS 3? To brag you play at "ultra settings" ? If you don't notice the artifacts due to the high frame rates, you won't notice you play at high settings instead of ultra either.
I heard they are already working on DLSS 4 and the next frame generation will be so advanced you won't even have to start the game to use it. Just think about what you want to play and Nvidia does the rest.
Given its a ps4 game, it looks incredible but given it used to run on jaguar cores and the equivalent of like a 7850 or similar, it must be optimised like absolute dogcrap... outside of original hardware config.Watch the video I posted above , this particular game hits the system hard ... he is seeing a noticeable difference going from pcie 3.0 to 4.0 and is saying memory bandwidth is having a noticeable impact aswell, with RT on at 4k it seems to be at its worse .
Surely this just highlights poor optimisation in the game's development then, since barely any other game exhibits the same issue. This game and the new Spiderman are the only PC games using the Insomniac engine too, so I would knock this off as a one-off TBH. This satisfies me now. Plus, I didn't find Spiderman all that gripping anyway lol.Watch the video I posted above , this particular game hits the system hard ... he is seeing a noticeable difference going from pcie 3.0 to 4.0 and is saying memory bandwidth is having a noticeable impact aswell, with RT on at 4k it seems to be at its worse .
Agree. Reviewers will need two charts from now on: fps and ffps (fake frames per second).The way i see it, is that DLSS 3 is more of a marketing tool, in my opinion, i don't know how other people will see it, but it's a way to say "hey we can get 4 x the performance compared to the competition" and while technically not correct as it's not apples to apples, they are right. And also, does it seem that nVidia would then be able to lean on this while producing cheaper and cheaper products while charging more and more and slapping DLSS 3 on it to get you the performance?
Ahhhhahahahaha, how much Nvidia paying you?xx70 is just a name. There's no defining feature of xx70 other than more expensive/faster than xx60 and cheaper/slower than xx80. I agree that having two quite distinct cards called a 4080 is needlessly confusing but that doesn't make it an "xx70" or a "straight rip off". As for price/performance, it looks like both 4080s are much better value than the 30xx cards of similar value (based on what Nvidia's claiming), and they'll almost certainly offer better bang for buck than the 4090 in real world usage.