Soldato
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
We know what AMD are releasing, a 7700XT renamed as a 7800XT as the real one got renamed to a 7900XT and price jacked so don't expect it to be much faster than a 6800XT as it will likely have just 60CUs
No, I meant for modern, finished gamesFar Cry 6 at 4K while using option texture pack
Are there such things anymore?No, I meant for modern, finished games
Everyone seems to forget this, the whole stack has shifted up 1 tier in terms of pricing. You can see this by just looking at the current pricing of the 4080 which is a 70 class card but priced 1 tier up, except its also been gouged a lot higher out of greed.
Other than the 4090, AMD have caught up. Raster performance is strong, FSR competes with DLSS and RT is usable. Not sure what else people expect them to do?Yup... second time they've done this.
Last time was the 680 which were the mid tier chips they released as high end because AMD had fallen so far behind. They later released the 780, which was the real high end chip.
If only we had real competition in the market... hopefully AMD will catch up.
But seeing as both AMD & Nvidia's CEOs are of the same family - I have a funny feeling there may be some verbal agreements in the background stopping it.
I want it for the performance boost - in theory we're now 2 generations behind what NVidia are actually capable of, but it's possible they scaled back R&D to cut costs.
price fixing and market manipulation, wh owuld have thought it could happen!Yup... second time they've done this.
Last time was the 680 which were the mid tier chips they released as high end because AMD had fallen so far behind. They later released the 780, which was the real high end chip.
If only we had real competition in the market... hopefully AMD will catch up.
But seeing as both AMD & Nvidia's CEOs are of the same family - I have a funny feeling there may be some verbal agreements in the background stopping it.
I want it for the performance boost - in theory we're now 2 generations behind what NVidia are actually capable of, but it's possible they scaled back R&D to cut costs.
Unfortunately very true.Are there such things anymore?
Other than the 4090, AMD have caught up. Raster performance is strong, FSR competes with DLSS and RT is usable. Not sure what else people expect them to do?
I love that nVidia buyers are excited for fake frames. Spending thousands on hardware that comes with software to make it feel faster. Like someone else said, nVidia are great at inventing problems so they can sell you the solution.They have caught up with 2 year old ampere RT, they are still behind in RT. They're worse in vr and power efficiency/consumption.
Fsr is hit and miss too although it's definitely less of an issue now since uptake is quicker/better.
They don't have a frame generation competitor yet.
Etc. Etc.
It's basically like rDNA 2 Vs ampere all over again i.e. if you want to save some money and only care for raster then go for amd.
I love that nVidia buyers are excited for fake frames. Spending thousands on hardware that comes with software to make it feel faster. Like someone else said, nVidia are great at inventing problems so they can sell you the solution.
the fake frames are only an issue when you dont have enough base frames to work with, 60+ base becoming 120+ it works well, not so much on the lower endI love that nVidia buyers are excited for fake frames. Spending thousands on hardware that comes with software to make it feel faster. Like someone else said, nVidia are great at inventing problems so they can sell you the solution.
the fake frames are only an issue when you dont have enough base frames to work with, 60+ base becoming 120+ it works well, not so much on the lower end
It works best in the place it's needed least. I'm not completely dismissing it and I can see benefits, but currently the main one appears to be skewing bar charts at the high end.the fake frames are only an issue when you dont have enough base frames to work with, 60+ base becoming 120+ it works well, not so much on the lower end
It works best in the place it's needed least. I'm not completely dismissing it and I can see benefits, but currently the main one appears to be skewing bar charts at the high end.
DLSS 2
DLSS 3
Frame generation (lame)
Ray tracing.
Tensor cores for AI processing.
Floating point performance (e.g. 48.74 TFLOPS)
Huge amounts of VRAM
G.Sync
'Low latency' modes
Support for very highframeratesrefresh rates
Fancy new power connectors
Ampere +++
Do these things seem familiar?
None of these things are direct indicators of the performance of the GPU itself. They don't indicate a graphics card with more processing cores, or higher pixel rates /texture rates. Nor do they indicate the overall 3D graphics (rasterising) performance of the card.
We see these features on the box and decide they are must have features, and it pushes prices up. I admit that DLSS 2 is a very nice thing to have, if you have a 1440p/4K monitor. But - AMD is now competitive in terms of their resolution upscaling and detail enhancement technologies, so, I think we should basically all consider buying AMD this time around...
Of course, I may end up being a big hypocrite, if the RTX 4070 /4070 TI prices seem affordable. Other wise, based on the prices of the RTX 4080, Nvidia can do one. They have reverted to type, and we are seeing similar prices to the RTX 2080 TI (or even higher) again. And endless product variations and rereleases.
I can watch all the videos I'd like, but it doesn't replace trying it where latency is a potential issue. Reserving judgement until I try it myself, which at these prices will be when I'm old, grey, and my vision needs upscaling, not my GPU!
If 4080 and 4070ti would be best suited for FG, they should be running things at a high base frame rate anyway, so FG conversely becomes less important, theoretically. It's like DLSS or FSR - no need to turn it on if you're already getting huge frame rates.
I don't decide they are "must have" features, I look at reviews and tech videos from the likes of Digital Foundry to see if it's marketing or a genuine feature.
I've been gaming on a 4k screen now since late 2014, that was on an AMD 295x2 GPU and performance wasn't great back then.
By 2015 I had upgraded to Titan X Maxwell GPU's in SLI and suddenly thanks to G-sync, I was able to smooth out the variable frame rates when they couldn't do a locked 60fps on a lot of games.
Then 2080Ti came along with Ray Tracing, and it wasn't really up to the task of running ray tracing at 4k/decent frame rates, but then Nvidia introduced DLSS - Version 1.0 wasn't great, but within months we had DLSS 2.0 and that was a bit of a game-changer really for playing stuff like 'Control'
3090 then came along and gave you a genuine locked 60fps with DLSS 2
Now 4090 is here and you can easily get a locked 120fps most of the time.
So anyway, that was a lot of waffle but here is what I do use:
DLSS 2
Ray tracing.
Tensor cores for AI processing (DLSS use these)
Huge amounts of VRAM - Useful for 4k/Ultra textures. Remember the 3080 with 10GB VRAM stuttering on DOOM Eternal?
G.Sync
'Low latency'
Support for very highframeratesrefresh rates - I use 144hz on my Alienware OLED and 120hz on my G2 OLED.
These aren't gimmicks, they are genuinely useful features.
DLSS 3 is a gimmick to me at the moment, but might improve over time.