Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
"Sucks at RT", appears from the AMD promo material to mean "around equal to or maybe better than a 3090", a card which nvidia seem still to be trying to punt at over £1k, while telling everyone in their 4xxx launch video that 'most gamers should be satisfied with the 3000 series performance'
But sure, it's AMD giving bad value for money. LOL.
Sort of - they've moved the memory controllers and infinity cache out to chiplets, freeing up space on the "main" GPU die, allowing either more room for compute units, or to allow the main GPU die to be smaller/cheaperI expected them to put the compute units in chiplets but it looks like they just did a normal GPU and added cache chiplets. Is this correct?
no one even knows what their performance really is yet, AMD figures could be with the camera pointing directly at the ground or the skyI would say buy what you want at the price you are happy with.
Until these 7900x prices were revealed my only upgrade path was the 4080 16GB and I was going to have to pay £1200 plus to get it.
Both the 7900 cards are a lot faster than a 3090Ti in raster and about on par for RT. As you go down the stack it means that those of us on 3080 or 6800 have an upgrade path for the same inflation adjusted price.
Nvidia’s plans for a £1200+ 4070 (aka the 4080 16GB) have just been sank.
Hence the vested interest in pushing dlss, fsr, xess so it lowers the barrier to entry and let's people feel like they're getting a true premium experience and not a fiesta with flame stickers on the sideIt doesn't matter about how fast an RTX4090 is in RT,if Nvidia rebrands the RTX4060 as a £900 RTX4080 12GB or a £450 RTX3070 replacement as a £1300 RTX4080 16GB. That means the RTX4060 will be a rebranded RTX3050 replacement with another mediocre performance jump.
Nvidia is also stagnating RT,by saying most of the sub £1000 market should stick with last generation RT performance,which is dire as my RTX3060TI is still useless in it,unless you cheat with DLSS2/FSR.
Basically I am having to degrade image quality just to make some improvements in other areas. So even if AMD "only" matches Ampere performance,the reality Nvidia is only interested in selling Ampere level RT performance to most average gamers.
Things such as tessellation existed since the ATI 8000 series in 2001,yet it took years for it to gain traction. It only really gained traction after a decade,and only really during the GCN/Kepler/Maxwell era was it applied in any liberal way.
The most important market is the mainstream market and consoles,because they make up the bulk of sales. So all the new tech means nothing if the average person has a rubbish dGPU or a console. This is determined by price. We are entering a massive global recession - Nvidia is more worried about margins and so is AMD to a lesser degree. Nvidia/AMD/Intel are all showing massive projected revenue losses - unless the prices drop most people will be sticking with what they have,ie,last generation RT performance.
PCMR hardware enthusiasts still don't get it after 20 years of wondering why games don't look as good as all the tech demos. The same with why games don't use 16 cores well,because again most games at best use four to six because that is what the affordable CPUs have. They might only need 8 cores one day because of consoles. This is why I was so critical of Zen4 pricing.
If you want more devs to target higher RT effects and more cores,it needs prices to drop to enable mass market adoption,and big price similar performance jumps. If companies like Nvidia want to sell the RTX4060 for £900 and the RTX4070 for £1300,a couple of PCMR hardware enthusiasts buying an RTX4090 or RX7900XTX won't change much. Even Cyberpunk 2077 and The Witcher 3 were downgraded because console and average PC hardware wasn't good enough. But the games can't be viable without those mass sales.
This is what happened until about 6~7 years ago,when everything seems to have go down the drain for the mainstream dGPU buyer. This is why games are stagnating visually - devs are targetting maximum sales.
Outside the odd AMD/ATI/Nvidia/Intel tech demo game which they throw money at to sell their new wares,most games in the last 20 years are targetted towards maximum sales. That means the average hardware spec of a gaming PC and the consoles will determine the level of RT effects used in most games. The biggest selling games in the world are MOBAs and MMORPGs. Almost all of them can run OK on a relative slow PC and have cartoony graphics.
Increasingly Microsoft and Sony are buying up game devs and these companies have deep investments in consoles. So ultimately most popular games will just stick with hybrid RT effects for the immediate future.
Once games start taking advantage of the raw horsepower of the 4090 and increase the number of bounces in RT, it won't be as good. Plague Tale is an excellent example. 3090 will not be able to do RT once they patch it in. RT overdrive mode would be unplayable on 3090 in Cyberpunk.Not as good as the best =/= sucks. Does Ampere suck at RT?
That makes more sense but does mean it’s not going to be as scalable as some think, cannot add more CU’s as its all in the main die. That said, they could do a bigger main die version.Sort of - they've moved the memory controllers and infinity cache out to chiplets, freeing up space on the "main" GPU die, allowing either more room for compute units, or to allow the main GPU die to be smaller/cheaper
Nearly all triple AAA games are coming with RT and AMD again have dropped the ball on that front
Ampere is over 2 years old.....
RT usually makes a bigger difference than going from say high to ultra settings. Most of us here are enthusiasts and want the best.
Its a multiplayer shooter. Why would it support RT? Even if it did, no one would use it as performance is key in MP shooters. Also the previous titles had non-existent RT which is RT shadows. I also disabled it in BF2042. These games already look like **** when it comes to graphics (that's by design) so RT is like putting lipstick on a pig.Call of Duty MWII released last week with zero RT, even though the previous three titles (2019-21) had support. Is this is a reflection of consumers priorities?
Hence the vested interest in pushing dlss, far, xess so it lowers the barrier to entry and let's people feel like they're getting a true premium experience and not a fiesta with flame stickers on the side
So it's a multiplayer shooter, probably the most popular genre of game out of them all, which RT shouldn't try to run on?Its a multiplayer shooter. Why would it support RT? Even if it did, no one would use it as performance is key in MP shooters. Also the previous titles had non-existent RT which is RT shadows. I also disabled it in BF2042. These games already look like **** when it comes to graphics (that's by design) so RT is like putting lipstick on a pig.
Why does caring about RT = trolling? I want a GPU which is good at both raster and RT or at the very least completely smokes the 4090 in raster (we can use VSR to upscale and downscale to get beyond the 120hz limitation). I don't mind paying high prices for this hobby but AMD had nothing for a customer like me yesterday.Ah, there's some more of that low rent trolling we were just talking about.
The 7900xtx, if we believe the figures they gave (and for now we have no others) equals or beats ampere, and looks to be approaching the 40 series in raster performance (certainly outshining ampere there). For 60% of the price of the only available 40 series card, and with much better power consumption.
That certainly looks more like “best” to me, than a halo card which costs £1700 and burns the house down while you play. “Most” of us won’t be buying a 4090, even here on an enthusiasts forum, so it’s of interest what else the market can offer at various price points.
Its a multiplayer shooter. Why would it support RT? Even if it did, no one would use it as performance is key in MP shooters. Also the previous titles had non-existent RT which is RT shadows. I also disabled it in BF2042. These games already look like **** when it comes to graphics (that's by design) so RT is like putting lipstick on a pig.
People upgrading from a 3080 Ti or any 3000 series card will be looking at it as an upgrade route. Maybe it makes sense for those on a 3070 or lower but I doubt that customer base would be looking for a card which costs over a grand.The 7900XTX is not competing with a 3080Ti. They are different generations for a start.
The 7900XTX is not competing with a 3080Ti. They are different generations for a start.
Its a multiplayer shooter. Why would it support RT? Even if it did, no one would use it as performance is key in MP shooters. Also the previous titles had non-existent RT which is RT shadows. I also disabled it in BF2042. These games already look like **** when it comes to graphics (that's by design) so RT is like putting lipstick on a pig.