• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
It is native. And i dont really care what you or nvidia wants me to think. Games have built in taa and dlss is an improvement over that. Period. Everything else is fanboy drivel and i want no part in it

You say you want no part in it, yet you keep proving my point. You have accepted the low bar that Nvidia have set for IQ. All so they can keep telling you “better than native”.
 
Not that I'm happy about these prices... but comments like this really bother me as they show a significant failure in any comprehension of economics.

To preface... I'm annoyed about the de-valuation and over-inflation of tiers that we see from Nvidia... but that's only because they have no competition in the market.

But when you look at actual pricing changes over the last 20 years & account for exchange rates and inflation... the prices aren't all that crazy.

Radeon X800XT was released 2004... It was $499. I seem to remember getting one for about £350.

Exchange rate was something like 1.9 at the time, so even with VAT, I paid a bit over the odds because stock was in short supply.

Now the exchange rate is 1.1... so even at base pricing with VAT, we are looking at around £550, just with the exchange rate.

Then we look at inflation... that comes to around £900.

Then there are various tech-specific supply chain restrictions, limits, expenses & other things that have happened all over the place since then... so, if we stay AMD/ATi-specific... we are looking at a grand total of £100 price increase... in 18 years... 100 quid in EIGHTEEN YEARS.

Ooooooh... massive price increase... just because your wages haven't tracked with exchange rates and inflation... are NOT the problems of these companies - they are the problems of your governments and beyond. Get your head on straight and look at the wider world.
I've already posted this before and I'll post again to 1080ti these days would have been around £830. Not £1k or £1.5k or £1.7k. stop justifying paying out of your rear when companies ask for whatever they want
 
RT may well be the future but until someone comes up with the model T of the RT world it's just going to be the exclusive of the well-heeled gamers, people are going to stick with their horse and carts because the small advantages offered by these expensive Mercedes and Rolls Royce simply aren't affordable.
Well, the other thing is that if we consider what it require to do "full RT everything" then the costs would be huge. (POVRAY still gets used as CPU benchmark by certain sites and any of those scenes are far from realtime.)

And this is at the same time as node progress has slowed down a lot (and cost per transistors is barely moving with new nodes).

So, brute-force RT is going to run into all kind of obstacles even some people wouldn't mind spending £10k on a 4000W monster card.

Now people may look down on raster lighting, but there are some clever "cheats" being used there and I think RT will similarly have to have some clever thinking.

And I don't meant upscalers.

Of course, my other gripe is that in games I don't necessarily want ultimate realism, as I don't want scenes which are 90% darks, being blinded by 1200 NIT HDR explosions, excessive lens flares or similar. Even the EBN lighting mods in the likes of Skyrim tend to be too extreme for me.
 
Snipped the post to save space
IMHO, after reading this all videos from this person via Youtube or Twitter should be blocked from being posted on the forums.

I see the leakers just had to cash out a few days before the official announcement and the one you highligted seems to be cashing out after lol but theres no point in trying to hold them accoutable as the PC community seems to forget 5 mins later and forgive them. Remember AdoredTV with his wild predictions with Zen2 and the multiple meltdowns that followed as it got so bad a lot of people wanted anything to do with him banned of the AMD subreddit.

No AMD store purchases for people in the UK since Brexit. Retailers only for us............
I've got the leak that nobody knows about (That I didnt just make up :D ) is that people only want to return to the EU so they can get AMD cards cheeper. :cry:
 
Last edited:
I would say buy what you want at the price you are happy with.

Until these 7900x prices were revealed my only upgrade path was the 4080 16GB and I was going to have to pay £1200 plus to get it.

Both the 7900 cards are a lot faster than a 3090Ti in raster and about on par for RT. As you go down the stack it means that those of us on 3080 or 6800 have an upgrade path for the same inflation adjusted price.

Nvidia’s plans for a £1200+ 4070 (aka the 4080 16GB) have just been sank.
Which is great for you/others as it gives you an option at the price point you are happy to pay for the performance you get(obviously we would all like to pay less). Hopefully Nvidia react as everyone is then a winner.

I will give the 7900XTX a try at some point I think as I did like the 6800XT when I had one.
 
AMD lower product stacks gonna be interesting if they don't increase price and how much performance increase, because I was pretty confident Nvidia would be increasing the price of its lower stacks
 
I actually look at this like I looked at AMDs first stab at chiplet designs on the CPU side, it might not have been perfect to begin with but it proved successful after a few generation, I can see the same happening on the GPU side with time.

These new cards might not have the outright performance of the 4090 but the price is right and it gives me hope for future generations of AMD cards.
 
I actually look at this like I looked at AMDs first stab at chiplet designs on the CPU side, it might not have been perfect to begin with but it proved successful after a few generation, I can see the same happening on the GPU side with time.

These new cards might not have the outright performance of the 4090 but the price is right and it gives me hope for future generations of AMD cards.
The RX 7000 series is the testbed for GPU Chiplets and could explain why AMD chose not to try to compete with the 4090, as they do not know how chiplets will perform in a mass market at the extreme so choose caution.

Once properly benchmarked the 7900XTX will most likely be around 80% of the performance of a 4090 and the 7900XT will probably outperform the 4080, or match it, at a lower price.
 
Last edited:
thats fine Ref will most likely cost less and so wont be missing much with how much extra the AIB will cost, probably just run cooler
 
Last edited:
People downplaying RT are straight up luddites. Ray Tracing is what decades of GPU progress has been building towards. Ray Tracing and HDR are transformative for picture quality and put the same game years ahead in visual quality and fidelity than the version without.

While HDR is ”free” (cost to enjoy is a high end display such as aw3423dw) RT needs a lot of GPU power for now. In the end, we should be encouraging both vendors to be in an arms race to delivery 2x+ generational performance in RT while overtaking each other and swapping places. The more robust the RT performance is across the board, the more inclined developers are to use it. You need the raw power to be there along with the toolset for game devs to invest their time.

I don’t care about logos. I care about end results. I don’t want to see RT stagnation because one of the two players has decided to phone it in and play the value card.
It doesn't matter about how fast an RTX4090 is in RT,if Nvidia rebrands the RTX4060 as a £900 RTX4080 12GB or a £450 RTX3070 replacement as a £1300 RTX4080 16GB. That means the RTX4060 will be a rebranded RTX3050 replacement with another mediocre performance jump.

Nvidia is also stagnating RT,by saying most of the sub £1000 market should stick with last generation RT performance,which is dire as my RTX3060TI is still useless in it,unless you cheat with DLSS2/FSR.

Basically I am having to degrade image quality just to make some improvements in other areas. So even if AMD "only" matches Ampere performance,the reality Nvidia is only interested in selling Ampere level RT performance to most average gamers.

Things such as tessellation existed since the ATI 8000 series in 2001,yet it took years for it to gain traction. It only really gained traction after a decade,and only really during the GCN/Kepler/Maxwell era was it applied in any liberal way.

The most important market is the mainstream market and consoles,because they make up the bulk of sales. So all the new tech means nothing if the average person has a rubbish dGPU or a console. This is determined by price. We are entering a massive global recession - Nvidia is more worried about margins and so is AMD to a lesser degree. Nvidia/AMD/Intel are all showing massive projected revenue losses - unless the prices drop most people will be sticking with what they have,ie,last generation RT performance.

PCMR hardware enthusiasts still don't get it after 20 years of wondering why games don't look as good as all the tech demos. The same with why games don't use 16 cores well,because again most games at best use four to six because that is what the affordable CPUs have. They might only need 8 cores one day because of consoles. This is why I was so critical of Zen4 pricing.

If you want more devs to target higher RT effects and more cores,it needs prices to drop to enable mass market adoption,and big price similar performance jumps. If companies like Nvidia want to sell the RTX4060 for £900 and the RTX4070 for £1300,a couple of PCMR hardware enthusiasts buying an RTX4090 or RX7900XTX won't change much. Even Cyberpunk 2077 and The Witcher 3 were downgraded because console and average PC hardware wasn't good enough. But the games can't be viable without those mass sales.

This is what happened until about 6~7 years ago,when everything seems to have go down the drain for the mainstream dGPU buyer. This is why games are stagnating visually - devs are targetting maximum sales.

Outside the odd AMD/ATI/Nvidia/Intel tech demo game which they throw money at to sell their new wares,most games in the last 20 years are targetted towards maximum sales. That means the average hardware spec of a gaming PC and the consoles will determine the level of RT effects used in most games. The biggest selling games in the world are MOBAs and MMORPGs. Almost all of them can run OK on a relative slow PC and have cartoony graphics.

Increasingly Microsoft and Sony are buying up game devs and these companies have deep investments in consoles. So ultimately most popular games will just stick with hybrid RT effects for the immediate future.
 
Last edited:
According to coreteks, 7900XTX gains an extra 3% performance with overclocking on current drivers
I'm guessing current drivers don't see or can't control the independent clock speeds of the shader and front end so you may find one or the other is holding things back, it could even be the MCD's holding things back and I'd assume those can also be overclocked separately although they didn't mention anything about this so who knows.
 
Status
Not open for further replies.
Back
Top Bottom