Associate
- Joined
- 19 Sep 2022
- Posts
- 1,010
- Location
- Pyongyang
Its getting posted, people can make their excuses for it all they like its not getting silenced.
but its for real:
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Its getting posted, people can make their excuses for it all they like its not getting silenced.
you'll have the 4000 series owners complaining mindRight it wont have DLSS 3.
Every time someone posts "BuT AmdEE No DLsS" i'm going to post this....
I mean, it helps to be realistic. I don't care about the branding, just what they price the Founders Editions at.
It's not going to £400-£500 for a card that has better performance than the RTX3080 (as much as I wish it could be), even if RDNA3 is brilliant.
you'll have the 4000 series owners complaining mind
i would beg to differ.. have been looking up 4090 numbers, and i think rtx 50 will make 4k mainstream - i think a big shift is comingPeople then wonder why GFX in games has stagnated and blame consoles.
Oh well, lol. This is why people should ignore upscaling features when buying cards now.
i would beg to differ.. have been looking up 4090 numbers, and i think rtx 50 will make 4k mainstream - i think a big shift is coming
also depending on how the mcm approach works out for amd, MAYBE prices will come under control
this is like stuff for future youtube documentaries. though i fully understand that prices have been spoiling the moment
RTX3070 or RTX3070TI level performance if it is clocked very high? Even if it was an RTX3080,it would be around 20% to 25% faster than an RTX3070. Now think how much it was price at? £900. Even if it was rebranded as an RTX4070 12GB,it would be over £500. If it was sold as an RTX3060 replacement it would be a very big upgrade. But Nvidia doesn't want to do this!
Don’t blame you mate.It would just be the most obvious conclusion to draw seeing as nvidia have had a head start, they may well be ahead but it could be by a negligable amount for all we know. Though AMD have surprised us before, their raster performance being on par with nvidia in the 6 series compared to what it was in the 5 series was a massive leap, before launch people were saying that if they could equal 2080ti it would be a big jump, but they far exceeded that. Maybe they can do it again on RT performance, who knows?
Not really in the market for a top end card at the minute so i really couldn't give a flying **** one way or the other, might get a 3060 or something to aid in video rendering as my gaming interest has basically vanished.
I remember the press from dlss being quite bad tbh mate. Maybe we follow different ppl in YouTube.People said DLSS 1 was "better than Native" that didn't stop when it was proven not to be and of course continued on pretending science fiction is real through every new iteration of it, even some tech journalists got on board with that marketing because they don't understand you cannot create something from nothing, the absence of something it not the basis for anything.
I think you seem a bit obsessed. The impression I got was that the RTX4080 12GB has a bit better performance in non RT games and better RT performance than the RTX 3080. Considering it will now be marketed as the RTX 4070, that seems OK to me. The cooling is likely to be improved too.
Only issue is pricing, it needs to come down a lot.
But the problem again is the high end. If the existing tiers were kept it might happen,but the previous 104 series dGPUs had 60% to 70% of the transistors of the top 102 series dGPUs.
The AD104 only has 48% of the transistors of the AD102,under half the shaders and half the memory bandwidth:
Nvidia "unlaunch" 4060ti 12GB erm 4080 12GB....
i always felt that the 4080 12g could only have had a very limited impact, perhaps OEM channels where you'd have corporate/govt or other ignorant buyers buying on behalf of other people..OEMs also tend to obfuscate things, because they're selling the whole package i don't see how an individual...forums.overclockers.co.uk
Its very close to what you would expect with a 106 series dGPU. The AD103 is nowhere a 104 series dGPU would be,ie,60% to 70% of the transistor count of the top chip.
Since it is being sold as an RTX4080 12GB(or maybe an RTX4070),it probably is half an RTX4090:
RTX3070 or RTX3070TI level performance if it is clocked very high? Even if it was an RTX3080,it would be around 20% to 25% faster than an RTX3070. Now think how much it was priced at? £900. Even if it was rebranded as an RTX4070 12GB,it would be over £500. The chip itself will be impressive,but Nvidia does not want to pass this onto the consumer. If it was sold as an RTX3060 replacement it would be a very big upgrade. But Nvidia doesn't want to do this!
You are not wrong on the cache at all. The massive cache 4090 has puts it far out ahead of the lower stack.The 4080 16gb is half the 4090, the 4080 12gb is less than half
For some reason the 4090 proportionally performs better than its core count would suggest. I'm not sure why this is, it's not the clock speed the lower end cards have the same or faster clocks. I believe it's the amount of cache the 4090 has
It could be that - the lower AMD RDNA2 cards have less cache relative to shader count than the higher end ones too. The problem though is not the GPUs themselves is how Nvidia is marketing them. The AD104 has only 48% of the transistors of a AD102,and as you said its less than half the shaders,memory bandwidth,etc too. Even if has higher clockspeeds,how was it worth £900?? How is it even worth £600? So we essentially get RTX3070TI to RTX3080 level performance with slightly more VRAM,better RT performance and similar rasterised performance(but it might have problems at higher resolutions).The 4080 16gb is half the 4090, the 4080 12gb is less than half
For some reason the 4090 proportionally performs better than its core count would suggest. I'm not sure why this is, it's not the clock speed the lower end cards have the same or faster clocks. I believe it's the amount of cache the 4090 has
AMD will price their stuff as high as they'll can. After all, nVIDIA did had a better card than AMD in terms of price/performance (at MSRP at least) - 3060ti.i would beg to differ.. have been looking up 4090 numbers, and i think rtx 50 will make 4k mainstream - i think a big shift is coming
also depending on how the mcm approach works out for amd, MAYBE prices will come under control
this is like stuff for future youtube documentaries. though i fully understand that prices have been spoiling the moment
AMD will price their stuff as high as they'll can. After all, nVIDIA did had a better card than AMD in terms of price/performance (at MSRP at least) - 3060ti.
Ryzen 5800x could have been sold around $101 at a profit. Instead they've chosen to charge almost 4,5x as much.