• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
AMD probably has a fat bonus for targeting a specific amount of consoles pre Xmas, from both Sony and MS. We all know their console margins are generally low, probably a little better on this gen compared to last.

Once they meet initial demand i fully see them swinging production back towards CPU and GPU as there is a lot of money there for them.

I can imagine console demand drops off massively both post launch and post holiday season, so no need to keep allocations so high to that market, and ramping up for CPU and GPU can begin.

Could see a surge of availability in Cpu and Gpu shortly after xmas and definitely feb onwards i reckon.

Hiow long does it take once the wafers made to turn it into boxed products?

Had a look at TSMC website and a google and it appears currently in average it takes between 11 and 13 weeks to produce a chip, also the below us direct from TSMC site

TSMC's 7nm Fin Field-Effect Transistor (FinFET) (N7) process technology sets the industry pace for 7nm process technology development by delivering 256Mb SRAM with double-digit yields in June 2016. In 2019, in N7 process node's second year of volume production, customers taped out more than 110 new generation products on N7. In addition, 7nm FinFET plus (N7+) technology entered full-scale production in 2019 and delivered customer 7nm products to market in high volume. N7+ technology is the first commercially available extreme ultraviolet EUV-enabled foundry manufacturing process technology in the world. Its success is a testament to TSMC's world-leading capabilities in EUV volume production and paves a solid foundation for N6 and more advanced technologies.

Meanwhile, TSMC's 6nm FinFET (N6) technology successfully completed product yield verification in 2019. Thanks to the reduction in mask layer and the process complexity achieved through EUV lithography technology, N6 technology could achieve better yield and shorter production cycles compared to N7 technology in the manufacture of the same products. In addition, N6 manufacturing process delivers 18% higher logic density over the N7 process. At the same time, its design rules are fully compatible with TSMC's proven N7 technology, allowing its comprehensive design ecosystem to be reused. As a result, it offers a seamless migration path with a fast design cycle time with very limited engineering resources for customers not only to achieve the product benefits from the new technology offering but also can significantly reduce customers' product design cycle time and time-to-market.

Risk production of N6 technology started in the first quarter of 2020 with volume production planned before the end of 2020, TSMC's N6 technology provides customers with additional cost-effective benefits while extending the industry-leading power and performance from the 7nm family for a broad array of applications, ranging from high-to-mid end mobile, consumer applications, AI, networking, 5G infrastructure, GPU, and high-performance computing.

Compared to its 10nm FinFET process, TSMC's 7nm FinFET features 1.6X logic density, ~20% speed improvement, and ~40% power reduction. TSMC set another industry record by launching two separate 7nm FinFET tracks: one optimized for mobile applications, the other for high performance computing applications.
 
Last edited:
Glorious! :p

The 3070 is weaker than the 2080ti for some reason (opinion). Its nothing to do with vram. Not in every game, in legions it should be ahead a few FPS. You can run the HD texture pack on the 3070 without issue. You can see that the 2080 ti runs way ahead of the 3070 at 4k with RT and DLSS. The 6800 can get about 31fps with RT on at 4k vs 43.2 fps on the 3070 and the 2080ti 48.6fps. The 3080 and 3090 go above 60fps. https://www.legitreviews.com/amd-radeon-rx-6800-xt-and-radeon-rx-6800-review_223774/3 https://eteknix-eteknixltd.netdna-ssl.com/wp-content/uploads/2020/11/main-one-legion-1-880x544.png

https://cdn.mos.cms.futurecdn.net/DUGmphh9kBM2mjZtWeChJF-970-80.png

**Do Not Hotlink Images**
 
Last edited by a moderator:
That's valid speculation. AMD is supplying:
-XSX
-PS5
-Zen 3
-RDNA 2
(more then likely in that order)

All on one node from TSMC. If true, I would give AMD a pass on availability and just wait it out. It's not like AMD is just in the GPU market selling dies to miners.
:D


To correct you its PS5 up to the top, as confirmed from the TSMC news today where the supplies were split by product - TSMC is producing DOUBLE the number of PS5 SOC's compared to Xbox SOC's. Even though MS and Sony won't directly published transparent number, we know from TSMC now that Sony is building twice the number of next gen consoles compared to Microsoft
 
If you want to see a beast, look here: 6800xt 8k gaming. Half the price of the 3090 and it does pretty well.


The 30 seriers is better placed to do 8k with double the cores, much higher memory bandwidth and DLSS. 8k games really need that and the 6900xt and 6800xt are no suited. The 6000 seriers has half the number of cores and much lower memory bandwidth. With the weak RT of the 6900xt only the 3090 can do 8k and RT with DLSS of course. The video cherry picks its games for the 8k resolution which means its more of less a waste of time. You can do that with any card but the 3090 is the best placed to run 8k @ 60fps.
 
The 30 seriers is better placed to do 8k with double the cores, much higher memory bandwidth and DLSS. 8k games really need that and the 6900xt and 6800xt are no suited. The 6000 seriers has half the number of cores and much lower memory bandwidth. With the weak RT of the 6900xt only the 3090 can do 8k and RT with DLSS of course. The video cherry picks its games for the 8k resolution which means its more of less a waste of time. You can do that with any card but the 3090 is the best placed to run 8k @ 60fps.
It seems like you do the same shilling on other websites. I hope you get paid for this.
 
@zx128k Who are you trying to convince and what are you trying to prove? All these arguments are so pointless when you can't get any card. How much better value is one against each other is depending on the region you are living. I can't get rtx 3000 cheap where i come for an example, so i bought AMD. I think 6800xt at 720 euro or 640 pounds to make it easier for you it's better than the 850 i would pay for 3070, or 1100 i would pay for 3080, and they are not even in stock, none of them now. And most will not see them for months. So why is it matters what gpu can do (barely) 8k, nobody gives a damn about that, people play at 1080p and 1440p, there is where the big market is, not false dreams like 8k. Not even 4k is a major market.
 
It seems like you do the same shilling on other websites. I hope you get paid for this.
The 6800xt cant do 8k there is not enough memory bandwidth, also there is not enough shaders. Prue hardware issue. Modern gpu dont have the bandwidth thats why DLSS exist. I mean the shear ignorant here is appalling.
 
@zx128k Who are you trying to convince and what are you trying to prove? All these arguments are so pointless when you can't get any card. How much better value is one against each other is depending on the region you are living. I can't get rtx 3000 cheap where i come for an example, so i bought AMD. I think 6800xt at 720 euro or 640 pounds to make it easier for you it's better than the 850 i would pay for 3070, or 1100 i would pay for 3080, and they are not even in stock, none of them now. And most will not see them for months. So why is it matters what gpu can do (barely) 8k, nobody gives a damn about that, people play at 1080p and 1440p, there is where the big market is, not false dreams like 8k. Not even 4k is a major market.

All I am doing is nailing ppl here everytime they mislead and post a link the the real facts. Posting an informed opinion is not against the rules of any form. Brigading a thread with pro AMD non sense, so that other normal people cant post. Well work it out.

You are lucky if you got either the 3080 or 6800xt. The 3090 can do 8k with DLSS ultra performance mode. No card can do 8k @ 60fps in many modern games with native rendering. You can do 8k with older games. The 6000 seriers has poor memory bandwidth for 8k gaming and too few cores. Today you are happy if you can do 4k @60fps with an RT title.

If your system is not ryzen 5000 based. Then its likely the 6800xt will lag the 3080 at all resolutions and be slower in RT games https://www.techpowerup.com/review/amd-radeon-rx-6800-xt/35.html https://www.techpowerup.com/review/amd-radeon-rx-6800-xt/30.html. Ryzen 5000 based system the benchmards are closer but the 3080 wins at 4k. At 4k the 6800xt lower core count will put the 3080 ahead. https://www.guru3d.com/news_story/r...t_to_be_7_4_slower_than_geforce_rtx_3080.html The 3080 has more memory bandwidth, this helps with higher resolutions and RT denoising.
 
Last edited:
The 6800xt cant do 8k there is not enough memory bandwidth, also there is not enough shaders. Prue hardware issue. Modern gpu dont have the bandwidth thats why DLSS exist. I mean the shear ignorant here is appalling.
Accuses others of cherrypicking. Calls other people ignorant. Talks about nailing people who are misleading.

Believes that you can compare shader counts across different architects.

Leonardo-DiCaprio-laughing-meme-template-of-Django-Unchained.jpg
 
Accuses others of cherrypicking. Calls other people ignorant. Talks about nailing people who are misleading.

Believes that you can compare shader counts across different architects.

Honestly refer back to my previous post on bahaviour like this. IQ @ 4k seems a little fuzzier on team green - it's not an issue. But reflections on an RT's puddle being poor on AMD?

GET THE MICROSCOPES OUT BOYS! THIS ****'S TERRIBAD
 
Accuses others of cherrypicking. Calls other people ignorant. Talks about nailing people who are misleading.

Believes that you can compare shader counts across different architects.

Leonardo-DiCaprio-laughing-meme-template-of-Django-Unchained.jpg

You need more shaders at 8k, no card has the shader count required or memory bandwidth. Like I stated your ignorant. The 3090 can run native in cherry picked games at 60fps https://youtu.be/3IiFbahWaqk but an modern AAA game that requires DLSS 2.1 with ultra performance mode for upscaling 1440p to 8k (Death Stranding https://youtu.be/iBsnb7SZPQI note the above 60fps). Control gets 9fps at 8k but with DLSS 2.1 you can get 50fps by native reandering at 1440p and using DLSS. That way you can get a playable 60fps. The 6800xt is far worse off at 8k bucause the card lacks the performance. At 8k you are hammering the shaders and the memory bandwidth. Both of which the 6800xt has less of. 8k gaming is upscaled 1440p-4k gaming.

Then there is the vRAM 8k can hit 20GB in games like Control with DLSS enabled at 8k.
 
Last edited:
You need more shaders at 8k, no card has the shader count required or memory bandwidth. Like I stated your ignorant. The 3090 can run native is cherry picked games but an AAA game requires DLSS 2.1 with ultra performance mode for upscaling 4k to 8k. That way you can get a playable 60fps. The 6800xt is far worse off at 8k bucause the card lacks the performance. At 8k you are hammering the shaders and the memory bandwidth. Both of which the 6800xt has less of. 8k gaming is upscaled 4k gaming.

"Shaders" are not created equal, you cant compare them 1:1

You are correct in saying that no current GPU can do 8K (even the 3090 cant really do it), but saying that its because "Big Navi has less bandwidth / less shaders" is irrelevant because, at 8K NVidia cards also have the same issues.

HOWEVER .. DLSS is not a silver bullet as :

* It doesn't work on every game
* The performance gains are not consistent across all games that do support it
* IQ issues in some games (mainly blur in motion)
* Vendor locked

Its a very cool technology that I think will end up allowing us to run at these silly high resolutions at OK FPS at some point, but for now being able to render at the native resolution is still the only real option (And for 8K neither side can achieve that today, we have only JUST gotten true 4K playable cards)
 
"Shaders" are not created equal, you cant compare them 1:1

You are correct in saying that no current GPU can do 8K (even the 3090 cant really do it), but saying that its because "Big Navi has less bandwidth / less shaders" is irrelevant because, at 8K NVidia cards also have the same issues.

HOWEVER .. DLSS is not a silver bullet as :

* It doesn't work on every game
* The performance gains are not consistent across all games that do support it
* IQ issues in some games (mainly blur in motion)
* Vendor locked

Its a very cool technology that I think will end up allowing us to run at these silly high resolutions at OK FPS at some point, but for now being able to render at the native resolution is still the only real option (And for 8K neither side can achieve that today, we have only JUST gotten true 4K playable cards)

You need a vast amount of cores/cu's at 8k. Its all the proccessing that is required. Then there is the massive amount of memory bandwidth required. Modern cards can't do 8k are 60fps or above in modern AAA games. The hardware does not exist Nvidia or AMD. Thats why DLSS exists. AMD does not have a DLSS feature so their performance is sub par at 8k. No upscaler can compete with DLSS for quality. This is why MS wants super resolution for xbox.
 
Status
Not open for further replies.
Back
Top Bottom