• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
Do they need to be discrete chiplets for MCM or could they employ a similar approach to Apples M1 Ultra, effectively dies glued together with interconnects. ?

There are a variety of different ways MCM type approaches could be used for GPUs - like with CPUs with discrete chiplets and a traditional bus is easy to do and scales well for compute tasks, but is pretty much impossible to overcome the limitations of SLI or Crossfire when it comes to gaming use. Similar could be done but with the cores having more complex interconnects and/or sideports for memory access, etc. but so far that only gives a fairly small performance gain while still having many of the limitations of SLI/CF when it comes to gaming while increasingly complexity a lot. If the interconnects within the interposer are high enough performance it starts to open up other possibilities such as moving large areas like some types of cache out of the main package so as to put more of other stuff in the main package (which might be what AMD are doing with RDNA3) or some form of spreading the functionality of a GPU out over multiple packages which might involve splitting out processing and command parts of the GPU, or having reprogrammable on the fly dumb chiplets (kind of like with Intel Larrabee) where the workload can be divided out as needed repurposing the resources of processing blocks as any given workload requires, etc.

Some general information on the nVidia work here https://research.nvidia.com/sites/default/files/publications/ISCA_2017_MCMGPU.pdf
 
Can't know its position either which is where the tunnelling bit comes in it can potentially find itself somewhere it isn't supposed to be i.e. its "tunneled" out. Its how radioactive decay happens
True, tunnelling can be either position or energy levels.

I could be mistaken but I think the damage aspect only occurs when it's energy levels, as that's when you get the particle actually moving across the barrier as opposed to teleporting or super-positioning. With the latter position changes you get a potentially flipped bit, but the logic gate itself remains undamaged.
 
Console refreshes are such a bad idea, all it does is make the leap from one generation to the next looks like far less of an upgrade when you have one in the middle that’s also an upgrade. So you’re going from say 6700xt to 7700xt, when it could be going from 6700xt to 8700xt.

Dunno who came up with this idea first if it was ms or Sony, used to be a console lasted 5+ years, now they’re shoehorning in an upgrade mid way through.
 
Dunno who came up with this idea first if it was ms or Sony, used to be a console lasted 5+ years, now they’re shoehorning in an upgrade mid way through.
I'm not sure either, but it kinda makes as much sense as upgrading a GPU. GPUs tend to get upgraded more often than other parts and, somehow, upgrading a console can be cheaper than upgrading GPU.

I appreciate that discrete GPUs are more powerful, but the theory is the same.
 
There is imho an interesting little problem coming down the line in the next few years where displays and therefore demand for high performance GPUs potentially falls off a cliff.

Once you can drive 4k at high refresh rates its really only VR displays that push the need for more. So whilst there will always be an enthusiast market for the fastest the vast majority of people upgrade because they need to. Feasibly that is what we can buy into next gen a top end GPU that potentially could last you 4-6years as that level of performance trickles down to the mainstream tiers.

So yeah I think in as soon as 2yrs from now the demand at the top end is going to drop significantly.
 
There is imho an interesting little problem coming down the line in the next few years where displays and therefore demand for high performance GPUs potentially falls off a cliff.

Once you can drive 4k at high refresh rates its really only VR displays that push the need for more. So whilst there will always be an enthusiast market for the fastest the vast majority of people upgrade because they need to. Feasibly that is what we can buy into next gen a top end GPU that potentially could last you 4-6years as that level of performance trickles down to the mainstream tiers.

So yeah I think in as soon as 2yrs from now the demand at the top end is going to drop significantly.

Interesting observation and i think you're right. Sort of.

However, Nvidia has already found a way to keep the gravy train going, Ray Tracing, RT is the new Tessellation, GPU's getting too powerful for that to matter?? No problem, keep cranking it up, and up, and up....

Not selling more GPU's each year than the last is not an option.
 
Interesting observation and i think you're right. Sort of.

However, Nvidia has already found a way to keep the gravy train going, Ray Tracing, RT is the new Tessellation, GPU's getting too powerful for that to matter?? No problem, keep cranking it up, and up, and up....

Not selling more GPU's each year than the last is not an option.

I'm still waiting for someone to do proper voxel acceleration, Imagine Teardown or Minecraft with a lot more fine grained details instead of looking like 1990s called back...
 
There is imho an interesting little problem coming down the line in the next few years where displays and therefore demand for high performance GPUs potentially falls off a cliff.

Once you can drive 4k at high refresh rates its really only VR displays that push the need for more. So whilst there will always be an enthusiast market for the fastest the vast majority of people upgrade because they need to. Feasibly that is what we can buy into next gen a top end GPU that potentially could last you 4-6years as that level of performance trickles down to the mainstream tiers.

So yeah I think in as soon as 2yrs from now the demand at the top end is going to drop significantly.

Doubtful, the goal for graphics has always been to make them virtually indistinguishable form real life. We're still a long way off that. Games still look very much like games, you can take screenshot and have scenes look pretty real but it's only in certain situations, usually when characters are involved in the picture you know immediately its a game.
 
Dunno who came up with this idea first if it was ms or Sony, used to be a console lasted 5+ years, now they’re shoehorning in an upgrade mid way through.

At least in regards to the Xbox and Playstation, there has nearly always been midlife refreshes and upgrades, sometimes more than one. I think the only console that didn't get a refresh/upgrade of any kind was the original Xbox, but it only last 4 years anyway before been replaced by the Xbox360.
 
At least in regards to the Xbox and Playstation, there has nearly always been midlife refreshes and upgrades, sometimes more than one. I think the only console that didn't get a refresh/upgrade of any kind was the original Xbox, but it only last 4 years anyway before been replaced by the Xbox360.
Those refreshes are all aesthetic such as a new 'slim' version and don't improve performance. The PS4 and Xbox One refreshes actually had a faster CPU and GPU. I don't recall a console getting upgrades before that. Quite the opposite because the Mega Drive refresh was considered a downgrade due to removal of the headphone jack.
 
Status
Not open for further replies.
Back
Top Bottom