• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Now's your time AMD

So whatever we call it there isn't much room left before we run out of process shrinks to increase performance? Are we looking at mid 2020's then new ideas will be required?
A lot of people are working on new ideas and have been for several years. New transistor designs, new materials, etc.

They have to try a lot of stuff, because not all of it ends up being viable to mass-produce (or even make products from). Like carbon nanotubes, which were being hyped a couple years ago, and have great properties but can't actually be turned into processors.

But yeah we're approaching the limits of what we can do with our current understanding of physics.
 
A lot of people are working on new ideas and have been for several years. New transistor designs, new materials, etc.

They have to try a lot of stuff, because not all of it ends up being viable to mass-produce (or even make products from). Like carbon nanotubes, which were being hyped a couple years ago, and have great properties but can't actually be turned into processors.

But yeah we're approaching the limits of what we can do with our current understanding of physics.

That's when I think it will get interesting. Necessity is the mother of all invention as they say. Current PC design does appear to be running out of road. Looking forward to a revolution rather than evolution, corporate types hate that, too much risk, but I think they'll have it forced on them.
 
Current computing technology took a leap forward with the race to the moon as a need for smaller transistors resulted in silicone chips. Perhaps when we push to Mars there will be another breakthrough. I've heard this mentioned a few times elsewhere.
 
We aren't really even doing proper die shrinks at this point in time.

True the whole die shrink thing can be difficult to decipher with everyone using different ways to describe the same size process. Only to be expected that performance will get closer and closer. Features like RT or Gsync may be the only way to differentiate in a few generations.
 
Yes totally agree. AMD will likely come in £50-£80 cheaper when they release. But if nvidia has successfully been selling at £1300. Well... The AMD cards will likely still be £1250.

Honestly we only have ourselves to blame. Preordering cards at over a grand with no performance figures just confirms to nvidia that we are morons who will pay anything for anything.

It is a sad situation and the RTX cards today will be useless for RT gaming within the next couple of years.

So whatever we call it there isn't much room left before we run out of process shrinks to increase performance? Are we looking at mid 2020's then new ideas will be required?

The ideas are already in play as they know they're reaching the limits of what's possible with current tech, Nvidia are doing it now with ray tracing, they're making it so we're unable to run games with ray tracing very well so the next five or more years will be about us slowly getting back to being able to run 4k 60 with RT. That's if they get there way and to be honest it's likely they will as AMD and Intel will both face the same issue. If ray tracing catches on I wouldn't be surprised if the shrinking stops at 7nm and stays there far a few years until the RT tech catches up.
 
It is a sad situation and the RTX cards today will be useless for RT gaming within the next couple of years.



The ideas are already in play as they know they're reaching the limits of what's possible with current tech, Nvidia are doing it now with ray tracing, they're making it so we're unable to run games with ray tracing very well so the next five or more years will be about us slowly getting back to being able to run 4k 60 with RT. That's if they get there way and to be honest it's likely they will as AMD and Intel will both face the same issue. If ray tracing catches on I wouldn't be surprised if the shrinking stops at 7nm and stays there far a few years until the RT tech catches up.
But the thing is, we need something like 200x the power of the 2080 Ti to do proper real-time ray tracing.

If shrinks stop at 7nm, we're looking at either some beastly sized chips, or a separate RT card, or something entirely different.

Even if the "shrinks" don't stop, you're typically not looking at more than +50% each gen, so how we get from here to 200x today's perf is unknown.

e: In fact 200x today's perf is such a pipe dream it's likely to not be possible at all.

Which would mean real time (proper) ray-tracing is not possible at all.
 
Well, it's savings money really, but as AMD have been so good to me, I'm going to hold out until AMD's next GPUs come out then treat myself :)
 
But the thing is, we need something like 200x the power of the 2080 Ti to do proper real-time ray tracing.

If shrinks stop at 7nm, we're looking at either some beastly sized chips, or a separate RT card, or something entirely different.

Even if the "shrinks" don't stop, you're typically not looking at more than +50% each gen, so how we get from here to 200x today's perf is unknown.

e: In fact 200x today's perf is such a pipe dream it's likely to not be possible at all.

Which would mean real time (proper) ray-tracing is not possible at all.

It wouldn't be be a surprise if it turns out it isn't possible, as for Nvidia & co they'll be more than happy to do what Nvidia has done with Turing and put the focus on ray tracing performance whether it's possible to get there or not. Like I said they need a new reason to make us believe we need new cards on a regular basis because they are getting towards the limit of what can be done with what they have, That's why ray tracings suddenly being called the Holy Grail of visuals and I've gotta admit I never knew how badly I needed it until they showed me the way and for that I'm grateful. :rolleyes:
 
or a separate RT card, or something entirely different.

Though they don't generally take off too well a separate RT card isn't outside the realm of possibility and ray tracing is much more favourable towards multiple core scaling than other areas of rendering. I'd imagine it would be potentially feasible, though probably somewhat unrealistic, to have systems with say 3x add-in RT cards with 4 RT cores each with a refined 7nm. (Obviously the high end systems for professional work probably will end up something like that, they already are that kind of style).
 
Back
Top Bottom