• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

They may be doing with Ampere, what they are doing now, and thats have 2x sets of cards out, ones that have hw RT, and ones that don't, the ones that don't, could get the RT, via the extra aics
 
Last edited:
The comparisons to physx is a bit daft it's almost like people forgot they done the complete opposite with it. They bought a company with a discrete ppu and then made their own GPUs do it instead making the ppu redundant.....

PhysX didn't got integrated into DirectX and get general traction, so the extent to which it was/is used could be done on the GPU. If it would have come down to really complex stuff, with industry wide implementation, then a PPU would have hit the market again.

Speaking of which, last time nVIDIA blocked AMD users of using PhysX even with a dedicated nVIDIA card in the system. If indeed they'll have a RT dedicated processor sold separately, would be interesting to see if they block AMD once more. Plus, a lot gamers could just buy that one if the current card they have offers enough raster performance. :)
 
I was not talking about his comment section, was actually talking about the attitude in these forums. There is always somebody with the alreet guys or i can't listen to his voice attitude.

Fair enough, I don't get that as I don't mind a Scottish accent. Probably just an easy way to have a dig at him. I'll give all the YouTubers credit, they put in far more effort in the producing content than I could. If I don't like their content I just look elsewhere :)
 
Speaking of which, last time nVIDIA blocked AMD users of using PhysX even with a dedicated nVIDIA card in the system. If indeed they'll have a RT dedicated processor sold separately, would be interesting to see if they block AMD once more. Plus, a lot gamers could just buy that one if the current card they have offers enough raster performance.

That would be an interesting one if they sold a stand alone RT processor - making it nVidia GPU exclusive could be a tricky move.
 
just can't see nvidia trying to do a stand alone RT card.

It's going to be much better to just have multiple dies on the same card interlinked with a high speed connection
 
just can't see nvidia trying to do a stand alone RT card.

It's going to be much better to just have multiple dies on the same card interlinked with a high speed connection

Can't really see it unless they are going to bet the house on pushing games with a feature complete path tracing implementation as the primary and/or only render path and even then you have things like laptop users, etc. who wouldn't be able to utilise a stand alone option.
 
Adored is probably the best commentator around. He gets what he sees as a leak and gives commentary around it. Most of the time he is telling there is plenty of salt involved to cover what he is saying. The actual amount of hate he gets for having a Scottish accent is infuriating as i thought we were past that crap but obviously not. i guess the next English guy i see in Scotland is getting a beating just because he sounds like he does. Some of you guys should be ashamed. Areet guys with a slap.

His Scottish accent is really bad, most people with a Scottish accent I actually really like, it makes them sound more intelligent.:)
 
His voice always reminds me of Spud from particularly the first Trainspotting film, as some his "leaks" are a bit like when Spud greeted in the kitchen that morning.
 
Using chips on the back of the board is nothing new: look at Apple with their RAM chips on the back of the motherboard. The problem with this traversal chip idea is that Ampere is already out in the wild without it. I wonder if Nvidia have simply done an Apple?


The GA100 is out in the wild, and they never mentioned Ray-tracing once, and it doesn't appear in any schematic. While no one would buy a GA100 for ray-tracing so there is no need to market it , there is also a good chance that it fundamentally doens't have any RT hardware. Ampere has gone even further to diverge consumer gaming and graphics form the HPC and AI markets. No gaming part will have anywhere near the number nor complexity of Tensor units for example.


Therefore we basically know nothing about consumer Ampere ray-tracing hardware. However, Turing dedicated a measly 5% of the die area to RT cores and the 7nm+ process double transistor density. Given nothing but basic scaling of transistor counts and allowing a healthier portion of the die to be RT dedicated, then we can easily imagine 5x the RT performance. Importantly, ray-tracing is far more scalable than even the rasterization workloads so achieving such a scaling factor with brute-force is much easier.
 
I'm just happy that the gamble paid off. Nvidia's push for RT was enough to kickstart the whole market.

At the PS5 event, the majority of games shown off are using Ray Tracing so mass adoption has been successful - by the end of this year most new AAA games will support ray tracing out of the box.
 
I'm just happy that the gamble paid off. Nvidia's push for RT was enough to kickstart the whole market.

At the PS5 event, the majority of games shown off are using Ray Tracing so mass adoption has been successful - by the end of this year most new AAA games will support ray tracing out of the box.
I don’t know about that. The consoles will have been in development way before 2 years ago. Nvidia likely knew what was coming and wanted to get ahead. Hence why their first implementation was so ****.
 
People can only go on there experiences at end of the day. You have had negative experiences on h20 were I have had nothing but positive. Been watercooling since 2007, all positive experiences.

I'm ofcourse not suggesting that h2o caused the issue with my 2080ti's, I just thought it was funny that someone was clutching at straws to admit that watercooling doesn't make a big difference to overclocking and instead suggested it would give a benefit to longevity
 
I don’t know about that. The consoles will have been in development way before 2 years ago. Nvidia likely knew what was coming and wanted to get ahead. Hence why their first implementation was so ****.

That doesn't necessarily mean anything. Consoles have changed development plans mid stream dependent on the hardware and technological advances that come along. The Xbox one release is an example. Developments kits were sent out using VLIW4(Terascale) hardware but the final Xbox one used GCN hardware.

Remember the first Ray Tracing Demos from Microsoft and Nvidia were in March 2018. DXR was rumoured to be in the hands of developers sometime in early 2017. So Nvidia would have had to have been working on Ray Tracing before that.

AMD filed a Patent in December 2017 for Ray Tracing. They submitted another Patent in June 2019 which seems to be just an update on their previous submission.

But all Nvidia's Patents that deal with Ray Tracing in their current form were filed between 2014 and 2016.

So, draw your own conclusions.


Maybe not though.
 
Last edited:
Back
Top Bottom