• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

HDD's I only use for storage. Recently picked up two 12tb externals. For OS I use my optane and have 1tb nvme waiting in reserve if and when needed :D

Soon your setup will need to be upgraded to something like mine when RTX IO comes out and Cyberpunk 2077 gets updated ;)


Why?
 
There is some rumours flying that not all NVME drives will support
It may end up being the case that the older ones will not work or you need minimum pcie 4.0 ones which will mean even I have to upgrade. But think what I have will be fine.
 
STolen from Guru 3D

No benchmarks have leaked, how is that possible? https://www.guru3d.com/articles-pages/geforce-rtx-3080-and-3090-what-we-know,1.html
I'll let you in on that secret. The AIB partners have all been prepping their cards for months now. They have the products, engineering boards for a while. NVIDIA however, has not released a driver that works with anything other than the test software they supply. So get this, I am writing this article on September 1st, hours before the presentation, and still, the board partners have no idea what the performance is going to be like. We need to advance on that as the board partners even do not know the thermal capacity effect of their products. NVIDIA has provided them with test software that will work with the driver. Basically, these are DOS-like applications that run stress tests. No output is given other than PASS or FAIL. We know the names of these test applications: NVfulcrum test and NVUberstress test. For thermals, there is another unnamed stress test, but here again, the board partners can only see PASS or FAIL. Well, we assume they have tested with thermal probes. What this paragraph, well, to show you the secrecy that NVIDIA applied for this Ampere project.

From what I saw in the release was that Ampere with DLSS 2 renders everything @ 1080p and then through the new way of whatever interpolation, fills in the gaps up to 4k AND claims that it has more detail from doing this than native 4k.

I'm waiting for this to be unpicked by reviews as ALL we have seen is Nvidia's marketing and literally nothing else as there have been no leaks due to Nvidia not giving them a driver that works with real world tests.

But those pro gamers who played 8k @ 60fps were impressed then it's still all very exciting.
 
Last edited:
STolen from Guru 3D

No benchmarks have leaked, how is that possible? https://www.guru3d.com/articles-pages/geforce-rtx-3080-and-3090-what-we-know,1.html
I'll let you in on that secret. The AIB partners have all been prepping their cards for months now. They have the products, engineering boards for a while. NVIDIA however, has not released a driver that works with anything other than the test software they supply. So get this, I am writing this article on September 1st, hours before the presentation, and still, the board partners have no idea what the performance is going to be like. We need to advance on that as the board partners even do not know the thermal capacity effect of their products. NVIDIA has provided them with test software that will work with the driver. Basically, these are DOS-like applications that run stress tests. No output is given other than PASS or FAIL. We know the names of these test applications: NVfulcrum test and NVUberstress test. For thermals, there is another unnamed stress test, but here again, the board partners can only see PASS or FAIL. Well, we assume they have tested with thermal probes. What this paragraph, well, to show you the secrecy that NVIDIA applied for this Ampere project.

Read this too. I think it explains the ghetto fan on the KFA2 card, a sensor somewhere tripped and they didn't have time to fully solve so zip tied a fan to the back of the card (I've done that before, worked well!)
 
Does any reviewer do reviews based on power consumption? Could a well clocked 2080ti be given as much juice as a 3080 for instance, to see how close it can come? Or restrict the 3080 down to 2080ti levels?

This is a good point. Putting the two cards at the same power level will almost certainly eat into the 3080's performance lead.

It also makes overclocking headroom more of a question too.
 
I'm waiting for this to be unpicked by reviews as ALL we have seen is Nvidia's marketing and literally nothing else as there have been no leaks due to Nvidia not giving them a driver that works with real world tests.
Digital Foundry’s early look wasn’t marketing. Sure, it had some Nvidia imposed restrictions, but they ran real games without oversight.
 
You need NVME drives for it to work and I recon pcie 4.0 will likely work better with it too. May even be essential.

I wonder if the superior IO performance of optane will help.


I can put optane in my NAS. But I can't see it being better than 4X1TB SSDs for cache.
 
I am not 100% sure, but I think for it to work it will need to go directly into your mobo’s nvme slot.


As said. I have a optane in my laptop.
It only helps in cache, so I don't see the point of it over a NVME.
 
Digital Foundry’s early look wasn’t marketing. Sure, it had some Nvidia imposed restrictions, but they ran real games without oversight.

TBF it was marketing as they controlled the test conditions,test sequences and games,and didn't allow the wider tech press to have a look either. Best to wait for the reviews to drop,to see tests on games outside the recommended test suite.
 
As said. I have a optane in my laptop.
It only helps in cache, so I don't see the point of it over a NVME.
I suggest you go read about RTX IO and Microsoft Direct Storage.

Not sure what your laptop has got do do with anything for starters.
 
I'm waiting purely because it's obvious Nvidia plans to slot in Super/ Ti variants after AMD launch their cards.

Will have much more options by that time and let's face it I may as well hang on to my 2080 Ti as long as possible at this point. Got bit with the 2080 Ti so don't wanna get bit with the quick refresh cards too xD

Fancy either the 3070 Ti 16GB or 3080 Ti 20GB with AIO.
 
I may as well hang on to my 2080 Ti as long as possible at this point. Got bit with the 2080 Ti so don't wanna get bit with the quick refresh cards too

I think most of the 2080ti and 1080ti price drop has happened now so it’s worth waiting to see what happens with AMD, consoles and Nvidia refresh. Prices won’t drop too much more for the next 6 months.
 
Speculating of course, but from what I have gathered the new RTX IO feature may not require a PCIe4 gen NVME. It will accelerate what ever it can up to 2x speed.

Obviously pairing it with a PCIe4 gen NVME will be faster but a PCIe3 gen NVME will still be accelerated.

Anyway... if needs be just buy a new NVME in 2021 but hopefully wont be necessary.
 
Last edited:
I think most of the 2080ti and 1080ti price drop has happened now so it’s worth waiting to see what happens with AMD, consoles and Nvidia refresh. Prices won’t drop too much more for the next 6 months.

Yeah exactly man, worse thing would be to pick up a 3080 at launch for it to be replaced by an improved Ti version with more vram a couple months later lol.

Going to sit back and let the dust settle from both camps launches and see where things are at.
 
I really don't see these upgraded memory cards coming soon.

Tell me, how would they price a 3070ti with 16GB of ram against a 3080 mere months in to the launch?

Is it more or less expensive?

Speculating on cards with more memory barely a week in to launch would make these cards DOA essentially.

It would be a huge smack in the gob if these cards arrived before Xmas.

---

Only unless supply of GDDR6 and GDDR6x is low would there be a reason for this.

Could also explain why AMD are waiting a bit longer then too.

But there was that article saying that indeed GDDR6 is low but it wont be until next year that supplies pick up again.
 
Last edited:
Back
Top Bottom