• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

They will. 3080 will prove to be a mid tier card in less than 6 months, Nvidia up to clever tricks again, obviously holding back 12gb and 16gb ti's to combat anything AMD launch. Once thsr happens buyer's remorse will hit home. 3080 is a monster of a card no doubt but 650 is still a huge chunk of money where Nvidia have brainwashed everyone in thinking this is a flagship card at an incredible price.

Disagree with that.
It will not be a mid-tier card, just like a 1080/2080 is not a mid-tier card. Yes, there are Ti/Super variants, but these are the very top tier cards.
Everything is relative too, £650 isn't a massive amount in today's market. It's definitely a huge performance step up from last series.
 
The main thing for me now is seeing the performance of the 3090 compared to the 3080. If it is significantly different, I may just get the 3090..
Reports (mostly based on specs) so far seem to say 20% increase at 4k or lower. If that is true, then for almost double the money a 3090 is not a smart buy and is one only for people who want the best without caring about value.
Middle of next year we will very likely see Super editions of the 3070/3080 with more VRAM. 3070 will likely get an upgrade to GDDR6X too as production increases and it gets cheaper.
 
Now that the prices on the 30 series have been revealed to be better than expected, it would take a bargain price to tempt me on a second hand 10 or 20 series card. Something tells me people are going to expect too much money for them though.

Was the same thing last time around. People wanted £300 for a second hand 980ti, so I just bought a 1070 for £400.
 
You won't see much difference PCIE 4 is a gimmick.

Longer term we will see a difference, in fact the aim is such a big difference that it will change fundamentally how games are built. If you read about the goals of Microsoft with DirectStorage they note that storage speed has increased significantly, in fact many people are still bottlenecked by SATA which is limited to about 700MB/sec read/write simply due to the bus alone, but with NVMe we can get 4GB/sec and with PCI 4.0 NVMe up to 8GB/sec which is storage that's 10x faster. And game installs are increasing above 100Gb in size, the internal tests of devs like iD software has their devs building texture packs at 300Gb+ which are only scaled down to be shipable.

The point is long gone are the days of engines that load a level, put everything that level needs into vRAM and the vRAM is the limitation of level size before the next load. Engines support streaming in assets as/when they need them and as game install sizes increase and disk speed increases up to a factor of 10x over SATA the load on the CPU for decompressing these assets off disk will just be a bottleneck. The fact that the next gen consoles have baked this into their very architecture and specs is a good indication that the next generation of games will be built to leverage this, not just by increasing loading speeds of traditional games, but moving away from confining players into small isolated areas which are used as portals for loading in the new parts of the world ahead of them. The goal is just to be done with that constraint and have much more open flowing worlds where loading is more seamless and dont funnel players through loading zones disguised in some way.

Also right now loading games is CPU bottlenecked, you get zero benefit from faster drives right now for load times. I extensively tested my 2x Samsung 960 Pros in RAID 0 for game loading benefits over an SATA SSD and there is none despite a 5x increase in speed, starting up a game and loading game levels is all strictly CPU bottlenecked. And that's frustrating because CPUs are kinda slow these days, GPUs are 10x the size and 10x faster, you really don't want to be doing anything on the CPU unless you absolutely have to.
 
Last edited:
Now that the prices on the 30 series have been revealed to be better than expected, it would take a bargain price to tempt me on a second hand 10 or 20 series card. Something tells me people are going to expect too much money for them though.

I was following a 2070 Super FE. He had it up before announcement for £490. I sent him a message after the announcement (when he declined my offer of £300) and suggested he was having a laugh expecting almost £500 and advised he checked out the release info. He thanked me (patronisingly) and said he should have accepted the £430 he was offered. Yesterday I got a notification to say he had dropped the price to £432.50. What an absolute joker!
 
The main thing for me now is seeing the performance of the 3090 compared to the 3080. If it is significantly different, I may just get the 3090..

My money is still on a 20% difference across the board. Whether that is significant is up to up. If a 3080 is giving you 100fps, do you really need the 120fps that the 3090 gives you?

On the otherhand if cyberpunk is do demanding that it wont give a consistent 60fps at 4k and drops to 50fps but a 3090 will give you a rock solid 60fps then it becomes significant.
 
Cut down maybe the wrong word, just meant its a scaled back 3080ti. I'm not saying it's a bad card, it's a bonkers mental gpu no one can argue that, it's just that I'm sure, and you might even agree, there's going to be a ti within 6 months to sit in around the 800 mark that really is the flagship gaming model. That's where the remorse might come in. There's no way this is the full range of 30 series cards, they've so much room to move here depending on what AMD do.

Exactly.

People who don't think there will be a 3070ti/S and a 3080ti are delusional.

I await the threads where people complain about nvidia's practices because they only bought a 3070/3080 a few months ago :D
 
Amazed that around 30 2080tis sold yesterday on ebay for between £500 and £600. I thought people would be paying less for them by now.
 
Nvidia have said differently - and its their product?

This is what they said in the Reddit Q&A

Will customers find a performance degradation on PCIE 3.0?

System performance is impacted by many factors and the impact varies between applications. The impact is typically less than a few percent going from a x16 PCIE 4.0 to x16 PCIE 3.0. CPU selection often has a larger impact on performance.We look forward to new platforms that can fully take advantage of Gen4 capabilities for potential performance increases.
 
Interesting, I notice none are compatible with the founders edition. Might be better going for the cheapest AIB one if going to watercool.
Checked the configurator last night but it doesn't specify which AIB cards are compatible yet. You're right, the cheapest ones probably are "reference" as there doesn't seem to be much info from manufacturers regarding extra power phases etc.
 
This is what they said in the Reddit Q&A

Will customers find a performance degradation on PCIE 3.0?

System performance is impacted by many factors and the impact varies between applications. The impact is typically less than a few percent going from a x16 PCIE 4.0 to x16 PCIE 3.0. CPU selection often has a larger impact on performance.We look forward to new platforms that can fully take advantage of Gen4 capabilities for potential performance increases.

That is in reference to the GPUs themselves and the performance of GPUs. The simple fact is that there's next to no benefit even going from PCIe 3.0 8x to 16x, the impact on benchmarks is negligable, the video cards aren't starved by PCI-e e bandwidth at that point, so the additional bandwidth of PCI 3.0 vs 4.0 is unecessary.

The debate over PCI-e 4.0 as a benefit really has more to do with the RTX IO and DirectStorage and the ability of SSDs to potentially get up to 8GB/sec of data transfer with PCI 4.0 x4
 
Its not an issue a large portion of users will be affected by, however if there are cards that jump up in performance i.e. 5x00 amd's in testing (hammer on box) there are conditions that need to be explored to eek this information out.

Its probably about as many games as there are for DLSS/raytracing... so definintely worth it ;)
 
That is in reference to the GPUs themselves and the performance of GPUs. The simple fact is that there's next to no benefit even going from PCIe 3.0 8x to 16x, the impact on benchmarks is negligable, the video cards aren't starved by PCI-e e bandwidth at that point, so the additional bandwidth of PCI 3.0 vs 4.0 is unecessary.

The debate over PCI-e 4.0 as a benefit really has more to do with the RTX IO and DirectStorage and the ability of SSDs to potentially get up to 8GB/sec of data transfer with PCI 4.0 x4
Yep that last line is saying once they start smashing rtx io through it Intel will need to be on 4 lol
 
I really hope the 3080 RTX can run perfectly fine on a decent 650W PSU. The wait for reviews is painful...

If the rest of your system is pretty lean, then I'm pretty sure a quality 650w will be fine.

For me with 6 hard drives and a 3900x, there is no way a 650 is going to be enough.
 
Back
Top Bottom