• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

A few of us are thinking ahead tho to next years games and the year after that.

Not everyone wants to upgrade every gen.

Yeah but I would just like to see actual evidence of VRAM requirements and any detrimental effects. Granted we all want more memory but at 1440p I have never had an issue with 8gb VRAM on any game so far. Also the actual cost savings of less VRAM.

What will be interesting is the comparison if 3rd party vendors release cards with more VRAM.
 
Not sure whether the 3090 will be faster than 2 2080 Tis, especially in games like GTA V.

However, 1 3090 will be better than 2 2080 Tis in games where SLI can’t be used.

1x 2080 Ti is great as it can just about maintain 4K 60fps in a lot of games but there is a few it struggles with and falls a bit short in games like the division 2 on occasions. I only need a little bit more power to make sure I can always hit a minimum of 4K 60fps in the games I want and the 3090 should more than do that comfortably.

Haven't played GTA V in many, many months but even at 4K it should still give a nice boost over one 2080 Ti as it scales decently well with 2x 2080 Ti's still.

Ray tracing performance boost is what I'm really looking forward to and the 3090 could be shaping up to be really strong at that, soon find out tomorrow though.
 
Yeah but I would just like to see actual evidence of VRAM requirements and any detrimental effects. Granted we all want more memory but at 1440p I have never had an issue with 8gb VRAM on any game so far. Also the actual cost savings of less VRAM.

What will be interesting is the comparison if 3rd party vendors release cards with more VRAM.
Like it or not (and a lot of PCMR types hate it), the consoles dictate the lie of the gaming landscape.

The consoles are about to have their usable memory doubled.

People who think that isn't going to have an impact on game development - or up the requirements on PC hardware - I think they will be proven wrong, albeit perhaps not immediately, but steadily as more "next gen" games are released.
 
Actually I may have exaggerated my framerates in Control a 4K Ultra with RT..:p



So, 15fps might be doable with an RTX 3090 :D

Well, that concludes my theory that it may only look a bit weird because of lower resolutions.

I can clearly see the grittiness around her head and weird line running down her leg so I guess RT looks like that even at 4k.
 
Because I'm an idiot and I'm on the brink of getting a new PC... Do you think the current gen Intel chips/mobos that are only PCIe3 could hold something like the 3090 back or do you think it's not really something to worry about?
 
Did it not annoy you that you were paying for tech that you weren't interested in?

I upgraded to a 2080 Ti over the 1080 Ti myself not because of RTX but I need that extra raster performance in games where the 1080 Ti couldn't quite keep up in non SLI games as I like to plat at 4K 60 at ultra settings. I have absolutely no regrets in upgrading. Have absolutely loved these cards for the pure performance, RTX is just a very nice bonus.
 
Because I'm an idiot and I'm on the brink of getting a new PC... Do you think the current gen Intel chips/mobos that are only PCIe3 could hold something like the 3090 back or do you think it's not really something to worry about?
Anyone who says they know is lying, but there's a possibility that it might by a small amount. I would be surprised if we were talking double digit percent impact, and it's possible it might have zero impact
 
Because I'm an idiot and I'm on the brink of getting a new PC... Do you think the current gen Intel chips/mobos that are only PCIe3 could hold something like the 3090 back or do you think it's not really something to worry about?

We don't know :)

You would think in theory that yes, they would. But wait and see. You really don't have long now and that will be answered.
 
I upgraded to a 2080 Ti over the 1080 Ti myself not because of RTX but I need that extra raster performance in games where the 1080 Ti couldn't quite keep up in non SLI games as I like to plat at 4K 60 at ultra settings. I have absolutely no regrets in upgrading. Have absolutely loved these cards for the pure performance, RTX is just a very nice bonus.

I went from a 2080super to a 1080ti ( long story with good reason ) i found i only played RT games because i had RT.. and im sure when i upgrade to a 3000 series i will be getting more RT games. Though i will be re playing control with RT and DLSS for sure for my second play through as im still confused about what happened at the end.
 
Do you think the current gen Intel chips/mobos that are only PCIe3 could hold something like the 3090 back or do you think it's not really something to worry about?

The 3090? Absolutely. PCIe v3 already holds back the RX 5700 XT in streaming from a v4 NVME drive. I don't think that the effect will be large on a x16 slot, but if you have to plug in another card you will see that x16 slot become a x8 slot and you will see a drop in performance. Plus Intel are supposed to be releasing PCIe v4 boards at the end of the year.
 
I only want a Graphics Card for Flight Sim 2020... I didn't realise what the exchange rate was like though, I might just be able to go for a 3080 if it comes in at £600.
 
I'm probably going to sell my 9700k when intel and amd refresh their cpus. I hope intel would have gen 4/5 by then, and maybe will give a bit of a boost in gpu performance which will be nice. :D. Damn I'm in Venice at the moment and I will be watching the conference tomorrow with my card in hand to buy a 3090. Hope it's not the rumoured $1900 that'll be ridiculous
 
Back
Top Bottom