• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

The 3090 does not have the price tag of a Titan and they are positioning it as a pure gaming GPU unlike the Titan.

I know, its even more expensive! I was referring to the Titan Pascal which cost me £1150 and then the 1080ti came out.

If the 3090 only cost £1150 I would buy one in a heartbeat. But at £1500+ I am concerned a 3080ti doesnt come along and offer 95% of the performance for £500 less.
 
Anyone know how much of an upgrade the 3090 will be over this?


Will it run Dwarf Fortress with RTX?
Zq63k60.jpg

Side-grade
 
I seriously doubt AMD have anything to touch Nvidia, I hear it every launch but it is a case of "ahh well next time with RDNA X"

They may have something to compete with a 3070 (basically 2 year old performance = 2080Ti).

It's not AMD's fault that some silly Willys are walking around claiming things on AMD's behalf. RDNA 1 was never meant to be a flagship killer. They went after the market where most of the money was instead of the niche flagship market and when you look at what they delivered I would argue they succeded. You get 2070 super like performance for 3/4 the price. What is wrong with that? The only real bomb in terms of AMD hype from the company itself was when Raja was talking up Vega, and the handling of Radeon 7 imo. Now I personally like Vega but it was never able to deliver the same gaming performance per Tflops that Nvidia could. But that's it really. Polaris was fine and it wasn't AMD who claimed to my knowledge that it would match a 980 ti that was just fanboy hype as usual(funny side note, the RX 580 actually did eventually manage to match the 980ti in a single title). The 290x was able to beat the titan. What more do you want?
 
In those days when the 1080 Ti came out the Titan cost a lot less than the 3090.:eek:
Nvidia are doing what I predicted and wanted them to do ages ago. Milk the people who are loaded hard by giving them a little bit more performance than the "mid range" and give us plebs a very nice increase in performance in similar pricing tiers from the prior gen.

As long as price for performance is improving nicely gen to gen, I am happy to upgrade. They got it right for Ampere, it will sell well. Us plebs will be happy with around 20% less performance for less than half the price and people who want the best will be happy they have it. Win win.
 
I know, its even more expensive! I was referring to the Titan Pascal which cost me £1150 and then the 1080ti came out.

If the 3090 only cost £1150 I would buy one in a heartbeat. But at £1500+ I am concerned a 3080ti doesnt come along and offer 95% of the performance for £500 less.

It probably will, but I'm willing to bet we won't see a 95% performance Ti till June next year, you'll of had all those months with a top end card not worrying about VRAM etc etc, and even come June it'll still be the top end card.
 
I know, its even more expensive! I was referring to the Titan Pascal which cost me £1150 and then the 1080ti came out.

If the 3090 only cost £1150 I would buy one in a heartbeat. But at £1500+ I am concerned a 3080ti doesnt come along and offer 95% of the performance for £500 less.
That's the point, the massive price gap between the 3080 and 3090. One card or two will definitely fill the gap, it's just when.
 
I wonder if Nvidia will allow 10bit output now for the 30 series over hdmi 2.1


If not this limits the amount of displays which can take advantage of hdmi 2.1 at 10bit as some devices and avrs only have 40gbps ports

The Q&A on reddit sort of answers this:

https://www.reddit.com/r/nvidia/comments/ilhao8/nvidia_rtx_30series_you_asked_we_answered/

Will the 30 series cards be supporting 10bit 444 120fps ? Traditionally Nvidia consumer cards have only supported 8bit or 12bit output, and don’t do 10bit. The vast majority of hdr monitors/TVs on the market are 10bit.

[Qi Lin] The 30 series supports 10bit HDR. In fact, HDMI 2.1 can support up to 8K@60Hz with 12bit HDR, and that covers 10bit HDR displays.


However, the answer did not address whether 10-bit would work in 444 (or RGB) or just in 422 like it does now. So I'm waiting for a reviewer to tell us this.
 
Looking forward to seeing some real reviews with detailed benchmarks to see if all of this hype translates to real game performance. If it does then I will examine the cost of upgrade/performance improvement over my 1080ti before making a decision. I've certainly learned my lesson about getting over-excited before new tech comes out and then it being a bit of a letdown and if that means I miss out on the initial stocks then I'm okay with that.
 
Taken from the Reddit Q&A Site:

Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?

[Justin Walker] We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price.

In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory.

Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.
 
The Q&A on reddit sort of answers this:

https://www.reddit.com/r/nvidia/comments/ilhao8/nvidia_rtx_30series_you_asked_we_answered/

Will the 30 series cards be supporting 10bit 444 120fps ? Traditionally Nvidia consumer cards have only supported 8bit or 12bit output, and don’t do 10bit. The vast majority of hdr monitors/TVs on the market are 10bit.

[Qi Lin] The 30 series supports 10bit HDR. In fact, HDMI 2.1 can support up to 8K@60Hz with 12bit HDR, and that covers 10bit HDR displays.


However, the answer did not address whether 10-bit would work in 444 (or RGB) or just in 422 like it does now. So I'm waiting for a reviewer to tell us this.


Thanks. Thats exactly the one thing holding me back from selling the RTX 2080 and buying a 3080 now.
If it will support 10-bit 4:4:4 RGB HDR 120hz, then it'll work with my CX and I can move forwards with the upgrade.

If it doesn't, then I'm not that excited and can potentially wait until they eventually do add that feature (there are so many 40gbps devices in the wild now that I feel its just a question of when Nvidia or AMD support it).

If NVIDIA will continue to lock 10-bit, then I'm perplexed given most HDMI 2.1 displays don't offer 12-bit capable bandwith outside of the C9 series OLEDs.
 
My strategy is:

Buy 3090
Wear a smug face for the next 2 years knowing that it’s unlikely anything will be better
Skip 4000 series upgrade
Buy 5000 series top dog in 4 years

I’ll probably be looking at parenthood by the time the next gen comes out, so might not have time to be gaming anyway!!
 
Taken from the Reddit Q&A Site:

Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?

[Justin Walker] We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price.

In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory.

Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.


TBH this crying about memory is just that.

When I bought my RTX 2080 at launch, loads of users were crying about it then. Yet here I am, years later, and the 8GB VRAM has never held me back at all before.
 
My strategy is:

Buy 3090
Wear a smug face for the next 2 years knowing that it’s unlikely anything will be better
Skip 4000 series upgrade
Buy 5000 series top dog in 4 years

I’ll probably be looking at parenthood by the time the next gen comes out, so might not have time to be gaming anyway!!


The fact they didn't show the 3090 smoking the 3080 makes me feel as if its really not a good deal.
For the price of the 3090, you can get a 3080 and a 3080super in the next 2 years and have two GPUs/rigs.

I'd much rather buy a 3080 and a 4080 the year after for the same price rather than hold onto a 3090. You can probably also buy a 3080 and sell it, and buy a 3080ti and still be up money wise too.
 
The fact they didn't show the 3090 smoking the 3080 makes me feel as if its really not a good deal.
For the price of the 3090, you can get a 3080 and a 3080super in the next 2 years and have two GPUs/rigs.

I'd much rather buy a 3080 and a 4080 the year after for the same price rather than hold onto a 3090. You can probably also buy a 3080 and sell it, and buy a 3080ti and still be up money wise too.

I've got the worry to the only increase the 3090 has is the VRAM and will offer little else, I'm guessing 10-20% in game top's but we'll see.
I'm 70% certain I'll go 80 then 80ti route.
 
TBH this crying about memory is just that.

When I bought my RTX 2080 at launch, loads of users were crying about it then. Yet here I am, years later, and the 8GB VRAM has never held me back at all before.

Who is Justin Walker?

Any way... wow 60-100 at 4k! I am thinking of pairing a 3080 with a 1440p monitor :eek:.

It may indeed be overkill at this resolution.

Also on the subject of vram. As I posted yesterday the next gen games may be streaming in more textures from nvme via directstorage (RTX IO) so vram may not be as limiting as we think.

It might even be time to get a 4k monitor... but now this is getting expensive. :o
 
Back
Top Bottom