• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA GeForce RTX 3090 Ti, the flagship reinvented

Agree. It's for these very reasons I've stuck with 1440p and my trusty 1080Ti. With sensible tweaking, all of those games can be made to run at 60-100 fps with no massive change in visual quality to my old eyes. Sure, I can't have ray-tracing, but nothing I've seen so far has left me so impressed I felt the need to upgrade.

As you say, 100+ fps @ 4k is really where we need to be. That's at least 6+ months away, IMHO - so I won't be funding a new car for Gibbo until that point. :D

Funnily enough I was looking at a video on Youtube that popped up on recommended comparing a 1080Ti to a 30 series card (can't remember which one it was) and it held it's own, beating it in many cases. It was a lower end one of course but it's crazy for me when we're 2 series on from then.

Nvidia really went all out on that card, and as you say you might not get ray tracing but you can still game at high framerates, and I've not seen one game with raytracing that's blown me away.
 
I've tried to resist writing this, but you've annoyed me now.

Rather than trying to play one side off against the other, why not learn a little about the tech involved? Todays PC tech is made up of AI, RT and legacy support. Stop living in the past.
I'm sorry?

You're the person who posted about "Fine wine....is corked", with a gif which has AMD falling over. With a chart that has a laser focus on RT, to the exclusion of any other metric. Every post you've made since has digs at "the others". "Knitting circle". "Stop living in the past".

You are the one who is being something of an arse here.

All tech uses greater amounts of power to acheive greater performance. That point you claim the 3090Ti is past doesn't exist for those who are not budget limited.
That's just wrong. Consider Colossus - one of the first computers. Ran at more than 8KW, to to do less work than my raspberry pi. All technology uses an amount of power to do a specific amount of work. Progress lets us do more with less. But the point that *you concede* that the 3090 Ti is past, has nothing to do with money. It's a point of the power/performance curve. How that impacts people of different budgets might be worthy of discussion - but that's not your claim. You say it doesn't exist.

It's also worth mentioning again that you only have AMD to compare to who are reaping the efficency of a smaller node, which the 3 cards I listed show just how efficent Ampere is when using up to date tech.
And you're blind in your obsession. AMD are reaping the rewards of having a better tech node, true. But they've stopped, more or less, at the 6900xt. Where as Nvidia started at the edge of reasonable power levels and have just accelerated into the distance. AMD have released some somewhat disapointing low end cards, but they haven't followed Nvidia's insanity. 500W for a graphics card.

And more to the point, whose fault is it that they don't have the better node? Whose fault is it that AMD probably have a worse architecture? They are who they are, with the tech they have. Whinging about how it isn't fair that AMD has a node advantage doesn't change Nvidia's choice to sell graphics cards you could mistake for a small kettle.

If you want to debate, fine, just don't regurgitate the clickbait beggars.
I ask for more data. You provide a small, cherry picked handful of data that can't be used for a full debate but only support a very narrow view. That's clickbait to me.

Oh, and yet another dig with the "beggers" there. "People with money buy Nvidia", of course. Love the implication there.
[Edit: Sodding quote tags]
 
I've tried to resist writing this, but you've annoyed me now.

I'm sorry?

You're the person who posted about "Fine wine....is corked", with a gif which has AMD falling over. With a chart that has a laser focus on RT, to the exclusion of any other metric. Every post you've made since has digs at "the others". "Knitting circle". "Stop living in the past".

You are the one who is being something of an arse here.

Dry your tears, pull on your big boy pants...

The Fine wine joke is the constant excuse by the AMD fanboy hence the comment. It's been mentioned several times on here regarding AMD's RT perf. The GIF is a fun but also perfect state of play as we have Intel and Nvidia backing RT and AI, both of which AMD has failed with. Yes, I know it's AMD's first attempt, but that's not an excuse that's going to get my cash.

That's just wrong. Consider Colossus - one of the first computers. Ran at more than 8KW, to to do less work than my raspberry pi. All technology uses an amount of power to do a specific amount of work. Progress lets us do more with less. But the point that *you concede* that the 3090 Ti is past, has nothing to do with money. It's a point of the power/performance curve. How that impacts people of different budgets might be worthy of discussion - but that's not your claim. You say it doesn't exist.

Shame you can't purchase common sense. Did you really think I was comparing Colossus to the Pi, or are you just being an arse? Just since you brought it up though, how would you increase the performance of your Pi?

And you're blind in your obsession. AMD are reaping the rewards of having a better tech node, true. But they've stopped, more or less, at the 6900xt. Where as Nvidia started at the edge of reasonable power levels and have just accelerated into the distance. AMD have released some somewhat disapointing low end cards, but they haven't followed Nvidia's insanity. 500W for a graphics card.

There is no blind obsession. We have two discrete GPU manufacturers. One makes great general purpose GPUs and one makes a gaming orientated GPU which is budget limited due to console/mobile design. If AMD was leading, then I'd be using an AMD card at the moment. I agree AMD have not followed Nvidia, perhaps they should as it may help their market share. BTW. leaks put Navi 31 at 500W.

And more to the point, whose fault is it that they don't have the better node? Whose fault is it that AMD probably have a worse architecture? They are who they are, with the tech they have. Whinging about how it isn't fair that AMD has a node advantage doesn't change Nvidia's choice to sell graphics cards you could mistake for a small kettle.

Fault? No one is claiming fault. Why would you fault them? There is no 'whinging', it's a fact. Nvidia are out performing AMD with a far less efficient cheaper node. A very good business move.

I ask for more data. You provide a small, cherry picked handful of data that can't be used for a full debate but only support a very narrow view. That's clickbait to me.

Oh, and yet another dig with the "beggers" there. "People with money buy Nvidia", of course. Love the implication there.
[Edit: Sodding quote tags]

I provided a small list as you claim not to be interested, while appearing far too lazy to do it yourself. Is it really cherry picked or is it simply sticking to what is worth considering by the average consumer, 3060Ti - 3080?

The 'beggers' is aimed at the current trend of tech tubers turned influencer who earn their money from clickbait. Little bit of a complex going on there?
 
Absolutely brutal, and that's vanilla RT, but we can easily expand refl. res & roughness threshold...

Syrw5Vz.jpg

Just looking at this review from computebase now:

https://www.computerbase.de/2022-03...chmarks_mit_und_ohne_raytracing_in_3840__2160

Laughable from nvidia this release, anything to cash in now before the 40xx series and setting things in motion for launch prices of 40xx series no doubt.


On a separate note, interesting that they have the 3080 12gb in their comparisons now too. I was expecting way more of an improvement over the 3080 10gb since we have a couple on here "insisting" that the 10gb vram is holding back the 3080 at 4k yet.... looking at various games there, seems that is not the case though? (only FC 6 shows the biggest improvement and well, lets not go there again....) There is an improvement no doubt but looks like that is coming from the slightly better hardware of the 12GB model overall.... Particularly in doom eternal, which has been "quoted" to have "issues" with 3080 10gb??? :confused:

Don't post up your link to a website as it wont be trusted by some, as that what it comes down to now. Trusted websites (or cherry picking) are subjective.

Indeed :cry:

Looks like computebase aren't showing the results people want to "believe/see" so they might no longer be trustworthy.... :D ;) :p

Sure -
3080 ~340W @ 47FPS
6900XT ~300W @ 29FPS
3060Ti ~200W @ 27 FPS


Not interesting, perhaps you thought this was the knitting section?

It's actually rather interesting them figures and does show nvidia are leagues ahead here with efficiency in RT where as RDNA 2 is more efficient with rasterization workloads.

Bearing in mind that the majority of ampere cards can be undervolted massively too e.g. my 3080 in heavy RT doesn't go past 280w at most. I think RDNA 2 are good for undervolting as well as being better for overclocking?

Another thing, which has been shown when it comes to RT is amds FSR doesn't perform anywhere as well as dlss but without RT, performance gains are equal. All this kind of just goes to show that RT really is not a focus for AMD "currently", hopefully that'll change though given intel look to be pushing RT front too, which should be of no surprise given all the games getting the RT treatment so amd would be silly to not make it a focus for RDNA 3....
 
Indeed :cry:

Looks like computebase aren't showing the results people want to "believe/see" so they might no longer be trustworthy.... :D ;) :p

B,B,But it's techpress mate. Global dontcha know. That's why one result will do........in another language. Someone didn't do a science degree and learn how to form an argument using data. 1 example is enough to evaluate. Question that, and you are accused of having a problem.

Even a 'colouring-in' degree (Geography) will have a module on statistics.
 
The problem is when those "small diminishing returns" become inperceptible to the human eye. I'd say that is certainly the case for the 3090Ti versus a decent OEM 3080 Ti.

Having just looked at OCUK 3080Ti prices versus 3090Ti prices, the average price delta seems to be about £700. That's a lot to pay for an inperceptible improvement. :)

And for those cases where very low framerates come into play (Cyberpunk, AC:V, WD:L and MSFS2020 @4k/Ultra, for example), you'd still end up tweaking the settings anyway to get vastly improved performance on any card, so even in those cases the 3090Ti makes little sense, IMHO.

Overclocked 3080 FE = 3090 FE STOCK

Overclocked 3090 FE = 3090 TI STOCK

The difference between an Overclocked 3080 and 3090 Ti stock is baout 15% or around 20-25FPS - is this worh £1200 more? I think not, people are literally throwing their money away imo.
 
Yeah I agree that for the difference in price the jump in perfomance is small, I just seem to suffer from Fomo.

The other argument is whether it's worth buying a card priced this high when it's only a few months before it's overtaken but I'm not convinced the new cards are months away like some people believe. I can't see new cards arriving until the end of the year, with the 4090/Ti equivalent not til the first quarter of 2023 anyway.

So I do think there's some value to be had buying a high end series card now and getting a years use out of it.

The games you mention are frustrating, as having a high end card and not even being able to run at a solid 60 is annoying. I'm looking forward to cards that can game at 4k 120 comfortably. Yes I can do that on certain games but a lot of recent releases I play are nowhere near.

Forza Horizon 5 and Tiny Tina wonderlands can't run at close to 120 maxed out at 4k.

Getting a year's use will cost you probably a Grand as I can't see anyone paying even £1400 for a used 3090Ti more like £800 come March 2023
 
Getting a year's use will cost you probably a Grand as I can't see anyone paying even £1400 for a used 3090Ti more like £800 come March 2023

You could be right, I think it's going to depend on what the market is like. If we have a repeat of what happened with the 30 series then prices of cards could shoot up.

I just can't see the gpu situation being any better. Infact I think it will be worse. I can see Nvidia starting MSRP at even higher prices. The 30 series launched at an okish suggested retail price. These 40 series cards are going to start at hundreds more, and I've got a feeling that crypto (despite what people think) is going to boom again.

PLus scalpers are still going to try their luck, I've seen some idiots trying to sell the 3090Ti for more than retail. Thing is no one will pay that when it's still in stock at retail lol.

I just wish pricing could go back to 980/1080Ti prices.
 
You could be right, I think it's going to depend on what the market is like. If we have a repeat of what happened with the 30 series then prices of cards could shoot up.

I just can't see the gpu situation being any better. Infact I think it will be worse. I can see Nvidia starting MSRP at even higher prices. The 30 series launched at an okish suggested retail price. These 40 series cards are going to start at hundreds more, and I've got a feeling that crypto (despite what people think) is going to boom again.

PLus scalpers are still going to try their luck, I've seen some idiots trying to sell the 3090Ti for more than retail. Thing is no one will pay that when it's still in stock at retail lol.

I just wish pricing could go back to 980/1080Ti prices.

We are probably going to be in a bear market for 1-2 years before another coin takes over from Ethereum, so I think crypto will lose a lot of part-time miners and market will flood with used GPUs.

My advice is not to waste £400-500 extra on a 3090ti and enjoy what you have!
 
My advice is not to waste £400-500 extra on a 3090ti and enjoy what you have!

Yeah that's certainly the sensible choice. I've been watching more benchmarks and in some games the performance differences is within margin of error.

I've got my card for sale anyway just incase I get what I'm asking but I'm not desperate to sell.
 
Looks like the 3090ti was not just a TDP dry run, it's a PCB dry run too, people have noticed that a lot of PCBs have missing components- for example Gigabyte cards have a slot on the PCB for a 2nd pcie5 power connector but it's not soldered on and the Founders edition PCB forbthe 3090ti has a whole bunch of missing components that the PCB has cutouts and pin outs for


now this is not uncommon, we've seen it on several other cards that share PCBs this generation- what it tells us thatbthe 3090ti shares its PCB with something else, something that is not Ampere, it's probably the 4080/4090/4090ti

https://videocardz.com/newz/nvidia-...ned-to-support-next-gen-rtx-40-ada-ad102-gpus
 
Back
Top Bottom