• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Associate
Joined
15 Feb 2017
Posts
2,146
Location
the ghetto
The 7990 was a joke, they were supposedly binning lower voltage gpu's for these cards and a lot of the reviews stated how "cool and quiet" it ran. Yet the product most of us that bought it got was a loud and hot pain in the ass. It's like at some point amd realised they didn't have enough low voltage dies to put on the card so just started using any they had to hand which overwhelmed the cooler.

Here's the product launch video:



Notice the "cool and quiet" rhetoric.

And here's the same gimp claiming that "we won't have to worry about the 6990 being loud and hot". And yes, it was loud and hot.



I know I bought two of them fro crossfire when they slashed them to half price.. what was I thinking :eek:

had to stick them both under water in the end, didn’t need central heating in the winter :D
 
Caporegime
Joined
18 Oct 2002
Posts
39,267
Location
Ireland
I know I bought two of them fro crossfire when they slashed them to half price.. what was I thinking :eek:

had to stick them both under water in the end, didn’t need central heating in the winter :D


It really bugged me as it was an expensive card (even at half price) and it seemed like the reviewers samples got all the low voltage gpu's that resulted in low noise and lower temps. Mine at one point was touching 100c on one of the cores.
 
Associate
Joined
15 Feb 2017
Posts
2,146
Location
the ghetto
It really bugged me as it was an expensive card (even at half price) and it seemed like the reviewers samples got all the low voltage gpu's that resulted in low noise and lower temps. Mine at one point was touching 100c on one of the cores.

yeah one of mine did that, it’s what forced me to put them on a custom loop.. never again
 
Soldato
Joined
19 Dec 2010
Posts
12,019
The true Titan (like the RTX Titan) could be multiple thousands.

The Titan cards aren't consumer cards anymore. With the Volta Nvidia said that they were moving the Titan brand into the prosumer market.

Sure you can buy one if you want if you got money to burn but to buy one for just gaming is a complete waste of money.
 
Associate
Joined
23 Aug 2005
Posts
1,273
Weird cooler design. And there is a photo of one with Sapphire logo on the back.. so that is one of of the AIB coolers? So 3 areas of the card goes: 1) gpu with fan, 2) just heatpipes, no-fan, passive-cooled, 3) followed by 2nd fan (which just happens to be the other way around for some reason). How is that better than a normal 3 fan design? That's what happens when you put marketing in charge of thermodynamics.
 
Last edited:
Soldato
Joined
12 May 2014
Posts
5,225
Adored speculated that the reason the 3090 exists is because Nvidia are not 100% certain it could beat RDNA2 and they didn't want to risk devaluing the Titan branding by having it lose to AMD.

What does everyone think of this? Do you think this could be the reason or do you think it could be another reason?
 
Associate
Joined
20 Aug 2020
Posts
2,034
Location
South Wales
They haven't done a VRAM increase in ages, so they could be scrapping the Ti name this gen and changing the 3080 Ti to the 3090 instead. Makes more sense having the Ti model as a number increase instead anyway.
 
Soldato
Joined
6 Jan 2013
Posts
21,839
Location
Rollergirl
I always liked to think that I would jump on any decent AMD rival card. Unfortunately my experience with my Vega 64 has led me to believe the advantage would need to be really significant.

The only compromise I've ever had to make with a Nvidia card was on price.
 
Soldato
Joined
6 Aug 2009
Posts
7,070
I think nobody knows and we'll just have to wait and see. I think each person can decide whether the new cards are too expensive etc, it's all relative to the individual, their opinions and their circumstances etc. However that won't stop people bickering and throwing venom at each other. If there's a decent performance jump then I will probably upgrade my 2080Ti. However if AMD released a decent alternative and I preferred that I would happily switch brand, as I recently did with my processor.

I'll never understand how people can get so irate over PC parts, makes for great reading though :p. If I think something is too expensive I just don't buy it... I don't harbour ill feelings towards those who do.

Exactly. Computing does seem to attract a certain kind of personality though, some would say almost binary in their thinking :p
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,176
Location
Greater London
The Titan cards aren't consumer cards anymore. With the Volta Nvidia said that they were moving the Titan brand into the prosumer market.

Sure you can buy one if you want if you got money to burn but to buy one for just gaming is a complete waste of money.
I do not see the RTX Titan that way. It is a gaming card as far as I am concerned. Quadro’s are pro cards.

I agree they are a waste of money, they are there to milk people who want the best. Makes a lot of business sense. Still a gaming card though.
 
Associate
Joined
20 Aug 2020
Posts
2,034
Location
South Wales
Haven’t Nvidia xx90 cards historically been dual GPU cards?

Wulf

They have so it will be strange seeing that number on a single GPU card if they decide to do it. They used to use Ti for a while in the past until they eventually stopped using it after the Geforce 4 and they brought it back again with the 780 Ti. So wouldn't be surprised if they drop Ti again.
 
Associate
Joined
19 Sep 2010
Posts
2,338
Location
The North
Watching this new gen release with eager eyes, currently running a 1070 and while it's no slouch, I'm running a 1440p 155hz monitor these days and having to turn down some settings in games, so there's certainly room for improvement. Also looking to go water cooled, for both the challenge/enjoyment and the headroom it gives.
 
Soldato
Joined
19 Dec 2010
Posts
12,019
I do not see the RTX Titan that way. It is a gaming card as far as I am concerned. Quadro’s are pro cards.

I agree they are a waste of money, they are there to milk people who want the best. Makes a lot of business sense. Still a gaming card though.


IT wasn't launch as a gaming card, It's not marketed as a gaming card. It's a Prosumer card aimed at professional market. It's for those people who need the performance of a Quadro but don't need the EEC memory or the software support or can't afford it.

You know the only difference between the Quadro and the RTX Titan is the ECC Memory and the professional software support?

And the benchmarks prove that it is a prosumer card. Much better at professional applications than the 2080Ti but barely any better in gaming situations. And That's what all the reviews say about it, they recommend it for deep learning and AI etc, but not for gaming.

The Titan cards are now for professional applications. You can game on them, but they don't offer any real performance over the top gaming cards.

Nvidia did make this very clear when they launched the Titan V.
 
Associate
Joined
21 Apr 2007
Posts
2,483
Adored speculated that the reason the 3090 exists is because Nvidia are not 100% certain it could beat RDNA2 and they didn't want to risk devaluing the Titan branding by having it lose to AMD.

What does everyone think of this? Do you think this could be the reason or do you think it could be another reason?

I see two reasons one is to give Nvidia options should an RDNA2 card out perform their top product (i.e. they could call it a Titan at any point or not as they see fit) and 2nd to basically charge more for another high end SKU. I question the value of having 24GB because that’s why consoles are using ultra fast SSD so to my mind you’re better off doing the same in PC world when it becomes available rather than paying the inevitable premium for a GPU solution.
 
Back
Top Bottom