• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

I said pages ago that any one with a 2080Ti should not buy the 3080. By the time you make the huge loss on the 2080Ti and buy a 3080 the raw performance could be 15% better at best, and would not be worth the price premium. However. You are forgetting two things here. RTX and DLSS. Both of which the 3080 should be *far* better at, if they float your boat.

Personally I don't like RT as it sits, so I really don't care.

So what you are saying is that 2080 Ti users aren't getting an upgrade option (non-RT use), as the 3090 product is a halo item, and therefore should be ignored by anyone who wants to upgrade?

Have Nvidia panicked? Have they been almost forced by themselves to push out a product into the mainstream, that shouldn't be in the mainstream, to in effect counter any potential competitiveness from other companies. Are they trying to save face before people even realise it needed to be saved? Should the 3090 have not existed yet is what I am asking, is this something they wanted to do but maybe in 6-9 months, just to ensure they have covered the single fastest GPU 100% without a shadow of a doubt?

The 2080 Ti + 5-10% at $799-899, also know as the 3080 (maybe), may not be what they wanted either, maybe the whole product stack is a bit of a hot mess.
 
I can't believe I'm seeing posts here about 8 and 10GB being enough for years.

They are either the same people or their descendants, who said the same when Diamond (yeh, I'm going back some distance) brought out a 6MB (lol) card when the norm was 4MB.

"6MB will never be needed", they said.

Yeh, that aged well.
 
So what you are saying is that 2080 Ti users aren't getting an upgrade option (non-RT use), as the 3090 product is a halo item, and therefore should be ignored by anyone who wants to upgrade?

Have Nvidia panicked? Have they been almost forced by themselves to push out a product into the mainstream, that shouldn't be in the mainstream, to in effect counter any potential competitiveness from other companies. Are they trying to save face before people even realise it needed to be saved? Should the 3090 have not existed yet is what I am asking, is this something they wanted to do but maybe in 6-9 months, just to ensure they have covered the single fastest GPU 100% without a shadow of a doubt?

The 2080 Ti + 5-10% at $799-899, also know as the 3080 (maybe), may not be what they wanted either, maybe the whole product stack is a bit of a hot mess.

You are trying to twist my words.

The 3090 is an upgrade, if you want it, for the 2080Ti. I think you would be crazy to "upgrade" but that is up to you. The 3080 will be an incremental upgrade. It's like buying a 1080, then selling it and buying a 1080Ti a few months later. Which I would guess that loads of people did, and Jen was delighted. Same with the 980 and 980Ti.

The 2080Ti was a halo item too. "Look, here are the RT cards !" and the 2080Ti was £1300+. How you would think that isn't the halo card? no idea. You could have had RT by using the 2060.

Yes I think Nvidia have indeed panicked. Hence all of the changes to how they do things. Usually you would have got the 3080 and 3070, pulled in the suckers then released the Ti or 3090 later, two paydays. And seriously, again if you don't think that is why they did that?...... I know plenty here who bought the 1080 and then bought the Ti a few months later. Mostly for their egos, but that is how Jen manages to "play" people, if he even does. He makes them feel inferior with lesser products. Hence why the first thing out of his mouth when he launches a new product is rubbishing the last one and making you feel like you must upgrade. Welcome to the world of sales.

The rumour has been, for ages and ages, that the big Navi would perform at least as fast as the 2080Ti. See the thread here, "Nvidia killer etc" and how old is it?

Nvidia will either know (through spies) or will be able to clearly and easily work out how AMD's products will perform. We can do it, it's a simple calculation and we can get a rough idea of the flops. That is why there are so many rumours that put the 3080 and 3090 where they are based on cuda cores, clocks etc. It's a simple mathematics equation and doesn't require rocket science.

You have to take what is happening into account. How Nvidia are doing this. It's different to anything since the GX 285. And they launched that to combat the 4890. So they know full well what they are doing, and why they are doing it this way.

As I said, usually 3080 and 3070 first at high prices. Then the Ti at a even higher price. Only, many stopped playing the game because look, let's be brutally honest here OK? they couldn't afford it. This forum has changed SO MUCH since the 10 series days it's not even funny. It used to be such a braggart's paradise but that all changed with the much higher prices of the 20 series. Many only liked the game because they could buy the highest end card and show off. When that became either uneconomical or they just didn't deem it as worth it as they had grown an ounce of common sense? it stopped. It did not, however, stop those who could afford it and don't post on forums buying it.

I'm not trying to upset any one, BTW. I just speak how I see it, and how it is. I am brand neutral to the extreme, and I don't care who is giving me what I want so long as I can get it at the price I want it at.
 
Last edited:
I can't believe I'm seeing posts here about 8 and 10GB being enough for years.

They are either the same people or their descendants, who said the same when Diamond (yeh, I'm going back some distance) brought out a 6MB (lol) card when the norm was 4MB.

"6MB will never be needed", they said.

Yeh, that aged well.

I'm currently playing The Last Epoch and at 1440p that game eats up around 6gb of VRAM and after an hour or so maxes out my VRAM buffer of a measly 8gb. Now people can stream early access title all they want it doesn't change the fact that a game I enjoy is eating up all my VRAM :P. Thank god for HBCC. I would prefer my next card to have at least 14-16gb of VRAM.
 
I can't believe I'm seeing posts here about 8 and 10GB being enough for years.

They are either the same people or their descendants, who said the same when Diamond (yeh, I'm going back some distance) brought out a 6MB (lol) card when the norm was 4MB.

"6MB will never be needed", they said.

Yeh, that aged well.

There is a 20gb version of the 3080 as well
 
I can't believe I'm seeing posts here about 8 and 10GB being enough for years.

They are either the same people or their descendants, who said the same when Diamond (yeh, I'm going back some distance) brought out a 6MB (lol) card when the norm was 4MB.

"6MB will never be needed", they said.

Yeh, that aged well.
Yeah it's ridiculous. Might not upgrade this gen, but when I do it'll be 16 GB or nothing.
 
It's obvious why Nvidia killed SLI.
What was the first thing we used to do when Nvidia release a new gen of cards... We have that thought... Hmmm do I get a new card, or buy another second hand "1080Ti" and go SLI?

Well by killing SLI they put that debate to bed. New card is your only option. Mo money for Nvidia!

Or maybe it's just because SLI sucks and game developers don't want to support it anyway
 
Massive hole tho between the 8GB 3070 and the 20 GB 3080 (assuming nobody is mad enough to buy the 10GB 3080).

However if Nvidia is allowing double vram on the 3080 then they can easily do on the 3070 too and have 16gb 3070 models

but for now I don't want a downgrade in vram - a 1080ti or 2080ti owner has no upgrade other than the 3090 until the higher vram models of the 3080/3070 arrive
 
There is a 20gb version of the 3080 as well

There is not. It's just a rumour.

Remember, this is always true. On lower end lower priced products there is ALWAYS a catch. Always. In this case it is clear to see that the 3080 "only" has 10gb VRAM. Less than the 2080Ti.

We are already seeing current gen games use 9.5gb+ at 4k. Work it out. There is your catch. It will last just long enough until Jen pops his head around the door again.

This is what I meant when I talked about linear rail roading. It's to make you stop and think, and realise that you simply have to get the top end card. They are not stupid.
 
I'm not trying to upset any one, BTW. I just speak how I see it, and how it is. I am brand neutral to the extreme, and I don't care who is giving me what I want so long as I can get it at the price I want it at.

Not sure why you'd upset anyone, it is a computer forum to discuss computer related things.

Yes I think Nvidia have indeed panicked. Hence all of the changes to how they do things.

I had this feeling too, it seems to be somewhat rushed and disjointed, the product stack as leaked just doesn't make sense.

The 2080Ti was a halo item too. "Look, here are the RT cards !" and the 2080Ti was £1300+. How you would think that isn't the halo card? no idea. You could have had RT by using the 2060.

IMO I don't think it was, I think it was an attempt to test the waters, and make hay while the sun shone, with no real high end competition to challenge them. That's the beauty of opinions though, you don't have to agree.
 
However if Nvidia is allowing double vram on the 3080 then they can easily do on the 3070 too and have 16gb 3070 models

but for now I don't want a downgrade in vram - a 1080ti or 2080ti owner has no upgrade other than the 3090 until the higher vram models of the 3080/3070 arrive
I'm wondering if they will be available at launch, or if nV will be total ducks and only release the 1/2 memory cards at first. With the double mem cards 6 months later (if at all).
 
This is what I meant when I talked about linear rail roading. It's to make you stop and think, and realise that you simply have to get the top end card. They are not stupid.
Assuming no competition.

This strategy will backfire massively if AMD gives more memory and therefore longer lasting cards.
 
This strategy will backfire massively if AMD gives more memory and therefore longer lasting cards.

See the RX 580/ RX 570 8GB cards, they are going to be around way longer than the GTX 1060 3GB yet they cost more at the time. It's a funny old world sometimes.
 
Not sure why you'd upset anyone, it is a computer forum to discuss computer related things.

Good ! I am chuffed that the place has changed (and it has ! or I wouldn't be posting here). It's nice to talk to people who understand where you are coming from without getting their knickers in a twist.


I had this feeling too, it seems to be somewhat rushed and disjointed, the product stack as leaked just doesn't make sense.

It's quite complicated really, but inevitably quite simple. Jen is on an all time high. If he gets egg on his face it takes much more than an extra release to put it right. Remember, as a share driven company he MUST remain far ahead to keep those prices going up. As soon as he cocks up? they go down.

IMO I don't think it was, I think it was an attempt to test the waters, and make hay while the sun shone, with no real high end competition to challenge them. That's the beauty of opinions though, you don't have to agree.

I think they saw the Xbox 1SEX and **** themselves. I truly do. The TFLOPS were published, they looked at the core count and absolutely crapped a biggun. And that is why the 3090 turned up, shortly after that.

Remember, Nvidia have had troubles. Samsung, delays, Turing etc. If AMD have not? yeah, they could be in for a big surprise......

Foxeye - yes indeed, assuming no competition. However, if he assumed big Navi wasn't going to be? oh boy I think he would be in for a shock. Even if big Navi isn't as good as he suspects I think it will still be better than he originally thought. Or he could realise that Ampere has been a PITA, and AMD may have just caught up a little bit (or caught up on Nvidia's problems).

Memory isn't the be all and end all unless you run out. Like with the Fury X. It was a horror show. All of a sudden you start streaming from a paging file on your rusty hard drive. However, again that is all about to change with the introduction of NVME PCIE 4.0 storage into the consoles. Which may be why they have less than I thought? or, maybe they simply don't need it when the coding is "to the metal" like that and they won't have the bonkers settings PC games have? IDK.

I got rid of my 4k monitor after wasting three grand on GPUs. First three Titan Black, then two Fury X. Got rid and got a 1440p, and just recently I "upgraded" to another 1440p and a 2080Ti. This one is curved, and instead of 70hz is 144. That is more than enough for me, the 2080Ti is more than enough for that and so on.

Getting on the 4k train was the most daft thing I ever did.

Edit. BTW just to cover one thing I missed.

The 2080Ti *never* dropped 1p in price. Not ever. We saw cheaper models but they were just that. Palit and co.

Not even when they were getting long in the tooth did that price drop. Like every other card in the stack. Just bear that in mind.
 
It doesn't need to be released the spec has, and has been confirmed.

I said pages ago that any one with a 2080Ti should not buy the 3080. By the time you make the huge loss on the 2080Ti and buy a 3080 the raw performance could be 15% better at best, and would not be worth the price premium. However. You are forgetting two things here. RTX and DLSS. Both of which the 3080 should be *far* better at, if they float your boat.

Personally I don't like RT as it sits, so I really don't care.

I just realized that Turing got a lot of people to settle for tiny performance gains...except when it comes to RT performance.

For some reason, people expect to get a lot more RT peformance from one gen to the next. I wonder how people will react if Ampere gives tiny performance performance increases all around.
 
Back
Top Bottom