• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Ampere might launch as GeForce GTX 2070 and 2080 on April 12th

Sure they might have got another couple of percent out of the Veyron chucking a load more money at it but that would have been well into diminishing returns. There is a vast gap between that and an upper 400 to 500mm2 Kepler core never mind trying to push above 600mm2 where you start to incur the ridiculous costs if you can make it viable at all.


I'm pretty sure that if Nvidia had gone with a bigger die at the time of the 680 launch that it would have cost more money and would have had less supply. The principle remains that flagship is the best actually available not some fictional 'best possible' product
 
nvidia are really gettin in for the cash these days aren’t they? I’m not gettin any more cards of them until they price them right .
The £££££ are gettin beyond The joke .
But they are priced right. They're a business, not a charity. They sell out at current prices.
I know as a consumer it sux but their pricing is spot on, all things considered.
Should mining pressures ease off, that may help e bit to bring prices down.
Also worth looking at FE cards IMO. The prices of those are not supply/demand driven. 1080, 1070's etc are either the same as their release prices or lower. Unfortunately hard to get hold of too ATM. For example, no 1070 Ti is worth £600+ when the FE cards are £419.99 including delivery.
 
That isn't really something to advocate - increment by say 5% call it flagship job done is not something any consumer should ever defend.

The 680 not only had a decidedly mid-range core size it was decidedly mid-range memory specs, the electronics on the board were all pretty low budget compared to the previous gen high end (which was sold as x80) as well, etc. etc. even the codename for the core was the mid-range one used from the previous 3 generations as the top end of the mid-range core.

(It is also fairly well documented that the 680 was originally going to be the 670ti)

This. Anyone calling the 680 a high end card knows nothing about GPUs. It also was the bait and switch that Nvidia used to increase the price of their high end chip and make people believe there was a new performance tier. When there wasn't, it was just the mid range cards renamed to x80 and x70. The high end chip became the Titan and x80ti.
 
This. Anyone calling the 680 a high end card knows nothing about GPUs.

Anyone not excepting that the 680 was the 'flagship' 'high end' nvidia GPU at launch doesn't properly understand the English language! ..... You don't get to redefine words to suit you own weird world view and not get called on it.


It was a new performance tier as it was an improvement on the outgoing 580gtx. That Nvidia, for marketing reasons, decided to bring in the Titan and Ti cards to delineate two new performance brackets above the xx80 and xx70 in a given series of cards is irrelevant to whether the 680, at launch, was a high end flagship card as nvidia didn't have any ti or titan cards out at the same time.

The 680, at launch, was the high end nvidia flagship gpu end of.

We don't rate gpu's by their memory bus width or die size (and unless you are a some sort of weirdo with a Freudian esque size obsession) It's their ability to render as much as possible quickly that maters.

Hence whatever nvidia consumer card, at a given time, can render stuff the quickest is by definition their high end flagship card. (albeit the 'high end' part is dependant on what else is put on the market at a given time)

The ATI HD2900XT was ATI's flagship consumer gpu at launch with a wide memory bus and a big die with lots of transistors (more than the 8800gtx ultra) and it sucked because it performed badly vs the completion at launch. This is not true for the 680.... Die size and memory bus width are not, in of themselves, relevant to whether a card is high end or a flagship... Its the performance and pretty much only the performance that matters as a qualifier for a card being 'high end' or a flagship.....

(You could, correctly, say that a company produces no 'high end' products ... As this is a comparison with other products on the market unlike 'flagship' which only refers to products from the same company... For example you might say that AMD currently do not produce any high end gaming/consumer GPU's as Vega 64 is their 'flagship' but not performance competitive for gaming vs nvidia's flagship)
 
Last edited:
I would suggest the position a particular card takes depends on where the card fits on the company roadmap for its products, not the release date.

If you want to qualify it and say retrospectively that for example the GTX970 is a mid range card in the `900`\maxwell line up of cards that's fine.

But at a given time a companies flagship is their fastest/best performing product by defintion
 
Anyone not excepting that the 680 was the 'flagship' 'high end' nvidia GPU at launch doesn't properly understand the English language! ..... You don't get to redefine words to suit you own weird world view and not get called on it.


It was a new performance tier as it was an improvement on the outgoing 580gtx. That Nvidia, for marketing reasons, decided to bring in the Titan and Ti cards to delineate two new performance brackets above the xx80 and xx70 in a given series of cards is irrelevant to whether the 680, at launch, was a high end flagship card as nvidia didn't have any ti or titan cards out at the same time.

The 680, at launch, was the high end nvidia flagship gpu end of.

We don't rate gpu's by their memory bus width or die size (and unless you are a some sort of weirdo with a Freudian esque size obsession) It's their ability to render as much as possible quickly that maters.

Hence whatever nvidia consumer card, at a given time, can render stuff the quickest is by definition their high end flagship card.

Sure, if you want to be pedantic about the meaning then the 680 was the best card out at the time. But, people who know GPU's know that it's really a mid range card made from a mid range chip. Which is why die sizes etc are important. To use a car analogy, if BMW had a manufacturing fault and couldn't release the 7 series cars and then renamed the 5 series to 7 series. Sure, that's now the flagship car, but, nobody who knows anything about cars would accept it. They would be asking where is the real 7 series? That's what happened when the 680 launched, people were asking where was the full fat Kepler? Why was the mid range part been used for the 680?
 
This. Anyone calling the 680 a high end card knows nothing about GPUs. It also was the bait and switch that Nvidia used to increase the price of their high end chip and make people believe there was a new performance tier. When there wasn't, it was just the mid range cards renamed to x80 and x70. The high end chip became the Titan and x80ti.

Is correct
 
Sure, if you want to be pedantic about the meaning then the 680 was the best card out at the time. But, people who know GPU's know that it's really a mid range card made from a mid range chip. Which is why die sizes etc are important. To use a car analogy, if BMW had a manufacturing fault and couldn't release the 7 series cars and then renamed the 5 series to 7 series. Sure, that's now the flagship car, but, nobody who knows anything about cars would accept it. They would be asking where is the real 7 series? That's what happened when the 680 launched, people were asking where was the full fat Kepler? Why was the mid range part been used for the 680?

Yep. As was mentioned earlier, the naming scheme was changed at 680 to denote the top flagship GPU when it wasn't anything of the sort.

480 and 580 were the top full fat chip flagship card...680 wasn't the full fat top end chip.

It may have been the best card on release, but it wasn't the chipset flagship card. Best at the time...and flagship are two different things.
 
Sure, if you want to be pedantic about the meaning then the 680 was the best card out at the time. But, people who know GPU's know that it's really a mid range card made from a mid range chip. Which is why die sizes etc are important. To use a car analogy, if BMW had a manufacturing fault and couldn't release the 7 series cars and then renamed the 5 series to 7 series. Sure, that's now the flagship car, but, nobody who knows anything about cars would accept it. They would be asking where is the real 7 series? That's what happened when the 680 launched, people were asking where was the full fat Kepler? Why was the mid range part been used for the 680?

By pedantic you mean correct right?

You car analogy is flawed a 7 series BMW is a 7 series due to its physical size compared with contemporary BMW's (ie the 7 series is bigger then a 5 series out at the same time) so BMW could not sell a '5' series as a '7' without a inserting an new intermediate sized card as a new '5' series car....

You can also buy a 7 series BMW with a smaller engine the another 5 series


The physical size of a car actually matters to a consumer... The memory bus width and die size doesn't, in of itself, matter.... Just look at the 2900xt vs the 8800gtz with the latter having a smaller sized memory bus and less transistors but better perfomance.
 
Last edited:
.It may have been the best card on release, but it wasn't the chipset flagship card. Best at the time...and flagship are two different things.

By the definition of the word it was a flagship... You don't get to make up your own definitions for words and not have it pointed out that you are talking nonsense

flagship
ˈflaɡʃɪp/
noun
  1. the ship in a fleet which carries the commanding admiral.
    • the best or most important thing owned or produced by a particular organization.

At launch the 680 was the best consumer gpu on offer from Nvidia and as such by definition was their flagship consumer gpu at the time... Even in using the same 'chipset' the only faster card was the 690 which was just two 680`s strapped onto one card.
 
Last edited:
By pedantic you mean correct right?

You car analogy is flawed a 7 series BMW is a 7 series due to its physical size compared with contemporary BMW's (ie the 7 series is bigger then a 5 series out at the same time) so BMW could not sell a '5' series as a '7' without a inserting an new intermediate sized card as a new '5' series car.... The physical size of a car actually matters to a consumer... The memory bus width and die size doesn't, in of itself, matter.... Just look at the 2900xt vs the 8800gtz with the latter having a smaller sized memory bus and less transistors but better perfomance.

If you want to live by the strictest definition of the word and not the spirit and what a sad world it is if that makes you happy.

You are comparing two different processes and different architectures with the 290xt and the 8800GTX. Comparing apples and oranges.

You and I will never agree on this.
 
I agree it is sad.... That people are so bothered about the size of the die or memory bus width... Look at the performance and price to evaluate gpu's vs the competition be that other companies or the last gen from the same manufacturer .... I take it you have come around to my way of thinking?
 
Back
Top Bottom