• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

what could have been

bru

bru

Soldato
Joined
21 Oct 2002
Posts
7,361
Location
kent
Seeing that now Nvidia are releasing a mainstream GK110 part, it makes me wonder what shape things would be in if they had of planned and released these GK110(780 and titan) parts as a 680 and 670 and with the outgoing 680 and 670 as the 660ti and 660.

Just something to think over while we wait for the official release of the new cards
 
Depends on price but that kind of performance available for £550 would have made a big difference. Never settle / 12.11 would have been minor as opposed to game changing.

To me the biggest point of interest now is what AMD releases.
 
To be honest that's what Nvidia should have done in the first place, however they got greedy and just released cards that slightly beat AMD's (like Intel sitting on their laurels atm), then AMD managed to pull extra performance out of their drivers and retake the lead so Nvidia screwed themselves (part of me suspects AMD released under performing drivers on purpose to engineer this situation).
 
It is almost certainly what Nvidia planned to do however clearly they couldn't produce GK110 in the kind of quantities necessary to do that.

They couldn't believe their luck when they saw their mid range part could match the 7970 with a decent overclock.
 
Seeing that now Nvidia are releasing a mainstream GK110 part, it makes me wonder what shape things would be in if they had of planned and released these GK110(780 and titan) parts as a 680 and 670 and with the outgoing 680 and 670 as the 660ti and 660.

Just something to think over while we wait for the official release of the new cards

Nothing to think over, the gk110 was never going to the the 680.
 
It is almost certainly what Nvidia planned to do however clearly they couldn't produce GK110 in the kind of quantities necessary to do that.

They couldn't believe their luck when they saw their mid range part could match the 7970 with a decent overclock.

???

the 680 was the top end part because the one they wanted to use wasn't able to be produced...

You make no sence
 
Depends on price but that kind of performance available for £550 would have made a big difference. Never settle / 12.11 would have been minor as opposed to game changing.

To me the biggest point of interest now is what AMD releases.

Yep, they have a new high-end target to beat, at a cheaper price. The 780 at this price does not fit into the freak category.

I can see them moving pretty quickly to snuff out NV's edge here, especially with Maxwell not due until 2014. It would make a lot of sense for them to grab the mid to mid upper section upper section on price/performance as they did with 7850/7950. If they get a range out in say Oct, that could give them six months or so beating NV for bank for buck.
 
Nothing to think over, the gk110 was never going to the the 680.

you sure about that?

I am certain that right back at the beginning of the design phase that the big chip was going to be the top tier part, just as it has been with the last god know how many series of cards.
Of course somewhere along the way ideas changed, whether this was down to unmanufacturability or mid range part performance projections or the colour of the board room wallpaper we will probably never know.
 
Seeing that now Nvidia are releasing a mainstream GK110 part, it makes me wonder what shape things would be in if they had of planned and released these GK110(780 and titan) parts as a 680 and 670 and with the outgoing 680 and 670 as the 660ti and 660.

Just something to think over while we wait for the official release of the new cards

Had have, rather than of.

It's unrealistic though, I think them using GK110 now was a bit of a knee-jerk reaction.

nVidia desperately wanted to get their top end gaming chips down to smaller manageable sizes, which would slash production costs and increase profits greatly.
 
I can see them moving pretty quickly to snuff out NV's edge here, especially with Maxwell not due until 2014.

Personally I cannot see AMD being in a position to give us anything on a new process until very late in the year, probably December time. It will all depend on TSMC.
I suppose at least with these Nvidia refreshes they are covered for a while longer if TSMC do have problems.
 
It is almost certainly what Nvidia planned to do however clearly they couldn't produce GK110 in the kind of quantities necessary to do that.

They couldn't believe their luck when they saw their mid range part could match the 7970 with a decent overclock.

you sure about that?

I am certain that right back at the beginning of the design phase that the big chip was going to be the top tier part, just as it has been with the last god know how many series of cards.
Of course somewhere along the way ideas changed, whether this was down to unmanufacturability or mid range part performance projections or the colour of the board room wallpaper we will probably never know.

GK110 was never going to be the GTX 680. Never.

nVidia very clearly wanted to split GeForce cards from CUDA/Compute usage.

This is why Kepler is heavily cut down when it comes to compute performance compared to Fermi.

nVidia realised that people were choosing GeForce cards over quadro and tesla cards.

It's one of a number of reasons why they chose to split it off the way they did, and why GK110 was never ever going to be the GT680.
 
GK110 was never going to be the GTX 680. Never.

nVidia very clearly wanted to split GeForce cards from CUDA/Compute usage.

This is why Kepler is heavily cut down when it comes to compute performance compared to Fermi.

nVidia realised that people were choosing GeForce cards over quadro and tesla cards.

It's one of a number of reasons why they chose to split it off the way they did, and why GK110 was never ever going to be the GT680.

Have you got a link to the Nvidia stament regarding this.
 
GK110 was never going to be the GTX 680. Never.

you seem very certain, do you have any evidence to support this theory or is it just that.

Of course no problem if it is just your theory, as you are quite right it could be how it was planned all along, despite going against the trend for the last few generations.

Either theory could be correct, but as above we will probably never know.
 
Where's your link to the nVidia statement saying that GK110 was originally supposed to be the GTX680?

It is not me stating anything as fact. What staement do I need to give proof of.

You made a comment as fact, I am interested inreading the official Nvidia version of it. What is the problem for you to get defensive about it?
 
you seem very certain, do you have any evidence to support this theory or is it just that.

Of course no problem if it is just your theory, as you are quite right it could be how it was planned all along, despite going against the trend for the last few generations.

Either theory could be correct, but as above we will probably never know.

It's about the technical aspects about what they've done with each GPU, and how and where they've crippled performance.

The make up of the GK104 chips is very telling really, especially when compared to previous families.

I made quite a few large technical posts about it a little while back, I'll see if I can dig them out.

It is not me stating anything as fact. What staement do I need to give proof of.

You made a comment as fact, I am interested inreading the official Nvidia version of it. What is the problem for you to get defensive about it?

I'm not being defensive, it just seemed like an odd comment to make when you probably know nVidia don't talk about this kind of thing openly.
 
yep, looking at Gibbo's comments in the MSI thread, AMD could release 9*** any time they like from now onwards basically, which must mean it's still on 28nm, not 20 as had been suggested
 
Back
Top Bottom