• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia say they're underwhelmed by 7970

When AMD launched the fasted single card they priced it above the Nvidia competitor.

Nvidia won't kill their business model, if the 'GTX660' performs similar to the GTX580 and 7970 it will be priced similar. Nvidia have lots of other cards to sell cheaper.

Prices will drop slowly over time as each vendor manages sales vs stock vs new parts they have for launch.

AD
 
I'm sure that what you say would be great for a lot of us -key word IF-, but can't help thinking this is beyond wishful thinking.

Yrs, bu if we're going by rumours, ten it'd be the case

When AMD launched the fasted single card they priced it above the Nvidia competitor.

Nvidia won't kill their business model, if the 'GTX660' performs similar to the GTX580 and 7970 it will be priced similar. Nvidia have lots of other cards to sell cheaper.

Prices will drop slowly over time as each vendor manages sales vs stock vs new parts they have for launch.

AD

True, or they may decide to take the GTX 580 and 570 out of production completely and let the 660 take the gap. This wouldn't be a really good strategy for them, but if it seriousl dents AMD, hen they may do
 
i agree, if a £200 card beats or rivals a £500 pound card - that is not good news for AMD

Unless of course they do what they have always done for at least the last 5 years and price it above £300. If Nvidia has the fastest single GPU card they will charge for it. Even if it rivals an HD7970 and does not quite beat it,they will look at whats available at the time and price it according. This is again what Nvidia have done.
 
True, or they may decide to take the GTX 580 and 570 out of production completely and let the 660 take the gap. This wouldn't be a really good strategy for them, but if it seriousl dents AMD, hen they may do

The other alternative is that lower end GK104 bins take the place of the cards under £300. The GTX285 was for example around £300 to £350 and it has been the cheapest single GPU card Nvidia have had when it has had the fastest single GPU title.
 
Arrogant as ever.

Oh and 2gb vram? tight gits.

These cards are supposed to be able to run three monitors.

Still, as the saying goes... Treat em mean keep em keen.

If you’re “arrogant as ever” comment is directed at Nvidia I put it to you to go and reread the article this thread is about.
Not once in that article do Nvidia say anything about AMD. All we have is the author of the article relating his opinion and his feelings after meeting with Nvidia, there is nothing but rumour and even then he doesn’t link to his source of the leaked specs . going to the site he mentions “inpai.com.cn” first thing i noticed was an English version button at the top of the page which takes your here http://en.inpai.com.cn/ no mention of the article the author was talking about. I wouldn’t be surprised if it was just a post on their forums.
If of course the “arrogant as ever” comment wasn’t directed at Nvidia then please ignore this post and accept my apologies.
 
Last edited:
John Logie Baird first to invent the TV
Alexander Graham Bell, first to invent the Telephone.

Why do the Scot's not then make the best phones or tellys ?

The only one making zilch sense is you.

Except these aren't the same at all.
My post was in regards to a blanket statement.

Had he actually posted what he meant, I'd have agreed.
While AMD have held the single GPU at certain times, it's because they've had their GPU out first, the 5XXX and 7XXX, however in the 5XXX instance Nvidia one upped them with their launch, it looks like the 7XXX will get one upped too.

However, in the instance where Nvidia got their 5XX out before AMD's 6XXX, Nvidia's 580 still beat out the 6970.

So I'd have to agree that Nvidia, when it comes to single GPU end performance are ahead of AMD, but everything else not so much.
 
Last edited:
Arrogant as ever.

Oh and 2gb vram? tight gits.
When 99% of gamers use a single monitor of 1920x1200 or less, why is 2GB tight? At this resolution, my GTX580 with it's meager 1.5GB has never caused me problems.

Let's see how the cards perform with games and at resolutions the vast majority of people use. 3GB is unnescessary for the vast majority who own 7970's, and just adds to the cost. That is why AMD will soon release a 1.5GB version, and the same reason 1GB 6950's were made.

If GK104 is released and turns out to be faster, more power efficient, and cheaper than the 7970, atleast AMD fans can cling to the fact that they have more VRAM, even if it does sit idle.
 
Last edited:
Even if it's 2560x1600 2gb is plenty. people have just gone all crazy talking out of their bums as far as vram is concerned. gpu power is more important.
 
When 99% of gamers use a single monitor of 1920x1200 or less, why is 2GB tight? At this resolution, my GTX580 with it's meager 1.5GB has never caused me problems.

I'm not talking about the here and now. I'm talking about the future. No one buys these cards based on what they can do with games that are already out (well, apart from the very recently released BF3). When I got my 470 it had a huge amount of vram. Fast forward a year and the card had out dated itself by not having enough vram, and I had to replace it.

BF3 has shown that if we want graphics to progress then it's going to involve huge textures which eat into VRAM. Also, resolutions are on the rise and whilst it won't happen overnight eventually we will all be using 1600p as manufacturers don't like making loads of products that won't sell.

IE - just because some one decides to be nostalgic in two years time and want a 1080p monitor it doesn't mean there will be any to buy. If there are they will probably be being made in small numbers and thus, expensive.

Not only that, but one of the aims of Kepler is to add three screen gaming (surround) onto the one card like AMD have with Eyefinity. 2GB is the bottom line you will need to throw a game around at those resolutions.
 
Indeed, I was looking for a couple of 1600 resolution monitors (to run in portrait mode eyefinity with my 30incher) the other day and found a DELL 4:3 resolution monitor - £334 because it is now the oddball and hence priced because of the cost of making a limited number of them compared to the bucket loads of 16:9 screens. Similarly the strange near total withdrawal of the 16:10 laptop screens of 4-5 years ago (I can now only find apple macbook pro's with them).
 
So what is best then. A "GTX760" with 2GB, or a 4GB version costing ~£50 more? Surely, NVidia starting with 2GB is the best option, and I am sure AMD would have done the same if they had a 256/512bit memory bus.
 
Yeah. It would be crazy to make a mid-range 256bit memory bus card with 4GB as baseline, on an what-if basis. It will be an unnecessary expense for the majority who buy mid-range cards. By the time games come out that actually need 3GB-4GB the GTX 760 will likely lack the GPU power to justify the extra RAM.
 
It's hardly a what if really.

In the space of two years we have gone from 512mb being overkill to 1.3gb not being enough. It doesn't take rocket science to figure out where we are headed.

And rightly so IMO. Graphics have been moving along so slowly thanks to the 360, but when the next one releases you will see a flurry.

Game makers are just as competitive as GPU makers. They want their games to look better, because unsurprisingly their audience are fickle shallow gamers who seem to think that graphics are everything.

Duke Nukem Forever said it all when it released. Even the supposed "true gamers" who reviewed it all picked it apart based on the graphics. Absolutely hilarious.
 
Are you joking, man? 512MB was overkill 2 years ago? 2 years ago was January 2010. 512 MB was well below required 2 years ago.

And it is a what if. Just because BF3 needs 2gb for 1600p doesn't mean a mid-range card should baseline have 4GB over 2GB. There may very well be many games that don't even require 2GB in the next few years with just a couple more that do. It's impossible to say. And even if that's not the case you don't go sticking 4GB on a mid-range card. Their high-end cards will have 3GB and that's what people gaming at high resolutions in future games will be aiming for. There has to be logic behind pairing up GPUs with VRAM. It's not a matter of sticking on the maximum VRAM possible to support all resolutions on some possible future game.`
 
Back
Top Bottom