• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

I think you should get used to gains not always being as big as before going forward. We'll get the odd time when major architectural changes coincidence with a new node shrink or something, but otherwise, things are just getting much harder and more expensive to find the improvements that people have gotten used to.

In terms of the 1070 not being 20% more than the 980Ti like the 670 was over the 580:

1) The 1070 is more 'cut down' than the 670 was.

2) Pascal is not the same leap that Fermi->Kepler was. I think Pascal is more than just 'shrunk Maxwell'(you dont get these sort of clock gains for free), but it's also certainly not a major architectural change, either. Pascal was only inserted into the lineup a couple years ago, where Maxwell and Volta were the main architectural 'revolutions' that they were working on.

In terms of Nvidia/AMD 'milking' anything, do you think Nvidia just has Volta sitting on the shelf somewhere? Pascal was brought in *because* Volta is not ready and wont be for a little while. The alternative was that they stay on 28nm Maxwell until Volta was ready. You'd have been ok with that?

And neither Nvidia/AMD can do anything about the length of time it takes for the next process shrink to come about. I'm sure Nvidia were originally planning to release Maxwell alongside 20nm wayback when they first started working on Maxwell, but obviously those plans were scuppered. Releasing Maxwell on 28nm wasn't 'milking' anything, it was them doing what they could to release an improved product, which is what customers expect.

As I said, improvements are becoming harder and more expensive to come by. Moore's Low is slowly coming to an end and that will continue until some serious breakthroughs get made.


The thing is Nvidia mostly makes money from GPUs and their margins have gone up by nearly double over the last 4 years or so,so the strategy is very sound financially and kudos on Nvidia managing people's expectations. From their perspective its done very well for them,and hats off that they managed to get it to work for them but it is not so good for consumers.

The thing is though,if you looked at it in a basic view,the 314mm2 GP104 is not doing that badly,but the pricing structure and product segmentation is just out of whack.

But the problem it is screwing over the sub £250 market in the process.

Look at the last JPR report:

https://jonpeddie.com/images/uploads/news/graph-pr-2rev2.png

ijaTIy1.png

The Enthusiast market is above $300(or around £200 in our money),and according to reports a while back the market above $449(£310) is much smaller. This means by its very pricing the GTX1070 is more niche.

Most cards sold are under $300 still,but more people are spending $300+ on cards.

This is because the sub $300 market has had such rubbish improvements people are forced to spend more and more on cards,by moving to the tier above.

I could even understand if these were the straight 20NM node cards we might have had last year,but Nvidia is using second generation TSMC 16NM and AMD second generation GF/Samsung 14NM. These are bascially third generation 20NM based processes.

I am now getting a bit worried what Polaris 10 and the GTX1060 will bring to the table. I expect they will be £250 and be around R9 390X/GTX980 level which will make them the same price/performance as the GTX1070.

Sadly it also means only a 10% to 20% improvement over their predecessors. This means instead of spending the same to get a performance improvement,people will now have to go to the segment ABOVE,AGAIN.

Realistically only the launch of BOTH the GTX1080TI and Vega 10 will make things more compeitive,ie, a full product stack from top to bottom.

Maybe,we will need to agree to disagree and leave it at that,but I am just getting more and more dissapointed by what we are seeing so far about Pascal and Polaris.

I am sure others will find the GTX1070 more exciting.
 
Last edited:
The thing is though,if you looked at it in a basic view,the 314mm2 GP104 is not doing that badly,but the pricing structure and product segmentation is just out of whack.
If we're talking 1080, yea, it's a bit extreme.

The 1070 though - it's cheaper than the 670 and 770 before it. Looks like a price hike compared to the 970, but the 970 was priced aggressively. Notice that the 970 *never* got a price cut and still costs today what it did the day it released? Because it was basically price cut from the get-go and remained a good value for nearly 2 years.

I fully expect the 1070 will cost about what a 970 does within 6-9 months. Especially if AMD releases a highly competitive baby Vega or if P10 manages to surprise with more power than you or I are currently expecting.
 
Last edited:
If we're talking 1080, yea, it's a bit extreme.

The 1070 though - it's cheaper than the 670 and 770 before it. Looks like a price hike compared to the 970, but the 970 was priced aggressively. Notice that the 970 *never* got a price cut and still costs today what it did the day it released? Because it was basically price cut from the get-go and remained a good value for nearly 2 years.

I fully expect the 1070 will cost about what a 970 does within 6-9 months. Especially if AMD releases a highly competitive baby Vega or if P10 manages to surprise with more power than you or I are currently expecting.

The GTX670 was roughly around the same dollar price as the lower end range of the GTX1070,so that 20% improvement is what the GTX1080 got at a stupid price increase.

The equivalent to the GTX1070 with Kepler was the GTX660TI at around $300,which was slightly faster than a GTX580 and had more VRAM.

I also agree with the last bit - we need to probably wait until we get the whole product stacks from AMD and Nvidia launched to get better price/performance.
 
The GTX670 was roughly around the same dollar price as the lower end range of the GTX1070,so that 20% improvement is what the GTX1080 got at a stupid price increase.

The equivalent to the GTX1070 with Kepler was the GTX660TI at around $300,which was slightly faster than a GTX580 and had more VRAM.
Once again man - we cant expect performance gains to always be what they were before. And you have to look at where they are with their architectural states. 600 series was both a node shrink and a major architectural jump. Pascal is a more interim, minor architectural improvement because Volta isn't ready.

These companies are spending *more* money for *less* improvement. They certainly aren't going to cut prices for consumers just because Moore's Law is slowing down. Just not how it works.

But as I said, I think the 1070 will have some built-in margin for future price cuts. The 1080 probably has a huge margin. How low they go will depend on the competition and where GP104 and GP100 fit in.
 
Last edited:
Once again man - we cant expect performance gains to always be what they were before. And you have to look at where they are with their architectural states. 600 series was both a node shrink and a major architectural jump. Pascal is a more interim, minor architectural improvement because Volta isn't ready.

These companies are spending *more* money for *less* improvement. They certainly aren't going to cut prices for consumers just because Moore's Law is slowing down. Just not how it works.

But as I said, I think the 1070 will have some built-in margin for future price cuts. The 1080 probably has a huge margin. How low they go will depend on the competition and where GP104 and GP100 fit in.

The problem is people say that but Nvidia has nearly doubled its margins in the last 5 years and is making record amounts of dosh,as I mentioned before:

The thing is Nvidia mostly makes money from GPUs and their margins have gone up by nearly double over the last 4 years or so,so the strategy is very sound financially and kudos on Nvidia managing people's expectations. From their perspective its done very well for them,and hats off that they managed to get it to work for them but it is not so good for consumers.

The thing is if AMD starts being better organised and does better marketing,I expect they will follow suite and they already made comments to that level.

I think the only real way to get a decent price/performance jump,is to probably wait at until both companies release their entire range including large die GPUs.

Maybe with Pascal and Polaris being more evolved uarchs it won't be such a great generation for a while! :(

Like I said we probably need to agree to disagree and leave it at that.
 
Last edited:
GALAX GeForce GTX 1070 HOF and GAMER spotted

GALAX_Ge_Force_GTX_1070_GAMER_1.jpg


http://videocardz.com/60651/galax-geforce-gtx-1070-hof-and-gamer-spotted
 
That explains why you notice when your frame rate drops below 60fps. With Vsync on, when you drop below 60fps, your frame rate actually drops to 30fps. Which is very noticeable.

I'm not sure about that. The FPS counter doesn't show that the FPS count is 30.
It shows 59-58, etc...
I notice FPS drops even without V-Sync...
 
This is people fundamentally not understanding how vsync works. A 60hz monitor only refreshes every 16.6ms (1000ms per second divided by 60 frames per second). If it misses a refresh because a frame takes 16.7ms or more to render, vaync holds that same frame for the full 33.3ms, which is equivalent to 30fps. Your frame counter counts the actual number of frames being rendered, but the monitor works on fixed intervals of 16.6ms. In a situation where you are averaging 58fps, it means some frames are being rendered under 16.6, and some are over, so you are seeing the monitor effectively flip flopping from 60fps to 30fps and back. Once you drop below about 50fps in game, you are effectively missing every 16.6ms interval and the game is being displayed on your monitor at 30fps because pretty much every frame is being held over for 33ms.

Basically vsync is evil and should be burned with fire.

And FPS counters are actually not a very good method of determining smoothness.
 
Last edited:
The problem is people say that but Nvidia has nearly doubled its margins in the last 5 years and is making record amounts of dosh,as I mentioned before:



The thing is if AMD starts being better organised and does better marketing,I expect they will follow suite and they already made comments to that level.

I think the only real way to get a decent price/performance jump,is to probably wait at until both companies release their entire range including large die GPUs.

Maybe with Pascal and Polaris being more evolved uarchs it won't be such a great generation for a while! :(

Like I said we probably need to agree to disagree and leave it at that.
I just dont get what you are complaining about. First you say it's because the new cards aren't as big a jump in performance as you wanted. I respond, then you say, "Well the price isn't good for what you get!", which is a different sort of argument.

Just because a company is making good profits doesn't mean they have reason to slash prices simply because Moore's Law isn't holding anymore. You seem to want it to to be 2010 again and it wont be. Ever.
 
Vsync is a necessity if you've got a fixed refresh rate monitor.

And you can always use adaptive vsync if you dont want it to spike to 33.3ms during drops. So long as you love screen tearing.

Yes exactly, it was the best of a bad situation, but there now being better options I'm glad to see the back of it. If someone was considering a 2nd GPU or a new monitor to improve frametime/image quality, the new monitor is a one time purchase but it carries forward to GPU upgrades. Its like having auto-SLI for free going forwards.
 
Back
Top Bottom