• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Polaris architecture – GCN 4.0

Anyone would think we don't want a die shrunk new generation....

There is demand now for 4k or VR and at last there is a real need for AMD/Nvidia to speed along towards much more performance for the price.

I just hope Polaris is a commercial hit for AMD as we see with CPU pricing how awful monopolies can be.
 
He most likely is heavily invested with NVIDIA, whether it be in his mind only, or with stocks and shares etc.

For example, he seemed positively ecstatic saying that he attended the nvidia booth at CES, saying he couldn't reveal much due to NDA's etc.

Dunno what he imagines he's been privy to, if anything, assuming you're representing him correctly, but the only Pascal product at CES was the car product. They had nothing (even briefings) behind closed doors about either enterprise or consumer Pascal.

I assume they're waiting for GDC in March to talk or show something either publicly or behind closed doors.
 
Anyone would think we don't want a die shrunk new generation....

There is demand now for 4k or VR and at last there is a real need for AMD/Nvidia to speed along towards much more performance for the price.

I just hope Polaris is a commercial hit for AMD as we see with CPU pricing how awful monopolies can be.

I also want to see Polaris do well, and especially Zen. But I wouldn't worry too much about pricing under a monopoly scenario. Ultimately it is the markets that define prices. Just because there is only a single provider doesn't mean people will suddenly accept that fact and will be willing to pay twice the price for the same product, people pay what they are willing to pay regardless of the number of providers. Nvidia would still aim to maximize profits, and to do that requires fishing the optimal price points, selling a product at twice the price but 1/3rd of the volume is a bug step backwards. Nvidia would still have to innovate and provide big incentives to force people to upgrade. Conversely, even without a monopoly nvidia can sell a product at ridiculous prices like the TitanX simply because there is a market for it. Similar, Apple sell products at a huge profit margin despite mountains of competition for similar products, simply because the market dictates the price of Apple goods and Apple know how to persuade buyers.


Intel is actually a poor example for you to use if you were too look at the history of CPU prices. Intel are often blamed for drip feeding technology increases but the fact is intel and everyone has struggled to get to new manufacturing processes, which is exactly what we see with GPUs stuck on 28nm.


Lastly, I really wouldn't worry about a monopoly on GPUs happening because if AMD were to go bankrupt Intel alone will give plenty of competition in the coming years to nvidia. Nvidia is much more worried about intel than AMD, tbh.
 
All i care about is performance for the money, as long as it's not stupid money i don't care if it's Nvidia or AMD. 980Ti pricing being my limit as that is already expensive enough for a GPU imo.


I'd like to see sensible pricing too. I discounted Fury as an upgrade this generation due to pricing, and 3xx offered too little over 2xx to make it a viable upgrade option.
 
Dunno what he imagines he's been privy to, if anything, assuming you're representing him correctly, but the only Pascal product at CES was the car product. They had nothing (even briefings) behind closed doors about either enterprise or consumer Pascal.

I assume they're waiting for GDC in March to talk or show something either publicly or behind closed doors.

Showing two maxwell GPUs :)
 
Showing two maxwell GPUs :)

Doesn't really matter if what they showed was Maxwell. Product will supposedly be Pascal.

But yeah, doesn't bode well that they couldn't show the product with Pascal. I still think they're going to be very late to the party for any kind of volume product .. and I think they'll be ok with that in the short term if they can keep the IBM contract customers happy - Pascal was, is and will remain most important for them in compute, somewhere they were and are looking quite exposed against AMD. They can afford to be behind or take a beating in consumer (gaming) GPUs for a while, but not where all their margins are.
 
Last edited:
Dunno what he imagines he's been privy to, if anything, assuming you're representing him correctly, but the only Pascal product at CES was the car product. They had nothing (even briefings) behind closed doors about either enterprise or consumer Pascal.

I assume they're waiting for GDC in March to talk or show something either publicly or behind closed doors.

I'm not privy to anything GPU related. I can say that pascal based drive PX2 hardware is on the way to select partners March-April time, although mass production is expected much later (which is irrelevant, Volvo will be use PX2 in 2017 model year). The GPU details are purposely not disclosed but once the partners get hardware in a few months time there is a chance of leaked information so I expected a full announcement at GDC.

The twin pascal setup on the PX2 is 5-10x faster than a TitanX in real world benchmarks, but not tested in games, only in compute tasks related to autonomous driving. Part of this comes from support for lower precision which is sufficient for some deep learning models and other compute tasks, increasing throughput of certain functions by 2x. There is also 2 GPUs to achieve that
 
Last edited:
Nvidia would still have to innovate and provide big incentives to force people to upgrade

Nope.

NVIDIA would simply do as they are already doing - sabotage the drivers so that older generations of GPU's do much worse than they should.

Remember the previous generation, 290X vs 780ti?

The 780ti (Kepler) was all round faster than the 290x when it was current gen.

Now that the 900 series (Maxwell) is out, the 290x utterly dominates it in all areas.

Now imagine AMD weren't around - NVIDIA would be even more ruthless with their driver 'optimizations' which cripple previous architectures, forcing people to upgrade every generation.

At least with Intel, if you buy an I5/I7, it will last for many years. It's performance won't decrease. Sure, the last few generations have only been incremental improvements, though at least the mainstream I7 price that Intel recommend hasn't changed at all.
 
Are you claiming twin Pascal is 5-10 X faster than TX?

As I'd imagine the 12 cpu cores are contributing to the numbers no?

Its just about double precision / mixed precision compute power, where the maxwell fares very badly. To put it in context, in FP64 double precision computing a 7970 (or 290x) is about 1.5 times faster than a 980ti, according to anandtech reviews.

7970 1/4 double precision
290x 1/8 double precision
Gm200 1/32 double precision

So yeah its quite possible, especially with a chip made for the car manufacturers (could be even higher FP64 ratio than desktop chips), and with dual GPU, but as it was compared to maxwell its not really that impressive.
 
Last edited:
Nope.

NVIDIA would simply do as they are already doing - sabotage the drivers so that older generations of GPU's do much worse than they should.

.

What you are saying is completely disproven by empirical evidence that show an increase in performance in Nvida drivers and absolutely no signs of any sabotage.


People ask why I appear to defend nvidia or attack AMD - well, posts like this is exactly why. Completely uninformed outright lies are passed off as facts. Yours is nothing more than a troll post.
 
What you are saying is completely disproven by empirical evidence that show an increase in performance in Nvida drivers and absolutely no signs of any sabotage.


People ask why I appear to defend nvidia or attack AMD - well, posts like this is exactly why. Completely uninformed outright lies are passed off as facts. Yours is nothing more than a troll post.

Indeed, Even a video to back it up :)

 
Cant really take anything from the Zauba shipping reports, I mean the Fury X2 one was back on November 26th and hasn't seen the light of day yet whereas according to that list the Fury X was 26th of June where it was actually launched on the 24th June. So it doesn't really mean much, except that it is getting closer, but we knew that as they showed the small card off at CES.
 
Nope.

NVIDIA would simply do as they are already doing - sabotage the drivers so that older generations of GPU's do much worse than they should.

Remember the previous generation, 290X vs 780ti?

The 780ti (Kepler) was all round faster than the 290x when it was current gen.

Now that the 900 series (Maxwell) is out, the 290x utterly dominates it in all areas.

Now imagine AMD weren't around - NVIDIA would be even more ruthless with their driver 'optimizations' which cripple previous architectures, forcing people to upgrade every generation.

At least with Intel, if you buy an I5/I7, it will last for many years. It's performance won't decrease. Sure, the last few generations have only been incremental improvements, though at least the mainstream I7 price that Intel recommend hasn't changed at all.

As much as I hate to agree with the above, there is some thruth in it.
 
Nope.

NVIDIA would simply do as they are already doing - sabotage the drivers so that older generations of GPU's do much worse than they should.

Remember the previous generation, 290X vs 780ti?

The 780ti (Kepler) was all round faster than the 290x when it was current gen.

Now that the 900 series (Maxwell) is out, the 290x utterly dominates it in all areas.

Although i have thought there is some funny business for a long while with Nvidia and older gen cards. I believe the performance gap increasing between now and then is due to AMD's massive improvements in driver overhead in the past year.

On paper the 290X was always better specced than the 780ti/Titan at the time. it was the titan killer in terms of theoretical performance but the drivers let it down.

AMD have always had strong hardware, just the driver overhead has let them down in the past.
 
Although i have thought there is some funny business for a long while with Nvidia and older gen cards. I believe the performance gap increasing between now and then is due to AMD's massive improvements in driver overhead in the past year.

On paper the 290X was always better specced than the 780ti/Titan at the time. it was the titan killer in terms of theoretical performance but the drivers let it down.

AMD have always had strong hardware, just the driver overhead has let them down in the past.

If i recall correctly it was faster than the Titan on release, only the 780Ti was faster than the 290X
 
Indeed, Even a video to back it up :)

That isn't proof, there is empirical proof of Nvidia screwing up drivers. If Kepler performance was fine for a year, woeful for 6 months after Maxwell launches then Nvidia get called on it and release a fixed driver so the driver 7 months later has good performance it doesn't mean the drivers for 6 months didn't suck.

Also Nvidia ADMITTED their Kepler performance was a problem, they released a Kepler fixed performance driver.... which apparently they didn't need to because Kepler performance wasn't broken. The dozens of threads with thousands of posts on Nvidia's forums about subpar Kelper performance in multiple AAA titles, culminating with so much complaining about Witcher 3, with the performance difference being embarrassingly bad, that Nvidia couldn't keep quiet any more though had ignored the 'problem' for 6 months by that point, are all proof nothing was the matter.

Now if they forgot to optimise for Kepler in drivers for one game, then everyone complained and they fixed it straight away that is one thing. But hundreds/thousands of posts of Kepler specific performance problems were ignored for months and months before the embarrassing performance difference in Witcher 3 was too much to ignore.

Before Nvidia made announcements that they'd release a fixed driver soon, we had discussions on here with Kepler performance being laughed at with a what 780 being taken for a ride by a 960. The worst part was all the Nvidia excusers who insisted that things like better tessellation performance on Maxwell was the reason for the performance difference. Drivers weren't bad for Kepler, just the architecture was insignificant for the monumental amount of tessellation(that wasn't evident anywhere) in The Witcher 3. Nvidia guys on here were coming out with any kind of BS excuse for why a 960 was genuinely faster than a core twice the size on the same process node. Then Nvidia came along and spoiled the weeks of excuses by saying Kepler performance was rubbish and fixed it very quickly and easily, too quickly and easily for something they ignored and pretended wasn't happening for 6 months.
 
Last edited:
If i recall correctly it was faster than the Titan on release, only the 780Ti was faster than the 290X

I think they were all within a few fps of eachother. But thinking back i think the 780ti aftermarket cards were ahead of the titans due to better cooling/clocks etc.

But as mentioned before, the Hawaii chips are a reasonable distance ahead of the 780ti and titan now due to massive improvements in drivers.
 
Back
Top Bottom