• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Q&A SESSION WITH JAMES PRIOR FROM AMD / OcUK STREAMED TOMORROW, ASK QUESTIONS!

I’m only really interested in 3D GPU performance in applications like Blender/Maya/ZBrush/Mudbox (not 3DS Max or Vray cause everyone using them has excessive lens flare/softness and tries to emulate a wide open f1.4 lens and does not change the AA algorithm on renders) but it was always a given that a decent GPU would also be good in games, my first was a Geforce 256 DDR also the first GPU and I think that was about £200 at the time along with a 1GHZ slot A Athlon which I’m not sure of the cost but total system was about £1500.00 with 10,000 RPM SCSI and I’m glad that AMD has decent processors but I’ve never really trusted ATI/Radeon for GPU and maybe I should test them out?
 
Last edited:
In terms of gaming perf, 290/390 are most similar to the 480.

Anyhow, will you deny that the 290 (and later the 390) were available at £250 for at least 2 full years before the 480 launched at £250 to £300? Remember the 290 had some pretty big price cuts soon after launch.

Just answer that question if you dare.

I won't entertain the idea of discussing your anecdotal, part way through the lifecycle figures.
 
I won't entertain the idea of discussing your anecdotal, part way through the lifecycle figures.
Because you've got nothing.

The 290 was £250 a few months after it launched. Together with the 390 (a rebrand), this perf cost £240 for over 2 years before the 480 launched.

That is the reality.

2 full years of 290/390 perf for £250. THEN the 480 came along, offering the same price for the same perf.

Because 2 years is a long time it really doesn't matter what the 290 (or the 290X) launched at 3 years ago. The price was stable and consistent for 2+ full years.

You can moan all you like about it but reality is biting you on the ass. 2 full years of £250=290/390 perf. Then the 480 comes along and offers £250=290/390/480 perf.

THEN the 580 comes along... and guess what? :p £250=290/390/480/580 perf.

It's so blindingly apparent to EVERYONE who isn't a die-hard AMD apologist. For you, with your AMD life-sized Raja comfort pillow and your AMD branded pyjamas, there is not much hope of a happy 2018 :p

And I will end by saying I dislike nV right now too - I keep having to say this because people assume I'm an AMD shill :p The reason I post a lot of anti-AMD stuff here is because there are so many AMD die-hards with a reality distortion field that would make Apple proud :p
 
2 full years of 290/390 perf for £250. THEN the 480 came along, offering the same price for the same perf.
The RX480 was noticeably cheaper than the 290/390, it also used half the power. It's because all AMD had heard for years was people whining that their GPUs were too power hungry/hot/etc and sadly they didn't realise the whiners were just complaining for the sake of complaining so they invested a lot of time/effort to make the next GPU really efficient, at the expense of performance.
 
Because you've got nothing.

The 290 was £250 a few months after it launched. Together with the 390 (a rebrand), this perf cost £240 for over 2 years before the 480 launched.

That is the reality.

2 full years of 290/390 perf for £250. THEN the 480 came along, offering the same price for the same perf.

Because 2 years is a long time it really doesn't matter what the 290 (or the 290X) launched at 3 years ago. The price was stable and consistent for 2+ full years.

You can moan all you like about it but reality is biting you on the ass. 2 full years of £250=290/390 perf. Then the 480 comes along and offers £250=290/390/480 perf.

THEN the 580 comes along... and guess what? :p £250=290/390/480/580 perf.

It's so blindingly apparent to EVERYONE who isn't a die-hard AMD apologist. For you, with your AMD life-sized Raja comfort pillow and your AMD branded pyjamas, there is not much hope of a happy 2018 :p

And I will end by saying I dislike nV right now too - I keep having to say this because people assume I'm an AMD shill :p The reason I post a lot of anti-AMD stuff here is because there are so many AMD die-hards with a reality distortion field that would make Apple proud :p
The exchange rate influences the prices you see discussing substantially. Use dollars, it gives a more accurate comparison.

I don't disagree with your stance though. It's why I haven't bothered with AMD for a few years now. They just don't offer the level of performance I want, so my only option is nVidia.

Unfortunately, due to some CUDA software as well, I will be sticking with nVidia for the foreseeable future.
 
It is disappointing that AMD have had virtually the same performance at that price bracket for years but the fact is they still compete well with their competition in that tier. That speaks volumes about the current state of the market.

At the same time a direct comparison of prices is deceptive. £250 generally buys you a lot less hardware than it did even 2 years ago. For example my monitor is a good £150 more brand new than it was 20 months ago
 
Common sense is a very dangerous and inaccurate thing, it lead people to think for a long time the earth was totally flat and not curved.

As to the distance of the Sun, yes we do need a scientist to say how far away it is, for all I know it could be 1 mile away and much smaller than it really is.
Haha, cracked me up and actually spot on :D
 
A bit of trivia for you, On the night of December 3rd 2017 which is tomorrow we'll have 2017's only super moon which is when the moon is both full and at it's closest point to earth so remember to go out if it's a clear night and see the moon in all it's glory.

Totally off topic, but I'm currently offshore some 70Km north west of the Shetland Islands right now and the moon was looking awesome tonight. I wish I had read this thread beforehand as I would have taken my camera out.
 

Yep. So,basically the chap said they are ideally working towards making the 2019 and further CPUs to be able drop into current motherboards using a BIOS upgrade.

He also pretty much said AMD has two CPU teams - one working on the next generation Zen,and the other working on putting the current Zen onto a "new GF process" hence confirming there is probably a refresh coming.

The rumoured Vega11 is also the internal name for the IGP in Raven Ridge.
 
Totally off topic, but I'm currently offshore some 70Km north west of the Shetland Islands right now and the moon was looking awesome tonight. I wish I had read this thread beforehand as I would have taken my camera out.

I'd actually forgotten myself :rolleyes: so thanks for the reminder, I'm off out to take a look before it's too late..

EDIT: To cloudy here in darkest Surrey :(

I used to live in the New Forest, Hampshire and the skies there were amazing, The lack of local lighting meant pitch black was pitch black and when the sky was clear of clouds the moon and stars would light up the forest beautifully.

The next really close one is in 2034 so remember to take your camera then :D
 
Last edited:
Common sense is a very dangerous and inaccurate thing, it lead people to think for a long time the earth was totally flat and not curved.

As to the distance of the Sun, yes we do need a scientist to say how far away it is, for all I know it could be 1 mile away and much smaller than it really is.

Common sense is dangerous people.... This needs a super special mention.

Possibly the most special post on OcUK ever.
 
The RX480 was noticeably cheaper than the 290/390, it also used half the power. It's because all AMD had heard for years was people whining that their GPUs were too power hungry/hot/etc and sadly they didn't realise the whiners were just complaining for the sake of complaining so they invested a lot of time/effort to make the next GPU really efficient, at the expense of performance.

It even with a smaller process node it only just about matched the performance per watt of older product (and process node) GTX 970, less said about the PCIE power spec the better.

#PoorVolta
 
It even with a smaller process node it only just about matched the performance per watt of older product (and process node) GTX 970, less said about the PCIE power spec the better.

#PoorVolta

The 970 is getting murdered. Who cares about power use as long as you can cool the card...
 
The 970 is getting murdered. Who cares about power use as long as you can cool the card...

More power means, less room to scale up clocks and shaders for a architecture, AMD could add more shaders to a Polaris or Vega GPU but it would destroy the traditional power brackets. More complex PCB is needed due to the extra power components and cooling (and would be easier and cheaper to make a quieter card. In the oem space a your also adding cost to the PSU/Cooling as well so such people are less likely to use chip. Power and Heat matters even more when it comes to laptop.

How many Polaris/FX/Fermi based laptops did you ever see.

In short its power that's an issue when it comes to a CPU/GPU its the knock on effects as a result.
 
More power means, less room to scale up clocks and shaders for a architecture, AMD could add more shaders to a Polaris or Vega GPU but it would destroy the traditional power brackets. More complex PCB is needed due to the extra power components and cooling (and would be easier and cheaper to make a quieter card. In the oem space a your also adding cost to the PSU/Cooling as well so such people are less likely to use chip. Power and Heat matters even more when it comes to laptop.

How many Polaris/FX/Fermi based laptops did you ever see.

In short its power that's an issue when it comes to a CPU/GPU its the knock on effects as a result.
Away with your common sense posting style!

Spot on though :)
 
More power means, less room to scale up clocks and shaders for a architecture, AMD could add more shaders to a Polaris or Vega GPU but it would destroy the traditional power brackets. More complex PCB is needed due to the extra power components and cooling (and would be easier and cheaper to make a quieter card. In the oem space a your also adding cost to the PSU/Cooling as well so such people are less likely to use chip. Power and Heat matters even more when it comes to laptop.

How many Polaris/FX/Fermi based laptops did you ever see.

In short its power that's an issue when it comes to a CPU/GPU its the knock on effects as a result.

Well to some lesser extent and in specfic areas, but more power means higher potential performance. It's that simple.

The 970 150watt? is bettered by cards pulling more power regardless of clock speed.
 
Well to some lesser extent and in specfic areas, but more power means higher potential performance. It's that simple.

The 970 150watt? is bettered by cards pulling more power regardless of clock speed.

I just used the 970 as an example as it used similar amount of power for similar performance at the time of the RX480 release. Sure more power means potential more performance but only if efficiency is good. The GTX 480 was better performing (not by much at stock clocks and 6 months late) card than the 5870 but was considered a mess as it was hot and consumed 1/3 more power. AMD had the the power budject to smash the GTX480 with a bigger chip or what they ended up doing the two chip 5970

You could compare the RX480/580 to the GTX 1060 the former using 1/3 more power, or GP100 AMD have less power to play with to make a faster product, like it or now stuff are made to subjects weather that be power/cost/die size.

Put it this way, seeing as both AMD and Nvida are looking towards multi chip/die designs. If say both AMD and NVidia make a die that give you 100fps in Game X but AMD’s Die consumes 100w at stock clocks and Nvidia’s consume 75w, means nvidia could use more dies to hit better performance to his the same 150w midrange and 300w high end power budgets.

Personally I think the dark horse now in the coming years is Intel, now they have a GPU with huge amount of resources (people and money) together with their own in house process’s and manufacturing for better tuning and what not I do seem them as a real challenge now. Third time lucky after the foul ups that was the i740 or the cancelled Larrabee.
 
I just used the 970 as an example as it used similar amount of power for similar performance at the time of the RX480 release. Sure more power means potential more performance but only if efficiency is good. The GTX 480 was better performing (not by much at stock clocks and 6 months late) card than the 5870 but was considered a mess as it was hot and consumed 1/3 more power. AMD had the the power budject to smash the GTX480 with a bigger chip or what they ended up doing the two chip 5970

You could compare the RX480/580 to the GTX 1060 the former using 1/3 more power, or GP100 AMD have less power to play with to make a faster product, like it or now stuff are made to subjects weather that be power/cost/die size.

Put it this way, seeing as both AMD and Nvida are looking towards multi chip/die designs. If say both AMD and NVidia make a die that give you 100fps in Game X but AMD’s Die consumes 100w at stock clocks and Nvidia’s consume 75w, means nvidia could use more dies to hit better performance to his the same 150w midrange and 300w high end power budgets.

Personally I think the dark horse now in the coming years is Intel, now they have a GPU with huge amount of resources (people and money) together with their own in house process’s and manufacturing for better tuning and what not I do seem them as a real challenge now. Third time lucky after the foul ups that was the i740 or the cancelled Larrabee.

For me top end cards should be all about the most performance for 350watts. I would love to see cards rated by power use. What do you put first performance per watt or watts to performance? Performance should come first in the gaming market.

I think APUs will take a big slice out of the graphics cards market over the coming years. Intel are at 3.3Tflops intergrated into an 4c8t chip now? A chip like that should offer the performance for 2-3 megapixel gaming.
 
Last edited:
Back
Top Bottom