• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

INTEL BRING THE BIG GUNS!

So, this time round it's AMD with great marketing for their CPU's and it's Intel that are in disarray - loving it!
 
The slow progression of market share shows that AMD still have an uphill battle though even with a great range of products.

I am happy that Intel seem to be *panicked, and hopefully they can also turn things around, because they have benefitted from mediocre updates since Core2, and high prices with very little innovation.

*Note: I mean this in a positive way, I want Intel to makes changes to their relatively stagnant 4 core overpriced line-up.
 
Last edited:
Sad thing is, people will pay it. But we can at least thank AMD for forcing them to reduce the prices somewhat. imagine if Ryzen wasnt around. id say that chip would be closer to 3k.
 
nearly 2g for a chip, intel have lost the plot
Depends how you look at. £1200 for a GPU that will hardly be good for a few years, vs a £2k CPU that will be good for many, many years. That makes the CPU look good value :)
The cost of something not the same as the value.
Not cheap though !
 
Last edited:
Sad thing is, people will pay it. But we can at least thank AMD for forcing them to reduce the prices somewhat. imagine if Ryzen wasnt around. id say that chip would be closer to 3k.

If Ryzen wasn't around, Intel wouldn't have gone beyond the 7920x, if you look back at the initial announcements, none of the review sites were told about it in the press packs, and Intel themselves weren't fully prepared, hence no sign of it in retail or partner systems yet.
 
Capture.png

@4.8 1.245v

Thanks :)


Thanks :) .

You guys got any 7-Zip or other productivity test done?

Still waiting on TR4 EK block :o , Gibbo has said week commencing 4th Sep :( .
 
Depends how you look at. £1200 for a GPU that will hardly be good for a few years, vs a £2k CPU that will be good for many, many years. That makes the CPU look good value :)
The cost of something not the same as the value.
Not cheap though !

What? The CPU will have exactly the same performance today as 2 years from now, in 2 years you'll have 24+ cores at the same prices with updated faster architecture and better memory bandwidth. They will devalue in terms of ultimate performance just as fast as GPUs.

The supposed 80% extra performance that goes with a new process for GPUs goes for CPUs too, just because Intel hasn't been bothered to do it doesn't mean it can't or won't happen. Yet look what happened, buy a £1500 whatever the top end chip was called last time around, 10 cores, this chip has 80% more cores, though due to being same process node it won't have 80% more performance due to power, but remake that 18 core on 10nm and it will likely have 80-100% more performance than the 10 core chips and only 2-3 years later.

If you're gaming then you'll gain nothing going from 10 to 18 to 32 core chips, but if you're buying a high end chip for multi core performance because you need that performance, then you'll absolutely gain massively in the space of a couple of years.
 
What? The CPU will have exactly the same performance today as 2 years from now, in 2 years you'll have 24+ cores at the same prices with updated faster architecture and better memory bandwidth. They will devalue in terms of ultimate performance just as fast as GPUs.

The supposed 80% extra performance that goes with a new process for GPUs goes for CPUs too, just because Intel hasn't been bothered to do it doesn't mean it can't or won't happen. Yet look what happened, buy a £1500 whatever the top end chip was called last time around, 10 cores, this chip has 80% more cores, though due to being same process node it won't have 80% more performance due to power, but remake that 18 core on 10nm and it will likely have 80-100% more performance than the 10 core chips and only 2-3 years later.

If you're gaming then you'll gain nothing going from 10 to 18 to 32 core chips, but if you're buying a high end chip for multi core performance because you need that performance, then you'll absolutely gain massively in the space of a couple of years.
It's not quite the same because with GPUs you can really add more performance by doing the equivalent of "moar coars"; they essentially run heavily threaded applications. The same can be done for CPUs but it's overall less helpful than it is for GPUs.
 
It's not quite the same because with GPUs you can really add more performance by doing the equivalent of "moar coars"; they essentially run heavily threaded applications. The same can be done for CPUs but it's overall less helpful than it is for GPUs.

It's exactly as helpful on cpus, there are two groups of people who spend £1500+ on a multi core cpu, people who game, benchmark games and some basic applications and generally wasting their money and those who actually use applications that use all the cores well. The former might be 0.0001% of all 10+ core cpus, the later, making up the bulk, will see amazing performance scaling from more cores.
 
It's exactly as helpful on cpus, there are two groups of people who spend £1500+ on a multi core cpu, people who game, benchmark games and some basic applications and generally wasting their money and those who actually use applications that use all the cores well. The former might be 0.0001% of all 10+ core cpus, the later, making up the bulk, will see amazing performance scaling from more cores.
How is that exactly as helpful? All games will be faster with a GTX 1080 over a GTX 1070 (even if they had the same core and memory clocks). Not all games or applications will be faster using 8 cores instead of 4. Thus, it is easier to advance GPU performance when reducing transistor sizes than it is for CPUs.
 
How is that exactly as helpful? All games will be faster with a GTX 1080 over a GTX 1070 (even if they had the same core and memory clocks). Not all games or applications will be faster using 8 cores instead of 4. Thus, it is easier to advance GPU performance when reducing transistor sizes than it is for CPUs.

Not all games do, older games become cpu limited anyway thus making gpus pointless in the same way. New process nodes enable double the transistor count, that enables(in general) around a 80% performance improvement potential if transistor density doubles with a new process.

Just as some games are rop/tmu limited, or clock speed limited, some software is threaded limited. But the people who buy 18 core cpus, or 32 core server cpus ONLY run software that can use all those cores anyway. Performance scales massively from new nodes every 2-3 years on cpus just as it does on CPUs, just because the average consumer doesn't see the same difference in performance doesn't make this not true.

Even if more software were single threaded and used by those who want more cores, you're talking about HOW you use the performance you have, not the performance available.

Every three years you can double transistor count, you can double gpu shader count potentially or you can double cpu core count. If you use that performance is entirely irrelevant to the statement that performance has improved. That lets say a 18 core 14nm chip can score likely 80% higher than a 10 core 22nm chip is fact, that you might not use all the available performance on the new chip is a completely different discussion.

You said you can't get extra performance every few years on CPUs like you can with gpus, but you absolutely can. You just might not have software that utilises that available performance.
 
Back
Top Bottom