• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will next-gen games run better on AMD 8350 than 3770K?

Fair points ... I am considering moving on from my i7 920 @ 4.2GHz rig (having trouble with mother board) and so am looking for 'evidence' of what comes next ... I aint convinced yet which way to go tbh! I game at 2560x1600 so being CPU bound is a rarity!


Right now Intel is the safe side.

If we start to see more of this (performance advantage at normal res) Intel is still not out as the i7 3770K is just as good in Crysis 3, i think it will take a bit from AMD for it not to be.
But 'if'...... the £100 price difference for the Intel will start to look ridiculous.

And Haswel is coming, but so is Steamroller, rumors are Steamroller will get a huge per core IPC boost, some are predicting +30% others even more but i doubt that very much.

However its likely to be significantly more % over Piledriver than Haswel over Ivy Bridge.

No doubt Intel will still have the efficiency advantage, but Steamroller should be a very strong performer.

Again Intel's prices will just look ridiculous.

We may soon live in interesting times again :)
 
There is more to it though, look at platform choice.

For example - Intel gives you Intel SATA ports which are superior to the naff Marvel controllers you get on AMD boards.

Games might start to move towards more threaded performance gains but not everything will. I still think AMD have a lot of work to do before they can push out a CPU that is a solid as good as/better all round choice than the Intel equivalent, and more than just gaming performance is involved.
 
For example - Intel gives you Intel SATA ports which are superior to the naff Marvel controllers you get on AMD boards.

Not quite, marvell controllers are usually provided on Intel boards as a secondary SATA controller. My AMD board has the following SATA III run by the AMD SB950 chipset, 6 ports. Also the Jmicron for the SATA II. The AMD SATA ports are perfectly good for fast SSD's.

AMD SB950 controller :
6 x SATA 6Gb/s port(s), gray
Support Raid 0, 1, 5, 10
JMicron® JMB362 controller :
2 x SATA 3Gb/s port(s), black
 
Just to add one thing about compilers and Bulldozer.

Visual Studio 2010 SP1 had better compiler support for instructions inside of Bulldozer. I expect as time moves on and compilers updated, software performance will improve on AMD's new chips.

EDIT here a link to the Visual Studio SP1 update.

From the artical

"This operating system Service Pack contains the necessary support to save and restore extended context used by XOP, FMA4 and AVX instructions - See more at: http://blogs.amd.com/developer/2011/03/10/visual-studio-2010-service-pack-1/#sthash.TjrPICbm.dpuf"

http://blogs.amd.com/developer/2011/03/10/visual-studio-2010-service-pack-1/

When people did those early tests of Bulldozer it was based on code more optimized for Intel. Computer games take years of development and are often based on previous game engines, so it can take many years for compiler changes to filter into products. Not biased towards AMD btw, last 15 years used Intel chips!
 
Last edited:
There is more to it though, look at platform choice.

For example - Intel gives you Intel SATA ports which are superior to the naff Marvel controllers you get on AMD boards.

Games might start to move towards more threaded performance gains but not everything will. I still think AMD have a lot of work to do before they can push out a CPU that is a solid as good as/better all round choice than the Intel equivalent, and more than just gaming performance is involved.

What nkata said.

The AMD SB950 controller is as good as anything Intel have.

boottime.png


gjhgjh_zps858a8665.png
 
This is how you test CPU in a gaming title.

It used to be, in 2003, and like relying on fps numbers to rate GPUs it's also out of date. All it tells us is that the AMD IPC is worse....... wow, stop the press. Like we didn't know that already.

I couldn't give a monkeys uncle how the latest and most demanding game runs on the latest hardware connected to a 10 year old monitor. It's like testing a 2013 F1 car on 2003 tyres, interesting information but ultimately pointless unless technology suddenly starts to go backwards.

I do R&D for a living, and I can tell you there's no way we'd waste our time testing a sample in

There is more to it though, look at platform choice.

For example - Intel gives you Intel SATA ports which are superior to the naff Marvel controllers you get on AMD boards.

Nope, they have the AMD SATA controller as the main one.

Seems the call from 2012 was actually for you. :D

Games might start to move towards more threaded performance gains but not everything will. I still think AMD have a lot of work to do before they can push out a CPU that is a solid as good as/better all round choice than the Intel equivalent, and more than just gaming performance is involved.

Yes, I would agree.
 
Right now, to someone like me who only vaguely keeps an eye on how pc tech is progressing these days, this situation reminds me a lot about my own dilemma 5(?) years ago...

"Hybrid" Q6600 quad cores were all the rage back then, with Intel then releasing Core2Quad Q9xxx and Core2Duo 8xxx CPUs. The dilemma was whether to go with instant full useage of the Core2 Duos or gamble on quad core useage coming through within the lifetime of my new system.

I chose the quad, but in hindsight I probably made the wrong choice, because my gaming has become largely focused on drivng simulations and only in the last year have the new generation sims like Rfactor2 and pCARS really pushed my Q9300 to its knees (even on a basic air cooled 400x7.5 overclock).

These days I really need a graphics card upgrade more than a new CPU, but when the time comes, I forsee another "caught between a rock and a hard place" decision on whether to gamble on 8 core utilisation kicking in while the chip is still worth using. Given how my Q9300 has lasted 5 years and ATI's Piledriver has been around for several months already, I shall probably be tempted by more cores for the buck, whether that is ATI or Intel when the time arises.
 
Last edited:
Mertini, have you played Crysis 3? it looks astonishing, the level of detail and texture is off the planet, the lighting, reflection and shadowing is beautiful.
I also once asked if we will over see the level of detailed fine particle Physics we had seen in Metro2033 ever again, well yes, its all there.

Meh, it looks OK I guess.... Its COD style corridor bore shooter design is crap.
 
This thread started from the Crysis 3 cpu benchmarks that initally emerged.

However a Tom's hardware recent review, shows the FX-8350 with abysmal minimum cpu frames compared to even a Core i5 3550.

So as you can see Intel still hold the gaming crown, and absolutly NOTHING has changed for the Vishera chips.


/thread
 
Last edited:
\thread

Minimums in all the other reviews show this is NOT the case, so I would suggest that Tom's has an issue with their 990FX set up somewhere. Anyone with an ounce of impartiality would have spotted that and questioned it BEFORE posting an Intel biased comment like that. :rolleyes:

Also, why no 7970GE? Another review that wreaks of biasness (bit unsure of my spelling there :p).
 
\thread

Minimums in all the other reviews show this is NOT the case, so I would suggest that Tom's has an issue with their 990FX set up somewhere. Anyone with an ounce of impartiality would have spotted that and questioned it BEFORE posting an Intel biased comment like that. :rolleyes:

Also, why no 7970GE? Another review that wreaks of biasness (bit unsure of my spelling there :p).

:p

Biased it was granted! ;)

This is why I hate benchmark websites, results vary even with the same hardware! :rolleyes:
 
Why can't they see these oddities themselves though? I mean surely they've seen the other results floating about. Maybe they've found a part of the game, or specific setting, that doesn't agree with the AMD set up. Who knows, but surely you'd question it and try and find out what's going on rather than just jumping to an instant conclusion. Poor testing methods are so rife in this industry.

I had some odd results myself from Crysis 3, so I had a good look through my settings and sure enough found something that was hindering performance a little (10% actually).
 
From what I gather,is that the parts, where the reviews that show the FX8350 is doing well,test the most CPU bottlenecked part of the game,as it is highly multi-threaded due to the vegetation(?). Some of the other parts of the game probably don't push such a load,so its probably where a Core i5 would do better.

Hence,you could argue in that the case,the most important part of the Crysis3 experience will be at that part,if you intend to test a CPU. The Core i3 CPUs show poor performance,since the framerates appear to tumble in that part.

The same goes with the GPU,it should be tested during the most graphical demanding parts of the game.

Having said that, AMD's FX-8350 provides serviceable Crysis 3 game play. Despite the frame rate valley we experienced in our benchmark run, this CPU achieves smoother performance on average. Perhaps this is something Crytek will be able to address through a future update.

The FX8350 smoother?? Latency testing will be interesting.
 
Last edited:
From what I gather,is that the part, the reviews that show the FX8350 is doing well,test is the most CPU bottlenecked part of the game,as it is highly multi-threaded due to the vegetation(?). Some of the other parts of the game probably don't push such a load,so its probably where a Core i5 would do better.

Hence,you could argue in that the case,the most important part of the Crysis3 experience will be at that part,if you intend to test a CPU. The same goes with the GPU,it should be tested during the most graphical demanding parts of the game.



The FX8350 smoother?? Latency testing will be interesting.

I dont think they meant smoother than the intel cpu's, I think smoother than what the 8350 benchmarks suggest.

Still, if someone gave you the choice of higher fps but higher latency, vs lower fps and lower latencies (higher frames being 30 and low being 20) you would choose the higher fps and high latency option.
 
Last edited:
I dont think they meant smoother than the intel cpu's, I think smoother than what the 8350 benchmarks suggest.

Still, if someone gave you the choice of higher fps but higher latency, vs lower fps and lower latencies (higher frames being 30 and low being 20) you would choose the higher fps and high latency option.

The main thing is that the FX6300,which is a £100 CPU is showing similar performance to much more expensive CPUs,during the most CPU intensive part of the game. Hence a stock or overclocked FX6300 seems a very good budget option for a Crysis3 rig with a £180 HD7870LE/HD7870XT.

I suspect a locked Core i5 3350P for around £135 to £140 with an HD7850 2GB would have a poorer overall experience in the game.

The Core i3 CPUs OTH,really need replacing by Intel with a cheaper quad core IMHO,unless Haswell,somehow improves the hardware implementation of HT massively.
 
Last edited:
Intel for games :)

At best the 8350 slightly beats the similar priced 3570k on a few occasions and this isn't going to change much with new consoles.

And comparing to i7 is silly.
 
Back
Top Bottom