• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will next-gen games run better on AMD 8350 than 3770K?

Yeah, exactly, and this guy's claiming to be learning games development?

The stuff he's saying is clueless for someone who is just interested in games, never mind someone who wants to work in games development. Oh lawd.

Anyone with a decent PC should know that the DX version doesn't really dictate the graphical quality.
 
I dont know why theres so much hate on consoles. Sometimes its nice to sit back infront of your large TV, controller in hand and have a few relaxed games.

I'm 100% confident that the PS4/Xbox whatever will be as good if not better graphically than the current crop of PC's when released.

As for the current PS3/Xbox 360, just remember these are old. Xbox was released in 2005, thats 8 years ago. I think games still look pretty good considering the age and specs. I tried playing Battlefield 3 on my AMD X6 @ 3.6Ghz with a AMD 4670, and it barely managed 25FPS at 720p low. BF3 on my Xbox is a smooth experience with much lower specs. What was the graphics in the 360, ATI 1800XT or something. I remember being blown away by the 360 graphics when I first got it.

I'm sure in a few years PC gaming will be on another level again, after thousands spent on new CPU's and GPU's, compared to the £300 on a console.

As for the original question, I'm excited that AMD 8 cores are going to be in the PS4 and Xbox because you hope that this will also mean the PC games will be better AMD optimised. Which is good for me, since I dont have £250 to spent on Intel. I'm quite happy with my 'poor mans' FX8.
 
Because generally, they're lazy. They do the textures for the console version and then just use the same assets for the PC version.

That wasn't my point anyway, I was rebutting the notion that textures are made for the console version, then new, higher res textures are remade afterwards for the PC version.

It's an insane notion to suggest that textures are done twice.

I think i saw a video once describing how texture work was done in some studios. As fare as i can recall texture work is done at high resulotion to begin with then "downscaled" when put into use in a console game.

All it would take from my understanding is a revisit to the basic work and "resave" it. But this ofcourse takes time and cost money and by some developers not considered viable. My memory could have course be faulty.
 
I think i saw a video once describing how texture work was done in some studios. As fare as i can recall texture work is done at high resulotion to begin with then "downscaled" when put into use in a console game.

All it would take from my understanding is a revisit to the basic work and "resave" it. But this ofcourse takes time and cost money and by some developers not considered viable. My memory could have course be faulty.

That's exactly how it's done. The other guy was suggesting that the textures are done at the quality for the console use, and then new ones are made from scratch, for the higher res PC ones, which just doesn't make any sense at all. Why would they do that when they can just make a really high res one, and downscale it for use on different systems.

Plus, creating high res content is much easier than creating small low res textures.
 
Of course the answer is no. The 3770k is faster than the 8350 in all regards, but it's ~£100 more so the world is fully in balance there and we can all sleep easily at night. :rolleyes:

The question should have been 'Will next-gen games run better on AMD 8350 than 3570K?', as those are clearly the competing CPUs, to which the answer is 'it's very likely'. The specs of the PS4 and NextBox make that obvious. The only question will be how many games are going to be made with the PS3 and X360 in mind, at least for the first year of new consoles.

By then though AMD and Intel are likely to have new CPUs around so we'll have to see what happens then. This 'interim' phase won't be clear cut.
 
Chances are games will still not use more 4 / 6 cores at best. It'll take some years before we get games at 8 cores etc. Plus if the gaming market was about to become 8 core friendly then you would have thought Intel would have had a CPU ready ;)
 
That's the assumption that I'm making, but maybe they didn't know that the new consoles would be 8 core and have taken their eye off the ball! :p
 
Chances are games will still not use more 4 / 6 cores at best. It'll take some years before we get games at 8 cores etc. Plus if the gaming market was about to become 8 core friendly then you would have thought Intel would have had a CPU ready ;)

They already have and this is what people fail to realise.

Frostbite 2,id Tech5 and CryEngine3 already do support upto 8 threads. UE4 will probably do the same. The PS4 uses an 8 thread X86 CPU and and the next XBox is probably going to have even more threads it seems.

All the new generation multi-platform engines thread well.

People should be supporting this if anything,as it will be better for PC gaming and moreover,it would mean Core i7 CPUs should get cheaper,and Core i5 CPUs might drop to nearer £100.

The only people who would not want that are protecting their own investments.
 
Last edited:
While I agree the new engines do support up to 8 threads they are not very well optimized. I mean the 8350 even if its IPC is slower than the i5 should decimate it no if the games can use 8 threads? But the benchmarks speak for themselves.
 
While I agree the new engines do support up to 8 threads they are not very well optimized. I mean the 8350 even if its IPC is slower than the i5 should decimate it no if the games can use 8 threads? But the benchmarks speak for themselves.


What you mean like the Crysis 3 benchmarks where the FX-8350 decimates the i5 3570K?
 
While I agree the new engines do support up to 8 threads they are not very well optimized. I mean the 8350 even if its IPC is slower than the i5 should decimate it no if the games can use 8 threads? But the benchmarks speak for themselves.

Another engine which threads well is id Tech5:

http://images.anandtech.com/graphs/graph4955/41704.png

http://www.extremetech.com/wp-content/uploads/2011/11/RageVT.png

So,that is two next generation multi-platform engines which do well on multi-threaded CPUs. Frostbite2 also does well,so that is three.

I have a Core i5 myself and I have no issue with an AMD CPU being nearly as good or faster than it at all,or if any newer AMD or Intel CPU comes out tomorrow which decimates my current CPU.

In fact I want that to happen,so when I upgrade my current system I can get a Core i7 for Core i5 level price.

I just get the slight indication,people do not want it to happen.
 
Last edited:
Win some lose some .... don't get too excited just yet but the AMD does seem reasonable value though!

Also interesting that the GTX 680 & Radeon 7970 have the upper hand in different sections as well ...

http://pclab.pl/art52489-9.html



c301.jpg


c303s.jpg


c302r.jpg
 
Sorry - I assumed this was a CPU related thread and these low res ones have their place in that context. Maybe use a couple of Titans and benchmark at 1080p instead, hang on I'll get back to you on that one :D


There are already Titan and GTX 690 tests out there at 1080P and 1440P, all of them show the opposite to your link.

They didn't even bother to test or show results at resolutions that people actually use.

I also don't trust reviews that only show you low resolutions no one uses.

The idea that lowering the resolution is a dead certainty for showing the CPU's gaming performance is probably a misunderstanding.

At low res the game does not require much from the hardware, in some games it may not even need the CPU, be that as it may, if C&Q is turned on the CPU may even go into a power saving state.

I'm not saying that's whats happening here, just that the review is incomplete as it only has one aspect, an aspect that is not even relevant.

The fact that every other review has the FX-8350 significantly faster than the 3570K at 1080P / 1440P on a Titan or GTX 690 is probably a good indicator that something else is going on.

One thing i have noticed with Crysis 3, which is different to any other game i know, is that applying any AA puts the load on the CPU, to much, and you?/i end up with a MASSIVE CPU bottleneck, none at all, and the CPU is registuring far less use. You also don't need any core to get anywhere near maxed out for it to start bottlenecking the GPU.
I don't know what CPU usage looks like/ what effect it has with this game on Intel or the FX CPU's, but on my old Thuban, it makes no sense, i can get what looks like big CPU caused bottlenecking, and yet not one of my cores are over 60 / 70%. (on Planet Side 2 i get one miserable lonely core that's constantly pegged at well over 90%, Pile of Crap game) Stock vs overclocked also seems to have nothing more than little effect on performance, having said that 'i think' the CPU-NB and Memory does. < need to do more testing.
The game seems to use the CPU differently to what, at least my CPU, was designed for.
That there is a whole different discussion on its own.

Far more information than they provide is needed to explain their discrepancy from other reviewers, thats for sure. :)
 
Last edited:
...

The game seems to use the CPU differently to what, at least my CPU, was designed for.
That there is a whole different discussion on its own.re are already

Far more information than they provide is needed to explain their discrepancy from other reviewers, thats for sure. :)

Fair points ... I am considering moving on from my i7 920 @ 4.2GHz rig (having trouble with mother board) and so am looking for 'evidence' of what comes next ... I aint convinced yet which way to go tbh! I game at 2560x1600 so being CPU bound is a rarity!
 
Back
Top Bottom