• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

New consoles = better CPU optimization?

Now we're really speculating :p

If a GTX680 couldn't max it out due to VRAM at 1080p I'd be surprised, 1440P+, not so much, but then I'd want it to crap out of GPU before VRAM to be impressed.

Don't really want to see a 7970 maxing out BF4 yet either, I mean it can't max out Tomb Raider at 1080p comfortably, I want to see BF4 more demanding.
 
Yeah, I think it'll need 2x7950s to max out on ultra, maybe even 2x7970s. I think there's a really high chance that the new GPU will do the highest single-GPU settings possible though. Maybe it'll have a high texture option available only to GPUs with 3GB+.
 
The AMD involvement is also why I think nvidia "2GB is more than enough" fans might be annoyed with BF4 ultra settings. No AMD GPU capable of running it that high has less than 3GB.

Now we're really speculating :p

If a GTX680 couldn't max it out due to VRAM at 1080p I'd be surprised, 1440P+, not so much, but then I'd want it to crap out of GPU before VRAM to be impressed.

Don't really want to see a 7970 maxing out BF4 yet either, I mean it can't max out Tomb Raider at 1080p comfortably, I want to see BF4 more demanding.

It could be brown pants time for the "2gb will be enough forever" crowd. Before anyone mentions caching the alpha only had one level playable and caching only occurs during the second map loading. ;)

dR9XLzg.jpg
 
I've always said AMD's VRAM algorithm is more efficient that Nvidia's, that graph shows it too :p

While not maxed out, they could always drop the MSAA to 2x.

I guess why a lot of people are pro 2GB for being enough, is for years we've had 1GB being enough, it's only been in the last year you've needed more.

But from the graph, the potential is there for BF4 to need more than 2GB maxed out.
 
Agreed of course details can be lowered. Who would have thought that Ultra with x4 AA would exceed 2gb at 108p though. Wouldn't surprise me if AMD are trying to help get the vram usage up. Cuts out a large chunk of Nvidia 2gb cards if they can lol.

Two sides to this though. The game is likely unoptimized, maybe vram usage will come down. The other side is look at the map, 75% or more of the textures are missing and yet the vram usage is still pretty high. One would speculate that usage would increase once more textures are added.


aYpzzSS.jpg
 
If I were AMD, having got the biggest game of the year tied up, and launching it with a new flagship GPU, I'd want to make it just good enough that people don't say it's screwing over nvidia users, but at the same time on forums like this having everyone say only consider buying AMD new for BF4.
 
Dice won't let them do that. :D The vram usage is interesting though, can't wait to see how it plays out once the final version hits town.
 
Dice won't let them do that. :D The vram usage is interesting though, can't wait to see how it plays out once the final version hits town.

Bioshock Infinite is already well ahead on a stock 7970 compared to a GTX 780, and just edges out a Titan. With BF4 and the new GPU I can't see it not beating a Titan :)
 
Bioshock Infinite is already well ahead on a stock 7970 compared to a GTX 780, and just edges out a Titan. With BF4 and the new GPU I can't see it not beating a Titan :)

http://www.techspot.com/review/655-bioshock-infinite-performance/page2.html
http://www.techspot.com/review/655-bioshock-infinite-performance/page3.html
http://www.techspot.com/review/655-bioshock-infinite-performance/page4.html


And another review ;

http://www.guru3d.com/articles_pages/bioshock_infinite_graphics_performance_review_benchmark,6.html


Some Crossfire/SLI scaling ; http://www.tomshardware.com/reviews/geforce-gtx-770-gk104-review,3519-13.html (Unless Titan and 680 are achieving over 100% scaling, the single 7970 isn't faster)


I see this ; http://forums.anandtech.com/showthread.php?t=2309823 and it's under scrutiny from the get go and includes more benchmarks of the hierarchy as shown in the above benchmarks.
 
Last edited:
Are those on the latest drivers? I'm looking at this one:

96msYPR.png

(Source ht4u)

That's max settings, no AA. Latest AMD and nvidia drivers.
 
Why is it you only ever see the most AMD positive benchmarks.
I own a 7970 1GHZ, but I'm not going to kid myself lol.

Even assuming said benchmark is legit (In that everyone running those cards at that resolution, those settings) and because of the MSAA AMD's gained a lead, that set up and settings of 4x MSAA are in the vast minority, to say the 7970 is besting the 780/Titan is wrong as it's doing so in 1 of about 100 set ups.
 
Last edited:
I'm not - in their set of benchmarks Bioshock Infinite is the only game that beats a Titan, but it's on the latest drivers which have given a good boost to the game. Since it's an AMD bundled game I'd expect it to do quite well.
 
Well that's why I've asked if the others are on the latest drivers. One mentions 13.3 beta, etc. If there's something wrong with their benchmark fair enough, but it's on 13.6 drivers and part of a fairly large test.
 
It could be brown pants time for the "2gb will be enough forever" crowd. Before anyone mentions caching the alpha only had one level playable and caching only occurs during the second map loading. ;)

dR9XLzg.jpg
Meaningless without knowing what the corresponding min/avg frame rates were. High vram usage is one thing, high vram usage and maintaining 60fps quite another.
 
Just wondering - seeing as both the major new consoles will be shipping with AMD 8-core CPU's, will this make games more biased towards AMD?

In the past (PC wise) games usually have run better on Intel chips being designed to make optimum use of them probably due to the use of the intel compiler (which is obviously going to be biased :) )

What AMD CPU do these new consoles actually use? - i.e. are they actually bulldozer or whatever cores? - Think their x86 based so I'm pretty sure they will be better suited to games with coders having to do little to get them 'ported' to PC's.

Not sure of course as I don't understand the architecture very well, and obviously still undecided if I'll get an xbox 180 ( :p ) or a PS4 yet, but FPS has always been reserved for the PC for me - nowt beats the gud ol mouse/keyboard combo for them type of games!

PS - I hate the thought of my console having 'MOWAR CORES' than my PC - its just not right! ;)

Consoles often have a somewhat odd OS setup especially with the xbox where it has a "gaming" OS side by side with an "application" OS so some CPU cores are tied up doing OS stuff that can't (as far as I know) ever be used by the games themselves which is different to the PC where games and OS will share cores much more.

Consoles also often emulate in software hardware functionality that on the PC is handled by dedicated hardware i.e. networking and sound will often have a lot more CPU processed functionality thats implemented in hardware on the PC (its cheaper and is quite effective when you've got lots of low performance cores) but does mean that the on paper number of cores actually comes down to a lot less working on the game than it first seems.

Overall the effectiveness of multi-threading in games can be increased quite a lot from where we are now but at the end of the day the majority of game engines and games built on them will always favor a CPU where you have the ability to run 1-2 threads on high performance (or dynamically boosted) cores and a handful of lower performance threads over a CPU that has lots of low to medium performance cores. Something that AMD has worked on with Steamroller which should help to level the playing field against intel when it comes to gaming once CPUs based on that architecture appear.
 
Back
Top Bottom