• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

**The New Official THIEF-Incorporating Bench mark Thread**

I thought Matt said that it was just BF4 and DICE which had cocked the memory (mis)management? :p

Thief and StarSwarm work fine on my 1gb HD7770 and since posting that Mantle awareness thread I've got Mantle working at higher settings than DirectX and providing better performance. I'm happy, though i still think vram usage can be improved upon and i wanted to raise that issue.
 
Thief and StarSwarm work fine on my 1gb HD7770 and since posting that Mantle awareness thread I've got Mantle working at higher settings than DirectX and providing better performance. I'm happy, though i still think vram usage can be improved upon and i wanted to raise that issue.

It'll be easier to draw conclusions once Mantle is in a lot earlier in the dev cycle as opposed to patched in I think.

I don't think there's an inherent problem with the API which causes it but it is intriguing that it's present in two games by different devs on different engines.

Time will tell! :)
 
Mighty Mantle Xfire @1080 i7 2600k @stock 3.5ghz GPUs @977/1250

thief%20bench%20Mantle%20xfire.png
 
Last edited:
It'll be easier to draw conclusions once Mantle is in a lot earlier in the dev cycle as opposed to patched in I think.

I don't think there's an inherent problem with the API which causes it but it is intriguing that it's present in two games by different devs on different engines.

Time will tell! :)

The difference is stark for me. Thief/Starswarm uses a set amount of vram and does not increase past that point. Battlefield which uses the Frostbite engine, tends to use X amount of vram then increase, increase and increase until it reaches a plateau and levels off.
 
The difference is stark for me. Thief/Starswarm uses a set amount of vram and does not increase past that point. Battlefield which uses the Frostbite engine, tends to use X amount of vram then increase, increase and increase until it reaches a plateau and levels off.

I played BF4 Campaign on Ultra @ 4K Internal Res (1080P @ 200%) minus any AA, The whole lot in three sittings, Vram peaked at 3970MB, Mantle, i was willing it to hit the buffers just to see that..

Looks awesome, bright, sharp and with real depth. i might do it again in DX just to record it all in DVR.
 
I played BF4 Campaign on Ultra @ 4K Internal Res (1080P @ 200%) minus any AA, The whole lot in three sittings, Vram peaked at 3970MB, Mantle, i was willing it to hit the buffers just to see that..

Looks awesome, bright, sharp and with real depth. i might do it again in DX just to record it all in DVR.


So not real 4K then lol
 
I played BF4 Campaign on Ultra @ 4K Internal Res (1080P @ 200%) minus any AA, The whole lot in three sittings, Vram peaked at 3970MB, Mantle, i was willing it to hit the buffers just to see that..

Looks awesome, bright, sharp and with real depth. i might do it again in DX just to record it all in DVR.

So not real 4K then lol

I took a couple of screenshots earlier at 1440P, res scale 150% 4K, Ultra preset with x4 AA. It looks nice but im sure proper 4k looks better.

ScreenshotWin32-0002.png

ScreenshotWin32-0007.png
 
I took a couple of screenshots earlier at 1440P, res scale 150% 4K, Ultra preset with x4 AA. It looks nice but im sure proper 4k looks better.

ScreenshotWin32-0002.png

ScreenshotWin32-0007.png

It will look better on a 4K display for sure, But i don't know what you mean by 'proper 4k' Its the same thing as far as the render goes, think of resolution scaling as down sampling, the game is being rendered in 4K and then compressed to fit whatever res you have set, the image is a 4K quality image displayed at 1080P, in my case... 1440P in yours. :)

I think the quality difference between you and me is also different as i run 1080P while you run 1440P, your already closer to 4K than i am natively.
 
Last edited:
Q6600 @3600, R9 290 1060 / 1400 Catalyst 14.4 whql.

DirectX: Min: 12.8, Max: 68.6, Avr: 28.5
thief_dx.png

Mantle: Min: 29.0, Max: 79.6, Avr: 44.5
thief_mantle.png

Quite a difference on this aging cpu. At these settings even the menu is juddery on the menu screen with DirectX, yet nice and smooth with mantle.
 
Last edited:
Q6600 @3600, R9 290 1060 / 1400 Catalyst 14.4 whql.

DirectX: Min: 12.8, Max: 68.6, Avr: 28.5
thief_dx.png

Mantle: Min: 29.0, Max: 79.6, Avr: 44.5
thief_mantle.png

Quite a difference on this aging cpu. At these settings even the menu is juddery on the menu screen with DirectX, yet nice and smooth with mantle.

Looks like Mantle might breathe a bit of life into your aging cpu behemoth. :)
 
Sorry I had the Image on private.

Mantle Xfire @1080 i7 2600k @stock 3.5ghz GPUs @977/1250
thief%20bench%20Mantle%20xfire.png

You need to overclock your cpu bud and get some faster memory. :)

Third stock run and my best yet, ill submit this one please.

[email protected]
16GB DDR3 2133Mhz 9-10-10-21-1T
1080P
290P crossfire Stock @975/1250
14.6 Beta

2hRFnFi.jpg

And yes, my system is beating Kaaps. Lol. (sorry Kaap :p)
 
Back
Top Bottom