• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

**THE NVIDIA DRIVERS THREAD**

No he's saying he used the highest quality settings, reducing settings increases performance, either Nvidia used the highest settings, in which case the performance is very questionably poor or Nvidia used lower than higher settings making the performance even more questionable.

You're missing the point of as humbug knows his settings and you can't go higher than his settings that is the reference point.

If humbug was using medium settings lets say, then the highest settings may reduce performance so much the numbers could/should be that low, but he used the highest settings. Nvidia's numbers are BS if they used the highest settings, if they used anything lower the numbers are simply BS to a larger degree.

No, he isnt, humbug is saying that his 7870XT on low settings beats a 780ti on unknown probably max settings and cant see why that is a faulty premise

I ran Star Swarm in Mantle on a low setting, that leaves me completely (CPU bound) My GPU is much much weaker than a 780TI and yet with a 4 year old AMD CPU i was able to get 102 FPS. If the much more powerful 780TI is only getting 70 FPS then it must be because its bottlenecked by that 3960K in DX, despite it being so much more powerful than my CPU the performance is 30% less than what i get in Mantle.

With an infentially powerful CPU like the 3960K i would get about 140 FPS in Mantle. add a 290X into the mix i would get about 300 or 350.

The thing with the star swarm demo is that it isalways CPU bound, i never see more than 50% Gpu usage even with deferred contexts enabled, so yes it is entirely plausible that a lower end GPU gets close to the performance of a top tier card that iss running at 50%
Although I never see 100% usage on any core, either in DX or in mantle, so exactly where this supposed bottleneck is occuring is a quandary
 
Last edited:
What is it with you Humbug putting settings to low and then saying I challenge you...Give it up Humbug please. You are, for some reason, cheesed off with this news and resorting to stomping around the forum and making out your 7870 is a 390X in disguise and lambasting any good news with this tripe.

@ DM, I take it you didn't pick up on the AMD slides when they do their tech demos and make the results look much better than they are by purposefully starting at a high number to make the results look better than they actually are?

I need to keep tweeting people to find out when these new drivers are coming. Time to make myself look a proper brown nose :D

50aeeb8d333748d96acf2cc542ec042f.jpg


For the record, DX11 is kicking Mantle's behind.

Ohhh and StarSwarm is the most inconsistent bench I have ever seen and pretty much useless for comparison.
 
Last edited:
Greg do we have any idea when this new Nvidia driver will be available ?
I'm still in shock that Microsoft and Nvidia have sprung this :eek: I did think how quite Nvidia was and how much limelight AMD was getting, now we know :cool:
 
What is it with you Humbug putting settings to low and then saying I challenge you...Give it up Humbug please. You are, for some reason, cheesed off with this news and resorting to stomping around the forum and making out your 7870 is a 390X in disguise and lambasting any good news with this tripe.

@ DM, I take it you didn't pick up on the AMD slides when they do their tech demos and make the results look much better than they are by purposefully starting at a high number to make the results look better than they actually are?

I need to keep tweeting people to find out when these new drivers are coming. Time to make myself look a proper brown nose :D

50aeeb8d333748d96acf2cc542ec042f.jpg


For the record, DX11 is kicking Mantle's behind.

Ohhh and StarSwarm is the most inconsistent bench I have ever seen and pretty much useless for comparison.

Look at your results again??

Humbug with a friggin ancient Phenom II X6 1090T at stock with a £120 graphics card is scoring slightly higher than a system with a CPU which probably costs more than the Phenom II and HD7870 while being grossly overclocked. The same goes with your result. It could be twice as fast as his,but it should since you have £350+ CPU,£700+ graphics card,mega-cooling,etc. Kind of be rubbish if it wasn't,no?? :p

Plus if Starstorm is inconsistent then why bother even showing that chart then?? A bit pointless??

Its like with 3DMark or Unigine. They don't necessarily equal what you see in the realworld.

TH tests Thief with Mantle:

http://www.tomshardware.com/reviews/thief-mantle-benchmarks,3773.html

An FX4170 just about matches a Core i7 4770K!

X343g7k.png

ujEdKhK.png

Thats in a actual game,and people might want to deflect all they want. That graph alone says a lot even with £70 and £130 graphics card in a VERY CPU limited game,ie, one that uses upto 4 threads,but prioritises 2 threads.

Things like Mantle are for improving CPU efficiency,which is what has been stated all along. Heck,this is what they are talking about with DX12.

Heck,even the multi-threaded NV drivers too.

Its for the mass market. The majority of gamers don't own super high end CPUs,let alone normal Core i7s or even overclock.

They are the ones which will see the most benefit from all this.

The thing is ever since DX9C,MS has been stuck on its butt for nearly a decade,using drip fed DX improvements as an excuse for you to buy a new version of Windows.

Look at how they managed to fend off OpenGL.

Don't people think it is rather convenient that within a year of SteamOS and Mantle,MS starts talking about DX12??

Its time they felt the pressure otherwise they will use their market position to do eff all and make everyone go to tablets.

Why??

It costs less for them and means you buy more Windows licenses over time. Tablets are disposable items which people don't tend to transfer the license over.
 
Last edited:
Im so glad I stuck with Nvidia... the performance of the 780Ti has totally blown me away & the in-game real world performance is just jaw dropping. The 780 cards run very very cool & are oh so quiet.

Once again Nv save the day long live DX :p can't wait for this new balls to the wall driver
 
Last edited:
Im so glad I stuck with Nvidia... the performance of the 780Ti has totally blown me away & the in-game real world performance is just jaw dropping. The 780 cards run very very cool & are oh so quiet.

Once again Nv save the day long live DX :p can't wait for this new balls to the wall driver

To be honest, if I were an AMD owner I'd be devastated by this news and would be looking to go green :D. Enjoy your lovely 780ti dude ;)
 
@ Cat, I am not quoting all that but if you look at what I said, I said StarSwarm is the most inconsistent bench there is. Humbug's 390X is beating a 290X from DamnedLife, so how can that be compared in real world performance?

I was kinda making a mockery of it but I can see how it would get missed.
 
Last edited:
To be honest, if I were an AMD owner I'd be devastated by this news and would be looking to go green :D. Enjoy your lovely 780ti dude ;)

I'm not sure i'd go that far, a lot of AMD buyers buy on price, but it certainly curtails the "haha we've got mantle and will ownz you" talkthat has been de riguer around here the last couple of months... If I had based a purcahsing decision on Mantle I might be a bit upset ;)
 
To be honest, if I were an AMD owner I'd be devastated by this news and would be looking to go green :D. Enjoy your lovely 780ti dude ;)
I still don't see where's people logic in this...

Did people pay a price premium for their cards (for Mantle)? No. People that bought Nvidia card did.

Was the AMD cards lower in price? Yes.

Does Mantle mean AMD cards can no longer use directx? No.


So I don't see the reasoning in why AMD owners should be "devastated". The only thing that AMD owners should be "devastated" or "worried" is their cards performance being nerf in some game titles.
 
Last edited:
In fairness Marine, nVidia owners have watched AMD get Mantle and I even admitted to being slightly jealous and now we have improvements coming, which is hopefully across the whole of the DX11 range and gives Mantle levels of performance gains and the news of DX12 being backward compatible for the important bits, nVidia owners are all a little excited (if we are honest).

We have not had much to cheer about but now we do :D
 
I still don't see where's people logic in this...

Did people pay a price premium for their cards (for Mantle)? No. People that bought Nvidia card did.

Was the AMD cards lower in price? Yes.

Does Mantle mean AMD cards can no longer use directx? No.


So I don't see the reasoning in why AMD owners should be "devastated". The only thing that AMD owners should be "devastated" or "worried" is their cards performance being nerf in some game titles.

I was thinking DX11 boost with Nvidia drivers so it would appear AMD get no DX11 boost and if a good number of games benefit with Mantle like performance then yes I would be a little peeved, but I'm only saying how I would feel.
 
Last edited:
Alatar on OCN did some testing:

So with the DX11 efficiency talk I decided that I would do some small scale investigating.

Now since I do not have a 290 or a 290X I was forced to use my 280X. I OC'd the 280X and I downclocked my Titan so that they both offered about the same amount of GPU performance:

1100MHz 280X: http://www.3dmark.com/3dm/2734474
820MHz Titan: http://www.3dmark.com/3dm/2734934

Now since Star Swarm at stock settings is really inconsistent and doesn't offer similar results between different runs I used a custom scenario with a custom camera angle. This means that everything happens exactly the same between the runs and is viewed from the same angle as well. This means that Star Swarm works in the same way as any other normal bench. So running the custom scenario makes the results comparable between each other.

Here's the scenario if anyone else wants to test:

ScenarioCustom.csv 3k .csv file

Then I started testing both of the cards in the Star Swarm bench.

280X first:

DX11)



Mantle)



Then the Titan:

DX11)



Results (avg fps) :

280X DX11: 31.69 fps
280X Mantle: 42.52 fps
Titan DX11: 39.3 fps

Now remember that with these clocks the GPUs have approximately the same amount of GPU power (within a couple of per cent). And despite this:

Titan is 24% faster than 280X in DX11
280X is 8% faster than Titan when 280X is running mantle and Titan DX11.

Again, the cards are clocked so that GPU performance is comparable.


Conclusion:

Even in windows 7 there's a very clear advantage in favor of Nvidia when it comes to DX11 CPU overhead.

RE the new drivers. It's hardly a stretch that NV and MS have made similar overhead improvements given the push that has been given. Seems the playing field may equalise sooner than expected.
 
Last edited:
Alatar on OCN did some testing:



RE the new drivers. It's hardly a stretch that NV and MS have made similar overhead improvements given the push that has been given. Seems the playing field may equalise sooner than expected.

we tried to do the exact same test here - someone worked out a custom file that was very repeatable, however some AMD users couldn't follow simple instructions and got uppity at Greg for not including their results, so we had to switch back to the default profile which is why the starswarm bench thread is dog turd
 
I was thinking DX11 boost with Nvidia drivers so it would appear AMD get no DX11 boost and if a good number of games benefit then yes I would be a little peeved, but I'm only saying how I would feel.
If Nvidia can use dx11 to improve their performance, there's no reason why AMD wouldn't be able to do the same.

Until we actually see it being implemented into games, I still think this is more like a PR move more than anything else, like AMD keep saying Bulldozer CPU or Nvidia was saying Fermi was just around the corner...

If dx11 really had that much room for improvement, as others have already pointed out why the hell it's been left in such poor performance state for so many years? Typical M$ sitting on their fat hind ignoring the PC gaming community?
 
Last edited:
If Nvidia can use dx11 to improve their performance, there's no reason why AMD wouldn't be able to do the same.

Until we actually see it being implemented into games, I still think this is more like a PR move more than anything else, like AMD keep saying Bulldozer CPU or Nvidia was saying Fermi was just around the corner...


To be fair the same could be said about Mantle. People seem to have had it drilled into them that it's magically re-written the rule book. There is just as much no reason - that NV couldn't make the same overhead improvements through the existing API, which in time we will tell in testing.


Hi Greg :cool:
 
If Nvidia can use dx11 to improve their performance, there's no reason why AMD wouldn't be able to do the same.

Until we actually see it being implemented into games, I still think this is more like a PR move more than anything else, like AMD keep saying Bulldozer CPU or Nvidia was saying Fermi was just around the corner...

If dx11 really had that much room for improvement, as others have already pointed out why the hell it's been left in such poor performance state for so many years? Typical M$ sitting on their fat hind ignoring the PC gaming community?
AMD could do the same and they might well be doing so right now.
But I would imagine there putting more R&D into mantle and its hard to do both
 
If Nvidia can use dx11 to improve their performance, there's no reason why AMD wouldn't be able to do the same.

Until we actually see it being implemented into games, I still think this is more like a PR move more than anything else, like AMD keep saying Bulldozer CPU or Nvidia was saying Fermi was just around the corner...

If dx11 really had that much room for improvement, as others have already pointed out why the hell it's been left in such poor performance state for so many years? Typical M$ sitting on their fat hind ignoring the PC gaming community?

I see what you're saying but think it would be unlikely for AMD to improve DX11 performance as their focus is now on Mantle but I could be wrong.

Indeed if the improvements are as good as we're hoping then I will be a little bewildered as to why it hadn't been done before now, but on the other hand I don't think I'd dwell on it for very long either, better late than never.
 
Back
Top Bottom