• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD R9 Fury X Leaps Ahead Of Nvidia GTX 980 Ti With The Latest Windows 10 Drivers

But surely that means at one point they were worse than they are now? (Which was my original point)

Out of interest, what was the last game patches that offer huge performance jumps like some of AMD's big performance drivers?

Omega.

To be frank its a somewhat strange question to ask, considering you know more about what your talking about than you're average Joe.
Software, by that i mean Game Engines, Drivers.... ecte are always evolving, someone always finds a better more efficient way of doing something... ergo it gets better often that results in lower latency which = higher FPS.

I'm getting to grips with Voxel GI, its adding about 4ms to my GPU render time, that results in less FPS, i'm trying to find ways of improving efficiency, as are CryTek, once its found the performance will go up again. evolution. it takes time.
 
Last edited:
But surely that means at one point they were worse than they are now? (Which was my original point)

Out of interest, what was the last game patches that offer huge performance jumps like some of AMD's big performance drivers?

Games for me recently, MK10, Batman AK, Fifa 16, H-hour just to name some.

Optimization is the norm. Will always happen whether we like it or not.
 
What's so difficult to understand about finding better ways of doing things as time goes on?

Shankly's example of consoles improving over time is valid. Look at early 360 games compared to ones released later in it's life cycle. It's the same principle.
The more time you have with the hardware, the more likely you are to find ways of writing software that can take better advantage of it's capabilities.
 
What's so difficult to understand about finding better ways of doing things as time goes on?

Shankly's example of consoles improving over time is valid. Look at early 360 games compared to ones released later in it's life cycle. It's the same principle.
The more time you have with the hardware, the more likely you are to find ways of writing software that can take better advantage of it's capabilities.


This ^^^^ its par for course.

Omega.

To be frank its a somewhat strange question to ask, considering you know more about what your talking about than you're average Joe.
Software, by that i mean Game Engines, Drivers.... ecte are always evolving, someone always finds a better more efficient way of doing something... ergo it gets better often that results in lower latency which = higher FPS.

I'm getting to grips with Voxel GI, its adding about 4ms to my GPU render time, that results in less FPS, i'm trying to find ways of improving efficiency, as are CryTek, once its found the performance will go up again. evolution. it takes time.
 
What's so difficult to understand about finding better ways of doing things as time goes on?

Shankly's example of consoles improving over time is valid. Look at early 360 games compared to ones released later in it's life cycle. It's the same principle.
The more time you have with the hardware, the more likely you are to find ways of writing software that can take better advantage of it's capabilities.

Its fairly common with software development to discover ways of utilising something that no one ever expected that can include significant performance improvements especially as the developers become more familiar with the software/hardware with time. Often developers due to the sheer scale of things will have to use "safe mode" routines until certain features have been out in the wild long enough to get a good cross section of how they are best configured, etc. that couldn't possibly be done in a normal reasonable development timeframe.
 
I'm very much looking forward to a DX11 environment similar to that of nvidia and I think we are actually going to get it, possibly over the next few months too. I don't expect Crimson to do this all at once but get there over a few updates, from the looks of it we get the public release around the 24th I think it was. I hope we get good things for Linux also as I'm thinking of putting a distro on my current machine when I upgrade it, SteamOS maybe.

Are there any 3Dmark comparisons I can look over yet? Can't test myself as both of my 1200p monitors are dead with bad caps :(
 
All AMD GPUs Get A Sizable Performance Boost

Worthy of note is that these drivers aren’t part of the upcoming Crimson Edition Radeon Softawre. Which is a completely re-designed affair which replaces Catalyst Control Center with Radeon Settings and promises to deliver a wide array of user interface improvements as well as performance optimizations. So it’s going to be really interesting to see how much of an additional improvement these upcoming drivers will deliver.
We didn’t really expect the competitive landscape of graphics to change as drastically as it has and we certainly did not expect it to come via driver updates, but it’s happening. This development perhaps may also incentivize Radeon gamers who are currently on Windows 8.1 and 7 to upgrade to Windows 10. What’s perhaps the most exciting aspect of all is that because these performance improvements are achieved through the drivers they’re free and immediately accessible to everyone. So Radeon users will be able to enjoy the benefits simply by updating their software. Getting a free performance boost without having to fork out a decent sum for a hardware upgrade is never unpopular. So it seems the ultimate goal of having one’s cake and eating it too may not be far fetched.

Read more: http://wccftech.com/amd-r9-fury-x-performance-ahead-nvidia-980-ti-latest-drivers/#ixzz3quY2QOSX

Whenever I see a r290(x) tested, they don't mention if it's the stock model which actually cannot hold maximum speed at all times or a 3rd party one. If they're using the stock one, then boy, R290/x are even faster than that! :D

To see a r290 being equal to a gtx970 or better, that's a BIG thing for AMD. They may be "power hungry", but those oldies are some mean beasts and I'd say is probably what the most of us are interested in/should look at (and not some way too expensive cards where 980ti wins anyway due to overclocking and vRAM). :)
 
Last edited:
What's so difficult to understand about finding better ways of doing things as time goes on?

Shankly's example of consoles improving over time is valid. Look at early 360 games compared to ones released later in it's life cycle. It's the same principle.
The more time you have with the hardware, the more likely you are to find ways of writing software that can take better advantage of it's capabilities.

It's just another GM moment, half the time I find he just posts for the fun of it even though he knows the correct answer.
 
I'm very much looking forward to a DX11 environment similar to that of nvidia and I think we are actually going to get it, possibly over the next few months too. I don't expect Crimson to do this all at once but get there over a few updates, from the looks of it we get the public release around the 24th I think it was. I hope we get good things for Linux also as I'm thinking of putting a distro on my current machine when I upgrade it, SteamOS maybe.

Are there any 3Dmark comparisons I can look over yet? Can't test myself as both of my 1200p monitors are dead with bad caps :(

Yes if AMD can get their DX11 overhead down like Nvidia then their gpu performance could actually match or beat Nvidia. Older games would get a decent boost.
 
So are we saying that Nvidia could make their drivers better but they're purposely not because they hate the end user?

Otherwise we'd have to say that Nvidia got closer to the max performance out of the driver much earlier, but we've just been saying that's impossible because AMD didn't.

The last big increase I remember from Nvidia was when they basically re-wrote bit's of Microsoft's DirectX, which is more like improving DirectX than their driver.


EDIT:
And as for the console example, I think it's only comparable in the case where the people that made the hardware make the improvements. For everyone else it's probably the experience of working with someone else's hardware.

In fairness the whole thing was a bit tongue in cheek anyway (my first post (reply to humbug?) had a smilie in it to indicate such). After that I was just playing devil's advocate in how we often see these things as boost now rather than performance being held back at the start.
 
Last edited:
So are we saying that Nvidia could make their drivers better but they're purposely not because they hate the end user?

Otherwise we'd have to say that Nvidia got closer to the max performance out of the driver much earlier, but we've just been saying that's impossible because AMD didn't.

The last big increase I remember from Nvidia was when they basically re-wrote bit's of Microsoft's DirectX, which is more like improving DirectX than their driver.

IMO Nvidia are better at getting drivers right at the right time than AMD.

Quite a few times I have seen AMD come from behind to take the lead. I can't say I have ever seen drivers from Nvidia that really make them much better than they were before.

Fury X was probably the most hasty and rushed release I have ever seen. I actually expected this to happen, which was part of the reason I went with a Fury X.
 
So are we saying that Nvidia could make their drivers better but they're purposely not because they hate the end user?

Otherwise we'd have to say that Nvidia got closer to the max performance out of the driver much earlier, but we've just been saying that's impossible because AMD didn't.

The last big increase I remember from Nvidia was when they basically re-wrote bit's of Microsoft's DirectX, which is more like improving DirectX than their driver.

Nvidia have done in the past, have you forgotten the time when they released the so called Mantle killer driver?? Again they found out away to lower DX11 overhead resulting in better optimisation for there line up.

Again it happens.
 
IMO Nvidia are better at getting drivers right at the right time than AMD.

Quite a few times I have seen AMD come from behind to take the lead. I can't say I have ever seen drivers from Nvidia that really make them much better than they were before.

Fury X was probably the most hasty and rushed release I have ever seen. I actually expected this to happen, which was part of the reason I went with a Fury X.

I think we will find over time again once AMD and Devs get to work more with HBM then we will see another increase in performance.
 
Fury X was probably the most hasty and rushed release I have ever seen. I actually expected this to happen, which was part of the reason I went with a Fury X.

Which is nuts when you think about it, its really only mid October that stock has been good even though it launched 23 June. Makes me wonder when they would have launched it if they had delayed further to sort out pump and supply issues.
 
I think we will find over time again once AMD and Devs get to work more with HBM then we will see another increase in performance.

Yes there's that too.

One thing I still don't think is going to happen though is overclocking. I'm convinced that the CPU voltage is tied to the memory voltage and thus if you touch it things go pop.

Mind you I have never overclocked a GPU meaningfully and I don't really intend to start now.
 
Yes there's that too.

One thing I still don't think is going to happen though is overclocking. I'm convinced that the CPU voltage is tied to the memory voltage and thus if you touch it things go pop.

Mind you I have never overclocked a GPU meaningfully and I don't really intend to start now.

The dev behind Msi AB has already hinted that Voltage control will happen.
 
Nvidia have done in the past, have you forgotten the time when they released the so called Mantle killer driver?? Again they found out away to lower DX11 overhead resulting in better optimisation for there line up.

Again it happens.

That was actually quite impressive on Nvidia's part, its not a fraction on Mantle but it is pretty fantastic DX11.

Anyway... Googlay, Nvidia don't hate the end user, and for anyone to say its impossible to increase performance on anything is just trolling, if they are serious.

I would say in the past Nvidia absolutely had the upper hand in software development, but more recently i would say AMD have thier act together to such an extent they are more often putting Nvidia on the defensive, and that's good for all. :)
 
Back
Top Bottom