• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD or Nvidia: Between a Rock and a Hard Place?

I don't see the issue with crossfire support for freesync as SLI with gsync has stuttering issues and most people are only using 1 of their GPUs because of this so I don't blame AMD for holding back the freesync crossfire support if that is going to have the exact same problem as SLI and gsync, it will just annoy even more crossfire users.

Are they now? I use GSync with SLI and it's smooth as butter. 1440p144 means you really want/need freesync or gsync. Lack of crossfire freesync support is a big deal.

Freesync also has quite limiting refresh rate ranges, and doesn't work with some quite recent GPUs (here's looking at R9 280x. I would have had to replace my 7970/7990 trifire system anyway to use freesync, so was just as well going gsync.)
 
I agree with all of them tbh, all valid. I guess for me Witcher 3 has got me thinking, like when my £890 Titan X can only just manage 60fps at ultra and a £250 PS4 console really doesn't look much worse, I'm wandering have I just wasted a load of money lol.

An other way of saying a £100-120 GPU doesn't look much worse than the £890 Titan X. Since a £100-120 GPU on PC play Witcher 3 on the same level like the PS4.
 
Freesync also has quite limiting refresh rate ranges, and doesn't work with some quite recent GPUs (here's looking at R9 280x.

To be fair that GPU was originally launched on Jan 9, 2012. The fact it got renamed and relaunched in the next generation doesn't give it super powers.
 
Ok so because Greg had some issues, everyone else in the world must have them too? Right.

The last time I saw stutter in sli was on 460s, no one else I know has any issues, I'm sure it exists in certain situations but I'm also guessing that a fair amount is unrelated to sli and more pebcak.

You are just typing trash for the sake of it

Yer, I pretty much feel it was something I was doing wrong, as so many others are not having the same thing as me.
 
AMDMATT i haven't seen those results before, but it's looking a good thing that i recently went for a 5820k as it will give a nice boost in DX12. Last time i saw benchmarks of that they were running 4 core processors, i didn't realize jumping to actual 6 core (no HT involved) would make a nice difference like that. Seems to be the sweet spot too.

Looking forward to seeing benchmarks and prices of the 390 cards for sure.
 
It's certainly looking that way, our DirectX 12 driver is producing 33% more draw calls.

2k6prHA.png


t9VAh32.png

Source
http://www.pcper.com/reviews/Graphics-Cards/3DMark-API-Overhead-Feature-Test-Early-DX12-Performance

AMDMATT i haven't seen those results before, but it's looking a good thing that i recently went for a 5820k as it will give a nice boost in DX12. Last time i saw benchmarks of that they were running 4 core processors, i didn't realize jumping to actual 6 core (no HT involved) would make a nice difference like that. Seems to be the sweet spot too.

Looking forward to seeing benchmarks and prices of the 390 cards for sure.

As you can see from the results i posted, we're extremely excited about these low level API's, they seem very similar to Mantle. :)
 
Ok so because Greg had some issues, everyone else in the world must have them too? Right.

The last time I saw stutter in sli was on 460s, no one else I know has any issues, I'm sure it exists in certain situations but I'm also guessing that a fair amount is unrelated to sli and more pebcak.

You are just typing trash for the sake of it

Or maybe because greg isn't that "loyal" to nvidia/his swift to deny the problems?

There are more than a few on this forum saying they have issues and there are also plenty of threads on other forums about the issue.

Perhaps you and the others don't notice the micro stutter as you just aren't sensitive to it.

You are just typing trash for the sake of it

Is someone getting a little touchy?

Are they now? I use GSync with SLI and it's smooth as butter. 1440p144 means you really want/need freesync or gsync. Lack of crossfire freesync support is a big deal.

Freesync also has quite limiting refresh rate ranges, and doesn't work with some quite recent GPUs (here's looking at R9 280x. I would have had to replace my 7970/7990 trifire system anyway to use freesync, so was just as well going gsync.)

Yes of course having no crossfire support for freesync is a big let down but if AMD release it and it has stuttering issues, is there much point in having it?

Yup, that is frustrating that the older GPU's can't make use of it, no denying that.

The freesync range is more down to the monitor manufacturers and scaler choice, the LG widescreens operate between 48-75, 27" TN panels operate between 40-144 and the upcoming asus screen operates between 35-90 (asus had the choice of going with that opr 40-144), remember freesync has only been out for about a month so give it time and we will see better ranges etc. hopefully....
 
So now, someone who doesn't have problems is either loyal/blind or both? You keep getting better and better.

Right now, I do pref nvidia and I will probably stick with them unless amd sort themselves out and release something massively game changing. As for loyalty, I'm loyal to gigabyte, no one else - it comes down to whatever suits my needs. Right now that's my 780 sli and the swift.

It doesnt take a genius to see which provider is failing to deliver right now, something that is potentially bad for the market.
 
So now, someone who doesn't have problems is either loyal/blind or both? You keep getting better and better.

Right now, I do pref nvidia and I will probably stick with them unless amd sort themselves out and release something massively game changing. As for loyalty, I'm loyal to gigabyte, no one else - it comes down to whatever suits my needs. Right now that's my 780 sli and the swift.

It doesnt take a genius to see which provider is failing to deliver right now, something that is potentially bad for the market.

meh I give up.

Yes you are completely right and I am 100% wrong. I fully agree with you and your posts.
 
I think a possible issue for me was running the streamer service. I always have ShadowPlay running and I have seen some have no problems after getting rid of GE.
 
I don't see an issue with either company. Both offer good cards, each fighting to be unique enough to attract customers.
 
AMD or Nvidia... it is an interesting question. In the laboratory I always use Nvidia, basically because many of the applications I need are specifically coded in CUDA, at least some parts. Nvidia also has quite a compromise with the scientific community and they provide cards for free if you need them. You have to write a proposal, but they are very generous in this sense and this can save quite a lot of money.

In my PCs at home I use Nvidia as well, probably because I had a Riva TNT when I was a teenager and the Geforce series seemed like a natural evolution. It is also better to have Nvidia at home because many times I need to use the same applications there and in the lab and CUDA is a must. However, I am a bit upset with the stability of the latest drivers. This computer has a GTX 980 and I have to use 347.52, the last WHQL driver for Windows 7, because the newer ones are not stable enough. That is not acceptable for a computer that works 24/7. On the the bright side, Nvidia Linux support is not bad and this is really important, essential in many cases.

However, there is nothing wrong with AMD if CUDA is not absolutely required. Performance is there and price is somehow better. I guess that is a question of personal taste. The machine I use for gaming 99% of the time is a PS4 and I am very happy with it :D :D
 
With AMD's recent outburst I think Nvidia clearly take the moral high ground at the moment so the choice is quite easy.
 
AMD struggling to pay Microsoft the 5 grand for WHQL certification of new drivers means no WHQL driver for many months now.
next change I get im going back to NVidia, amd got too many driver issues,
 
AMD struggling to pay Microsoft the 5 grand for WHQL certification of new drivers means no WHQL driver for many months now.
next change I get im going back to NVidia, amd got too many driver issues,

Problem is that Nvidia drivers are also far from perfect, the last WHQL certificated driver is from December 2014. This one is rock solid in my hands, but the newer ones struggle to be really reliable. This is even worse when you have many cards in a single workstation, specially if they are different.
 
Problem is that Nvidia drivers are also far from perfect, the last WHQL certificated driver is from December 2014. This one is rock solid in my hands, but the newer ones struggle to be really reliable. This is even worse when you have many cards in a single workstation, specially if they are different.

Sorry bud but nVidia have released the last few drivers as WHQL.

 
Are drivers only important when you have SLI?

I ask because Ive ran single card for years and every game Ive tried just works. Drivers might improve framerates a little on occasion, but Ive never had a game breaking issue.

People seem to get pretty wound up by AMD drivers, but Ive not had a problem with any game.

Must be multigpu people or people who are obsessed with benchmarking or something.
 
Are drivers only important when you have SLI?

I ask because Ive ran single card for years and every game Ive tried just works. Drivers might improve framerates a little on occasion, but Ive never had a game breaking issue.

People seem to get pretty wound up by AMD drivers, but Ive not had a problem with any game.

Must be multigpu people or people who are obsessed with benchmarking or something.

No bud, there is performance gains and fixes that often happen with just single cards. For those with SLI and Crossfire, newer games need SLI and Crossfire profiles to be able to use both cards and if not, it basically renders the 2nd (or 3rd or 4th) card as redundant.
 
Back
Top Bottom