** ASUS DO IT AGAIN: IPS, 144Hz & FREESYNC!!! Asus MG279Q thread **

In what way do AMD cards "suck"? The 290x/290 are still performing very well compared to the 980/970, which afaic is pretty good going considering they are over a year older.

TDP of the 980ti is just as much as a 290x now and there is only a difference of something 70-90w between a 970 and 290/x, which works out to be a £50 difference if you were to game EVERY day for 4 hours spread across 2 years.

Yes, nvidia have a better overall package when it comes to their drivers/software.

The crossfire support has sucked and general communication has been poor for the last 6 months, however:

- single gpu users have had no/little problems with games in the last 6 months except a couple of games (?) i.e. project cars
- crossfire support has been lacking for a handful of games, project cars, TW 3 (both of which are fixed now?), dying light (think this was fixed not long after release?), elite, anything else?
- lack of crossfire freesync support, no excuse for this really, other than AMD are having issues to get it working "well"

Nvidia are far from perfect:

- quite a few users are still getting micro stutter with gsync and SLI
- the last 2 drivers have caused stability issues with crashing
- keplar GPU's not performing as they should for several (?) titles (which is apparently fixed now?)
- with the latest update, apparently 780 owners have noticed an improvement in TW 3 but titan x owners have noticed worse performance

And I'm sure that there must be some games that users have issues with, be it with single or SLI.

I think once AMD get the 3xx series out things will return to normal or at least speed up, to me it just seems like that and DX 12/windows 10 has been taking up most of their time and money.
 
I think once AMD get the 3xx series out things will return to normal or at least speed up, to me it just seems like that and DX 12/windows 10 has been taking up most of their time and money.

I hope so... But currently unless the fury and fury x are very good price / performance there is really no choice other than Nvidia... I think adaptive sync would make a massive difference in all games so I want to use that, I am not paying £200 for gsync so I don't have much choice... Maybe I will get Nvidia this year and see hopefully AMD's 14nm cards will be good, but I really want freesync now.

I was just looking at project cars... the performance on AMD is terrible.. no other word for it. Also all these Nvidia gameworks titles coming out will not run properly on AMD probably. You can blame it on Nvidia for not making it open, but then you can blame AMD for not making any effort to release competing features.

The TDP of a OC 980 VS an OC 290x is pretty massive difference... about 200w!

14216365279jY9Ks4C6f_11_1.gif


Most of those Nvidia problems are not major compared to the AMD problems... and I don't use SLI anyway. I want to buy an AMD card but they are not making it easy for me. I play project cars and it is almost unplayable on anything except medium settings according to the benchmarks + all the other games with gameworks features coming out + I like cards with good TDP / performance.

Maybe the AMD cards will improve with DX12 + Win10... Is there any evidence for that? I am trying to justify buying a Fury or Fury X.
 
Last edited:
I have no idea about project cars since I don't have the game but I'm pretty sure 15.5 has fixed the performance or there a simple tweak you can do to fix the performance, best to jump in the amd driver thread and ask matt.

And yes, I won't disagree, gameworks titles don't run well for me, I get good "FPS" but they all have some form of micro stutter but then there are people who say they have no stutters at all with gameworks titles but then again perhaps these people just aren't as sensitive to stutter so I don't know what to think...

Again, only crossfire users are really suffering, single GPU's are fine for 95% of games.

Well despite what a large majority of the nvidia "community" will tell you, DX 12 is very much the same as mantle

And here is a video comparing windows 8.1 and windows 10 in project cars:


I imagine that AMD will have a head start with windows 10/dx 12 games/optimisation due to their experience with the mantle api, impossible to say for definite though.

The last 2 drivers are causing people quite a few problems even just on desktop usage, just have a read at the last 1-2 pages of the driver thread. I rather have slow updates and stable drivers than fast unstable updates :p

AMD have attended the E3 gaming event and partnered up with microsoft so who knows what could happen, could just be a big marketing stunt or it could mean a better & brighter future for AMD GPUs.

That is certainly a big difference, although I would look at other review sites as I'm pretty sure they don't encounter there being that much of a difference in TDP. Either way, apparently these fury cards will use less power due to HBM so I reckon we will see similar power draw and performance to the 980ti.

Overall I prefer AMD because they are usually much better for bang per buck, however, if needs be, I will happily pay more for something, I am wanting to upgrade my monitor to a 34" 1440 g/free sync so my next GPU brand will be dictated by 2 things:

- how batman runs (if it runs crap on my 290 then I will def. be upgrading my GPU)
- 34" 1440 g/free sync choice for no more than £700

So I am in a similar position to you.
 
Yes as a corporation I prefer AMD to Nvidia, and usually have purchased them the last 5 years because of "bang for buck"... But the 970 I have atm is a great card for the money at 1080p, just not good enough for 1440p.... the fury cards really need to be good, fast and cheaper than 980ti. They are rated at 300w TDP with 2 x 8 pin connectors... so I don't think they will be the same as the 980ti, probably more like the 290x TDP + 10-20% but with higher performance obviously. They also seriously need to sort out the dual screen and video power consumption.
 
Last edited:
Yes hopefully it is actually released for reviews this month, I have trouble justifying £600 on a 980ti when in 1 year Pascal will come out which will probably be a lot better. I know this is always the case with any hardware but high end GPU are getting ridiculous with pricing etc. For example my £300 GTX 970 is faster than a £1000 titan from 2 years ago. I can remember when about £300-£400 was the max price for the best GPU's.
 
Yes hopefully it is actually released for reviews this month, I have trouble justifying £600 on a 980ti when in 1 year Pascal will come out which will probably be a lot better. I know this is always the case with any hardware but high end GPU are getting ridiculous with pricing etc. For example my £300 GTX 970 is faster than a £1000 titan from 2 years ago. I can remember when about £300-£400 was the max price for the best GPU's.

There's no way any fast Pascal chips will be out a year from now. Maybe some pipe cleaners, like 750/750ti were for Maxwell. Performance and high end will be later. On the other hand, I fully expect Arctic Islands (AMD 4xx) to launch at exactly this time next year, if not sooner ...
 
There's no way any fast Pascal chips will be out a year from now. Maybe some pipe cleaners, like 750/750ti were for Maxwell. Performance and high end will be later. On the other hand, I fully expect Arctic Islands (AMD 4xx) to launch at exactly this time next year, if not sooner ...

Why do you expect AMD to launch on time but not Nvidia?
 
Why do you expect AMD to launch on time but not Nvidia?

NVIDIA are many years behind AMD with research into stacked memory, and NVIDIA never even intended to have an HBM product (which is developed by SK Hynix & AMD jointly). Volta was pushed back 2+ years. It used and uses HMC, the competing standard. Pascal was then announced over a year ago to make up for this, and most of the stuff that's been announced about it has been entirely enterprise and supercomputer related as was the case with Volta ... NVIDIA appear to have expected that AMD would continue to delay their HBM products, or at least consumer products, so didn't place an onus on expediting consumer products featuring either HMC or HBM. They've been caught with their pants down now.

There's also the fact that there's no way NVIDIA are going to launch their big and biggest chips first. That's the opposite to what they've done in recent times; for very good reason - to avoid their appalling yields of years past. Certainly not on a new, smaller node, that also happens to be FinFet rather than bulk planar, with a new type of memory (bad history for them), a completely new memory controller (bad history for them), and a significantly revised architecture.

So their low to mid end cards will launch before. They're going to be on TSMC's 16nmFF+ process. This and the non "+" variant have been delayed again and again. For big GPUs (comparative to tiny SoCs), you're not going to see the first chips roll off the production line in any volume before the beginning of Q2. This would be best case for a '1050ti' product. Small SoCs probably beginning of Q1 or very end of Q4 if lucky. You can expect at least 4-5 months until a '1080' product, and certainly not until year end / Q1 2017 for a new Titan SKU.

Samsung on the other hand have been pumping out 14nmFF LP (Arctic will use 14nmFF LP+) chips in the many millions for 3 months already. GF's copy-exact process is spitting out a smaller number of the same Exynos chips. They effectively have a year lead on TSMC with the process in terms of product cycles. LP+ should be available before the end of 2015. They're doing the move to stacked memory, interposers, and a new memory controller with Fiji on an extremely familiar process. AI won't be a radical change for them. Unless AMD hit roadblocks with their chip design or unexpectedly low yields / HBM2 issues, launching at the same time next year (Computex / E3) would be perfect for them. It also means they can effectively deal a temporary death blow to NVIDIA in the high end, until the latter can respond some months later.

That's why. I don't expect to be wrong. Particularly about Pascal timing.
 
Last edited:
Guys lets try not to derail a monitor thread too much please with talk about graphics cards. Take that discussion to the graphics card forum :)
 

Perhaps you can answer one thing ... I assume they intend to release them again to retail once reviews are completed (I assume others will have a similar timeline to you)?

Depending on performance, I'm now considering this as an alternative to the XL2730Z ... previously I was expecting there to be lots announced at Computex, but there's been practically nothing. Maybe DP1.3 will be ready for IFA and we'll see a wave of new stuff then ...
 
Perhaps you can answer one thing ... I assume they intend to release them again to retail once reviews are completed (I assume others will have a similar timeline to you)?

I heard they will be back on retail towards the end of this month.
 
NVIDIA are many years behind AMD with research into stacked memory, and NVIDIA never even intended to have an HBM product (which is developed by SK Hynix & AMD jointly). Volta was pushed back 2+ years. It used and uses HMC, the competing standard. Pascal was then announced over a year ago to make up for this, and most of the stuff that's been announced about it has been entirely enterprise and supercomputer related as was the case with Volta ... NVIDIA appear to have expected that AMD would continue to delay their HBM products, or at least consumer products, so didn't place an onus on expediting consumer products featuring either HMC or HBM. They've been caught with their pants down now. ....................................

I wouldn't say that nvidia have been "caught with their pants down" currently they are doing a lot better than AMD in every area.... Anyway I hope AMD do well and that the Fury cards are worth buying.. I need adaptive sync and I am not paying £700+ for an IPS screen with Gsync. They need to release a nice product at lower than the 980ti price and it will probably sell ok.
 
Last edited:
Baddass, can you please tell is there some notification on box that the firmware is updated, 'cause some still selling monitors with old firmware?

Not that I know of but then the one I have I believe is an original version which has been updated by Asus before it was sent to me. All forthcoming stock should be new version as they mostly caught it before anywhere had it in stock. I think some places in Europe has early batch so I'd maybe avoid those or check with them if it's old or new ver. I don't think new ver is with anywhere yet though so if anyone is selling it now it's probably the old firmware
 
Perhaps you can answer one thing ... I assume they intend to release them again to retail once reviews are completed (I assume others will have a similar timeline to you)?

Depending on performance, I'm now considering this as an alternative to the XL2730Z ... previously I was expecting there to be lots announced at Computex, but there's been practically nothing. Maybe DP1.3 will be ready for IFA and we'll see a wave of new stuff then ...

Pretty sure the intention is for reviews to coincide with availability. Certainly mine is to tie up with UK availability. Not sure how many other sites have a sample yet to test. Not seen any reports of that. We did get ours early so we could have review ready for launch instead of it being a couple of weeks after :)
 
Back
Top Bottom