• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Only AMD has true Async Compute - Doom Devs

amd dont care what silent scone thinks - flopper thinks he is a thread capper and troll. :D

I also like to add, a game developer state and amd state but nvidia says nothing?

I tell it like it is, if that's not to your liking I'd suggest playing something that actively uses Async shaders to the advantage of GCN. You will however, struggle to do so as there are less than limited examples.

I'll go back to 'capping' in Doom. ;)
 
What a load of rubbish.

Maxwell does do async compute. You can make some test code and see the results for yourself. People have done this and shown that async works with current drivers on maxwell.

Do you ever research anything before going on an anti-nvidia rampage?

What games are running better with it? Genuine question.
 
I don't think anyone "fell for it" in the way you are suggesting. There are several people of a typically AMD favouring variety that are trying to ignore the fact of what was said and make out that they "fell for it" to suit their agenda, or just to get a rise out of "the other side".

He just said it was twice the performance of Titan X ??

4 times.

It was pointed out that it was in VR

Right...we're still on vr :(

Hmmm, but I thought the bit on VR was done afterwards, after announcing it was twice as fast as a Titan X? Oh welll :D

Looks like i was not the only one to notice.

lol at people believing the double performance hype :D

I sold my TX yesterday for a very good price so I'm one happy man anyway :)
 
What a load of rubbish.

Maxwell does do async compute. You can make some test code and see the results for yourself. People have done this and shown that async works with current drivers on maxwell.

Do you ever research anything before going on an anti-nvidia rampage?

Mmmmm link? Because last I heard testing async compute is a very tricky task, not just something you can put together in couple hours work.

See how fast 3dmark added draw calls, why not added async compute test? Because it's not quite that simple.
 
What games are running better with it? Genuine question.

Fable Legends showed good results, pity MS killed the game.

It is also important to best in mind that a big performance gain from using async is not necessarily a good thing as it indicates the GPU has poor utilization. It is wrong to expect nvidia cards to have the same gains as AMD cards because nvidia simply don't have the front-end weaknesses and can better leverage the available resources. Which is why despite the AMD GPUs having a far higher theoretical compute FP32 performance they are slower than equivalent Nviida GPUs.

Similarly, if you look at the AotS benchmarks with async on or off under certain resolutions and settings even some AMD cards get slower with async switched on because if poorly I,pleated then async will restrain performance, even if you have a hardware based scheduler like GCN.
 
Mmmmm link? Because last I heard testing async compute is a very tricky task, not just something you can put together in couple hours work.

See how fast 3dmark added draw calls, why not added async compute test? Because it's not quite that simple.


There is a Visual studio tool available that did nothing to 'prove' Nvidia had Async compute in their hardware.

Lets get this right though, even if Nvidia's Pascal does have it, their GPU's are already at very nearly full utilization so the effect on performance is/would be minimal.
AMD GPU's are not at 100% so Async fills the void considerably hence the nice performance boost.
 
Multiple people in chat on the event, on the reddit thread and I believe on the thread here made various statements from "zomg, it's twice as fast as Titan X", to "wait, now it's twice as fast" and everything in between. A LOT of people were confused that he spent most of the time after introducing that slide leaving out that 'in VR' qualifier when making that statement and it clearly misled a lot of people because a lot of people were saying "cool, twice as fast as a Titan X".

Yer and even on the respected wccftech comments section they said the same. Even people here who I credited with some intelligence seem to feel that the 1080 was literally going to be twice as fast as a Titan X and even 10 x faster than Maxwell. Not much to say to those people really as they are either being purposefully obtuse or are generally that gullible. Funny though, as when you quote something said from certain industry pro's, some here choose to dismiss it totally but something else can get taken completely out of context and then used to try and look big and clever.... Fickle bunch us hardware enthusiasts.
 
Mmmmm link? Because last I heard testing async compute is a very tricky task, not just something you can put together in couple hours work.

See how fast 3dmark added draw calls, why not added async compute test? Because it's not quite that simple.

Various different developer forums and websites, will did out links when back on my PC. Beyond3D had a good analysis and clearly showed how increasing compute jobs would lead to step changes in increments of 31 as expected from the maxwell2 architecture.

An async compute benchmark makes absolutely no sense because it depends entirely on what your graphics pipeline is like other compute tasks. What works well for AMD cards can be really bad the nvidia cards, and vice versa. Eve with GCN the different versions react very differently to different loads and simply, changing resolution or game setting changes things drastically. The AotS benchmarks show a performance loss on a,D cards under some circumstances.

You seem to misunderstand what async is and does. You seem to think it is a feature like Tessellation that has to exist to render the graphics appropriately. Async compute simply facilitates better utilization of the GPU which can make game rendering faster. The only thing that matters is end user performance. It doesn't matter how that performance is achieved.
 
It is also important to best in mind that a big performance gain from using async is not necessarily a good thing as it indicates the GPU has poor utilization. It is wrong to expect nvidia cards to have the same gains as AMD cards because nvidia simply don't have the front-end weaknesses and can better leverage the available resources. Which is why despite the AMD GPUs having a far higher theoretical compute FP32 performance they are slower than equivalent Nviida GPUs.

Similarly, if you look at the AotS benchmarks with async on or off under certain resolutions and settings even some AMD cards get slower with async switched on because if poorly I,pleated then async will restrain performance, even if you have a hardware based scheduler like GCN.

Yeah Im very much aware of that. NVIDIA cards barely need async they are already running very efficiently.
 
Yeah :D, I think Nvidia handle things perfectly. If this async is really important they would be on top of it as AMD are.

All IMO of course

It is actually just a method of ironing out inefficiencies in the architecture, preventing idoling stream processors. Once AMD have their command processor inside the new gpus I wouldn't be surprised if they stop going on about async so much. Of course it will still add something but nothing like before.
 
Respek Lol. :p
I think DP is working overtime. Damage control is hurting his fingers or his keyboard..

:D LOL ... Yes D.P ... starting to feel like Nvidia in the house ... It has been none stop Nvidia hype and damage control. Way past fanboy behaviour. Sorry D.P but is there something we should all know here ?


I am not a fanboy of any company, I just buy the best that is sold at the time and what I can afford. Without making any unreal statements. Read back in my statements I made about 1080 & 1070 before it came out and when it came out and Intel Broadwell before it came out and when it came out. My comments will highlight the advantages and disadvantages (basically being fair and biased, without hiding the issues that some may miss or don't understand 100%).


Sorry D.P you are way too much in favor of Nvidia and to the point you are hiding clear facts about some of the disadvantages and the hidden costs that have been added on for no good reason recently. Technology is meant to get cheaper for better performance, we are now paying by the frame rate. Check it out when you compare the frame rate from the low level GPUS of the same series to the highend. Work it out and from series to series from past generations the price per Frame is not going down a lot anymore and actually going up in some cases.

Nvidia and Intel are becoming a monopoly in the high end market ... if you don't see that there is something very wrong.
 
Last edited:
I don't think anyone "fell for it" in the way you are suggesting. There are several people of a typically AMD favouring variety that are trying to ignore the fact of what was said and make out that they "fell for it" to suit their agenda, or just to get a rise out of "the other side".

That's not true and it's pure rubbish about the AMD favouring variety comment.

If you had been in the pascal thread after the presentation the amount of people that were saying 2x the performance of the titan X wow, AMD are doomed etc. It took several pages before it was accepted that it meant VR.

People are still popping up in various forums wondering about is the 1080 2x the performance of the Titan-x.

And it created a huge buzz about the card.
 
Hmm im lookin at the slide for ashes and a 1080 can do 58 fps but 2x 480s at 200 each can do 62fs. So just 1 can only do 31fps? Seems a bit crap on their own doesnt it? Why the need to say 2x (dual) cards in the slide? Cant 1 card beat 1080?

Also i was lmao when he said "premium cooler", all that was shown was a very plain solid piece of shield in black with a blower on. WOW thats high tech... NOT! haha. Silly AMD. saying that was good cooling when its probably crap as its so basic. If its so advanced why hide it behind the box of the cooler and not show the insides or a blueprint kinda slide showing is construction. Did he even mention temps with that cooler at demanding games? Nope.
 
I think amd with Id have been a bit mean. Nvidia had a press conference with Id developers iirc singing its praises for doom with a 1080. Now they switched sides to AMD and said you dont need a 700 dollar video card to play with great fps. What a slap in the face and betrayal i reckon that is to nvidia from Id.
 
I think amd with Id have been a bit mean. Nvidia had a press conference with Id developers iirc singing its praises for doom with a 1080. Now they switched sides to AMD and said you dont need a 700 dollar video card to play with great fps. What a slap in the face and betrayal i reckon that is to nvidia from Id.

Its lol.

it just goes to show that money talks.
 
Back
Top Bottom