• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

7970 vs GTX 580 in nVidia’s Endless City tessellation demo

Indeed, i had no doubt that you couldn't back that one up.

Looks like NV take a HIT to me.
http://www.ctrlaltkill.org/2011/06/27/crysis-2-dx9-vs-dx11-benchmarks-time-to-upgrade-again/

Yes, from the post-processing, as I said. But you can't find any competent reviewers who know how to benchmark the stuff everyone really wants to see, so no, I can't back that up, other than my own system, which you wouldn't take as proof anyway. So I'm not going to waste more time on that. There are far more interesting issues mentioned, which you are ignoring completely.
 
Yes, from the post-processing, as I said. But you can't find any competent reviewers who know how to benchmark the stuff everyone really wants to see, so no, I can't back that up, other than my own system, which you wouldn't take as proof anyway. So I'm not going to waste more time on that. There are far more interesting issues mentioned, which you are ignoring completely.

No one would take what you say as proof.
 
Yes, from the post-processing, as I said. But you can't find any competent reviewers who know how to benchmark the stuff everyone really wants to see, so no, I can't back that up, other than my own system, which you wouldn't take as proof anyway. So I'm not going to waste more time on that. There are far more interesting issues mentioned, which you are ignoring completely.

No one would take what you say as proof.

The interesting stuff i have no issues with as i can see Roffs point.
Its the rubbish you inject into the matters that i have issues with because the are not need for the discussion or the interesting points.
 
The interesting stuff i have no issues with as i can see Roffs point.
Its the rubbish you inject into the matters that i have issues with because the are not need for the discussion or the interesting points.

Oh really? Well, DISCUSS then.
WHAT is rubbish, and what arguments do you present to prove that they are rubbish? I see NONE. Just a big mouth.
 
Actually, they do. My blog even got linked in Tomshardware's GTX680 review. So lulz@u

Again your generalising and out of context when it was about Crsys2 and that it plays the same DX9 and DX11 on the avg NV card, i was not generalising my comment.
 
Oh really? Well, DISCUSS then.
WHAT is rubbish, and what arguments do you present to prove that they are rubbish? I see NONE. Just a big mouth.

Crysis 2 plays the same in DX9 v DX11.
And your reply to me about unique heaven.
 
Last edited:
Just ignore him Scali - Final8y often posts incoherent waffle. You're arguing with someone who frankly doesn't make any sense a lot of the time.
 

And as its shows NV cards do not perform pretty much the same in Crysis 2 in DX9 v DX11.

The Context is was you using your rig as proof as to show that the performance was the same in DX9 v DX11 when there are reviews all over the net showing otherwise so in that regard
No one would take what you say as proof.
So you post a link that some listened to you about something else.
 
Last edited:
Just ignore him Scali - Final8y often posts incoherent waffle. You're arguing with someone who frankly doesn't make any sense a lot of the time.

It does not matter what you think i have proved my point and plenty would disagree with you.

Crysis 2 in DX11 mode is about as fast as in DX9 mode on nVidia hardware.

No it is not and even more so on the cards at the time.
http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-9.html
 
Last edited:
The rendering technique Pixar uses is not the "end game" because it is woefully inefficient; the visual pay off will never justify the demands on the hardware for gaming purposes, so you can forget that.

Arguing that AMD's GPUs are holding gaming visuals due to lack of tessellation power is a bit rich considering the origins of it in the first place and besides, everyone knows it has been the extended console generation that holds the most responsibility in that regard.

I'd love to see the Xbox 720 and PS4 get hugely powerful tessellation units but I doubt that's going to happen.
 
The rendering technique Pixar uses is not the "end game" because it is woefully inefficient; the visual pay off will never justify the demands on the hardware for gaming purposes, so you can forget that.

Are you joking?
Pixar's rendering technique is VERY efficient.
The issue is that without tessellation, it is impossible to reach anywhere near that kind of geometric detail. You just don't have enough memory to store the geometry and not enough bandwidth to process it.
It is also far more efficient than that other 'infinite geometry' solution: raytracing. Which is why Pixar's RenderMan has been the industry leader for many years.

Clearly this is exactly where we are going. Tessellation is introduced because games are already reaching the limits of video memory size and bandwidth. Tessellation takes geometry detail and efficiency a big step forward even on today's hardware.
Future GPUs will likely have optimized rasterizers which can handle micropolygons more efficiently. Eventually, we will no longer need larger polygons at all anymore, just like Pixar.

Or can you build a stronger case?
 
Back
Top Bottom