• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GeForce GTX 460 might launch June 1st - specs

Parallax occlusion mapping will still make texels inside the surface appear to have parallax (and occlude) compared to other texels within that surface but won't have anything extruded above the plane so compared to anything else in the world it looks flat - if that makes sense without a diagram heh. Whereas plain old normals based bumpmapping won't occlude other texels.

Yeah I get what you mean, they simply don't alter the geometry of the object itself.
 
ati owners before the fermi release: the 5xxx have dx 11 and tessellation! buy them now! nvidia gtx 2xx is ancient technology now!
ati owners after fermi release: meh dx 11 and tessellation is not important who cares...
 
No one is knocking tessellation, its usage is being knocked...

What do you expect when new games are constantly coming out?

Metro 2033 came out before fermi and people bashed its tessellation implementation.

AVP came out before fermi and people bashed its tessellation implementation.

I don't know why I'm bothering to reply with a rational response though when you're one of the worse nVidia trolls on these forums.
 
ati owners before the fermi release: the 5xxx have dx 11 and tessellation! buy them now! nvidia gtx 2xx is ancient technology now!
ati owners after fermi release: meh dx 11 and tessellation is not important who cares...

Sorry but I don't remember a single review or AMD buyer saying "i'm getting it just for tesselation".

Go ahead, go find a post in any of the threads where someone said tesselation was the be all and end all.

People got the cards because they are 70-100% faster than a 4890 depending on the game, thats generally the reason for upgrading your graphics card, to get a faster one.

People go on and on, with EVERY DX generation , theres the group " I'm not buying it for the DX version, its just the best card for the previous DX version", and the other group " its not very good at the new DX, this card sucks, I'll buy a next gen one maybe it will be good enough".

Both groups are full of moron's. DX versions bring very little, they bring standards and some accelerations and cards that support it can be assured of being capable of certain things, that doesn't mean old versions weren't capable of doing them. DX versions and new features are a good thing, its a general move forward and the sooner they are supported the sooner they will be used effectively.

However for 10 years, the introduction of most new features has taken a couple years to be used to a high level. HDR for instance was available for a good couple generations before games started to use it effectively without killing performance, thats how life is.


New generations are faster and offer higher IQ due to being able to run higher settings.......... BECAUSE THEY ARE FASTER. There is no other, and has NEVER been another reason for a newer card being better, just that they are faster. The more power there is, the more developers can add in terms of detail and the higher the detail levels you can set in newer games.

Anyone who bought a card for any other reason than, its faster than the last one, is rather daft. Be it buying for physx, cuda, tesselation, the pretty sticker, etc, etc, you get a new card because its faster.

Before 5xxx buyers got cards they said its because... they're faster than their last card. After they got their cards they are great because they are faster than their last card. After Fermi they were great because they've had them for 6 months, cost less and offer more bang for buck than Fermi's, and a £450 480gtx, wouldn't yield a noticeable difference in performance in most games.

DX8, DX9, DX10, DX10.1, DX11 were ALL underutilised in the first gen or two of cards that supported them, HDR was underutilised when it was first introduced.

Tesselation is good, DX11 is good, there isn't a game that uses any of them effectively. Uniengine shows effective use of tesselation, show me a picture of a game that shows the same level of tesselation and I'll buy that game.

There simply won't be games till the next gen or two are out that offer uniengine levels of tesselation, so its a mute point whose faster this gen.

Though again I'll point out to you, in all but Nvidia's launch game, Nvidia loses ground when tesselation is enabled, most people would take that as an indication that its tesselation performance is NOT good.
 
There simply won't be games till the next gen or two are out that offer uniengine levels of tesselation, so its a mute point whose faster this gen.

Though again I'll point out to you, in all but Nvidia's launch game, Nvidia loses ground when tesselation is enabled, most people would take that as an indication that its tesselation performance is NOT good.

Tessellation is one of the easiest "DX11" effects to slap onto an existing game (badly but there it is), so me may see developers porting engines over to slap this on so they can jump on the DX11 hype bandwagon.

In every game/benchmark so far where anything other than very light use of tessellation is present the GF100 has surged ahead. Both of the major benchmarks and a number of test apps have all shown 60-100% better performance on the GF100 under shader + tessellation situations and 400+% higher performance in purely tessellation. The tessellation performance on the GF100 isn't that great it just seems ATI never expected anyone to take it seriously and didn't bother updating their tessellation unit.
 
Yeah me too, and only if it doesn't use more power and produce more heat than the previous one so that I don't have to upgrade my PSU and cooling system as well.
 
Tesselation is good, DX11 is good, there isn't a game that uses any of them effectively. Uniengine shows effective use of tesselation, show me a picture of a game that shows the same level of tesselation and I'll buy that game.

http://en.wikipedia.org/wiki/Primal_Carnage
http://syndicatesofarkon.com
http://en.wikipedia.org/wiki/Afterfall_Universe

And it wouldn't surprise me if Crytek implement more agressive tesselation into their upcoming engine (Crysis 2 will showcase this).

I can also think of many (not rather daft) consumers that buy a gfx card for other reasons than "its just faster". IQ matters to a lot of people. I personally would be gutted to buy a new gfx card that couldn't perform very well at high AA levels, such as the 5870. performance dips wayy too much for my liking, I like my games smooth as silk and loking stunning. Otherwise I would not spend hundreds on a gfx card and get a console!!!
 
Tessellation is one of the easiest "DX11" effects to slap onto an existing game (badly but there it is), so me may see developers porting engines over to slap this on so they can jump on the DX11 hype bandwagon.

In every game/benchmark so far where anything other than very light use of tessellation is present the GF100 has surged ahead. Both of the major benchmarks and a number of test apps have all shown 60-100% better performance on the GF100 under shader + tessellation situations and 400+% higher performance in purely tessellation. The tessellation performance on the GF100 isn't that great it just seems ATI never expected anyone to take it seriously and didn't bother updating their tessellation unit.

Unfortunately, no one has taken it seriously, while easy to slap on, taking Metro 2033 as an example, you'd have to be quite mad to enable it and take a 30-35% performance hit on either a 5870 or a 480gtx, to get slightly rounder sleeves, honestly, barmy.

A number of test apps can be boiled down to.... uniengine?

AMD rightly expected no one to take it seriously and Nvidia have used a far higher portion of their already massive core to something not used in games to ANY useful effect so far.

Personally I prefer the £200 core with less tesselation performance for something I can't use, than £450 for marginally more performance, no availability, 6 months later for higher tesselation performance......... i still can't use.

I have still yet to actually run uniengine because for me, seeing a number in a benchmark I can't "play" does nothing for me. I'm sure I downloaded it somewhere, but not installed it, I've seen a video, seeing it on my screen or a video of it doesn't change the outcome for me.

As said though, for MONTHS before release lots of sources, not just Charlie, all said the gaming performance would not reflect benchmark performance of tesselation. Yet again I'll point out half the games out there seem to take a BIGGER hit on performance when tesselation is enabled on the 480gtx, which to me does not suggest improved tesselation performance.

Personally not being able to run it on a 480gtx myself I can't even verify that its tesselation alone making uniengine run faster, I'll admit I haven't read the benchies carefully(as I fully ignore 3dmark/non game benchmarks COMPLETELY) I don't know if the massive performance increase was at a detail level/res/aa setting/tesselation setting that simply pushed the 5870 past the 1gb memory barrier or not.

AS said though, I really don't expect a "decent" use of tesselation in any game within the next year, the likely "uniengine" level of tesselation involvement in any real game is unlikely to be seen for a couple years because the current cards simply don't have the power. If some games can implement some level of tesselation that actually improves IQ , NOTICEABLY, that will be an interesting comparison between the current architectures, until that game arrives, we're comparing to stupid extremes. Benchmarks that you can't play, and games that use it in a way which hurts performance and doesn't increase IQ.

The only way I noticed the IQ increase in Metro was when [H] paused, took a screen shot and massively zoomed in on a few sleeves, nothing else was done, minor character tesselation on area's you can't see in 99.9% of the game are simply not worth a 2% performance drop, let alone a 20-30% performance drop. DOF is another issue in that game, something that drops FPS on BOTH brands of cards by 30-40%, and decreases IQ significantly, madness.
 
http://en.wikipedia.org/wiki/Primal_Carnage
http://syndicatesofarkon.com
http://en.wikipedia.org/wiki/Afterfall_Universe

And it wouldn't surprise me if Crytek implement more agressive tesselation into their upcoming engine (Crysis 2 will showcase this).

I can also think of many (not rather daft) consumers that buy a gfx card for other reasons than "its just faster". IQ matters to a lot of people. I personally would be gutted to buy a new gfx card that couldn't perform very well at high AA levels, such as the 5870. performance dips wayy too much for my liking, I like my games smooth as silk and loking stunning. Otherwise I would not spend hundreds on a gfx card and get a console!!!


Errm, what you just said was for you the performance of a 5870 isn't high enough......... YOU WANT TO BUY A FASTER CARD. While also claiming its not just about wanting a faster card.

Higher IQ means more power, means you need a faster card.

Higher AA levels mean more power, means you need a faster card.

Huge tesselation usage means more power, means you need a faster card.

This was in claims that AMD users only bought 58xx's for tesselation and DX10.

I said that wasn't the case, you buy a new card because with AA/high quality effects enabled in new games, be them dx7 or dx 39, you'll need a faster card to be able to run the latest games. It would appear you agree with me, you didn't claim you wanted a card for a new FEATURE, like DX11, or tesselation, or unified shaders, or anything else, you wanted more power.


As for the games you list, the first, the video on the homepage is actually embarassingly bad, and due for release this year that vid should look HUGELY better. IT seems like a classic tech demo game, small multiplayer game by a indi studio, with crap grass, AWFUL dinosoar models, I mean they are embarassing by 10 year old standards, decent lighting(ain't hard to do) and the mountains look half decent WAY in the distance.

Ageia has previously made incredibly small tech demos based on their hardware and specific features, they are widely regarded as utter crap and no one plays them, the same features are yet to feature in "real games" years and years later. What you can do in a couple levels with a lot of time and effort does not reflect what you can include and run well in a real full sized game made for gameplay and story, rather than one effect.

The second game, looks even worse, the picture on its homepage looks closer to City of Heroes MMO look, than a heavily tesselated brand spanking new AAA title.

The third game you listed, well, their site isn't working, the game was due out, gamespot has three fairly awful looking screenshots of it and NO info on the game at all, another indi studio, suggest to me that game isn't coming out at all.

EDIT:- Yes, I'd also expect Crytek to add it into their latest engine, of course they are known for doing bleeding edge engines simply not playable on normal setups when released at higher detail levels, and only become more playable at higher settings a year or two later when you get........... not more feature cards but......... should we say it together...... a more powerful card.

Infact from Crysis's release, there were NO new features, the only new things cards had between its release, and today, that make the game more playable.......... is more power, its not using DX11, or tesselation, or new GPGPU instructions, it uses the same features as before and 2 or 4x the power as a couple years ago.

Its highly likely they'll include it in the latest game, and the heavy tesselation option along with the other high end options, won't offer a hugely playable game till another couple generations down the line. So I do ask, what good is the 480gtx supposed tesselation power if, you'd have to wait for a 680gtx in 18 months, to use it at a playable framerate?
 
Last edited:
When I buy a new GPU it's generally because it's a lot faster than my last, anything else is secondary, so +1 Drunken.

when i bought a gpu, it was to replace a fast but power hungry card with something that was a little slower, but far more efficiant and also had full hd video decoding.

so sorry, but no. people who want a faster card will buy a faster card...well duh. but thats not what he said, is it lol
 
Hes referring to the high-end enthusiast cards, in that section 99% of the time you're upgrading because you want a faster card, you're not going to pay 300 notes for a slower card that uses less power lol.
 
Hes referring to the high-end enthusiast cards, in that section 99% of the time you're upgrading because you want a faster card, you're not going to pay 300 notes for a slower card that uses less power lol.

funny, that isnt mentioned anywhere. As a blanket statement it is completely and utterly wrong. if he's *only* referring to the 480's well fair enough. however i still think that is wrong, for the reasons ive explained previously.
 
Theres more than just heaven benchmark, theres the stone giant one and a couple of other tech demos for up and coming games/engines. Then you have a number of test apps some nVidia, some ATI, some 3rd party.
 
Drunkenmaster, you sound like the kind of guy who prefers a corvette to a Elise. Faster faster faster!!!!

More posts up today saying 460 may be released in June. Some speculation suggests a reworking of 480 also, but speculation on Internet sites or forums isn't worth the paper it's written on ;)
 
Back
Top Bottom