• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD demonstrating DX11 hardware at QuakeCon (video)

I think there will be a greater take up of dx11 this time then there ever was for dx 10 due to a couple of reasons. First of course is the complete mess that dx10 became because of external interference meaning one minute this faeture was in and the next it was gone and when they finally settled on a concrete feature set for dx10 most people had lost interest.

Secondly the bad press vista got also played a part that is not the case for windoes 7 and many xp diehards have expressed a willingness to go to 7 who would never have accepted vista so dx11 will be on more pc's then dx10 ever was. Dx11 is far smoother in terms of acceptence and feature set then 10 was and gives developers an easier way to implement things rather then 10 so i don't think we will be waiting as long for dx11 games as we did for 10 games.
 
The big question is are the next consoles in prototype form being released to the big studios yet? (are they even in the design stage?). It's all well and good having these features, but if they continue to be bolt-ons to console ports there's not much point is there?

The 360 already does Tesselation since it's an ATI graphics chip.
 
Secondly the bad press vista got also played a part that is not the case for windoes 7 and many xp diehards have expressed a willingness to go to 7 who would never have accepted vista so dx11 will be on more pc's then dx10 ever was. Dx11 is far smoother in terms of acceptence and feature set then 10 was and gives developers an easier way to implement things rather then 10 so i don't think we will be waiting as long for dx11 games as we did for 10 games.

I think this is by far the biggest reason DX10 never really took off...
 
it's not all about games either. DX is used in most 3D CAD applications, like 3ds max, being the most popular and widely used across most of the sub-industries (architecture, gaming, product viz etc.). DX10 has never worked in any of these apps, even though they officially support it. turning it on produced graphical bugs/glitches and massive viewport slowdowns. There are a great many studios and individuals in the DCC (Digital Content Creation) market who use gaming cards to keep costs down (myself included) rather than using Quadros or FireGLs etc, and we have all had to manage with DX9 for years now. It's worked out fine, but when Vista hit and everyone ran crying from it, we were all essentially running away from DX10 too as a result. That's two huge markets that, for one reason or another, didn't adopt DX10.

With the enormously positive buzz surrounding Win7 the move to DX11 is almost certainly going to be smooth and total. I really wouldn't be surprised if, within a year, devs (both in gaming and DCC) stopped catering for DX9/10 users and moved entirely to DX11.

Remember that the DCC market is to a very large degree responsible for some of the cool new tech that goes into graphics cards. Nvidia bought Mental Images, there's the whole CUDA thing going on quietly (for now) and a lot of these new features in DX11 will be fantastically useful for DCC folk like myself.

For once I'm actually looking forward to a new Windows *and* a new DirectX. Been years since I last did (2k > XP).

Cheers,
 
I dunno... from the stats I have to hand - some of them as recent as July 2009 and all from 2009 tho some are from Jan...

Slightly less than 25% of gamers have DX10 compatible systems (OS and DX10 GPU) - but more than 50% of gamers do have a DX10 compatible GPU... the G92 based cards are by far the most common gaming card with just over 20% penetration of the market... less than 15% of the gaming market has a 200 or 48xx series card...

So guess what developers are going to be targetting for their games...
 
Last edited:
Once DX11 kicks in threading and compute shaders, etc. will deffinatly take off as they bring huge benefits to the table allowing you to implement features at a decent performance level that game developers actually want to include...
 
I think the mixture of both reasons i gave pretty much gauranteed the almost instant death of dx10 and this time round we don't have ignorants slating windows 7 as we did with vista. Even now i can remember so many self professed windows experts that slammed vista and then quietly admitted they had hardly bothered with it.

Rumours this time for win 7 have been positive so it is a whole new situation and the best time and environment to introduce a new dx standard. Ms also not allowing anyone to dictate on the standard has also helped a lot as it shows a level of confidence that wasn't there with dx10 all these things together bode well for dx11 and the future of gaming.
 
Physx is capable of a lot more tho than the token implementations used in games so far and hardware physics is something that game developers have a need for... as someone who has industry experience in video game development I'm aware of just what it can bring to the table which is why I hype it up... tho really I'm hyping hardware physics not physx, specially as nVidia have now shafted physx.

Tessellation on the other hand is almost the opposite of how the majority of game developers work and doesn't really solve any pressing needs game developers have... its possible it might be used in cutscenes a bit to increase the visual quality a tad but I don't see a widespread need or use for it in the current climate... may main problem tho isn't really with tessellation as such, its a great feature and has its place, my problem is with ATI typically concentrating on stuff that there isn't a major need for and ignoring things that would be far more useful, if you don't think this is true your probably a gamer and not a game developer.
 
i would have thought that being able to rely on the GPU to add polygonal data to a 3D model or a range of 3D models will:

1) free up time for the modelers so they don't need to spend ages building complex characters (think zbrush, start simple and let the app/gpu add the extra level of detail/tesselation)

2) allow devs to get to their desired level of quality more quickly/easily, thus freeing up time to actually improve gameplay and story etc.

3) allow devs to get to a certain level of detail/quality more quickly, so they can call it a day and save some cash that would otherwise have been spent paying employees to detail the models.

4) allow devs to move away from fudges such as bump and normal mapping and rely on "real" detail, again saving time and money

those are just a few off the top of my head.

i wouldn't underestimate just how big a deal tesselation will probably be for devs.
 
Up scaling a low/medium detail mesh especially character models, etc never produces as pleasing or predictable results as creating high res meshes and crunch down to your desired LOD stages.

Anyhow tesselation isn't bad as such it has its time and place, my main concern as I said is the attention and focus its getting over much more needed features.
 
Physx is capable of a lot more tho than the token implementations used in games so far and hardware physics is something that game developers have a need for... as someone who has industry experience in video game development I'm aware of just what it can bring to the table which is why I hype it up... tho really I'm hyping hardware physics not physx, specially as nVidia have now shafted physx.

Tessellation on the other hand is almost the opposite of how the majority of game developers work and doesn't really solve any pressing needs game developers have... its possible it might be used in cutscenes a bit to increase the visual quality a tad but I don't see a widespread need or use for it in the current climate... may main problem tho isn't really with tessellation as such, its a great feature and has its place, my problem is with ATI typically concentrating on stuff that there isn't a major need for and ignoring things that would be far more useful, if you don't think this is true your probably a gamer and not a game developer.

The thing is Rroff for me any improvements are improvements and are taking us forward which surely is not a bad thing and should not be ruled out and made into a huge ati are talking rubbish post. Now that you have said its not physx but in general game physics you support you come across much better because it does not sound like you are hyping up nvidia with that statement which seems to be why most people have a problem with you. It always seems like ati in your opinion can do nothing right when there new features could make games better.
 
Surely its irrelevant whether tessellation is good or not, or whether ATI are making a small thing big, as its part of the DX11 specs, so whatever ATI does, Nvidia has to do as well? Physx has potential to be big, but so far hasn't lived up to its expectations.

I guess we'll see what happens in the next 12 months!
 
Up scaling a low/medium detail mesh especially character models, etc never produces as pleasing or predictable results as creating high res meshes and crunch down to your desired LOD stages.

Anyhow tesselation isn't bad as such it has its time and place, my main concern as I said is the attention and focus its getting over much more needed features.

it's not up-scaling.. that's a very different thing. this is simply adding detail (a-la LOD as you mentioned). this is precisely what most character modelers do these days. You build a simple mesh, add a certain level of detail, refine and then bake the added detail into normal and bump maps (and maybe some others too). You don't model a high res character and then decrease the detail for the game. That would be an enormous waste of time. You build your models based on their appropriate use. A foreground object/character demands higher detail and will be modeled as such, and the opposite case is obvious.

The balance between GPU/app-controlled increases in detail (tesselation) aren't anything to do with "detail" as such - tessellation doesn't create definition and added description to a character or model - it just smooths edges, basically, so a well-modeled character/object will look more realistic/convincing. That step that will be performed by the GPU will save artists a lot of time and hassle, because they will be able to model at lower resolution (or stick with the res they currently model to) and will be able to rely (to a point) on the GPU adding the extra tesselation to smooth the edges. There will come a point where normal and bump maps won't be required any more, because the tesselation will be so great (or so efficient) that they will become redundant.

Have a look at Z-Brush or Mudbox - if you've ever seen those apps in action it's a good example of what GPU-based tessellation will do down the line.
 
it's not up-scaling.. that's a very different thing. this is simply adding detail (a-la LOD as you mentioned). this is precisely what most character modelers do these days. You build a simple mesh, add a certain level of detail, refine and then bake the added detail into normal and bump maps (and maybe some others too). You don't model a high res character and then decrease the detail for the game. That would be an enormous waste of time. You build your models based on their appropriate use. A foreground object/character demands higher detail and will be modeled as such, and the opposite case is obvious.

The balance between GPU/app-controlled increases in detail (tesselation) aren't anything to do with "detail" as such - tessellation doesn't create definition and added description to a character or model - it just smooths edges, basically, so a well-modeled character/object will look more realistic/convincing. That step that will be performed by the GPU will save artists a lot of time and hassle, because they will be able to model at lower resolution (or stick with the res they currently model to) and will be able to rely (to a point) on the GPU adding the extra tesselation to smooth the edges. There will come a point where normal and bump maps won't be required any more, because the tesselation will be so great (or so efficient) that they will become redundant.

Have a look at Z-Brush or Mudbox - if you've ever seen those apps in action it's a good example of what GPU-based tessellation will do down the line.

To put it simply, it's just hardware accelerated subdividing.

I would hardware assisted subdivision in the CAD/modelling softwares I use.

If sketchup had tessellation and DX11 support, it would be perfect for me.
 
To put it simply, it's just hardware accelerated subdividing.

I would hardware assisted subdivision in the CAD/modelling softwares I use.

If sketchup had tessellation and DX11 support, it would be perfect for me.

yeh tesselation in sketchup would indeed be useful. my pipeline is still sketchup (from clients/architects) > 3ds max (via the new Connection Extension) > Render. works well but subs in sketchup would be very useful! thankfully sub-ds in max work well enough. they're even better in XSI (just press + or -!)
 
Back
Top Bottom