• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA Medusa technology demo

Soldato
Joined
6 Oct 2007
Posts
23,082
Location
North West
Designed to show of the new 200 series cards but it works on my GT so other cards should work fine with it, averaged about ten fps at 1680x1050 with no AA and vsync off.

http://downloads.guru3d.com/NVIDIA-Medusa-technology-demo-download-1950.html

rrrrgr5.jpg


rrredh1.jpg


rrr2ol5.jpg
 
Last edited:
Doesnt work for me, says i need D3D10.dll

isnt a dx10 app?

edit : bah, Windows Vista (32/64-bit) -.-
 
DX10 *yawn*

Personally I wouldn't say it was hugely better visually than mass effect - which runs fine under DX9...
 
None of those screenshots look better then any of the DX9 demos we saw years back.

Where exactly are the phenomenal improvements? Still blurry textures on the human character.
 
None of those screenshots look better then any of the DX9 demos we saw years back.

Where exactly are the phenomenal improvements? Still blurry textures on the human character.


they've just concentrated on the face, but they're left with clothes and bodies than have been designed on paint or something
 
Doesnt really look any better than crysis,mass effect e.t.c.

Some of the textures look like they belong in a ps2 game
 
None of those screenshots look better then any of the DX9 demos we saw years back.

Where exactly are the phenomenal improvements? Still blurry textures on the human character.

And your'll continue to get blurry textures in anything until 2GB+ cards are mainstream. Dont matter if it was DX20, they cant do magic with limited RAM and processing power.

DX10 can do way better than this, but the hardware still isn't there yet to take advantage of it fully, and like with DX9 it wont be for atleast 2.5+ years after release... so i guess i'll just have to put up with posts like this until then :rolleyes:

But then it will happen all over again with DX11...
 
Last edited:
And your'll continue to get blurry textures in anything until 2GB+ cards are mainstream. Dont matter if it was DX20, they cant do magic with limited RAM and processing power.

DX10 can do way better than this, but the hardware still isn't there yet to take advantage of it fully, and like with DX9 it wont be for atleast 2.5+ years after release... so i guess i'll just have to put up with posts like this until then :rolleyes:

But then it will happen all over again with DX11...


Excuse me but I do know what I'm talking about. I know perfectly well the limitations of textures in games but this is not a full game, you don't have to worry about loading dozens of character textures. This is a tech demo with what looks like just a few characters. You would assume they would try harder to get better textures on the characters. Crysis has far better texture work on their characters and environments.

You cannot tell me that they cannot do better then that.
 
1024x1024 textures are less than 4megs each - and would be fairly decent quality - throw in some DDS and on modern GPUs they'd only take up around 900Kb each... even on a 256Mb card thats plenty of texture possibilities in a tech demo.
 
Excuse me but I do know what I'm talking about. I know perfectly well the limitations of textures in games but this is not a full game, you don't have to worry about loading dozens of character textures. This is a tech demo with what looks like just a few characters. You would assume they would try harder to get better textures on the characters. Crysis has far better texture work on their characters and environments.

You cannot tell me that they cannot do better then that.

If you knew what you was talking about, then you would know that it matters what textures are on screen, currently displayed and in the surrounding area, as they will be using the RAM. Parts of a level or characters that are not displayed/in the surrounding area, will not be using the graphics VRAM.

They could have done a better job on his armor though. But it's not that bad considering how close the camera is to it, and it does look better in nearly every other scene in the demo.

1024x1024 textures are less than 4megs each - and would be fairly decent quality - throw in some DDS and on modern GPUs they'd only take up around 900Kb each... even on a 256Mb card thats plenty of texture possibilities in a tech demo.

256MB is a joke and so is that comment. Even my 9800GX2 cannot run this demo smoothly at 1650x1080, not because it does not have the power, but because it does not have enough memory for the textures. I'm always running in to this texture memory problem with actual games @ 1920x1200 and higher res. 512MB usable VRAM does not cut it these days for high res gaming, especially if you have AA/AF.

Theres even a NV demo for the 8800GTX/Ultra that makes use of the 768MB on these cards, so my 9800GX2 cannot run it smooth (and my mates 8800GTS 512MB aswell), yet my 8800GTX could run it perfectly.
 
Last edited:
Back
Top Bottom