DirectX 11 Benchmark "Heaven"

Unigine
Heaven Demo v1.0
FPS: 29.3
Scores: 738

Hardware
Binary: Windows 32bit Visual C++ 1500 Release Oct 22 2009
Operating system: Windows 7 (build 7600) 64bit
CPU model: Intel(R) Core(TM)2 Quad CPU Q6700 @ 2.66GHz
CPU flags: 3200MHz MMX SSE SSE2 SSE3 SSSE3 HTT
GPU model: NVIDIA GeForce GTX 260 8.16.11.9062 896Mb

Settings
Render: direct3d11
Mode: 1920x1080 2xAA fullscreen
Shaders: high
Textures: high
Filter: trilinear
Anisotropy: 16x
Occlusion: enabled
Refraction: enabled
Volumetric: enabled

I think it didn't run in dx11 obviously though, my gpu doesn't support that, found it odd it let me select dx11 as renderer...
 
Bear in mind the res:

00000.jpg

00001.jpg

00002.jpg

00003.jpg

00004.jpg

00005.jpg

00006.jpg

[img[http://i469.photobucket.com/albums/rr52/Martian333/00007.jpg[/img]
00008.jpg

00009.jpg

00010.jpg

00011.jpg

00012.jpg

00013.jpg

00014.jpg

00015.jpg

00016.jpg

00017.jpg

00018.jpg

00019.jpg

00020.jpg

00021.jpg

00022.jpg

00023.jpg

00024.jpg
 
Unigine
Heaven Demo v1.0
FPS: 20.9
Scores: 527

Hardware
Binary: Windows 32bit Visual C++ 1500 Release Oct 22 2009
Operating system: Windows 7 (build 7600) 64bit
CPU model: Intel(R) Core(TM)2 Duo CPU E8400 @ 3.00GHz
CPU flags: 3000MHz MMX SSE SSE2 SSE3 SSSE3 SSE41 HTT
GPU model: ATI Radeon HD 5800 Series 8.661.0.0 1024Mb

Settings
Render: direct3d11
Mode: 1920x1080 8xAA fullscreen
Shaders: high
Textures: high
Filter: trilinear
Anisotropy: 16x
Occlusion: enabled
Refraction: enabled
Volumetric: enabled

Not great, not sure if its effected by it but running the new 9.10 cat drivers with AA on edge detect and the new anti-aliasing mode on super sampling ( quality)
 
Is this similar tesselation technology that used to be hardware controlled on the ATI 9700s, which had an option in the advanced settings of UT2003/4?
 
Remember that dx10 cards can use dx11 stuff just not all. Since theres very little thats actually needed by dx11 pure hardware only that only dx11 cards can do so yep dx10 cards can and do run dx11 stuff just not the whole thing.

Well AFAIK anyhow.

Hmm i thought tessilation worked on dx10 cards, shame. :(

Unigine
Heaven Demo v1.0

FPS: 27.0
Scores: 680

Hardware
Binary: Windows 32bit Visual C++ 1500 Release Oct 22 2009
Operating system: Windows 7 (build 7600) 64bit
CPU model: Intel(R) Core(TM)2 Quad CPU Q6600 @ 2.40GHz
CPU flags: 3599MHz MMX SSE SSE2 SSE3 SSSE3 HTT
GPU model: NVIDIA GeForce GTX 260 8.16.11.9107 896Mb

Settings
Render: direct3d11
Mode: 1600x1200 2xAA fullscreen
Shaders: high
Textures: high
Filter: trilinear
Anisotropy: 16x
Occlusion: enabled
Refraction: enabled
Volumetric: enabled
Unigine Corp. © 2005-2009
 
Last edited:
Unigine
Heaven Demo v1.0
FPS: 38.8
Scores: 977

Hardware
Binary: Windows 32bit Visual C++ 1500 Release Oct 22 2009
Operating system: Windows 7 (build 7600) 64bit
CPU model: AMD Phenom(tm) 9850 Quad-Core Processor
CPU flags: 2500MHz MMX+ 3DNow!+ SSE SSE2 SSE3 SSE4A HTT
GPU model: ATI Radeon HD 4800 Series 8.661.0.0 1024Mb

Settings
Render: direct3d11
Mode: 1680x1050 fullscreen
Shaders: high
Textures: high
Filter: trilinear
Anisotropy: 4x
Occlusion: enabled
Refraction: enabled
Volumetric: enabled

Unigine Corp. © 2005-2009
 
did anyone see the AMD froblins demo? a load of frog things wandering around collecting rocks... that demo'd tessellation, and even then (probably getting on for a year old) looked awesome.

this just blows me away! can't wait to get a 5870x2 now :D
 
Wow! Tessellation looks like the biggest leap forward in PC gaming graphics for years!

This is by far the best use of tessellation I've seen yet - tho still some design issues... I didn't really expect anyone to use it to this extent for another 18 months or so.

Your still better off as a game developer tho using offline tools to create 2 copies of the sections of world geometry (one for close up with high detail, one for distance with reduced LOD) it gives far cleaner, more predictable results with better performance... the extra memory and bus bandwidth used is pretty negligible on a modern PC - savings here being one of the claims for using tessellation.
 
This is by far the best use of tessellation I've seen yet - tho still some design issues... I didn't really expect anyone to use it to this extent for another 18 months or so.

Your still better off as a game developer tho using offline tools to create 2 copies of the sections of world geometry (one for close up with high detail, one for distance with reduced LOD) it gives far cleaner, more predictable results with better performance... the extra memory and bus bandwidth used is pretty negligible on a modern PC - savings here being one of the claims for using tessellation.

AS with anything, benchmarks tend to overuse new features, purely to show them off to their max, it does and will save power for higher detail, its not just memory and bus bandwidth, you still have to draw the extra detail, higher quality textures mean more memory used, less spare bandwidth and more power used.

The problem being, this tesselation unit has been in ATi hardware for, well 3ish years. its a HUGE gamble to make a massive tesselation unit capable of doing everything without breaking a sweat, when to date Nvidia have done nothing but screw ATi at every chance and have prevented tesselation taking off for 3 years already. Dedicating 3times the amount of transistors to a more powerful "tesselator" on die, if no games use it, is a huge waste of cash, every core more expensive, the inherent lower yields a bigger core provides ontop of the larger core meaning less per waifer.

Once it takes off, once games use it more often and once ATi know Nvidia support and use the feature in games also, meaning more and more games use it, the next generation from them can dedicate much more space to a much more powerful tesselator on die, which can simply do a lot of work going on at a far accelerated rate compared to normal shaders aswell as save a crap load of memory.

However, this is where Nvidia might be faultering and Tesselation might ultimately fail to make an appearance in many games. If Nvidia only do tesselation in software, not hardware, it might be so slow they cant' use it, so lots of games won't support it. Again why ATi aren't risking extra wasted transistors........ yet.

Tesselation is awesome, but might simple go unused due to a certain bunch of gits :(
 
Back
Top Bottom