• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Crysis - DX9 vs DX10

At least you can see the difference in this game, Gears of War had no difference.

You can't though, not when you use the mod to get Very High on XP. :p

There's no excuse (at this stage) Crytek have lied big time about Quad Core and Dx10 in Crysis, not only does it run slower it's also the same graphics wise...and the game runs better in XP.
 
Crytek have lied big time about Quad Core and Dx10

Yeah, what ever happened to optimising it for multi core....

As far as i can see, it doesnt max out most CPUs, so whats the point in having the extra horsepower if it doesnt make any reasonable difference. I'd love to see a 1/2/4 core comparison, with overclocking, and using the same GFX card (8800GT would be good, or 3870, midrange price range)
 
ok i must be mad cant see any difference between vista and xp very high :s and no offense to anyone but only power noobs take notice of things like this, i mean seriously when looking at them vista and xp pictures on very high, were suppost to see a difference...:/,

i can play any game on very low i really dont mind. when playing online its so much harder when you have a good pc :/
 
Last edited:
ok i must be mad cant see any difference between vista and xp very high :s and no offense but only power noobs take notice of things like this, i can play any game on low i really dont mind. when playing online its so much harder when you have a good pc :/

It's a con, they hide the ability to get Very High on XP and pretend Directx10 is required for it.

They should have hid it better, they look like morons now.
 
The DX10 'hack' for XP only utilizes roughly 75-80% of the 'very high' settings, which are not DX10, as obviously it's not backwards compatible; rather it's forced using the DX9 engine, which is not as was intended.

People fail to realise Direct X is not about visual quality, it's about how the visuals are rendered, and THAT determines whether or not the INTENDED level of visual quality can be achieved in a stable environment.

The hand-picked screenshots we all see, have really nothing to do with the difference between DX9 and 10. I can assure you though, DX10 Very high throughout the entire game, looks and plays much differently than with the 'hack.'

It's more to do with visual IQ and physics.


As for the CPU dual vs quad, no, it doesn't seem that there's a huge difference, but I'm not sure where people got this idea that Crysis was built using a quad core. The game was coded to run on dual cores and SUPPORTS quad cores to the extent that it can make a small improvement. This shouldn't be shocking. Quad cores are not especially better in real world 3d applications as opposed to dual cores.


This is all really just bloated nonsense used by people who either A) Can't afford .. or B) Don't want to use Vista for whatever various reasons.

The bottom line is, if you want the full experience, then you need Vista/Dx10, if not, then keep your mouth shut.

There's no point in perpetuating vile and inane hogwash.
 
Last edited:
The DX10 'hack' for XP only utilizes roughly 75-80% of the 'very high' settings, which are not DX10, as obviously it's not backwards compatible; rather it's forced using the DX9 engine, which is not as was intended.

People fail to realise Direct X is not about visual quality, it's about how the visuals are rendered, and THAT determines whether or not the INTENDED level of visual quality can be achieved in a stable environment.

The hand-picked screenshots we all see, have really nothing to do with the difference between DX9 and 10. I can assure you though, DX10 Very high throughout the entire game, looks and plays much differently than with the 'hack.'

Well I have seen other screenshots and tried it out for myself and the difference is nothing, also Vista/Dx10 Very High runs worse than XP/Dx9 Very High.

It's simply not worth it, they hyped it up so much and it's hardly any better, as for the gameplay I noticed no different except prettier visuals.

I see someone is a Dx10 Hype/Vista fan. ;)
 
Both the same, it only changes as they aint the exact same shots, for example the waters moving, the trees are swaying etc... so they aint captured the exact same placement of everything, like in the XP shot, the waters not the same as the Vista shot, as the water constantly moves, the trees aint in the same positions as they are swaying all the time, totally impossible for them to get every bit of water, every bush, tree as an exact mirror image between the two as everythings moving, its just like when you hover your mouse over the XP image from th Vista image, it moves a frame, as the waters moved, the trees have swayed, and i don't see the prob, that is Dx10, its what it is supposed to look like.:confused:
 
Last edited:
I must admit (and i dont know a lot on the subject)

There doesnt seem to be a lot of difference between dx9 to dx 10 compared to what remember there ebing from dx 8.1 - dx 9 (notably sof2 to css).

Im probably talking poo so ill shut up now :)
 
Yeah, what ever happened to optimising it for multi core....

As far as i can see, it doesnt max out most CPUs, so whats the point in having the extra horsepower if it doesnt make any reasonable difference. I'd love to see a 1/2/4 core comparison, with overclocking, and using the same GFX card (8800GT would be good, or 3870, midrange price range)

It does appear a little peculiar of Crytek not to use the extra core a cpu can provide. Instead they rely significantly on pure gpu power? I don't know how things compare to UT3 bbut I believe Epic are very good at using every ounce of available tech.
 
would be better if there were say 10 second clips of them instead of pictures as im asuming you would noticed once moving. (and would be better if they were the EXACT pictures ! theres obviously going to be different reflections ect when even a fraction of a second off the other picture)
 
theres obviously going to be different reflections ect when even a fraction of a second off the other picture)

Exactly. :p

Thats the only reason they look different, as the waters moved, the trees have moved etc... they aint in the same positions as everything moves, how can they capture an exact mirror image of the water, they can't. :p
 
Last edited:
It does appear a little peculiar of Crytek not to use the extra core a cpu can provide. Instead they rely significantly on pure gpu power? I don't know how things compare to UT3 bbut I believe Epic are very good at using every ounce of available tech.

Wasn't part of the DX10 spec to off load more resources from the cpu.
 
The DX10 'hack' for XP only utilizes roughly 75-80% of the 'very high' settings, which are not DX10, as obviously it's not backwards compatible; rather it's forced using the DX9 engine, which is not as was intended.

People fail to realise Direct X is not about visual quality, it's about how the visuals are rendered, and THAT determines whether or not the INTENDED level of visual quality can be achieved in a stable environment.

DX9 + hack works just as stably as DX10. And there is no proof that DX10 runs things faster than DX9, more the opposite you may find.

And as for physics, i dont notice any effects missing from DX9 at medium/high settings in DX10, and i generaly am quite "inventive" with my use of environmental objects (throwing a slide-top bin through a machine gun nest for example).



The hand-picked screenshots we all see, have really nothing to do with the difference between DX9 and 10. I can assure you though, DX10 Very high throughout the entire game, looks and plays much differently than with the 'hack.'

It's more to do with visual IQ and physics.

"It has nothing to do with the visuals"...... "its more to do with the image quality" ? WTF mate, the visuals DISPLAY the image quality. Think about what you are saying......

As for the CPU dual vs quad, no, it doesn't seem that there's a huge difference, but I'm not sure where people got this idea that Crysis was built using a quad core. The game was coded to run on dual cores and SUPPORTS quad cores to the extent that it can make a small improvement. This shouldn't be shocking. Quad cores are not especially better in real world 3d applications as opposed to dual cores.

So you are telling me that image programs dont use quads more than duals? I've seen different...... And a lot of the hype with Crysis was that it would be one of the first games coded to take advantage of quad cores, i heard a lot of "quads are useless now, but when propperly threaded games like Crysis are out they will be more useful" type talk. ALL games support quads..... they dont crash with them, and you can offload some of the other background apps onto them, doesnt mean they are threaded for them.

This is all really just bloated nonsense used by people who either A) Can't afford .. or B) Don't want to use Vista for whatever various reasons.

The bottom line is, if you want the full experience, then you need Vista/Dx10, if not, then keep your mouth shut.

There's no point in perpetuating vile and inane hogwash.

If i saw the benefit, i really would upgrade to Vista, i have mates that have upgraded and are totaly happy, and some of the stuff looks quite neat...... however every time i look at screenshot comparisons i am totaly dissapointed by DX10, and seeing as XP does pretty well as far as i am concerned, i dnot feel any presure to upgrade.

Emphasis should be on encouraging people to upgrade their systems, by proving the benefits, not cooercing them with false promises. Likewise, before you have a go at all the people complaining the very noticable DX10 faults as "perpetuating vile and inane hogwash" maybe you should look at the hogwash you yourself are perpetuating, and your own inconsistancies in your arguments.

TBH, i kinda wonder whether EA took over Crysis, and didnt let Crytek do any proper optimisation for quads and to make life easier on the gfx cards in order to force people to upgrade (They are a major Nvidia TWIMTBP partner) and cover it in the "Oh, Far cry never ran very well to start with either" type comments.
 
Last edited:
This is all really just bloated nonsense used by people who either A) Can't afford .. or B) Don't want to use Vista for whatever various reasons.

The bottom line is, if you want the full experience, then you need Vista/Dx10, if not, then keep your mouth shut.
I've got both XP and Vista installed (that would make me unbiassed wouldn't it?) and very high looks very similar on both to me. No noticeable difference I would say. It's also unplayable on both FWIW :D High runs quite a lot better on XP, especialy minimum framerate. The performance dips under Vista are far more noticeable.
 
Last edited:
Crytek are taking the **** with the quad core support in crysis, i see it all just as a marketing ploy with Intel back handing crytek to make out quads will give a significant boost in the game and in reality they are no better than two core cpu's, same can be said for multi gpu support, all hyped up and no where in sight. FFS timeshift and UT3 make real use of quads, crytek should maybe get some tips of them game developers.:rolleyes:
 
TBH people should be expecting this from Crytek by now. Remember the pixel shader 3.0 patch for Far Cry around the time the 6800 series launched? Remember the 64 bit version when the A64 launched? Crytek have no scruples about pimping whatever software/hardware they are asked to if the money is right. This time round they've done it with quad core and with ether Vista or DX10 hardware depending on which way you look at it.

Just wait for the Crysis DX10.1 support patch.
 
Back
Top Bottom