• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Infiltrator: GTX 680-Powered Unreal Engine 4 Tech Demo Unveiled

Well nVidia have a habit of struggling with the truth, so most people tend to wait until it's out there for people to test instead of lending weight to nVidia's claims.

PhysX 3.0 isn't "CPU/GPU" what they have been refusing to do until PhysX 3.0 is support modern instruction sets that as well as improve multithreaded performance.

When PhysX was first running on nVidia GPUs, it was strictly restricted to running on a single CPU thread, which really hurt performance with the intention of exaggerating GeForce GPU performance with PhysX.

This is the real reason why PhysX runs badly on the CPU, because they had no interest in making it run well on the CPU when GPUs were involved.

What they are basically saying there is that most of PhysX runs on the CPU, but when it comes to particle effects (things flowing, water, smoke, sparks that sort of stuff) uses the GPU to simulate.

In effect, they're indirectly admitting that a lot of the effects don't actually require massive amounts of GPU performance to run them, which is generally the basis for my argument about PhysX, rather than the one you claim where I say people shouldn't like PhysX and the effects are crap.
 
Was anyone else trying to wipe their screen when watching this?

Those effects are a little weird. I don't get why they are so prevailent in games, like lens flares as well, which are only visible through, well a lens.

I think the effects are nice, like in Crysis 3 where you see marks, scuffs and finger prints on the nanosuit's visor, but when they use them for characters that don't have visors on, it comes across as the developers not really understanding why things like lens flares and lens smudges even happen.
 

APEX Turbulence provides an interactive particle system based on eulerian fluid simulation. Since the APEX Turbulence technology is PhysX independent it works very well with CryEngine



 
Last edited:
Those effects are a little weird. I don't get why they are so prevailent in games, like lens flares as well, which are only visible through, well a lens.

As always, it's an attempt to make the game feel "more cinematic". Cinema is where audiences are used to seeing epic battles unfold, and developers attempt to cash in on this association.


IMO the demo looks very promising, and is a nice indication of where games will be headed with the new generation of consoles (and associated PC ports).
 
Did Crytek not drop Nvidia Physx for Cry engine 3 and develop an inhouse engine ? I'm sure I read that somewhere.

Somebody smarter than me on the subject will have to answer. I have been trying to find out if the the 3.0 PhysX engine is still limited to Nvidia GPU's but can't find anything solid.
 
Somebody smarter than me on the subject will have to answer. I have been trying to find out if the the 3.0 PhysX engine is still limited to Nvidia GPU's but can't find anything solid.

The hardware on both sides is capable of implementing any set of floating point computations. It's purely a software issue.

Nvidia wrap PhysX through CUDA, at least for the PC implementation, but there is no reason why the same computations couldn't be performed using OpenCL (for example), and run on any compatible hardware. GPUs can execute any set of instructions that they are provided.

The fact that PhysX is being licensed for use on the next-gen consoles (which run on AMD hardware) is evidence that Nvidia has the will and capability to do this. Holding back PhysX from AMD hardware on the PC is therefore a marketing strategy, rather than a technical issue.
 
1. thanks moogleys, demo is bloody incredible! but also it's either running at 30fps (maybe with motion blur) or someone is outright lying about a "single off the shelf gtx 680".

2. of course nvidia was going to licence physx to consoles and not to pc amd hardware. nvidia is securing both a higher interest in future nvidia gpu sales and physx' future development. to finish it off they're going around declaring the end of consoles, and how weak the next gen consoles are compared to current pcs, while amd are left with the apparently smaller profit margins per silicon die on consoles. i've always thought this but nvidia clearly have a better business/marketing mind. it doesn't matter if physx is artificially locked by nvidia or not, we're never going to get it on amd gpus, so there's no point crying about it. amd just need to come up with their own physics.

the way they could challenge physx is to (like tressfx) create a technology that is accessible by both amd and nvidia hardware, but that performs astronomically better on amd's gpu architectures than it does on nvidia's. that way they don't have to try as hard as nvidia to keep their physics api relevant and worth the trouble for developers. but without meaning to sound negative, amd have much more important things to work on atm, and they're probably stretched thin - a new memory manager for gcn, single and multi-card frame latency fixes (which they plan to release by july), basic missing driver options (they should really consider buying radeonpro), and any number of bugs and missing features either specific or not to gcn. i just kind of feel like talking about gpu physics and amd in the same sentence is pointless at this stage.
 
Last edited:
demos of unreal 4 engine have been demod before on a single 680 gtx and at 30 fps.

this is why a lot though the ps4 would get a card based around the 680.

so it doesn't seem far fetched.
 
1. thanks moogleys, demo is bloody incredible! but also it's either running at 30fps (maybe with motion blur) or someone is outright lying about a "single off the shelf gtx 680".

2. of course nvidia was going to licence physx to consoles and not to pc amd hardware. nvidia is securing both a higher interest in future nvidia gpu sales and physx' future development. to finish it off they're going around declaring the end of consoles, and how weak the next gen consoles are compared to current pcs, while amd are left with the apparently smaller profit margins per silicon die on consoles. i've always thought this but nvidia clearly have a better business/marketing mind. it doesn't matter if physx is artificially locked by nvidia or not, we're never going to get it on amd gpus, so there's no point crying about it. amd just need to come up with their own physics.

the way they could challenge physx is to (like tressfx) create a technology that is accessible by both amd and nvidia hardware, but that performs astronomically better on amd's gpu architectures than it does on nvidia's. that way they don't have to try as hard as nvidia to keep their physics api relevant and worth the trouble for developers. but without meaning to sound negative, amd have much more important things to work on atm, and they're probably stretched thin - a new memory manager for gcn, single and multi-card frame latency fixes (which they plan to release by july), basic missing driver options (they should really consider buying radeonpro), and any number of bugs and missing features either specific or not to gcn. i just kind of feel like talking about gpu physics and amd in the same sentence is pointless at this stage.

AMD have Havok for their Phyics which is openCL and is cross platform unlike PhysX which currently is Nvidia only. No reason PhysX can't run on an AMD GPU, other than Nvidia saying no to it.
 
I can't help but think this is the video equivalent of a bullshot.

A bullshot is a screenshot claiming to be taken "in-engine" or "in-game" but has either been passed through an unnatural amount of postprocessing or photoshopped or the game scene has been recreated & tarted up with LuxRender or something.

I'm sure there are things you can do for video as well. Gonna have to ask Rroff or someone to come up with theories on how though.
 
I can't help but think this is the video equivalent of a bullshot.

A bullshot is a screenshot claiming to be taken "in-engine" or "in-game" but has either been passed through an unnatural amount of postprocessing or photoshopped or the game scene has been recreated & tarted up with LuxRender or something.

I'm sure there are things you can do for video as well. Gonna have to ask Rroff or someone to come up with theories on how though.

It doesn't look quite that good to suggest that it's actually been rendered using an unbiased rendering engine.

It certainly looks very nice, but it will be very optimised for the PC it's running on. However, it simply looks like the accumulation of the advanced graphical effects we've been seeing in games like Crysis 3 and so on over the years. The demo looks great but the level of quality I would say is to be expected considering that we're at the start of a new generation of real time 3D graphics.
 
AMD have Havok for their Phyics which is openCL and is cross platform unlike PhysX which currently is Nvidia only. No reason PhysX can't run on an AMD GPU, other than Nvidia saying no to it.

AMD don't have Havok (it's owned by Intel), developers however have Havok there to use, and it has been demoed on the PS4, which I think was hardware accelerated as well.
 
AMD don't have Havok (it's owned by Intel), developers however have Havok there to use, and it has been demoed on the PS4, which I think was hardware accelerated as well.

Fair play. I have been reading up so much on PhysX and Havok, things are becoming a blur :)

PhysX - Havok - Bullet - Vortex they are all becoming one now in my head :(
 
Last edited:
No reason it wouldn't be, it's only a cinematic demo, I think we watch film projection at around 24fps.

film is frame blended! if you've ever paused a film, you know what that looks like, and how heavy it is. not to mention frametimes are 100% even, adding to the smoothness. a gpu rendering something at 24fps wouldn't have a chance in hell of looking as good as film. though i know what you mean, without interactivity things often look smoother than if you could look around yourself.

AMD have Havok for their Phyics which is openCL and is cross platform unlike PhysX which currently is Nvidia only. No reason PhysX can't run on an AMD GPU, other than Nvidia saying no to it.

i know, it just doesn't look like nvidia want to share, which is fair enough given how much work they've put into it and how much of a leveraging tool it is for them. havok is intel owned, plus i know havok is supposed to be widespread but for some reason i don't feel like i've ever noticed it. it's just not as in your face as physx apparently(?)

I can't help but think this is the video equivalent of a bullshot.

A bullshot is a screenshot claiming to be taken "in-engine" or "in-game" but has either been passed through an unnatural amount of postprocessing or photoshopped or the game scene has been recreated & tarted up with LuxRender or something.

I'm sure there are things you can do for video as well. Gonna have to ask Rroff or someone to come up with theories on how though.

which is why i was skeptical about a gtx 680. how can the demo not miss a beat with all these effects... i just don't feel like it would be particularly hard for someone to lie about that kind of thing. but whatever, benefit of the doubt i suppose
 
IIRC, the Samaritan demo was on a single 680 and also looked good but this just looks so much better. I guess with a fully optimised engine, there is no reason why we can't have those effects and all on a single high end GPU.

I look forward to seeing the Unreal Engine 4 in a game :)
 
Back
Top Bottom