• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Project cars benchmarks

Can't speak to much for AMD as I swaped my pair of 7970's for a pair of 780's early on in the development of this title, I was an early backer, but it runs great for me. I run triple screens at 5900*1080 and with everything set to high bar detailed grass + motion blur but with 4xMSAA enabled, I can get well over 60 fps. Actually it is normally around 100FPS once the start/first lap are out of the way but I prefer adaptive v-sync and a silky smooth 60 FPS locked at my monitors refresh rate. This is with large fields of 30+ cars and includes wet conditions.

Worst case scenario in my standard test, 36 car mixed GT3 field, wet conditions with me starting in the middle and letting the field drive by. It will drop to 50FPS briefly before recovering. Ultra settings in this title really do mean ultra. There are some very computationally heavy effects going on, especially in the wet where it will have the nearest thirty-two cars to you generating spray! That is a lot of particle effects to handle for any system. On the high setting that drops down to a more manageable eight.

There are also some crazy AA options if you want them. The top option, DSX9 does a custom 3x3 down sample. Better have those Titan-X's at the ready.

Back to AMD v Nvidia this game does run better on the equivalent Nvidia card due to I believe, their drivers being a little more efficient in terms of CPU resources consumed. Remember that this title has a very complex physics simulator running at its heart. There are some command line options you can run to enable mutli threading in the renderer and change the number of threads used for the physics engine. I don't know if the multi threaded renderer is now the default but most people have found this gives better results with AMD cards. Up until recently single threaded gave the best performance on my system (CPU is an i7 4930K @ 4.3Ghz) but in the last couple of months multi threaded has taken the lead.

Edit: BTW the suggestion that this title has been deliberately gimped to run badly on AMD hardware could not be further from the truth as anyone who has backed the title and followed its development from the start can attest. Slightly Mad Studios are a relatively small independent developer, do you really think they want to cut off a large proportion of their target market who are using AMD CPUs/GPUs?

There are a lot of different ways of creating the same Physics effects, not all of them are performance hogs, with CPU's that are capable of calculating physics instances on very efficient instruction sets across multiple threads There is no reason why physics should be such a performance hog.

I have 190,000 instances here, no Nvidia PhysX GPU, no £800 Intel CPU and the performance is pretty good.


What you are looking at there is something created using Cryengines Bullet Physics backend, its similar to Hair Works / TressFX, actually much closer to TressFX than Hair Works, it reacts to player actions as well as environmental.
 
Last edited:
Vehicle dynamics simulation strikes me as somewhat different to a perhaps less complex simulation of a repeating object such as grass or hair. Not to mention the tyre simulation which is a whole other order of complexity. GPU's good for calculating/simulating recurring less complex objects while CPU's better for calculating/simulating one very complex one?

None of the current leading simulations, RFactor2, iracing, Assetto Corsa, Project CARS, Raceroom Racing Experience, Game Stockcar Extreme use the GPU for physics simulation that I know of.

You can find more on Project CARS tyre model: http://www.wmdportal.com/projectnews/inside-project-cars-seta-tire-model/#more-6739
 
I can run it at 4k on a mix of high / ultra and get 25 - 35 fps in the wet / 19 car AI race at Barcelona with win 8.1

I Tried it on fresh install of win 10 tech preview on a spare drive and get 35 min up to 65 exact same race and settings and thats with bundled drivers bare install.


running on 4770k @4.1 and an AMD R295x2 stock (good bang for buck card only cost me £450 New)

very playable on win 10 considering should be more sorted with some new drivers.
 
Vehicle dynamics simulation strikes me as somewhat different to a perhaps less complex simulation of a repeating object such as grass or hair. Not to mention the tyre simulation which is a whole other order of complexity. GPU's good for calculating/simulating recurring less complex objects while CPU's better for calculating/simulating one very complex one?

None of the current leading simulations, RFactor2, iracing, Assetto Corsa, Project CARS, Raceroom Racing Experience, Game Stockcar Extreme use the GPU for physics simulation that I know of.

You can find more on Project CARS tyre model: http://www.wmdportal.com/projectnews/inside-project-cars-seta-tire-model/#more-6739

Its nothing new, i'm sure PC does it very well but its actually been done before on racing games as far back as 15 years ago.
There is nothing strenuous about it, the game calculates the physical state of the tyre based on predefined parameters and feeds that back to the player, a CPU can make millions of such calculations a second which is massively more than it needs to for such a task, this type of Physics started out on the one core 250Mhz PSX and it was quite good all things considered.

Anything that is an interactive object has physical properties, the engine knows where it is in real time and space, something as simple as a rotating wheel reacting to user input is physics.
That cylinder i'm shooting at is physics, it even knows where its being hit and reacts accordingly, hit it on the left it flicks to the right, hit it at the bottom it stops spinning or spins backwards, it even know how hard its being hit.
When you make these things you give these objects mass, density, gravity, air resistance..... properties with in its world.

It does not require massive calculations for all that to work, in 1998 it may have been a little hard going, but with the compute power both Intel and AMD CPU's have today even in the lower mid range its easy.
Its only if you use methods and API's from the 1990's is where you might need a GPU, and if the GPU is one of your proprietary products then your onto a winner.

Here is something you see in one Vendors brand of games and that Vendor says you need the compute power of their GPU's to do it.

Erm? no you don't.

 
Solid body physics is about as simple an example as you can get, which is why it is about the only type used on a purely CPU bound system. Accurately simulating a car is a far more complex proposition. F1 teams have racks of servers dedicated to such tasks.
 
Solid body physics is about as simple an example as you can get, which is why it is about the only type used on a purely CPU bound system. Accurately simulating a car is a far more complex proposition. F1 teams have racks of servers dedicated to such tasks.
A car in a game does not have a mind of its own, it does what you or a piece of programming tell it to do, not only is it completely predictable, its not even that. its taking instructions, thats all it does.

You can't compare real life with something someone created that is then played back on your screen based on the parameters he gave it, they are a million miles apart.
 
Last edited:
A car in a game does not have a mind of its own, it does what you or a piece of programming tell it to do, not only is it completely predictable, its not even that. its taking instructions, thats all it does.

You can't compare real life with something someone created that is then played back on your screen based on the parameters he gave it, they are a million miles apart.

wut?
 
If AMD/Nvidia want to state/imply it's free advertising, then they are good liars, advertising is not free in this world, monies/goods/services always change hands, imo it's naive to think otherwise.

Fed up with the politics(never got into GW's debates since BA), just wan't to game, as I said, whoever is providing the most game time will get my next purchase(regardless where the blame lies) which looks like it'll be Nvidia.

Its a strange one to describe, I wouldn't call it 'free' advertising per say, but its certainly not a case of "here is a wedge of cash, advertise us" kind of situation. More, these are the people who supported us and helped us create this game with hours of input and lots of hardware, we'll advertise them and not say it was a stipulation of their involvement :p

I wouldn't say dropping AMD is the answer, more they just need to stop playing the hard done by card and either get things fixed or expose things for how they are, at the moment all they seem to do is point the finger at devs, which for one will make devs not want to work with AMD and secondly just makes Nvidia look superior.

As others have said the game ran fine in BETA, they must have done something to it on release thats messed up AMD's performance.

If its a GW title then Nvidia had first shout at optimising, AMD have said they are working with the dev now.

The point is, weather or not it has changed from beta to retail, AMD haven't been in any sort of contact with SMS since October 2014 (circa 6 months). You can't honestly tell me anyone wouldn't expect some changes to the game, engine and its running in 6 months leading up to launch? Of course it would, but they decided not to do anything about it until after the game launched.

Ian Bell said:
What can I say but he should take better stock of what's happening in his company. We're reaching out to AMD with all of our efforts. We've provided them 20 keys as I say. They were invited to work with us for years.

Looking through company mails the last I can see they (AMD) talked to us was October of last year.

AMD can cry foul all they like, the point is, this time around at least, they were negligent, this is a well documented AAA title they haven't seen through to release.

I really do want AMD to succeed, they were making brilliant headway with drivers just as I left to get 780's, how am I supposed to contemplate a move back with support like this? Its like they've taken one step forward, three back, then shot them selves in the face.

/drunken rant over :p
 
Last edited:
Given it works fine on the consoles which run lower powered AMD graphics, Physx related perhaps?
 
Last edited:
The point is, weather or not it has changed from beta to retail, AMD haven't been in any sort of contact with SMS since October 2014 (circa 6 months). You can't honestly tell me anyone wouldn't expect some changes to the game, engine and its running in 6 months leading up to launch? Of course it would, but they decided not to do anything about it until after the game launched.

Has it been that long?
 
Given it works fine on the consoles which run lower powered AMD graphics, Physx related perhaps?
Or the AMD driver team are now spending most there time sorting out there console drivers...;)
Which could be one the main reasons why there crossfire driver support takes so long to come out these days..

(As am guessing the xboxone & PS4 console GPU drivers also get updated )
 
Last edited:
Can anyone recommend a GPU overlay monitoring software for this title. Fraps does not work for me, hasnt in a long time. Think afterburner is supposed too, correct me if im wrong, but iv only managed to get the Temp to display not the FPS. Im running a 290x and is quite choppy So I need something in game to help me tweek. Thanks
 
So i have been reading from many sources and it seems that its the CPU Physx that is bring AMD to its knees, NV asked the developers to add more Physx at a later date which the consoles versions dont have and just to show its not all well it must be all AMD drivers fault.

2 x GTX 980 SLI, 25fps and usage GPU only 40%???
4K, everything ultra
no AA

MY PC
i5 4.8, 16GB RAM, SSD

kacperflak [has Project CARS] 11 hours ago
problem solved
Nividia Panel -> PhysX - > CPU = 25fps 40% GPU
Nividia Panel -> PhysX - > Defult = 60fps 100% GPU
http://steamcommunity.com/app/234630/discussions/0/613957600537550716/

So even with the highest realistically affordable NV setup the fps will tank if the PhysX runs on the CPU which AMD has no choice in.

Also a Physx poll running here and some good info.
http://forums.anandtech.com/showthread.php?t=2430693&page=12
 
So i have been reading from many sources and it seems that its the CPU Physx that is bring AMD to its knees, NV asked the developers to add more Physx at a later date which the consoles versions dont have and just to show its not all well it must be all AMD drivers fault.




http://steamcommunity.com/app/234630/discussions/0/613957600537550716/

So even with the highest realistically affordable NV setup the fps will tank if the PhysX runs on the CPU which AMD has no choice in.

Also a Physx poll running here and some good info.
http://forums.anandtech.com/showthread.php?t=2430693&page=12

So that's why I'm getting 100% CPU load and 50% GPU use, what a **** take.
 
As others have said the game ran fine in BETA, they must have done something to it on release thats messed up AMD's performance.

If its a GW title then Nvidia had first shout at optimising, AMD have said they are working with the dev now.

So you expect the developer not to make any significant changes between beta and final so as not to break the optimisations AMD made a year or so ago? according to the developer AMD were provided with keys and asked to sit at the table so they have no excuse really.

The quotes from the developer are pretty damning really:

I asked the big boss Ian Bell the permission to post what he said about AMD's putting the blame on SMS for the perf's problem said:
We've provided AMD with 20 keys for game testing as they work on the driver side.


But you only have to look at the lesser hardware in the consoles to see how optimised we are on AMD based chips.


What can I say but he should take better stock of what's happening in his company. We're reaching out to AMD with all of our efforts. We've provided them 20 keys as I say. They were invited to work with us for years.


Looking through company mails the last I can see they (AMD) talked to us was October of last year.


Categorically, Nvidia have not paid us a penny. They have though been very forthcoming with support and co-marketing work at their instigation.


We've had emails back and forth with them yesterday also. I reiterate that this is mainly a driver issue but we'll obviously do anything we can from our side.
 
Last edited:
Deja vu

Need for Speed: Shift Patch 2 accelerates Radeon graphics cards

Same developers.

Need for Speed: Shift Patch 2 - Background
When Need for Speed: Shift was released, we criticized the surprisingly low performance of AMD's Radeon cards. Especially in scenes with many vehicles the framerate was bad - no matter if the resolution was set to 800 x 600, 1920 x 1200 or any other resolution. The first time the racing game was updated, the problem was not solved, but the second patch for Need for Speed: Shift delivers more frames per second - according to the readme because of "Improved ATI graphics card support”. The reason for the up to now poor performance of Radeon cards has not been unveiled although rumors say that certain Shader routines had not been optimized. Honi soit qui mal y pense - Shift is part of Nvidia's TWIMTBP program.

Need for Speed: Shift - Benchmarks
In order to show the huge performance benefit for Radeon cards delivered by the patch, we race against 15 computer opponents on "Brands Hatch GP” in broad daylight. We record the framerate for 30 seconds. Without the patch a Radeon HD 5850 wasn't able to exceed 45 fps on average at 1280 x 1024 or 1680 x 1050 (each with 4x MSAA and 16:1 AF), but the update to version 1.02 the performance is increased by 60 respectively 44 percent. At 2560 x 1600 the performance advantage of "only” 19 percent is, as expected, a little smaller. The Radeon HD 4000 cards also get a benefit from the patch. In our benchmarks a Radeon HD 4870 runs between 11 and 34 percent faster (depending on the settings). In matter of performance Geforce owners on the other hand don't necessarily need the patch since the game runs "only” 2 to 4 percent faster.
http://www.pcgameshardware.com/aid,...-2-accelerates-Radeon-graphics-cards/Reviews/
 
Back
Top Bottom