• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

2080 to 3070ti = less rust fps

Soldato
Joined
7 Jan 2009
Posts
6,803
Hey,

So I upgraded from a regular 2080 (non super or ti) to a 3070ti.

I was expecting a some sort amount of gain in rust with fps but its actually gone down leaving me baffled.

Before with my 2080 with the same settings at 1440p I was getting around 130-136 fps.
Now with the 3070ti it's 98-100 fps.

Any idea what the issue could be, I wasn't expecting huge huge fps increase but certainly wasn't expected a decrease lol.

Specs -
11700k at stock (boosts to 5/5.2)
32gb 3600mhz cl16 ram
850w bequiet power supply

Cheers
 
Last edited:
Hey,

So I upgraded from a regular 2080 (non super or ti) to a 3070ti.

I was expecting a some sort amount of gain in rust with fps but its actually gone down leaving me baffled.

Before with my 2080 with the same settings at 1440p I was getting around 130-136 fps.
Now with the 3070ti it's 98-100 fps.

Any idea what the issue could be, I wasn't expecting huge huge fps increase but certainly wasn't expected a decrease lol.

Specs -
11700k at stock (boosts to 5/5.2)
32gb 3600mhz cl16 ram
850w bequiet power supply

Cheers

I would start with a motherboard BIOS update and see from there and make sure to enable rebar in the BIOS too.

Try that and do a full driver uninstall and reinstall too with the clean setup option so removes any settings that maybe still there from previous card.

Also if that fails to improve things try DDU https://www.wagnardsoft.com/content/Display-Driver-Uninstaller-DDU-V18060-Released


Also have you tried reinstalling and installing the game or forcing it to recompile shaders ? Sometimes changing cards can effect the games installed and they may need reinstalling to detect the new hardware correctly again or recompile the shaders.

You could also delete the Nvidia compiled shader cache and that will force games installed to recompile shaders too :-

C:\Users\YOURUSER\AppData\Local\NVIDIA (There's DX and GL cache folders)



 
Last edited:
Obvious question, I know, but you're sure it didn't automatically detect the change and up your settings/detail levels? Some games will do this.
 
Last edited:
As for Rust itself I'm pretty sure it's more CPU bound than GPU, I would've expected hardly FPS difference instead. Another thing to note I'm pretty sure Rust itself has weird optimisation issues. Perhaps it's not well optimised for Ampere cards compared to Turing. Have you tested other games to see if they regressed as well?
 
Thanks for the replies all,
I won't quote the posts as it'll be quite long post but purg, it's a fresh install of Windows, latest drivers all around, so there is nothing left behind from the previous :)

Tetras, I did check and on a fresh install of rust it always defaults to detail lever 4 and I always bump it to 6, with quality all maxed out etc I can very clearly remember what settings I was running prior and they're the same now, I'm just getting less fps, nvidia dlss does not do a thing other than bump it up by like 3-5 fps lol

Orc, possibly yeah, because I've not changed cpu but I'd still have thought there would have been at least some performance increase with it been a stronger gpu even if its not as optimised as well for that particular generation.

Freddie, haha I watch camomo frequently, I can guarantee I will never be on one of his vids, I detest cheaters, with that said I do know the game is full of them just from admin experience myself, they're dumb as..
 
One thing I did do differently was enable rebar within the bios (was previously disabled) , as I read somewhere this benifits the 3000 series cards, will try disable it and try again.
 
Not much help here and people seem to be trying to justify the lack of fps. You should definitly be seeing an fps bump. Something isn't right but I can't help you as I don't know...
 
Not much help here and people seem to be trying to justify the lack of fps. You should definitly be seeing an fps bump. Something isn't right but I can't help you as I don't know...

Not always. In squads I went from 60ish to 70ish
Going from a RX 580 to a 6600xt

The game engine is butchered and massively cpu bound.

Going from a r5 3600 to a r5 5600 I got up to 110fps

If your cpu bound going from a 1660 to 4090 would not give many more fps and no matter the gpu if you cpu bound
 
Last edited:
You could try loading up the gpu to unload the cpu a little and see if you get better fps.

Crank up the setting.
 
Last edited:
My 4080 and 7600x were badly underperforming for a week until I upgraded my bios, which the guy at memoryexpress said they did in the half hour I had to wait for them to dick around in the back. After the latest bios it went from getting far below average on timespy to far above average. FPS in 1440p Doom Eternal maxed went from like 220 to like 500

Something isn't right for such a dramatic loss after an upgrade. Bit of an odd move though isn't it? While the 4k series has maybe been a bit underwhelming, a 4070ti would still be a massive upgrade and frame gen is a gamechanger in the right situation - it absolutely fills out my 280hz monitor in some games, and it can turn 70 fps into a high refresh rate situation.

I know they are overpriced, but I'm thinking they can still extend a future upgrade
 
Bizzare,
You could try loading up the gpu to unload the cpu a little and see if you get better fps.

Crank up the setting.
I think its pretty much cranked,1440p with maxed out details,The only thing i turn down is Anti-Aliasing to FXAA as i could never really see the difference personally although maybe i should max this out to push the GPU harder.
My 4080 and 7600x were badly underperforming for a week until I upgraded my bios, which the guy at memoryexpress said they did in the half hour I had to wait for them to dick around in the back. After the latest bios it went from getting far below average on timespy to far above average. FPS in 1440p Doom Eternal maxed went from like 220 to like 500

Something isn't right for such a dramatic loss after an upgrade. Bit of an odd move though isn't it? While the 4k series has maybe been a bit underwhelming, a 4070ti would still be a massive upgrade and frame gen is a gamechanger in the right situation - it absolutely fills out my 280hz monitor in some games, and it can turn 70 fps into a high refresh rate situation.

I know they are overpriced, but I'm thinking they can still extend a future upgrade
Il have to check to see if there is any updated Bios,s for my motherboard,i ran 3D Mark Timespy and did get a bump in sore compared to my old 2080.

What surprised my is apparently my 3070TI boosted to 1905Mhz according to the 3D Mark Graph at a certain point,I dont know much how boosting works on the 3000 series but i thought it only went to whatever the subvendor set it too (in my case the MSI model i have is set to boost to 1770Mhz) im assuming it just had the thermal headroom so boosted further,Temps under this benchmark never really went over 65C.

ZAYKO0e.png
 
Last edited:
The fact it shows an improvement in 3DMark does suggest the GPU should be working properly.

As for resizable bar it works differently for Nvidia, they have a whitelist on what games can use it, and it's very small and hasn't been updated for nearly a year: https://www.nvidia.com/en-gb/geforce/news/geforce-rtx-30-series-resizable-bar-support/

For Rust, it's not enabled but you can use Nvidia Profile Inspector to force it on. Whether it will help for Rust I've no idea.

As for boost the stated boost clocks tends to be the minimum it will boost to, the way it works if there's power/thermal headroom then the GPU will boost higher. You can use MSI AfterBurner's curve editor to see what the true peak clock is.

Just to make sure are you using any PCIe riser cables? Does GPU-Z for example see the GPU running at PCIe 4.0 x16 while in game?

And apart from 3DMark, do you have any other games to test with?
 
Back
Top Bottom