If you run the test again but use Exclusive full screen mode and not borderless mode you might get a better score.
What CPU are you using?
Can you check the memory frequency? It's different to Radox's submission.
Scoreboard updated.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
If you run the test again but use Exclusive full screen mode and not borderless mode you might get a better score.
Cheers mate I will fix it shortly I thought I had buggered it up. Bit rusty been years since I ran a bench thread.His Memory would be half to get in same format as mine (or double mine to get into same format as his I suppose!) So it would be 4876 Mhz if written same way as mine. I have +1000 Mhz offset for that saved profile. so add 500 Mhz (half 1k) onto stock speed of 4876Mhz gets to my speed of 5376Mhz I posted.
That said i think you made a small copy and paste error and mish mashed his and mine numbers in the results table
8.5% difference for me, just tested it with my 24/7 gaming clocks. More than i was expecting. Minimums were better with it off though.In this game it's rather more than that according to HW Unboxed. However, in many games the difference is negligible. Regardless, the new AMD cards are much faster in this game than Nvidia anyway.
I will test that next.Interesting, thanks. I think the difference is even bigger at lower resolutions going by HW Unboxed.
Good point, thanks mate will update it now.Sorry I tried to set game settings from memory, was wondering it was not great score but I do set voltage curve lower than default so it doesn't hammer 500w on the PSU. CPU should be in the screenshot provided (Ryzen5 3600).
10.9% at 1080P. Didn't bother testing 1440P.Interesting, thanks. I think the difference is even bigger at lower resolutions going by HW Unboxed.
She’s a keeper mate Lol.My story isnt much different. Looked at 3080 on release, didnt get. Waited for 6800XT release - didnt get. Then wife sourced 3090FE in stock and gave green light (I was surprised she sanctioned it) I wasnt going to argue. I reckon if I still tried getting a 3080 I would be likely without one still...
Have you seen how well the 6000 series under volt? The score achieved at 4K 1st place in this bench is with a 6900 XT under volt Ed to 1.1v and drawing 250W. You’ll actually gain performance under volting Big Navi, never mind the temperature, power draw and noise benefits.TBF to nvidia, you can undervolt the 3080 and 3090 massively. My tweaked 3080 is delivering better perf than an untouched 3080 and doesn't go above 300W now. Been very impressed with the 3080 FE in terms of temps and noise levels.
Nice score, will update the thread shortly. I found the same at 1440P also, but I’ve not tried pushing the GPU yet.Update for 1080p
Sam on
5600x 4750mhz
gpu 2760mhz/2130mhz mem
Driver 2.2.21
Tried but unable to get more than 120fps at 1440p. Amd freq limit pulls back 6800xt all the time.
Can you provide the GPU clock settings at stock as shown in the Nvidia Control Panel?Nvidia RTX 3080 FE@stock
AMD Ryzen 9 3900X@stock
32GB Crucial Ballistix 3600 MHz CAS 16 (4*8GB) RAM
1440P
Scoreboard updated.Update for 1080p
Sam on
5600x 4750mhz
gpu 2760mhz/2130mhz mem
Driver 2.2.21
Tried but unable to get more than 120fps at 1440p. Amd freq limit pulls back 6800xt all the time.
Boost clock is 1710MHz, Memory clock is 1188MHz (19000MHz effective), not sure that's very accurate in practice though.
I provide the clock speeds as i set them in Radeon Software, but in game is usually 50Mhz lower. What do you see in game?You will likely be boosting in 1900 Mhz - 2000 Mhz range out the box at stock.
Hmm, something can't be right there the memory clock when doubled is only 2376?Boost clock is 1710MHz, Memory clock is 1188MHz (19000MHz effective), not sure that's very accurate in practice though.
Furry muff, jeez why can't it just be the number without all this multiplying Lol. All these different formats.Stock the boost behaviour can vary when not using a particular profile as it picks whatever voltage point it likes. However my posting of 2175Mhz is with one of my saved profiles with an undervolt and all steps after that at 2175Mhz so card sticks to that entire run during benchmark as its clockspeed.
You will want to double it to get the same format as others so in same format as mine would be 4752 Mhz. depending on number person posts will be multiples of 2 basically.
1440p
91FPS average / 40FPS low
3090FE @ 2175Mhz / 5376mhz
9900k @ stock
Driver: 461.09
Ampere does not scale well as we drop resolution. AMD defo wrecks ampere in this game. Will be curious to see if it makes any difference when Addressable bar support comes out for Nvidia, decent gains it seems.
You are welcome to start your own mate. I picked this game as I actually own it.Sir @LtMatt I recently got Odyssey, is there a thread to necro on this or worth setting one up? Would be good to see if the last game the wedge between vendors grew wider.
I don’t own the game but if I can get a freebie or pick it up cheap I’ll contribute even if it is not favourable towards Radeon.It looks like that one has too, I not checked to see if there was an old thread covering it, but it might be worth it if there is interest then I will take it forward. Not completed Valhalla yet (92% though) and river raids round the corner may mean I delay my Odyssey beginning.
Nope I believe you hinted at such earlier no?Was this a Freudian slip for the Valhalla thread?
I was considering bench threads for those games as well but I need to check to see if they have a built in benchmark and the results screen shows the settings used.Well both are AMD sponsored although Odyssey tends to favour Nvidia, as the older Assassin's Creed games do. However, Valhalla is using much more modern and advanced graphical techniques and in my opinion looks far superior to the previous games in the series. It would be interesting to know the precise technical reason why AMD is doing so much better in AC Valhalla, Dirt 5 and Godfall, but it's a big win for AMD in my book that they are.