• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Battlefield V performance


Joker tested Battlefield V 1080p DXR Low to Ultra on 9900K and RTX 2070.

Surprise RTX 2070 can run Battlefield V 1080p DXR Ultra playable very well.

Seem RTX 2070 is good at minimum for Ray Tracing at less than £500. :)
 

Joker tested Battlefield V 1080p DXR Low to Ultra on 9900K and RTX 2070.

Surprise RTX 2070 can run Battlefield V 1080p DXR Ultra playable very well.

Seem RTX 2070 is good at minimum for Ray Tracing at less than £500. :)
Yeah but 1080p though...

Who buys a 9900K and spends £500 on a graphics card to play sub 60fps at 1080p? Got to be having a laugh. Lol
 
Yeah but 1080p though...

Who buys a 9900K and spends £500 on a graphics card to play sub 60fps at 1080p? Got to be having a laugh. Lol

Rocking 45 FPS at 1080p...the way it’s meant to be played !

So much cost for so little performance and low resolution.:o
 
Who buys a 9900K and spends £500 on a graphics card to play sub 60fps at 1080p? Got to be having a laugh. Lol

If it was a fully (even if hybrid) lighting solution including bounced light, etc. I'd happily go down to 1080p aslong as it wasn't too much below 60fps and TBH almost anyone would if they saw a game that did it properly. (Atleast for single player and maybe MMOs, etc. but not for twitch FPS where I usually use "lowpro" type visual settings anyhow).

BF V merely has RTX reflections and that is it so no thank you.
 
Yeah but 1080p though...

Who buys a 9900K and spends £500 on a graphics card to play sub 60fps at 1080p? Got to be having a laugh. Lol

I am not sure why you are laughing.

I spend £450 2 years ago on GTX 1070 to play games with graphics options maxed out at 1080p on 8700K CPU. Will do the same with RTX 2080 at 1080p so nothing wrong with that. Just about everybody play games at 1080p resolution accorded to Steam hardware survey, 64.29% of steam users playing games at 1920x1080 which is the majority, the 2nd largest is 12.62% play at 1366X768 which usually found on cheap tablets and low cost laptops at £200. 2560X1440 is the third largest with 3.52% steam users, 0.41% steam users playing games with super wide 3440X1440 monitors and 1.31% steam users playing games with 4K HDTV or 4K monitors.

1080p is still the most popular resolution and still has plenty of life in it I used it for 8 years now. I been played few games at 4K resolution but the biggest issue is texts in games are really very small to read and sometime unreadable. Wished every games come with UI scaling option just like Star Trek Online did which seemed is the only game with UI scaling support I played at 4K with large icons and texts. I remembered played Half Life very long time ago at 640X480 on 3dfx Voodoo Banshee, the texts was easy to read but 10 years passed, the resolution increased to 1280X1024 on LCD TV, the texts on Half Life was awful so bloody very tiny to read. Now on 1080p HDTV the texts finally readable at maximum UI scaling in patch update years ago but unfortunately at 4K and 8K DSR the texts was just totally impossible to read so Valve will not increase UI scaling further at 4K and beyond. Any mod or hack will not fix UI scaling.

I think you will definitely not laugh if Battlefield V had DXR fallback support to see your Titan XP struggling at 3.5 fps slideshow at 1080p.

Upcoming Ray Tracing 3DMark demo should have DXR fallback support that will show how bad Pascal will run at 3.5 fps slideshow at 1080p just like old time with 3DMark 06 back in 2006 ran like 3.5 fps slideshow at 1024X768 on dual core Athlon 64 X2 4400+ and Geforce 7900 GTX.

It would be much better to run ray tracing games sub 60 fps at 1080p rather than 3.5 fps slideshow at 1080p. Actually the first and last time a game I experienced ran so badly 3.5 fps slideshow at 1280X1024 was GTA IV 10 years ago back in 2008 on Athlon 64 X2 4400+ and brand new Geforce 8800 GTX. I ran GTA IV for the first time and shocked to see it was so unplayable and stopped played after 15 mins then ordered a new Socket 939 motherboard, DDR3 memory and Phenom X4 9950 quad core CPU then installed new hardware and ran GTA IV then BOOM it finally playable at 45 fps on 1280X1024 with all graphics settings maxed out. Not surprised it turned out that very expensive dual core Athlon 64 X2 4400+ I bought for £400 back in 2005 was the bottleneck.
 
Last edited:
I am not sure why you are laughing.

I spend £450 2 years ago on GTX 1070 to play games with graphics options maxed out at 1080p on 8700K CPU. Will do the same with RTX 2080 at 1080p so nothing wrong with that. Just about everybody play games at 1080p resolution accorded to Steam hardware survey, 64.29% of steam users playing games at 1920x1080 which is the majority, the 2nd largest is 12.62% play at 1366X768 which usually found on cheap tablets and low cost laptops at £200. 2560X1440 is the third largest with 3.52% steam users, 0.41% steam users playing games with super wide 3440X1440 monitors and 1.31% steam users playing games with 4K HDTV or 4K monitors.

1080p is still the most popular resolution and still has plenty of life in it I used it for 8 years now. I been played few games at 4K resolution but the biggest issue is texts in games are really very small to read and sometime unreadable. Wished every games come with UI scaling option just like Star Trek Online did which seemed is the only game with UI scaling support I played at 4K with large icons and texts. I remembered played Half Life very long time ago at 640X480 on 3dfx Voodoo Banshee, the texts was easy to read but 10 years passed, the resolution increased to 1280X1024 on LCD TV, the texts on Half Life was awful so bloody very tiny to read. Now on 1080p HDTV the texts finally readable at maximum UI scaling in patch update years ago but unfortunately at 4K and 8K DSR the texts was just totally impossible to read so Valve will not increase UI scaling further at 4K and beyond. Any mod or hack will not fix UI scaling.

I think you will definitely not laugh if Battlefield V had DXR fallback support to see your Titan XP struggling at 3.5 fps slideshow at 1080p.

Upcoming Ray Tracing 3DMark demo should have DXR fallback support that will show how bad Pascal will run at 3.5 fps slideshow at 1080p just like old time with 3DMark 06 back in 2006 ran like 3.5 fps slideshow at 1024X768 on dual core Athlon 64 X2 4400+ and Geforce 7900 GTX.

It would be much better to run ray tracing games sub 60 fps at 1080p rather than 3.5 fps slideshow at 1080p. Actually the first and last time a game I experienced ran so badly 3.5 fps slideshow at 1280X1024 was GTA IV 10 years ago back in 2008 on Athlon 64 X2 4400+ and brand new Geforce 8800 GTX. I ran GTA IV for the first time and shocked to see it was so unplayable and stopped played after 15 mins then ordered a new Socket 939 motherboard, DDR3 memory and Phenom X4 9950 quad core CPU then installed new hardware and ran GTA IV then BOOM it finally playable at 45 fps on 1280X1024 with all graphics settings maxed out. Not surprised it turned out that very expensive dual core Athlon 64 X2 4400+ I bought for £400 back in 2005 was the bottleneck.

You are right 1080p is the most popular resolution. But to be fair that has nothing to do with how good it is and out of that 64.29% of people I am willing to be majority do not have a hardware anywhere close to a 9900K and 2070. Mainstream is mainstream. This is Overclockers UK bro :p

By the way, nothing wrong with 1080p. But I personally can see the pixels. Games to me look night and day different on my 4K monitor. I have not encountered the small text issue in a long time personally. The last game I had that on was when I installed Dragon Age Origins a few years ago and as I recall there was no easy fix. Not had on any game where I could not fix the issue since. These days it just works. Windows 10 scaling is so much better lately. Just whack windows scaling to 200% and everything is big again, but now looks so much sharper and clearer. The difference is like putting on a pair of glasses.

As for my Titan XP struggling DXR. By the time that tech has anything to offer for me personally, I will be on a 3080 Ti which will likely imo have a lot capablity than what is on offer today. At least with 4K today and even the past 4 year that I have been using it there has been a very nice jump in overall image quality. Recently played both Witcher 3 and Final Fantasy 15 and did a comparison to 1080p actually and boy there is a huge difference in image quality in those two games in particular.

Anyway. As usual horses for courses. I can’t go back to 1080p, tried it before so I don’t have to spend so much on graphics cards, but it killed my enjoyment.
 
You are right 1080p is the most popular resolution. But to be fair that has nothing to do with how good it is and out of that 64.29% of people I am willing to be majority do not have a hardware anywhere close to a 9900K and 2070. Mainstream is mainstream. This is Overclockers UK bro :p

By the way, nothing wrong with 1080p. But I personally can see the pixels. Games to me look night and day different on my 4K monitor. I have not encountered the small text issue in a long time personally. The last game I had that on was when I installed Dragon Age Origins a few years ago and as I recall there was no easy fix. Not had on any game where I could not fix the issue since. These days it just works. Windows 10 scaling is so much better lately. Just whack windows scaling to 200% and everything is big again, but now looks so much sharper and clearer. The difference is like putting on a pair of glasses.

As for my Titan XP struggling DXR. By the time that tech has anything to offer for me personally, I will be on a 3080 Ti which will likely imo have a lot capablity than what is on offer today. At least with 4K today and even the past 4 year that I have been using it there has been a very nice jump in overall image quality. Recently played both Witcher 3 and Final Fantasy 15 and did a comparison to 1080p actually and boy there is a huge difference in image quality in those two games in particular.

Anyway. As usual horses for courses. I can’t go back to 1080p, tried it before so I don’t have to spend so much on graphics cards, but it killed my enjoyment.
Some people don't seem to understand the fundamental problem with RT (at the moment).

It's like going into a fancy restaurant and paying a lot of money for a 2 course meal with a main and a dessert; instead of getting a good main course and a good dessert, you would have to settle for a sub-par main course if you want a good dessert, or if you want a good main course you'd have to settle a sub-par dessert.
 
Some people don't seem to understand the fundamental problem with RT (at the moment).

It's like going into a fancy restaurant and paying a lot of money for a 2 course meal with a main and a dessert; instead of getting a good main course and a good dessert, you would have to settle for a sub-par main course if you want a good dessert, or if you want a good main course you'd have to settle a sub-par dessert.
Have you tried it yourself? Why not ? :).
RT is more like going to a restaurant and trying something that is going to change the world of cuisine. It may be a little expensive, there might not be much to taste either, but you know it's good to try it :). While many won't give a damn about trying it and won't want to pay the £ to try it and of course will respond negatively to those that do, others will want a taste and not care what others think.
Watched a video last night where they were getting near 100FPS at 1080P from the Ti(ultra everything, DXR low). A far cry from the 60FPS everyone has been going on about.
I've found it completely workable at 1440P using a 2080 FE and G-sync. Should I need more FPS I can turn it off but at least I can make that choice. Course it's going to take a while to develop. In 2+ generations it's going to be awesome.
There is no fundamental problem with RT. It's extremely complex technology that is in its early days of development. End of story. If people don't want to try it because they deem the GPU's too expensive, fine :).
I hope NV and devs get SLI working because I;m sure there are plenty that will buy another card to get better performance. Would be great if FE buyers could get a discount too on a second card(more you buy, more you save :). )I also hope NV release a £1500+ Titan T too.
I doubt I'll buy more into the current gen but will be looking forward to what's next after trying it already. I bought the last two Titans but for a first taste of RT, for me personally, I think the 2080 FE is enough. I will be tempted if the Titan T has a lot more RT cores than even the Ti so will have to resist.
 
Last edited:
Have you tried it yourself? Why not ? :).
RT is more like going to a restaurant and trying something that is going to change the world of cuisine. It may be a little expensive, there might not be much to taste either, but you know it's good to try it :). While many won't give a damn about trying it and won't want to pay the £ to try it and of course will respond negatively to those that do, others will and not care.
Watched a video last night where they were getting near 100FPS at 1080P from the Ti. A far cry from the 60FPS everyone has been going on about.
I've found it completely workable at 1440P using a 2080 FE and G-sync. Should I need more FPS I can turn it off but at least I can make that choice. Course it's going to take a while to develop. In 2+ generations it's going to be awesome.
Fine if people don't want to try it for now but I still don't get all those ragging on about it. Oh yes I do....they want one but they're too much £.
There is no fundamental problem with RT. It's extremely complex technology that is in its early days of development. End of story. If people don't want to try it because they deem the GPU's too expensive, fine :).
I hope NV and devs get SLI working because I;m sure there are plenty that will buy another card to get better performance. Would be great if FE buyers could get a discount too (more you buy, more you save :). )I also hope NV release a £1500+ Titan T too.
I doubt I'll buy more into the current gen but will be so looking forward to what's next after trying it already. I bought the last two Titans but for a first taste of RT, for me personally, I think the 2080 FE is enough. I will be tempted if the Titan T has a lot more RT cores than even the Ti so will have to resist.
You clearly missed the part (or choose to ignore) where I said "at the moment".

I am not dismissing RT altogether or its future potential, I was talking about RT on this game at the moment is far from impressive with it mostly only made some subtitle improvement on reflections only (that people REALLY have to try to look to notice the difference) at the cost of huge performance hit. Rather than enabling RT and be blown away by its visual, on the current RTX card on BFV it's been reduced to "can I tick the enable RT box", regardless of how big or little visual improvement it actually brings". I suspect DICE has probably "gone easy/held back" on the implementation of the RT rather than going all-out pushing for it visually, as otherwise we would have another "Crysis situation" at our hand; had they gone all out, the difference that RT would make to the game would be instantly hit the users with one eye opened, rather than having to put two pictures or two videos side-by-side to really really focus to look try and find the subtitle differences.

I honestly think Nvidia has rushed the launch of RT prematurely (but they have their reasons), as the current current 12nm process density does not allow enough Tensor Cores and RT Cores to be packed alongside the Stream Processor. The RT aspect of RTX 2000 series is more like a "beta", or a showpiece to showcase toward investors and enthusiast than being actual fully fudge capable product to do proper RT. When 7nm lands, it is probably where we see the journey of RT "truly begins", and what it is capable of visually for games.

I have said this before, but considering the timing of how there's most likely not going to be enough games to really use RT in a truly meaningful or visually impressive way until the arrival of 7nm anyway, the 2000 series would have been better off been a "GTX" using the extra silicon space for more Stream Processors, Texture Units and ROP counts rather than "RTX" using those space for Tensor and RT cores that 99.9% of the games won't make use. Buying 2000 series RTX cards for RT is a bit like buying the first gen electric car with a max speed of 30-40mph with the infrastructure of charging stations not yet readily available across the country. It is not about people being too poor and salty just because of the high price, but more to do with even with the high price, the product is not yet has the performance or practical real-world practical usage; the 1st gen SSD is an example of this- it had the huge potential, but the small capacity and high price meant it was not yet ready for the mainstream market.
 
Last edited:
Nvidia had nothing that offered me a better fps and visuals so I bought vega56 and it runs great on BFV so I heard.
RTX is a gimmick atm.
wait a few weeks and none be talking about it with BFV.

If an api halves your fps you then want to double the horsepower to get back on usable frame rates.
really hard to do without die shrinks as those starts to slow down and stop soon.
7nm may be here for years to come as 5nm and such are still up in the air.
 
I spend £450 2 years ago on GTX 1070 to play games with graphics options maxed out at 1080p on 8700K CPU. Will do the same with RTX 2080 at 1080p so nothing wrong with that. Just about everybody play games at 1080p resolution accorded to Steam hardware survey, 64.29% of steam users playing games at 1920x1080 which is the majority, the 2nd largest is 12.62% play at 1366X768 which usually found on cheap tablets and low cost laptops at £200. 2560X1440 is the third largest with 3.52% steam users, 0.41% steam users playing games with super wide 3440X1440 monitors and 1.31% steam users playing games with 4K HDTV or 4K monitors.

1080p is still the most popular resolution and still has plenty of life in it I used it for 8 years now. I been played few games at 4K resolution but the biggest issue is texts in games are really very small to read and sometime unreadable. Wished every games come with UI scaling option just like Star Trek Online did which seemed is the only game with UI scaling support I played at 4K with large icons and texts. I remembered played Half Life very long time ago at 640X480 on 3dfx Voodoo Banshee, the texts was easy to read but 10 years passed, the resolution increased to 1280X1024 on LCD TV, the texts on Half Life was awful so bloody very tiny to read. Now on 1080p HDTV the texts finally readable at maximum UI scaling in patch update years ago but unfortunately at 4K and 8K DSR the texts was just totally impossible to read so Valve will not increase UI scaling further at 4K and beyond. Any mod or hack will not fix UI scaling.

Most people are at 1080p because it's cheap. It offers high frame rates of course, but given how well even a 1070 does at 1440p, most people will make that step up as and when they can afford to... and it is an extremely noticeable upgrade when you do! The costs of doing so are a barrier for many though, and it's why in the same Steam survey you see most people are gaming with 1060 equivalent GPUs or below. Even an RTX 2070 is going to be out of reach for many gamers at 1080p, assuming it could even manage ray tracing at respectable frame rates in the future.

I agree the current RTX cards don't really seem fit for very much other than more of a beta trial run and to satisfy the 4K gamers who do at least finally have a GPU that offers a solid 60FPS+ in pretty much every game... albeit at a price. As for ray tracing though, forget it. That said, I do think developers will leverage the RTX features far better in future titles, but we're always going to see a large performance hit regardless.
 
Last edited:
You clearly missed the part (or choose to ignore) where I said "at the moment".
Nope, I saw that. But there's no fundamental problem ATM but now you;ve expanded out what you mean I understand the point.
The game overall is quite buggy too, even DX12 non-RT is a bit rubbish.
I'm in agreement on some aspects hence why I said the non-Ti was my choice this round just to get an initial taster. £1500 Ti's aren't for me that's for sure.
NV need to get this out there and used though so I think it is the right step at the right time. For it to develop properly it needs to be used by custoemrs(us) but also their other customers (game devs) to start working with it, otherwise we'll end up in a situation where the hardware is much more advanced than what they game developers are able to do with it(another disappointing situation)
Part of the problem IMO is the 4K monitors many have bought into. Moving from 1200P to 1440P back in the day and experiencing the downsides is exactly the reason I have no interest in 4K.
Oh and yep it's obvious Dice have scaled back some aspects - but still a pretty decent "early" implementation IMO. You only have to look how far the "faking it" technique has come to see how good RT will get.
 
Last edited:
Most people are at 1080p because it's cheap. It offers high frame rates of course, but given how well even a 1070 does at 1440p, most people will make that step up as and when they can afford to... and it is an extremely noticeable upgrade when you do! The costs of doing so are a barrier for many though, and it's why in the same Steam survey you see most people are gaming with 1060 equivalent GPUs or below. Even an RTX 2070 is going to be out of reach for many gamers at 1080p, assuming it could even manage ray tracing at respectable frame rates in the future.

I agree the current RTX cards don't really seem fit for very much other than more of a beta trial run and to satisfy the 4K gamers who do at least finally have a GPU that offers a solid 60FPS+ in pretty much every game... albeit at a price. As for ray tracing though, forget it. That said, I do think developers will leverage the RTX features far better in future titles, but we're always going to see a large performance hit regardless.

I disagreed, current RTX cards do fit for many people like me. Like back in 1996 when 3D was very much more of a beta run and I saw 3dfx Voodoo poster next to a PC in PC World with 3dfx Voodoo 1 card running 3Dfx tech demo and I was blown away how stunning 3D looked, it put my 2D card to shame. Voodoo 1 came with very high price £249 for 3D card compared to around £50 for cheap 2D cards. On Christmas day 1996 I played the very first 3Dfx supported game Tomb Raider and I was absolutely blown away how stunning 3D graphics looked compared to S3 2D card emulated 3D with horrible ugly square pixels which I hated. I made very smart decision to go 3D very early with Voodoo 1 and then 3D was very successful a year later. The same thing happened to 3D will happen to Ray Tracing too with Turing. We always, ALWAYS going to see large performance hit regardless whether you using ray tracing or 4X MSAA or 8X MSAA or 4X SSAA. It really does not matter.

I used 4X or 8X MSAA everytime in games at 1080p, it gave much better image quality with no jaggies edges than poor quality FXAA with some jaggies edges but MSAA had a large performance hit on every resolution but at 4K will be absolutely massive performance hit compared to 1080p. THAT is why nvidia launched Turing with Tensor core to give us DLSS that will give 40% performance increase on any resolution.

The big issue with 4K resolution in games I noticed is the power consumption on power meter, it consumed 100W more than 1080p, 1440p consumed 50W more than 1080p. 4K and 1440p are not worth for now. Probably in 2 or 3 generations away nvidia GPUs will be energy efficiency at 4K games. I will stick with 1080p with 250W power consumption for now, many people still use 1080p and also use RTSS frame capper to capped to 60 fps to see power consumption huge reduction further.

Also I preferred to played games at maxed graphics settings like Ultra which I can noticed big difference in graphics but however some people didnt noticed difference between High or Very High to Ultra and decided to turned down to High settings as Ultra settings can have a large performance hit on frame rates too.
 
Excellent video and shows just how bad Ray tracing can look at times.

Very misleading!
00930317c20062ef96edf2468e28e4a29cba0d86eaf664dfd356077bac0cfdfa28cd4a09.jpg
 
Last edited:
check out the horrible popin issues here.


especially when they show the barrel reflections, its so close to the player, this isnt unique to BF5, most modern games I play have pop in really close and is so frustrating like they designed it for a 10 year old GPU or something.

GTA5 and FF15 draw shadows about 2 metres in front of you and are both really bad at it.
 
I disagreed, current RTX cards do fit for many people like me. Like back in 1996 when 3D was very much more of a beta run and I saw 3dfx Voodoo poster next to a PC in PC World with 3dfx Voodoo 1 card running 3Dfx tech demo and I was blown away how stunning 3D looked, it put my 2D card to shame. Voodoo 1 came with very high price £249 for 3D card compared to around £50 for cheap 2D cards. On Christmas day 1996 I played the very first 3Dfx supported game Tomb Raider and I was absolutely blown away how stunning 3D graphics looked compared to S3 2D card emulated 3D with horrible ugly square pixels which I hated. I made very smart decision to go 3D very early with Voodoo 1 and then 3D was very successful a year later. The same thing happened to 3D will happen to Ray Tracing too with Turing. We always, ALWAYS going to see large performance hit regardless whether you using ray tracing or 4X MSAA or 8X MSAA or 4X SSAA. It really does not matter.

I used 4X or 8X MSAA everytime in games at 1080p, it gave much better image quality with no jaggies edges than poor quality FXAA with some jaggies edges but MSAA had a large performance hit on every resolution but at 4K will be absolutely massive performance hit compared to 1080p. THAT is why nvidia launched Turing with Tensor core to give us DLSS that will give 40% performance increase on any resolution.

The big issue with 4K resolution in games I noticed is the power consumption on power meter, it consumed 100W more than 1080p, 1440p consumed 50W more than 1080p. 4K and 1440p are not worth for now. Probably in 2 or 3 generations away nvidia GPUs will be energy efficiency at 4K games. I will stick with 1080p with 250W power consumption for now, many people still use 1080p and also use RTSS frame capper to capped to 60 fps to see power consumption huge reduction further.

Also I preferred to played games at maxed graphics settings like Ultra which I can noticed big difference in graphics but however some people didnt noticed difference between High or Very High to Ultra and decided to turned down to High settings as Ultra settings can have a large performance hit on frame rates too.


All very alien to me. I've never met anyone who is SO worried about power consumption. Of course it's not something to be ignored, but I'd say you're very much in the minority of 1080p gamers who are only there because of power consumption. That is quite unusual. And limiting frame rate just to save power!? WOW!! Just WOW! :eek::eek::eek:
 
check out the horrible popin issues here.


especially when they show the barrel reflections, its so close to the player, this isnt unique to BF5, most modern games I play have pop in really close and is so frustrating like they designed it for a 10 year old GPU or something.

GTA5 and FF15 draw shadows about 2 metres in front of you and are both really bad at it.

Frostbite has always been lauded as a "really well optimized" engine, but pop-in has always been an issue in it. BF1 had the same issues and it's not even done subtly, its literally a load of stuff appearing in front of your eyes, instead of the traditional way they used to do of having several versions of objects with varying qualities for different ranges.

Runs well on my card at 4k, all ultra with the 5 gimmicky options (film grain, motion blur etc) turned off. Generally above 60fps in most maps, sometimes in the 80-ish fps range.
 
People typically want more for their money, lower frame rates because of some fancy reflections isn't a great selling point.
 
Back
Top Bottom