• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Battlefield V performance

"We have been" spoiled by the latest Gen on value to performance, specially the 1080ti.

Dx12 is still a pain, should not be after this many years. (Many reasons, many directions to point fingers at)

No real new tech has been added to this field in years, most kids don't remember any of that because they where sucking on the tit.

People like to complain.
____

I'm an old fart as far as this field goes, had the voodoos, Rivas etc. So this is not something new to me. Personally I purshased a 2080ti to guarantee 100fps on Ultra-Wide 1440p in most games or close. Can't complain. And I value Nvidia's desision to not sit still and keep pushing tech.

People are being objective - the tech is not quite there yet,the performance increase is more or less matched by the price increase. All the people who got overexcited by the cards are getting annoyed,and it was the same even 15 years ago. I am an old fart too,and it boggles my mind at how people can just fall for the marketing bumpf all the time. The same with tessellation,AA,etc all that - it took years for them to run acceptably and all the people argueing with those who said the tech would take years to be effective,ended up ditching the first gen cards anyway,and buying the newer cards to run the effects better,hence validating what people were saying. Rinse and repeat.

Tessellation was there in 2001,and people got overexcited about that too. It took years for it to become viable. The same with AA,etc.

If you think the GTX1080TI was some price/performance king - LOL. I can remember back years ago,when a midrange card would match or get close to high end ones from the previous generation. Things like the 8800GT for example. Even adjusted for inflation,exchange rates,etc it was excellent money for value back then.
 
Indeed. I still remember the vast performance hit for enabling 32bit color. It wasn’t until the original GeForce 256 that we were really able to use it without penalty.
 
People are being objective - the tech is not quite there yet,the performance increase is more or less matched by the price increase. All the people who got overexcited by the cards are getting annoyed,and it was the same even 15 years ago. I am an old fart too,and it boggles my mind at how people can just fall for the marketing bumpf all the time. The same with tessellation,AA,etc all that - it took years for them to run acceptably and all the people argueing with those who said the tech would take years to be effective,ended up ditching the first gen cards anyway,and buying the newer cards to run the effects better,hence validating what people were saying. Rinse and repeat.

If you think the GTX1080TI was some price/performance king - LOL. I can remember back years ago,when a midrange card would match or get close to high end ones from the previous generation. Things like the 8800GT for example.

And who will want a 2080ti used once the 3000 series hits? No doubt everyone will drop the 2080ti like a stone if a card gets released that can push 100 fps with RT on at 3440 x 1440p .
 
Indeed. I still remember the vast performance hit for enabling 32bit color. It wasn’t until the original GeForce 256 that we were really able to use it without penalty.

Yeah,its the way these things go - companies will push these things out to sell the next best thing,even if its not quite there. Its the history of the tech industry.

And who will want a 2080ti used once the 3000 series hits? No doubt everyone will drop the 2080ti like a stone if a card gets released that can push 100 fps with RT on at 3440 x 1440p .

They will buy the GTX3080TI OFC,and JHH gets a new outhouse for his leather jackets......made of leather?? :p

Nvidia knows its market well. It has worked before.
 
Last edited:
If you think the GTX1080TI was some price/performance king - LOL. I can remember back years ago,when a midrange card would match or get close to high end ones from the previous generation. Things like the 8800GT for example. Even adjusted for inflation,exchange rates,etc it was excellent money for value back then.

No, I don't think that at all, I didn't include myself in the lot "" . Performance to value kings are hardly ever top or the line models.

I had a 8800GT.

But I do think that the 1080ti was (is) a great performer for the price.
 
Last edited:
No, I don't think that at all, I didn't include myself in the lot "" . Performance to value kings are hardly ever top or the line models.

I had a 8800GT.

But I do think that the 1080ti was (is) a great performer for the price.

I suppose compared to the newest cards,it makes it look even better,LOL.

:p
 
Sure does, and where it not for mining craze it could have been even better.

Well,not only mining but the RAM cartel having a good time too.

Also another thing is Turing is more an Nvidia play to enter VFX(massive market),and they have essentially merged their consumer and pro lines into one now. They are using gaming sales to initially prop that play up methinks. It was a bit like Fermi did - it was a play into commercial markets propped up by gaming sales. I can understand why they are doing this - 60% of their revenue is gaming still,and they want to diversify even more.
 
And who will want a 2080ti used once the 3000 series hits? No doubt everyone will drop the 2080ti like a stone if a card gets released that can push 100 fps with RT on at 3440 x 1440p .
That I do agree with and is one of the reasons I ended up going for the 2080. I wanted a £1k+ card to last me a good few years and felt the 2080 gave me more of a middle ground, to upgrade again next gen without feeling too bad about it. But it's all unknown though - if the 20 series gets hit hard, so will the 10 series too once the 30 series is released. At least someone with a 20 Ti gets to experience the best on offer atm. Again, it's a bit of an "each to their own" case.
I bought the previous two Titans but despite my positively on the 20 series, I decided 2080 enough for now, to help keep options open for next gen and to see how NV develop RT and DLSS usage.
 
That I do agree with and is one of the reasons I ended up going for the 2080. I wanted a £1k+ card to last me a good few years and felt the 2080 gave me more of a middle ground, to upgrade again next gen without feeling too bad about it. But it's all unknown though - if the 20 series gets hit hard, so will the 10 series too once the 30 series is released. At least someone with a 20 Ti gets to experience the best on offer atm. Again, it's a bit of an "each to their own" case.
I bought the previous two Titans but despite my positively on the 20 series, I decided 2080 enough for now, to help keep options open for next gen).


You're right the once the 3000 series hits the 1080ti will get hit hard...But I didn't pay £1300 for it:)
 
Seems to me it may have been better to load the chip up with rt cores instead of dlss if they really wanted to push rtx. There is probably a good reason why they didn't.
 
No, I don't think that at all, I didn't include myself in the lot "" . Performance to value kings are hardly ever top or the line models.

I had a 8800GT.

But I do think that the 1080ti was (is) a great performer for the price.
Agreed, the 1080ti is a fantastic performer but not at the prices charged - I love mine but paid absolute peanuts for it.

I’m old enough to remember performance bargains (AGP 6800GT, TI4200 spring to mind) the 20xx series reminds me of the GeForce FX range which was a poor jump from its predecessor which at the time people thought overpriced for the small performance jump, the 20xx is just daylight robbery by comparison and equally if not more flawed imo.
 
I had a Gigabyte 1080 ti waterforce WB SLI setup, managed to get them at robbery prices, to the point I sold them past month and made money on them both.

That scratch was spent on the 2080ti that I got a very good deal too, initially had the Trio x, sold it to a friend in need and got this deal on a Strix OC. Paid 1160 EUR. Can't complain much.
 
As I said above, the earlier Titan P owners who kept their GPU are kind of laughing. A £1099 GPU but if they're not bothered by the 20 series they can easily sit it out knowing their GPU offers 2080 performance atm where DLSS is not used, some 2 years and 3 months after release. So early Titan buyers have had so much top end gaming already. Gives them more options, ie, upgrade or wait, still good performance.
I'm fairly sure NV will be along soon however with a tempting Titan T :p.
Anyway, lets hope Dice implement DLSS soon to give a boost in performance, even if RT might still be a challenge.
 
Last edited:
And who will want a 2080ti used once the 3000 series hits? No doubt everyone will drop the 2080ti like a stone if a card gets released that can push 100 fps with RT on at 3440 x 1440p .


I wouldn’t be surprised if nvidia are making a new card with full RT at 4K res and it will cost a bomb lol.....I’ll not be upgrading for a while now anyway.....
 
As I said above, the earlier Titan P owners who kept their GPU are kind of laughing. A £1099 GPU but if they're not bothered by the 20 series they can easily sit it out knowing their GPU offers 2080 performance atm where DLSS is not used, some 2 years and 3 months after release. So early Titan buyers have had so much top end gaming already. Gives them more options, ie, upgrade or wait, still good performance.
I'm fairly sure NV will be along soon however with a tempting Titan T :p.
Anyway, lets hope Dice implement DLSS soon to give a boost in performance, even if RT might still be a challenge.

I bought mine for £675 in June 2017. It was a few weeks old.

I have absolutely no plans of replacing it whatsoever. Not until RTX is a big thing that truly makes "normal" games look bad. I guess it will be at least two more gens of RTX before it will be noticeable enough for me to take the jump, 'cause right now I'm just not seeing it. Well, I am, but only after considerable staring at it. I want it to make the game itself better, not be something you need to stop and run around to notice.

I also game at 4k right now, though I understand I will probably need to drop that to 1440p when game engines get heavier, which means running at least BFV would look worse.

I think it's kinda naughty of Nvidia to make it look so good in that demo with the enormous puddle full of fire and reflections. They must have been running four Titan V for that. Also out of order that the 2070 right now is theoretically useless.

Shame on you Nvidia, shame on you :(
 
It's the first implementation and dice has recognized a few bugs in it already (rtx related, dx12 related) ...: https://forums.battlefield.com/en-us/discussion/161023/battlefield-vs-known-issues-list

It will get better. And we have yet to see other games, other dev environments (Vulkan) at work.

Keep calm.


DirectX
  • Micro-Stuttering
  • If you’re running on Direct X 12 and seeing microstuttering in-game, go into the Video settings in-game and toggle Direct X 12 off. Ignore the pop-up, exit back to the options screen, then restart your game.
  • Microstuttering can also be down to a driver issue, whether it’s a driver conflict or the latest driver for your graphics card not working well with Battlefield V. If you’ve already tried a clean reinstall of your graphics drivers, you may just need to wait for a new driver to fix it.

DXR
  • Direct X Raytracing - The BFV 11-14-2018 Update which introduces first release of DXR ray tracing shipped with a couple of known bugs. We recommend in this first release of DXR ray tracing that “DXR Raytraced Reflections Quality” be set to “Low” for the best experience. We will continue to work with Nvidia to further enhance your ray tracing experience.
  • “Medium” DXR quality preset setting not applied correctly
  • May cause performance and quality in the “Medium” preset to vary depending on which preset you selected last
  • Status: Fix coming in an upcoming update.
  • DXR Performance degraded in maps which feature a lot of foliage
  • This particularly effects War Stories “Liberte” and “Tirailleur”
  • Status: Currently investigating
  • DXR is not automatically enabled for users who can use the tech
  • This is known. We've provided a brief guide on enabling which includes downloading latest BFV update, Windows update, and GPU drivers.
 
Last edited:
It's the first implementation and dice has recognized a few bugs in it already (rtx related, dx12 related) ...: https://forums.battlefield.com/en-us/discussion/161023/battlefield-vs-known-issues-list

It will get better. And we have yet to see other games, other dev environments (Vulkan) at work.

Keep calm.

It will get better on future hardware that can be four times as powerful as what we have right now, with four times the amount of tensor cores. Nvidia ditched the kitchen sink after Fermi and went for a much more light, cut down design. Now they have to start putting the kitchen sink back in, as it's finally time to use it.

However, as I said that means that cards right now are a joke. It's a token offering, more than RT utilised fully and properly.

Normally? I would have been silly and played "Chase The Dragon" but I did that with 4k and it cost me a bloody fortune. Like seriously, I threw about £4k (lmao, ironic huh?) at 4k. Two Titan Black (new) Two Fury X, a Titan XM and Titan XP.

Not doing that again.
 
It will get better on future hardware that can be four times as powerful as what we have right now, with four times the amount of tensor cores. Nvidia ditched the kitchen sink after Fermi and went for a much more light, cut down design. Now they have to start putting the kitchen sink back in, as it's finally time to use it.

However, as I said that means that cards right now are a joke. It's a token offering, more than RT utilised fully and properly.

Normally? I would have been silly and played "Chase The Dragon" but I did that with 4k and it cost me a bloody fortune. Like seriously, I threw about £4k (lmao, ironic huh?) at 4k. Two Titan Black (new) Two Fury X, a Titan XM and Titan XP.

Not doing that again.

But but but....Now is the time for 4k, haven't you heard? 2080ti is the 4k king :).

Now, seriously,

RTX can be used for many things, some more noticeable than others. It's upto the developers to see the current hardware capability and tweak until it is usable in his software... Will it be noticeable? Worthwhile? Dunno.

I see these things as if it was developed for a console, they can optimize heavily because the hardware is unique at this point...this is why consoles can do much much more...with less.

I agree with you that the hardware has to improve, but this is the only one we have, no variation, it's also an opportunity.

Lets just remember the demo and some of this implementation was developed without any specific hardware to rtx, with the titan v... Only so much they can do in this time-frame and with a game release on their hands, they do have priorities! It's not rtx! It's a very complex game too....
 
Tessellation was in consumer GPUs from 2001,yet it took years for it to become viable. So not sure why people are getting so excited about some hybrid-RT effects which won't run very well,and then getting annoyed when its pointed out that it does not run very well,and the cards cost too much. Nowadays people have no patience,just like with AA - as many people said back then wait a few years,before jumping in and they were right. The people argueing with them,got too excited and wanted to be an early adopter,and the benefits were not there due to the terrible performance. So the detractors were correct.

The same with tessellation and the same with this. Buy the cards on their normal performance. If it works out good for you,then fine. Anything like RTX would be icing on the cake,but also expect the 2nd generation to end up doing it much better.

The major difference between RTX and tessellation, tessellation implementation didn't use roughly 25% of the die (as far as I know).

Given that when you compare prices per mm² of die the 1080ti to the 2080ti are pretty similar. So if the 2080ti didn't have RTX cored it could have been up to 25% cheaper.
So owners are paying a lot for something they may not use.
 
Back
Top Bottom