• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Another RTX GPU Moan Thread

  • Thread starter Thread starter Guest
  • Start date Start date
I don't recall a time in recent history where rendering techniques have been wholely implemented at the time of a GPU launch, it's just not a feasible notion. RTX is for the long game, albeit marketing isn't going to tell you that. The prospects are exciting, but anyone buying into Turing purely for these features should strongly consider evaluating their understanding of the industry.

Microsofts VRS has more lasting implications than either of the other two.
 
The only place I ran into issues with memory was playing Forza Horizon 4 in 1440p UW ultra on a GTX 1080. That said, having 8Gb on 2080 does seem a bit skimpy.

I run FHY 4 at 4k Ultra on a Vega 64 which only has 8gb Vram and had no problems. Locked at 60fps so not sure how a gtx1080 at 1440p ultra wide would run into memory problems.
 
Agreed - but how is that any different to Battlefield V's built in reflections (some of which I prefer over RT)? Or any other decent 3D engine for that matter?

I thought RT was going to be this holy grail of lighting, so either we've seen a really poor version of it first time around or it's just not all that.

It is a simplified version of it, RT was never really meant for real-time rendering. Just to take it one notch up in detail requires DOUBLE the processing power.

It's not going to be anything more than a gimmick for a long time yet. Maybe one day it will replace traditional lighting we see in games, but we're probably looking at at least a decade for that.

I run FHY 4 at 4k Ultra on a Vega 64 which only has 8gb Vram and had no problems. Locked at 60fps so not sure how a gtx1080 at 1440p ultra wide would run into memory problems.

Vega can shift stuff on to system RAM (instead of windows virtual memory) if it runs out. But yea, not many things will fill 8gb.
 
Last edited:
I run FHY 4 at 4k Ultra on a Vega 64 which only has 8gb Vram and had no problems. Locked at 60fps so not sure how a gtx1080 at 1440p ultra wide would run into memory problems.

Did you go into the advanced settings and change all the additional settings at to Extreme? The default ultra setting with dynamic optimisation isn't really ultra in FH4.
 
The prices for these cards are bouncing up and down more than a cheap hooker on freebie night. 1 place has the duke in stock, last month they had it in for £1260, earlier today it was £1365, now its down to £1300. Just looks like people are fart assing about with prices on a whim. :mad:
 
The prices for these cards are bouncing up and down more than a cheap hooker on freebie night. 1 place has the duke in stock, last month they had it in for £1260, earlier today it was £1365, now its down to £1300. Just looks like people are fart assing about with prices on a whim. :mad:

Just seen the news...85% of Black Friday deals nationwide are cheaper throughout the year.
 
Just seen the news...85% of Black Friday deals nationwide are cheaper throughout the year.
Lol. Not surprised one bit. Black Friday was only good the first couple of years, like Steam sales, it is just meh for the most part now. Just another marketing opportunity for companies really.
 
Did you go into the advanced settings and change all the additional settings at to Extreme? The default ultra setting with dynamic optimisation isn't really ultra in FH4.

Well you are only just above 1440p and i am at 4k so your extra settings are more than likely cancelled out by me running way more pixels. I will run it at max iq just to see but obviously won't get near enough playable fps.
 
Maxed every slider to Ultra with HDR on at 8xmsaa. Left my second monitor on as well and got 46 average, 37 low with a high of 58.1. It's not playable but certainly not running out of memory. The card was at 1600/1100 and 2700x at 4.2. I see no signs of 8gb not being enough at 4k never mind 1440p Ultra wide.
 
Just seen the news...85% of Black Friday deals nationwide are cheaper throughout the year.
It's not surprising really? I mostly only buy stuff I need even when the sales come around. Most of the time there;s good reason why stuff is discounted and the bargains are not what they seem.
Last big discount I got was buying a 6700K system. Two years ago that was. Today we have 32 core processors :D and have had the 7700K, 8700K and now 9700&9900K in the mainstream, all in two years :)
 
Maxed every slider to Ultra with HDR on at 8xmsaa. Left my second monitor on as well and got 46 average, 37 low with a high of 58.1. It's not playable but certainly not running out of memory. The card was at 1600/1100 and 2700x at 4.2. I see no signs of 8gb not being enough at 4k never mind 1440p Ultra wide.
Try Final Fantasy 15 and see what happens :p

8xmsaa would be silly at 4K. These days if I can, I avoid turning on AA at all at 4K. Some games really need AA, like say Final Fantasy 15. But then try a Tomb Raider game you can get away with not using AA at all at 4K and it looks better overall without it too. TAA/SMAA remove jaggies, but end up making the image quality look much worse sometimes.
 
Try Final Fantasy 15 and see what happens :p

8xmsaa would be silly at 4K. These days if I can, I avoid turning on AA at all at 4K. Some games really need AA, like say Final Fantasy 15. But then try a Tomb Raider game you can get away with not using AA at all at 4K and it looks better overall without it too. TAA/SMAA remove jaggies, but end up making the image quality look much worse sometimes.

I 100% agree. Was just trying to show him that a gtx1080 with 8g vram should have been more than enough at 1440p Ultrawide. Vega 64 does a decent job at 4k if you manage the settings. There is plenty of options that make the image look worse to me, at 4k the first thing i do is turn them off along with aa and performance is usually just about where i need it with 60hz. After that if needed i lower a few other things that i may have had ultra if i had a bit more in the tank. Freesync takes care of the rest. Honestly not played a game yet that does not look really good to my eye that's not playable. Assassin's Creed Oddysey is my latest game and hurt the most though.

As for Final Fantasy i completed that with a Platinum on Ps4 long before it came to PC lol. Those games i will not wait for :D:D:D:D:D. RDR 2 was my latest. Looks every bit as good as any PC game and the story was among the best i have ever played. Ps4 pro 4k ain't bad at all with 30fps as the action/game is not fast paced. 60 or above would definitely been better but not enough for me to wait.

I suppose it could be down to Vega's High Bandwidth Cache but so many have rubbished it around here so surely not :D:D:D:D:D
 
Last edited:
Thanks.

With SLI these days the hard bit is finding games that support it.

Games like ROTTR and SOTTR run well and scale well.

I am hoping that BFV gets a good SLI profile so I can give RTX a go @2160p at decent fps.:)

Yet you still brought 2x 20xx cards on a promise, and in good faith from Nvidia.
You still hang on to SLI in the hope it will be supported in games and by Nvidia.
People only use the words: Wish, Hope, faith belief and trust, as a last desperate resort, when they have nothing else left.
You are in a tiny, niche market. I hope you get what you wish for.
 
Lol Kapp has to show off and buy 2 but mainly 4 :D:D:D:D:D. Seriously though i think he gets a load of enjoyment out of testing stuff out so for him it's probably money well spent. Without Kapp and the same minded most of us wouldn't have first hand experience as to how bad 2,3, and 4 cards are so in my mind am glad guys like him test these setups.

It's a dirty job but somebody has to do it.
 
Yet you still brought 2x 20xx cards on a promise, and in good faith from Nvidia.
You still hang on to SLI in the hope it will be supported in games and by Nvidia.
People only use the words: Wish, Hope, faith belief and trust, as a last desperate resort, when they have nothing else left.
You are in a tiny, niche market. I hope you get what you wish for.
I am not sure but I think he buys them mainly for benchmarking which he enjoys.

I 100% agree. Was just trying to show him that a gtx1080 with 8g vram should have been more than enough at 1440p Ultrawide. Vega 64 does a decent job at 4k if you manage the settings. There is plenty of options that make the image look worse to me, at 4k the first thing i do is turn them off along with aa and performance is usually just about where i need it with 60hz. After that if needed i lower a few other things that i may have had ultra if i had a bit more in the tank. Freesync takes care of the rest. Honestly not played a game yet that does not look really good to my eye that's not playable. Assassin's Creed Oddysey is my latest game and hurt the most though.

As for Final Fantasy i completed that with a Platinum on Ps4 long before it came to PC lol. Those games i will not wait for :D:D:D:D:D. RDR 2 was my latest. Looks every bit as good as any PC game and the story was among the best i have ever played. Ps4 pro 4k ain't bad at all with 30fps as the action/game is not fast paced. 60 or above would definitely been better but not enough for me to wait.

I suppose it could be down to Vega's High Bandwidth Cache but so many have rubbished it around here so surely not :D:D:D:D:D

I do the same mate. Many things get turned off as imo the game looks better without them. DoF and Motion Blur are the first things to get turned off. Why make the image look worse and lose fps? :p
 
I do the same mate. Many things get turned off as imo the game looks better without them. DoF and Motion Blur are the first things to get turned off. Why make the image look worse and lose fps? :p

Yea we think alike as do many of my mates. Ultra everything makes no sense when certain effects make the image worse to the eye or when aa adds nothing.
 
Try Final Fantasy 15 and see what happens :p

8xmsaa would be silly at 4K. These days if I can, I avoid turning on AA at all at 4K. Some games really need AA, like say Final Fantasy 15. But then try a Tomb Raider game you can get away with not using AA at all at 4K and it looks better overall without it too. TAA/SMAA remove jaggies, but end up making the image quality look much worse sometimes.

indeed, haha 8 gig isnt enough for ff15 at 720p ;)

Its dangerous to make claims like "vram isnt needed" based on the very few games reviewers cherry pick.
 
Back
Top Bottom