Wow, those reflections are bad even without RT then.That screenshot is without RT enabled
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Wow, those reflections are bad even without RT then.That screenshot is without RT enabled
At first glance I thought I agree with you as its sharper, but the high picture has the same issue I posted a couple of days back where the reflection is overpowering and looks like it is extending below the floor. Are both of these RT enabled? Is there a screenshot of that scene without RT?
This game just seems badly made to me.
What have you got your in game hdr settings if you don't mind pleaseNot a screenshot nor exact same angle:
SSR can only do so much, in some parts it looks good but other parts, it is awful e.g. SSR artefacts and reflections disappearing with even the slightest angle adjustment
RT reflections are broken and glitchy and break the immersion even more than SSR in this game, hopefully RT improves though as game is in desperate need of "good" RT
peak/white brightness set to 932, can't set exactly 1000 to match my hdr1000 monitor though, should match whatever your displays peak brightness isWhat have you got your in game hdr settings if you don't mind please
The bus width was different, cause as ive said you can't just add 2gb of ram. The 3080 10gb couldnt have 12gb,it would need to have 20. Or increase the bus which further drives up the cost.
Pretty sure every reviewer was raving about the 3080 at launch as it crushed the 2080ti for almost half the money while being around 70-80% faster than the 2080, its probably the best high end card Nvidia have released in the last 5 years in terms of price to performance.They only released that card (the 12gb) because of what was happening to it at launch at 4k. Exactly the same time this thread began.
How do you know that the bus could not be increased using the 3080 10gb core? I mean, obviously it was wasn't it? because 3080 12gb. For which they charged an extra £100. And considering how utterly greedy they are?. Put it this way, if you spent the extra £100 you did the right thing.
They only released that card (the 12gb) because of what was happening to it at launch at 4k. Exactly the same time this thread began.
As bitter a pill as it may be to swallow this is completely their idea and how they limit cards from lasting people too long. They can't bake in a self destruct module as they would probably get caught. They can't stop making drivers for it too soon, because again people would notice. So they prey on people's behaviors and this is the safest and cheapest way to make sure they keep coming back for more.
The 20 series was pretty much shunned by reviewers. The prices were insane, and any one with a 1080 or 1080Ti had all they needed. So, they made sure with the 30 series that people would come back for more once they released the 40 series. And oh look, it's worked. Who'da thunk it eh?
Also dude, unless you are Jen's accountant you really have no idea how much it costs to increase the bus. I have a pretty good idea of how much VRAM costs 'cause you can find out if you look. But believe me when I say it is nowhere near as precious or expensive as Nvidia want you to believe. Like I said, with things being a bit bland and at a stand still in a gaming sense Nvidia need a way to keep you buying. 1070 - 8gb. 1080 - 8gb. 2070 8gb. 2080 8gb. 3070 8gb. So for three generations they did not change it, even though they knew for a fact RT would guzzle VRAM.
They effectively costed 2 gigs of VRAM at £350, crazy stuff.The 3080 12GB didn't even come with an msrp from Nvidia..
The 3080 12GB was being sold from the start for £999 or $999 and up and the 3080ti was $1200 msrp for the FE .
They effectively costed 2 gigs of VRAM at £350, crazy stuff.
I never said you cant increase the bus width as well.. My point is shipping the card with 12 gb ram instead of 10 isnt just costing a single 2gb ram module. Its more complicated and costly.How do you know that the bus could not be increased using the 3080 10gb core? I mean, obviously it was wasn't it? because 3080 12gb. For which they charged an extra £100. And considering how utterly greedy they are?. Put it this way, if you spent the extra £100 you did the right thing.
They only released that card (the 12gb) because of what was happening to it at launch at 4k. Exactly the same time this thread began.
As bitter a pill as it may be to swallow this is completely their idea and how they limit cards from lasting people too long. They can't bake in a self destruct module as they would probably get caught. They can't stop making drivers for it too soon, because again people would notice. So they prey on people's behaviors and this is the safest and cheapest way to make sure they keep coming back for more.
The 20 series was pretty much shunned by reviewers. The prices were insane, and any one with a 1080 or 1080Ti had all they needed. So, they made sure with the 30
No, hogwarts used 21 gb on my 4090. So every single card is obsolete"16GB Vram enough for the 4080? Discuss.."
No, hogwarts used 21 gb on my 4090. So every single card is obsolete
The 3080 12GB didn't even come with an msrp from Nvidia..
The 3080 12GB was being sold from the start for £999 or $999 and up and the 3080ti was $1200 msrp for the FE .
two choices. Split memory bus, and look how well that went down with the 970, or More memory = bigger bus. bigger bus = more memory controllers on the die. All of this = more cost. The 3080 was aggressively priced - it wouldnt have been if it was released as a 12gb card.How do you know that the bus could not be increased using the 3080 10gb core? I mean, obviously it was wasn't it? because 3080 12gb
While the 3070 could have done better with 10gb, the 2080 and older are fine 99,99% with 8gb. Even the almighty consoles with their 16gb (actually less than that since the OS will need its own RAM) will fall flat due to lack of power, not vRAM.Like I said, with things being a bit bland and at a stand still in a gaming sense Nvidia need a way to keep you buying. 1070 - 8gb. 1080 - 8gb. 2070 8gb. 2080 8gb. 3070 8gb. So for three generations they did not change it, even though they knew for a fact RT would guzzle VRAM.
peak/white brightness set to 932, can't set exactly 1000 to match my hdr1000 monitor though, should match whatever your displays peak brightness is
black point = 0
hdr colour = 25
ui = lowest value
It's more saturated/vibrant in real life than photos.
Back in the early days, Intel started selling GPUs that used your system ram as vram.
Imagine if we had that now, and people could give their GPUs as much vram as they want and we wouldn't have to moan all the time