• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA 2 potential longevity concerns?

From the Grim youtube comparison above, I thought the 6900 image was better in that first area. The fps of the DLSS is obviously the playable speed wanted.

From the feedback given above it would sound more like the devs didnt bother too much catering for other settings rather than the GPU features being poorer than another.

As I mentioned in the other thread, there is no doubt techland didn't put as much time/effort into the rasterization/non-rt settings but even if they did, you would still be very hard pressed to get results as good as what we are seeing currently, not to mention, how long would it take them to get good results?

4a shared their workflow for this with DF, which is a good watch:

https://www.youtube.com/watch?v=fiO38cqBlWU&t=315s

This but then again, many did and still do argue that metro exodus (non enhanced version) has some of the best shadowing, lighting etc. yet it still pales in comparison to the enhanced version. There is only so much that can be done by rasterization methods and what we have seen with RT is still just the tip of the iceberg.

The only game where I would say lighting, reflections and shadows etc. etc. could almost be considered to be on par with RT implementations such as that found in metro ee, cp 2077 and dl 2 is rdr 2 but again, we are essentially comparing the very best that rasterization can offer to what is essentially still the early days of RT.

Another game where shadows, AO looks considerably better than rasterization is deathloop, although no doubt we'll get the same reason again of "developers didn't spend time making non-rt settings look good", which is a perfectly valid point to make but as per the youtube video above, there is a good reason for that, of course, we could also say that nvidia are paying good money to have developers gimp their games :p ;) But I doubt this given the games are still having to run and look good for the much bigger consumers/market i.e. consoles.

I think this is going to become more common going forward now as developers are seeing just how much quicker, easier and less effort is required to use RT over rasterization hence why they are trying to use RT where possible on consoles.
 
I never claimed 8GB is not breakable though ;)
That's the answer I was hoping to hear.:)

Which reinforces my point!

Since I'm actually running Nv and not AMD, I've got a larger concern that when 40 series lands(and you still might not be able to get one) they'll double up RTX perf>they'll pay for more effects heavy games>it WILL criple this gens gpu's.

By adding that onto the vram light 70/80's tight waistband, it's a bigger concern personally as I've got my limit on how much cash I'm throwing at these two companies.

Nvidia Ampere Potential Longevity Concerns?

Should I start a thread?
 
That's the answer I was hoping to hear.:)

Which reinforces my point!

Since I'm actually running Nv and not AMD, I've got a larger concern that when 40 series lands(and you still might not be able to get one) they'll double up RTX perf>they'll pay for more effects heavy games>it WILL criple this gens gpu's.

By adding that onto the vram light 70/80's tight waistband, it's a bigger concern personally as I've got my limit on how much cash I'm throwing at these two companies.

Nvidia Ampere Potential Longevity Concerns?

Should I start a thread?

Might as well, let me get the popcorn first :p ;) :D

Although I do agree big time on that highlighted point but if it gives us a further leap in visuals, bring it on I say! As mentioned before, that is going to be the deciding factor for my next upgrade.
 
That's the answer I was hoping to hear.:)

Which reinforces my point!

Since I'm actually running Nv and not AMD, I've got a larger concern that when 40 series lands(and you still might not be able to get one) they'll double up RTX perf>they'll pay for more effects heavy games>it WILL criple this gens gpu's.

By adding that onto the vram light 70/80's tight waistband, it's a bigger concern personally as I've got my limit on how much cash I'm throwing at these two companies.

Nvidia Ampere Potential Longevity Concerns?

Should I start a thread?
I know what you mean man. I am no fan of the price hikes, but also I feel people are taking what I am saying out of context to push their own agendas. Would I like more vram? Who wouldn’t? But it has to make sense. These companies are here to make as much money out of us as they can. Had nvidia given us more vram it would have meant they would have charged us more for those said cards. I would rather they make two versions and give people choice from the get go.

I would rather pay less now with the lower vram amount as for my personal needs grunt is a lot more important. You know me, I will sell this card, add a little and upgrade to next gen which will no doubt have more vram anyway.
 
That's the answer I was hoping to hear.:)

Which reinforces my point!

Since I'm actually running Nv and not AMD, I've got a larger concern that when 40 series lands(and you still might not be able to get one) they'll double up RTX perf>they'll pay for more effects heavy games>it WILL criple this gens gpu's.

By adding that onto the vram light 70/80's tight waistband, it's a bigger concern personally as I've got my limit on how much cash I'm throwing at these two companies.

Nvidia Ampere Potential Longevity Concerns?

Should I start a thread?

63kcDBuvm6.gif


Might as well, let me get the popcorn first :p ;) :D

Bringing it.
 
Really no care for raytracing at the moment. 6800 runs everything very well and I turn off raytracing on both PS5 and series x as I would rather have the performance. Let's see where is at in 2 - 3 years.
 
Yeah. I am hoping AMD’s next gen cards surpass what we have now and are close to Nvidia’s 4000 series offering. That way we can move pass this “I can’t see the difference” stuff :p:D

I would like to see at least a doubling in RT performance from 3000 to 4000 series.
 
I have no doubt amd will improve their RT capabilities, question is will it be enough though? They have a considerable way to go to match ampere, let alone whatever 40xx series will bring us...
 
I have no doubt amd will improve their RT capabilities, question is will it be enough though? They have a considerable way to go to match ampere, let alone whatever 40xx series will bring us...

Yeah it will be a gulf I feel that wont be closed in the titles that have worked with nvidia. That ship sailed but I look forward to see if the intel architecture has something maybe near to it that moulds the direction away from it. Rather like the sync technology where years later its more standardised.
 
I have no doubt amd will improve their RT capabilities, question is will it be enough though? They have a considerable way to go to match ampere, let alone whatever 40xx series will bring us...

From what I've seen, AMD's first RT attempt on par with Turing cards and that is without even using dedicated Tensor cores. RDNA3 is rumoured to include more RT dedicated cores so no doubt it will catch up but yet to be seen if it will match or beat it Nvidia's next gen or even current gen cards. The future for now seems to be a DLSS like feature that alleviates the heavy RT processing but more needs to be done to get it working better on consoles. Simply rendering at a lower resolution is not enough when heavy rt effects are used.
 
I tend to play everything at 1440p@144hz native and I'm absolutely over the moon with my 6900XT. It's probably the best purchase I've made since the 980Ti all those years ago, and I imagine it's going to last me years if I keep it in good condition. I'm definitely in the "save up, splurge out" every few years camp when it comes to upgrading, as I feel my money goes further if the performance gap is greater.

We'll see which GPU vendor has the most upgraders when the next generation arrives. I suspect price will determine 90% of the decision, with the remaining factors being monitor resolution and general system performance.

Based on the Steam survey I don't see many buyers caring too much about ray tracing just yet. It's nice but there are so many more important factors that determine if a game looks and feels good.
 
My wife is extremely happy with her 6600, at MSRP my budget won't go over a 6700xt although I definitely wouldn't mind at least a 6800...
 
I tend to play everything at 1440p@144hz native and I'm absolutely over the moon with my 6900XT. It's probably the best purchase I've made since the 980Ti all those years ago, and I imagine it's going to last me years if I keep it in good condition. I'm definitely in the "save up, splurge out" every few years camp when it comes to upgrading, as I feel my money goes further if the performance gap is greater.

We'll see which GPU vendor has the most upgraders when the next generation arrives. I suspect price will determine 90% of the decision, with the remaining factors being monitor resolution and general system performance.

Based on the Steam survey I don't see many buyers caring too much about ray tracing just yet. It's nice but there are so many more important factors that determine if a game looks and feels good.

Same here with my 6800, It's hands down the best GPU purchase I have made in years. Equally its hard to be concerned about longevity as I know that new and shiny tech will most likely have me buying the next or following gen that is released and I imagine that will be the same for a large number of members of this forum.
 
every current card will be sub-par when RT is in full effect but this can't really happen to major games until the next gen consoles are out, so another 7 odd years. We'll just see progressive improvements in graphics up to there with cards with lower RT capabilities holding back graphics features for users more and more. If you upgrade every gen it's not an issue, if you skip like 2-3 gens then just won't be able to play games in all their glory with any current gen card.

Dying Light 2 is kinda interesting, they are probably thinking of their engine and features for the next 5 years. They still release regular updates to DL1 even though it came out 6 years ago
 
every current card will be sub-par when RT is in full effect but this can't really happen to major games until the next gen consoles are out, so another 7 odd years. We'll just see progressive improvements in graphics up to there with cards with lower RT capabilities holding back graphics features for users more and more. If you upgrade every gen it's not an issue, if you skip like 2-3 gens then just won't be able to play games in all their glory with any current gen card.

Dying Light 2 is kinda interesting, they are probably thinking of their engine and features for the next 5 years. They still release regular updates to DL1 even though it came out 6 years ago

Yet here I am enjoying RT maxed out using a 3080 and a decade old 3770k :)
 
I'm not too sure how cards from either side are going to have too much longevity, the 3080 and below don't have enough Vram for my liking, but at least they have dlss which can probably help extend their useful lifespan.
The 6000 series is lacking in RT performance, and hasn't got DLSS to back it up, FSR isn't close enough to DLSS.
 
Back
Top Bottom