• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What do you think of the 4070Ti?

Status
Not open for further replies.
Soldato
Joined
9 Nov 2009
Posts
24,982
Location
Planet Earth
Funnily I detest apple and their OS and industry methods (although do use a mac as it's much better suited for development than windows). Problem with apple is they just don't innovate or lead the market any more, instead they follow the trend/copy what android does but take years to do this, funnily just like when it comes to amd and lagging behind the market leader, maybe apple should buy amd :D Would actually probably be for the best as means they would at least be able to deliver quality hardware and good drivers from the get go, match made in heaven :D :cry:



Eh? Poor troll attempt?

Ampere and CP 2077 at the time was the pinnacle for RT for the past 2 years..... technology and optimisation evolves thus we now have 40xx and SER + game optimisations which = the new pinnacle instead of ampere/cp 2077 v1

You were going on how brilliant it,is and suddenly all of sudden it isn't. This is why when you try and talk about stuff being "future proof" it really isn't though? Every generation some new proprietary feature that Nvidia releases which means the older generation won't get optimisations and tank in performance. It happened before with some of the other features too.

So people go on about AMD cards being terribad at RT,but that also means without SER,Ampere performance will go south too in newer games. You basically admitted Nvidia won't optimise for Ampere now,ie,only two years it was good for FROM LAUNCH. But it's even worse when you think people were buying RTX3080 cards last year,so for many(me included) we won't get even two years of reasonable RT performance in games. That means,by extension that means when the RTX5000 series comes out in two years,it will be the same for the RTX4000.

Also, the people I know who are really into Apple,are the same when it comes to all the launches. Suddenly they have a paperweight because of FOMO and suddenly anything which isn't the latest is a paperweight. Then they start sprouting all the marketing buzzwords which Apple seems to like pushing out.

I find it kind of amusing how PCMR and Apple fans seem to be converging.
What fear of missing out someone would feel in order to buy these? :D
You can get all the cards you want, they're not printing money anymore, just eating yours! :cry:
Fear of not have the latest E-PEEN or saying their card supports the latest buzzword.
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,569
You were going on how brilliant it,is and suddenly all of sudden it isn't. This is why when you try and talk about stuff being "future proof" it really isn't though? Every generation some new proprietary feature that Nvidia releases which means the older generation won't get optimisations and tank in performance. It happened before with some of the other features too.

So go on about AMD cards being terribad at RT,but that also means without SER,Ampere performance will go south too in newer games. You basically admitted Nvidia won't optimise for Ampere now,ie,only two years it will be good for FROM LAUNCH. That means,by extension that means when the RTX5000 series comes out in two years,it will be the same.

All the people I know who are really into Apple,are the same when it comes to all the launches. Suddenly they have a paperweight because of FOMO and suddenly anything which isn't the latest is a paperweight. Then they start sprouting all the marketing buzzwords which Apple seems to like pushing out.

I find it kind of amusing how PCMR and Apple fans seem to be converging. I think Apple buying Nvidia would be a peak convergence event,then we can have at least 5X throughput of b

Fear of not have the latest E-PEEN or saying their card supports the latest buzzword.

Eh? Where have I said cyberpunk/ampere RT was never/no longer good? I've always said it's the best showcase to date designed for hardware 2 years ago..... The new update will make it even better and is now aimed for the new and better hardware. It's as simple as that.

Ser does not harm ampere nor amd performance, if it did, then you would have a point.

Also, I've said rDNA 3 RT performance is actually good now, well it's still poor given it is matching 2 year old tech and costing far more.... But it is at least now usable, which is the important bit.
 
Last edited:
Soldato
Joined
14 Aug 2009
Posts
2,948
But but but didn't you know developers love to have no support or guidance on best practices etc.

:cry:

I think AMD was in luck with the consoles, at least it had some form of optimisations. If nvidia would have been in the most popular console... yeah, amd could have closed shop.
You were going on how brilliant it,is and suddenly all of sudden it isn't. This is why when you try and talk about stuff being "future proof" it really isn't though? Every generation some new proprietary feature that Nvidia releases which means the older generation won't get optimisations and tank in performance. It happened before with some of the other features too

So go on about AMD cards being terribad at RT,but that also means without SER,Ampere performance will go south too in newer games. You basically admitted Nvidia won't optimise for Ampere now,ie,only two years it will be good for FROM LAUNCH. That means,by extension that means when the RTX5000 series comes out in two years,it will be the same.

All the people I know who are really into Apple,are the same when it comes to all the launches. Suddenly they have a paperweight because of FOMO and suddenly anything which isn't the latest is a paperweight. Then they start sprouting all the marketing buzzwords which Apple seems to like pushing out.

I find it kind of amusing how PCMR and Apple fans seem to be converging. I think Apple buying Nvidia would be a peak convergence event,then we can have at least 5X throughput of b

Fear of not have the latest E-PEEN or saying their card supports the latest buzzword.
Ampere is ok, but is last gen. Why would I pay current gen money for last gen performance? :) If AMD would have priced these around 450-$600... maybe. :)
 
Soldato
Joined
21 Jul 2005
Posts
20,244
Location
Officially least sunny location -Ronskistats
You were going on how brilliant it,is and suddenly all of sudden it isn't... Every generation some new proprietary feature that Nvidia releases which means the older generation won't get optimisations and tank in performance. It happened before with some of the other features too.

Nail, head mate. Planned obsolecence at its finest! The more you buy, the more you save. Unreal.
 
Soldato
Joined
9 Nov 2009
Posts
24,982
Location
Planet Earth
Eh? Where have I said cyberpunk was never good? I've always said it's the best showcase to date designed for hardware 2 years ago..... The new update will make it even better and is now aimed for the new and better hardware. It's as simple as that.

Ser does not harm ampere nor amd performance, if it did, then you would have a point.

Also, I've said rDNA 3 RT performance is actually good now, well it's still poor given it is matching 2 year old tech and coating far more.... But it is at least now usable, which is the important bit.

In the end it sort of does sadly.Because Nvidia will now only optimise any RT in newer games if that feature is active in hardware,hence reducing relative performance in future games on Ampere. They won't be bothered optimising for Ampere and Turing now. Even the Tweet say "more efficient usage of shaders" meaning Ampere will need more driver work on a generation Nvidia won't be making money on. If it were simply a case of Ampere pushing ahead due to more RT resources it would be different,as the work would could scale backwards.

For example the optimisation work for Ampere most likely scaled back to Turing because AFAIK the feature set was pretty much the same.

But it won't reading those tweets. This is my because concern with RT,it won't be just the general increase in RT resources each generation,but also these new features popping up which will make the performance gap grow beyond actual RT resources. It's quite clear you really need to buy at the start of each generation if you want to get the most out of RT performance.

Nail, head mate. Planned obsolecence at its finest! The more you buy, the more you save. Unreal.


I think AMD was in luck with the consoles, at least it had some form of optimisations. If nvidia would have been in the most popular console... yeah, amd could have closed shop.

Ampere is ok, but is last gen. Why would I pay current gen money for last gen performance? :) If AMD would have priced these around 450-$600... maybe. :)

I am talking about future proofing of RT performance. It really looks like you need to buy at the start of each generation. It's one thing RT performance scaling with additional hardware resources,its when the newer generation introduces newer hardware features which making the software optimisaton easier on the newer generation. The issue with that is anyone who say buys a new card 12 months into a generation,is only going to have 12 months of good optimisations,until the next generation changes something in the design,which does the same.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
28,520
Location
Greater London
Nail, head mate. Planned obsolecence at its finest! The more you buy, the more you save. Unreal.

If that's the case the rdna2 was obsolete from day one as could not do RT to save it's life.

Not trying to defend Nvidia as they are filth, but the bias shown here is what is unreal.

Both companies suck as far as I am concerned. Don't get why one would continue to defend AMD. If they are like this with 10% marketshare God knows what they would be like if they had 90%. Unreal.....
 
Soldato
Joined
21 Jul 2005
Posts
20,244
Location
Officially least sunny location -Ronskistats
I am talking about future proofing of RT performance. It really looks like you need to buy at the start of each generation. It's one thing RT performance scaling with additional hardware resources,its when the newer generation introduces newer hardware features which making the software optimisaton easier on the newer generation. The issue with that is anyone who say buys a new card 12 months into a generation,is only going to have 12 months of good optimisations,until the next generation changes something in the design,which does the same.

And proprietary lockouts like DLSS 3 only works with our latest gen. Even though they are still churning out Ampere cards and not discounting them at this very moment.
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,569
I think AMD was in luck with the consoles, at least it had some form of optimisations. If nvidia would have been in the most popular console... yeah, amd could have closed shop.

Without a doubt, some won't want to accept that though.

Had amd not got into consoles, I dread to imagine what desktop GPU performance would have been like for amd given that all the console ports from Sony perform great on Nvidia and in many cases, a good chunk better.

Also, this is where Nvidia have to ensure they remain leaders in this space as they only have pc gaming going for them where as amd have the consoles as well as their fingers in many other pies.

It's actually a very tough position for Nvidia the situation as they are basically having to compete with themselves now.

In the end it sort of does sadly.Because Nvidia will now only optimise any RT in newer games if that feature is active in hardware,hence reducing relative performance in future games on Ampere. They won't be bothered optimising for Ampere and Turing now. Even the Tweet say "more efficient usage of shaders" meaning Ampere will need more driver work on a generation Nvidia won't be making money on. If it were simply a case of Ampere pushing ahead due to more RT resources it would be different,as the work would could scale backwards.

For example the optimisation work for Ampere most likely scaled back to Turing because AFAIK the feature set was pretty much the same.

But it won't reading those tweets. This is my because concern with RT,it won't be just the general increase in RT resources each generation,but also these new features popping up which will make the performance gap grow beyond actual RT resources. It's quite clear you really need to buy at the start of each generation if you want to get the most out of RT performance.






I am talking about future proofing of RT performance. It really looks like you need to buy at the start of each generation. It's one thing RT performance scaling with additional hardware resources,its when the newer generation introduces newer hardware features which making the software optimisaton easier on the newer generation. The issue with that is anyone who say buys a new card 12 months into a generation,is only going to have 12 months of good optimisations,until the next generation changes something in the design,which does the same.

But how's that any different to anything not RT related? If you want the best visuals and performance, you will always have to buy the latest and greatest....
 
Last edited:
Soldato
Joined
19 Jan 2022
Posts
2,753
Location
Devilarium
You were going on how brilliant it,is and suddenly all of sudden it isn't. This is why when you try and talk about stuff being "future proof" it really isn't though? Every generation some new proprietary feature that Nvidia releases which means the older generation won't get optimisations and tank in performance. It happened before with some of the other features too.

So people go on about AMD cards being terribad at RT,but that also means without SER,Ampere performance will go south too in newer games. You basically admitted Nvidia won't optimise for Ampere now,ie,only two years it was good for FROM LAUNCH. But it's even worse when you think people were buying RTX3080 cards last year,so for many(me included) we won't get even two years of reasonable RT performance in games. That means,by extension that means when the RTX5000 series comes out in two years,it will be the same for the RTX4000.
I really don't get the argument you are making. The 3080 was the best (or, near the best, since there was the 3080ti / 3090 etc) RT card you could buy in 2020, 2021 and 2022. It still is a great card and handles everything RT included just fine, so if you have one you shouldnt be looking to upgrade unless you go for something like a 4090.
 
Soldato
Joined
9 Nov 2009
Posts
24,982
Location
Planet Earth
If that's the case the rdna2 was obsolete from day one as could not do RT to save it's life.

Not trying to defend Nvidia as they are filth, but the bias shown here is what is unreal.

Both companies suck as far as I am concerned. Don't get why one would continue to defend AMD. If they are like this with 10% marketshare God knows what they would be like if they had 90%. Unreal.....
But,but the RTX4070TI is not too expensive,according to the PCMR cultists excuse makers. The tech companies need our monies because they are poor and the tax breaks and US government funds are not enough!

They are both **** poor and the best thing is to keep all of it on the shelf. I don't understand why FOMO is so strong,people have to find some way to salvage any release from tech companies?

Gamers and PCMR cultists on tech forums have become some of the biggest Whales I have ever seen.

Literally none of my gaming mates,including those who tend to buy Nvidia thinks this generation is nothing but overpriced.All I can hear them saying is how much Nvidia and AMD are taking the micky,including those who have RTX3080 and RTX3090 cards.

Go on gaming websites,other PC websites,Twitter,Reddit or even HUKD,etc people are just are fed up with the prices of this current generation. It was the same with the Zen4 CPUs too,apart from some defenders on forums. There were people defending the useless pricing of those too!
:rolleyes:


In the end AMD had to drop prices,but Zen4 is nowhere as bad as the pricing of the RTX4000 series,and even the RX7900 series.

It just shows you how much of a bubble some people are in. They are on a PC Hardware forum which itself is niche part of the internet,and yet they are trying their best to spin stuff despite most people not agreeing with them. These were the same people defending Turing,etc until Nvidia admitted they were wrong and had to refresh the whole range to be better value for money. The excuse makers should have eaten humble pie,but OFC they went silent when it was proven even Nvidia realised Turing MK1 was overpriced.

Remember in August Nvidia had to write off over a billion USD! They are trying their best to keep margins high as sales overall are not great. Not sure why there are people on here acting like unpaid marketing for Nvidia and AMD. This whole set of releases,from the RTX4000 series,the RX7900XT and Zen4 have been cash grabs.

A Hobson's Choice isn't a choice,its more a an unofficial cartel in this case.

The difference people go on about RT performance in terms of lifespan because everyone apparently only plays games with RT. Right? Settled.

Remember people were just going on about how overpriced the RTX4070TI was and the same one or two characters started trying to make it an AMD thing and even a mod tried to step in and gave up. The same AMD card many here said is a cash grab.So who has the bias then? Both are essentially price fixing this generation.

Most people don't buy at the start of a generation.I knew people who bought into Ampere cards last summer,well over a year of into its lifespan because of mates saying Nvidia is better at RT.

Also,lets assume AMD RDNA2/RDNA3 buyers don't care for RT because they suck. Interestingly enough most of the buyers I know who got AMD cards,seem to run games which are rasterised,ie,maybe more performance in Rocket League or something.

People here harp on how AMD sucks at RT,even if AMD is faster in rasterised performance,and that AMD cards won't last as long because of poor RT performance. Most gamers I know keep cards for a few years,like from 3 to 5 years. My last card,a GTX1080,lasted me nearly 5 years.

But then it was admitted in that tweet,Nvidia has introduced a feature only on Ada Lovelace,which makes the programming model easier on Ada Lovelace. Anyone can see what is going to happen with Ampere(and by extension Turing performance) once that feature is enabled. It's going to get progressively further and further and further away from Ada Lovelace. So the difference in RT performance is going to get worse.

That means if you are really into using RT, if you buy at the start of a generation that is two good years. But most people don't so the essential lifespan is much shorter. That means as time progresses,you will be falling back on rasterised performance because the RT effects won't run so great on anything but the latest generation Nvidia card. Let's again assume you never buy an AMD card for anything but rasterised performance.

In fact I saw this first hand with the tessellation wars. People here kept on saying ATI/AMD sucked arse because the tessellation was worse. They would have a shorter lifespan. At high tessellation levels it was objectively worse.

Nvidia pushed a ton of heavy tessellation based effects. ATI/AMD wasn't so hot. The moment Nvidia introduced newer generations with more tessellation power,they further pushed the usage up,until it came to the point the previous generation ended up being as bad as the ATI/AMD cards. You have been here long enough to know this.

W3 was an example,where Kepler(which was better than the 7000 series in tessellation) essentially was unplayable with those effects. In the end both the GTX600/7000 series needed it to be dialed down to a similar level. I had both an HD7850 and GTX660 in two machines and saw what happened over time,and they ended up being much of a sameness.

This is the problem with many here - they make arguments about lifespan,but change very quickly when a new generation drops. They make the same excuses for VRAM,etc. Yet upgrade very quickly. Not one of these characters will be keeping these cards.

It's the same with all these Whales who defend Nvidia and AMD upselling smaller and smaller chips. Even people defending the original Zen4 pricing too. I argued with the same people over Zen3 pricing too.

The same characters are defending this upselling. It screws mainstream gamers like me massively. Unless we keep spending more and more we get less and less.
 
Last edited:
Associate
Joined
21 Oct 2013
Posts
2,080
Location
Ild
If that's the case the rdna2 was obsolete from day one as could not do RT to save it's life.

Not trying to defend Nvidia as they are filth, but the bias shown here is what is unreal.

Both companies suck as far as I am concerned. Don't get why one would continue to defend AMD. If they are like this with 10% marketshare God knows what they would be like if they had 90%. Unreal.....
Well if thats the case then anything below the RTX3080 was also obsolete.
 
Caporegime
Joined
4 Jun 2009
Posts
31,569
At the end of the day, if you are a big fan for RT, nvidia is still the only option to go with, it is simply better "overall" based on all the data we have available to us as of right now, be that for hardware RT performance, DLSS choice/availability or/and frame generation. If you're one of these people who defaults to one or all of the following anytime RT performance gets brought up:

- game is old, who still plays that
- game is ****, who on earth plays that
- game is sponsored by nvidia, not fair
- hardly any games have RT so who cares
- who on earth enables RT

Then just as was the case when it came to ampere vs rdna 2..... buy an amd gpu and enjoy COD and FC 6.

Simples.

Well if thats the case then anything below the RTX3080 was also obsolete.

Biggest problem for RDNA 2 was the fact they didn't have any upscaling tech. for a long time then they got FSR 1 which was inferior in every way possible (although tbf, amd did state it wasn't a competitor to dlss) and it took another several months until the real dlss competitor came along i.e. FSR 2 but it's first iteration was poor, not to mention the lack of uptake in games, it is only recently with 2.2/2.3+ where quality is almost on par with DLSS and it's seeing more uptake now.

This is where RDNA 3 is in a better place now i.e. it has got usable levels of RT perf. from a hardware POV and FSR 2+ is better established. Obviously frame generation is "around the corner" for amd too (given amds timing, god knows when that will be though...) but given previous amds technology releases, it could be several months or years until it is on par with nvidias solution and potentially may face similar slow uptake as FSR 2+ did.....
 
Soldato
Joined
9 Nov 2009
Posts
24,982
Location
Planet Earth
At the end of the day, if you are a big fan for RT, nvidia is still the only option to go with, it is simply better "overall" based on all the data we have available to us as of right now, be that for hardware RT performance, DLSS choice/availability or/and frame generation. If you're one of these people who defaults to one or all of the following anytime RT performance gets brought up:

- game is old, who still plays that
- game is ****, who on earth plays that
- game is sponsored by nvidia, not fair
- hardly any games have RT so who cares
- who on earth enables RT

Then just as was the case when it came to ampere vs rdna 2..... buy an amd gpu and enjoy COD and FC 6.

Simples.



Biggest problem for RDNA 2 was the fact they didn't have any upscaling tech. for a long time then they got FSR 1 which was inferior in every way possible (although tbf, amd did state it wasn't a competitor to dlss) and it took another several months until the real dlss competitor came along i.e. FSR 2 but it's first iteration was poor, not to mention the lack of uptake in games, it is only recently with 2.2/2.3+ where quality is almost on par with DLSS and it's seeing more uptake now.

This is where RDNA 3 is in a better place now i.e. it has got usable levels of RT perf. from a hardware POV and FSR 2+ is better established. Obviously frame generation is "around the corner" for amd too (given amds timing, god knows when that will be though...) but given previous amds technology releases, it could be several months or years until it is on par with nvidias solution and potentially may face similar slow uptake as FSR 2+ did.....

So you promise to keep your RTX3080 for another three years or you upgrading this year? I just like to know how long you think the lifespan of your RTX3080 is for RT is in years?

Because it was you and a few others who brought up lifespan of cards based on RT performance.

It was Bencher on here and you started talking about AMD cards,when everyone was talking about how overpriced the RTX4070TI is and saying how great value it was. I like to know why you are defending the terrible pricing on the Nvidia side by evoking AMD for some reason?

It is you and Bencher who keep saying the RTX4070TI will last longer than any AMD card,so they are not worth buying. I like to know how long extra in years this would be?

Again BOTH of you made the claim - can we have some numbers please? One year? Two years? I am just interested to know what calculations you made,because I assume you crunched the numbers?
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,569
So you promise to keep your RTX3080 for another three years?

Well based on what I've said here:

But how's that any different to anything not RT related? If you want the best visuals and performance, you will always have to buy the latest and greatest....
At the end of the day, if you are a big fan for RT, nvidia is still the only option to go with

What do you think?

It depends entirely on how well "future" games will run and if it will require a new gpu in order to meet my requirements, especially for any RT games which will "potentially" get most of my play time e.g.

- atomic heart
- avatar
- starfield
- hogwarts
- ark 2
- redfall
- day before (probably going to be **** though)
- dune (not confirmed to be RT but nvidia sponsored...)
- stalker

Some other titles but they won't get as much time most likely:

- deliver us mars
- black myth wukong
- splinter cell

Games which aren't confirmed to have RT yet but possibly will:

- suicide squad
- dead island 2
- arc raiders

And probably a good few others I've missed including any older games which get remastered with RT and of course another replay of cp when overdrive mode arrives.

I got no problem turning down a setting or 2 and using lower presets of DLSS to meet my FPS target if needs be (ideal = 100+ fps but happy with 70/80+).

So far 3080 has and still is serving my needs/wants very well, not even portal rtx was an issue given how good DLSS performance and even ultra performance mode (which are going to further improve with an update soon) looks for that game..... not bad for 2 year old tech, if I can get 3 more years from it whilst meeting my requirements, fantastic, if not and I "want the best visuals and performance" then there is no choice but to buy a 4080/4090 gpu

EDIT:

Had I gone for rdna 2, well I would have missed out on a lot of RT goodness over the past 2 years and given the future titles expected, I most definetly would be looking to upgrade now, be that a 7900xtx or 4080/4090 gpu. But again, this is where it falls down to this:

At the end of the day, if you are a big fan for RT, nvidia is still the only option to go with, it is simply better "overall" based on all the data we have available to us as of right now, be that for hardware RT performance, DLSS choice/availability or/and frame generation. If you're one of these people who defaults to one or all of the following anytime RT performance gets brought up:

- game is old, who still plays that
- game is ****, who on earth plays that
- game is sponsored by nvidia, not fair
- hardly any games have RT so who cares
- who on earth enables RT

Then just as was the case when it came to ampere vs rdna 2..... buy an amd gpu and enjoy COD and FC 6.

Simples.
 
Last edited:
Soldato
Joined
19 Jan 2022
Posts
2,753
Location
Devilarium
So you promise to keep your RTX3080 for another three years or you upgrading this year? I just like to know how long you think the lifespan of your RTX3080 is for RT is in years?

Because it was you and a few others who brought up lifespan of cards based on RT performance.

It was Bencher on here and you started talking about AMD cards,when everyone was talking about how overpriced the RTX4070TI is and saying how great value it was. I like to know why you are defending the terrible pricing on the Nvidia side by evoking AMD for some reason?

It is you and Bencher who keep saying the RTX4070TI will last longer than any AMD card,so they are not worth buying. I like to know how long extra in years this would be?

Again BOTH of you made the claim - can we have some numbers please?
Your argument doesnt make sense. Let's say for the sake of argument the RTX 3080 dies in 2 months cause new RT games are too heavy so it cant handle them. That means it lasted for 2 and a half years. The xtx and the xt that have similar RT performance will also die, having a lifespan of a couple of months. So, what the heck are you talking about?

Other than that, im not defending anyone. I just stated my opinion, that the 4070ti is a superior card to it's competitor which is the XT.
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,982
Location
Planet Earth
Well based on what I've said here:




What do you think?

It depends entirely on how well "future" games will run and if it will require a new gpu in order to meet my requirements, especially for any RT games which will "potentially" get most of my play time e.g.

- atomic heart
- avatar
- starfield
- hogwarts
- ark 2
- redfall
- day before (probably going to be **** though)
- dune (not confirmed to be RT but nvidia sponsored...)
- stalker

Some other titles but they won't get as much time most likely:

- deliver us mars
- black myth wukong
- splinter cell

Games which aren't confirmed to have RT yet but possibly will:

- suicide squad
- dead island 2
- arc raiders

And probably a good few others I've missed including any older games which get remastered with RT and of course another replay of cp when overdrive mode arrives.

I got no problem turning down a setting or 2 and using lower presets of DLSS to meet my FPS target if needs be (ideal = 100+ fps but happy with 70/80+).

So far 3080 has and still is serving my needs/wants very well, not even portal rtx was an issue given how good DLSS performance and even ultra performance mode (which are going to further improve with an update soon) looks for that game..... not bad for 2 year old tech, if I can get 3 more years from it whilst meeting my requirements, fantastic, if not and I "want the best visuals and performance" then there is no choice but to buy a 4080/4090 gpu


Your argument doesnt make sense. Let's say for the sake of argument the RTX 3080 dies in 2 months cause new RT games are too heavy so it cant handle them. That means it lasted for 2 and a half years. The xtx and the xt that have similar RT performance will also die, having a lifespan of a couple of months. So, what the heck are you talking about?

No because you basically have flooded this thread with saying the RTX4070TI is 1000X better because it has more RT performance and will last longer. YOU are the one who keeps saying the faster rasterised performance is not important.

So,again tell all of us how much longer this means in months and years. YOU made the claim. Tell us how long.
 
Last edited:
Soldato
Joined
19 Jan 2022
Posts
2,753
Location
Devilarium
No because you basically have flooded this thread with saying the RTX4070TI is 1000X better because it has more RT performance and will last longer.

So,again tell all of us how much longer this means in months and years. YOU made the claim. Tell us how long.
In fact, I never said the 70ti is better because it has more RT and will last longer. I said it's better cause it offers the same raster per dollar, 35 to 50% better RT, much better power draw, no issues with multimonitr and video playback power draw, and it's 10 to 15% cheaper. If that doesn't make it a better product I don't know what does anymore.

I don't know how long the 3080 will last in terms of RT, but I know it will last as long as the just released AMD offerrings, so yeah, that's pretty freaking impressive if you think about it.
 
Caporegime
Joined
4 Jun 2009
Posts
31,569
It is you and Bencher who keep saying the RTX4070TI will last longer than any AMD card,so they are not worth buying. I like to know how long extra in years this would be?

Again BOTH of you made the claim - can we have some numbers please? One year? Two years? I am just interested to know what calculations you made,because I assume you crunched the numbers?

Just look at history of rdna 2 and ampere to see this.... Buying on day 1 with rdna 2, forget about rt gaming, meanwhile ampere owners have been enjoying 7900xt(x) RT performance for the past 2 years and for cheaper *strokes £650 3080* ;) :)

Seems you missed this part of my post too when it comes to rdna 3:

This is where RDNA 3 is in a better place now i.e. it has got usable levels of RT perf. from a hardware POV and FSR 2+ is better established. Obviously frame generation is "around the corner" for amd too (given amds timing, god knows when that will be though...) but given previous amds technology releases, it could be several months or years until it is on par with nvidias solution and potentially may face similar slow uptake as FSR 2+ did.....

Also, I'm not defending any of the gpus and their pricing, I've stated many times how people are all being taken for a mug with pricing with the exception of maybe the 4090 given how far ahead it is. Here is my most recent quote of the many stating this:

Either way I still wouldn't touch the 7900xt or 70ti with a barge pole, stump up and go 7900xtx or 4080 or better yet 4090

So seems you're just making a mountain out of a molehill for the sake of it? :D




Funny thing is, RT performance/longevity is the least of rdna 3/amds concerns, there is so many other reasons to not buy amd over nvidia unless again, you only play cod and FC 6 :cry:
 
Soldato
Joined
17 Jun 2004
Posts
7,616
Location
Eastbourne , East Sussex.
Funny thing is, RT performance/longevity is the least of rdna 3/amds concerns, there is so many other reasons to not buy amd over nvidia unless again, you only play cod and FC 6 :cry:

Such as? Please mention the single German shop in 1 town, the only place which has any RNDA cards with issues...... or please mention the MBA XTX vapour cooler, as the AIB cards are all working as intended....
 
Status
Not open for further replies.
Back
Top Bottom