• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Just what is NVIDIA up to?

With AMD driver i can play games running on my PC through my TV and use my tablet as a live system info display. no steaming hardware needed... how ####### cool is that?

I thought you had 2070s ? What AMD card do you have or which was the last AMD card ?

I also have 5700xt nitro+ in another system
 
The amount of championing for higher prices from simple buyers (if there are simple buyers and nothing else), is unreal. No wonder the prices keep going up! :))

Coming back to the thread: Nvidia is pushing AI hard and its ecosystem. RT is part of that as it helps with visualization. Naturally, adding this to the gaming market (since that is the holly grail there as well), is the smart thing to do. Like that your R&D costs get spread to all products, which is the way to go. They also probably knew AMD wouldn't bother to change its architecture dramatically inclusing dedicated hardware for RT and upscaling until next gen of consoles are out. So is just 1+1 for Nvidia to score a win (after win). AMD (in)action just confirms that.

Yeah, I get it, RT and upscalers are not worth it for some. I bet consoles players would praise to high Heaven their machine if they would get 4k@60fps with path tracing in Cyberpunk (even if is dlss3) and in other games that would have followed. At this point, going with AMD on the graphics side just holds back progress. But alas, that's secondary to profit.

Ah, and if the situation was in reverse, with AMD being ahead in RT, we would see AMD fans going nuts on how Nvidia is crippling progress by not keeping up and (perhaps) limiting what other tech gets into their sponsored game.
 
If you have only ever bought one brand humbug you cant know what your missing right? :)

Right...
I thought you had 2070s ? What AMD card do you have or which was the last AMD card ?

I also have 5700xt nitro+ in another system

Between the 1070 and 2070S, which yes is what i have right now. I had a 5700XT for a few weeks, it was an ASRock Challenger D, it was cheap so i bought it, turns out the cooler was, and i'm not joking, like a cheap cooler from 2008, it was very noisy, not even in a way that at least quality fans can be at high RPM, it was a higher pitched scream, and yet still couldn't keep the 200 watt GPU cool.

As a GPU the 5700XT was great, ran perfectly, good and smooth performance, awesome drivers, bad cooler, really really bad cooler, i'll say this, Nvidia would never allow anyone to put one of their GPU's with a cooler like that on to the market.
 
Last edited:
20+ years ago problems with games not always running properly was part of the PCMR experience, with both ATI/AMD and Nvidia there was often something odd going on, you accepted it because you were used to it, that's how it was.

To begin with Nvidia put a lot of driver work in to try and make that a thing of the past, with success.

However since then and for some years now AMD have also put that work in, i couldn't tell the difference between the 5700XT and the 2070S, in terms of smoothness, responsiveness, dependability and all of that.... they are exactly the same.

Apart from one thing, at the time my mate @pete910 had a GTX 1080, we started playing Insurgency.

My mate with his 1080 "i've got purple crates, why do i have purple crates?"
Me, RX 5700XT "the #### are you talking about?"
My mate: "all the crates in the game are purple"
Me: "no, they are wood"
My Mate: "here's a screenshot"
Me: "hm? Purple crates"

One RTX 2070S later.
Me: "ah... there's my purple crates"

It got fixed, with a driver, eventually.
 
Last edited:
Those are US prices without state taxes. So £700 in the UK which is about that with VAT.

With the people like myself who were 100% going to spend £60 on Starfield it made the 7900 XT £640.

Having said that I was going to go for the 6800 XT with the same code and that would have been £440. Though the only 6800 XT at that price (at least then) would not fit in my case.
 
The amount of championing for higher prices from simple buyers (if there are simple buyers and nothing else), is unreal. No wonder the prices keep going up! :))

Coming back to the thread: Nvidia is pushing AI hard and its ecosystem. RT is part of that as it helps with visualization. Naturally, adding this to the gaming market (since that is the holly grail there as well), is the smart thing to do. Like that your R&D costs get spread to all products, which is the way to go. They also probably knew AMD wouldn't bother to change its architecture dramatically inclusing dedicated hardware for RT and upscaling until next gen of consoles are out. So is just 1+1 for Nvidia to score a win (after win). AMD (in)action just confirms that.

Yeah, I get it, RT and upscalers are not worth it for some. I bet consoles players would praise to high Heaven their machine if they would get 4k@60fps with path tracing in Cyberpunk (even if is dlss3) and in other games that would have followed. At this point, going with AMD on the graphics side just holds back progress. But alas, that's secondary to profit.

Ah, and if the situation was in reverse, with AMD being ahead in RT, we would see AMD fans going nuts on how Nvidia is crippling progress by not keeping up and (perhaps) limiting what other tech gets into their sponsored game.

Having used the latest FSR and DLSS you need to pixel peep to see any real differences and most AAA games give both options. RT is a good bit better on Nvidia but you need to go to 4090 to get to the point it matters in games.

By that I mean any GPU lower than a 4090 100% needs to use up scaling in demanding RT games even at 1440p for playable FPS. Even the 4090 needs it at 4K sometimes.

For example (made up numbers).

At 4K in an extreme RT game
4080 gets 30 avg FPS
7900 XT gets 20 avg FPS

Now both of these are totally unplayable. So you must turn on up scaling and it adds 40 FPS to your final FPS on both GPUs making both GPUs playable. Yet on paper the 4080 has 50% better RT performance in this game. How many reviews simply show the non up scaled scores so people come away thinking the Nvidia GPU does much better RT and is worth the premium?

It’s all part of Nvidia’s review guide I bet.
 
Last edited:
I have an amusing personal anecdote about DLSS, Tim from HUB loves to compare telegraph lines between FSR and DLSS.

Almost every game i have with DLSS has telegraph lines in it, i have several, from a distance yeah sure those lines look clean, not jagged with gaps.... go close and they look like the original lines have been painted over with black crayon, ridiculously thick black lines, like 6" thick, its that sort of crap where one cannot help but wonder why Tim's constant emphasis on comparing these ####### telegraph lines.
 
Last edited:
I don’t use gamepass and have no interest, so value is relative.

Sure.... a year of monthly Gamepass sub's comes to about £100, the game is £60, but you get a lot of other games besides.

The value of Starfield with these GPU's is inarguable, you can get it with a £189.99 RX 6600, a £60 game with a GPU that costs less than £190. You can get it with CPU's too.

They clearly aren't short of Starfield keys if they are bundling it with pretty much anything and everything AMD make, that will have cost AMD millions, many millions in sponsorship deals.
 
Last edited:
I hasten to add, I still think £600 for not even top tier GPU is crazy money really, but it's much better than anything Nvidia are offering right now in terms of value for money/bang for buck right now.

80 class GPU's have been ~$800 when adjusted for inflation for the last 20 years.

7900XT is in the ball park now at £700 as it's close enough to the 4080 in raster.
Selling or keeping the free game sweetens the deal and is probably as cheap as it's likely to get until next gen.
Perhaps a slightly better deal periodically, but I don't expect anything worth waiting for if you're in the market as you're trading the benefit of usage vs a potential small cost saving at some future point.

At the low end the 7600 is a decent card at £230.
More than capable of 1080p in recent games unless you max the settings, it's even good enough for 1440p in some current and most older titles.
The much touted GTX1060 6GB launch benchmarks show similar and often worse frame rates in the 'new titles' when it launched typically, at from what I see the visual quality settings were lower than HUB's obsession with 'Ultra'.
And the 1060 launch price was $250-300 or more like $325 to $400 today when corrected for inflation.

£200 to £1000+ is the range of GPU pricing and no amount of convoluted debate in a forum is going to change that.
 
I'm imagining what becoming an Nvidia AIB must be like.

New AIB: "Well looks like a tough market but we can start by competing on price"

Nvidia: "no you can't"

New AIB : "What about remediating product issues like increasing the amount of VRAM?"

Nvidia: "absolutely not!"

New AIB: "well what abo-"

Nvidia: "you will compete on cooler design, the sticker on the card and up to a 5% overclock and you will like it"

No wonder evga sacked it off.
 
Hypothetically, if you were doing a completely new build targetting 1440p what GPU would you look at? Lots of discussion about price in this thread that is really relevant as I think more people start to consider their next builds.

Keeping ‘budget’ in mind to some degree…

If you go the Nvidia, do you just look at the 4070 now as the main option? Or consider last generation with a used 3070/3080?

Or for AMD, do you stick with the last generation and grab a 6750xt / 6700xt?
 
I'm imagining what becoming an Nvidia AIB must be like.

New AIB: "Well looks like a tough market but we can start by competing on price"

Nvidia: "no you can't"

New AIB : "What about remediating product issues like increasing the amount of VRAM?"

Nvidia: "absolutely not!"

New AIB: "well what abo-"

Nvidia: "you will compete on cooler design, the sticker on the card and up to a 5% overclock and you will like it"

No wonder evga sacked it off.

Unless Nvidia are actively trying to take AIB's out of the equation, to cut out that middleman, i also don't understand this mentality, surely actual competition between AIB's makes better products and helps sell them.

Here is an idea for AMD, have a flat rate IP charge for each CU in the architecture, make AIB's source their own manufacturing but with that they can chose not only how much VRam they have but also how many CU's and clock rates in each tier.

My GPU has 96 CU's, ah... but my GPU has 104 CU's, yeah but while my GPU only has 88 CU's its cheaper than all of yours.....
They don't even have to call them AMD GPU's, Sony doesn't call the PS5 the AMD Play Station, it could just be the Asus ROG Strix 104 CU, they could skin and brand AMD's drivers however they like....

Or is that a stupid idea?
 
Last edited:
Hypothetically, if you were doing a completely new build targetting 1440p what GPU would you look at? Lots of discussion about price in this thread that is really relevant as I think more people start to consider their next builds.

Keeping ‘budget’ in mind to some degree…

If you go the Nvidia, do you just look at the 4070 now as the main option? Or consider last generation with a used 3070/3080?

Or for AMD, do you stick with the last generation and grab a 6750xt / 6700xt?
I Would go for a used 3080 for under £400, one went for £360 the other day on MM, if your spending near £600 on a 4070 then you might as well save the extra for a 7900XT which is over 30% faster.
 
Unless Nvidia are actively trying to take AIB's out of the equation, to cut out that middleman, i also don't understand this mentality, surely actual competition between AIB's makes better products and helps sell them.

Here is an idea for AMD, have a flat rate IP charge for each CU in the architecture, make AIB's source their own manufacturing but with that they can chose not only how much VRam they have but also how many CU's and clock rates in each tier.

My GPU has 96 CU's, ah... but my GPU has 104 CU's, yeah but while my GPU only has 88 CU's its cheaper than all of yours.....
They don't even have to call them AMD GPU's, Sony doesn't call the PS5 the AMD Play Station, it could just be the Asus ROG Strix 104 CU, they could skin and brand AMD's drivers however they like....

Or is that a stupid idea?

In my mind they could do something kind of similar. I think if the AIBs all individually sourced manufacturing of chips they'd lose out on economy of scale and it wouldn't work out too well. But potentially it could work like: AMD manufactures and bins all of the chips, the AIBs can buy the chips with prices based on how they've been binned. They then use that flexibility to come up with whatever product they want to make.


I think the main issue would be the confusion around marketing in that case though.
 
Having used the latest FSR and DLSS you need to pixel peep to see any real differences and most AAA games give both options. RT is a good bit better on Nvidia but you need to go to 4090 to get to the point it matters in games.

By that I mean any GPU lower than a 4090 100% needs to use up scaling in demanding RT games even at 1440p for playable FPS. Even the 4090 needs it at 4K sometimes.

For example (made up numbers).

At 4K in an extreme RT game
4080 gets 30 avg FPS
7900 XT gets 20 avg FPS

Now both of these are totally unplayable. So the you turn on up scaling and it adds 40 FPS to your final FPS on both GPUs making both GPUs playable. Yet on paper the 4080 has 50% better RT performance in this game. How many reviews simply show the non up scaled scores so people come away thinking the Nvidia GPU does much better RT and is worth the premium?

It’s all part of Nvidia’s review guide I bet.
Indeed, if you go higher in resolution and don't go too much into performance modes for FSR, it can get quite similar with DLSS ( I only test it in Cyberpunk though). However, at lower resolution and faster performance modes it falls behind. I was quite surprised how well both solutions look when used ultra performance on a downscaled image from 4k to 1080p (not sure if the native 1080p displays plays any role in that vs a 4k native display or not). Also FSR solves some issue better with path tracing. Remains to be seen if something changes or not with the next update.

In your example, doubling the performance with upscaling you end up with 60fps vs 40 fps and that 60 is better. You do feel the difference. And of course, no FSR 3 means is less playable in some scenarios on the AMD side.

Yes, reviewers should test in native vs upscaling for the usual resolutions, see how it performs and how image quality holds up, find out possible issues, etc. A video with the testing scenario should also be a must!

Also a big plus for DLSS is that you can manually change versions just by copying dlls.
 
Unless Nvidia are actively trying to take AIB's out of the equation, to cut out that middleman, i also don't understand this mentality, surely actual competition between AIB's makes better products and helps sell them.

Here is an idea for AMD, have a flat rate IP charge for each CU in the architecture, make AIB's source their own manufacturing but with that they can chose not only how much VRam they have but also how many CU's and clock rates in each tier.

My GPU has 96 CU's, ah... but my GPU has 104 CU's, yeah but while my GPU only has 88 CU's its cheaper than all of yours.....
They don't even have to call them AMD GPU's, Sony doesn't call the PS5 the AMD Play Station, it could just be the Asus ROG Strix 104 CU, they could skin and brand AMD's drivers however they like....

Or is that a stupid idea?


Sounds like a recipe for a pick a mix of badly validated products and a poor customer experience as driver updates push one or the other 'brand' over the edge.
I do think we could have more variation in allowed optimisation as shipped from factory to match the often oversize 'marketing' coolers instead of the "2%" OC models for £££ extra.
AMD seem to have a habit of running the voltage too high for 'extra stability' so a vendor could better optimise the out of box performance, with voltage tuning and a power limit boost to suit the cooler.

All we need is a simple product stack at perceived 'value' price points instead of the usual micro-segmentation we get through each generation.

AMD's stack is not unreasonable at the current price points, even if it's a mix of generations.
At least they fixed the $100 7900XT/XTX gap that killed the XT launch reviews.
Clearly AMD has not achieved it's performance goals for this generation and is late on features it promised at launch so it needs to offer value.

Nvidia on the other hand made a huge leap in architecture, however chose to push up prices and cut down cards resulting in a meh generation for most segments.

Right now, I need a GPU for a budget box so the kids mates have something better than a lenovo tiny to play on when the come round.
Looking at 6600/7600 as it will be paired with I7 4970, older cards might be OK, say 5600XT but Starfield sweetens the deal somewhat.

On the other side, tempted to grab a 7900XT and shove my current GPU in the budget box.

Nothing tempting me from Nvidia this Gen, though I'd take the 4080 at £800.
 
Back
Top Bottom