Soldato
- Joined
- 21 Jul 2005
- Posts
- 21,144
- Location
- Officially least sunny location -Ronskistats
If you have only ever bought one brand humbug you cant know what your missing right? 

Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

With AMD driver i can play games running on my PC through my TV and use my tablet as a live system info display. no steaming hardware needed... how ####### cool is that?
Those are US prices without state taxes. So £700 in the UK which is about that with VAT.Also HUB keep saying no more than 750USD for the 7900XT which is £588 converted.
That’s unrealistic surely to achieve price discount (unless well into next gen!?) in the UK/Europe?
)If you have only ever bought one brand humbug you cant know what your missing right?![]()
I thought you had 2070s ? What AMD card do you have or which was the last AMD card ?
I also have 5700xt nitro+ in another system
Those are US prices without state taxes. So £700 in the UK which is about that with VAT.
The amount of championing for higher prices from simple buyers (if there are simple buyers and nothing else), is unreal. No wonder the prices keep going up!)
Coming back to the thread: Nvidia is pushing AI hard and its ecosystem. RT is part of that as it helps with visualization. Naturally, adding this to the gaming market (since that is the holly grail there as well), is the smart thing to do. Like that your R&D costs get spread to all products, which is the way to go. They also probably knew AMD wouldn't bother to change its architecture dramatically inclusing dedicated hardware for RT and upscaling until next gen of consoles are out. So is just 1+1 for Nvidia to score a win (after win). AMD (in)action just confirms that.
Yeah, I get it, RT and upscalers are not worth it for some. I bet consoles players would praise to high Heaven their machine if they would get 4k@60fps with path tracing in Cyberpunk (even if is dlss3) and in other games that would have followed. At this point, going with AMD on the graphics side just holds back progress. But alas, that's secondary to profit.
Ah, and if the situation was in reverse, with AMD being ahead in RT, we would see AMD fans going nuts on how Nvidia is crippling progress by not keeping up and (perhaps) limiting what other tech gets into their sponsored game.
I've got Starfield on Gamepass.
I don’t use gamepass and have no interest, so value is relative.
I hasten to add, I still think £600 for not even top tier GPU is crazy money really, but it's much better than anything Nvidia are offering right now in terms of value for money/bang for buck right now.
I'm imagining what becoming an Nvidia AIB must be like.
New AIB: "Well looks like a tough market but we can start by competing on price"
Nvidia: "no you can't"
New AIB : "What about remediating product issues like increasing the amount of VRAM?"
Nvidia: "absolutely not!"
New AIB: "well what abo-"
Nvidia: "you will compete on cooler design, the sticker on the card and up to a 5% overclock and you will like it"
No wonder evga sacked it off.
I Would go for a used 3080 for under £400, one went for £360 the other day on MM, if your spending near £600 on a 4070 then you might as well save the extra for a 7900XT which is over 30% faster.Hypothetically, if you were doing a completely new build targetting 1440p what GPU would you look at? Lots of discussion about price in this thread that is really relevant as I think more people start to consider their next builds.
Keeping ‘budget’ in mind to some degree…
If you go the Nvidia, do you just look at the 4070 now as the main option? Or consider last generation with a used 3070/3080?
Or for AMD, do you stick with the last generation and grab a 6750xt / 6700xt?
Unless Nvidia are actively trying to take AIB's out of the equation, to cut out that middleman, i also don't understand this mentality, surely actual competition between AIB's makes better products and helps sell them.
Here is an idea for AMD, have a flat rate IP charge for each CU in the architecture, make AIB's source their own manufacturing but with that they can chose not only how much VRam they have but also how many CU's and clock rates in each tier.
My GPU has 96 CU's, ah... but my GPU has 104 CU's, yeah but while my GPU only has 88 CU's its cheaper than all of yours.....
They don't even have to call them AMD GPU's, Sony doesn't call the PS5 the AMD Play Station, it could just be the Asus ROG Strix 104 CU, they could skin and brand AMD's drivers however they like....
Or is that a stupid idea?
Indeed, if you go higher in resolution and don't go too much into performance modes for FSR, it can get quite similar with DLSS ( I only test it in Cyberpunk though). However, at lower resolution and faster performance modes it falls behind. I was quite surprised how well both solutions look when used ultra performance on a downscaled image from 4k to 1080p (not sure if the native 1080p displays plays any role in that vs a 4k native display or not). Also FSR solves some issue better with path tracing. Remains to be seen if something changes or not with the next update.Having used the latest FSR and DLSS you need to pixel peep to see any real differences and most AAA games give both options. RT is a good bit better on Nvidia but you need to go to 4090 to get to the point it matters in games.
By that I mean any GPU lower than a 4090 100% needs to use up scaling in demanding RT games even at 1440p for playable FPS. Even the 4090 needs it at 4K sometimes.
For example (made up numbers).
At 4K in an extreme RT game
4080 gets 30 avg FPS
7900 XT gets 20 avg FPS
Now both of these are totally unplayable. So the you turn on up scaling and it adds 40 FPS to your final FPS on both GPUs making both GPUs playable. Yet on paper the 4080 has 50% better RT performance in this game. How many reviews simply show the non up scaled scores so people come away thinking the Nvidia GPU does much better RT and is worth the premium?
It’s all part of Nvidia’s review guide I bet.
Unless Nvidia are actively trying to take AIB's out of the equation, to cut out that middleman, i also don't understand this mentality, surely actual competition between AIB's makes better products and helps sell them.
Here is an idea for AMD, have a flat rate IP charge for each CU in the architecture, make AIB's source their own manufacturing but with that they can chose not only how much VRam they have but also how many CU's and clock rates in each tier.
My GPU has 96 CU's, ah... but my GPU has 104 CU's, yeah but while my GPU only has 88 CU's its cheaper than all of yours.....
They don't even have to call them AMD GPU's, Sony doesn't call the PS5 the AMD Play Station, it could just be the Asus ROG Strix 104 CU, they could skin and brand AMD's drivers however they like....
Or is that a stupid idea?