• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When the Gpu's prices will go down ?

Surely Nvidia will not have to cut back on ada chips from TSMC due to AI demand? of course not, they can't sell what's out there right now. I can guarantee you the Nvidia next gen will be another ampere moment, just without the crypto and pandemic inflation.
I think there is reason to be cautiously optimistic here as well. I mean there's definitely a bit of a pattern forming.

10 series (good)
20 series (bad)
30 series (good but crypto/pandemic scalp-fest)
40 series (bad)

So surely we're up for a good generation next. :D
 
I think there is reason to be cautiously optimistic here as well. I mean there's definitely a bit of a pattern forming.

10 series (good)
20 series (bad)
30 series (good but crypto/pandemic scalp-fest)
40 series (bad)

So surely we're up for a good generation next. :D

I agree 10 and 20 as you say, 30 series...... some of them maybe, but the 3070 imo is a lemon.


.....and that's coming from someone who owns one.
 
3060Ti was an excellent card (still is). At FE money I think they were one of the best cards for years. To be as close to a 2080Ti performance for the price was amazing.

It would be like the 5060Ti offering 4090 performance for £400. Unthinkable now and will never happen…(I dare you Nvidia).
 
They were sort after cards for gamers but crypto messed it all up. The 3060 ti is 10% slower than the 2080 ti for under £400, they got rave reviews. 8gb was not a factor back then.
For most buyers, maybe. But plenty of people did figure it in their decisions as 8GB was always going to age badly and be poor for mods. No FE and AMD's lack of reference models in the UK made a big difference - just ask CAT whether they would have preferred 12GB 6700 XT vs the 3060 Ti FE they eventually got? Nvidia selling reference models at MSRP was the sole reason.

EDIT: says me with a 3050... I missed the FEs but took the forum-only deal of a 3050 at MRSP. I always knew it would age badly - however until the last few months it has held up it's value stubbornly.
 
Last edited:
3060Ti was an excellent card (still is). At FE money I think they were one of the best cards for years. To be as close to a 2080Ti performance for the price was amazing.

It would be like the 5060Ti offering 4090 performance for £400. Unthinkable now and will never happen…(I dare you Nvidia).
For a 5060ti to be as fast as a 4090 it would need to be +200% faster than a 4060ti and considering even on a good year Nvidia dishes out +50% it’ll be many years before Nvidia gives out 4090 performance for £400 and by then a 60ti will probably be pushing £600 anyway.

If Nvidia keeps the same pricing for next gen then I doubt even a 5070ti will get close to a 4090 as that would need around a 70% uplift over the 4070ti which will be impossible with the jump from 5nm to 3nm unless Nvidia moves back to a 400mm2 die or can hit 4ghz.
 
AMD pushes open source, some times it works most of the time it doesn't.

Without AMD's Mantle-that arguably got even more push back because it was AMD, there wouldn't be RT'ing.

Imo the 'hate' for RT'ing is because of the buy in/performance cost, shown by plenty of NV/AMD users.

Until RT'ing is available to mainstream users that can turn it on without the MASSIVE performance cost, then and only then will the hatepushback cease.

It also doesn't help when we are force fed upscaling>native.

Different needs for different users.
AMD didn't push enough. BF4 was the biggest thing (done right) with Mantle, other than that... not much. Even crap implementation for stuff such as Eyefinity/Multidisplay in Far Cry series (although FC2 worked great).
They could have done more to make use of that compute inside their cards, but chose not to... just as they're choosing not to really compete now with a basically free win.

As for Upscaling, I'll take that any day of the week for a huge performance (even in raster), compared to some percentages, maybe faster in raster if all that die space was not dedicated to al sorts of other cores.
 
AMD didn't push enough. BF4 was the biggest thing (done right) with Mantle, other than that... not much.
AMD didn't have a penny to push mantle as they nearly went bust, then as they promised from the start, they handed it over to MS and Chronos and we got DX12 and Vulkan, and Nv users didn't want to touch DX12 in the first few years as AMD was on top until Nv caught up because they needed DX12 for RT'ing and surprise surprise Nv users love their DX12 RT'ing and DLSS don't they?


Even crap implementation for stuff such as Eyefinity/Multidisplay in Far Cry series (although FC2 worked great).
They could have done more to make use of that compute inside their cards, but chose not to...

Eyefinity, don't know about FC series, but Eyefinity whupped Nv rotten, it was highly configurable and streets ahead of Nv, you could use a centre display, with two smaller displays in portrait mode for side windows and mirrors.

And don't mention 3 displays into Nv gpus in this house!:p

Worst case scenario it can be headbanging when it craps out and you have to restart/unplug hdmi's/DP cables on a 3 screen setup, or plug in a fourth display and you get flickering-noticed Steve @HUB isn't a fan of Nv multi displays either!
just as they're choosing not to really compete now with a basically free win.
AMD are not Nv, AMD is a package CPU-Console-GPU's-Steamdecks+Handheld PC's, where GPU's lay in AMDS priorities idk, whereas NV make gpus.

Everyone's forgot that last gen-AMD came from so far behind with nowhere to be seen to at least equalling the 3080ti and got close to the 3090 too!

AMD aren't Nvidia, while Nv's sales page is advertising 3070/ti's sold for more than 4070's, AMD's dropped the 68s/6950's to compete with lower to higher 40 series prices, they're shifting stock withought releasing newer same performance point gpus.
As for Upscaling, I'll take that any day of the week for a huge performance (even in raster), compared to some percentages, maybe faster in raster if all that die space was not dedicated to al sorts of other cores.
I have zero issues with your/others highly positive experience with upscaling, I really don't, I respect that, that's what PC gaming is all about, as I said earlier-Different needs for different users.

My comment was solely in regards to the over the top backlash when I and other NV/AMD users do not want to use DLSS/FSR -I have no doubt, many probably keep quiet due to the backlash ridiculing anyone who prefers native. :o

For example, when I put forward my experience running DLSS/FSR on a 65" QD-Oled running TLOU using DLSS/FSR, it throws up a huge halo artifact on DLSS/FSR that totally breaks immersion, it needs native on my display./

But almost every time I/others put our points across it's followed by DLSS>native with walls and walls of text and vids, now I get that, but although Iv'e said I don't know how many times that DLSS/FSR is an amazing tech, I do not definitively shout NATIVE>DLSS/FSR, because this user respects-Different needs for different users.:)
 
Last edited:
EDIT: says me with a 3050... I missed the FEs but took the forum-only deal of a 3050 at MRSP. I always knew it would age badly - however until the last few months it has held up it's value stubbornly.

Yup same here, got my 3070 on a forum deal and literally couldn't pass because at the time my 1070 was flagging pretty badly for high FPS PUBG that I was playing at the time.

I guess I could sell it, although I would probably donate it to a good friend who has had some bad luck in life recently.

But to replace it I'd be after a 7900xt to make it worth it and get longer term use, and £750 is too much to stomach for a GPU.
 
Eyefinity, don't know about FC series, but Eyefinity whupped Nv rotten, it was highly configurable and streets ahead of Nv, you could use a centre display, with two smaller displays in portrait mode for side windows and mirrors.

And don't mention 3 displays into Nv gpus in this house!:p

Speaking of the multi-display on AMD - a mate asked me what gpu to get for his 3x monitor setup, anyway I found this interesting from HU, tallies up with my 290x and vega experience so wonder why the negative vibes? If anything it looks like nvidia need to put some effort into this:

 
3060Ti was an excellent card (still is). At FE money I think they were one of the best cards for years. To be as close to a 2080Ti performance for the price was amazing.

It would be like the 5060Ti offering 4090 performance for £400. Unthinkable now and will never happen…(I dare you Nvidia).

Yeah not gonna happen, we all know it lol. Be lucky to get a 5070ti close to a 4090 but then they'll charge you £900 and say its a bargain.
 
Last edited:
Speaking of the multi-display on AMD - a mate asked me what gpu to get for his 3x monitor setup, anyway I found this interesting from HU, tallies up with my 290x and vega experience so wonder why the negative vibes? If anything it looks like nvidia need to put some effort into this:


Having used Nvidia multi monitor for 2 years now, i still can't get my gpu to idle below 100w when running multiple screens.
 
Speaking of the multi-display on AMD - a mate asked me what gpu to get for his 3x monitor setup, anyway I found this interesting from HU, tallies up with my 290x and vega experience so wonder why the negative vibes? If anything it looks like nvidia need to put some effort into this:

AMD's 79s are heavy on the power too though, doesn't read like there is a fix inbound either
Having used Nvidia multi monitor for 2 years now, i still can't get my gpu to idle below 100w when running multiple screens.
You running Wallpaper Engine animated desktops?

Fantastic looking but ups the clocks/power usage.
 
Last edited:
Are there any concrete news about Battlemage, a.k.a the Savior?
It's too early at this point I think, just existing rumours about it being about twice the A770 in terms of shaders etc. plus higher clock speeds at the high end, to be released at some point in 2024... We hope early in 2024 so it doesn't get immediately overshadowed by Blackwell and RDNA4.
 
It's too early at this point I think, just existing rumours about it being about twice the A770 in terms of shaders etc. plus higher clock speeds at the high end, to be released at some point in 2024... We hope early in 2024 so it doesn't get immediately overshadowed by Blackwell and RDNA4.
If they continue the trend of raising prices and neglecting the mid-range and lower segments with the new series, there will be more than enough room in that segment for Battlemage, and Intel will also be able to make some profit from it.
 
AMD's 79s are heavy on the power too though, doesn't read like there is a fix inbound either

You running Wallpaper Engine animated desktops?

Fantastic looking but ups the clocks/power usage.

Nope, not running anything in the background. Just basic windows wallpaper. It's never sat lower that 100w when connected to multi monitor. Works fine on a single monitor though.
 
Having used Nvidia multi monitor for 2 years now, i still can't get my gpu to idle below 100w when running multiple screens.
I have 4x 144Hz monitors (one of them is actually a 165Hz) running off a 3060ti and the GPU downclocks to 210/101 MHz when idle as long as they are all set to the same refresh rate.

I used to get high idle clocks and after i did a clean install of the driver it was still the same so i reinstalled over the top again and after the third time it fixed itself.

Key thing is to have all monitors running at the same refresh rate to keep the clocks down on idle. I have to set all of mine to 120Hz as streaming with OBS Studio @60fps the monitors need to be at a multiple of 60 to keep the stuttering down, some users get away with it but its definitly an issues for many.

Edit.. 3 displays use DP cables and the other uses an HDMI cable.
 
Last edited:
Back
Top Bottom