• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why GPU prices are NOT likely to drop significantly EVER!

Status
Not open for further replies.
indeed, companies milk fanboys/girls to the max, knowing they will come back for more, companies love these sorts of ppl, apple is prime example, over last few generations nvidia has gone down that route, cant blame them if folk want to throw their money at a company, tat company isnt going to say no thx, these are the fplk that bought into RTX at start with no games hardly and those available tanked your fps for a few effects that were gimmicks imo, but to these folk its like they best thing since sliced bread:D watch how thy fall over themselves to defend a gimmick now:cry:

its like shooting fish in a barrel, they cant help themselves:cry::cry:
 
Except that 25fps is delivered interlaced and once deinterlaced by TV it turns in to 50fps which is displayed at 50hz.
And let’s not forget that “most” people have frame interpolation enabled on their TVs.

Correct me i'm wrong here, but 50i is 25fps in the camera world, so i presume TV is the same.
 
Correct me i'm wrong here, but 50i is 25fps in the camera world, so i presume TV is the same.
Honestly I’m no expert and I’m describing my observations on how interlaced signal works.
But yes it’s the same in camcorders as it is in televisions.
1080 50i signal can be converted in to 1920x1080@25fps or to 1920x540@50fps.
 
Sorry but that makes no sense. 35fps is 35fps regardless of if thats a GT710 or an RTX3090. Gsync or Freesync may smooth that down and while it is playable, the fact you find it smooth does not make it smooth. I would say a vast majority of the people on these forums will agree 35fps is not smooth and 60 is a lot different.

Yeah agree 60 is a bit smoother and noticeable, but I'm still going to play at 4k 35+ fps rather than 60fps+ by dropping down to 1440p.
Movie like image quality is much more important to me than 1440p where many pixels are missing from the rendered textures compared to a 4k screen.
 
6700XT saw another $100 dollars of the in store price today and is $829.
3070Ti is $799 (Can't remember the previous price now).

Mind you... A friend posted 'container prices' from Shanghai. They are absolutely insane now. (I can only assume they are high from Taiwan as well).

You people in the US are getting some heavy discounts. Nothing like this in the UK, just checked the latest prices.
 
Playable yes but not ideal

Played Doom Eternal on the switch - 30 fps was doable but painful compared to PC - I'd much rather play at 80+ to be honest :)

Some games are fine at lower FPS like MSFS I would say but not shooters or fast paced games.

Yep agreed for Fps and fast paced games you get better timing and accuracy but I'm sticking with super image quality (that's why I got a 4k monitor) and don't mind losing a bit of edge against AI.
Its different when I'm playing against people online though I'd want ideally about 200+ fps and the image does not have to be top quality.
 
I've been saying this for ten years, can you wait another ten? This is the new normal

Yes but we're getting to the end of it now. The green shoots of Real inflation are starting to appear in the reserve currencies which will accelerate going forward.
Another 10 years? Three words *NOT A CHANCE*
The 2008 breakdown will look like tea party compared to what's ahead.
 
TV in the UK is broadcast at 25 FPS. It looked smooth as butter when England beat Denmark.

Yet a mouse pointer looks jittery at 60 FPS.

It's not as clear cut as saying 35 FPS is a slideshow and 80 FPS is smooth. Depends on the content. Some games do look smooth at 35 FPS while others like a fast paced shooter needs around 80 FPS before you can perceive the same level of smoothness.

There's "natural" motion blur into film which makes it look smooth, but we also got used to that so much so that 50fps or more looks... unnatural.

Also movies have zero input from the viewer and that makes the difference.
60fps locked or more, in games, is where starts (for most), to become an ideal experience.

PS: without mining, I wouldn't put much weight on high prices being sustainable in the long run. Plenty of people living outside 1st world countries.
 
I disagree. A bit of competition in the GPU world will do the market a lot of good. AMD is just not competition right now but there is real hope that intel will be.

AMD is the only competition with Nvidia in the GPU discrete market and will be for years to come, Intel has nothing in the discrete market able to compete with AMD or Nvidia, don't believe all the hype with their upcoming discrete cards, they will be terrible compared to Nvidia and AMD. Maybe in half a decade Intel maybe able to compete on discrete cards. Intel can't even manage to compete well in the CPU market a market they created and had a monopoly on for decades.

Anyone that thinks Intel with DG2 the Xe discrete cards will influence the GPU discrete market in the coming months needs to understand that the GPU market is more complex than the CPU market. Intel and AMDS ex GPU team they collected "Raja M. Koduri" is a bad joke really, he's good at marketing and sales but can't produce the goods as we saw when he was at AMD, he was the one sinking AMDS GPU section and once he left as we see AMD's GPU section has bounced back to actually competing with Nvidia and keeping them honest again.

Raja M. Koduri will sink Intels GPU section too and now he's working for a company that is basically selling snake oil with most of its current hardware and the never ending BS marketing and sales as we have seen over the years from Intel, Intel sadly has become a Meme of how to destroy a respected company's reputation with arrogance, lies and misleading benchmarks.
 
Intel can also pay their vendors to push their product. Not like they haven't been caught doing that already... :p

Ohh I'm sure they will and give them rebates for making sure they don't stock AMD or Nvidia GPUS or build them into prebuilds. Intel's behaviour in the PC market is famous for all the tricks they pulled and how they control companies to supply their hardware even if it costs them money.

EVGA as a recent example has gone back to making AMD motherboards, when was last time they did that a decade and a half ago ? Guess why because Intel paid them to do that and same with DELL was famous for it and many other companies that took rebates and bribes to keep AMD out of their systems.
 
AMD is the only competition with Nvidia in the GPU discrete market and will be for years to come, Intel has nothing in the discrete market able to compete with AMD or Nvidia, don't believe all the hype with their upcoming discrete cards, they will be terrible compared to Nvidia and AMD. Maybe in half a decade Intel maybe able to compete on discrete cards. Intel can't even manage to compete well in the CPU market a market they created and had a monopoly on for decades.

Anyone that thinks Intel with DG2 the Xe discrete cards will influence the GPU discrete market in the coming months needs to understand that the GPU market is more complex than the CPU market. Intel and AMDS ex GPU team they collected "Raja M. Koduri" is a bad joke really, he's good at marketing and sales but can't produce the goods as we saw when he was at AMD, he was the one sinking AMDS GPU section and once he left as we see AMD's GPU section has bounced back to actually competing with Nvidia and keeping them honest again.

Raja M. Koduri will sink Intels GPU section too and now he's working for a company that is basically selling snake oil with most of its current hardware and the never ending BS marketing and sales as we have seen over the years from Intel, Intel sadly has become a Meme of how to destroy a respected company's reputation with arrogance, lies and misleading benchmarks.

No, it really isn't. AMD has always had only a small percentage of the GPU market. And you are just assuming that intel will be rubbish. No disrespect, but I would rather wait and see than listen to guesswork ( no matter which way that leans ).
 
I disagree. A bit of competition in the GPU world will do the market a lot of good. AMD is just not competition right now but there is real hope that intel will be.

Also expecting Intel to be your saviour in the GPU market will not happen as we saw with AMD this generation in GPUS and CPUS, they didn't make CPUS cheaper as an example, minute they had a slight lead they raised the prices, the GPU side for AMD is also overpriced in my book as they are missing many features to compete at prices of Nvidia's offering, but look at their pricing a 6800xt is only $50 cheaper than a 3080 a card that is much better specs wise no matter how you look at it, only thing AMD have on the 6800xt is more VRAM but slower and next gen features like RT that is not here or there with them because it tanks performance too much to be worth using.

AMD will not be your saviour too as many thought they will be this gen, only thing they did was keep Nvidia a bit more honest but reality is the street prices have been a lot higher than MSRP of both companies but at least Nvidia is actually selling GPUS at MSRP on their store to us in UK, while AMD refuses to sell us GPUS and CPUS on their store for msrp. So as you see business is business with these companies and they are not your guardian angels that will protect you from their competition and the prices .. AMD actually this time has shown its true colours and pretence of they are here for the gamers they sold us before is only because they couldn't compete well with Nvidia or Intel, now the tables have turned and well we see what happened.
 
Last edited:
Also expecting Intel to be your saviour in the GPu market will not happen as we saw with AMD this generation in GPUS and CPUS, they didn't make CPUS cheaper as an example, minute they ahd a slight lead they raised the prices, the GPU side for AMD is also overpriced in my book as they are missing many features to compete at prices of Nvidia's offering, but look at their pricing a 6800xt is only $50 cheaper than a 3080 a card that is much better specs wise no matter how you look at it, only thing AMD have on the 6800xt is more VRAM but slower and next gen features like RT that is not here or there with them because it tanks performance too much to be worth using.

AMD will not be your saviour too as many thought they will be this gen, only thing they did was keep Nvidia a but more honest but reality is the street prices have been a lot higher than MSRP of both companies but at least Nvidia is actually selling GPUS at MSRP on their store to us in UK, while AMD refuses to sell us GPUS and CPUS on their store for msrp. So as you see business is business with these companies and they are not your guardian angels that will protect you from their competition and the prices .. AMD actually this time has shown it's true colours and pretence of they are here for the gamers they sold us before is only because they couldn't compete well with Nvidia or Intel, now the tables have turned and well we see what happened.
While we may not have seen the benefit in prices we have in performance atleast with the 3080 being on a 102 die, had AMD not been competitive then I'd have fully expected the 3080 to be on a 104 with similar performance to what a 3070 offers.
 
No, it really isn't. AMD has always had only a small percentage of the GPU market. And you are just assuming that intel will be rubbish. No disrespect, but I would rather wait and see than listen to guesswork ( no matter which way that leans ).

Their market is not small in GPUS sales, in percentages owned by users maybe but they sell millions of GPUS every generation IGPUS (in pc's and consoles) and discrete. Remember AMD took over ATI and ATI was the only real competition to Nvidia, Matrox and other GPU makers left the gamer and general consumer market years ago and some still exist but make special gpus for special uses as Matrox still does and they will not be coming back to general GPU gamers market as its not profitable for them to do so.


Well be ready to be disappointed when Intel launch DG2, because it will not be anything more than a basic mid range card at Intel prices.
 
Last edited:
While we may not have seen the benefit in prices we have in performance atleast with the 3080 being on a 102 die, had AMD not been competitive then I'd have fully expected the 3080 to be on a 104 with similar performance to what a 3070 offers.

Yup it would have been a 104 for the 3080 and the 3080 ti we have now would have been a 3080 spec on a 102.

Nvidia lost their bottle this time with new consoles and not sure what AMD was going to do 100%, but I believe Nvidia knew what AMD had but not sure how they were going to play the cards out at what prices, also the backlash from the 20xx series was ringing in their ears too and thought we better price the 3080 at $700 (a price they really didn't want to sell at and I'm sure they really wanted around $1000 for it on the 102, but on 104 the price would have been right for them).


I also think the 3090 was meant to be their Titan card but the power issues and the regular Titan prices may have again caused a backlash as this card again has 24GB like the past RTX Titan and some users would have not upgraded or said too expensive for a 24GB card again.

Then again the power issues may not have been the whole problem as they have recently even released Quadro class cards with 300W+ power use which in past would have been 250-280W. Also their A100 gpu pcie and server GPUS are well over the 300w power and can even do 500w each GPU and servers with 8 x A100 can use over 5KW now.


As can be seen here a 5KW server from Nvidia :-

 
Last edited:
Status
Not open for further replies.
Back
Top Bottom