• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why are GPUs so expensive?

Soldato
Joined
10 Oct 2012
Posts
4,424
Location
Denmark
At least console gaming is fair and equal. PC gaming has people with 60hz, 120hz, 144hz and 240hz screens playing online together which gives people an unfair advantage.

So because a few people can afford 2080 ti's, 9900ks's and 4k 144hz screens you think PC online gaming isn't fair. I'd rather be playing on my own mediocre machine vs a player with top of the line equipment than be dragged down to 30 fps hell. If he wins it wasn't because of his equipment but because he was better than me. The only big issue with PC online gaming is the darn aimbots and wallhacks.
 
Soldato
Joined
19 Dec 2010
Posts
12,031
At least console gaming is fair and equal. PC gaming has people with 60hz, 120hz, 144hz and 240hz screens playing online together which gives people an unfair advantage.

Eh, no, the problem with PC online gaming is the hackers. A crap player on a 60Hz monitor will still be a crap player on a 240Hz monitor.
 
Soldato
Joined
19 Dec 2010
Posts
12,031
They can be called "virtual" refresh rates and effectively have the same result for the viewer.

Oh, I just realised who I responded to. I forgot you don't understand how TV's work. In the context of the conversation about higher refresh rates giving a competitive advantage then no, the result isn't the same for the viewer. BMR, CMR, PMI, whatever name the company puts on it, are all motion interpolation technologies. If you want to be competitive in console gaming you will want to turn these off. I would go so far as to say that in any FPS shooter on a console, even single player, you should turn these technologies off.
 
Caporegime
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Oh, I just realised who I responded to. I forgot you don't understand how TV's work. In the context of the conversation about higher refresh rates giving a competitive advantage then no, the result isn't the same for the viewer. BMR, CMR, PMI, whatever name the company puts on it, are all motion interpolation technologies. If you want to be competitive in console gaming you will want to turn these off. I would go so far as to say that in any FPS shooter on a console, even single player, you should turn these technologies off.
I see you quoted him and thought "Why you do dis"?? LMAO

My old sony had 1000Hz refresh rate, so that translates to 1000 screens ps right? :D
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
Oh, I just realised who I responded to. I forgot you don't understand how TV's work. In the context of the conversation about higher refresh rates giving a competitive advantage then no, the result isn't the same for the viewer. BMR, CMR, PMI, whatever name the company puts on it, are all motion interpolation technologies. If you want to be competitive in console gaming you will want to turn these off. I would go so far as to say that in any FPS shooter on a console, even single player, you should turn these technologies off.

There are no options to turn the effective refresh rate off ! lol

side note: you still can not get that nvidia gives far worse image quality, can you?
 
Soldato
Joined
19 Dec 2010
Posts
12,031
There are no options to turn the effective refresh rate off ! lol

side note: you still can not get that nvidia gives far worse image quality, can you?

What? Those figures you quoted aren't real refresh rates. They are the quoted refresh rates of the motion smoothing technology in the TV. It's marketing to sell more TV's because big numbers fool people, just like they fooled you. Basically, since they are defined by the TV manufacturer they can be anything the manufacturer wants.

I believe the setting in a Panasonic TV is IFC, turn this off to disable motion smoothing. I also think that enabling game mode will turn it off as well.

And your question about Nvidia image quality is daft. Because you have shown so little knowledge or understanding of any topic or thread you have posted in I doubt you are right about image quality, and if you say Nvidia has worse image quality then the opposite is probably true.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
What? Those figures you quoted aren't real refresh rates. They are the quoted refresh rates of the motion smoothing technology in the TV. It's marketing to sell more TV's because big numbers fool people, just like they fooled you. Basically, since they are defined by the TV manufacturer they can be anything the manufacturer wants

The framerate on your gaming monitor is also not real. You have 0.1% lows and 1% lows, micro-stutters, etc.

TVs need it because the most common TV signal is 576p25, and they need something in order to improve the visual perception for the user!
 
Soldato
Joined
19 Dec 2010
Posts
12,031
The framerate on your gaming monitor is also not real. You have 0.1% lows and 1% lows, micro-stutters, etc.

TVs need it because the most common TV signal is 576p25, and they need something in order to improve the visual perception for the user!

What rubbish are you spouting about now? Seriously? We aren't talking about frame rates, we are talking about refresh rates.
 
Soldato
Joined
26 Aug 2004
Posts
5,032
Location
South Wales
Are we expecting a good proper generational jump this time unlike the 2000 series from NV? So a 3070 should pretty much = a 2080Ti, and with next gen consoles launching some months later i guess that would be a good time for that kind of performance.
 
Associate
Joined
1 Aug 2017
Posts
686
Something will need to be done regarding the prices of GPUs when next gen consoles get released. I mean, the Series X GPU is apparently around 12TF, if true it just makes the overpricing in today's market even more laughable.

Plus with all the optimisations consoles have, i do think Nvidia/AMD will have a tougher time shifting mediocre 400/500 GBP cards.

Next year Nvidia/AMD need to start offering incredible cards at very good prices, it's the only way they'll stop the mass migration to consoles for some people.

Personally, consoles aren't an option for me, so i'm hoping they both get their **** together with regards to pricing.
 
Last edited:
Soldato
Joined
16 Jan 2006
Posts
3,020
The framerate on your gaming monitor is also not real. You have 0.1% lows and 1% lows, micro-stutters, etc.

TVs need it because the most common TV signal is 576p25, and they need something in order to improve the visual perception for the user!

go and have a lie down
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
In fairness he is completely correct, Nvidia did intentionally lower the image quality of their drivers to get an FPS advantage over ATi. The bit he's missing out however is that it isn't 2006 anymore :D

Things are much fresher, though :D

I just upgraded to an MSI GTX 760 oc and was quite giddy to install and test it out. However, before I uninstalled my old Radeon 6870 drivers I was running a beautiful 1920x1080 1080p on my 37" LCD TV Once I uninstalled my driver and then installed the 326.80 driver for the 760, I noticed two huge issues: 1. Overscanning: The sides and top of my screen no longer fit the given image. 2. Very Poor Quality: All text was very blurry, and all lines were no longer crisp but slightly pixelated and blurry just like text. Overall it hurt my eyes to look at for a long period of time.

It's not pixellated, it's just dull... like there's no contrast and the screen is washed out.
https://www.nvidia.com/en-us/geforc...ow-quality-image-after-switched-from-old-ati/


I have a Nvidia Asus ROG Geforce GTX 1080ti have all of my settings maxed out and I noticed in many games that colors look washed and is missing a lot of detailed textures in many games I've played. I also have a XFX Radeon Fury X but when playing the same games on that the graphics quality looks so much better on the Radeon than my Geforce. I have switched monitors around I have switched cards around in both of my Rigs I used DDU for a clean install of the latest GeForce drivers and my Geforce graphics still look like crap. Is something wrong with my card?
https://forums.tomshardware.com/threads/nvidia-geforce-1080ti-poor-graphics-quality.3293637/

go and have a lie down

What is "lie down"?
 
Back
Top Bottom