• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Opinion - Why are we sold stuff we don't need?

I think we are reaching a plateau in pc gpu power and pc cpu power for a gamers perspective. Due to software production costs most games are going to be multi-platform, which too a large part means coding games with the least powerful platform in mind and with limited time making performance enhancements on the other platforms.

Take your mind back to 2006? when the 360 and PS3 came out. Both came out with a derivative of what was the previous gens top GPU, 7800 and x1900. If I remember, shortly after release of the consoles the 8800gtx came out which was about twice as powerful as the 7800. In reality any gamer who has a pc with a 8800gtx then could easily play any console port on a pc if they were willing to use similar setting, i.e. 720p medium textures, poor draw distance etc. I bet in fact an 8800gtx would easily outperform both the xbox 360 and PS3 in Skyrim today.

If the rumours are true then the gpus in the next gen consoles are going to be based around the amd 6750. Which is probably at least 5 or 6 times less powerful then the top end GPU available now. Consoles are due out next year/early 2014 by then we will have a single GPU probably twice as fast as the 680/7970 meaning pc graphics cards are going to be ten times faster than that available on consoles.

I cannot really see any console ports of the future pushing any modern GPU to its limits. Meaning there is no real need for faster "gaming" pc's. The only reason I can see the need of these uber GPU is going to be for ultra high res multidisplays. Or if (heaven forbid) some developers start making PC only titles which make the most over the power available.

I think both AMD and Nvidia know what's coming and they are artificially slowing the performance increase each next gen brings to try and maximise profit.
 
they want us to have.

Doesn't work like that !

Oh, but it does. Nobody ever wanted any i-device. We were sold i-devices so hard we convinced ourselves we desperately wanted them.

Coca cola vs pepsi. back in the old days, there were blind taste tests done and something like 95% of America's population preferred pepsi to coke, yet because of coke's 100yr long marketing scheme, when the results were revealed, some people bought the coke; even though they actually preferred the pepsi.

Coke then brought out a "new coke" and people again, loved it in blind taste tests, but when it came to revealing the names, nobody liked it simpl because it wasn't the coke they had been told was the best drink around.

My point is, we are told what we are to buy; we listen to it. We then justify our purchases.

Lets move on to the "pointless crap".

There's no point releasing products that are meant for a reasonably large market and making it so that it doesn't have the features available to satisfy all demographics of your trget market.

Let's take cars as an analogy. Lamborghini want to sell us some new [Insert spazzy italian name here] car, they want all the wealthy in the world to buy it. It's a small market as a % in terms of everyone that buys/will own a car. They decide that they'll keep the driver's seat on the left and have the touchscreen control pod thingy and all the writing in Italian. They reason that because most Italians think that the English optiojns in the car are added crap which they'll never use, they will get rid of that, cutting of a huge segment of their market and they will then use that time saved to focus getting another 2mph on the top speed.

So how does this relate to graphics cards? Well, if nvidia removed CUDA cores, then what would happen to the people who used the cards for video editing stuff? They'd suffer a great deal as they may have stuck with nv over AMD simply because of that fact. What one person deems stupid, another will value greatly. You said that the 2600k is all you need for gaming, which is true. However, what if you're like me? I love flight sims and my 2500k simply couldn't cope. My 3930k pwns flight sim, and i use it for more common gaming like BF aswell. There is a segement of the market who don't want to pay £9000000 for a xeon, but would alos like more power than a 2600k, so adding features to products that already exist will mean you cater for another market segment.

For example, someone who enjoys gaming but also needs to do video editing will ideally have a GTX card (for CUDA) and an intel 6 core. Both CPU and GPU serve the main purpose; gaming, but they also have the capability to serve the second purpose, video editing.

Finally, handmedown culture. It exists in everything!
 
tl;dr

We need server markets and higher tier environments to progress the consumer kit.

The same happens in many many fields. Cars - lot's of tech driven by the motorsport industry, mass transport etc.

There is no demand from the masses of tech consumers outside of the enthusiast groups to drive tech forward. I would hazard a very large % of desktop owners could do 100% of their tasks on tech 5 years old and not care/notice compared to a new system.

Many things come from the server market. Started life as very expensive technologies, matured there, cost came down as it did so, crossed the bridge into enthusiast circles, then became far more mainstream.

Gaming and enthusiast markets are tiny compared to mainstream. Very very very highly focused products exist aimed squarely and ONLY at us. They also cary extreme price premiums.

We are lucky there is an enterprise market to prop us up. Without it we would be dead in the water.
 
I think we are reaching a plateau in pc gpu power and pc cpu power for a gamers perspective. Due to software production costs most games are going to be multi-platform, which too a large part means coding games with the least powerful platform in mind and with limited time making performance enhancements on the other platforms.

Take your mind back to 2006? when the 360 and PS3 came out. Both came out with a derivative of what was the previous gens top GPU, 7800 and x1900. If I remember, shortly after release of the consoles the 8800gtx came out which was about twice as powerful as the 7800. In reality any gamer who has a pc with a 8800gtx then could easily play any console port on a pc if they were willing to use similar setting, i.e. 720p medium textures, poor draw distance etc. I bet in fact an 8800gtx would easily outperform both the xbox 360 and PS3 in Skyrim today.

If the rumours are true then the gpus in the next gen consoles are going to be based around the amd 6750. Which is probably at least 5 or 6 times less powerful then the top end GPU available now. Consoles are due out next year/early 2014 by then we will have a single GPU probably twice as fast as the 680/7970 meaning pc graphics cards are going to be ten times faster than that available on consoles.

I cannot really see any console ports of the future pushing any modern GPU to its limits. Meaning there is no real need for faster "gaming" pc's. The only reason I can see the need of these uber GPU is going to be for ultra high res multidisplays. Or if (heaven forbid) some developers start making PC only titles which make the most over the power available.

I think both AMD and Nvidia know what's coming and they are artificially slowing the performance increase each next gen brings to try and maximise profit.

I agree with this, I believe that Nvidia and AMD use clever marketing to convince us to get the next best thing. Even though the game can be played, a lot of people seem to chase fps. Even if in game they are getting good fps, if there is a card that offers 50% more then it's enough for people to get them even if they are not needed.
I also think games purposely have these ultra and demanding settings to coincide and push these expensive cards out of the door.
Some of the effects look good if you look for them but some have little effect on the gaming experience if they were not there.
 
it's a business thing, it's cheaper for them to knock out a jack-of-all-trades part and sell it at a premium in terms of up front cost than to design discrete parts for everyone.

TBH monitor tech holds us back almost as much as the consoles. Sure you can play games at low settings at 720p, but buy a big monitor and try to enjoy everything on max, have a better experience. I'd love a card that could push a 1440p/1600p monitor at 120Hz, but no-one makes one. If content distribution for tv moved fast enough then the consoles would keep up which would push the pc games as well.
 
it's a business thing, it's cheaper for them to knock out a jack-of-all-trades part and sell it at a premium in terms of up front cost than to design discrete parts for everyone.

TBH monitor tech holds us back almost as much as the consoles. Sure you can play games at low settings at 720p, but buy a big monitor and try to enjoy everything on max, have a better experience. I'd love a card that could push a 1440p/1600p monitor at 120Hz, but no-one makes one. If content distribution for tv moved fast enough then the consoles would keep up which would push the pc games as well.

Forgive me if I'm wrong, but surely 3840x2160 would be the next logical step to take in terms of resolution as you simply double everything? That was my understanding of why the new iPad had that res anyway
 
Forgive me if I'm wrong, but surely 3840x2160 would be the next logical step to take in terms of resolution as you simply double everything? That was my understanding of why the new iPad had that res anyway

LOOOOL :D

The iPad does not have a 4K/Quad Full High Definition resolution.. :p

It has a 2048x1536 res which is a doubling of 1024x768.

There are already monitors which support 4K resolutions but it is not going to be mainstream for a very very long time. It's really not needed in computing for the consumer.

No point stuffing that res into a desk sized package you view from 1-2 feet. Even 720p content is stupid sharp on a 1080p panel. Imagine 4K content on a 30" monitor....it's just silly really (as far as the regular consumer goes). Awesome pixel density sure but a very expensive way of going about it for what real benefit?
 
Last edited:
So many GTX 580s are up for sale at the moment. There are also a good chunk of 7970s and 7950s. Why? because people are stupid. It's their stupidity that is holding us back in so many ways it's not funny.


And you can now buy a 570, a really good card for a snip over £200.

Is a 689/7970 worth more than double that?


No way.
 
So many GTX 580s are up for sale at the moment. There are also a good chunk of 7970s and 7950s. Why? because people are stupid. It's their stupidity that is holding us back in so many ways it's not funny.


And you can now buy a 570, a really good card for a snip over £200.

Is a 689/7970 worth more than double that?


No way.

What do you expect when people looking at £200 cards,are being pointed towards £400 ones to play games at 1920X1080??
 
Pointless Thread really .... you could ask , Why do people buy a Ferrari, when they can invest in a Mercedes based Smart Car... Both do the same job ultimately, getting you from A-B, and both inherent technology featured in F1 from sometime...
Answer - because people want to get from A-B faster, and upgrade the feature set of their previous product,
The Gaming market isnt the only market surely that AMD/NVIDIA aim product at ultimately after all... and technologies such as Direct Compute will benefit Guys building a home theatre just as much as faster FPS benefit Gamers...
 
There are already monitors which support 4K resolutions but it is not going to be mainstream for a very very long time. It's really not needed in computing for the consumer.

In the last 2 years they've already gone from $70K to around $20K and more manufacturers are talking about releasing them this year

I think "very very long time" is over exagerating a bit, that suggests 10 year timescale at least, I can see these being a lot more affordable in the next 2-3 years

it depends to an extent what you consider "mainstream"... 3D TV's are plentiful and relatively cheap but not a high percentage of households have one or actually watch 3D content (because there isn't a great deal)
 
Andy, I believe you're wrong about PhysX, with your statement that we don't want or need it.

I concede that support in games for it today is poor, and can therefore understand the position that maybe we don't need it today, however real physics implementation in games (in real-time) requires SERIOUS multi-threaded computation power that simply cannot be delivered by today's CPUs.

The "physics" that we see in today's games (ala BF3) is pretty basic stuff, but there's nothing really happening there; tiny bit of destruction, but all falling into pre-defined pieces the same way every time and then a couple of bits of junk that have no mass and your character can kick about as if they were made of air.

Real physics computations would deliver far more realistic destruction effects calculated on the fly that take into account the mass of a projectile striking a surface/structure, the point and angle of impact and the mass/strength of surrounding structures. This would deliver a different (and unrepeatable) effect every time a building is destroyed.

Another example is the implementation of soft fabrics (i.e. cloaks) and how the move around a character and interact with the environment. What you see in games today is pretty much all pre-programmed.

Take another example: long grass. As your character moves through the long grass, in order to create truly realistic visuals, the motion of every blade of grass and how the character's movement through it affects is needs to be calculated.

Yet another example would be calculating richochets of projectiles from vehicle armour or buildings. Currently those projectiles disappear from the game environment, but in reality they don't.

Sure, you can say we don't need it, but from the same standpoint you could argue that we don't need DirectX 11 (or 10, 9, 8 etc for that matter!) but games developers are in the pursuit of photorealistic (or videorealistic) gameplay environments and to truly achieve this requires proper implementation of physics.

This is why PhysX is NOT a lost cause. nVidia are just a bit ahead of the curve, and the implementation of this level of detail requires more and more investment from the games studios, but it will happen. Gaming is now a bigger industry than the movies, and this will ultimately result in bigger and bigger budgets for new game developments.

Will WILL see proper physics implementations and will WILL need nVidia's (and AMD's) physics engines to deliver these effects.
 
Well with regard to the OP, you have said in it "let's first look at CPUs" and then spent about 4 lines talking about them. Then you "move on" to graphics cards and spend paragraphs and paragraphs on the subject. I think you should have kept this just about graphics cards seeing as you are talking about gaming.

I was going to make a thread on this issue myself and may still do so; a lot of people on these forums think that we all buy computer hardware with the sole intention of gaming.
I think in reality this is not the case. I see so many threads/discussions where people "spec" a build for someone or give advice and they default to ideas based on their assumption that the machine will "obviously" be used to play the latest games on.

The most common thing I see, is someone will ask about what spec CPU to get and currently right now, often it's about deciding to go all out on an i7 vs a little less for an i5 or less again for an i3. A lot of responses will come in like:

"i5 is where it's at, and all you need for gaming".

That's nice. I would put gaming at about the least important thing I do on a computer. I do game, yes, but I evaluate purchase decisions based on ALL usage that said component is likely to get.

Yes this is overclockers UK, but the forum has spanned an audience and membership level that goes way beyond niche overclocking users (a point you discussed in the OP regularly) and it's not always about gaming. Granted, it's not always about maxxing a CPU out, using 16gb of ram or 2gb of ram. My point is, different folks for different strokes. The OP is very gaming specific.

As a general argument I somewhat agree with many of the points though. Current business models bringing "new features" that we as consumers do not need is becoming more and more out of touch. Getting back to basics and producing the best bang per buck product at doing the minimal things that are really important is a big issue, and not one that many manufacturers seem to be trying to do.

TV panels for example, these days are compromised on picture quality with uneven backlight levels often because of cramming in marketing features such as 3D (requiring filters) and networking aspects (that are in their infancy and poorly implemented). it is frustrating as a consumer that we never seem to get what we really want, especially in electronics/hardware.


So yes I agree with the OP to an extent, but some people DO want some of these features for their usage. It's not really a simple discussion unless you pick a particular product. You seem to focus on graphics cards, and in this respect I mostly agree with what has been said.

:)
 
Back
Top Bottom