• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD To Launch RV770 On June 18th

Oh come on, surely in 2 years they can muster more than that increase, and imo they can, Nvidia's just milking everyone as AMD can't compete anymore, you think if AMD were able to compete, Nvidia wouldn't have much faster cards than their 8800's out, of course they would and everyone knows it, but with it only being a 1 horse race now, its stagnated to the point of death, i hope Intel can start to compete when they step into it, or its just going to go on and on like the past 2 years, utter ******* crap, these GTX260's/80's better be a ******* good step up.
 
Last edited:
Oh come on, surely in 2 years they can muster more than that increase.

Wait until their actual high-end product is released (r700) before making judgements on performance relative to other high-end products.

And anyway, that's a reasonable opinion to have. But point-blank assuming that progress *must* continue at the same rate as previously, without understanding any of the complexities involved in the task, and assuming that if it does not then everyone involved has failed at their jobs? Well - that's pretty ignorant, don't you think? What qualifies you to make such judgements?
 
Wait until their actual high-end product is released (r700) before making judgements on performance relative to other high-end products.

And anyway, that's a reasonable opinion to have. But point-blank assuming that progress *must* continue at the same rate as previously, without understanding any of the complexities involved in the task, and assuming that if it does not then everyone involved has failed at their jobs? Well - that's pretty ignorant, don't you think? What qualifies you to make such judgements?

Your right i don't understand, i just find it frustrating that in nearly 2 years it hasn't moved on one bit. :(
 
But what you dont tell people is that you have an 8800GTS 320 in your rig still :p

Well yeah as i still had it when i had my GT, and then got shot of my GT, so i bunged it back in to play the odd game, but thats very rare these days, as evens gamings on a bit of a downer. :(

I just want a card thats worth an upgrade, i dont want a card thats only a few frames here or there faster than my previous one, or just the same card renamed and hiked in price.
 
Last edited:
Well yeah as i still had it when i had my GT, and then got shot of my GT, so i bunged it back in to play the odd game, but thats very rare these days, as evens gamings on a bit of a downer. :(

I just want a card thats worth an upgrade, i dont want a card thats only a few frames here or there faster than my previous one, or just the same card renamed and hiked in price.

You'll be dead before you get a new card.
 
Oh come on, surely in 2 years they can muster more than that increase, and imo they can, Nvidia's just milking everyone as AMD can't compete anymore, you think if AMD were able to compete, Nvidia wouldn't have much faster cards than their 8800's out, of course they would and everyone knows it, but with it only being a 1 horse race now, its stagnated to the point of death, i hope Intel can start to compete when they step into it, or its just going to go on and on like the past 2 years, utter ******* crap, these GTX260's/80's better be a ******* good step up.

as much as he repeats it, Loadsa's got a point... That's why IMO the 9800GTX is a flop...

ATi i can understand, and after he less than stellar success of R600, they are clawing back... nVidia, though, seems to be sitting around on their ***** doing what seems to be sweet nothing... 8800GS, anyone?
 
Your right i don't understand, i just find it frustrating that in nearly 2 years it hasn't moved on one bit. :(

Neither do I, if it makes you feel any better :p


And yeah, it is annoying when progress starts to slow down, but in a way it's inevitable:

CPU development started to hit a wall 4 or 5 years ago, due to various practical limitations. So, intel and AMD decided to go for multi-core approaches instead. This is nice, as it allows them to keep up with Moore's law, but certainly isn't as practical as having ever-doubling CPU speeds.

Similarly GPU development is running up against problems with heat generation and power consumption (the two go hand in hand). it's a tough pill to swallow, but when you're performing massive numbers of calculations, you run up against entropy issues. The faster you go, the harder you have to fight against the inevitability of heat generation. It's like wading through ever-deeper treacle.
 

Parallel vs. serial.

It is massively more complex to write a software application to run on multiple CPU cores. Why? because communication between these different cores can have a huge cost in terms of latency. In simple terms, one CPU core is always going to be waiting for the other(s) to finish, so that it can update the global data sets and move on.

Some applications (like folding for example), are very easy to parallelise, and so a dual-core CPU is just as good as a single core at twice the speed. But these applications are rare. Game logic, for example, is extremely difficult to write to use more than one CPU core effectively. You can utilise multiple cores easily enough, but it might well be slower than using just one. Strategy games (with the main cost being lots of semi-independed AI units) are much easier to parallelise then (say) FPS games, where almost everything is relying on everything else at all times.
 
So ATi have it harder to make high performance chips compared to Nvidia?

It depends how you look at that. In the opinion of the gamers who dont have a clue, then they have a mountain to climb, but I actually think ATi has done themself some favours (all be it unintentionally). They moved onto 55nm a while ago now, Nvidia is in a word plagued by 65nm, and there trying desperetly to get away from it.


I have said this a few times, but, imo it would seem while ATi was down in the dirt, they did the ground work for a come back, that R700 COULD, be what finially shuts up those dam fanboys. I certainly hope so, R700 taking the performance crown is what will spark some rivalry again, and hence move on the industry (or more importantly appease loadsamoney xD).

Martyn
 
not sure he means practical actually, I think he means straightforward? Apologies duffman if I'm mis quoting but basically its a bit of a cheat doubling the transistor count by doubling the amount OF cpus rather than doubling the amount PER cpu.

Basically, the absolute best increase in speed you can get from doubling the number of CPU cores is a factor of 2, but in most practical cases it is almost impossible to get near to this (and may in fact be less than 1 - ie slower). It also takes a lot more programming effort.

If you double the speed of a CPU *in serial* then you will get double the speed increase automatically, in every application.


But yeah, Moore's law is like the Bible in chip development. They will do ANYTHING to keep up with it, for as long as possible :p
 
Well i don't know abot design or any of that stuff but seems to me you can only keep moving things ahead at a certain pace for so long before you hit the limit maybe were just closer to that then any of us think. Either way i game with an old nvidia 7 series card and plan to upgrade to 4870 so even if you guys with later cards are not impressed i will be so i will be happy :D.
 
Back
Top Bottom