• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

4870 and 4850 info

I think though amd were pretty honest in admitting with r600 that it was not able to compete against the 8800gtx and all the stuff coming up around then was all bad news. This time around its pretty much the opposite and some articles going as far to say its nvidia that are worried not amd so fingers crossed.
 
And the other worry now from that linked article is the price of the 4870 keeps going up.

Doing the usual conversion you might now be looking at £250-£300 for the 4870 so you really would want 40% performance over the 9800GTX.
 
Yea you are being pesimistic as those are not the top of the range amd card. .

The 4870 is the top of the range "single gpu" card though isn't it?

You can't count the 4870x2 as you might as well buy two 4870 cards in crossfire?

Plus with the 4870 looking more like £250-£300, you can expect the 4870x2 to be £400-£500 range as well.
 
Since when did $349 bring the price up to 250-300 by my calculations of previous prices that prob brings it up around the 200-230 mark. I just don't think amd will do that though as it may pull it to close to the gtx 260 price range. Unless the performance of this 4870 is pretty close to gtx 260 i can't see them shooting themselves in the foot having the prices so close.
 
The 4870 is the top of the range "single gpu" card though isn't it?

You can't count the 4870x2 as you might as well buy two 4870 cards in crossfire?

Plus with the 4870 looking more like £250-£300, you can expect the 4870x2 to be £400-£500 range as well.

I highly doubt the 4870 is going to be anywhere near that price.

My guess would be £200 tops.

I have it on good authority that the 4850 will be ~£130 on launch, so I can't see the 4870 being double the price...
 
The 4870 is the top of the range "single gpu" card though isn't it?

You can't count the 4870x2 as you might as well buy two 4870 cards in crossfire?

Plus with the 4870 looking more like £250-£300, you can expect the 4870x2 to be £400-£500 range as well.


If you read the article about amd changing the way there x2 cards work so that they get rid of micro stuttering then no 2 cards won't be better than just the x2 as they are not sure if they have fixed this issue using multi cards.
 
No I say that people won't get excited over a new card if it Can't run the last previous best selling game and graphics fest properly, that's all. I personally, don't rate Crysis... but it's not just Crysis...

The fact is, when the 8800 series came out, there wasn't a game that wouldn't run at the highest res's... so everyone KNEW they were getting a massive leap in performance... this time around to be saying we're getting 40% increase in games with new cards, to me, isn't enough... which is why I bought anothe r8800GT to tide me over for now... however, to get a 100% increase in Crysis and be getting 16fps is pathteic, simple as that, People wnat to know that when spending upward of £500 notes, they're going to be getting that 500% increase, as then they'll know that ATI/NVIDIA have actually looked at a game, and though hold on, we need somethign powerful enough to run it... not just we'll hold back on the tech, churn out a load of new cards that aren't as fast as they should be...

Bear in mind, COD4 doesn't run tip top at 1920x1200 with 4xAA and 8xAF on a single card... yes, playable and smooth, but not silky... what happens with COD5 and 6 when they come out if the cards are already playing catchup? Dirt is another example, Hellgate London is another I could go on... but these games HAVE TO RUN significantly faster!

To assume MOST people don't care about Crysis is blinkered, because CRYSIS is THE graphics benchmark, and I'm sure there's a lot of people out there who love the game, OTHERWISE IT WOULDN'T HAVE BEEN A NO.1 BEST SELLER WOULD IT?

:confused: It wasn't a number one best seller though, Crytek have been moaning for a while about how crysis sold little due to pirating of the game. Graphics cards aren't made for one game, and as people keep saying, crysis is badly optimised, so what do you expect? As for Hellgate London, there's no real reason that doesn't run fast, it looks like crap. HL2 Episode 2 looks better than Hellgate, and runs faster too.
 
Yeah but so are a lot of others on that basis then to explain their poor framerates. And what makes you think future games are going to be any better coded? Some are (COD4) but a lot aren't cause it's cheaper to get us to buy better hardware than the developers to spend months improving the code.

For example.

The Witcher - great game, great(ish) graphics.

1680 x 1050 on my pc as per sig with everything on highest setting I get 60 fps with 16xAF and 2xAA.

If I change to 16xaa I get 8-15 fps.

So a 40% boost will get me 12 - 21 fps - still not playable with everything on max.

A lot of people game at even higher resolutions.

I could go on and on with a list of games which won't run at playable framerates with 16xaa/af at 1680 x 1050 (Supreme Commander turns into a slide show)

Yes, turn down the AA I hear you say but why should we? It's there so is it too much to expect the next gen cards to be able to play all the games of the last 6 months at max quality and 30+ fps?

Aparantly so.

Yes, any increase is welcomed but 40% is not enough.
 
People have too high expectations... How long did it take far cry to be run at 1920x1200 with 16AF and 4AA? The first two generations of DX9 cards couldnt handle it, and the third generation is debatable(i dont know ~_^).

Hell, my overclocked 3850 512 doesnt always stay at above 60FPS, and thats on far cry! many years old game on DX9!
 
I knew somebody would post that!

Yes, I do but it's there so why can't we have a card which can use it?

You can use it, just not on the very latest games.

And its mainly their due to ATi / nVidia 1up manship.

If ATi had it and nVidia didn't they'd forever flaunt it as their card was far superior etc etc.

Anywho, 16x can be used, you can quite easily play most Source based games using 16xAA on top end cards without it comming to a halt.
 
The best coded game I've seen in the last 12 months for codewise would have to be UT3... simply stunning to look at, running at 1920x1200 with all filtering on, although I doubt even I could run 16xAa and 16xAF... think I run 4xAA and 8xAF at the moment... I'll have a look tonight me thinks...

Cheers Pug
 
Everybody seems so quick to say that a 40% improvement over a GT is not enough? When you consider that the 3870 is 10% (ish) behind the GT, For ATI to improve their 3850 successor (4850) by 50% over their previous best single core solution (3870) seems pretty admirable to me.

To put this into context for 3850 money you can have a 4850 which beats 3870 by 50% give or take, I think it’s a step in the right direction. So it’s not like back in the day when Nvidia had their geForce 3 Ti Series and ATI came along and spanked them with the 9700pro, but those kinds of breakthroughs don’t happen very often or overnight. If the video card companies go back to their 6 month development schedule and carry on releasing steady improvements for the same money then I think most will be happy.

Finally Crysis... It makes me laugh so many people are so quick to use it as the be all or end all of pc gaming. Personally I thought it had a poor storyline, poor game play and was boring. I much prefer games such as COD4 which is far less graphically intense but a better game all round. Crysis lasted all of about 4 hours of game play before I binned it, GRID has been minimised for 4 days playing whenever I get the chance different genre of game but also a different class of game.
 
I was playing CS-S on the OkUK server last night, at 1920x1200, 16xAA and 16xAF, and was getting 140fps minimum... I love it lol... although yeah I know, it's years old like, but tell ya what, it still looks nice, espeecially Dust with HDR enabled... and runs like a dream... oh how if everything looks as pretty :-) I'm sure in 4 years time we'll have Crysis running like that/... maybe?

Cheers Pug
 
You can use it, just not on the very latest games.

And its mainly their due to ATi / nVidia 1up manship.

If ATi had it and nVidia didn't they'd forever flaunt it as their card was far superior etc etc.

Anywho, 16x can be used, you can quite easily play most Source based games using 16xAA on top end cards without it comming to a halt.

Agreed and I wasn't saying ALL games. Source and UT3 are both graphically pretty and well coded.

However, there are lots of top games which aren't. My point is that maybe half the games are well encoded and done properly. The rest rely on brute force graphics cards.
 
To put this into context for 3850 money you can have a 4850 which beats 3870 by 50% give or take, I think it’s a step in the right direction. So it’s not like back in the day when Nvidia had their geForce 3 Ti Series and ATI came along and spanked them with the 9700pro, but those kinds of breakthroughs don’t happen very often or overnight. If the video card companies go back to their 6 month development schedule and carry on releasing steady improvements for the same money then I think most will be happy.
.
BUT, ATi or nVidia haven't released new technology for how long now since the 8800 series was released? 18 months is it, maybe more in actual dev time? 18 months work for 40%.... I'm still not convinced.

I agree with what you say about Crysis, it's your opinion, but you can't deny that it IS THE GRAPHICAL benchmark for all games, THUS obviously it's going to be pointed out as a good benchmark of GPU performance... if these new GEN still can't run it... what else are they going to struggle with in the coming 6 months?

Right better get doing some work (as I'm in work)... otherwise I won't be able to afford a nVidia 5200 rofl

Cheers Pug
 
Back
Top Bottom