• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

5770 = 4770?

Soldato
Joined
7 Jul 2009
Posts
16,234
Location
Newcastle/Aberdeen
Well i got to thinking, 40nm? Isn't that the same as the 4770? Well the similarities don't end there:

Performance similar to the 4870 (with overclock and more VRAM)
Very Similar Length
GDDR5

Apart from the obvious differences of DX11 and native HDMI it's pretty much the same chip... what i want to know is what makes it worth twice the price?
 
But the problem is that performance isn't that much better than the 4770, and that's due to a higher stock clock and more memory. Essentially it's the same card.
 
it ranges from neck and neck to 20% faster. there isnt much in it, no. But then cards are always closer than some people like to admit - a 5850 is a smidge under twice the price. is it twice the performance? nope, not close.
 
But the problem is that performance isn't that much better than the 4770, and that's due to a higher stock clock and more memory. Essentially it's the same card.

4770 640 shaders ,16 rops, dx10.1, mostly 512mb versions around, limited overclocking on early 40nm process.

5770, 800 shaders, 24 rops, dx11, 1gb, much higher clocking memory, much faster, HUGELY higher minimum framerates, its a hugely better card. While it often has lower max/average framerate its always got a higher min framerate than the 4770/4850, often very very close to the 4890 and 275/285gtx, almost always higher min framerate than the 260gtx and thats whats important. Especially when most LCD users use vsync, getting 80 or 100fps doesn't matter, but it drags the averages framerate in benchmarks up, but when the minimum is 30fps on a 5770 or a 4890, it will feel the same.

If you've got £100 or more to spend, anything but a 5XXX series is money wasted.

http://www.bit-tech.net/hardware/graphics/2009/10/13/amd-ati-radeon-hd-5770-review/6

look at its minimum framerate in Stalker, ahead on par with a 275/285/4870, while its max fps is 30% or so off the 4890, its min fps is much much closer. Average framerates have never ever been a good indication of how a card feels compared to another, but minimum framerates are key to how good a card is, the 5770 frequently shows up much bigger and more expensive cards, a 4890 is some 20% more expensive, and that might be a fair estimate of performance. Depending on how high the memory clocks, it might close the gap a lot also, then factor in new(and no final drivers), the fact that it will only get increased performance and/or IQ in dx11 games and you have two cards that you'd be very hard pushed to tell the difference between in most games, but ones more future proof, cheaper and much more likely to make gains with better drivers.
 
Last edited:
:confused:

The 4770's are just under 4850 performance, the 5770 is around the same performance as the 4870, as they are its next gen replacement, they are also cheaper than it, so not only are they worth buying over those (4870's), they are also worth buying over the slower 4770's to.
 
Last edited:
4770vs57702.png



Thats better lol.



So, what's you're definition of 'massively faster' then? 45% is a big percentage, but is 34 fps a lot faster than 24fps? because that's a 41% increase..... i say no.

Do any of those look like HUGELY higher minimum framerates? nope, except crysis at 1920x1200 4xaa....a whopping 79% faster equating to......16fps......:o
 
Last edited:
Thats better lol.



So, what's you're definition of 'massively faster' then? 45% is a big percentage.....but is 34 fps a lot faster than 24fps? because that's a 45% increase..... i say no. Do any of those look like HUGELY higher minimum framerates?

24fps to 34fps is a big jump, because you will notice it, unlike a jump from 50-70. However most people, well people that arent braindead, get a card to play at higher details/res and not to increase framerates.
 
Going from 50-70 will enable you to run vsync, which combats tearing.

Buttery smooth 60fps with no tearing is noticable let me tell you!
 
24fps to 34fps is a big jump, because you will notice it, unlike a jump from 50-70. However most people, well people that arent braindead, get a card to play at higher details/res and not to increase framerates.

true and if you want to look at it like that, neither card is really fast enough at 1920x1200 if you are buying a card for that reason unless you dont want to run AA.... in which case the 4770 still isnt quick enough for those games.


The differences arent as large as people like to make out.
 
Last edited:
Code:
card	price	Price	Perform.    increase %
                        Increase    for the cost
4770	65	100.00%	--.--%	    -.--
5770	110	169.23%	39.15%	    0.23
5850	200	307.69%	109.64%	    0.36

intreasting...did a few sums and performance wise, the 5850 is better value for money than the 5770. But we knew the 5770's were on the expensive side anyway. either way though, Thats a long way from a 1:1 return on money spent :p It's only based on a few games so it's to be taken with a pinch of salt naturally :)
 
Last edited:
Thats better lol.



So, what's you're definition of 'massively faster' then? 45% is a big percentage, but is 34 fps a lot faster than 24fps? because that's a 41% increase..... i say no.

Do any of those look like HUGELY higher minimum framerates? nope, except crysis at 1920x1200 4xaa....a whopping 79% faster equating to......16fps......:o


errm, actually yes, those do look like hugely higher minimum framerates, insanely higher, for an OP that said cards perform the same, whats the average 43% at the probably most used res of 1680x1050 for that kind of power card, thats god damned massive.

People go on about liking 60fps average, which is all well and fine, but people like that number because minimum might be 30fps, where as a 50fps average on a less good card might only have 22fps average.

The 50/60fps won't feel very different at all, the 22/30fps is the difference people notice, and yes, that difference is entirely massive, especially for cards the OP claimed were "almost the same" . that 8fps difference could easily be the difference between pretty smooth and completely stuttery and horrible. While the difference between 60fps and 100fps might be completely invisible.


I find it funny that someone could even ask the question of is 45% really a big increase in a seemingly serious way, then claim they don't think it is.

THe 285gtx is routinely claimed as spanking say a 4870 even though it might only beat it in a game by 10% fps in total, yet a 45% difference is not that big?

yes, sometimes you'll get bizarre numbers and still have an unplayable experience, 16fps is infact, a million times more playable than 9fps, but both still suck hard either way you look at it.

But still remember this is a DX11 card, which will only increase that lead in the future, even without DX11 better drivers could also further extend that lead.
 
I find it funny that someone could even ask the question of is 45% really a big increase in a seemingly serious way, then claim they don't think it is.

you must have a awful sense of humour then. whats 50% of 8? 4. huge increase? no. 50% of 20? 10... nope. 40? 20... nope. 80? 40 - yes, getting there. percentages mean diddly squat, it's what you end up with that's important and it could be 500% faster than 8 frames a second....it'd still be a slideshow.

THe 285gtx is routinely claimed as spanking say a 4870 even though it might only beat it in a game by 10% fps in total, yet a 45% difference is not that big?
not by me, i know better lol.
yes, sometimes you'll get bizarre numbers and still have an unplayable experience, 16fps is infact, a million times more playable than 9fps, but both still suck hard either way you look at it.

ah so you do understand the concept then. Just another usual akward for the sake of it drunkenmaster reply! :p

Anyway, minimum framerates is an entirely subjective thing, you could argue all night long about it. I wouldnt buy either of the cards knowing i'd be greeted with those sorts of frame rates and that's what's put me off the 5770 - it doesnt cut it. It might, if i had a smaller monitor. Or, if i was happy gaming with details knocked back a notch here and there. But that's the sort if thing i'd do on my laptop, not my pc. If im going to upgrade the card i just sold (4830), its going to have to be better than both the 4770 and the 5770. Yes the 5770 is 40% faster, but it's not enough so it really makes little difference.
 
Last edited:
umm I think you're losing it to make a point. 500% faster than 8fps is 48fps and that isn't a sideshow...

yes, if thats a minimum and not an average lol

If the 5770's were say, £80 then it would make the 4770's obsolete. as it stands, you're paying £70% extra for a 40% increase in performance (roughly of course) which imo doesnt make it clear cut. But two of them at £80.....that'd be great if i had a decent crossfire motherboard (and not an ip35pro :p)
 
Last edited:
Back
Top Bottom