• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

2 Different HD4870X2's....

i thought on a pci-e 2.0 slot can give more power to the graphic card?

yes on PCI-E 2.0 mobos the power available will look like this:

HD4850: 150w PCI-E Slot + 75w PCI-E 6-pin = 225w

HD4870: 150w PCI-E Slot + 75w PCI-E 6-pin + 75w PCI-E 6-pin = 300w

However, as said before the gfx will draw whatever current it needs. The cables are only rated for 75w of draw each but can supply much more.

Also, I don't see a 48xx series using 225w+ anyway really...

The problem with the 4850 is the power regulation and stability onboard the card itself, it's not the actual power supply to the card.
 
Thats incorrect. 256-Bit bus is a joke, and even GDDR5 can hardly save it.

GTX 280 memory bandwidth = 141 GB/s

4870 memory bandwidth = 115 GB/s


ATI should have gave the 4870 a 384-Bit bus like the 8800GTX had.

As for the 4850 with it's 256-Bit bus and GDDR3 it only has 63 GB/s. With that very low bandwidth and only 512MB, the card is definitely not a good idea for high res monitors.
My GX2 had similar memory specs, but with a little more bandwidth (70 GB/s), and high res gaming on that was not a nice experience. My 280 is such a big improvement.

GDDR5 makes up for the 256-bit bus, there are some 4870's hitting 4400+ speeds for over 141 GB/s, all from a card currently available for £176

What I realy want to see now are some proper benchmarks between o/c 4870, stock 280 and o/c 280

http://enthusiast.hardocp.com/article.html?art=MTUyNCw4LCxoZW50aHVzaWFzdA==

http://enthusiast.hardocp.com/article.html?art=MTUyNCw5LCxoZW50aHVzaWFzdA==
 
Last edited:
I always like to game at native 2560x1600 res, and any 512MB Nvidia card just dont cut it

Fixed.

As shown above the 4870 with 512MB of RAM is within 90% of the GTX 280 at 2560x1600 with AA and AF.

[edit]

Since Grid was mentioned lets look at the performance

grid2sc7.png


Yep the extra cash for the GTX 280 is so worth it.. Though you can see the 256 bit bus 512MB Nvidia cards drop hard once AA+AF is applied at that res, the 4870 carries on just fine.
 
Last edited:
Why would it cripple a 4870X2 ? I thought the only problem would be if you were going to use XFire on a P35 board as it only supports 16x/4X ?


Anyone ?

the P35 motherboards not only have Xfire in 16x/4x but they are also limited to PCIE V1.1 which is half the speed of PCIE 2.0 which is featured on X** series motherboards. So cards which bog down the PCIe interface like the X2 cards get a performance hit. ( this is what i have been informed as i do not have the cards myself to test )
 
Fixed.

As shown above the 4870 with 512MB of RAM is within 90% of the GTX 280 at 2560x1600 with AA and AF.

[edit]

Since Grid was mentioned lets look at the performance

grid2sc7.png


Yep the extra cash for the GTX 280 is so worth it.. Though you can see the 256 bit bus 512MB Nvidia cards drop hard once AA+AF is applied at that res, the 4870 carries on just fine.

I wonder what detail setting they were running GRID at because i get single digits on a single 3870, & 2560x1600 4xAA you have got to be kidding.. it will take at least 15min to get to the menu screen.
Im going to do one test & disable tri crossfire & start the game in safe mode & see how low the detail settings have to go to make it playable....but they all fail on the chart as im a 60fps man.
 
Last edited:
Some interesting reading for you guys! :)

PCI Express 2.0 Graphics Cards Tested : How Does PCI Express 2.0 Scale?
Conclusion: Is PCIe 2.0 really necessary yet? As long as a graphics solution can operate with data that is stored within its local video frame buffer memory, both the reasonably mainstream Radeon HD 3850 and the hardcore GeForce 9900 GX2 will operate close to their maximum performance, even if the PCI Express link width is limited to x8 or x4. Once larger textures need to be accessed, as is the case in Crysis or Microsoft’s Flight Simulator interface bandwidth becomes a crucial element. Any link width below x16 will noticeably limit these games’ playability.

The answer, thus, has to be "yes": you want maximum bandwidth, and PCI Express 2.0, for all sorts of sophisticated 3D applications. Benchmarks such as Futuremark’s 3DMark06, PCMark Vantage, Prey or Quake provide proof from the other end of the spectrum, though: they can fit all the graphics data into the 512 MB (Radeon HD 3850) or 2x 512 MB (GeForce 9800 GX2) frame buffers.


Crossfire Meets PCI Express 2.0 : Crossfire Up To 20 Percent Faster
Conclusion - Switching To PCI Express 2.0 Yields No Improvement
 
Just wish they would update the test with 4870's and a GTX280 now.

Big difference with xfire already buy who knows how much the hit is now with the new more powerful single cards between pci-e 1.0a and 2.0.
 
awesome find with that PCIE V2.0 stuff. It has swayed me into believing i might not upgrade my motherboard yet, and just get a good graphics card despite the mild performance hit in some games. and swap my motherboard out when ig et my next CPU in a years time (summer upgrades! )
 
I think the RV770Pro will be the end of Nvidia having the most powerfull card considering that the normal 512mb 4870 is already just a tiny bit slower than the GTX280.
Unless nvidia comes up somehow magically with extra 30-40% performance on what they got or drops their prices by 30-40% they are so going to die this season.
 
Just wish they would update the test with 4870's and a GTX280 now.

Big difference with xfire already buy who knows how much the hit is now with the new more powerful single cards between pci-e 1.0a and 2.0.

I have been ooing and ahhing over this also. I've decided to go for a new board but my requirement is to have a board that does PCI Express 2.0, DDR 2 memory and have a PCI-X slot for my SCSI card. Asus have conveniently announced the P5Q WS!
 
Back
Top Bottom