• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Discussion On The ATI Radeon 5*** Series Before They Have Been Released Thread

Status
Not open for further replies.
Apparently

"According to reports, AMD will unleash its next wave of high-end graphics products in under two weeks. Wednesday September 23rd is the date being bandied about, with AMD expected to launch two new products in the form of the ATI Radeon HD 5870 and ATI Radeon HD 5850."

Taken from *****.
 
Not interested in a game that needs 1 - 2 years of hardware advancements just to play it at a tidy speed.

By your logic they should've just witheld the game by two years, which ultimately doesn't help anyone apart from the very (very) few whose egos were somewhat bruised that their rig couldn't play a cutting edge game with everything turned on at god-knows-what resolution. I mean realistically, what games even rival Crysis in terms of scene complexity and graphics quality. I mean sure a lot of games are catching up on the latter, but you have to understand Crysis is rendering a lot compared (particularly in the way of foliage mixed with its massive draw distance) to most contemporary games. That doesn't make it poorly coded (I mean, if you've studied Crytek's source code and can suggest some optimisations for them whilst maintaining a similar quality, feel free to prove me wrong), only a bit overambitious.
 
Wanna see what 24.5 million pixels looks like?

eyefinity.jpg


That's six Dell 30" displays, each with an individual resolution of 2560 x 1600. The game is World of Warcraft and the man crouched in front of the setup is Carrell Killebrew, his name may sound familiar.

Driving all of this is AMD's next-generation GPU, which will be announced later this month. I didn't leave out any letters, there's a single GPU driving all of these panels. The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed. This is the successor to the RV770. We can't talk specs but at today's AMD press conference two details are public: over 2 billion transistors and over 2 TFLOPs of performance. As expected, but nice to know regardless.

The technology being demonstrated here is called Eyefinity and it actually all started in notebooks.
Not Multi-Monitor, but Single Large Surface

DisplayPort is gaining popularity. It's a very simple interface and you can expect to see mini-DisplayPort on notebooks and desktops alike in the very near future. Apple was the first to embrace it but others will follow.

The OEMs asked AMD for six possible outputs for DisplayPort from their notebook GPUs: up to two internally for notebook panels, up to two externally for conncetors on the side of the notebook and up to two for use via a docking station. In order to fulfill these needs AMD had to build in 6 lanes of DisplayPort outputs into its GPUs, driven by a single display engine. A single display engine could drive any two outputs, similar to how graphics cards work today.

Eventually someone looked at all of the outputs and realized that without too much effort you could drive six displays off of a single card - you just needed more display engines on the chip. AMD's DX11 GPU family does just that.

eyefinitylogo.jpg


At the bare minimum, the lowest end AMD DX11 GPU can support up to 3 displays. At the high end? A single GPU will be able to drive up to 6 displays.

configuration.jpg


AMD's software makes the displays appear as one. This will work in Vista, Windows 7 as well as Linux.

The software layer makes it all seamless. The displays appear independent until you turn on SLS mode (Single Large Surface). When on, they'll appear to Windows and its applications as one large, high resolution display. There's no multimonitor mess to deal with, it just works. This is the way to do multi-monitor, both for work and games.

displayproperties.jpg


I played Dirt 2, a DX11 title at 7680 x 3200 and saw definitely playable frame rates. I played Left 4 Dead and the experience was much better. Obviously this new GPU is powerful, although I wouldn't expect it to run everything at super high frame rates at 7680 x 3200.

left4deads.jpg


Left 4 Dead in a 3 monitor configuration, 7680 x 1600

dirt2resolution.jpg


With six 30" panels you're looking at several thousand dollars worth of displays. That was never the ultimate intention of Eyefinity, despite its overwhelming sweetness. Instead the idea was to provide gamers (and others in need of a single, high resolution display) the ability to piece together a display that offered more resolution and was more immersive than anything on the market today. The idea isn't to pick up six 30" displays but perhaps add a third 20" panel to your existing setup, or buy five $150 displays to build the ultimate gaming setup. Even using 1680 x 1050 displays in a 5x1 arrangement (ideal for first person shooters apparently, since you get a nice wrap around effect) still nets you a 8400 x 1050 display. If you want more vertical real estate, switch over to a 3x2 setup and then you're at 5040 x 2100. That's more resolution for less than most high end 30" panels.

triconfiguration.jpg


Any configuration is supported, you can even group displays together. So you could turn a set of six displays into a group of 4 and a group of 2.

It all just seems to work, which is arguably the most impressive part of it all. AMD has partnered up with at least one display manufacturer to sell displays with thinner bezels and without distracting LEDs on the front:

eyefinity2.jpg


A render of what the Samsung Eyefinity optimized displays will look like

We can expect brackets and support from more monitor makers in the future. Building a wall of displays isn't exactly easy.
 
Last edited:
Not interested in a game that needs 1 - 2 years of hardware advancements just to play it at a tidy speed.
You can play at at a tidy speed, just not at the 'Very High' quality setting. In other words Crytek have made the game scale well to future hardware, while still taking advantage of current hardware.

The best of both worlds, isn't it?
 
30fps minimum on crysis and the game has been out ages, acceptable dropping to 30 lowest but it shows how badly this game was made.

Then play it at high settings, still looks better than 99% of games out there and runs pretty damn smooth on recent hardware (runs at 55fps @ 1680x1050 on my 4890).

Very high settings looks better than any game still, especially when you consider the overall complexity of the scenes, therefore you can't expect it to run at a constant 60fps. Think about what 60fps entails; thats 60 frames every second, with each frame made up of literally millions of polygons and relatively complex shader effects.

Also, the amount of computing power required to render a scene, does not scale linearly with the perceived graphical quality/complexity of that scene. In other words, if a particular game scene looks twice as good as another, it does not necessarily require twice as much computing power to render; the perception of graphical quality/complexity is way to subjective. This is why I cringe, when people complain that certain games look almost as good as Crysis, yet Crysis runs a lot worse.

Finally Crysis @ very high, running at 30fps looks a lot smoother than your average game, since it arguably has the best motion blur effects in any game.
 
Will there be a 2GB version of the 5870? Because otherwise i cant see this card magically running above games at 7680 x 3200.
Even with 1GB cards i've had in the past, ATI and NV, they have all run out of memory at 2500x1600 with some AA on quite a few newer games.
 
Status
Not open for further replies.
Back
Top Bottom