• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Graphics Cards, for Noobs

Associate
Joined
29 Aug 2009
Posts
64
I'm not a complete noob with computer hardware, however Graphics cards have always seemed to eclipse me. I generally know which the best cards are and why, but when it comes down to actually justifying each spec I am dumbstruck.

For the benefit of myself (thank you!) and other nooblets, what do the following specs mean, and why should I care?

- Core Clock: 600MHz
- Memory: 1GB GDDR3
- Memory Clock: 1800MHz (Effective)
- Memory Interface: 256-Bit
- Processing Cores: 112
- Shader Clock: 1500MHz

Also, why are there so many new graphics cards coming out with just 2 of these obscure connections, what happened to the VGA that every one of my monitors has solely supported? (I've been out of the loop for around 4 years)

- Display Connectors: 2 Dual-Link DVI-I
 
Core clock * the amount of texturing units (texture fillrate) = potentially larger, better filtered textures
Core clock * the amount of ROPs (pixel fillrate) = faster at antialiasing and high resolution rendering
Memory amount = the video card can store more data locally - this means higher resolutions and better filtering are actually plausible. With too little memory on the video card, the video data gets stored on in the system memory which is very slow for the card to access (so causes a large drop in framerates)
Memory clock * bus width (bandwidth) = can transfer more data from the memory to the card and visa versa, helps a lot with higher resolutions and antialiasing
Shader clock * shader units (processing power) = shinier effects, more or less.

That's an extremely dumbed down version of what those all mean to you as an end user. Some things affect the same thing, in which case the slowest part of the system turns into a bottleneck (the memory bandwidth is often a common cause of this).
 
Core clock * the amount of texturing units (texture fillrate) = potentially larger, better filtered textures
Core clock * the amount of ROPs (pixel fillrate) = faster at antialiasing and high resolution rendering
Memory amount = the video card can store more data locally - this means higher resolutions and better filtering are actually plausible. With too little memory on the video card, the video data gets stored on in the system memory which is very slow for the card to access (so causes a large drop in framerates)
Memory clock * bus width (bandwidth) = can transfer more data from the memory to the card and visa versa, helps a lot with higher resolutions and antialiasing
Shader clock * shader units (processing power) = shinier effects, more or less.

That's an extremely dumbed down version of what those all mean to you as an end user. Some things affect the same thing, in which case the slowest part of the system turns into a bottleneck (the memory bandwidth is often a common cause of this).

Yeah, and that the actual amount of work done per clock cycle can vary... So OP; the best solution is always to wait for the benchmarks (remember the drivers are also a very important factor), and make your judgement from there.
 
DVI isn't obscure, it's the prevalent standard for connecting monitors.

VGA = Analog
DVI = Digital

Therefore it's better.

You can get VGA Female->DVI Male adaptors, I'm using one now.
 
hah thanks Billy I was definately aware I was being a bit grumpy with the use of the word 'Obscure' haha.

Thanks for the feedback on all my questions, I too use benchmarks and reviews as my selection process but was curious about the specifics as cards tend to have a few variants.

Do Monitors ship with DVI connections yet or is it just on higher end monitors atm?
I've bought 2 monitors this year and neither had DVI, just good old VGA :D


Thanks again
 
hah thanks Billy I was definately aware I was being a bit grumpy with the use of the word 'Obscure' haha.

Thanks for the feedback on all my questions, I too use benchmarks and reviews as my selection process but was curious about the specifics as cards tend to have a few variants.

Do Monitors ship with DVI connections yet or is it just on higher end monitors atm?
I've bought 2 monitors this year and neither had DVI, just good old VGA :D


Thanks again

It's just budget monitors that solely use DVI now. Most decent monitors will have both, and higher end ones usually have a load more.

Mine screens hav got 2 DVI, 1 HMDI 1 display port, 1VGA, 1 Component, 1 S-Video and 1 Composite.

HDMI and DVI are the same, they just use different connectors and HDMI can do audio.

Displayport is the successor to DVI, it can carry a lot more data across the connection, and I'm pretty sure it's supposed to be able to do peripheral data, so like if your screen has USB ports or something, it can be carried over displayport.

I'd expect you know the rest, component is basically VGA, it uses the same connections, except that the RGB cables are all separate, though they're the same standard, same differences that DVI and HDMI have minus the audio.

There aren't many cards that do display port at the moment, but they're going to become standard fairly shortly, with ATi's newest cards having at least one on them.

EDIT:

Some wiki info on display port if you're interested
Wikipedia said:
Advantages over DVI


  1. Based on micro-packet protocol.
    • Allows easy expansion of the standard
    • Allows multiple video streams over single physical connection (in a future version)
  2. Designed to support internal chip-to-chip communication
    • Can drive display panels directly, eliminating scaling and control circuits and allowing for cheaper and slimmer displays
    • Aimed to replace internal LVDS links in notebook panels with a unified link interface
    • Compatible with low-voltage signalling used with 45 nm CMOS fabrication
  3. Supports both RGB and YCbCr encoding formats
  4. Auxiliary channel can be used for touch-panel data, USB links, camera, microphone, etc.
  5. Fewer lanes with embedded clock reduce RFI.
  6. Slimmer cables and a much smaller connector that doesn't require thumbscrews. Connector pins don't run the risk of bending if improperly handled.
  7. In low-light conditions or awkward under-desk connections, the DisplayPort connector is easier to connect when guided only by touch.
 
Last edited:
Most monitors that aren't bargain basement budget models come with a DVI connection now.

Also most decent graphics cards in the £75+ range that only have DVI connectors come with 1 or 2 DVI to VGA adaptors. (Good idea to check this before you buy)
 
A displayport is small too, not much bigger than usb. So I hope we'll be able to pack more of them onto the end of a card.

Indeed, I'm really hoping the new ATi cards come with more than one on them. I've got 2 display port cables that came with my monitors that I want to put to some good usage.
 
DVI isn't obscure, it's the prevalent standard for connecting monitors.

VGA = Analog
DVI = Digital

Therefore it's better.

You can get VGA Female->DVI Male adaptors, I'm using one now.

I always love the digital is better argument, especially the therefore one, it's a killer.

As things stand dvi as a digital interface will produce better quality images than a vga analogue connection, like for like.

Analogue computing is some way off, maybe a looong way off, but when it does the leap will be exponential. Maybe your grand kids will scoff at grandads 'digital' mp4 player like many people flame guys that still use vinyl...
 
Back
Top Bottom