• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The first "proper" Kepler news Fri 17th Feb?

Not being funny or owt, but I wouldn't use 8XFSAA even if I could. Just turns into a blurry mess. I tried it once in Grid and it made me feel rather ill.

wow, just... wow

every comparison I've ever seen shows 8xFSAA (aka SSAA) is better image quality than any other method - all the other methods are a way of trying to enable AA that isn't so crushingly GPU intensive

that's an impressive misrepresentation in favour of your card that you love/hate(depending on which thread you are in apparently)
 
k5.png


It looks like the die comes to around 319MM2 in area not the published 294MM2. It seems similar in size to the GF104 and GF114.
 
Last edited:
wow, just... wow

every comparison I've ever seen shows 8xFSAA (aka SSAA) is better image quality than any other method - all the other methods are a way of trying to enable AA that isn't so crushingly GPU intensive

that's an impressive misrepresentation in favour of your card that you love/hate(depending on which thread you are in apparently)

Then it was probably one of the other methods I used @ 8x. I know Grid has a lot of them.
 
The maximum vram consumption used in that we have seen around here is 1.7gb. Normal usage seems to be around the 1.4gb mark.

That doesn't mean performance would have dropped with less vram than that, once data is loaded into vram the card won't garbage collect it unless it needs to even if it is no longer used in the scene.
 
no it isn't, because i can pay this same price for a 50'' plasma, this card and all the others are way too expensive... this is especially bad because we're in a recession right now, because it's not just about buying this, it's the Electricity, council tax, petrol etc.

i've built a top end rig for only 450 quid.... i only need the card now, but there's no way i'm going above 350 quid for this, you just need to compare prices online and i dont think i'm the only one around here that's careful nowadays.

it's the wrong time to be releasing a card costing 500 quid, people just dont have the money nowadays

Well said sir, finally some common sense. Makes sense to me, cheaper price sell lots of cards, silly price sell lots less. I've just bought a new 3D monitor for £129.99 and the site that had them sold 300-400 in a matter of hours, I love the monitor and for the money I think its terrific value. I cannot for the life of me see where £500 on a small graphics card goes, yes R&D blah blah but dont Tv manufacturers have that expense, jesus I can remember my mother paying £800 for a tv way back in 1972 for the Olympics, tech is really cheap now, dont see why graphics cards should be different.:)
 
I think you've gotten confused between FS (Full Scene) AA and FX (Fast Approximate)

one is super sampling (running the game at 4 or 8 times the resolution and then down sampling to the correct one - hence why it's so GPU intensive and requires such large amounts of VRAM - 8x FSAA on 1080p means it's trying to render at 8640p and then downsample)

FXAA is basically a glorified "blur" filter that removes details as well as jaggies which is why I personally prefer FXAA medium and 4xMSAA in BF3

MSAA works to find the edges of objects and then only AA's them, so it doesn't work on textures (hence adding in FXAA as well for transparent textures that will have edges within them that MSAA won't find)
 
Metro 2033 was an excellent game. It was linear but it had a good narrative, excellent graphics, great combat and was highly engaging - it took me a while to get around to playing it but once I did I was hooked. I'm certainly looking forward to Metro: Last Light later this year. Highly recommended.
 
All we can go on mush is Battlefield 3 and Frostbite II.

The maximum vram consumption used in that we have seen around here is 1.7gb. Normal usage seems to be around the 1.4gb mark.

This is what I meant, though, when I brought up resolution. Those usage levels are not for 1600p, 1440p or any kind of multiple/triple monitor set up.

As texture sizes jump so do the vram usage levels.

yea but below 3GB you only have 1.5 GB, so that's the same as you quote, so this is what they mean..... a 1.5 GB card is right on the limit now, but will not be enough for next year.
 
Well said sir, finally some common sense. Makes sense to me, cheaper price sell lots of cards, silly price sell lots less. I've just bought a new 3D monitor for £129.99 and the site that had them sold 300-400 in a matter of hours, I love the monitor and for the money I think its terrific value. I cannot for the life of me see where £500 on a small graphics card goes, yes R&D blah blah but dont Tv manufacturers have that expense, jesus I can remember my mother paying £800 for a tv way back in 1972 for the Olympics, tech is really cheap now, dont see why graphics cards should be different.:)

It wasn't well said at all. Plasma TVs have been around for bloody years. The first one out cost US $14999.

Since then hardly anything has changed, so thanks to mass production they are now cheaper.

GPUs on the other hand change all the time. For example Kepler is a completely new technology from Nvidia like nothing before. Thus, they need back all of the money they have put into inventing it, researching and developing it and so on.

Thus, as I pointed out before comparing various inanimate objects to something completely different is not well said at all.

Let's say a TV came out next week that runs on fart gas. Then you would have a point comparing it to a GPU running new technology.

Plasma is far from new technology, and like many things in the world of TV and so on changes very infrequently as it needs to become a household standard.
 
can somebody give me a rundown of the card.

I have a headache, just got up after a night shift and can't be arsed to analyse.

i was going to get 2 7970s.

Wait until the full reviews are out next week?? On the face of it the GTX680 is faster in the games tested when compared to the HD7970. However,when the full reviews are out more games and more settings will be tested so you will get a better view of the relative positions of each card.

Metro 2033 was an excellent game. It was linear but it had a good narrative, excellent graphics, great combat and was highly engaging - it took me a while to get around to playing it but once I did I was hooked. I'm certainly looking forward to Metro: Last Light later this year. Highly recommended.

Same here.
 
http://www.hkepc.com/7672/page/4#view


New Adaptive VSync technology



NVIDIA, on the other hand, made ​​for the actual operation of fluency of the game is optimized and to promote the new Adaptive VSync VSync setting are gamers, open the main purpose is to reduce the graphics card output image is generated in different fps too fast or too slow situations, and the output frame rate is locked in with the mainstream display screen 60fps. However, the image of the actual output of the graphics card will still be due to different scenes within the game for differences between the actual output fps lower than 60fps fps, will be directly reduced to 30 below 30fps is reduced to 20fps.



In fact, such cases are quite common, and convert 60fps to 30fps of the screen will significantly slow the beating, which greatly affect the fluency of the game, the new Adaptive VSync just solve the problem, when the Adaptive VSync enabled, the system automatically detection of the actual operation fps than 60 VSync will open a locked 60fps output frames, such as less than 60 will be automatically VSync turn off and running to the actual output frames to reduce all of a sudden dropped to 30fps from 60fps is the emergence of slow beat, can be very useful.

Practical function open: new antialiasing technology TXAA



In addition to the graphics performance, NVIDIA also for the needs of gamers, add a number of new and useful features for a new generation of the GeForce GTX 680, released a new anti-aliasing technology TXAA, a new generation of hardware rendering anti-aliasing technology, actually belongs to the old MSAA's enhanced version provides better anti-aliasing I can reduce the open antialiasing fluency decline, for the time being belonging to the GeForce GTX 680 unique anti-aliasing, but NVIDIA said that the above will be open at a later time to the GeForce GTX 500 or Model.



TXAA temporarily can be divided into two levels: TXAA 1 and TXAA, the former costs only 2x the MSAA similar, but the MSAA is better than the 8x anti-aliasing; As for TXAA 2 effect is more outstanding, but the cost only similar with 4x MSAA. TXAA game itself support to be open to use, support TXAA variety of games will be held during the year stage, although present, failed to test its real performance, but based on the NVIDIA official photos can be seen, antialiasing does, it is worth looking forward to.



In addition to the new TXAA Antialiasing, NVIDIA is also an existing graphics card users to bring good news, the introduction of the original few games before the corresponding FXAA to the NVIDIA Control Panel, users can easily turn the feature on. lead to better anti-aliasing for different games, and enhance the quality of the game screen.
 
Back
Top Bottom