• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The first "proper" Kepler news Fri 17th Feb?

well so far it's pretty efficient.

I get a solid 60fps in mass effect 3 with my 5850 running at 400MHz in UVD mode
 
Yup yup!

I need one myself, and these new engine iomprovements are really amazing looking!

I can only wonder what Unreal engine 4 they're working on is going to be like. :D

From what I read they demoed UE4 running on Kepler to a select group but they are under NDA and it's only going to be shown to the peasantry later this year.
 
Yes I was initially confusing UE3 with UE4. That's what I meant when I said AFAIK there are no games using it.
 
I'm not sure they will call the GK104 card 680, I think it will be incredibly efficient with performance just exceeding GTX580 and it would not surprise me if they have not already got twin chip cards in development, it has to be the way forward for real power and efficiency gains. They are doing the Hyperthreading so dual core must be the next step ;) Whether they would use some sort of on demand AFR with 2 chip cards or something far more clever I would not like to guess.

The GK110, I think will be in parallel development and be a monster single chip card, but it could be 6 months away by which time AMD will have their next generation.

However it pans out I think we will see some interesting development in GFX cards this Year but I really do think Nvidia has plenty of options up it's sleeve as I'm sure they could have just shrunk the fermi core if they just wanted a fast 28nm card in a hurry

It would not surprise me if the GK104 was not developed with one eye on the next generation of Game Console which must be up for a refresh soon ;)
 
From what I read they demoed UE4 running on Kepler to a select group but they are under NDA and it's only going to be shown to the peasantry later this year.

Yes I was initially confusing UE3 with UE4. That's what I meant when I said AFAIK there are no games using it.

If UE4 is as efficient as UE3 was, while looking great. Well it's going to be an amazing engine!

I remember running Unreal Tournament on a 2900GT, 3870, 8600GT and 8800GTX and on all of them it ran brilliantly. :D
 
Yes I was initially confusing UE3 with UE4. That's what I meant when I said AFAIK there are no games using it.

I am not overly clued up on things but that real time samaritan demo is running on the Unreal Engine 3?

This was what I was basing my thoughts on and maybe this is just a ruse from the Nvidia camp.
 
I am not overly clued up on things but that real time samaritan demo is running on the Unreal Engine 3?

This was what I was basing my thoughts on and maybe this is just a ruse from the Nvidia camp.

Yes it is. They've made remarkable improvements to the engine since it's release and it's amazing looking.
 
The biggest improvement from the tech demo is.... not using it in a game.

What you can optimise and use in a tech demo with few other parts to consider is VASTLY different to what you can get in a real game.

Uniengine version 1, years ago, vs ANY game ever made with tessellation yet? Tech demo is just that, games are an entirely different matter. Tech demo's are a truly horrific idea of how good an game could look running the same engine.
 
The biggest improvement from the tech demo is.... not using it in a game.

What you can optimise and use in a tech demo with few other parts to consider is VASTLY different to what you can get in a real game.

Uniengine version 1, years ago, vs ANY game ever made with tessellation yet? Tech demo is just that, games are an entirely different matter. Tech demo's are a truly horrific idea of how good an game could look running the same engine.

Yep I hear what you are saying. It would be great to see those kind of details actually in a playing example.

Maybe not to far away....
 
I wish you lot would stop thinking that things like die-size will effect the price. It's all about performance vs the AMD range.
Silicon (specifically die size) is the most expensive component part for graphics card production. To say that small die / large die does not affect end user cost is like saying small gold bars cost the same as larger gold bars. Sure, NVidia can price it's products to match AMD's almost funny prices, but they are less likely to do so if production costs are lower to begin with.

We will know for sure in two weeks or so.
 
They may launch at AMDish prices and then drop them as they go along. That's what die size tells us. Of course they may also decide to launch them at cost + a good profit margin and thus undercut AMD. They'd do that if they wanted to start a price war and try to eat into AMD's profits. IF they do do this I'd certainly be buying two of em and I think quite a few will jump at it too.
 

BBRM415.jpg


:p
 
Back
Top Bottom