Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I can only conclude from your intense hatred of nVidia you must have been well and truly sodomized by them at some point.
You do realize that the forum's users are actual real people, and that Nvidia is a faceless corporation, right?
Thing is now they have started making Ati cards will nvidia be happy letting them continue making there cards aswell when they do surface to market?
I think what is best for now is stop waiting for fermi, upgrade if need be if not enjoy your games. well its what i m going to do
oh wait what good games >_>
who cares if COD4 goes at 120 frame rates and your old GPU is only at 100? its still going to be smooth, still does what you want, more of a brag factor if you ask me![]()
I've seen a lot of people mention that moving from 40 to 28nm would make the G100 suddenly viable. What is it about the 40nm process which is a problem, and why will this issue evaporate with 28nm?
Is it simply a case of lower power consumption and die sizes leading to better yields, or is 40nm just uniquely awful?
Point i was making my way to is that with severely overdue products coming to a market with less partners to push at more varied prices, its going to seem like a bit of an uphill struggle for nvidia no?
Instead of fermi being the product to get them back in the game i think its going to be its successor that does it, im not suggesting for one second that a massive corporation like nvidia would be crippled in the market by all of this but it has to be hurting them a lot more than they had planned?
Good point. And Crysis 2 is going to be an XB360 game? WTF?![]()
I've seen a lot of people mention that moving from 40 to 28nm would make the G100 suddenly viable. What is it about the 40nm process which is a problem, and why will this issue evaporate with 28nm?
Is it simply a case of lower power consumption and die sizes leading to better yields, or is 40nm just uniquely awful?
The physical die size of the G100 is pretty much borderline to what is viable to produce in substantial yeilds on a 40nm process.
lol I made a harmless joke about how you were all big keyboard warriors and your still holding it against me?
it better not!
![]()
thats why us PC gamers can laugh in the face of console fanbois saying their platform is inferior (it is)
...That's PC, 360 and PS3...
Get a grip Rroff.
Nothing at all, people stupidly, very stupidly, think that a CURRENT GF100 on 28nm would be great, the problem being, AMD will go wherever they go on 28nm, lets say they double shaders again, well Nvidia will need a doubling in shaders aswell, plus the other bits, which will mean it will be a VERY similar size on 28nm, and have ALL the problems the 40nm process has.
Yes if Nvidia make an exact GF100 on 28nm, it will be fine, but it will basically be a midrange core, not a high end one, it will be half the size it is now and still be very difficult to make.
So the main issue is people completely ignore that whatever is in the works for 28nm, lets call it a GF200, will be double the transistors of the GF100, and therefore really have no improvement under 28nm, but all the same issues. Though a better design, its likely to sacrifice shaders and power, or become EVEN bigger. THe 5870 is bigger than it was planned due to design sacrifices made to accomodate TSMC's crappy process. Nvidia didn't do these things, and can barely make their cores, they will need to incorporate them under their 28nm design as all the biggest problems within TSMC's 40nm process are still there in their 28nm process. So Nvidia literally have no choice but to sacrifice some die space to bus/shaders/rendering cell, or simply add die size with the modifications. The design, large and inefficient, is simply not suitable for large scale production these days, its why everyone else on earth has gone for small sizes and more efficiency rather than brute force.
One day Nvidia will learn that lesson.
Yes, it is, the problem is, their next core would obviously, like ever before, be designed with a rough doubling of shader power to be added, meaning double the transistors give or take as I said. The plan will be a 6billion transistor on 28nm, not a high end 3billion transistor core. GF100 will never be made at 28nm, ever.
With seperate dev teams as well, so PC version should truly live up to its platform![]()
Sounds like they've had some issues that have prevented them from doing much at all or just being plain lazy and resting on their reputation for sales?
I hope crysis 2 engine does live upto some of the hype. Sadly I don't think it will.