• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

First Nvidia GT300 Fermi pic.

What a total waste of time, how long did that take you, a couple of hours???

I notice a lot of guys on this forums always having a dig at nvidia, but when someone starts bashing ATI cards people get ripped apart..

If people don't like nvidia products, then they should just stay away from any threads related to them, and same goes for ATI haters...

all this fanboi crap really bugs the hell out of me, there really isn't any need for it at all.

Most of the issues with NV ATM have nothing too with the hardware, but NV themselves that is the focus as NV's practices keeps bringing bad attention to themselves as opposed to good attention.
 
Last edited:
Most of the issues with NV ATM have nothing too with the hardware, but NV themselves that is the focus as NV's practices keeps bring bad attention to themselves as opposed to good attention.

nVidia's approach is as if they were the monolithic organisation with a dominating product. News is that they're not hence people have the choice of not working with them.

However, it is the only way that nVidia can stay in the position that are in. Unfortunately the lower ranks have not delivered the supporting structure of sales and profitability.

I see nVidia attempting to copy Apple. The CEO is attempting to emulate Jobs - however I see a couple of flaws:
a) the nVidia CEO doesn't have the attention to detail within his product set that Steve Jobs is renowned for.
b) the nVidia CEO cannot harness the bigger picture (pixar and now disney in the case of Jobs) to utilise and drive his technology to a new mass market.
c) there's no iPod-iTune content delivery product space available to nVidia, that's the games consoles. They don't own any market segment that they can sell access to make money for anyone.
d) nVidia doesn't radiate (through PR or other) the "can do no wrong" aura.
e) they have three products: (a) a graphics chip/architecture, (b) a chipset architecture, (c) a non-existant CPU chip compared to Apple. This increases the risk.

Those points above mean that businesses will not bend over backwards to work with them. This is the wrong model for them.

They are innovative hardware vendor. They have good IPR and as such should look to organise their product set to licence their design IPR to allow licensees to attack additional market segments. Partner suck in knowledge to access new product areas. This allows you concentrate on building your own experience in the new market area you want to deliver to.

Intel are already attacking the mobile segment through their increased stake in Imagination Technologies PowerVR (the same chip in the iPhone for graphics). This low power knowledge will be used for intel's new mobile laptop chipsets.
AMD has many partners they can openly work with todo the same to achieve the same technology capture.

They need two strings - one to focus on getting the money from anywhere in the market they can, and the other to drive the company into the area they want.

In short nVidia hadn't woken up to the change in business strategies going on. They're sticking with the same model as 4-5 years ago. A bit like gollem in lord of the rings - nVidia now have the ring of performance leadership, what they've not seen is that they've just run off the cliff and there's a pit of lava below.

The reason for all the negativity is simple - other companies face similar issues, however the risk of failure is increased with nVidia due to the personality of leadership and the lack of product market spread to reduce the financial impact of competitors stealing revenue generating market share.
 
Last edited:
Ok, this is a simple state of affairs. Oddly I've been researching this market for jobs..

nVidia want to move into the CPU market as the GPU market is massively shrinking. The Mobile market has leaders such as ARM Mali and Imagination Technologies PowerVR that power the majority of power-senstive mobile devices (phones, video cameras etc). nVidia don't have the skill to produce hardware that's power efficient enough for that market. Note that Intel and Apple have a large increased holdings in Imagination Technology, ARM are the unprecedented 800lb gorilla in this area - even making Intel look like a kicked spaniel.

You need to do some more research. Nvidia DO NOT want to move into the CPU market and the GPU market is expanding not shrinking!

You are confusing CPU, GPU with what is known in the mobile sector as SOC (System on a Chip). A SOC will contain a number of components in a discrete IC package. These will be a microcontroller and or a DSP, some memory, IO and these days a GPU and audio processor. The idea being that a vendor can drop one of these in his device replacing lots of seperate components with a smaller and cheaper alternative. Thats the theory any how. Thats what Nvidia want to supply. And they already have a solution. Its called Tegra. It uses a multicore ARM 11 cpu, a HD video processor and a geforce GPU. The obvious target is an application processor for the smartphone market. However it is currently used in the new Zune HD.

Rumour has it that a number of mobile handsets are in the making, using Tegra. HTC will probably release one shortly and there are a couple using the google android os well.

I wouldnt rule out Nvidia in the SOC market. Mobile handset manufacturers are interested in SOC cost, power consumption, performance and ease of integration within the system and for application programming. The cost for the Nvidia SOC is competative at $10-$15 each, power consumption is low and performance is up there with the best.
 
Last edited:
I should have clarified the CPU and GPU market in the first line was referring to desktop market. Writing it at about 4-5am with few edits after probably didn't help readability. I understand SOC (and have programmed, including embedded, ARM 2,3,610,710, 6,7 thumb... and AD's Blackfin) :D
nVidia do have a CPU program unless they've shelved/mothballed it on the quiet.

When Intel/Apple started with PowerVR in the iPhone, it's a sure thing that Intel would be pushing for the iPhone to use Intel as the CPU component. I know Intel actually licence ARM cores but currently Apple use Samsung built ARM 6/7.

At the time of last having a look, Tegra had just been released and was being demonstrated within notebooks. ARM's Mali seems to have stopped short (perhaps delays in integrating the Norwegian company) so there is a spot if they've not managed to pull their finger out. PowerVR were surpassing them in feature levels.

nVidia do also have chipset experience, so ease of integration could also be a strength I suppose. I would assume their Nexus toolkit also supports mobile development for Tegra too. It wouldn't surprise me if nVidia introduce a CPU, it enters this space so they own the entire SOC environment in the device. Long way off for that..

Btw, not that I would suspect you work for nVidia GmbH :p
 
Speak for yourself. I feel sorry for the people who are waiting for this card when they could be spending money on a 5800 and enjoying the same performance, TODAY. .

They have been enjoying same performance as a 5*** for months :rolleyes:
Oh and with Physx ;)
 
Bit sad to notice that at the end of Anands conclusion it said;
"We will learn more NEXT YEAR".............
 
I should have clarified the CPU and GPU market in the first line was referring to desktop market. Writing it at about 4-5am with few edits after probably didn't help readability. I understand SOC (and have programmed, including embedded, ARM 2,3,610,710, 6,7 thumb... and AD's Blackfin) :D
nVidia do have a CPU program unless they've shelved/mothballed it on the quiet.

When Intel/Apple started with PowerVR in the iPhone, it's a sure thing that Intel would be pushing for the iPhone to use Intel as the CPU component. I know Intel actually licence ARM cores but currently Apple use Samsung built ARM 6/7.

At the time of last having a look, Tegra had just been released and was being demonstrated within notebooks. ARM's Mali seems to have stopped short (perhaps delays in integrating the Norwegian company) so there is a spot if they've not managed to pull their finger out. PowerVR were surpassing them in feature levels.

nVidia do also have chipset experience, so ease of integration could also be a strength I suppose. I would assume their Nexus toolkit also supports mobile development for Tegra too. It wouldn't surprise me if nVidia introduce a CPU, it enters this space so they own the entire SOC environment in the device. Long way off for that..

Btw, not that I would suspect you work for nVidia GmbH :p

I dont work for nvidia Gmbh. If you look at my profile it says "3G/LTE R&D, protocol stack software".
 
There's a difference between "they are" and "how much". :p

If I link the Physx library - I am using Physx..

Exactly, some cloth, floating paper and a breaking small parts off the floor is the most I have seen in the 'physx' titles, hardly what I call ground breaking or a move in the right direction

Heck red faction guerilla and Bad Company 2 are more exciting in terms of visual physics effects :)

People need to wake up and realize it is lame, open supported standard = win, closed unsupported rubbish = fail , I play games not fanboy wars
 
yes open standard will win the day, particularly if they are supported by both parties, When AMD gets physics on their GPUs I m guessing they won't lock it off to Nvidia users
 
yes open standard will win the day, particularly if they are supported by both parties, When AMD gets physics on their GPUs I m guessing they won't lock it off to Nvidia users

Trouble is "Open Standards" is another word for attempting to get one over on your rivals. When both companies support it then they go to war over attempting to influence the standards bodies themselves or, worse still, implement the standard in their own unique interpretation that's compliant but incompatible with the other company's products.. :D (Part of the CTO office role within companies usually! Bonus points for managing to get your own IPR in to the standard and charge for it)

Chances are Khronos will be smart enough having had this with OpenGL in the past, MS too will be smart enough..
 
Speak for yourself. I feel sorry for the people who are waiting for this card when they could be spending money on a 5800 and enjoying the same performance, TODAY

But do they need the same performance, TODAY?

I'm very interested in this new card but have absolutely no desire to instead buy the 5800. There are currently no games on sale which show up my GTX280 @ 285 speeds, so why not simply wait for the new cards? By which time we'll be closer to the games that actually need them?
 
[TW]Fox;15027927 said:
But do they need the same performance, TODAY?

I'm very interested in this new card but have absolutely no desire to instead buy the 5800. There are currently no games on sale which show up my GTX280 @ 285 speeds, so why not simply wait for the new cards? By which time we'll be closer to the games that actually need them?

I doubt any graphically groundbreaking games will be released in the next 2 months, but I still agree with waiting, for a chance to buy the top end card or prices of the other to go down if you are fine now.

No point in upgrading now, it isn't the best time to buy unless the 5870 is a 8800gtx which it isn't. :)
 
Why isn't it like the 8800 GTX?

Cause at the time of launch the 8800GTX was so much faster than the competiton so that is why people are still running them in rigs today, some 3-4 years after launch.

He's saying that the 5870 is not a big enough leap especially over a GTX285 for it to last that long I guess.

And I suppose there is Nvidia's alternative on the horizon too
 
I would love to see the GPU going into the next generation of Xbox.. some how I think that AMD will do what they did last time. Stick with the highest return but then produce a really hardcore GPU for the Xbox that then appears two years later on the PC.
 
Back
Top Bottom