Some fool on another forum is positive that graphics cards make a noticable difference in the 2D rendering of standard apps.
I agree that there is, but for non graphic-intensive applications the difference would be unnoticable.
We used MS Word as our reference app.
I've tested on my work machine with it's intergrated intel graphics, and the load time of the app (through the netowrk no less as our profiles are web based) was 1.5 seconds.
He is certain that when he underclocked his 7900GT by around 100mhz, there was a multiple second increase in the load time.
Utter tripe.
The amount of gpu power needed to render the 2D layout of a MS Word document is almost nothing, as proven by my own work machine.
As such, if you had an identicle system in every aspect, but one had say an FX 5200, and the other that uber Nvidia card, the most powerful AGP one on the market, there would be no noticable difference in the draw time of loading of MS Word, correct?
I agree that there is, but for non graphic-intensive applications the difference would be unnoticable.
We used MS Word as our reference app.
I've tested on my work machine with it's intergrated intel graphics, and the load time of the app (through the netowrk no less as our profiles are web based) was 1.5 seconds.
He is certain that when he underclocked his 7900GT by around 100mhz, there was a multiple second increase in the load time.
Utter tripe.
The amount of gpu power needed to render the 2D layout of a MS Word document is almost nothing, as proven by my own work machine.
As such, if you had an identicle system in every aspect, but one had say an FX 5200, and the other that uber Nvidia card, the most powerful AGP one on the market, there would be no noticable difference in the draw time of loading of MS Word, correct?