• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Where did Dx10.1 from Vantage go, Nvidia knows

true quad core allows the cores to talk to each other better, also u can clock each core separately.

yes that is great but it doesnt do anything i need right now. id rather have a cpu that was faster than something that was slower and more efficient. there will come a time where cpu's will have the speed advantage of the intels with the advanced logic and power saving that native quad core brings. when such a time comes, ill get that cpu. whether its AMD or Intel, platform providing.

i would like to say again, that i said the best for your money. it isnt always about speed.

Only thing i worry about when it comes to power is will the PSU hold up i have tripped the last 3.

so whats all this about throwing money at a problem then? do you think its ok for you to do that, but not ok for nvidia to have thrown money and a very good single core gpu? thats a bit odd isnt it?
 
Last edited:
We can all 'make up as much stuff as we like' about Nvidia but at the end of the day, if they weren't investing millions of dollars into game development then the amount of quality games in the market would be severly reduced and we'd all be complaining about that.

ATI isn't spending enough on helping game developers produce better products and that, to me, isn't good enough.
 
I gave AMD OverDrive ago & was impressed by what can be done with it but i still like to do it the BIOS way, but in your circumstance i can see why you use it.
yeah when i first got the AOD installed i was like wow!!!.. i could never do overclocking by myself via bios before.... tbh amd did a really good job...
 
yes that is great but it doesnt do anything i need right now. id rather have a cpu that was faster than something that was slower and more efficient. there will come a time where cpu's will have the speed advantage of the intels with the advanced logic and power saving that native quad core brings. when such a time comes, ill get that cpu. whether its AMD or Intel, platform providing.

i would like to say again, that i said the best for your money. it isnt always about speed.



so whats all this about throwing money at a problem then? do you think its ok for you to do that, but not ok for nvidia do thrown money and a very good single core gpu? thats a bit odd isnt it?

LOL

Not my fault the PSU was not powerful enough as my PC`s expanded over the years but my Gigabyte ODIN GT 800W is doing the job nicely but i feel i will need about 1200w in the near future.
And there is noting wrong with Nvidia throwing money at the problem, i never said it was i just said that they have money to do so.
 
yeah when i first got the AOD installed i was like wow!!!.. i could never do overclocking by myself via bios before.... tbh amd did a really good job...

The main reason i like the bios way is that you can find out faster if its unstable with the OS crashing when loading :)
 
LOL

Not my fault the PSU was not powerful enough as my PC`s expanded over the years but my Gigabyte ODIN GT 800W is doing the job nicely but i feel i will need about 1200w in the near future.

you probably will. i dont know what you are laughing at though?

And there is noting wrong with Nvidia throwing money at the problem, i never said it was i just said that they have money to do so.

the way you worded it, it certainly sounded like it. but hey if im wrong then so be it. you made some pretty wild assumptions about me;)

Some of us like to buy with a bit more thought than just getting the fastest & should be allowed to do so with out the buy bthe fastest make willy wavers.

wait, back up. you bought a gtx ultra and a 30" lcd, is that right?. whats that about not getting the fastest? or the biggest for that matter? i payed about...£520 for my gtx and a 24" screen (still a LOT of money, no mistake). what did you pay for your ultra and 30"? im not having a dig here and people are free to buy what they like, but the smart money is hardly on the ultra is it? lol what i bought gave me far more bang per buck than an ultra and 30" would have, thats why i bought it. i thought long and hard about it:)
 
Last edited:
The main reason i like the bios way is that you can find out faster if its unstable with the OS crashing when loading :)
i uselly just set the voltage and clocks up to what i feel is right then use the stabiley test for 30min-1hour then slowly clock up tlll it locks then just add alittle more voltage . :)
 
But it does say something about the market and the mistake they've made on only focusing on the fastest cards.

There's a large portion of non-DX10 cards (both nVidia and AMD) then add the number of XP owners vs number of Vista owners.

Now if DX10 worked on XP then we'd not have an issue.. MS really did shoot themselves in the foot by attempting to force everyone onto Vista DX10..

To be honest I don't see anything better for the majority of gamers (ie not the high end gfx cards) from Vista or DX10 as the rest of the platform can't cope with the high settings to get he benefit.

I think nVidia are just seeing that too.. most people I think will attempt to wait to see what the next version of windows offers. I know from a Mperson that Vista internally within MS is a joke.

I don't really agree, if your refering to futruremark then they had to have a DX10 test and since theres only one OS capable they didn't have much choice, I do agree however that Microsoft are basically forcing us to use vista with DX10 as it really is all vista has going for it.

I'm sure there eventually will be some great titles to showcase DX10, but right now yeah vista and DX10 don't really offer much, I still dual boot with Windows XP x64 as my primary OS. But this doesnt really have anything to do with the topic or what I posted.

I know its not actual confirmation,

No, it's not, and that should be the end of the discussion, I have no interest in your conspiracy theories based on rumours.
 
Last edited:
you probably will. i dont know what you are laughing at though?



the way you worded it, it certainly sounded like it. but hey if im wrong then so be it. you made some pretty wild assumptions about me;)



wait, back up. you bought a gtx ultra and a 30" lcd, is that right?. whats that about not getting the fastest? or the biggest for that matter? i payed about...£520 for my gtx and a 24" screen (still a LOT of money, no mistake). what did you pay for your ultra and 30"? im not having a dig here and people are free to buy what they like, but the smart money is hardly on the ultra is it? lol what i bought gave me far more bang per buck than an ultra and 30" would have, thats why i bought it. i thought long and hard about it:)
the 8800 cards were the first card to have ATI`s I.Q & AA+HDR from Nvidia so no i did not buy it purely for its speed & sacrifice other features for it so i thought.
Secondhand Ultra was £300, he needed the money.
That where 3DFX went wrong with going for speed at 16bit colour while others moved on to 24/32bit
I also didn't say anything about having the not having the biggest Monitor! £1200 for the 30" 2560x1600 Sam absolute stunning detail & like lots of things on the screen at once & may need to bin the 2 CRT`s & get 2 more LCD`s on the side in landscape just need to work out what size they need to be to lineup but my hold off until the 120Hz screens turn up.
 
Last edited:
Yes, its funny isnt it!

I can't believe you don't see what I do. Hell, i've seen you have run-ins with the same nVidiots that i'm referring to in the past.

Although I will admit that the situation wasn't as bad as it used to be, a few of them seem to have stopped posting (thank god). :)
 
Theres no proof nvidia were directly involved in the dx10.1 support being removed from AC, from screenshots at rage3d and [H] I can see a drop in IQ, contrast looks messed up and textures look blurry on the ATI cards tested, the developers themselfs even said the performance increase was due to a bug, it's just pure paranoia on everyones part to believe nvidia paid them to remove it.

Of course we will see dx10.1 games when nvidia support them, developers have no reason not to support it then, right now most users with high end cards do not have DX10.1 capable hardware and SP1 has only just been released, in fact it's still not available to the public via windows update.

From that article.

There should be no doubt that Assassin's Creed contains DirectX 10.1 technology, and that that technology provides AMD's video cards with a solid performance boost. However, it is not to last. A few days ago, we received this statement from Andy Swanson of Ubisoft via NVIDIA:

Ubisoft plans to release a patch for the PC version of Assassin’s Creed that addresses the majority of issues reported by fans. In addition to addressing reported glitches, the patch will remove support for DX10.1. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly.

I agree that the IQ is better with the Nvidia card on that {H} article. It makes the image stand out more and it's easier to see from still images but I don't think it will make much of a difference when you're playing as it's not terrible by any means. Also with no DX10.1 capable Nvidia cards to compare against then it makes it hard to determine if Nvidia would be affected by this also.

It's not that I am saying that Nvidia has paid for this to be removed but what is to say that future money was threatened or something similar? (Speculation but do you think this stuff would be common knowledge?, we don't get to find out everything through the Internet). It is a very strange move to remove this feature already but it's down to if the user patches their game or not :). I'm also not saying that Nvidia has paid Futuremark for it to not be included in Vantage. I just think that technology that can boost your performance should be an addition to this benchmark. It wouldn't be fair on Nvidia as they are behind but in all honesty they have had 6 months over ATI to implement DX10.1 and one and a half years overall in the lead. How the hell could ATI go from 2900 to 3870 then nearly have 4870 land on our doorstep within a year with better technology in each step?.

Yes, this thread is about DX10.1 but why ignore Nvidia's lack of progress when it clearly affects the consumer?. So because of Nvidia then all the Assassin Creed players that own ATI DX10.1 cards should do without because Nvidia don't have the technology?. As James.Miller said it doesn't surpass the Nvidia cards performances in this game so what is the harm in leaving this in?.

What about the date of the latest 8800GTX drivers and the date of the latest 9800Pro drivers?.

(just a few) 9800Pro, X800, X1800, X1900, X2900XT, 3800,4xxx but only the 9 series Nvidia cards have had any whql drivers this year. How can the ancient 9800Pro have April 2008 drivers but the 8800GTX is still sitting on last years drivers?.

Assassins Creed, to what we know for now has shown DX10.1 to be pretty decent for it's first viewing in a game so why are Nvidia trailing the company they've been leading for the last year and a half for DX10.1 support?.

Microsoft changed DX10 requirements so Nvidia could be Vista compliant. If a company of that size can bend over for Nvidia then you have to give plausibility to the idea that Futuremark might have done the same?. I think I can remember that they said that DX10.1 came when they were too far into development to implement it which is believable but we'll never get to know what happens behind closed doors. When you have the fact that Microsoft done Nvidia a favour for DX10 and a report that Assassins Creed is to have a feature removed that grants the opposing card with the newest technology better AA performances then that gives you food for thought (speculation of course regarding Futuremark but with past experiences it seems possible).

I've done most of my gaming with ATI. Since I started getting decent cards I've had 9800Pro, X800XT PE, 7600GT OC, X1900XT, 2900Pro and now the 8800GTX. I'm no fan boy though as I wouldn't have bought this card or the 7600GT. I still agree that the GTX(& Ultra) are the best single GPU cards for high res and high IQ gaming. X2 3870 was tempting but for the small increase I thought I'd rather wait as the G80 is the fastest card I've had the pleasure of owning. I'm just mentioning this before anyone thinks I hate Nvidia. I don't. I did think it was poor that the 7 series couldn't do AA and HDR with bad IQ but I've had nothing but praise for the G80/G92. When G92 arrived as the 9800GTX and then I see the Assassins Creed report as well as knowing about Microsoft changing Vista for Nvidia and the 9800Pro being better supported than my GTX I have every right to question the lacklustre advances from the company that are comfortably in the lead in the high end GPU market.

I advise just about everyone to get the GT, GTS or G80 GTX. I know where it's at between 3870Xt and GTX due to owning the less efficient 3870XT(2900Pro overclocked). So for this round I have a good indication of ATI and Nvidia's high end single GPU performances and know that the GT, GTS and GTX all beat the 3870XT. I just admire ATI's advances and resiliency whilst being disappointed in Nvidia's support as well as not having DX10.1 already with the time they've had.
 
Regardless of why Ubisoft made this move with AC, the strange fact remains that this is the only patch that I can think of for a game that actually removes a feature of this type. If it is purely down to IQ, why not let the user decide if they want to turn it on or off, like so many other rendering options that effect IQ?

Why actively make this decision for the user?
 
Very nice but thats not proof, at best it's circumstantial and even thats pushing it.

No, it's not proof. It's a basis of why it could be truthful and by that I mean the whole article.

If the blurring is just to do with no AF being present then that shouldn't be anything to do with DX10.1. The X2 3870 performance when AF was forced on through the control panel wasn't so great though. When a driver is released that can enable the AF without forcing it on then it's best to talk about the textures as performances could change.

Also there is this link below that points to another DX10.1 issue concerning Nvidia (and others) which again strengthen the probability of Assassins Creed's DX10.1 support being pulled because of pressure from Nvidia.

http://forums.overclockers.co.uk/showthread.php?p=11602205&posted=1#post11602205
 
Last edited:
Its laughable really, nvidia may have the faster cards at the minute but they clearly are pretty dated in terms of features. Apparently speed is all that matters.

quite right you are, its usually ATI that brings new features to the market, forcing NVIDIA to upgrade there feature set, ATI are technological leaders IMO, always have been better than NVIDIA in that department
 
So of all the new features of the ati card which one's are killer feature's? Are they used that often to warn't the lower performance?

It does sound silly all the halabloo around the dx10.1 in 3dmark. all the users with dx10.1 are really that desparate to find out how their card will match in this new version of dx. Will the gap between dx9 and dx10 at relaese be the same with dx10.1 and dx10?
 
Back
Top Bottom