• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia doesn't want to just dominate the graphics card market, it wants to own it

AMD are far from squeaky clean, hence another lawsuit against them for mis-selling the Bulldozer chip.

You mean like the guy who said it wasn't an 8 core CPU,yet it has 8 integer cores,and this was known from launch?? If AMD were misselling that,then so was Intel,Cyrix and many other companies,which in the past,sold CPUs without functional FPUs or even had external ones.

Edit!!

Even Anandtech says it has 8 integer cores:

http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested
 
Last edited:
You mean like the guy who said it wasn't an 8 core CPU,yet has 8 integer cores?? If AMD were misselling that,then so was Intel,Cyrix and many other companies,which in the past,sold CPUs with functional FPUs or even had external ones.

But if you use all 8 they don't "work as well" as if you only use 4, or have I got that wrong?
 
But if you use all 8 they don't "work as well" as if you only use 4, or have I got that wrong?

They are 8 separate integer cores - its the floating point units which are shared,but if you go that way with FPUs it means Intel,VIA and loads of other companies are open to litigation.

Lots of parts of modern CPUs are also shared(caches and the like).

Anandtech went throught the uarch:

http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested

The main issue with BD,was it had very "narrow" cores,which means each core was slimmed down when compared to the Phenom II.

But the world+dog knew that,since there was so much information out there,only a total idiot or moron would have missed it.

It just comes across as some bloke trying to use the US legal system to try and get some easy money.

You only have to look at the retarded lawsuits in the US which would never see the light of day in Europe.
 
They are 8 separate integer cores - its the floating point units which are shared,but if you go that way with FPUs it means Intel,VIA and loads of other companies are open to litigation.

Anandtech went throught the uarch:

http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested

The main issue with BD,was it had very "narrow" cores,which means each core was slimmed down when compared to the Phenom II.

But the world+dog knew that,since there was so much information out there,only a total idiot or moron would have missed it.

Didn't AMD claim that the Intel Core2Quad weren't proper Quad cores?
Are you saying that on a SandyBridge 2500K that if you run 4 cores at 4.0GHz they don't, individually, run as well as a single core at 4.0Ghz? Genuine question.
 
Didn't AMD claim that the Intel Core2Quad weren't proper Quad cores?
Are you saying that on a SandyBridge 2500K that if you run 4 cores at 4.0GHz they don't, individually, run as well as a single core at 4.0Ghz? Genuine question.

Dude,seriously the lawsuit will not hold any water. The uarch has been well understood for ages. If you really want it to succeed,there are whole generations of intel,AMD and Cyrix CPUs going back to the 80s and 90s which had no on-die FPUs or disabled FPUs.

Don't believe me look,back at 30 years of CPU development and come back.

8 integer workloads can work independently of each other,so its an 8 core CPU by all traditional metrics.

These are stupid arguments which were fought over like 4 years ago -go back to all the threads and articles from then.

I am off to play FO4 - see you lot tomorrow.
 
Last edited:
Dude,seriously the lawsuit will not hold any water. The uarch has been well understood for ages. If you really want it to succeed,there are whole generations of intel,AMD and Cyrix CPUs going back to the 80s and 90s which had no on-die FPUs or disabled FPUs.

Don't believe me look,back at 30 years of CPU development and come back.

Not really asking about the lawsuit now...
 
Dude,seriously the lawsuit will not hold any water. The uarch has been well understood for ages. If you really want it to succeed,there are whole generations of intel,AMD and Cyrix CPUs going back to the 80s and 90s which had no on-die FPUs or disabled FPUs.

Don't believe me look,back at 30 years of CPU development and come back.

8 integer workloads can work independently of each other,so its an 8 core CPU by all traditional metrics.

These are stupid arguments which were fought over like 4 years ago -go back to all the threads and articles from then.

I am off to play FO4 - see you lot tomorrow.

While I think it is a tough case to win what you say is largely irrelevantly. CPU designs of the 1980s doesn't apply to the 2010s market. The law case will all,revolve outright the details of AMD's claims and marketing material, and what the lawyers decide on the meaning of terms like "core". It modern marketing a core has always referred to full FPU and integer capabilities in a single module, AMD advertised 8 cores, but the cores what not at all what people would expect.
 
Dude,seriously the lawsuit will not hold any water. The uarch has been well understood for ages. If you really want it to succeed,there are whole generations of intel,AMD and Cyrix CPUs going back to the 80s and 90s which had no on-die FPUs or disabled FPUs.

Don't believe me look,back at 30 years of CPU development and come back.

8 integer workloads can work independently of each other,so its an 8 core CPU by all traditional metrics.

These are stupid arguments which were fought over like 4 years ago -go back to all the threads and articles from then.

I am off to play FO4 - see you lot tomorrow.

Agreed.

AMD lawsuit over false Bulldozer chip marketing is bogus

the lawsuit is utterly without technical merit...

www.extremetech.com/extreme/217672-...lse-bulldozer-chip-marketing-is-without-merit
 
+1.

I read on some random website that NVIDIA's market share dropped to an estimated 50%! Though I don't have the source of course, though everyone will take me at their word, right? :)

It helps when you joke about that you don't pick the correct number :) Around 50% of gamers use NVIDIA cards for games. Around 26% use AMD and more then 20% use APU's over cards which I find the most interesting as the amount of gamers doing that has steadily been increasing. The numbers move around a bit month to month but that's around what its been for the past year.
 
source?
if it's the ubiquitous steam survey then you need to take in to account that about 50% of steam survey results are from laptops

steam has around 125 million users, but there are 40million+ AIB's sold every year, so what do we think the installed base of AIB's is? how often do they upgrade?
you really have to be aware of what figures you are talking about and where those figures come from, you can't just pluck figures out of the air or off steam survey and say that they apply to "PC gaming", are we talking AAA gaming or are you happily including all of the mobile/tablet-like games that are on steam?
 
Last edited:
But if you use all 8 they don't "work as well" as if you only use 4, or have I got that wrong?

Its going to fall flat on its face because not only is his description of what makes a Core wrong but his fundamental argument that it doesn't scale from 4 to 8 Cores is also wrong, in fact it does. just not in the same way Intels Hyper Threading, it was never designed to, HT has 4 Cores and 8 FPU's while Bulldozer has 8 Cores and 4 FPU's. Intel scale Floating Point Calc while Bulldozer scale Integer Calc.

By his description an i7 4790K is an 8 Core while Bulldozer is a 4 Core, we and the rest of the world know its the other way round, he has no idea what he's talking about and he's going to get humiliated.
 
Last edited:
I dont believe nvidia can do async without changes to architecture, so no nvidia cant do async and make it as good as amd can because amd had the scaler on their cards already.

So Nvidia made G-sync so it works with their hardware, no one had to buy g-sync you know, like i haven't but many people like it so those people bought it.

That's surprising? I mean Nvidia are managing to use G-sync without the module in the monitor with laptops in testing so I would have thought software would offer a way around an issue like that but even if that's not the case with current offerings the tech is available to use by them with Pascal if they choose to unlike G-sync for AMD.
That was my point you can't call them the bad guy for wanting a similar tech to use. ie: You can't say they're the ones fragmenting the industry when the first version of a new to market tech is locked down by one provider.

+1.

I read on some random website that NVIDIA's market share dropped to an estimated 50%! Though I don't have the source of course, though everyone will take me at their word, right? :)

Quoted from the Gospel of Saint Dave2150
 
Last edited:
Same as Intel CPUs then, as they'll turbo higher when only using one core. :p

I've overclocked my Intel CPUs so they don't Turbo when using 1 core or 4.
I've also overclocked my Pilerdriver so all cores run at the same clock speed.
So if I load one core is it running any differently to each of the cores if I load all 4 cores? If you have a HyperThreading CPU does it affect the performance of the 'cores' when a HyperThreading thread is running?
If I load one core on my Piledriver CPU does that 1 core run any differently than the 8 cores if I load all 8?

That's surprising? I mean Nvidia are managing to use G-sync without the module in the monitor with laptops in testing so I would have thought software would offer a way around an issue like that but even if that's not the case with current offerings the tech is available to use by them with Pascal if they choose to unlike G-sync for AMD.
That was my point you can't call them the bad guy for wanting a similar tech to use. ie: You can't say they're the ones fragmenting the industry when the first version of a new to market tech is locked down by one provider.



Quoted from the Gospel of Saint Dave2150

Do we know GSync wasn't available to AMD or did they just turn it down?
Nvidia seem to get blamed plenty for us having 2 solutions that cause lock-in, but they did it first. How is it their fault someone made an alternative? Now Nvidia have to abandon their solution?
 
That's surprising? I mean Nvidia are managing to use G-sync without the module in the monitor with laptops in testing so I would have thought software would offer a way around an issue like that but even if that's not the case with current offerings the tech is available to use by them with Pascal if they choose to unlike G-sync for AMD.

Thats because its not true gsync in laptops its actually adaptive sync. The way laptop monitors are designed allows this to be used it doesnt need a module because the hardware required is already there. I believe adaptive sync is the way forward and would be a "standard" to adopt but the tech for desktop monitors is not available yet. Through the type of DP/HDMI classification or monitor hardware something like that.
 
Last edited:
Back
Top Bottom