• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Im not as convinced of this. Ray tracing in its current state is being run and in a lot of cases only feasible with upscaling. No problem with that as far as I'm concerned. But ampere at high resolutions with full Ray tracing is not foolproof, and with the rate of expected improvements in Ray tracing coming up, I don't think ampere will age that well.

Point taken with RDNA 2, but if both are struggling to run RT in a few years, the vram may be of more benefit for longer. It'll depend how the long promised direct storage and upscaling tech comes through.
Those RT options will be scalable though i.e. low to medium to high to ultra so if RT effects do get dialled up even further where only 40xx/rdna 3 is capable of high/ultra, ampere will be capable of medium where as RDNA 2 won't even be capable of low, we have already seen RDNA 2 having to sacrifice the RT settings even when upscaling is used i.e. same way any potential problematic vram intensive games can be solved by knocking down the texture setting by one notch.

What DF etc. have found is when RT is more complex or/and more effects are used, RDNA 2 completely buckles where as ampere performs even better:


The more complex the ray tracing in space, the better Ampere performs.

Alex from DF did a very good video showing how ampere and rnda 2 performs when you add more RT effects into the mix too.
 
Same old faces with the same old useless posts that add nothing of their own thoughts to the thread :cry:



As per usual, nothing of substance to comeback to I see, much like how this thread went down:


:cry:

gIn3sMDLQLpUZrWZH_YT_ofqpIPqij1WQufZ0ihsrRk4x5NTT2AJwD_hW5XeT4PFEQ-7SPGTwPNuCGYas1P2__ydOq5hZFDm-A=s0-d
 
Im not as convinced of this. Ray tracing in its current state is being run and in a lot of cases only feasible with upscaling. No problem with that as far as I'm concerned. But ampere at high resolutions with full Ray tracing is not foolproof, and with the rate of expected improvements in Ray tracing coming up, I don't think ampere will age that well.

Point taken with RDNA 2, but if both are struggling to run RT in a few years, the vram may be of more benefit for longer. It'll depend how the long promised direct storage and upscaling tech comes through.

Be like Bill.
 
What DF etc. have found is when RT is more complex or/and more effects are used, RDNA 2 completely buckles where as ampere performs even better:
It'll be interesting to see if this is the case moving forward. If nvidia are pushing Ray tracing, it would make sense for them to really push it with a whole new architecture. Ampere marketing seemed to push the 4k 60fps much harder. I'm not saying ampere is poor at Ray tracing as thing stand, but I think it could fall behind fairly quickly (more so than general rasta etc as this seems to be slowing down. Opinion, no fact there) in a rapidly maturing technology.
 
It'll be interesting to see if this is the case moving forward. If nvidia are pushing Ray tracing, it would make sense for them to really push it with a whole new architecture. Ampere marketing seemed to push the 4k 60fps much harder. I'm not saying ampere is poor at Ray tracing as thing stand, but I think it could fall behind fairly quickly (more so than general rasta etc as this seems to be slowing down. Opinion, no fact there) in a rapidly maturing technology.
There is no doubt ampere will fall behind as you can guarantee nvidia will be pushing their sponsored titles hard with the RTX feature set if the 40xx dominates the RT space but as mentioned, ampere will still be far more capable of dialling up the RT effects than what RDNA 2 will be able to as we have seen time and time again with all the RT titles out now, same way any potential vram heavy games will be able to dial up the texture setting 1-2 notches higher come the time but difference is, there is likely to be far more RT titles than there are vram heavy titles imo, maybe in 3/4 years time, there will be more vram heavy titles but chances are the vast majority will be on new gpus with both far more vram and RT perf, even in the lower end tier gpus.

As always it comes down to how much one values RT though.

Whilst we're on the topic of RT and how ampere and RDNA 2 performs in RT, spiderman requirements are out now, 3070 in same league as a 6900xt for second highest RT option and 3080 in the same league as the 6950xt for the max RT preset @ 4k:

dN8RDDi.jpeg

Meanwhile VRAM heavy/problematic titles?

lost-john-travolta.gif


:p
 
  • Haha
Reactions: TNA
Oh... almost forgot, the 3090 also has 2GB IC's, but that's a £1,500 GPU, spend that to get plenty :D

3090 actually has 1GB IC's 24 of them, the 3090ti has 2GB IC's and 12 of them, this is why I stated there should have been 24/48GB versions of both these cards for the pro users too as really that's who they were aimed at and why I called a 3080 a disgrace of a card with 10GB VRAM, but some keep defending this fact and reality is it should have come with a minimum of 12GB from the start and 16GB for the 3080ti.

It was clear planned obsolescence what they did with the 10GB 3080 and anyone that doesn't see that has clearly not been a computer enthusiast long enough to see the VRAM games being played out with such a powerful card that was basically crippled by design on day one of release, to make sure you update next gen when they bring out software that will make it look like the 8GB cards of the previous gen.

Anyone remember this :-


Clearly they used the VRAM limit to make the previous 2080 look like a failure and the 3080 a massive upgrade but reality as we know was it was a VRAM limitation that Nvidia used to make their new gen look like a "huge upgrade"... Then they use the media to shill this information for them and they play innocent while they give their sponsored media review guides to show the limitations of the previous gen with newer software or older software/hardware that they know will fail compared to the new gen they are selling us all at the time.


When I saw the release of the 3080 and it had 10GB I was never going to buy a card that had 10GB after coming from a 980ti 6GB over many years of owning that and a person that keeps their cards for many years before updating again the whole system, normally now the GPU I buy remains in the system till I update the whole machine, before it was worth adding one GPU update before updating the full machine, it hasn't been worth really doing that for me in a while a person that use to buy every years new GPU when it was worth updating then for real gains, now the gains are not worth it either because no software takes advantage of it or the resolution used shows no benefit from the upgrade due to cpu bottlenecks or some other system bottleneck. So this time I purchased 2 x 3090 and can pool the VRAM for 48GB for my work and sli for some of the games that can use them both and knowing that 24GB is not going to limit games for a good few years to come. I also updated my full system and the screen to a 5120 x 1440 and higher refresh rate to make use of the cards without creating a cpu bottleneck and making sure the gpus are the bottleneck for the resolution so I get the max out of such an investment for the years I will use it.

3080 will end up a 1440p card soon because of the 10GB VRAM when really it is a powerful enough card for 4k for future games if it had more vram, but they know that the 10GB will make people turn settings down for 4k sooner and by then most people that have such a card will have a 5120x1440p or 4k screens. Anyways this argument has been going on since day one and the massive threads about it, reality is 10GB is enough for now but it will show its limits next gen for sure and nvidia will use that limit as they did with the 2080 vs 3080 video above, same trick will be used again and they will make sure more games use over 10GB soon and AMD did the same game with farcry 6 as we know and made sure cards needed 16GB at first but then they trimmed it down to was it 11GB ? So making sure the 3080 10GB still suffers and the 3080 12GB, 3080ti, 3060, 3090 were the only cards that could use the high res texture pack as all these cards had 12GB or 24GB and the 3060 they didn't care about as that card had gpu performance issues to make real use of the texture pack at 4k for example where the texture pack shines. So really we have seen VRAM limits used as a weapon against the competitor and against the customer to make them update sooner (planned obsolescence).

The 3080 10GB was the unicorn sensibly priced card on release and was great value for this generation but it had a dark side too that will rear its ugly head soon sadly, they used the £650/$700 msrp as a weapon against AMD too, this is why AMD could not sell the 6800xt at the fake msrp they had and only allowed the fake msrp in their store only at the time but of course the UK AMD store didn't exist to UK customers because of BREXIT they said but reality was Nvidia did it, but they didn't want to do that and sell cards to the UK with a tiny profit margin or maybe even a slight loss in this market, but of course they had to sell them somewhere at that msrp or it would land them foul with the law as it would have been a fake msrp, so they sold them on their store to countries they would make a small profit or sold at cost and didn't even sell many of them.

Looking forward to the 4000/7000 series cards but not looking forward to the silly games from these companies and the scalpers that will be back for sure this time.
 
Last edited:
It's getting late so have skimmed above post but "crippling 2080 to make the 3080 look better than it really is"........ Can I have some of what you're smoking please :cry: A 3080 destroys a 2080 and beats an extreme OC 2080ti too.... Not remember the footage of doom eternal (another "supposed" vram problematic game...)

Heck look at the 3070 vs 2080ti...... it's matching and beating a 2080ti.

Oh and remind me again of the price difference of said gpus? :cry:

Again, of course if you have titles that push vram heavy a 2080ti will probably be able to have the texture notch up one level but guess what...... it's fps is still going to suck compared to the 3080 due to the weaker grunt.

Also, to reiterate it is not really "planned obsolescence", this indicates that something can no longer be used at all, in effect saying the gpu can't play said game at all, reality is, lower 1-2 settings and problem solved, same way RDNA 2 users have to lower/turn off RT or 3090(ti) users have to use dlss balanced/performance in order to get around 60 fps in cp 2077 maxed out at 4k. It's only planned obsolescence if said users refuse to adjust settings appropriate to their hardware, reminds me of that guy who tried maxing out cp 2077 on his 3070 then was complaining about how poorly optimised the game was :cry:

People have been saying 3080 was doomed from day 1 with its vram yet here we are, still waiting.... FC 6 also plays fine on my end as shown in my youtube footage except when I went and forced rebar on, was getting similar fps to what 6800xt were getting in tech press benchmarks.

Of course there will come a time where vram will "potentially" become an issue, see this point:

At a time 4 cores was more than enough and even overkill, same way 2 cores was more than enough and at a point also overkill too. Technology and the requirements advance as time goes along, shock horror.....

And people still haven't answered this point yet either:

Who do you think is going to be better of in the end with all things considered (when it comes to just gaming experience on the whole and money spent)?

Option 1:

Someone who bought a 3070/3080 for their MSRP and is going to upgrade to a 4070/4080 for their MSRP (assuming MSRP is between £500-700)

Option 2:

Someone who bought a 3090 for their MSRP and isn't going to upgrade to a 4070/4080

Option 3:

Someone who bought a 3090 for their MSRP and is going to upgrade to a 4070/4080 for their MSRP (assuming MSRP is between £500-700)

The amount of vram heavy games this gen has been a far cry from what was expected…

But but but the people who don't even own one have been telling insisting that there would be loads of games by now! :cry:
 
But but but the people who don't even own one have been telling insisting that there would be loads of games by now! :cry:

Yup. Too funny.

I kept saying there will be a handful of games this gen that may need more than 10gb and they were not having it. Turned out it was not even a handful and out of them not a single one I was interested personally myself :cry:

Roll on next gen cards which will have the extra vram anyways :D
 
Option 4 - Bought 3080 at msrp and not intending to upgrade.

Could also be option 3, if that person is sufficiently rich to not worry.

Option 2 isn't ridiculous depending on use case - I know you stipulated gaming only, but in the real world Gpus often pull double duty.

Option 1 is going to be enthusiasts regardless of cost.

Who is better off? Damn tricky to answer.
 
Option 4 - Bought 3080 at msrp and not intending to upgrade.

Could also be option 3, if that person is sufficiently rich to not worry.

Option 2 isn't ridiculous depending on use case - I know you stipulated gaming only, but in the real world Gpus often pull double duty.

Option 1 is going to be enthusiasts regardless of cost.

Who is better off? Damn tricky to answer.
Well yeah obviously there could be loads of options :p But to fit the "is it worth more £ for the extra vram" theme, those options made the most sense. Same could even apply to the the 3080 12gb models too.

From my POV, assuming old gpus are sold on.... it's pretty obvious option 1 person will be the better of both in short term and long term especially given how much value the 3090s/3080 12gb models are losing now and the 40xx hasn't even been announced yet, of course a 3080 10gb model will also lose value but not as much as the £1+k gpus... If money is a complete non issue then obviously option 3 is good from that persons POV.

Option 2 would only make sense if one doesn't care about having the latest and greatest and wants said gpu to last as long as possible and are contempt with having the new gen mid tier gpu match/beat theirs for a much cheaper price.
 
Resale on the 3000 series will probably be good...but not whilst 4000 series cards are in stock at the beginning of the cycle. So say 3080 bought for 649, 4080 bought for....£800? That's a total spend 1,449. Also dependent on availability, if you sell your 3080 cheap and then can't get a 4080... You might laugh, but some people won't learn/be aware of the lesson from last gen.

Recoup say 450 on the 3080 sale, that's an outlay of £1,000. Not a million miles off the MSRP of the 3090, and reliant on having been able to pick up a 3000 at launch, and being able to do the same with a 4000 series. £1,399 (I think) for a 3090 at launch is more, but that person will have also had two years out of it, probably more vRAM than either 3080 or 4080, and allows greater performance for rendering too, so I don't think it's as black and white as you'd think.

I am not yet seeing a compelling case for upgrading gen on gen, but I'm clearly not a proper enthusiast :D
 
At a time 4 cores was more than enough and even overkill, same way 2 cores was more than enough and at a point also overkill too. Technology and the requirements advance as time goes along, shock horror.....


You got to be kidding 4 cores was enough ? :cry: The 4 core cpus we had for a long long time was because Intel didn't want the general consumer to get more as they could not then sell their higher core cpus to business for the silly prices they charged and the HEDT systems that real enthusiasts were buying at silly prices for their cpus that had only 2 cores more in some cases than the consumer 4 cores, thanks to AMD they stopped that stupidity that was going on by Intel.

You are now defending a company that deliberately held back technology to the general public and stagnated the whole industry because of their profit margins.

Wow really... You clearly have not been around computer technology long enough to make such a statement that 4 cores were enough. Intel stagnated the whole pc industry with their 4 core cpus and monopoly and dirty tricks they did with OEMs like DELL etc etc. The 4 core cpu mess that never ended caused huge issues for real computer users and caused huge price increases if you wanted more than the 4 cores, either HEDT consumer cpus or Xeons. You clearly never owned a HEDT system then and seen the real cost compared to the 4 core consumer systems.

Google is a good place for you to learn about Intel.. By searching "Intel monopoly" and understanding the reality of what Intel did to the cpu market and how they deliberately drip fed us minor updates for over a decade. :rolleyes:


Sorry Nexus18 that is your worst argument so far mate, defending the 4 core never ending nightmare that happened then. Remember some of us use computers for more than playing games and even game makers got tired of the 4 cores that limited them in some games and Intel would play well use hyperthreading and you have 8 threads lol, they created hyperthreading to make it look like the cpu had double the cores, but reality they don't it's a good feature HT where there is some idle use of the core and not waste the core, reality is HT is not the magic bullet for games and games require real cores and some applications made very good use of HT some it made them slower and as with some games, remember the days people were turning off HT for some games ? because HT would make them slower in some cases.

Anyways ... mate don't defend these companies that are deliberately fleecing you by deliberate planned obsolescence or stagnating the industry to keep their other product lines with higher margins. Nvidia and Intel are the prime examples of this behaviour because they are a monopoly and AMD is no saint too but as the company that is normally behind these companies they are forced to make industry changes to stick out and thanks to them they keep Intel and Nvidia a little more honest, but as we have seen minute AMD is ahead the same game plays out and then we need Intel or Nvidia to keep them honest, that is why competition is very important to the consumer and why I am happy to see Intel in the dGPU market now even with not amazing products yet but it will keep the industry a little more honest and competitive.
 
Last edited:
Ampere marketing seemed to push the 4k 60fps much harder.

This. Although cries of lacks horsepower would seem to conflict with it. I have noticed liked posts with the same information some of us were posting, so it seems some have warmed to the situation albeit through gritted teeth.
 
..why I called a 3080 a disgrace of a card with 10GB VRAM, but some keep defending this fact and reality is it should have come with a minimum of 12GB from the start and 16GB for the 3080ti.

It was clear planned obsolescence what they did with the 10GB 3080 and anyone that doesn't see that has clearly not been a computer enthusiast long enough to see the VRAM games being played out with such a powerful card that was basically crippled by design on day one of release, to make sure you update next gen when they bring out software that will make it look like the 8GB cards of the previous gen.
...
3080 will end up a 1440p card soon because of the 10GB VRAM when really it is a powerful enough card for 4k for future games if it had more vram, but they know that the 10GB will make people turn settings down for 4k sooner and by then most people that have such a card will have a 5120x1440p or 4k screens. Anyways this argument has been going on since day one and the massive threads about it, reality is 10GB is enough for now but it will show its limits next gen for sure.

Mate, it comes out of your mouth and little resistance. When I said this months ago it was pitchforks and torches by the usual suspects getting antsy. Interested to see the reaction though all the same!
 
Resale on the 3000 series will probably be good...but not whilst 4000 series cards are in stock at the beginning of the cycle. So say 3080 bought for 649, 4080 bought for....£800? That's a total spend 1,449. Also dependent on availability, if you sell your 3080 cheap and then can't get a 4080... You might laugh, but some people won't learn/be aware of the lesson from last gen.

Recoup say 450 on the 3080 sale, that's an outlay of £1,000. Not a million miles off the MSRP of the 3090, and reliant on having been able to pick up a 3000 at launch, and being able to do the same with a 4000 series. £1,399 (I think) for a 3090 at launch is more, but that person will have also had two years out of it, probably more vRAM than either 3080 or 4080, and allows greater performance for rendering too, so I don't think it's as black and white as you'd think.

I am not yet seeing a compelling case for upgrading gen on gen, but I'm clearly not a proper enthusiast :D

I know people use their pc for more than gaming hence why I stated in the question:

Who do you think is going to be better of in the end with all things considered (when it comes to just gaming experience on the whole and money spent)

:p

Most people on this forum are just gamers though, I would say the likes of purgatory are very rare where they need all the vram they can get for their workloads (and obviously they bought the 3090 with this in mind so the question doesn't really apply to those kind of "power" users)


3090 msrp was/is £1400

I suspect 3080 second hand will go for £300-400 (as this will be price range of 4060 imo) and a 4080 will be £750-800.

A 3090 is not going to go for anymore than £600 at the very most come 4070 announcement (second hand 3090s are already going for very low prices on the bay and high street retailer, circa £800-900 not to mention if they have been mined on....), see what happened in the mm when people tried selling their 2080tis when the 3070 got announced :cry: I am looking forward to putting some cheeky offers in along with a half eaten ham sandwich ;) :D :cry:

Of course, we will have to see what the demand, market and stock is like though. But you would be silly to not jump on new tech whilst you can as the 30xx are just going to lose even more value as time goes on.

It is going to work out great for me regardless as I sold my 290 for £90 and vega 56 for £225 and COD for £15 to put towards my £650 3080 ;) (also sold the ps 4 pro as well for £200+ but don't really count that)

You got to be kidding 4 cores was enough ? :cry: The 4 core cpus we had for a long long time was because Intel didn't want the general consumer to get more as they could not then sell their higher core cpus to business for the silly prices they charged and the HEDT systems that real enthusiasts were buying at silly prices for their cpus that had only 2 cores more in some cases than the consumer 4 cores, thanks to AMD they stopped that stupidity that was going on by Intel.

You are now defending a company that deliberately held back technology to the general public and stagnated the whole industry because of their profit margins.

Wow really... You clearly have not been around computer technology long enough to make such a statement that 4 cores were enough. Intel stagnated the whole pc industry with their 4 core cpus and monopoly and dirty tricks they did with OEMs like DELL etc etc. The 4 core cpu mess that never ended caused huge issues for real computer users and caused huge price increases if you wanted more than the 4 cores, either HEDT consumer cpus or Xeons. You clearly never owned a HEDT system then and seen the real cost compared to the 4 core consumer systems.

Google is a good place for you to learn about Intel.. By searching "Intel monopoly" and understanding the reality of what Intel did to the cpu market and how they deliberately drip fed us minor updates for over a decade. :rolleyes:


Sorry Nexus18 that is your worst argument so far mate, defending the 4 core never ending nightmare that happened then. Remember some of us use computers for more than playing games and even game makers got tired of the 4 cores that limited them in some games and Intel would play well use hyperthreading and you have 8 threads lol, they created hyperthreading to make it look like the cpu had double the cores, but reality they don't it's a good feature HT where there is some idle use of the core and not waste the core, reality is HT is not the magic bullet for games and games require real cores and some applications made very good use of HT some it made them slower and as with some games, remember the days people were turning off HT for some games ? because HT would make them slower in some cases.

Anyways ... mate don't defend these companies that are deliberately fleecing you by deliberate planned obsolescence or stagnating the industry to keep their other product lines with higher margins. Nvidia and Intel are the prime examples of this behaviour because they are a monopoly and AMD is no saint too but as the company that is normally behind these companies they are forced to make industry changes to stick out and thanks to them they keep Intel and Nvidia a little more honest, but as we have seen minute AMD is ahead the same game plays out and then we need Intel or Nvidia to keep them honest, that is why competition is very important to the consumer and why I am happy to see Intel in the dGPU market now even with not amazing products yet but it will keep the industry a little more honest and competitive.
So you're saying that at a point there was "never" a time where 4 cores was overkill? Sounds like you haven't been around pc tech long enough :cry: I didn't even mention intel and I'm not disputing any of the points about intel either so not sure why you went on a spiel speech about that thinking I'm defending them? :confused: You sound like humbug :cry:

And you missed the key point I highlighted:

Technology and the requirements advance as time goes along, shock horror.....

I'm not defending either company, as said, I buy what's best and makes the most sense at the time, hence why I have owned all brands, funnily far more amd based hardware than both nvidia and intel combined....

PS. what about the rest of the points?

It's getting late so have skimmed above post but "crippling 2080 to make the 3080 look better than it really is"........ Can I have some of what you're smoking please :cry: A 3080 destroys a 2080 and beats an extreme OC 2080ti too.... Not remember the footage of doom eternal (another "supposed" vram problematic game...)

Heck look at the 3070 vs 2080ti...... it's matching and beating a 2080ti.

Oh and remind me again of the price difference of said gpus? :cry:

Again, of course if you have titles that push vram heavy a 2080ti will probably be able to have the texture notch up one level but guess what...... it's fps is still going to suck compared to the 3080 due to the weaker grunt.

Also, to reiterate it is not really "planned obsolescence", this indicates that something can no longer be used at all, in effect saying the gpu can't play said game at all, reality is, lower 1-2 settings and problem solved, same way RDNA 2 users have to lower/turn off RT or 3090(ti) users have to use dlss balanced/performance in order to get around 60 fps in cp 2077 maxed out at 4k. It's only planned obsolescence if said users refuse to adjust settings appropriate to their hardware, reminds me of that guy who tried maxing out cp 2077 on his 3070 then was complaining about how poorly optimised the game was :cry:

People have been saying 3080 was doomed from day 1 with its vram yet here we are, still waiting.... FC 6 also plays fine on my end as shown in my youtube footage except when I went and forced rebar on, was getting similar fps to what 6800xt were getting in tech press benchmarks.

Of course there will come a time where vram will "potentially" become an issue, see this point:


And people still haven't answered this point yet either:

Who do you think is going to be better of in the end with all things considered (when it comes to just gaming experience on the whole and money spent)?

Option 1:

Someone who bought a 3070/3080 for their MSRP and is going to upgrade to a 4070/4080 for their MSRP (assuming MSRP is between £500-700)

Option 2:

Someone who bought a 3090 for their MSRP and isn't going to upgrade to a 4070/4080

Option 3:

Someone who bought a 3090 for their MSRP and is going to upgrade to a 4070/4080 for their MSRP (assuming MSRP is between £500-700)

Very curious on the planned obsolescence bit being explained.



PS.

On ampere being pushed more as a 4k60 arch., that is true, after all, generally ampere tops 4k across the "majority" of games across most tiers of gpus and that's not even factoring in RT nor DLSS availability in games (see TPU and hardware unboxed for their comparisons [SAM/rebar is on too])
 
So you're saying that at a point there was "never" a time where 4 cores was overkill? Sounds like you haven't been around pc tech long enough :cry: I didn't even mention intel and I'm not disputing any of the points about intel either so not sure why you went on a spiel speech about that thinking I'm defending them? :confused: You sound like humbug :cry:

And you missed the key point I highlighted:



I'm not defending either company, as said, I buy what's best and makes the most sense at the time, hence why I have owned all brands, funnily far more amd based hardware than both nvidia and intel combined....
No there was never a point where 4 cores was overkill for me, any person that knew how to use a computer and power users always were wanting more, because Intel only gave you what they thought you needed and what they wanted to give you at a set price, they clearly dictated and stagnated the development of better computers.

As a power user even then the cpus were not ever powerful enough for some of the tasks I did and the company did unless we purchased very expensive hardware that was clearly overpriced for what it gave too. Gaming even then required better cpus than the 4 cores, flight simulator users always knew this and we always had a cpu bottleneck due to how FS worked and wanted more cpu power. Many more examples of other software at the time that required more cores and faster cores.


Intel dictated the cpu market in that 4 core period and before that, so as a long time enthusiast you would have known that, so no one else you could have been talking about as AMD didn't really have any cpus that could beat Intel during that and any they did that had a slight advantage were frowned upon by OEMS because of the monopoly Intel had at the time and the deals they ran with OEMS and the kickbacks (aka rebates) Intel was giving any OEM not to supply AMD products.

Also the general consumer only ever saw Intel adverts and related any good computer would have an Intel cpu in it thanks to their marketing tactics at the time.

Intel ..



Intel's Disgraceful Marketing Gets Worse​


AMD vs Intel - The Past, The Present and Near Future​


Nvidia..





ohh Intel even knew they were a monopoly and joked about it in an advert ...



WRAP EU fines Intel 1.06 billion euros for monopoly abuse, Intel reax, file, rival reax​




Intel Sued by U.S. for Illegal Monopoly Practices: Video​


Intel Loses Court Appeal Against $1.4bn EU Fine​



EU Intel Antitrust case​




CEO of Intel different philosophies behind antitrust laws in EU vs. US​

 
Last edited:
4 cores was overkill for gaming in 2008, I remember those days. I owned one of the first quad core CPUs and people told me I should have bought a dual core because they overclocked better
 
Back
Top Bottom