• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

i3 6100 not good enough for BF1

Status
Not open for further replies.
Wow you really are a troll.

I also linked you to IEEE Spectrum who I believe stated in the article that they were working with Intel on the problem.

There are may sources that say transistors wear out with usage and a microprocessor is simply millions of transistors on a silicon die. As transistors die, error correction means calculations take longer.

There are even books on it co-written by IBM researchers:

https://books.google.co.uk/books?id...AEImQEwFQ#v=onepage&q=transistor wear&f=false


I can find a post or an article that states the opposite in five seconds...but I just can't be bothered...

You spouting rubbish about you can detect speed decreases that are beyond human perception and can only be measured by oscillators and what not...

It's ludicrous!
 
Last edited:
Sure you can find posts about it. There are lots of forum arguments for and against it. However, they're just speculation by amateurs.

That's why I chose to post articles that claim to be reporting intel research and a book co-written by IBM researchers.

At the end of the day, you can believe what you believe. However, I've experienced system slow downs over time that cannot be relieved by system cleaning (hardware and software) and Windows re-installation and when experts in the field write about cpu's slowing down over time due to transistor wear, I tend to believe it, because I'm an amateur and they're the experts in their field and so should know what they're talking about.
 
Sure you can find posts about it. There are lots of forum arguments for and against it. However, they're just speculation by amateurs.

That's why I chose to post articles that claim to be reporting intel research and a book co-written by IBM researchers.

At the end of the day, you can believe what you believe. However, I've experienced system slow downs over time that cannot be relieved by system cleaning (hardware and software) and Windows re-installation and when experts in the field write about cpu's slowing down over time due to transistor wear, I tend to believe it, because I'm an amateur and they're the experts in their field and so should know what they're talking about.

Dude

Please stop...you are experiencing placebo...To suggest i7 degrades less than i5 and you use the i5 less in your laptop...

Is , well, madness....
 
Test ones 5 year old CPU, does it score the same in benchmarks now as it did when new? yes.

CPU degradation blown out of proportion? yes.
 
Yeah since its like 60 quid cheaper than the amd 8370

Still a bottleneck though people pay £1100 quid for less than a 50fps jump in a new gfx card i.e titan xp so a few 100 for a new cpu todo the same is a bargain tbh
 
well yeah looking at the same result with the Fury X its around 20fps difference but that's still what people upgrade from a 970 to a 980ti for but still playable obviously at 1080p.
 
A i3 6100T v i7-6700k dx11 with a 1080 gtx

50-60fps differnce which is a lot no matter how you look at it

i3 94fps

I7 148fps


http://www.techspot.com/review/1267-battlefield-1-benchmarks/page4.html

Its abit frustrating they used the i3 6100T as the difference between 3.2ghz and 3.7ghz is fairly substantial (approximately 15%).The i7 costs approximately 3x as much as the i3 6100 so it would seem to offer amazing value for money. It would have also been interesting to show the 1440p or even 4K benchmarks as I imagine the gap would have been even less when the game becomes more GPU bound.
 
Its abit frustrating they used the i3 6100T as the difference between 3.2ghz and 3.7ghz is fairly substantial (approximately 15%).The i7 costs approximately 3x as much as the i3 6100 so it would seem to offer amazing value for money. It would have also been interesting to show the 1440p or even 4K benchmarks as I imagine the gap would have been even less when the game becomes more GPU bound.

Like all of these reviews they are also benching single player and a part where there is nothing going on, there is only a light to medium load on one or two threads.

So i3 / i5 / i7 doesn't make the blindest bit of difference.

Take it to multilayer and where there is a lot going on the extra threads on the i5 and i7 the performance runs away from the i3, sometimes by huge amounts.

Reviews like this on CPU's are utterly pointless as they bear no resemblance what-so-ever to reality, if reviewers don't know this they are stupid, they are just easy page click magnets for people who like to link them to forums like this to make a statement.
 
Like all of these reviews they are also benching single player and a part where there is nothing going on, there is only a light to medium load on one or two threads.

So i3 / i5 / i7 doesn't make the blindest bit of difference.

Take it to multilayer and where there is a lot going on the extra threads on the i5 and i7 the performance runs away from the i3, sometimes by huge amounts.

Reviews like this on CPU's are utterly pointless as they bear no resemblance what-so-ever to reality, if reviewers don't know this they are stupid, they are just easy page click magnets for people who like to link them to forums like this to make a statement.

I agree that real world mileage vs benchmarks can be very different. Its a shame that they didn't include a CPU benchmark in the game to at least give an idea of CPU performance.

Another great example of benchmark vs reality is ROTTR, the benchmark in that gives me amazing results... a far cry from the reality of the geothermal valley.
 
Wow you really are a troll.

I also linked you to IEEE Spectrum who I believe stated in the article that they were working with Intel on the problem.

There are may sources that say transistors wear out with usage and a microprocessor is simply millions of transistors on a silicon die. As transistors die, error correction means calculations take longer.

There are even books on it co-written by IBM researchers:

https://books.google.co.uk/books?id...AEImQEwFQ#v=onepage&q=transistor wear&f=false

You are phrasing it oddly which suggests you aren't quite grasping what they mean, particularly your comments on i7's wearing out slower (plus the subsequent PC slowing down due to worn out cpu comment) and transistors dying causing calculations to take longer.

In that book they are talking about individual transistor switching speed slowing down as they age (degrade). This has to complete within the clock cycle of a circuit otherwise the data/input is lost and the circuit is not functioning as required. So long as they switch fast enough for the circuit then all is fine. If it is not then for proper operation the clockspeed would need to be reduced as your chip/circuit can no longer reliably operate at its former clockspeed. As the book you linked to states, that is a failure and is not a 'slowing down' of the circuit/microprocessor as you put it. The circuit can not operate at clockspeeds that require the transistors to switch faster than they are capable.

Error correction would cause calculations to appear to take longer as they would need to be repeated once caught. The intel work/tech you mentioned to tackle the problem of transistor aging leading to errors aims to reduce the clockspeed so that the circuit/microprocessor is no longer producing errors. Under that tech the microprocessor underclocks itself or a domain of itself. This way a couple of dodgy transistors cause a minor overall drop in throughput as a particular subsystems clock is reduced by 10% for eg, rather than have 30% of operations using that circuit throw up errors and need repeating.
 
Im upgrading from a 2500k and have a 980ti. By the sounds of it most of you are recommending a 6600k or 6770k? And what is the cheapest motherboard you would pair it with? I've only got a couple of SSD's so don't need any fancy mobo
 
Last edited:
Is an upgrade away from an i3 6100 the right move?[/QUOTE

imo only if/when you have to drop the settings to say medium on everything to play most games at 60fps and even then if you still enjoy playing on medium then why upgrade?,i can play everything at over 60fps on ultra at the moment with my setup except the new deus ex,for me theres no point at all upgrading,people can get stuck in the crowd of constantly upgrading when they don't really have to imo :)
 
I guess it would take a long time for a chip to decline, I've benchmarked 3/4 year old CPU's inthe past and they always scored the same as the initial reviews on their release.

By the time a CPU is declining in performance (if at all), it would probably be long past it's proper use anyway.
 
A i3 6100T v i7-6700k dx11 with a 1080 gtx

50-60fps differnce which is a lot no matter how you look at it

i3 94fps

I7 148fps


http://www.techspot.com/review/1267-battlefield-1-benchmarks/page4.html

I do feel though given most people are striving for a 60fps smooth experience even the i3 is complete overkill. The minority who want 144fps would benefit from an i7 though. It's a case of yes the i7 is a lot quicker but the i3 is more than you need anyway!!!

Also an i3 'T' model is a low powered one. Choose a 3.8ghz full power one and the gap especially the minimums will be better.
 
Really? What do you blame then Windows? Because it was upgraded to Win 10 64 bit Pro with a new installation and is still slow.

In terms of wear, my desktop is in use 12-18 hours per day 365 days per year. My laptop can go a week without even being turned on and then it gets 1 hour usage.

Even when new, the i5 didn't even come close to the i7 in performance. I know others with i5's, and their desktop pale in comparison to the speed I have on mine. You get what you pay for.

You couldn't give me an i5 or i3.

As for gaming, the biggest bottleneck is nearly always graphics not CPU if you have anything 1/2 decent fitted. Personally I wouldn't expect to game on an i3.

Wow, you do realise that the very last sentence you only gone and contradicted yourself?

I have i3 and i7, both score the same in shadows of mordor bench at 1440P, i said it before and il say it again, it will all depend on the game, always has, and always will.

Spending £100 on a cpu and £400 on a 1070 would be better than spending £400 on a cpu and £100 on a GPU as far as gaming goes, but feel free to prove me wrong.

An i3 is perfectly capable of gaming, some games will be more demanding than others, and may require lowering or tweaking the video settings, but play they will.

Wont meet everybody's needs, but great value and very capable CPU for the money, and its wrong to dismiss it based on you needs, there are many games i could run and sit you down and play, and you would not be able to tell which was running a i3 or i7.
 
This is still going on?!?!? lol

An i3 is NOT capable a gaming CPU. just like a Pentium and Celeron is not. AMD will tell you that, Intel will tell you that, Nvidia will tell you that, any game developer will tell you that.

It's not hard, no matter if you want to bolt a £600 GPU to prove otherwise.
 
Status
Not open for further replies.
Back
Top Bottom