• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

gtx970 running @ x8 (1.1) not x16 (2.0)

You could also use Snipping tool to capture selected parts of the screen.
Start>All programs>Windows accessories>Snipping tool

Snip - still running at x8...

IeU2qzS.jpg
 
Im gonna go into safe mode, uninstall all drivers, reinstall latest, see if that works.

then re-insert card (although im pretty sure my last card only ran at x8 so might be something wrong with the slot?

THEN gonna do BIOS update, If I can figure out how.
 
Hey all. Thought I'd come back here and share my success. I went into the BIOS and disabled something called "usb 3.0 turbo"

I had heard other people talking of "trying" to disable it for asrock boards and wondered if it could be using some of the pci-express bandwidth in some way - what a horrible feature to have on by auto!!!

losing 1-10% in gaming power compared to slightly quicker usb3.0 speeds that I never use anyway? WOW. Cheers Gigabyte.

FPS has gone up slightly it seems, which may indicate that a gtx970 will be bottlenecked by a PciEx 2.0 x8 lane... Or it could be placebo effect... Still, mystery solved!!! :)

Thanks guys

***EDIT *** Have just gone back to an area in tomb raider which I know was jumping like hell from 40fps - 60fps (very juddery) and the same area now runs at a CONSTANT 60fps... No placebo effect - A huge increase. For anyone that says there is no difference between x8 and x16 on todays cards needs to run this test.
It's the area just after your team mate (forget his name) climbs up the ladder in shantytown. After that battle, the area lara is in is very glitchy when looking back across the gap and up to the bloom effect sky.
 
Last edited:
perfrel_2560_1440.png


By the way don't use Furmark any more (and if you're so concerned with performance overclock that 2500k, 4Ghz shouldn't be a problem) .
 
Last edited:
Well done OP.

https://tpucdn.com/reviews/AMD/R9_Fury_X_PCI-Express_Scaling/images/perfrel_2560_1440.png

By the way don't use Furmark any more (and if you're so concerned with performance overclock that 2500k, 4Ghz shouldn't be a problem) .

Are you wondering why graphs only show a 2% difference between 2.0 x8 and 2.0 x16 but the OP found Tomb Raider throttled by 0-20 FPS? I am too. I'm sure you CBA slim01 but if you found the time to record two play throughs of that area I'd be very interested to watch them.

And my bad with Furmark, just the first thing that came to mind.
 
In that gpu-z window. Hit that question mark next to pci-e speed, and start rendering tests. It should raise it to maximum speed it can do in load.

Also if you can't get 16x from your motherboard, you could have same problem I had with my 7970. One pin on the card was proken, so it couldn't get contact with pci-e slot.

Edit, oh, 2500k only has 16 pci lanes, so if could ben that when you're populating one 1x pci slot with soundcard, you only have 15 lanes left, so 8x is your maximum?
 
Last edited:
Well done OP.



Are you wondering why graphs only show a 2% difference between 2.0 x8 and 2.0 x16 but the OP found Tomb Raider throttled by 0-20 FPS? I am too. I'm sure you CBA slim01 but if you found the time to record two play throughs of that area I'd be very interested to watch them.

And my bad with Furmark, just the first thing that came to mind.

I'd love to make a vid but yeah, CBA to download and figure out how to use all the software, editing, posting etc!
Graphs and benchmarks are great but there's nothing like real world tests... I may overclock the i5 again, see if any noticeable differences, despite graphs and the like showing a gtx970 won't be bottleneck by stock i5, only giving 1-3% increase in some games.
Windows task manager however never really shows cpu at 100% on all cores so I guess it's very unlikely it'll make a difference at all.

I still find it amazing how and why gigabyte would default on usb3.0 turbo mode and sacrificing gpu power on a board that is clearly designed to house full graphics cards... Surely they must know? :confused:
There were no warning messages or anything so I am sure there are thousands of others out there with the same setting.
 
Last edited:
Back
Top Bottom