• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

it could be used for good or for bad, system is very open to abuse without a stated policy

good imho would look like : GFE has an Nvidia GPU & can see they run games regularly over 12months aka they are a PC gamer
bad would like : GFE has a 3090Ti and we want to target those mugs 1st or we don't take into account a user that doesn't play games aka they are a professional that should be buying Quadro's

It says on the terms of the deal that in order to qualify you must be a gamer and a current owner of a 10 series or above card. The system checks using GFE before it will generate you a link.

My only issue with that is that a low of people sell well and may well have sold their 1080ti/2080ti/3080/3090 card a month before launch to get max price and just have a low level temp card in their system but they wont be able to check out.

But the system cant be perfect, at least they have set it so only a current user of a 10 series and above card can buy a 4090FE and a pro wouldnt have GFE installed (in fact a lot of gamers wont touch GFE with a 10 foot barge pole!!!!)
 
Last edited:
how the hell did you even get the chance to order one? I never even saw them come available on the day...

+bonus points for living near Norwich, pleased we got some in our region :)

The GFE scheme. They are sending random invites out with a direct link valid for 48 hours to buy a 4090 FE to people with GFE installed and at least a 10 series Nvidia card or higher.
 
I had a couple of games on BF2042 last night but the game crashed twice after about 15 mins of playing. Any ideas what it could be? I hardly play it which is a shame as there is such a good game in there but DICE really have kicked us PC users in the nuts which is shameful considering we are who made this franchise what it is. No joystick support is frankly an insult.

For joypad support I'm sure windows natively supports the xbox pad , personally I use a ps5 pad and to get that working use either the steam support for games on steam or use DS4Windows which works in every single game I've tried :)
 
how the hell did you even get the chance to order one? I never even saw them come available on the day...

+bonus points for living near Norwich, pleased we got some in our region :)

On release day I just pressed F5 once when it was 14:00 and then Buy button was there, had my card and address details already saved to my browser and was checked out in about 20 seconds.

Few of us in the Norwich area been on OCUK for many years :)
Shout out Battlenet above Iceland on St Stephens Street!
 
Last edited:
have you checked the Windows event logs and looked at the warning descriptions with the red exclamation mark?
Thanks, just had a look and there's a lot of errors for 'The AORUS LCD Panel service terminated unexpectedly. It has done this X times..." - and there's an entry for this every second, not sure this would be linked though?
 
just had a quick stint overclocking my msi gaming trio, didn't get results as good as some of u on here (stable +125 core +950 memory). i did try upping the power and +250 core +1200 memory and pc rebooted :eek:
 

NVIDIA GeForce RTX 4090 PCI-Express Scaling




Conclusion​

The GeForce RTX 4090 "Ada" is a monstrous graphics card that delivered massive generational performance uplifts. When we set out to do this feature article, our main curiosity was on how the RTX 4090 performed with half the bandwidth of its native PCIe setting of PCI-Express 4.0 x16. Forcing the motherboard to limit the processor's PEG interface to Gen 3 (i.e. PCI-Express 3.0 x16), accomplishes this. Gen 3 x16 is also an identical amount of bandwidth to Gen 4 x8. This PCIe mode will be most relevant to those planning to build 13th Gen Intel Core "Raptor Lake" + RTX 4090 machines who also plan to use next-gen PCIe Gen 5 NVMe SSDs. We are speaking with various motherboard manufacturers, and they report that most of their premium Intel Z790 chipset products come with Gen 5 NVMe slots that subtract 8 PCIe Gen 5 lanes from the main PEG interface when the M.2 slot is in use. If you have no M.2 SSD installed in the Gen 5 slot, then the graphics card will run with a full x16 lanes configuration, but when you install an SSD, the GPU slot will be limited to PCI-Express 4.0 x8 (with bandwidth identical to PCI-Express 3.0 x16).

Just to clarify, if you have an Intel motherboard with M.2 Gen 5 and M.2 Gen 4 and choose to install your SSD into the M.2 Gen 4 slot, you will not lose any PCIe bandwidth for the graphics card.

"Raptor Lake" PC builders can breathe a huge sigh of relief—we're happy to report that the GeForce RTX 4090 loses a negligible, inconsequential amount of performance in PCI-Express 3.0 x16 (Gen 4 x8-comparable) mode. Averaged across all tests, at the 4K Ultra HD resolution, the RTX 4090 loses 2% performance with Gen 3 x16. Even in lower, CPU-limited resolutions, the performance loss is barely 2-3 percent. When looking at individual game tests, there is only one test that we can put our finger on, where the performance loss is significant, and that's "Metro: Exodus," which sees its framerate drop by a significant 15% at 4K UHD, with similar performance losses seen at lower resolutions.

While in the past the rule has always been "lower resolution = higher FPS = higher performance loss from lower PCIe bandwidth", in today's dataset we find a couple of games that behave differently. For example, Elden Ring, Far Cry 6 and Guardians of the Galaxy clearly show a bigger loss in performance at higher resolution. It seems that these games, which are all fairly new and use modern DX12 engines, do something differently. I suspect that while the other games transfer a relatively constant amount of data for each frame, and thus get more limited at higher FPS, these titles transfer data that's based on the native render resolution. So if you go from 1080p (8 MB per frame), to 1440p (14 MB per frame) and 4K (32 MB per frame), the increase in per-frame traffic outgrows the traffic reduction due to the lower FPS rate.

We also tested the RTX 4090 in PCI-Express 2.0 x16. This is the kind of bandwidth you get if you try to use the card on older machines with Gen 3 x8 bandwidth for whatever reason (think AMD Ryzen 3000G APUs), or if you accidentally plug the card into one of the electrical Gen 4 x4 PCIe slots of your motherboard. Doing so won't really saturate your chipset bus, as Intel has increased that to DMI 4.0 x8 with the Z690 and Z790 chipsets. This is also the kind of bandwidth comparable to using eGPU boxes (which convert an 80 Gbps Thunderbolt or USB4 interface to a PCI-Express 4.0 x4 slot. Here, the performance loss is a little more pronounced, averaging 8% at 4K UHD resolution, and going as high as 18%, seen with "Metro: Exodus" at 4K.

We also tested the academically-relevant PCI-Express 1.1 x16 bus, or bandwidth that was available to graphics cards some 16 years ago. This is comparable bandwidth to using some of the older eGPU boxes that wire out a 40 Gbps Thunderbolt 3 to a PCI-Express 3.0 x4 slot. Though surprisingly well-contained in some games, the overall performance loss is pronounced, averaging 19% across all tests at the 4K UHD resolution. This can get as bad as 30%—nearly a third of your performance lost, or performance levels comparable to an RTX 3090 Ti @ Gen 4 x16.

If you're happy with what Ryzen 7000 "Zen 4" offers you, go ahead and build a machine. Regardless of the motherboard and chipset, you'll get a Gen 4 x16 slot for your graphics card, and a Gen 5 x4 NVMe slot for your future SSD, which won't eat into the x16 slot's bandwidth. If, however, the 13th Gen Core "Raptor Lake" impresses you, and you only have a Gen 4 NVMe SSD, try to use an M.2 slot that's wired to the chipset or the M.2 slot wired to the processor's Gen 4 x4 NVMe interface, but avoid using the slot that's capable of "Gen 5" bandwidth, as that will cut into the x16 PEG slot's bandwidth, and although inconsequential, you still lose 0-2 percent frame-rates (why even lose that much?). If you get your hands on a Gen 5 NVMe SSD a few months from now, and want to use it with a 13th Gen Core platform; go ahead and use that Gen 5-capable slot, the 2% graphics performance is an acceptable tradeoff for storage performance in excess of 10 GB/s, which could come in handy for those in media production, etc.

If you're into external GPU enclosures, pay attention to the kind of bandwidth those boxes offer. The ones with 40 Gbps upstream bandwidth (Thunderbolt 3), offer downstream bandwidth to the GPU comparable to PCI-Express 1.1 x16, which means you stand to lose a third of the performance. Some of the newer eGPU enclosures in development are being designed for 80 Gbps interfaces such as Thunderbolt 4 or USB4—comparable bandwidth to PCI-Express 2.0 x16 or 3.0 x8. Here, your performance loss drops to within 15-20%, which is still somewhat acceptable.
 
Last edited:
just had a quick stint overclocking my msi gaming trio, didn't get results as good as some of u on here (stable +125 core +950 memory). i did try upping the power and +250 core +1200 memory and pc rebooted :eek:
Go up in small jumps (+10/20 per time on core and around +50 on memory)and only tweak one at a time otherwise won't know if it was memory or core that caused it to reboot :)
 
For joypad support I'm sure windows natively supports the xbox pad , personally I use a ps5 pad and to get that working use either the steam support for games on steam or use DS4Windows which works in every single game I've tried :)

Yeah but I want to use my joystick. I've been flying in BF since BF1942 and Desert Combat and it is just far more immersive for me. I will never understand why they dropped that support :mad:
 
I want MORE :D
Waiting to be able to flash a new bios.
Is that what she said :eek:;)

But yeah don't we all want higher core clocks , a touch more power and in a waterblock with some liquid metal should certainly help it out , if the clocks don't go any higher at least they should remain stable and not drop off due to temp :D
 
Back
Top Bottom