Linking 2 PCs over 10G query

Yeah I cant be bothered with the 2.5/5GbE stuff, might as well just go all in to 10GbE.

Currently other than my desktops nothing else needs that sort of throughput here and ive got a 24port 1GBE switch which I bought many many years ago and is still rock solid so hoping to get CAT6A run round the house and set up a rack, out of the way. Then think about an 8port 10GbE switch for future expansion. Hence I wont mind too much if it has a fan but I dont really want it under my desk :)
 
At the point we're at gigabit FTTP, I DO think it's time the cheap home kit moved on a bit.

Doesn't have to be anything more than cheap RJ45 (no SFP/+ excesses need) that'll do more throughput. The 2.5 GbE trend seems a bit of a cop out imo.
The problem is 10Gb over copper isn't even that widespread in the data centre so there's been no trickle down to home use. 10gb SFP+ switches are dead cheap as are the direct attach cables or optics and premade fibre leads.

2.5Gb is just a "let's do something faster over existing cabling", as most existing home and even data centre cabling wont do 10Gb.
 
The problem is 10Gb over copper isn't even that widespread in the data centre so there's been no trickle down to home use. 10gb SFP+ switches are dead cheap as are the direct attach cables or optics and premade fibre leads.

2.5Gb is just a "let's do something faster over existing cabling", as most existing home and even data centre cabling wont do 10Gb.

Hmmm, fair. I guess my current role isn't QUITE so network hardware linked so not a connection I'd made (pun was accidental but I'll own it).

Was sort of thinking in the direction of what motherboards come out equipped with as that's kind of the end game from a "gamer" perspective. SFP modules still seem to be about £20+ each as best I'd seen which sounds like a fair wedge for a motherboard port. I guess it is waiting on more copper 10GbE progress.

Good insight though, hadn't quite considered that angle and, as said, not quite a network hardware pro but thinking on, indd, kit I have seen is probably more "fiber based patch lead" type direction for how racks are cabled up.
 
Last edited:
Well my cards have come and for some reason the two PCs wont talk to each other, or at least they did briefly and it worked perfectly but then connection disappeared when I restarted the PC and I dont seem to be able to get it back. there seems to be data being sent up and down from both pcs looking at the status

Ive set the IPs in the TCP/IPv4 as 10.0.0.1 and 10.0.0.2 with sub nets of 255.255.255.0, any thoughts?

Also ive tried a different cable with no benefit. Ive also connected the new NICs to the switch and they are working fine talking to the internet etc

EDIT:

I think ive worked out the problem, windows seems to think its a public network rather than private and has the connection labelled as unidentified network…I dont know how to change it?
 
Last edited:
Ive set the IPs in the TCP/IPv4 as 10.0.0.1 and 10.0.0.2 with sub nets of 255.255.255.0, any thoughts?

Looks good. Now you need to set a route. You want something like ROUTE -p ADD 10.0.0.0 MASK 255.255.255.0 10.0.0.1 (or 2) METRIC <metric number> IF <interface number> but you'll need someone more knowledgeable than me to give you chapter and verse.
 
TP Link SX-105.

I use this it works well, using smb multichannel with 2x10Gb in my machine and NAS with this gives me a nice 20Gb link to my NAS, fairly cheaply, I did do this with the Zyxel but needed another 10G port so others could access my NAS at more than 1Gb too so the 5 port was handy.

If direct connection is problematic then there are cheap desktop ones that'd help.

Other cheap fanless switches that'll do the job that I have used are from Zyxel the

XGS1010-12 ( 8x1 , 2x2.5, 2x10Gb SFP) £100 unmanaged​

XGS1210-12 ( 8x1 , 2x2.5, 2x10Gb SFP) £120​

XGS1250-12 ( 8x1, 3xMultigig upto 10GbE, 1x10Gb SFP) £160​

 
Last edited:
You shouldn't need to set a route.

Have you pinged the IP address of the server from the PC.

You can try running a trace route, it should go directly there.

The other thing that comes to mind is Windows firewall. You can try turning it off to see if it makes any difference.
 
Ok so I have sorted the problem I think. I am using Win11 on both computers FYI.

So ive had to add the other machine's IP address in both the gateway and DNS server in the settings, this let me change the network to a private network. The first time when it was working the windows tool came up asking me about network sharing and it worked but seems on restart it wouldnt keep the settings?!

I also had to manually enter network bindings with the 10G adapter as a lower value or it wasnt using it to access the server. Not quite as straight forward as one might like...
 
Faster isn't always better if you're running plex or jellyfin etc

I'm only using 2.5Gb so when I transfer a movie across it writes at 280MB/s. If somebody it watching something on plex this write speed to standard spinning HDDs will cause their stream to buffer. I either need to wait until nobody is watching anything, or slow the transfer down. Upgrading to NVME is not an affordable option when you have 40TB or storage

If you'll be using your server for something similar to this, 10Gbs is overkill.
I can see why it would be fun to have though
 
Faster isn't always better if you're running plex or jellyfin etc

I'm only using 2.5Gb so when I transfer a movie across it writes at 280MB/s. If somebody it watching something on plex this write speed to standard spinning HDDs will cause their stream to buffer. I either need to wait until nobody is watching anything, or slow the transfer down. Upgrading to NVME is not an affordable option when you have 40TB or storage

If you'll be using your server for something similar to this, 10Gbs is overkill.
I can see why it would be fun to have though
seems you need to add a cache drive into the mix, use it for downloading without writing to your array
 
Faster isn't always better if you're running plex or jellyfin etc

I'm only using 2.5Gb so when I transfer a movie across it writes at 280MB/s. If somebody it watching something on plex this write speed to standard spinning HDDs will cause their stream to buffer. I either need to wait until nobody is watching anything, or slow the transfer down. Upgrading to NVME is not an affordable option when you have 40TB or storage

If you'll be using your server for something similar to this, 10Gbs is overkill.
I can see why it would be fun to have though
Surely you will run into problems either way? You saturate the network bandwidth with 1g or the drives read ability on 10g
 
Back
Top Bottom