Best connection - S/PDIF or 6x audio jacks?

Man of Honour
Joined
12 Jan 2003
Posts
20,647
Location
UK
Hi all,

I am considering buying a new motherboard which will have integrated onboard audio as i'm sure every board does nowadays. I am just wondering how best to connect it up to an AV receiver system i currently have (Denon AVR 1306 system with 5.1 speakers)

would it be best to connect using the single digital S/PDIF or using the normal audio jacks (3x connections for Center/Subwoofer, Front left/right and back left/right)?


Which should give better quality sound? are there any limitations with either when playing things like Blu-ray discs as well or with different types of audio encoding methods?

thanks
 
I would say Digital if you can dude...

Make sure that you get a motherboard which does 'DTS connect'. This way you will not be limited to the awful converters which are 'on-board' the motherboard. If you want good converters one would opt for a standalone soundcard for this precise reason.

I use DTS Connect for movies/games and take a toslink (SPDIF) optical out from my soundcard into my Denon AVR-1911 Home Cinema Receiver with a full Monitor Audio BX surround system. I find this to be the best solution on my system as it gives me the full 5.1 surround sound in movies AND games. For music listening purposes I do not wish to employ DTS Connect and therefore set the output of Foobar 2000 (my media player of choice) to SPDIF Out in the preferences panel. This way, the computer is not 'colouring' the sound in any way and degrading the signal path (I've heard that DTS Connect is a lossy format).

My understanding is that a using a single SPDIF cable will give you better sonic quality as there is no conversion from the motherboard (computers only deal in 1s and 0s). If you were to use Analogue - there would have to be conversion of the signal into the Analogue realm from the source audio (which would be digital as we are presumably playing Blu-Rays/playing games/listening to MP3s/Wavs/FLACs).

Forgive me if any of this information is wrong (I'm sure someone will be along to aid us here!) - I am still learning all the home audio technical stuff (I work in PRO sound but 'home' audio is incredibly quite different!).

Will be building custom speakers with a friend soon though so I hope to rack his brains over all of this stuff. Slowly slowly as they say!
 
Last edited:
You really want to be using HDMI, if you are going to be using the PC as a Blu-ray player.

SPDIF is a better choice than using the onboard audio, as your Denon AVR will process the sound, and will sound better than the onboard audio.

You won't be able to take advantage of Dolby/DTS HD audio though, if you use SPDIF.
 
A ha!

Marsman to the rescue! I've been wondering for a while what I need to do to get DTS HD going (although I have yet to buy a Blu-Ray Player/Drive) - Sorry to steer thread a little off topic but am I right to assume that due to SPDIF only supporting a maximum of 44.1kHz/16-bit (iirc) then HDMI HAS to be employed for DTS HD?

Do motherboards these days even come with HDMI outputs? Surely the OP will need a dedicated soundcard for this like the Auzen X-Fi HomeTheater HD? Or simply run an HDMI from his GFX card?

Speaking of which - does this mean that my GTX 580 would be better to output sound via on my system than the Auzentech X-Fi Prelude I have currently? Would going through the graphics card still be able to handle DTS Connect?
 
HDMI is required for these so called HD audio formats (Dolby True HD and DTS-HD Master Audio).

There isn't much point in having a motherboard with HDMI, unless it is to be used as a HTPC. Adding a graphic card, will disable to the onboard graphics, and thus the HDMI.

Much better to use a graphics card with HDMI audio capabilities, whether it be a more powerful system, or a HTPC.

DTS Connect or Dolby Digital Live are only needed if SPDIF is used. SPDIF can only take stereo PCM. Dolby and DTS are needed to encode and compress the sound, so that multi channel audio can be sent.

HDMI can use multi channel PCM, so no encoding needed.
 
So generally speaking HDMI is far superior as it can carry a completely uncompressed digital signal stream without the need for DTS Connect/DD Live etc?

Thanks for the info Marsman - interesting stuff!
 
You really want to be using HDMI, if you are going to be using the PC as a Blu-ray player.

SPDIF is a better choice than using the onboard audio, as your Denon AVR will process the sound, and will sound better than the onboard audio.

You won't be able to take advantage of Dolby/DTS HD audio though, if you use SPDIF.

he's going to need a new av amp as well as that denon does't have hdmi sockets :p

Marsman to the rescue! I've been wondering for a while what I need to do to get DTS HD going (although I have yet to buy a Blu-Ray Player/Drive) - Sorry to steer thread a little off topic but am I right to assume that due to SPDIF only supporting a maximum of 44.1kHz/16-bit (iirc) then HDMI HAS to be employed for DTS HD?

No, spdif supports up to 24bit/192khz stereo pcm, up to 92khz/24bit 5.1 surround and 48khz/24bit dolby digital. the later two being a limitation of dts and dolby digital respectively, rather than a limitation of spdif.

marsman said:
There isn't much point in having a motherboard with HDMI, unless it is to be used as a HTPC. Adding a graphic card, will disable to the onboard graphics, and thus the HDMI.

That depends very much on the chipset. Intel onboard gpu's i believe do shut down. Nvidias might, Ati's most certainly dont.
 
No, spdif supports up to 24bit/192khz stereo pcm, up to 96khz/24bit 5.1 surround and 48khz/24bit dolby digital. the latter two being a limitation of dts and dolby digital respectively, rather than a limitation of spdif.

Nice - cheers for clarifying James.

EDIT: Question : Presumably, the video games industry will move more towards these higher sample rate standards such as Dolby Digital HD & DTS HD in the near future? Or do they already and I'm limited to 'DTS Connect' on my system?

I don't work in games...lol!
 
Last edited:
Maybe. There have been a few games that use high-res lossless audio - metal gear solid 4 on the ps3 for example, but it takes up monstrous amounts of room (gigabytes) and tbh it's a bit unnecessary. I mean, that game was 30gb, mostly because of the lossless audio and all of the recorded dialogue in that game. I cant see games using DD TrueHd or DTS MA, either, because of the overheads required when encoding it. They are containers, remember, so the game would first have to generate the audio, then compress it, then send it to your amp to be decompressed again. Since you need HDMI to support these formats, why bother at all when you can just use lpcm over hdmi instead?

I think 24bit/48khz is more than enough bandwidth for any game audio, imo, and I can see that being standard for a long time yet.
 
Last edited:
Maybe. There have been a few games that use high-res lossless audio - metal gear solid 4 on the ps3 for example, but it takes up monstrous amounts of room (gigabytes) and tbh it's a bit unnecessary. I mean, that game was 30gb, mostly because of the lossless audio and all of the recorded dialogue in that game. I cant see games using DD TrueHd or DTS MA, either, because of the overheads required when encoding it. They are containers, remember, so the game would first have to generate the audio, then compress it, then send it to your amp to be decompressed again. Since you need HDMI to support these formats, why bother at all when you can just use lpcm over hdmi instead?

I think 24bit/48khz is more than enough bandwidth for any game audio, imo, and I can see that being standard for a long time yet.

Thanks for explaining all of this James - I speculated that the storage space might be an issue when dealing with the higher sample rates and bit depths (this is pretty much the only reason in the pro music sector as to why a lot of producers are still in ~44.1kHz/24-bit region). I myself use 96kHz/24-bit for productions I work on but invariably we have to master down to redbook CD standard at the end anyways for release so...

As for games however (I don't work in games although have a few friends who do), I was unsure as to the reasoning but like you say, the storage/memory bandwidth exponentially increases with higher sample rates and bit depths.

Presumably, with the new Xbox in development, we will see an increasing gravitation towards these high fidelity standards in mainstream gaming.

James you said these algorithms are simply containers which are compressed at source (?) and then decompressed by the amp at the end. Quick question - is this compression lossy or, because we are dealing with digital data streams it is an exact lossless copy ie. no degradation of signal path?

Thanks again for the information - finally starting to fill some of these gaps in my knowledge!
 
SPDIF will give better quality sound unless your amp has a useless DAC. :)

I had also wondered aboot this - in pro music world where the actual conversion is taking place is of paramount importance to us ie. is conversion done on a tasty analogue tube mixer or do we use ADAT (like a multi track high fidelty version of SPDIF using Toslink connectors) and perform the conversion at the Audio Interface stage?

This is what I had always assumed was happening (hence why I recommended Digital to the OP) as speakers don't understand 1s and 0s!!!
 
Back
Top Bottom