Digital Transport Quality (and why it *may* matter).
Apr 21, 2023 at 9:17 PM Post #121 of 135
No, no, nothing to do with Shannon. Shannon was wrong, Stereophile, Rob Watts and some guys from the Cables subforum proved he was wrong and the USB protocol is wrong and in fact all digital protocols! đź‘Ť

G
Yep links about data transmission being either serial or parallel are all wrong: it's Shannon!! Current TCP\IP or USB protocols, it was all determined in Shannon's "Mathematical Theory of Communication"! Current computer bus architectures, don't need to worry about any of their formats: Shannon! And my favorite: a video stream on a 4K player actually starts to buffer in 720p :smile::smile:
 
May 3, 2023 at 3:48 AM Post #122 of 135
Hi everybody, I'm new here and this is my first post (sorry because it will be a long one). Just finished reading this thread and what a trip it was. So much so that I decided to create an account (finally). Being of an IT background, I tend to agree with @gregorio and @Davesrose (and others) that digital transport is perfect. Being neither an expert nor enough-knowledgeable to either debunk or support all these claims, I turned to chat GPT for help. It was a hard battle (since the poor thing is learning from Google, and we all know what nonsense you can find there), but after half an hour at least I managed to come to the conclusion that as long as you use properly shielded and good-quality cables you're fine. i.e. those used in construction - in other words, no need to sell my kidney for wiring :))

But why I decided to create an account is this topic: do digital transfers matter? I'm afraid I'm opening Pandora's box again or playing the devil's advocate, but... Chat GPT is absolutely certain that interference and noise do affect a digital transfer. I'm hoping someone here can prove it wrong :D. So, this is what GPT says:

On noise affecting digital signals in general, and how it will affect the analog result

Digital signals are more immune to noise interference than analog signals, but they are not completely immune. Noise can still impact digital signals, especially when it is introduced into the signal path before the signal is converted back to an analog audio signal.

The quality of the power supply and the components used in the digital signal transmission can affect the susceptibility of the signal to noise interference. For example, if the power supply used by the HiFi streamer is noisy, this can impact the quality of the digital signal and introduce noise. Similarly, poor quality components or cables used in the signal transmission path can also introduce noise and affect the quality of the digital signal.

When noise interferes with a digital signal, it typically doesn't alter the sound in the same way that it would for an analog signal. Instead, noise can cause errors in the digital signal that can result in jitter or loss of signal. Jitter refers to variations in the timing of the digital signal, which can cause distortion in the resulting analog audio signal. When a digital signal is transmitted from a HiFi streamer to an external DAC, any jitter in the signal can cause the DAC to have difficulty reconstructing the original analog waveform, leading to audible distortion.

So while noise interference in a digital signal doesn't necessarily alter the sound in the same way that it would for an analog signal, it can still have a negative impact on the quality of the resulting analog audio signal by introducing jitter or causing loss of signal. When a digital signal is distorted due to jitter, the resulting analog audio signal can exhibit a number of different artifacts that can be audible to the human ear. Some common examples of jitter-induced distortion include:

  1. Timing errors: Jitter can cause the timing of the digital signal to be slightly off, which can result in the analog audio signal sounding "off" or "disjointed." This can be particularly noticeable in music with complex rhythms or fast transients.
  2. Frequency modulation: Jitter can also cause the frequency of the digital signal to fluctuate slightly, which can result in audible pitch changes in the resulting analog audio signal. This can make the music sound out of tune or create a sense of pitch instability.
  3. Amplitude modulation: Jitter can also cause fluctuations in the amplitude of the digital signal, which can result in audible changes in volume in the resulting analog audio signal. This can make the music sound "punchy" or "muddy," depending on the type of modulation.
Overall, jitter-induced distortion can make the music sound less clear and less enjoyable to listen to. While some degree of jitter is inevitable in any digital signal transmission, high-quality HiFi streamers and external DACs are designed to minimize jitter as much as possible to ensure that the resulting analog audio signal is as faithful to the original recording as possible.

Why does noise happened in digital signal?

Firstly, noise can be introduced into the cable due to electromagnetic interference (EMI) or radio frequency interference (RFI). This can cause voltage spikes or other irregularities that can distort the digital signal. Even small amounts of noise can result in bit errors or other signal degradation, which can be audible in the resulting analog audio signal.

Similarly, jitter can be caused by fluctuations in the clock signal that is used to synchronize the transmission of the digital signal. Jitter can be caused by many factors, including power supply noise, EMI/RFI, and interference from other digital devices in the same system. Jitter can cause timing errors and other distortions in the digital signal, which can result in distortion in the resulting analog audio signal.

It's worth noting that digital signals are not completely immune to noise and interference. While it's true that digital signals are made up of 1s and 0s, these signals still rely on precise timing and voltage levels to accurately represent the original analog waveform. Any interference that affects these timing or voltage levels can result in audible distortion in the resulting analog audio signal.


How does noise affect optical cables?

Optical cables are indeed immune to electromagnetic interference (EMI) and radio frequency interference (RFI), which are common sources of noise that can affect copper-based digital cables. This is because optical cables use light to transmit the digital signal, rather than electricity. As a result, they are not affected by the same types of interference that can affect copper-based cables.

However, optical cables can still be affected by other types of noise, such as environmental noise (e.g. vibrations or physical disturbance) or poor-quality components. Additionally, while optical cables are less prone to jitter than copper-based cables, they can still be affected by timing errors or other forms of digital signal distortion.

In general, optical cables are a good choice for minimizing noise and interference in a digital audio system, particularly over longer cable runs. However, it's still important to use high-quality optical cables and components to ensure that the digital signal is transmitted with minimal distortion. Additionally, other factors such as the quality of the external DAC and the clock circuitry in the HiFi streamer can also affect the overall sound quality.


Can USB connection help?

A USB connection can help to address some of the clock/timing problems that can affect digital audio signals, particularly if the USB connection is asynchronous.

In asynchronous USB mode, the USB device (such as a HiFi streamer) controls the timing of the data transfer, rather than relying on the timing of the host device (such as a computer). This can help to reduce the amount of jitter and timing errors in the digital signal, resulting in a cleaner and more accurate audio signal.

However, it's worth noting that USB cables and ports can still be affected by noise and interference, particularly if low-quality components are used. Additionally, the quality of the clock circuitry in the HiFi streamer and external DAC can also affect the overall sound quality, regardless of the type of digital connection used.

Can you show proof of measurements that support your claim?

Here are some specific measurements and results related to the effects of noise and jitter on digital audio signals:

  • Jitter tolerance test: This test involves injecting jitter into a digital audio signal and measuring the ability of the receiving device to recover the original clock signal. A study conducted by the Audio Engineering Society (AES) in 2003 found that even relatively small amounts of jitter can have a noticeable impact on the sound quality of digital audio signals. The study found that jitter levels as low as 10 nanoseconds can result in audible changes to the sound quality of the signal.
  • Eye pattern test: This test involves measuring the shape and clarity of the signal waveform over time. A study conducted by the European Broadcasting Union (EBU) found that timing errors in the digital audio signal can result in waveform distortion, which can lead to audible clicks, pops, or other artifacts in the sound. The study found that waveform distortion was most pronounced when the timing errors were in the range of a few microseconds or less.
  • Signal-to-noise ratio (SNR): This measurement quantifies the level of noise in the signal relative to the level of the desired audio signal. Higher SNR values indicate a cleaner, more accurate signal with less noise. A study conducted by the AES found that increasing the SNR of a digital audio signal can result in improved sound quality and reduced listener fatigue.
  • Total harmonic distortion (THD): This measurement quantifies the amount of harmonic distortion present in the audio signal. Higher THD values indicate greater levels of distortion and less accurate reproduction of the original audio signal. A study conducted by the AES found that reducing THD in the digital audio signal can result in improved sound quality and a more natural, lifelike sound.
  • Intermodulation distortion (IMD): This measurement quantifies the amount of distortion created when two or more signals are mixed together in a nonlinear system. Higher IMD values indicate greater levels of distortion and less accurate reproduction of the original audio signal. A study conducted by the AES found that reducing IMD in the digital audio signal can result in improved sound quality and reduced listener fatigue.
Overall, these measurements and studies provide evidence to support the claim that noise and jitter can have a significant impact on the sound quality of digital audio signals, and that minimizing these effects can lead to improved sound quality in high-end HiFi systems.


I wrote this post so I can learn. Honestly, I can feel the marketing-level language from ChatGPT, and I'm sceptical about it. But I kept asking it around in circles from different angles and reverse-logic and it still draws the same conclusions.

Would anyone care to comment on this?

EDIT: structure
 
Last edited:
May 3, 2023 at 3:57 AM Post #123 of 135
Would anyone care to comment on this?
i also asked ChatGPT a few audio question out of curiousity and overall it seems to just gather information from other sources and combining them, and advising at the end that it might matter or not most of the time, basicly it lefts you uncertain like reading any other source because of how controversial some of the stuff is

tho i kinda like the fact it combines information quite well
 
May 3, 2023 at 4:03 AM Post #124 of 135
i also asked ChatGPT a few audio question out of curiousity and overall it seems to just gather information from other sources and combining them, and advising at the end that it might matter or not most of the time, basicly it lefts you uncertain like reading any other source because of how controversial some of the stuff is

tho i kinda like the fact it combines information quite well

It's a very nice tool - useful, but to be taken with a grain of salt. Fortunately, there are ways to train it from specific data instead of letting it learn for itself.
 
May 3, 2023 at 4:53 AM Post #125 of 135
My experience with GPT is that when it offered to login with a Microsoft account, I used my super modern(still working, even though it's as a spam box) Hotmail address, and it refused it. I felt personally insulted, so now I hate all AIs.
 
May 3, 2023 at 5:20 AM Post #126 of 135
User:
Does the quality of the digital transport affect sound quality as long as the data is received without errors (the data is bit perfect)?

ChatGPT:
As long as the digital data is received without errors, the quality of the digital transport itself should not have a noticeable effect on the sound quality. This is because digital data is transmitted in a binary format, and as long as the data is received correctly, the resulting sound should be identical regardless of the quality of the digital transport.

However, it's worth noting that while the digital transport itself may not affect the sound quality, other factors such as the quality of the digital-to-analog converter (DAC) and the analog circuitry in the playback device can have a significant impact on the overall sound quality. So while the digital transport may not be the most important factor in determining sound quality, it's still important to consider the entire audio playback chain when aiming for the best possible sound quality.
I got this from openai's free chatgpt on my first prompt. I tried it again a couple of times since I know ChatGPT can generate different responses for the same prompts and I got responses that are similar to @alexj87 's responses as well. I suspect ChatGPT was trained on headfi's cable and tweaks subforum (or the equivalents) as well.

User:
Does the quality of the digital transport affect sound quality as long as the data is received without errors (the data is bit perfect)?

Bing chatbot:
The quality of the digital transport can affect sound quality. A poor transport could easily be making up large parts of the data if its tracking systems are not working well, forcing the error-correction system to give up and resort to concealing errors instead. Is there anything else you would like to know?

User:
What if the data is received without errors, as in the data is bit perfect? Note that that was already included in my original question.

Bing chatbot:
I apologize for misunderstanding your question. If the data is received without errors and is bit-perfect, then the quality of the digital transport should not affect the sound quality. Is there anything else you would like to know?
I got that from the bing chatbot that also provides sources for their claims which is a good thing. Bing consistently ignored the second part of my question and always gave the same answers for me.
 
Last edited:
May 3, 2023 at 5:31 AM Post #127 of 135
I got this from openai's free chatgpt on my first prompt. I tried it again a couple of times since I know ChatGPT can generate different responses for the same prompts and I got responses that are similar to @alexj87 's responses as well. I suspect ChatGPT was trained on headfi's cable and tweaks subforum (or the equivalents) as well.


I got that from the bing chatbot that also provides sources for their claims which is a good thing. Bing consistently ignored the second part of my question and always gave the same answers for me.

Thank you! I think you found the missing question :)) I get the same response - if the transfer is bit-perfect then the sound quality is not affected.
Furthermore:

Will regular cables allow bit-perfect transmission?

Yes, regular cables, such as standard HDMI, coaxial, or optical cables, can transmit digital signals bit-perfectly as long as they meet the necessary specifications for the signal being transmitted. Bit-perfect transmission means that the digital data sent from the source device to the receiving device arrives exactly as it was sent, without any errors or alterations.

In general, the quality of the cable materials and construction can impact factors such as signal strength, attenuation, and resistance, but as long as the cable meets the necessary specifications for the signal being transmitted, it should be able to transmit the signal bit-perfectly.


The examples of where would you need higher-quality cables are: Higher-quality cables may be needed in situations where the cable is subjected to high levels of electrical interference or mechanical stress, or when longer cable runs are required. For example, in professional audio settings where long cable runs are common, high-quality cables with robust shielding may be necessary to prevent signal degradation over long distances. In-home audio settings, higher-quality cables may be beneficial for longer cable runs or in situations where the equipment is located near sources of electrical interference, such as power lines or radio transmitters.

As you said, it always rounds back to: "some high-end audio equipment may be more sensitive to the quality of the cables used, and may benefit from the use of higher-quality cables..." haha. Scary!
 
May 3, 2023 at 8:52 AM Post #128 of 135
Chat GPT is absolutely certain that interference and noise do affect a digital transfer.
It seems to have trawled audiophile sites for its information. This is hardly surprising as audiophile sites represent by far the highest quantity of information on this and related subjects because to engineers and science this is largely an irrelevant question. So the answers it has provided are based on audiophile marketing and all the audiophiles suckered by it. The answers are therefore either just false or effectively wrong due to a very old audiophile marketing technique, the “lie of omission”.

Here are some examples:
Digital signals are more immune to noise interference than analog signals, but they are not completely immune. Noise can still impact digital signals, especially when it is introduced into the signal path before the signal is converted back to an analog audio signal.
The first statement is true but is again a “lie of omission”. Digital transfer is completely immune to low/typical consumer levels of noise/interference but is not immune to particularly high levels, which can cause bit errors. However, such bit errors are detected and depending on the transfer protocol are usually either resent, error corrected or interpolated. So, noise/interference can impact the digital signal but does not affect the digital transfer. If it did, the internet would never work!
While it's true that digital signals are made up of 1s and 0s, these signals still rely on precise timing and voltage levels to accurately represent the original analog waveform. Any interference that affects these timing or voltage levels can result in audible distortion in the resulting analog audio signal.
This is just false! “Any interference that affects these timing or voltage levels” does NOT result in audible distortion, they would with an analogue signal but not a digital signal, that’s why digital was invented! Each of the protocols for transferring digital audio specify a RANGE of timing and voltage levels. For example a voltage level could be specified as say 5 volts, which would represent a digital “1” while 0 volts would represents a digital “0”. A voltage of say 3 or 4v would still register a “1”, while a voltage of 0-2v would still register a “0”. So even pretty huge voltage errors would have absolutely no affect whatsoever, let alone an audible distortion! Similar sort of issue with timing/jitter, a RANGE of timings is specified for each protocol (the rise time of the “eye pattern”), if a receiver cannot resolve the timing variation specified by the protocol, it is not compliant with the protocol it’s claiming to support and is therefore faulty. In practice, the timing of the digital signal is not used directly by the DAC, it is either not used at all (in the case of asynchronous transfers) or is only used to modulate the timing generated by it’s own internal clock (synchronous transfer). In either case, jitter in the digital signal is irrelevant, unless it’s so extreme it’s significantly outside the range specified by the protocol, in which case errors would result.
A study conducted by the Audio Engineering Society (AES) in 2003 found that even relatively small amounts of jitter can have a noticeable impact on the sound quality of digital audio signals.
Exactly how are they defining “relatively small amounts”? This is at best misrepresenting the findings and at worst, an outright lie! The lowest amount of jitter detected was about 3 nano-secs, with a test signal. With music it was 27ns, although that was one test subject with one specifically chosen (and very unusual) recording. In general the threshold for jitter audibility is around 200-500ns. Is say 200 billionths of a second “relatively small”? It’s relatively tiny compared to say the blink of an eye (around 200-500 milli-secs) but relative to the subject in question, jitter, it’s the exact opposite, “relatively massive”! An AES paper in 1997 (Benjamin and Gannon) measured the jitter of 50 cheap/common consumer CD and DVD drives and digital TV tuners from the mid 1990’s. On average they produced roughly 140 pico-secs of jitter, over a thousand times less than the lower 200ns typical threshold and roughly 200 times less than even the lowest jitter ever detected with music. So, even “relatively average amounts of jitter” from ~30 years ago would have absolutely no noticeable impact, let alone “relatively small amounts” today. The bot is just quoting audiophile BS!

The rest of what you quoted is much the same, IE. Either just false or true but a “lie of omission”. The EBU showing micro-secs of jitter causing pronounced distortion is true but again omits the actual jitter produced. Same with THD and noise (SNR), reducing them can indeed “improve sound quality”, “reduce listener fatigue”, etc. But the quote omits the fact that this is ONLY true when reducing audible levels of THD and noise. So we have the “lie of omission” again because to be audible, the level of THD has to be greater than about 1% but what DACs have higher than 1% THD? The Apple dongle from 7 years ago for example, cost about $8 and had THD of 0.0011%!

G
 
May 3, 2023 at 9:52 AM Post #129 of 135
Higher-quality cables may be needed in situations where the cable is subjected to high levels of electrical interference or mechanical stress, or when longer cable runs are required. For example, in professional audio settings where long cable runs are common, high-quality cables with robust shielding may be necessary to prevent signal degradation over long distances.
Same again! Firstly, there is a falsehood here. Depending on what signal we’re talking about, there maybe no need for robust shielding. Audio transfer via Ethernet is increasingly common in “professional audio settings”, for example Dante or AES 67 which specify a 1gb ethernet protocol and CAT 5 UTP cable, for cable runs up to 100m transferring up to 256 audio channels at 24/48. So that’s completely unshielded (twisted pair) cable, let alone “robustly shielded”!

Secondly, similar to my previous post, how exactly does it define “high-quality”? It’s true that in studios we typically have much longer cable runs than in consumer environments and far more EM interference, so for analogue signal transfer we do need “high-quality” cables with robust shielding BUT we have to differentiate “high-quality” cables from “audiophile” cables. The high quality cable we use in commercial studios typically costs around $2 per meter, for example Van Damme Classic Starquad, so around $10 including high quality (Neutrik) connectors for a 2m cable. Commercial studios don’t use audiophile cables, they’re roughly 10-500 times more expensive, offer zero signal transfer benefits, are often of an inappropriate gauge and are commonly less robust, less well made and less reliable!

G
 
May 3, 2023 at 11:26 AM Post #130 of 135
Thanks for taking the time to read and respond! I can't claim to understand every detail of what you explain since I lack particular knowledge/experience on the topic, but logic says you're probably right. It's not the first time to catch GPT at delivering outright false information. However, it is easier for me to do so at my own expertise/profession, and we have done it often with colleagues since its popularity and usage grew. One has to be very careful while using it.

I'm looking to buy a streamer btw, hence my recent interest in the topic. I was looking at the Node, but recently I'm aiming more toward the WiiM Pro. It will go in my Marantz pm6007 with a DAC, via optical/coax. I keep reading I'm going to need a better DAC (i.e. topping e50 or e70), and am constantly told to go for coax instead of optical because my bitrate would be lower (24/96 instead of 24/192). I'm not sure I (or anyone else) can hear better than 16/44.

On another topic, just this morning I saw a youtube video about a new product (EverSolo DMP-a6). Looks cool with the flashy display if anyone is interested to check it out.

Excuse any ignorance on my part, I just got into the hobby a year ago so I have a lot of catching up to do.
 
May 3, 2023 at 12:10 PM Post #131 of 135
It's not the first time to catch GPT at delivering outright false information.
It would appear to be making judgements based on quantity of information. For example, if most sources stated the Earth was flat, that’s probably what it would deliver. This leaves it open to the Argumentum ad Populam fallacy and therefore false/fallacious information. Not easy to overcome that problem, without access behind many paywalls and judgement of reliable sources.
I can't claim to understand every detail of what you explain since I lack particular knowledge/experience on the topic, but logic says you're probably right.
Unfortunately, you bought up a lot of points, so I couldn’t really go into much individual detail/in depth simplified explanations. If you have a specific point, then ask away, we’ll do our best to provide answers which you can check with Wikipedia or ask for references.
I keep reading I'm going to need a better DAC (i.e. topping e50 or e70), and am constantly told to go for coax instead of optical because my bitrate would be lower (24/96 instead of 24/192). I'm not sure I (or anyone else) can hear better than 16/44.
The audiophile world only exists by consumers being convinced to buy audiophile equipment. This is achieved by convincing them that newer, more expensive models/formats, are better than older ones. Typically that is either not true or only true “on paper”, EG. The higher performance is inaudible. Transducers (speakers, HPs, IEMs) are an exception to this general rule of thumb but even then, only under certain circumstances.

To answer your point, there is no reliable evidence to suggest any audible difference between 16/44 and any of the higher data rate formats for consumer playback at reasonable listening levels, but there is a considerable amount of reliable evidence demonstrating no audible difference. This issue was effectively done and dusted more than 15 years ago but is kept alive by the needs of audiophile marketing.

G
 
Last edited:
May 3, 2023 at 12:28 PM Post #132 of 135
AI can totally make sh$& up. I asked Bard (Google’s AI) how a basketball game went once and it got the players right but just totally made up how the game went and how the score went. I told Bard it was incorrect to check its answer on the web (Bard looks out on the web in real time, unlike ChatGPT) and it acknowledged it was wrong and corrected itself and apologized and stated it was experimental and would try to do better in the future.

ChatGPT once made up for me a fake episode of Saturday Night Live in the 1970s with Chevy Chase that made fun of Richard Nixon eating spaghetti. No such episode ever existed. So these things definitely will make sh$& up. They often don’t know when to hallucinate and when to regurgitate, and logic and fact checking are not very good right now for verbal replies (my understanding is that they are great at coding though).

Things are moving so fast, I personally think they’ll get there pretty soon, just not right now. I really enjoy using them and find them extremely useful and entertaining but part of using them well is to understand the flaws and weaknesses, which right now are huge. IMHO & etc. 🙂
 
Last edited:
May 3, 2023 at 1:04 PM Post #133 of 135
The short answer is that jitter, even in the absolutely worst examples never comes remotely close to being audible. Anti jitter features that manufacturers tout are solutions to problems that don’t exist.
 
May 3, 2023 at 5:30 PM Post #134 of 135
AI can totally make sh$& up. I asked Bard (Google’s AI) how a basketball game went once and it got the players right but just totally made up how the game went and how the score went. I told Bard it was incorrect to check its answer on the web (Bard looks out on the web in real time, unlike ChatGPT) and it acknowledged it was wrong and corrected itself and apologized and stated it was experimental and would try to do better in the future.

ChatGPT once made up for me a fake episode of Saturday Night Live in the 1970s with Chevy Chase that made fun of Richard Nixon eating spaghetti. No such episode ever existed. So these things definitely will make sh$& up. They often don’t know when to hallucinate and when to regurgitate, and logic and fact checking are not very good right now for verbal replies (my understanding is that they are great at coding though).

Things are moving so fast, I personally think they’ll get there pretty soon, just not right now. I really enjoy using them and find them extremely useful and entertaining but part of using them well is to understand the flaws and weaknesses, which right now are huge. IMHO & etc. 🙂


Imagine if/when it gets to a point it can be trusted as 100% reliable so no one questions it, then it starts getting things wrong.
 
Last edited:
May 3, 2023 at 5:34 PM Post #135 of 135
People just need to think for themselves... but tell that to an audiophile who trusts high end audio salesmen to think for them.
 

Users who are viewing this thread

Back
Top