Digital kabel - anbefaling ønskes!

K

knutinh

Gjest
ar-t skrev:
Ok, the reflection coefficient, called rho is:

(100-75)/(100+75) = 0.143
I think that you can safely assume that a large part of the readers are fairly familiar with reflections due to bad impedances.

What roffe was probably (?) referring to was Inter-Symbol Interference, where the decaying reflections cause issues for later channel symbols.
Briefly on "peer-reviewed" papers, such as one will find in the JAES.

JAES has an agenda.
I have never seen any documentation supporting your claims. I am guessing that you have none?

Sorry, but the fact that they have been "peer-reviewed" does not hold water with me. If all of those papers are correct, many of you will have to throw out a lot of your gear.
So we should not accept papers that are peer-reviewed, but we should accept your white-papers when according to your self "I was not making a peer-reviewed study, with sufficient statistical validity to hold up to close scrutiny.
"
?

I believe that there are some academics working in this field with no ties to the industry, that still think that AES is a fairly good source of insight. Are they wrong? Do they have an agenda? Do you have an agenda?

-k
 

Soundproof

Hi-Fi freak
Ble medlem
04.07.2007
Innlegg
1.639
Antall liker
0
Soundproof's Theorem: Anecdotes that support ambitions tend to get repeated.

Soundproof's Corollary: The plural of anecdote is not data.

I find it necessary and essential that the international data transmissions industry should be made aware of this absolutely critical failing as it's definitely upsetting to think of all the data going wrong. Given the potential for error it's a mystery how the world financial industry can rely on willy-nilly setups for the transmission of all that critical data - no wonder the market goes up and down.

BTW - I look forward to the day when analog sound bandwidths and digital transmission bandwidths are no longer confused, but fear that day will only come when people stop treating CD-players as if they were vinyl players.
At any rate, best of fortune with your product and your company.
 

muzak

Overivrig entusiast
Ble medlem
13.07.2003
Innlegg
503
Antall liker
16
Soundproof skrev:
I find it necessary and essential that the international data transmissions industry should be made aware of this absolutely critical failing as it's definitely upsetting to think of all the data going wrong. Given the potential for error it's a mystery how the world financial industry can rely on willy-nilly setups for the transmission of all that critical data - no wonder the market goes up and down.
Data transmission is i complete different matter. Data may go wrong, but you have plenty of time to correct them. I digital audiostream is more or less realtime, and therefore time and timing is essential. I think even you know this, but you do not let a chance to quarrel go past you :)

By the way: This is the best thread in here for a long time. You may actually learn something :)

bjørn
 
K

kbwh

Gjest
I can se no reason why we shoudn't accept a slight time delay for digital processing in hifi as in other applications.

"Realtime" should be a non-issue.
 
K

knutinh

Gjest
Isbjorn skrev:
Data transmission is i complete different matter. Data may go wrong, but you have plenty of time to correct them. I digital audiostream is more or less realtime, and therefore time and timing is essential. I think even you know this, but you do not let a chance to quarrel go past you :)
Actually, the "realtimeness" of data is a real limitation for data transmission as well. Users wants their webpage to load in 50ms, not 1500ms. If a higher delay was tolerated, e.g. radio networks could use large interleaving schemes that evens out burst errors very nicely.

Of course, you are right that when latency-demands are low enough compared to the network delay, 2-way communication can be used to combat errors very efficiently. On the other hand, we are talking about stuff like 100Mb or 1Gb ethernet and/or large distances, low SNR, high ISI etc. 1 meter of 16bit 44.1kHz PCM audio is another league in terms of information compared to channel caracteristics.

-k
 
K

knutinh

Gjest
Valentino skrev:
I can se no reason why we shoudn't accept a slight time delay for digital processing in hifi as in other applications.

"Realtime" should be a non-issue.
Of course, the easiest solution (if this is considered to be a problem) would simply be to purchase a good CD-player with a good internal DAC using whatever proprietary internal clock-scheme ensuring good jitter-performance at analog outputs.

Anyone using room-correction should be used to delay in the "100 ms range", I havent heard of anyone having a problem with that. Of course, the limit is when the CD-player play/pause buttons are perceived as "sluggish", but as true audiophiles, we always listen to a complete record from start to end without touching the buttons, dont we? :)

-k
 
K

kbwh

Gjest
From time to time i listen to both vinyl and digitally stored music on a stereo that is totally DSP-controlled. I think the delay is close to 200ms. This really is a non-issue. If not I suggest teraphy.
 
S

Slubbert

Gjest
Soundproof skrev:
He makes a living propagating the jitter myth and would be out of a living should one manage to demonstrate that jitter is unimportant to digital audio transfer, in a well designed system.
This is easy to demonstrate. It's convincing people that's the hard part.

But hey, who are we to tell what others should spend money on? If someone can make a living out of it; good for them.
 

muzak

Overivrig entusiast
Ble medlem
13.07.2003
Innlegg
503
Antall liker
16
A feasible problem with this discussion is, while ART is argumenting for the technical aspects of this matter, some of the other contributors are attacking the political issues and the metodics. I think ART don't care to much about it. He has doen his listeningtest which has proved his point to himself and his company. It is sound to bee sceptic, but i guess PAT has done a lot of TDR measurements through the years to back up that there migth be problems in the transmission (for differet reasons).

To get anywhere in this discussion, I think we have to a agree on the fact that there is reflection and other unlinarities in the transmission. The discussion then can be aimed at what cause this has for the following stages, and the integrity of the signal. And if it has any influense, how to get rid of these unwanted features.

Here is one: Phase noise is HF and RANDOM in nature. SPDIF receivers have good HF jitter attenuation, and Zero attenuation at LF. Will contributing reflections cause high LF jitter?

knutinh skrev:
Of course, you are right that when latency-demands are low enough compared to the network delay, 2-way communication can be used to combat errors very efficiently. On the other hand, we are talking about stuff like 100Mb or 1Gb ethernet and/or large distances, low SNR, high ISI etc. 1 meter of 16bit 44.1kHz PCM audio is another league in terms of information compared to channel caracteristics.
-k
You are quite right. But this is not new issues for the engineers working with transmission. Look to telephone industry. They have been dealing with these problem for decades. Now you are talking real time data transmission. Not suprisgly are they the same persons designing good solutions for digital audio transmission also.

Cheers
Bjørn :eek:)
 
K

knutinh

Gjest
Isbjorn skrev:
You are quite right. But this is not new issues for the engineers working with transmission. Look to telephone industry. They have been dealing with these problem for decades. Now you are talking real time data transmission. Not suprisgly are they the same persons designing good solutions for digital audio transmission also.
I believe that I know a good deal of those persons.

And every single one laugh at the theories and the approach used to scientific method commonly displayed in the hifi industry. When you say that "I don't believe in all the ABX/DBT stuff" I question your critical sense.

I believe that you can make the same arguement when it comes to diet pills. "Our pleased customers know for them selves that our products works, without any scientific mumbo-jumbo". They still make things very simple for sceptics. I think that is sad, because real, hard facts are welcome in any field be it hifi, dieting or cosmetics.

-k
 

Soundproof

Hi-Fi freak
Ble medlem
04.07.2007
Innlegg
1.639
Antall liker
0
Hi there, Bjørn -- I'm willing to accept Art's anecdotes, but there's absolutely no reason for him to cast aspersions on JAES, on toslink optical (well, in fact there is, since toslink optical is far superior to his coax stretch, and as his cable theory is irrelevant in toslink, and therefore wouldn't earn him money, if toslink's superiority was accepted by the hifi-church), or on peer-review.

And yes, Art, I'm still waiting for your comments as to toslink's inferiority, given the music recording industry's adoption of ADAT Toslink.
 

ar-t

Bransjeaktør
Ble medlem
17.02.2008
Innlegg
18
Antall liker
0
Boy......a lot of stuff to go over.

First, a lot of folks in the industry find the ABX/DBT thing to be very tedious. I am not alone.

Some of you think that most of the readers are aware of what reflections are? REALLY!!!!!

Then you are all smarter than the AES/EBU working group that came up with the original specifications. After it became public in the JAES, I had a long phone conversation with the chairman of that group. He was (at that time) a highly-respected person in the field of acoustics. But he had absolutely no knowledge of why you can't stick 3 transmitters and 7 receivers on the same transmission line and expect it to work properly. Now, I am not going to say that they changed the spec a year or so later just because I explained how flawed it was to him. But I am fairly certain that enough other folks in the industry spoke up. But someone had to, because it was changed to one transmitter and one receiver.

So much for "industry experts". I am sure that the concept was "peer-reviewed". Yet it turned out to be all wrong. Of course, you could make the point that the whole concept of SPDIF and AES/EBU is flawed because of faulty way of encoding the data.

Yes, in the telecom world, we can transmit gigabits of data without screwing up. I have no knowledge of how data is encoded; that is not my field. But I can tell you that everyone that I work with (outside of our company) agrees that SPDIF is the worst possible thing that has ever come along.

Briefly, the encoding scheme that they use creates a lot of data-correlated jitter. Which is far worse than random jitter. And that is why oddball ideas like long cables make a noticeable difference.

As someone has noted, why not just have an all-in-one player, and dispense with all of this jitter nonsense.

Precisely! Which is why we made one, and only one, SPDIF DAC box. We accurately predicted that as soon as the magazines figured out the jitter existed, and that it was a problem, the market would dry up. They did, and it did. We did not know if they would embrace the concept of secondary PLLs and other stuff that would be needed to clean things up. So, we just continued making amps and preamps.

(The magazines also figured out that single-mode fiber had problems, and that was the end of that fad. I could have told them that as well. Had they asked me. But, alas........they didn't. Yes, I will explain TOSLINK and now single-mode fiber as well. Be patient.)

Ok, some of you don't think that I should jump the AES. Well, some of my best friends in the industry were AES insiders at one point. They were ostracised by the academics, and their industry cohorts. Yes, I may have a axe to grind; I admit that. But try to put that aside. (I merely raise my gripes with them to segue to other matters. Maybe not the best approach, but it is my idiom.)

It all gets back to peer-reviewed vs lunatic-at-large-on-the-Internet approach to assimilating information. My contention is that it is just as wrong to accept all peer-reviewed papers, just as it would be to dismiss lunatic rantings on the 'Net. Yes, a lot of peer-reviewed stuff is probably ok. And a lot of nonsense can be found on the 'Net.

But all of you guys have brains to think with. And ears to hear with. Somehow, you are going to have to decide for yourselves what to believe and what sounds right. Do you believe every review in every magazine? Of course not! That would be silly. Reviewers have different tastes and beliefs than you do. And yes, some may have an agenda. So, how do you sort things out for yourself? Well, you have to put some time and effort into finding out what is right and what is nonsense.

Same holds for industry publications. Their motives are not all that different than high-end magazines. Just because the JAES is a good source of technical information is no reason to believe each and every single word that is published in it. Academics have an agenda: publish or perish. Industry advocates have theirs as well: our stuff is better because...........

Ok, using that assumption, I have the same agenda. Well, maybe. I can assure you that we are not going to get rich selling a 200USD digital cable. Maybe if it cost $2 to make, we could. But it doesn't. If we priced it using the industry norm, it would cost $600.

But we can not, in good conscience, sell a piece of wire with a connector attached to each end, at such an exorbitant price. Even though it is very expensive wire (it really is!) and the connectors are also expensive, as they are custom-made. It is still just wire at the end of the day.

Which leads to my "real agenda". How many of you know someone that has (or will) spend $1000 on a digital cable. Why? It is just wire. Same as ours. Except shorter. And maybe fancier looking. But we all know that people make cables like that, because people will buy cables like that.

My agenda is to challenge conventional notions. What makes that $1000 cable worth that price? Do they have any real scientific basis to justify that? I doubt it. When was the last time any of you can remember any "wire bandit" (as we call them in the USA) go to such lengths to explain what they do and why? Oh, sure, they will have some great ad copy, telling you about the high quality materials, and the lengthy design process, and a lot of other sophistry.

News flash: "wire bandits" don't have a team of designers, toiling in the back room with vector network analysers. Nope. They just call up a cable manufacturer and say: "Hey, how much would it cost per foot to take this many conductors (pick a random number) and arrange it is this configuration (whatever configuration you can imagine) and make a cable with our name on it?" If you can afford 5000 feet, then congratulations!.........you are now in the cable business. That is how it works. Just make up some cool looking ads, take some fancy pictures, invent some nonsense to justify the $$$$$$ price tag, and that is all it takes.

So, some nut in Texas comes along and not only tells you that 1/2 meter digital cables are crap, but why they are crap. He probably isn't going to sell any to the readers here. But he may get some of you to open your minds, and realise that a lot of what is considered common knowledge is just plain wrong. Doesn't matter if it is the high-end magazines or professional journals.

At one time, common knowledge (backed by the church, which in effect was the government) thought that the world was flat. Gee, I wonder if the AES was around back then, which side they would come down on. They got AES/EBU wrong the first time out. So, we know that they are not infallible. The magazines thought that outboard DACs were hot. Until someone figured out the jitter was a problem. Ditto for their infallibility.

Hell, even I am not infallible. I got it wrong the first time. With 25 years (back then) of RF experience, I just stuck a 1/2 meter cable on a SPDIF DAC. It sucked, but it took less than one afternoon to get my head out of my butt and get it pointed in the right direction.

We all make mistakes. Some of us learn and grow from them. Others just pick themselves up and trip over the same obstacle again.

So..........here is your challenge. Someone here has to be able to make their own cables. Make some that are 1/2 meter long. Make some the same way that are 6 meters long. Listen to them. Make your friends listen to them. Then tell us what everyone thought. You can send me an e-mail thanking me. A post card from your country would be very welcome to add to our collection of satisfied customers the world over.

One word of caution: get ready for the onslaught of "Did you do a ABX/DBT, and get a statistically significant response? How did you qualify those response?" I'll let you deal with them. I have had my fill of that since we started our company back in '86.

I know........TOSLINK and single-mode fiber.......my fingers need a break. Plus, I do have a job. Maybe not much of one..........

AR-T
 

ar-t

Bransjeaktør
Ble medlem
17.02.2008
Innlegg
18
Antall liker
0
It just occurred to me that I did not give you all the links to my writings over on AC. This where I started in on tearing apart SC transformers:

http://www.audiocircle.com/circles/index.php?topic=41593.0

Remember.........they had a paper in the AES! (Yep, I really do hate those guys. Sorry, but at least I am honest about it.)

AR-T
 
K

knutinh

Gjest
ar-t skrev:
Then you are all smarter than the AES/EBU working group that came up with the original specifications.
After it became public in the JAES, I had a long phone conversation with the chairman of that group. He was (at that time) a highly-respected person in the field of acoustics.
I have no pretentions of being that clever.

But knowing the relevant theory for reflections along impedance boundaries (1.year university or 3. year of "videregående") I guess, and connecting that to analysis of a standard is probably easier than using that theory yourself when speccing something. It is always easier to reiterate others ideas than making your own.

From your references, I gather that your conversation was heated. Drawing from my strictly non-technical understanding of how we humans operate, I guess that your views (as well as his) were polarized at the end and perhaps didnt serve to show off your abilities to converse cool calm and collected?
Of course, you could make the point that the whole concept of SPDIF and AES/EBU is flawed because of faulty way of encoding the data.
I have no knowledge of how data is encoded; that is not my field.
The two quotes above does not seem to make sense. Are you an authority on data encoding, or not??
Briefly, the encoding scheme that they use creates a lot of data-correlated jitter. Which is far worse than random jitter. And that is why oddball ideas like long cables make a noticeable difference.
Floyd Toole did DBT ABX tests in ca 1986 showing not only that loudspeakers sound different, but obtaining significant correlation between measureable loudspeaker characteristics and subjective response.

Surely in the 20+ years since that, with significant advances in technology and the stuff you industry guys are providing us with, proving that jitter in commercially available gear is at all noticable is something that someone like yourself can do in a couple of hours with both of your hands tied behind your back? ;-)
As someone has noted, why not just have an all-in-one player, and dispense with all of this jitter nonsense.
That would be me. As I also mentioned, that would be an obvious solution if interface-induced jitter is perceived as a problem. As I dont perceive it as a problem, I am using spdif for my own equipment.
Ok, some of you don't think that I should jump the AES. Well, some of my best friends in the industry were AES insiders at one point. They were ostracised by the academics, and their industry cohorts. Yes, I may have a axe to grind; I admit that. But try to put that aside. (I merely raise my gripes with them to segue to other matters. Maybe not the best approach, but it is my idiom.)
I do know quite a few publishing scientists, and there are personal as well as organisational issues there as well as every other place where people meet. But surely, you recognize that AES has a vested interest in doing a strict quality assurance on every published manuscript, while hifi manufacturers may well sell loads of products even after being caught in a flat-out lie?
But all of you guys have brains to think with. And ears to hear with. Somehow, you are going to have to decide for yourselves what to believe and what sounds right. Do you believe every review in every magazine? Of course not! That would be silly. Reviewers have different tastes and beliefs than you do. And yes, some may have an agenda. So, how do you sort things out for yourself? Well, you have to put some time and effort into finding out what is right and what is nonsense.
The question is not what I hear, if one is only concerned with ones own experience, then there is no motivation for being on a discussion forum or reading hifi magazines. The question is where to draw the line between "Busted", "Plausible" and "Confirmed" to use Mythbusters terminology
Same holds for industry publications. Their motives are not all that different than high-end magazines. Just because the JAES is a good source of technical information is no reason to believe each and every single word that is published in it. Academics have an agenda: publish or perish. Industry advocates have theirs as well: our stuff is better because...........
And the problem is visible: Academics publish too many papers that carry no meaningful information, often just slight adjustements of earlier publications by the same author.

But, if you are aware of the Sudbø scandal, it should be obvious that your career as a scientist is on the line if you are caught in a lie or cheating with data.
My agenda is to challenge conventional notions. What makes that $1000 cable worth that price? Do they have any real scientific basis to justify that? I doubt it.
Finally something we can agree on.
News flash: "wire bandits" don't have a team of designers, toiling in the back room with vector network analysers. Nope. They just call up a cable manufacturer and say: "Hey, how much would it cost per foot to take this many conductors (pick a random number) and arrange it is this configuration (whatever configuration you can imagine) and make a cable with our name on it?" If you can afford 5000 feet, then congratulations!.........you are now in the cable business. That is how it works. Just make up some cool looking ads, take some fancy pictures, invent some nonsense to justify the $$$$$$ price tag, and that is all it takes.
In case you want to learn a Norwegian expression: "like honey to my ears"!

At one time, common knowledge (backed by the church, which in effect was the government) thought that the world was flat.
I believe tha the flat-earth myth has been thoroughly debunked as a myth created by 18th century historians?

-k
 
S

Slubbert

Gjest
@ knutinh: That the AES3-spec was changed is verifiable. It originally specified 250? receiver impedance (and 110? source) to enable the connection of several receivers in parallel to one line driver. This caused excessive ISI and made some multi-receiver systems lose lock, so it was rectified in the AES3-1992 revision which specified one driver, one receiver and one characteristic impedance.

R.A.Finger, "AES3-1992: The Revised Two-Channel Digital Audio Interface", J. Audio Eng.Soc., vol.40(3), pp.107-116, March 1992.

That is all.
 

ar-t

Bransjeaktør
Ble medlem
17.02.2008
Innlegg
18
Antall liker
0
Guys:

I am really short on time right now, so let me briefly address the TOSLINK issue. I will go into further detail in a day or so, when I have more time.

Sure, TOSLINK works if you inject a master clock into everything. You could use just about anything, including a wet shoe lace, to transmit the data, if you have an external clock that feeds everything. (OK, a wet shoe lace is an exaggeration.) Once you feed an external clock, you no longer have to recover the clock from the encoded data. And that is the whole problem with both SPDIF and AES/EBU. (No, I am not an expert on data encoding. I do know that is the problem with SPDIF. So all the experts claim, and my experience says that they are right. Does that answer your question?)

It took the recording industry a while to figure out that a reference clock was needed to supply all the equipment in the chain. I don't know why it never caught on in the high-end world. Sure, some manufacturers sold gear that supplied a reference clock from one unit to another. But there has not been a move towards the same scheme, as used in Pro Audio. Seems like a great idea. Think of all the $$$$$$ to be made selling "magic"clocks and stuff to all the tweaks. We all know how much audiophiles love to try out new gadgets, especially if it costs $$$.

The only drawback would be the market for cables would take a big hit, as they would all most likely sound the same. The only obvious tweak would be different clocks.

As a company, we would rather build and sell clocks than cables! We are an electronics manufacturer, not "wire bandits". So, I guess that is my secret agenda.

AR-T
 

Rockefoten

Overivrig entusiast
Ble medlem
15.09.2003
Innlegg
721
Antall liker
31
Torget vurderinger
2
Mulig det blir OT, men jeg prøver

Noen dac-produsenter påstår at de er like bra som masterclock-produkter. Dette gjelder belcanto og benchmark ihvertfall. Disse dacene har jo også fått mye skryt.

Gir dette teknisk mening? (If youre not able to read Norwegian, Im asking for an opinion on the explanation below)

The surpassing musical performance rests on the solid foundation of the Ultra-Clock™, a clock with performance found previously only from separate Master Clock products. The Ultra-Clock™ provides jitter performance 50x better than other clocks. Specifications of 1 picoseconds RMS and frequency accuracy of 0.0001% define new benchmarks in clock stability. This Ultra-Clock™ allows the Dac3 to reproduce music with new levels dynamic expression and accuracy of timbre. Your music communicates as never before, recalling the best of analog music reproduction, without the limitations of either analog or other digital technologies. The Ultra-Clock™ works in concert with every detail of the Dac3’s design to achieve a new level of musical realism.

ole
 

roffe

Hi-Fi freak
Ble medlem
12.01.2005
Innlegg
3.677
Antall liker
6
ar-t skrev:
Let us say that the RX is 100 ohms. (A very popular D/A box did this. I asked the designer why, and his response was "Hey, 100 ohms is what we had that day. Most guys leave it unterminated. Give me a break; we are a step in the right direction.")

Ok, the reflection coefficient, called rho is:

(100-75)/(100+75) = 0.143

That translates into -16.9 dB. Not that great. Keep in mind that there will be some stray reactance to take into account, so in practice it will be worse.

Anyway, we now have around 14% of the original signal bouncing back to the source. Well, if we have a perfect 75 ohm cable and source, all of the reflection will be absorbed, and there will be no further reflection.

Of course, that is not reality. So, let us say that it also has the same rho of 0.143. 14% of 14% will then be reflected back to the RX. Delayed in time by the propagation time of twice the cable length.

If the cable is very short, you could end up 2% of the original signal arriving a few nSec later.

Ok, 2% is obviously not enough to affect whether it is a 1 or a 0. But it is enough to affect the timing of the decision point. And this is what matters.
That 2% reflection will arrive some time, regardless of cable length, and will affect in the same way.
So, what do you do about it???

1.) Design a reclocking circuit to get rid of that problem. Guess how many do that. Hint: not many.
I have one foot in the stereo camp and one in home cinema, and I can tell you that buffering and reclocking is a necessity in every part of home cinema processing units. It is more common than you may think. But of course the non-oversampling purist DAC producers will cut down on circuitry where they can to cater to their dogmatic minimalist approach. Such DACs will of course be easy prey to jitter-problems. Like I said, my money is on reclocking DACs, preferably with some internal buffering. It is done all the time within portable CD-players, MP3-players etc and there is no reason not to buffer, other than idealistic minimalism.
OK, you asked about BW. Let us say that you agree that a very sharp rise time is needed to prevent those nasty reflections from mucking things up. (Faster rise time, sharper slope, less chance for mucking up timing.)

If you have a 7 nS rise time, the -3 dB point (assuming a single-pole network), you get 0.35/(7^-9) = 50 MHz. Cut the rise time in half, and the BW is now 100 MHz. Yes, it will generate tons of EMI. But, there are units around that are very fast. So, 100 MHz is not that outlandish of a claim.
But as I stated before, the arrival of the reflections will happen some time, no matter what, and screw up the timing. So 100MHz will only be of academic interest, since the imperfection of impedance matching will cause flank jitter in the timing of these presumed steep flanks. Even if the flank of the reflected signal does not arrive while still in the flank, the added reflection will cause a train of minimised rises and falls in the original signal, causing a shift of the signal level of the flanks.

You are stating that a lossy cable actually is preferable in spdif-applications. I do not quite see how "cable loss" enters into it. The gradual decline in the "ping-pong" effect of the reflections occurs because of the impedance-mismatches at each end of the cable, not because of the cable itself.
Can you tell me some basic properties of what you would classify as a lossy cable?
 

roffe

Hi-Fi freak
Ble medlem
12.01.2005
Innlegg
3.677
Antall liker
6
ar-t skrev:
1.) Error?? "Perceived error" was the same as any other type jitter-related problems. Bass sounds flabby, and the top-end is nasty sounding.
Are you telling me that the timing of these flanks should cause significant timing errors in the 20-80Hz region?! We are talking about a part of the signal where a sampling frequency of 200-300Hz would be enough for any practical purposes. What DACs did you say you tested this with? A 44KHz sampling frequency is literally overkill if bass signals are concerned.
I challenge you to show me how a bass only signal sampled with 44KHz would be affected audibly by timing jitter issues in the DAC.
 

ar-t

Bransjeaktør
Ble medlem
17.02.2008
Innlegg
18
Antall liker
0
Still short on time, so I will try to answer without being too brief.

Yes, the refection should arrive at the same time. But you have to allow for noise, and jitter already present in the signal before it leaves.

The concept is simple: If the round trip for the reflection arrives after the decision point, it won't matter much how large the reflection is. Since most equipment has lots of reflection problems, SPDIF DACs without reclocking schemes are greatly helped.

Think of the cable as a delay line, if that helps.

Lossy cables attenuate the reflections. But usually with increased HF loss. It is a balancing act.

A fast waveform dictates a high BW. Slow it down by lowering the BW, and you increase the risk of reflections mucking things up.

How jitter does what, in the psychoacoustic world, I am not going to even begin to speculate. You will just have to take that on faith. 20 years of building digital audio products, coupled with similar observations with darn near everyone that I know in the business, says that jitter does that.

Speaking of psychoacoustics, I have no explanation why "odd things going on" at high frequencies make the bass sound wrong. Ask any other amp designer about this. The honest ones will tell you a lot of the time when the bass does not sound right, there is something going wrong at the higher frequencies. By higher, yes that means above the audible range. There are a lot of things that happen in audio that we have no explanation for. Because of this, a lot of folks will invent stuff to justify the price tag, but most of it is nonsense.

With a TDR, spectrum analyser, and your ears, my claims stand up to scrutiny. I don't make up stuff to sell our products. While I admit that attaching a precise value is damn near impossible, you can measure reduced levels of jitter with our cable.

Here is one interesting thing to try:

Those of you who have a SPDIF DAC, stick a battery-operated listening device of some sort on the part of the RX chip that is the PLL filter. Usually a cap, with one or two resistors. Listen to the noise on that, with the transport idle. Then listen to it with music playing!

When you can make out what song is playing by listening to the PLL filter, then maybe you will realise how bad SPDIF is.

Ok, I promised something on fiber. Here is a case where reflections don't always arrive at exactly the same time. Give me some time to do some other work, and I will try to start writing it.

AR-T
 

ar-t

Bransjeaktør
Ble medlem
17.02.2008
Innlegg
18
Antall liker
0
OK, fiber optics............

First off, this is in cases where there is no external clock input, or reclocking circuit. That fixes all the problems, by itself.

You also have to agree that fast rise times are better for jitter performance than slow rise times. Most commercial gear has slow rise times, as it is necessary to get by EMI standards.

TOSLINK has a very s-l-o-w rise time. Along with that, the rise and fall times are different! Sorry, that is not the approach that is called for. Sure, it passes bit, without error. Jitter performance.......no way!

But let's examine single-mode glass fibre, which by any reasonable thinking should be better than a crude mutli-mode plastic fibre.

Fibre has the great advantage that is provides for galvanic isolation, but without all the problems that building a transformer-based interface entails. It just has a set that is unique to it.

If you were to look at the output of a single-mode laser diode on an optical spectrum analyser (yes, they really do make such tings), you would see that there are many spectral lines present. Not just one, as you might expect. One of those spectral outputs is clearly dominant. Under normal operating conditions, it remains so, and everything works ok.

But just what comprises an operating condition that causes that dominant mode to no longer be the dominant one? How is it that we can pass gigabits of data over 10's of km without any problems, yet I contend that it doesn't work for say.............1 metre..........with only SPDIF data?

Because it operates over 10's of km, not 1 metre, that is why!

Fibre was designed to operate over ranges where copper was no longer a viable option. At that is roughly around 1 km. What happens when you try to get one of those laser diodes to work with only 1 metre of cable is really interesting.

With only 1 metre of cable, too much light is reflected back from the RX to the laser diode..It is not designed to work properly with the amount of light that is returned, when using only 1 meter of cable. it wants to see a km or so of fibre to absorb the reflected light.

So, light bounces back. What is the big deal, you wonder.

When too much light enters the laser diode, it undergoes something known as mode hopping. This is an interesting phenomenon where the dominant mode does not remain the dominant mode. The dominant mode changes freely and unpredictably. All the time. So, if we were to look at the output now, you would see all the spectral lines going up and down. Seemingly all on their own. Pick up that 1 metre pigtail, and really watch them go wild!

Ok, so we have the spectral content changing all the time. So what, you ask.

Now you have to examine the properties of fibre itself. In this case, we will only concentrate on single-mode fibre.

As the output travels along the fibre, each of the modes has a slightly different wavelength, along with a slightly different path. This means that the arrival time for each is just slightly different. Maybe not much, but enough. Different arrival times leads to pulse dispersion, which means that the pulse from the transmit end is not as clean as it was leaving. (You could also call it pulse spreading. The 2 terms seem to used interchangeably.) As the dominant mode changes, the pulse dispersion becomes an issue. You no longer have the same arrival time. It changes around, as the modes shift. (Each mode contributes some to the final pulse. The dominant mode, in normal operating conditions, is strong enough to mask the arrival of all the other modes. The pulse is spread out some, but not enough to be a problem.)

So, let us add all this together:

We have too much light, bouncing back to the source. This gives rise to mode-hopping. We now no longer have one single dominant mode. It changes all the time. This means that due to different modes becoming the dominant one, the arrival times will shift with it.

And that is what we call jitter.

Of course, the fix is easy: get rid of the reflected light. Too bad companies that used the AT&T transmitter never bothered to ask them what happens if you have too much light bouncing back. Of course, you can blame AT&T for not putting that in the data sheet. But then, they would probably say "If you don't know that much about fibre optics, then you shouldn't be designing with it."

All for now. I will check back later to see if there are any other questions.

Maybe I will go over to AC and see if I can make someone there mad at me!

AR-T
 

Bx

Bransjeaktør
Ble medlem
04.08.2005
Innlegg
8.875
Antall liker
4.315
Thank you for a very interesting contribution Ar-t,

I also took a big look at the spdif/transporter thread. It was well time spent.
 

roffe

Hi-Fi freak
Ble medlem
12.01.2005
Innlegg
3.677
Antall liker
6
ar-t skrev:
As the output travels along the fibre, each of the modes has a slightly different wavelength, along with a slightly different path. This means that the arrival time for each is just slightly different. Maybe not much, but enough. Different arrival times leads to pulse dispersion, which means that the pulse from the transmit end is not as clean as it was leaving. (You could also call it pulse spreading. The 2 terms seem to used interchangeably.) As the dominant mode changes, the pulse dispersion becomes an issue. You no longer have the same arrival time. It changes around, as the modes shift. (Each mode contributes some to the final pulse. The dominant mode, in normal operating conditions, is strong enough to mask the arrival of all the other modes. The pulse is spread out some, but not enough to be a problem.)
So what are we talking about in variance of the pulse arrival times? How does that compare to the ideal rise time of the coax spdif, according to you requiring a bandwidth of 50-100MHz?
 

Soundproof

Hi-Fi freak
Ble medlem
04.07.2007
Innlegg
1.639
Antall liker
0
roffe skrev:
ar-t skrev:
As the output travels along the fibre, each of the modes has a slightly different wavelength, along with a slightly different path. This means that the arrival time for each is just slightly different. Maybe not much, but enough. Different arrival times leads to pulse dispersion, which means that the pulse from the transmit end is not as clean as it was leaving. (You could also call it pulse spreading. The 2 terms seem to used interchangeably.) As the dominant mode changes, the pulse dispersion becomes an issue. You no longer have the same arrival time. It changes around, as the modes shift. (Each mode contributes some to the final pulse. The dominant mode, in normal operating conditions, is strong enough to mask the arrival of all the other modes. The pulse is spread out some, but not enough to be a problem.)
So what are we talking about in variance of the pulse arrival times? How does that compare to the ideal rise time of the coax spdif, according to you requiring a bandwidth of 50-100MHz?
Kvantefysikk er nok svaret her, skal det bli god lyd i stua! ;D
 

ar-t

Bransjeaktør
Ble medlem
17.02.2008
Innlegg
18
Antall liker
0
I don't have any data for the AT&T parts. I have the data sheets somewhere, but can not put my hand on them at this moment.

Let me extrapolate from my telecom experience. (OK, that has been quite a few years, but I think it will serve for our purposes.)

We were using a Siecor single-mode fibre, at 1310 nm. (We did not characterise it at 1550 nm.) In the world of fibre, they do not think in terms of bandwidth, as us "analogue" types do. They use the term chromatic dispersion to come up with an equivalent term. Effectively, we measured to the BW to be in the area of 3 GHz, on that particular fibre. I doubt that much has changed, so let's say our pigtail from the transport to the DAC is in that vicinity.

Assuming that we are talking about a single-pole network, we would have a rise time of 0.35/(3^9) = 0.117 nS.

Now, I don't know about you, but I haven't run across a lot of logic in stereo gear that has a rise time that fast. It would have to be some sort of ECL type logic, as opposed to saturated (CMOS style) logic.

So, in lieu of the actual specs (I don't even remember the AT&T p/n's so I can not look them up on-line. If someone can, please!!!!!!!!), we will have to make a leap of faith and assume that the rise and fall time of the logic is going to be the limiting factor in the TX/RX pair in question.

I know, an incomplete answer. But if we are using logic with a 3.5 nS rise and fall time, we are back at our 100 MHz BW. I don't think the fibre will be the limiting factor here.

Now, some of you may have your thinking caps on, and realise that long cables can be the answer for coax and fibre, but for different reasons.

In fibre, we need that length to attenuate the light bouncing back to the laser from the RX. But, 1 or 2 metres in length can make the difference between that reflection arriving at the RX during the transition period or harmlessly at the top (or bottom) of the waveform. We are talking about 0.1 % difference in length being possibly the difference between benefit and hindrance. Clearly, any attempt to control reflections based on length in fibre is not a wise approach. But for attenuating the reflection, it is a clear winner.

The opposite is true in coax. Clearly, a 1 km coax will attenuate the signal and the reflection too much to be of any use. But, we can use a much shorter cable (with obviously much less attenuation) to control where the reflection can occur. By carefully controlling its length, we can "tune" when the reflection occurs.

Now.......can anyone think of a device that we could use in either situation, that would do exactly the same thing, and come up with the same result? (Obviously, it would be physically different. But we would use the same term to denote it by.)

If you have the answer, don't blurt it out here! Just in case the competition is reading this.

AR-T
 

Dog

Hi-Fi freak
Ble medlem
22.11.2004
Innlegg
3.228
Antall liker
422
Sted
Sola
Jeg tar tråden tilbake! ;D

Dacen min har 47 Ohm utgangsimpedanse, ASRen tror jeg har 10K Ohm inngangsimpedanse...det skulle vel være et greit forhold?
(med tanke på 10-ganger regelen...). Det spiller oppløst og fint på de fleste måter, men jeg innbiller meg at jeg mangler noe detaljer og definisjon
i bunn...at det er mindre vekt i bassen i forhold til Lectoren(RIP). Kanskje det er kabelen? ;D

O.
 

SAL

Hi-Fi freak
Ble medlem
05.10.2006
Innlegg
3.471
Antall liker
4.271
Torget vurderinger
10
Vi har byttet en del digitalkabler og hos OSS har det vært store forskjeller, med store mener jeg ett løft på linje med å finne en f.eks. bedre cd-spiller! ;)
 
N

nb

Gjest
Fra en gammel tråd, et innspill fra soon-to-be-Dr. Løkken ang båståtte forskjeller i bassmengde og volum ved bruk av ulike digitalkabler:

------
For at en digitalkabel skal kunne endre bassresponsen (eller volumet som ble nevnt tidligere) må den:

1) Dekode lyddata fra SPdif-strømmen, som også inneholder trackinformasjon, paritetsinformasjon, biphase-enkoding, kanalinformasjon og SCMS kopibeskyttelse.
2) Gjøre om serielle lyddata til parallelle PCM samples og generere sin egen sampleklokke for å holde styr på disse.
3) Gjøre matematiske operasjoner på PCM lyddata vha. digital signalprosessering, feks filtrering eller equalizing.
4) Enkode den prosesserte dataen til SP-dif igjen.
5) I tillegg til de prosesserte data også sende ut en dummydatastrøm som lurer opptaks- eller måleutstyr til å tro at inndata og utdata er bit for bit identiske.

Jeg vil anta at en krets for å gjøre dette vil kreve et sted mellom femti tusen og fem hundre tusen transistorer. Det er ganske imponerende at en kabel får til det samme.
-----

Neppe alle som er enige i dette, men kan jo være noe å ha i bakhodet når man leser de mest ekstatiske rapportene om hva som skjer ved bytte av digitalkabler.
 
K

knutinh

Gjest
nb skrev:
Neppe alle som er enige i dette, men kan jo være noe å ha i bakhodet når man leser de mest ekstatiske rapportene om hva som skjer ved bytte av digitalkabler.
Men dersom man antar at lyttere detekterer forskjeller, men at deres perseptuelle beskrivelse ikke stemmer med hva som gjøres med signalet så åpnes det opp noen andre muligheter.

Dersom du (av ukjent grunn) har så enormt mye jitter at det er hørbart, og personen under test har såpass lite trening at vedkommende føler på seg at endringen er "mindre bass", så har du en forklaringsmekanisme.

Dette forutsetter selvsagt at man benytter en testmetodikk som faktisk søker å finne ut hva testpersonen hører, og ikke hva hun ønsker å høre. Det forutsetter også at det ligger et reelt fysisk fenomen i bunn.

-k
 

Soundproof

Hi-Fi freak
Ble medlem
04.07.2007
Innlegg
1.639
Antall liker
0
Det er fullstendig hinsides å tro at digitalkabler har de effekter på lyden som mange innbiller seg. Det er å misforstå hva et digitalt signal består av - noe Løkken er meget tydelig på her.

Men det er helt i tråd med en bransje som ønsker å overføre kriterier fra vinyl og analoge signaler til digitalt, som de ser utgjør massemarkedet. Og så lenge kundene er villige til å være med på leken er jo dette fint for "produsentene" og for dem som mener de har fått en vesentlig forbedring av lyden.

Det går ultrakritiske datastrømmer i finans, sivile, militære, telekom o.a. sammenhenger. Dersom denne signalgangen ble endret ved kabelvalg så ville hele integriteten til disse overføringssystemene være truet, og ingen ville ha noen som helst garanti om at datapakkene er intakte og transparente.

Enkelt sagt, jeg ville forlangt lønn i hard valuta og vært dypt usikker på pensjonen om datasignaler er så sårbare som det påstås i hifi-kretser.

Men hifi er jo en separatøvelse når det gjelder aksept av tekniske kriterier.
 

Gammeln

Overivrig entusiast
Ble medlem
03.09.2007
Innlegg
742
Antall liker
43
Dersom SPDIF-sendere og mottagere er implementert så hårreisende dårlig som ar-t har skrevet i sine glimrende innlegg er det slett ikke rart at digitalkabler påvirker lyden.
Og jeg vil ikke bli forundret om han har rett.

Det er fullstendig irrelevant hva som skjer ved vanlig dataforbindelser. Der kan det være så mye jitter det bare vil så lenge man klarer å dekode informasjonen. Vi snakker fort om jitter på prosentnivå i disse tilfellene.

Ved SPDIF skal klokkeinformasjonen gjenvinnes og brukes som klokke til en DAC. Her stilles det helt andre krav.
 
K

knutinh

Gjest
Gammeln skrev:
Dersom SPDIF-sendere og mottagere er implementert så hårreisende dårlig som ar-t har skrevet i sine glimrende innlegg er det slett ikke rart at digitalkabler påvirker lyden.
Og jeg vil ikke bli forundret om han har rett.

Det er fullstendig irrelevant hva som skjer ved vanlig dataforbindelser. Der kan det være så mye jitter det bare vil så lenge man klarer å dekode informasjonen. Vi snakker fort om jitter på prosentnivå i disse tilfellene.

Ved SPDIF skal klokkeinformasjonen gjenvinnes og brukes som klokke til en DAC. Her stilles det helt andre krav.
Dersom man måler thd+n for den aktuelle kretsen (drivverk, kabel, DAC, evt støypåvirkninger) så får man et uttrykk som er følsomt for jitter. Det er mulig å lage et oppsett som måler "dårlig" på thd+n uten å ha jitter, men jeg tror ikke det er mulig å lage et oppsett som måler "bra" på thd+n og samtidig har betydelig jitter.

Siden ideell vekting av slike feil såvidt meg bekjent ikke er kjent, så er det ikke mulig å lage en målt skala som sier "hvor godt låter A i forhold til B" for to forskjellige oppsett med forskjellig grad av støy, jitter, ulineær forvrengning, etc.

Dersom man ikke har tilgang på thd+n eller jitter-måling for akkurat sitt oppsett så kan man argumentere:
1. For CD-spillere så er thd+n oppgitt.
2. Mange audiofile foretrekker CD-drivverk med ekstern DAC framfor CD-spiller
3. Dersom valget av ekstern DAC er vel fundert i sanseopplevelser, er det noen grunn til å tro at ekstern DAC har mer jitter enn 1) ?


Det jeg har sett av blindtester av menneskelig følsomhet for jitter antyder for meg at problemet er overdrevet av en bransje som lever av FUD. Det hadde likevel vært veldig betryggende å hatt mer utfyllende tester av forskjellig typer jitter under forskjellige betingelser å støtte seg til.

-k
 

Tweedjakke

Hi-Fi freak
Ble medlem
29.01.2008
Innlegg
3.487
Antall liker
3.608
Sted
Sunnmøre
Soundproof skrev:
Det går ultrakritiske datastrømmer i finans, sivile, militære, telekom o.a. sammenhenger. Dersom denne signalgangen ble endret ved kabelvalg så ville hele integriteten til disse overføringssystemene være truet, og ingen ville ha noen som helst garanti om at datapakkene er intakte og transparente.
Fyrst, eg er overtydd om at det meste av digitaloverføring innanfor hifi er feilfri og at kablane speler lita rolle.

Men: Sjølvsagt skjer det fysiske feil i overføringa heile tida, og det fører som oftast til at heile pakkar med informasjon må sendast på nytt. Det er difor all digital overføring (td. på eit nettverk) har protokollar for å sjekka om ein har fått gyldige data eller "øydelagte" data. Eg har ikkje studert S/PDIF særleg nøye, men det er vel vanleg å kasta frames med feil?

(Eller, omformulert: Det er vel ikkje mogleg for DACen å be CD-transporten om å få ein frame som feilar på nytt?)
 

Soundproof

Hi-Fi freak
Ble medlem
04.07.2007
Innlegg
1.639
Antall liker
0
Jitter er bransjens svar på wow/flutter/tracking error i vinyl, og er blitt kjørt frem som beveggrunn for å kunne opprettholde unødvendig høye priser på signalkabler.

Men der man i vinyl straks hører forskjeller om noe er feil, så skal man faktisk ødelegge signalet bevisst for å få det til i digitale overføringer.
 

Soundproof

Hi-Fi freak
Ble medlem
04.07.2007
Innlegg
1.639
Antall liker
0
vagstol skrev:
Soundproof skrev:
Det går ultrakritiske datastrømmer i finans, sivile, militære, telekom o.a. sammenhenger. Dersom denne signalgangen ble endret ved kabelvalg så ville hele integriteten til disse overføringssystemene være truet, og ingen ville ha noen som helst garanti om at datapakkene er intakte og transparente.
Fyrst, eg er overtydd om at det meste av digitaloverføring innanfor hifi er feilfri og at kablane speler lita rolle.

Men: Sjølvsagt skjer det fysiske feil i overføringa heile tida, og det fører som oftast til at heile pakkar med informasjon må sendast på nytt. Det er difor all digital overføring (td. på eit nettverk) har protokollar for å sjekka om ein har fått gyldige data eller "øydelagte" data. Eg har ikkje studert S/PDIF særleg nøye, men det er vel vanleg å kasta frames med feil?

(Eller, omformulert: Det er vel ikkje mogleg for DACen å be CD-transporten om å få ein frame som feilar på nytt?)
Dette er riktig - og det ideelle ville være sign-off på korrekt overførte datapakker med bufring av mottatt informasjon. Hvilket også er bygd inn i en rekke av de komponentene som anvendes, men det krever to-veis signalstrøm, og i en del av utstyret som benyttes er det ikke tatt høyde for dette.

Derfor er det også fordelaktig med harddiskbasert avlesning, nettopp for å kunne ha error-correction der. Der har man potensielt et problem ved optisk avlesning med laser fra en CD, men dette problemet er også betydelig overdrevet. Og uansett ikke noe som kabelvalg vil forbedre.

Poenget er naturligvis hva som så skjer med datastrømmen fra den går i signalkjeden og til den prosesseres. Her er det gjort utstrakte tester, bl.a. av John Atkinson i Stereophile, med tanke på å vurdere hvorvidt det er bit-transparens mellom avsendt og mottatt data. Det har han funnet, selv i så billige enheter som Apples Airport Express trådløse mottager (som har toslink optisk s/pdif ut - og som bufrer mottatte datapakker for å bevare avspillingsintegritet - toveis).

Dermed sitter vi igjen med potensiell klokkefeil, og hvorvidt denne kan være slik at den hørlig forvrenger og ødelegger musikken. Benchmark har skapt sitt Ultraclock-system, som ikke er toveis, men som skal sikre mot dette. (Steve Nugent mener han kan forbedre på det, til Benchmarks irritasjon.)

Her har man mengder av artikler og uttalelser fra kritiske musikere og produsenter som finner digital overføring, fra CD gjennom selv standard anlegg, langt å foretrekke over vinyl, og uten de forvrengningene som bransjeaktører finner støtt. Det er seriøst stor båndbredde for overføring av data i s/pdif, og forsvinnende lavt potensiale for feiltransmisjon mellom komponenter som er beregnet for hverandre, og som står nær hverandre med en enkel og stabil kobling mellom dem.

Innen tradisjonell dataoverføring har man sikret seg med protokoller fordi det kan være tusenvis av ulikt konfigurerte og fremstilte komponenter som skal behandle datastrømmen på vei fra avsender til mottager -- fungerer utrolig godt der likevel.

Overføringsutfordringene i et anlegg er faktorer under tradisjonell multikomponent dataoverføring i kompleksitet.
 

Dog

Hi-Fi freak
Ble medlem
22.11.2004
Innlegg
3.228
Antall liker
422
Sted
Sola
Jeg tror vel ikke heller digitalkabel gjør de store tingene, om noen. Det snakkes stadig om "fornuftig konstruerte kabler", det går vel ikke bare om impedanse men også materialvalg - så i forhold til min meter med Claes Ohlson digitalkabel(!!), så vil jeg tro det finnes bedre ting.

Det som er mer interessant, men kanskje OT i mine egen tråd, er hvorvidt Squeezebox 3 duger i rollen som transport sammenlignet med dyrere alternativer. (det være seg nettverkspillere eller cd-spillere). Det finner jeg nok ut etterhvert som jeg får tid til å teste ulike ting inn i MF Dacen.

Noen som har noen erfaringer her?

O.
 

SAL

Hi-Fi freak
Ble medlem
05.10.2006
Innlegg
3.471
Antall liker
4.271
Torget vurderinger
10
Noen som har lyst å komme og høre forskjell på digitalkabler? Soundproof?
 
N

nb

Gjest
SAL skrev:
Noen som har lyst å komme og høre forskjell på digitalkabler? Soundproof?
Jeg har vært på demo på HiFi-sjapper hvor selgeren har vært smått ekstatisk ved demo av slike i ulike prisklasser uten å bli nevneverdig imponert. Jeg har også testet litt hjemme uten å merke noe til eller fra. Kanskje fordi ingeniørene som har laget mine greier har såpass i knollen at de har klart å designe seg forbi slike trivielle(?) problemer;)

Edit: Til denne historien hører det også med at vedkommende var akkurat like ekstatisk selv om det gikk litt i krøll for han hva vi faktisk lyttet på. Troverdigheten var følgelig litt såder synes jeg.
 
Topp Bunn