An introduction to computer audio
An undeniable truth so if your digital audio chain is bit perfect, it is perfect.
One bit is not more beautiful, better polished, etc than any other.
This argument is true but not right because it leaves the other half of digital audio, the time, out of the equation.
Digital audio on a computer is PCM (Pulse Code Modulation) audio.
It are samples (the bits) taken with a fixed interval, the sample rate.
Play back is converting the bits to analogue and this must be done with the right sample rate.
To generate the sample rate you need a clock.
A clock is an analogue device; precision can be very high but never perfect.
These tiny variations in the sample rate are called jitter.
If jitter becomes too high, it becomes audible.
The bits might be perfect, the timing will never be.
If we feed the right bits to the DAC, do we hear them all in the right way?
The answer is no.
DACs differ in their ability to resolve the bits.
The cheap ones probably do 12 bits right, the very good ones up to 22.
This is called the linearity of a DAC.
dCS tries to visualize it in both the digital and the analog domain.
The first picture I do think is clear, if samples are mangled (not on purpose, that is called DSP) e.g. a bit is flipped, it will result in a distorted sound.
The third picture is also clear, the right samples arrive at the right time.
It is the middle one I do think unclear.
Here there are problems with the timing, can't blame the samples for having the right content!
The right sample at the wrong time is the right sample but the timing is in error.
Most of the time the problem is not a slight change in sample rate as in this picture but slight differences in length of each time step.
This is called jitter.
As PCM audio is samples with a fixed sample rate; perfect playback is bit perfect and time step perfect.
More about bit perfect jitter.