But enough of the theory, just how do these codecs differ when it comes to real world listening? Well if you spend enough time Googling the phrase "sound quality" coupled with the names mentioned on the previous pages and you'll find endless debate over the merits and demerits of each codec. After a couple of hours you're likely to be more confused than when you started for, while some of it is focussed on proper research and balanced, scientific analysis, most of it is misinformed and biased. The simple fact is, that at a given file size, specifically in the medium compression range described originally by 128kbps CBR encoding, most people can hear very little difference between the codecs.
It's impossible to say, without any sort of mass study involving hundreds or perhaps thousands of people which is better, because one person's ears will differ from another. Where I might hear a difference, you may not; where I can put up with one type of audio artefact, it might grate on your nerves.
However, I'm never one to take other peoples' opinions and research at face value. I wanted to find out for myself which codecs sounded better. Here are the results of my research:
High bit rates
Listening tests have been repeated ad infinitum on the web and the conclusion is that, generally, even very experienced listeners who know what they are listening for have difficulty in telling the difference between the different codecs and between losslessly encoded files and the uncompressed WAV file above bit rates of 192kbps.
And I can confirm from my extensive, yet I must stress informal, listening tests with a pair of high-end headphones (Grado SR325is in case you're interested), that this is indeed, the case. In the interests of keeping the quality of the source high, the headphones were hooked up to an iTube Fatman valve amp, then in turn to my own home-brewed DAC, which decoded the digital S/PDIF signal output by my PC.
And, having used Foobar's ABX facility to carry out double blind tests on various different clips of music, the only codec where I could reliably tell the difference between the original and high bit rate file was MP3 (encoded using the -V 2 --vbr-new option - roughly the equivalent to 190kbps).
With the same tests run on Ogg Vorbis (q6), AAC (iTunes VBR equivalent to 196kbps) and WMA (196kbps) I couldn't tell the difference with any kind of statistical reliability.