Home / Opinions / AptX HD: The future of high-res music streaming

AptX HD: The future of high-res music streaming

by

aptx logo

We take a look at Qualcomm's new aptX HD and what it means for mobile audio.

As we've seen from the recent announcement of the LG G5 and its DAC plug-in, as well as speculation over the iPhone 7's audio set-ups, people care about mobile audio.

One of the big standards you might have encountered of late is aptX HD. So what exactly is it, and why should you care?

Before we can discuss aptX HD, we should lay the groundwork by explaining aptX - the standard it stems from.

Read more: Best headphones 2016

LG friends

History of aptX

AptX is an audio codec established in the 1980s by Dr. Stephen Smyth of Queen's University Belfast. It was initially released as an integrated circuit for broadcasters looking to store CD-quality audio on a computer hard disk drive.

This explains why aptX has gone on to become a popular professional audio standard that has been adopted by 30,000 radio stations and 20,000 cinemas across the world.

As of 2009, aptX started to be applied to consumer electronics, and it has gone on to be adopted by some 320 leading audio brands. In this context - the one we're mainly concerned with here - the aptX codec has become a key means for streaming near-CD-quality sound from various media-playing devices to a variety of Bluetooth speakers, headphones, soundbars, and the like.

It's widely enabled in mobile devices from the likes of Samsung, Sony, HTC, Motorola and LG, and it's also supported by Windows 10 and Mac OS X.

One final point worth making here is that in July 2010, Cambridge-based semiconductor company CSR acquired the aptX technology, and just over a year later the world's biggest mobile chip maker Qualcomm acquired CSR. As such, aptX is now a Qualcomm property.

What is AptX HD?

With aptX explained, you can probably take a good guess at what aptX HD involves. It's an enhancement of the core aptX codec aimed at providing higher quality wireless sound streaming.

Announced at CES 2016 back in January, aptX HD works in much the same way as aptX, but it now allows for high-resolution audio. Where straight aptX can stream 16-bit audio (roughly CD-quality), aptX HD goes beyond that and can manage 24-bit over a Bluetooth connection.

While it accepts a 24-bit signal, aptX HD ensures the retention of this fidelity by using an extra two bits in each of the four sub-bands of processing, which effectively lowers the signal-to-noise ratio and leads to less distortion in high-resolution audio files.

Given its heritage, it's perhaps unsurprising to learn that aptX HD retains backwards compatibility with Bluetooth devices running classic aptX.

Related: Samsung Galaxy S7 vs LG G5

LG G5The LG G5 has full aptX HD support

AptX HD support

You might think that given its Qualcomm ownership, a whole heap of Android devices launched in 2016 will come with aptX HD support by default. After all, the Qualcomm Snapdragon 820 looks set to be the go-to chip for most high-end phone manufacturers.

However, aptX support requires the compliance of the handset manufacturers themselves.

For example, the recently announced LG G5 comes with aptX HD support in tow. The LG Tone Platinum, meanwhile, is the world's first Bluetooth headset equipped with the aptX HD codec. The idea is presumably to pair the two up for wire-free high-resolution audio nirvana.

However, the Samsung Galaxy S7, which was announced on the same day as the LG G5 and is powered by the same Snapdragon 820 CPU (in most territories), only has regular 16-bit aptX support.

Get a great deal on the Galaxy S6 Edge at Amazon

s7 1Samsung's Galaxy S7 does not come with aptX HD support

The moral of the story if you're looking to stream high-resolution audio wirelessly using aptX HD, then, is to check for support before you buy anything. Whether it's a smartphone, a set of Bluetooth headphones, or any other part of your audio set-up, support for Qualcomm's latest standard is not guaranteed.

Of course, there are plenty of people who will tell you that you'll never match the quality of a decent set of wired headphones with any Bluetooth set, regardless of the audio codec used. But that's an argument for another article.

Are you looking forward aptX HD? Let us know in the comments below.

pimlicosound

February 24, 2016, 3:04 pm

I assume everyone involved in making this, like all high-res audio products, just doesn't care that it's beyond the theoretical maximum hearing capability of human ears. There's a reason CD-quality is 16-bit/44.1KHz - it offers twice the fidelity that people can resolve, meaning that even with aliasing of the waveform people will still resolve the original audio with 100% accuracy. It's mathematically impossible to improve on this. Higher fidelity is only necessary in sound studios where audio will be relentlessly processed and suffer down-sampling as a result. It's a waste of time in consumer audio. Anyone who says otherwise is suffering from the placebo effect.

Dcol

February 24, 2016, 9:15 pm

There's a minor factual error in this article. Qualcomm acquired CSR in August 2015 not in 2011 as your article sates.

Gokhan Tunc

February 24, 2016, 9:52 pm

I hope iPhone 7 will support it and I am also looking for a Bluetooth headphone with both aptX HD and lighning cable support.

mode11

February 25, 2016, 6:37 pm

Pimlico didn't say anything about frequency response, he was talking about frequency range. This is completely linked to sampling rate, which needs to be over double the max. frequency you wish to capture (i.e. 22KHz x 2 + a little bit (0.1KHz) = 44.1KHz). 22KHz is already beyond the human ear, especially for adults.

Bit depth is how fine the steps are in the captured waveform, and in turn defines the dynamic range (loudest vs. quietest sounds that can be captured). 16-bit resolution already allows for a range that goes from inaudible to literally deafening (if the mastering chose to use it, and your hi-fi had the power). In HDR TV terms, like going from pitch black to staring into a laser beam.

The CD format only bothered to go as high as 16 bit in the first place because oversampling hadn't been invented. Created to enable early-80's 14-bit DACs to be good enough for CDs, applying this upsampling technique to 16-bit audio pushes it even further beyond the capabilities of the human ear.

mode11

February 25, 2016, 6:44 pm

Does this sort out the latency of Bluetooth? I was going to buy some BT headphones for late night TV watching, but after research there seems to be an inherent 0.2 sec lag that's noticeable with video. Apt-X Low Latency does exist, but approximately 2 (expensive) headsets use it.

I opted for a couple of pairs of Sennheiser RS160's in the end, which use Kleer lossless (and sound superb).

Bugblatter

February 26, 2016, 2:16 pm

To sample a 20KHz waveform you need to sample at at least 40KHz (waves go up and down and you need to capture both the up and the down). In fact you need to be slightly above in order to avoid always sampling the same level on the waveform (which might be the zero bit for example).

Even then, with 44.1KHz you're getting an incredibly rough approximation of that 20KHz waveform; it should be a sine wave but you're capturing a saw tooth wave. If you do a fourier analysis on a saw tooth you find you have the base frequency, i.e. the one you want, plus a whole load of harmonics on top. These harmonics are distortion. Most will be inaudible, but not all (especially on the slightly lower frequencies, e.g. 16KHz and above).

For 16 bit vs 24 bit it seems like it shouldn't make much of a difference (65K levels vs 16 million, but 65K seems like it ought to be enough), but the 24 bit stuff I've listened to does seem crisper and more solid.

pimlicosound

February 26, 2016, 2:20 pm

"it should be a sine wave but you're capturing a saw tooth wave"

You're actually not, because the ADC is not capturing a wave at all. It's recording a series of values at regular intervals and not recording anything in between those intervals - in other words, it's recording a series of points. A saw-tooth wave is just a really bad visual representation of this series of points. If you join up the points with a smooth line (which is essentially what a DAC does), you get a smooth sine wave, which is a perfect facsimile of the original audio wave.

pimlicosound

February 26, 2016, 2:22 pm

"16-bit resolution already allows for a range that goes from inaudible to literally deafening"

Exactly right. High bit depth doesn't increase quality (because dithering already removes any quantization errors from an insufficient bit depth); it only lowers the noise floor, which means you can get increased dynamic range. Since 16-bit already provides enough dynamic range to deafen you, 24-bit provides enough to literally kill you.

Bugblatter

February 26, 2016, 2:29 pm

If you only have 2 points then joining them up will give you a saw tooth.

Obviously we're talking as if the ADC samples each frequency individually, which it doesn't. There's no 'smooth line' for the DAC to draw between the two points; it just samples the overall amplitude of the music and that will include all of its constituent frequencies.

pimlicosound

February 26, 2016, 3:32 pm

TrustedReviews or Disqus keeps blocking my attempt to link to a YouTube video, so instead I'll recommend you search YouTube for this: "D/A and A/D | Digital Show and Tell". It explains with great clarity how ADC and DAC really works. It's 25 minutes long, but so worth it. If you just want to see the explanation of the fallacy of a digital "wave" and how an ADC really records discrete values that a DAC can convert perfectly into the original waveform, skip to 5:55. But you should really watch the whole thing - it's fantastic.

Pete

March 5, 2016, 4:43 am

Some disinformation here, Bugblatter, no, Sampling and analog to digital conversion don't work like that, sines don't become sawtooth they come out as sine. on the other end, pimliicosound, a square wave at 22k WILL become a sine...Yes you could say, you don't hear those harmonics so why bother with them. True that you can portray any waveshape as a sum of sines put there is NO proofs that when the eardrum receive information that it can't hear it will filter it exactly the same way the Dac filters it. Bottom line is, frequencies that we can't hear still have an influence on the waveshape of a signal, and Hi sample rates just keep more of that integrity all along until it gets to your ears. For bit depth, well, it's directly linked to the signal to noise ratio, and yes most trained ears will hear a difference between a 24 bit and a 16 bit audio signal, no Placebo effect here.

ford4life69

March 16, 2016, 2:11 pm

So why does my S7 not work with my LG HBS750 then? It should support traditional aptx at the very least correct? It works fine on calls but playing Pandora, Youtube, or any other audio/video source and it is awful. This headset worked beautifully with my S5. I almost wish I hadn't traded it in. :-\

RiotSloth

April 16, 2016, 2:05 am

HTC 10 has aptX, but is it HD? Not that it makes much difference to me, aptX is way good enough for my tired ears!

Paul M

May 1, 2016, 5:43 pm

I doubt it will, no iphone to my knowledge has supported even AptX. Apple have their own streaming system over wifi.

woswas denni

May 14, 2016, 10:27 pm

it doesnt make a difference, regular aptx is already the max you gonna be useable on headphones so dont worry about it. the hd is jsut a marketing thing.
the important thing is that aptx finally get a broad and easy to use support.

woswas denni

May 14, 2016, 10:37 pm

while i mostly agree with you, 99% of the usual consumer hardware wont profit from more then 16 bit. the more important thing is that aptx finally get its distribution it deserves, 16bit 24bit fuck it, better then csp any time and almost no more lags .

Pete

May 14, 2016, 11:27 pm

Guys, sorry, but we really should not picture a digital conversion as a serie of point, it's more complicated like that even tough it seem the logical way. It's necessary to understand what is a convolution, Fourrier transform, the use of a sinc function as a reconstruction filter due to it's inverse fourrier transform, the step function, etc...Then we can understand that any audio under half the sample rate CAN be EXACTLY reproduced, not triangle like, not choppy, not discrete... CAN is the key word not all DACS are well conceived. Me It's the frequencies ABOVE Nyquist that get lost in the process I'm interrested in having.

Michael F

September 20, 2016, 1:37 am

To avoid those distortions it is necessary to record and process above 44 kHz, 96 kHz is plenty, but it is not necessary to play back above 44 kHz. However, there is nothing to be gained by using 24 bit audio over 16 bit except for very high end equipment that can reproduce both very loud and very quiet sounds without massive distortion - movie theaters and concert venues for example.

Paul Smith

September 24, 2016, 7:28 am

LDAC does a better job in blind listening tests.

SoNic67

November 27, 2016, 2:35 pm

Really does it? Can you provide a link?

SoNic67

November 27, 2016, 2:37 pm

On xda developers forum, somebody created a mod (for rooted devices) that supposedly enables aptX-HD codec for any Qualcomm-based phone that already supports the regular aptX within the stock firmware.

comments powered by Disqus