What is a ToF camera? Understanding the P30 Pro’s 3D camera trick

What is a ToF camera?

A ToF camera is a depth mapping tool. “ToF” stands for time of flight, and refers to the basic concept behind its operation.

The camera measures the time it takes for light to travel to and bounce off objects, using the delay to judge distance. Put one on a phone, as many companies now do, and it can see the world in 3D.

The second-generation Xbox Kinect sensor for Xbox One was the first major commercial implementation of ToF. Relatively few even acknowledged the use of the technology in 2013 when the sensor bar was released.

ToF is finally in the spotlight, thanks to phones. But how does it work? And what can one do?

What is a ToF camera for?

A ToF camera creates a depth map of the scene in its field of view. Photography and augmented reality are the two obvious uses for one.

Chicken Widget

Best Game of Thrones Deal

Now TV – 3 Month Entertainment Pass (UK Deal)

Giving you just enough time to catch up on previous episodes of Game of Thrones and to dive into the eighth and final season, this discounted three-month pass is perfect for the casual watcher.

HBO on Amazon Prime Channels – 7-Day Free Trial (US Deal)

Residents in the US can get access to all of HBO's fantastic content (including Game of Thrones) through Amazon Prime Channels, with a 7-day free trial available for new customers.

As of early-mid 2019, though, the future for ToF hardware is not that clear yet. Google is likely to bring ToF depth support to its ARCore augmented reality platform as part of Android 10 “Q”. This has not been detailed or uncovered in the software’s early betas, though.

If a ToF camera is to be used to create depth maps for shallow depth of field photography, the most obvious current use, that is up to the manufacturers. Huawei, Samsung, Apple and Google all have their own algorithms that process depth in their phones’ cameras. Huawei’s P30 Pro has some of the best shallow depth of field effects seen in a phone, and does not appear to need its TOF camera at all for “Aperture” blur photos.

How can you tell? Block the TOF camera and there’s no big difference in the background blur effect. The argument ToF is better by design than using two cameras and parallax for this kind of photography is flawed anyway.

Why? Resolution is a rarely mentioned technical aspect of ToF cameras. They do have a specific resolution, and it determines the fidelity of the depth mapping. While the latest phones may have greater resolution than the Kinect v2’s 512 x 424 ToF sensor, they almost certainly don’t match that of the primary camera. And that means an imperfect depth map for photography.

This is why augmented reality is a more cogent reason for the sudden explosion of ToF hardware in phones. It can make better use of these cameras’ specific traits.

For a clearer picture of what ToF are likely to be used for in 12 months, look back to the Kinect v2. Tech demo videos show this sensor bar tracking a person’s movement. It breaks them down to the articulations of the skeleton’s major joints, letting apps and games turn real people into game characters in real time.

This is not all down to the ToF camera, but the (presumably) relatively low overheads of its very fine grain tracking are a key component.

ToF should prove a golden opportunity for augmented reality. While Google and Apple have devised very clever ways to map environments in 3D using conventional hardware, it’s one of the surest ways to make your phone hot and drain its battery quickly. ToF should ease this strain, and be able to track depth at a greater rate, making it easier to track the motion of people in real time, rather than just surfaces.

How does a ToF sensor work?

There are two core parts to a ToF camera, an illuminator and the camera itself. The illuminator floods the field of view of the camera with light of a specific wavelength.

This is always out of the range of human vision, and tends to be in the near-infrared range. As such, it is not reliant on ambient light like a two-camera depth system.

The light travels from the illuminator, bounces off the various surfaces in a scene, and some of it travels back to the sensor. By analysing the phase of the light wave, the ToF camera’s paired processor can determine how long the light took to bounce back.

As the speed of light is functionally constant, calculating the distance of objects becomes relatively simple. A greater phase difference indicates the surface is further away.

The role of a ToF camera is narrower than that of a normal camera, which has to interpret the entire spectrum of visible light. However, it is also more advanced than the “laser” focus aids based on the same concept, seen in phones and cameras for years.

Each pixel of the ToF camera sensor offers a depth reading, rather than just the single point of a laser focus aid. This allows for a relatively high fidelity depth “heat” map that updates at, or beyond, the rate we consciously perceive.

If we consider technological development a companion piece to evolution, ToF cameras see phone depth mapping deviate from the way we humans evolved. Our eyes use parallax to judge distance, and our brains are pretty good at making these calculations even with one working eye. ToF takes a different approach, using “invisible” light.

Fact of the day: goldfish can see both infrared light and ultraviolet light, that with longer and shorter wavelengths than that we can see.

When will we see proper ToF AR?

As of early-mid 2019 it’s a little hard to tell when we’ll see the full power of ToF phone cameras in action. However, it will likely begin with the launch of Android 10 Q.

That software will get its usual unveil at Google I/O, which takes place on 7-9 May. We will have to wait a little longer to try the final version ourselves.

Big new versions of Android tend to launch inside new Pixel phones. And the Pixel 2 series of phones had ARCore features months before the first wave of third-party models.

Assuming Google keeps to its traditional timescales, and releases Android 10 and the Pixel 4 in October, today’s phones with ToF hardware may not be able to use it fully until 2020.

Which phones have a ToF camera?

Oppo RX17 Pro

The Oppo RX17 Pro was the first phone we saw with a ToF camera. It was important for another reason too. This was among the first Oppo phones given an official release in the UK. While it didn’t quite get top marks, it is fairly good value at £479 nowadays if you don’t mind its use of a lower-end CPU.

Honor View 20

There are no half measures in £499 Honor View 20. This excellent value phone has the same CPU as some phones twice the price, a Kirin 980, a 48-megapixel main camera and a ToF camera. Honor called it a “3D” camera in its View 20 adverts.

Huawei P30 Pro

This is Huawei’s top phone for the first half of 2019, and it is all about the camera. As well as a ToF setup, the Huawei P30 Pro has a triple rear camera array. There’s a wide, a normal view camera and a 5x zoom. Using hybrid zooming you can shoot 10x, or even 50x, images.

LG G8 ThinQ

LG takes a different approach to the ToF camera here. Most put it on the back. The LG G8 ThinQ has one on the front. And this makes a lot of sense. While the most ambitious uses for augmented reality rely on the rear camera, those people actually play with the most are roughly selfie-based.

Samsung Galaxy S10 5G

Only the ultra-expensive 5G version of the Galaxy S10 has a time-of-flight camera. However, as it’s a higher-end alternative to the S10+, you can expect the price to be very high. Relatively few are likely to buy this phone. But it does prove Samsung knows ToF and 5G hardware is important, if not essential quite yet.

Is ToF the future? Tweet us @trustedreviews