What is the Photonic Engine? Apple has introduced a new iPhone 14 camera feature called the Photonic Engine, which promises better images when light is less than ideal.
The computational photography tech with a very ‘Apple’ name combines the hardware and software elements of the iPhone 14 and iPhone 14 Pro ranges and will deliver improved performance (Apple says a ‘giant leap’) for shots taken in mid-to-low light.
On the main camera, telephoto camera and the TrueDepth camera the performance is 2x better, while the ultra wide camera could benefit from 3x better performance in lower light conditions. The feature exists separately from the existing Night Mode, which can be selected by iPhone users to combat lighting conditions.
How does the Photonic Engine work?
Apple says this is essentially an expansion of its Deep Fusion image processing technology, which was launched within the iPhone 11 in 2019, and has continued to underpin the handset range’s computational photography nous.
Essentially, it uses machine learning to process photos on a pixel-by-pixel basis, which the company says optimises the texture, details and noise throughout the image.
What the Photonic Engine does is apply this technology earlier in the process than before to ensure a “dramatic increase in quality.” It says subtle details will be preserved, while the colours will be better. It will also work across all of the cameras within the iPhone 14, iPhone 14 Plus, iPhone 14 Pro, and iPhone 14 Pro Max.
On the iPhone 14 Pro, Apple will combine this tech with a 48-megapixel main camera, which is a massive increase on the 12-megapixel sensor Apple had offered within its iPhones previously. However, the company is only really using this expanded sensor to allow in more light, in most cases. The resulting photos will only be 12-megapixels in standard mode. You can get 48-megapixel snaps in ProRAW mode.