Video feed is always overexposed using ARKit. Trying to enable HDR for ARSession doesn't seem to work. Setting videoHDRAllowed to true on ARWorldTrackingConfiguration does not change video rendering. Also when accessing the AVCaptureDevice with ARWorldTrackingConfiguration.configurableCaptureDeviceForPrimaryCamera, activeFormat.isVideoHDRSupported returns false (on iPhone 12 Pro Max) so I cannot set captureDevice.isVideoHDREnabled to true. Also when using setExposureModeCustom and setting iso to activeFormat.minISO, the image rendered by ARKit has always a way greater exposure than when running an AVCaptureSession. The use case is for using ARKit in a Basketball stadium: the pitch always appears totally white with ARKit so we cannot see any player while with AVCaptureSession (or just the iOS camera app) the pitch and players appear clearly thanks to HDR.

Setting videoHDRAllowed means that HDR will be enabled on the formats supporting it; however this is not the case for all video formats.

In iOS 16, ARVideoFormat has a new property isVideoHDRSupported. You can filter the list of the configuration’s supportedVideoFormat to find one where videoHDRSupported is true, and set this format as the configuration’s videoFormat before running the session.

Tagged with: