Is taking the output MTLTexture from RealityKit 2's `postProcessing` pipeline suitable for writing to an AVAssetWriter, streaming via RTMP, etc?

“Maybe” 🙂

So you can certainly take MTLTextures and convert them (if they’re configured correctly) into CVPixelBuffers for AVFoundation to consume.

That said it’s really not the intended use case of RealityKit's post processing functionality and I wouldn’t be surprised if either it doesn’t work as you’d expect or if we break you in the future.

Sounds like a great feature request though - Bug Reporting

Tagged with: