如何在iOS 11和Swift 4中从相机捕获深度数据?

我正在尝试使用AVDepthData从iOS

11中的相机获取深度数据,当我使用AVCapturePhotoCaptureDelegate设置photoOutput时设置photoOutput.depthData为nil。

因此,我尝试使用AVCaptureDepthDataOutput设置AVCaptureDepthDataOutputDelegate,但是我不知道如何捕获深度照片?

有没有人从AVDepthData获得图像?

这是我尝试的代码:

// delegates: AVCapturePhotoCaptureDelegate & AVCaptureDepthDataOutputDelegate

@IBOutlet var image_view: UIImageView!

@IBOutlet var capture_button: UIButton!

var captureSession: AVCaptureSession?

var sessionOutput: AVCapturePhotoOutput?

var depthOutput: AVCaptureDepthDataOutput?

var previewLayer: AVCaptureVideoPreviewLayer?

@IBAction func capture(_ sender: Any) {

self.sessionOutput?.capturePhoto(with: AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg]), delegate: self)

}

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {

self.previewLayer?.removeFromSuperlayer()

self.image_view.image = UIImage(data: photo.fileDataRepresentation()!)

let depth_map = photo.depthData?.depthDataMap

print("depth_map:", depth_map) // is nil

}

func depthDataOutput(_ output: AVCaptureDepthDataOutput, didOutput depthData: AVDepthData, timestamp: CMTime, connection: AVCaptureConnection) {

print("depth data") // never called

}

override func viewDidLoad() {

super.viewDidLoad()

self.captureSession = AVCaptureSession()

self.captureSession?.sessionPreset = .photo

self.sessionOutput = AVCapturePhotoOutput()

self.depthOutput = AVCaptureDepthDataOutput()

self.depthOutput?.setDelegate(self, callbackQueue: DispatchQueue(label: "depth queue"))

do {

let device = AVCaptureDevice.default(for: .video)

let input = try AVCaptureDeviceInput(device: device!)

if(self.captureSession?.canAddInput(input))!{

self.captureSession?.addInput(input)

if(self.captureSession?.canAddOutput(self.sessionOutput!))!{

self.captureSession?.addOutput(self.sessionOutput!)

if(self.captureSession?.canAddOutput(self.depthOutput!))!{

self.captureSession?.addOutput(self.depthOutput!)

self.previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession!)

self.previewLayer?.frame = self.image_view.bounds

self.previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill

self.previewLayer?.connection?.videoOrientation = AVCaptureVideoOrientation.portrait

self.image_view.layer.addSublayer(self.previewLayer!)

}

}

}

} catch {}

self.captureSession?.startRunning()

}

我正在尝试两件事,一件事是深度数据为零,另一件事是我试图调用深度委托方法。

有人知道我想念的吗?

回答:

首先,您需要使用双摄像头,否则您将不会获得任何深度数据。

let device = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .back)

并保留对您队列的引用

let dataOutputQueue = DispatchQueue(label: "data queue", qos: .userInitiated, attributes: [], autoreleaseFrequency: .workItem)

您可能还想同步视频和深度数据

var outputSynchronizer: AVCaptureDataOutputSynchronizer?

然后,您可以像这样在viewDidLoad()方法中同步两个输出

if sessionOutput?.isDepthDataDeliverySupported {

sessionOutput?.isDepthDataDeliveryEnabled = true

depthDataOutput?.connection(with: .depthData)!.isEnabled = true

depthDataOutput?.isFilteringEnabled = true

outputSynchronizer = AVCaptureDataOutputSynchronizer(dataOutputs: [sessionOutput!, depthDataOutput!])

outputSynchronizer!.setDelegate(self, queue: self.dataOutputQueue)

}

我建议您观看WWDC会话507-他们还提供了完全符合您想要的功能的示例应用程序。

https://developer.apple.com/videos/play/wwdc2017/507/

以上是 如何在iOS 11和Swift 4中从相机捕获深度数据? 的全部内容, 来源链接: utcz.com/qa/423188.html

回到顶部