如何使用AVCapturePhotoOutput

我一直在使用自定义相机,最近我与Swift 3一起升级到了Xcode 8 beta。

var stillImageOutput: AVCaptureStillImageOutput?

但是,我现在收到警告:

iOS 10.0中不推荐使用“ AVCaptureStillImageOutput”:改用AVCapturePhotoOutput

由于这是相当新的内容,因此我没有看到太多信息。这是我当前的代码:

var captureSession: AVCaptureSession?

var stillImageOutput: AVCaptureStillImageOutput?

var previewLayer: AVCaptureVideoPreviewLayer?

func clickPicture() {

if let videoConnection = stillImageOutput?.connection(withMediaType: AVMediaTypeVideo) {

videoConnection.videoOrientation = .portrait

stillImageOutput?.captureStillImageAsynchronously(from: videoConnection, completionHandler: { (sampleBuffer, error) -> Void in

if sampleBuffer != nil {

let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)

let dataProvider = CGDataProvider(data: imageData!)

let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)

let image = UIImage(cgImage: cgImageRef!, scale: 1, orientation: .right)

}

})

}

}

我尝试查看AVCapturePhotoCaptureDelegate,但是我不确定如何使用它。有人知道如何使用吗?谢谢。

回答:

Hi,它真的很容易使用AVCapturePhotoOutput

您需要AVCapturePhotoCaptureDelegate返回的 CMSampleBuffer

如果您告诉AVCapturePhotoSettingsPreviewFormat,则还可以获得预览图像。

    class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate {

let cameraOutput = AVCapturePhotoOutput()

func capturePhoto() {

let settings = AVCapturePhotoSettings()

let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!

let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,

kCVPixelBufferWidthKey as String: 160,

kCVPixelBufferHeightKey as String: 160]

settings.previewPhotoFormat = previewFormat

self.cameraOutput.capturePhoto(with: settings, delegate: self)

}

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

if let error = error {

print(error.localizedDescription)

}

if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) {

print("image: \(UIImage(data: dataImage)?.size)") // Your Image

}

}

}

有关更多信息,请访问https://developer.apple.com/reference/AVFoundation/AVCapturePhotoOutput

注意:拍摄照片之前,您必须将添加AVCapturePhotoOutputAVCaptureSession。所以像这样:session.addOutput(output),然后:output.capturePhoto(with:settings,

delegate:self)谢谢@BigHeadCreations

以上是 如何使用AVCapturePhotoOutput 的全部内容, 来源链接: utcz.com/qa/401202.html

回到顶部