【问题标题】:Sample Buffer Delegate Swift 2 For Real Time Video Filter用于实时视频过滤器的采样缓冲区委托 Swift 2
【发布时间】:2016-03-30 15:55:27
【问题描述】:

我正在尝试使用 iPhone 上的相机快速创建一个光强度读取器。这个想法是它采用所有像素的强度分量并将它们平均来给我一个单一的值。我不需要相机的预览。我一直在拼凑几个教程来尝试让它工作,到目前为止已经想出了下面的代码。 camDeviceSetup() 在 ViewDidLoad 上运行,cameraSetup() 在按钮按下时运行。

我在“videoDeviceOutput!.setSampleBufferDelegate”开头的行中遇到错误,它说它无法将 FirstViewController(视图控制器)类型的值转换为预期的参数。

let captureSession = AVCaptureSession()
// If we find a device we'll store it here for later use
var captureDevice : AVCaptureDevice?
var videoDeviceOutput: AVCaptureVideoDataOutput?
// AVCaptureVideoPreviewLayer is a subclass of CALayer that you use to display video as it is being captured by an input device.
var previewLayer = AVCaptureVideoPreviewLayer()

func camDeviceSetup() {
    captureSession.sessionPreset = AVCaptureSessionPreset640x480
    let devices = AVCaptureDevice.devices()
    for device in devices {
        // Make sure this particular device supports video
        if (device.hasMediaType(AVMediaTypeVideo)) {
            // Finally check the position and confirm we've got the back camera
            if(device.position == AVCaptureDevicePosition.Back) {
                captureDevice = device as? AVCaptureDevice
            }
        }
    }
    if captureDevice != nil {
        let err : NSError? = nil
        captureSession.addInput(try! AVCaptureDeviceInput(device: captureDevice))

        if err != nil {
            print("error: \(err?.localizedDescription)")
        }

    }
}

func cameraSetup() {
    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    previewLayer.frame = view.bounds
    view.layer.addSublayer(previewLayer)

    videoDeviceOutput = AVCaptureVideoDataOutput()
    videoDeviceOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey:Int(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)]
    videoDeviceOutput!.alwaysDiscardsLateVideoFrames = true

//This is the line that gets stuck and not sure why
    videoDeviceOutput!.setSampleBufferDelegate(self, queue: dispatch_queue_create("VideoBuffer", DISPATCH_QUEUE_SERIAL))

    if captureSession.canAddOutput(videoDeviceOutput) {
        captureSession.addOutput(videoDeviceOutput)
    }

    captureSession.startRunning() 
}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
    // Think once the delegate is correctly set my algorithm for finding light intensity goes here

}

【问题讨论】:

  • 该行的问题在于我没有在 ViewController 顶部的类中声明 AVCaptureVideoDataOutputSampleBufferDelegate。

标签: ios swift camera avfoundation avcapturesession


【解决方案1】:

该行的问题在于我没有在 ViewController 顶部的类中声明 AVCaptureVideoDataOutputSampleBufferDelegate。

【讨论】:

  • 您是否将 AVCaptureVideoDataOutputSampleBufferDelegate 分配给了 vc?
猜你喜欢
  • 1970-01-01
  • 1970-01-01
  • 2013-03-26
  • 2017-03-20
  • 2014-01-17
  • 1970-01-01
  • 2012-02-03
  • 1970-01-01
  • 1970-01-01
相关资源
最近更新 更多