【问题标题】:AVCapture Session To Capture Image SWIFTAVCapture 会话捕获图像 SWIFT
【发布时间】:2015-07-07 23:16:13
【问题描述】:

我创建了一个 AVCaptureSession 来捕获视频输出并通过 UIView 将其显示给用户。现在我希望能够单击一个按钮(takePhoto 方法)并在 UIImageView 中显示会话中的图像。我试图遍历每个设备连接并尝试保存输出,但没有奏效。我的代码如下

let captureSession = AVCaptureSession()
var stillImageOutput: AVCaptureStillImageOutput!

@IBOutlet var imageView: UIImageView!
@IBOutlet var cameraView: UIView!


// If we find a device we'll store it here for later use
var captureDevice : AVCaptureDevice?

override func viewDidLoad() {
    // Do any additional setup after loading the view, typically from a nib.
    super.viewDidLoad()
    println("I AM AT THE CAMERA")
    captureSession.sessionPreset = AVCaptureSessionPresetLow
    self.captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
    if(captureDevice != nil){
        beginSession()
    }
}
    func beginSession() {

    self.stillImageOutput = AVCaptureStillImageOutput()
    self.captureSession.addOutput(self.stillImageOutput)
    var err : NSError? = nil
    self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err))

    if err != nil {
        println("error: \(err?.localizedDescription)")
    }

    var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
    self.cameraView.layer.addSublayer(previewLayer)
    previewLayer?.frame = self.cameraView.layer.frame
    captureSession.startRunning()
}

@IBAction func takePhoto(sender: UIButton) {
    self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo)) { (buffer:CMSampleBuffer!, error:NSError!) -> Void in
        var image = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer)
        var data_image = UIImage(data: image)
        self.imageView.image = data_image
    }
}
}

【问题讨论】:

  • 现在出现这个错误Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** Cannot add output <AVCaptureStillImageOutput: 0x1742221c0> to capture session <AVCaptureSession: 0x17000ae70 [AVCaptureSessionPresetLow]> <AVCaptureDeviceInput: 0x174221840 [Back Camera]> -> <AVCaptureVideoPreviewLayer: 0x1742217e0> <AVCaptureDeviceInput: 0x174221840 [Back Camera]> -> <AVCaptureStillImageOutput: 0x174221da0> because more than one output of the same type is unsupported.'
  • 首先取出以下行: captureSession.addOutput(self.stillImageOutput) 并查看添加位置。这应该可以解决错误。
  • @user3353890 我修复了错误(我更新了上面的代码),但现在我只获得了用于预览的静态图像,而不是来自相机的视频......?
  • 静态图片是什么意思?
  • @user3353890 cameraview 不显示相机正在捕获的视频,而只是设置为在 capturesession 开始运行时相机首先捕获的任何静态图像。

标签: ios swift avcapturesession


【解决方案1】:

在开始之前向会话添加输入和输出时,您应该尝试添加一个新线程。在 Apple 的文档中,他们声明

重要提示:startRunning 方法是一个阻塞调用,可能需要一些时间,因此您应该在串行队列上执行会话设置,以免主队列被阻塞(这使 UI 保持响应)。有关规范的实现示例,请参阅 AVCam for iOS。

尝试在创建会话方法中使用调度,如下所示

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), { // 1
        self.captureSession.addOutput(self.stillImageOutput)
        self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err))
        self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto
        if err != nil {
            println("error: \(err?.localizedDescription)")
        }
        var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession)
        previewLayer?.frame = self.cameraView.layer.bounds
        previewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
        dispatch_async(dispatch_get_main_queue(), { // 2
                    // 3
            self.cameraView.layer.addSublayer(previewLayer)
            self.captureSession.startRunning()
            });
        });

【讨论】:

    猜你喜欢
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 2020-03-01
    • 1970-01-01
    • 1970-01-01
    相关资源
    最近更新 更多