【问题标题】:Save image from bar code scanner iOS 7从条形码扫描仪 iOS 7 保存图像
【发布时间】:2014-08-29 15:40:52
【问题描述】:

我有一个条形码扫描仪,我使用 IOS7 中的一些新的AVCapture API 编写。一切都很好,但是在我从捕获输出中获取满足数据后,我很想抓取图像。下面的方法是我在 SKU 等上进行查找的委托,并且也想抓取图像。这种方法可以做到吗?

 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
    {
    ...
    }

【问题讨论】:

    标签: ios ios7 avcapturesession


    【解决方案1】:

    要具体回答您的问题,不,没有办法从 AVCaptureMetadataOutput 实例中保存图像。

    但是,正如codingVoldemort 的优秀示例所示,您可以创建一个AVCaptureStillImageOutput 实例并将其添加到您的AVCaptureSession 输出中。一旦您的应用检测到一些元数据,您就可以立即触发对该 CaptureStillImageOutput 实例的捕获。

    这里有一个更明确的解决方案,使用codingVoldemort的初始代码作为基础:

    首先,无论您在何处建立AVCaptureSession,都向其中添加AVCaptureStillImageOutput

    _session = [[AVCaptureSession alloc] init];
    
    _output = [[AVCaptureMetadataOutput alloc] init];
    [_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
    [_session addOutput:_output];
    
    _stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    [_session addOutput:_stillImageOutput];
    

    现在,在- captureOutput: didOutputMetadataObjects 中,您可以在触发该方法时捕获静止图像:

    AVCaptureConnection *stillImageConnection = [_stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
    [stillImageConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    [stillImageConnection setVideoScaleAndCropFactor:1.0f];
    [_stillImageOutput setOutputSettings:[NSDictionary dictionaryWithObject:AVVideoCodecJPEG
                                                                            forKey:AVVideoCodecKey]];
    _stillImageOutput.outputSettings = @{AVVideoCodecKey: AVVideoCodecJPEG, AVVideoQualityKey:@1};
    
    [stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection
                                                  completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
                                                      if (error) {
                                                          NSLog(@"error: %@", error);
                                                      }
                                                      else {
                                                          NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                                                          UIImage *image =[UIImage imageWithData:jpegData];
                                                          //Grabbing the image here
                                                          dispatch_async(dispatch_get_main_queue(), ^(void) {
    
                                                            //Update UI if necessary.
    
    
                                                          });
    
    
                                                      }
                                                  }
    
     ];
    

    【讨论】:

      【解决方案2】:

      试试这个方法:

      -(void)captureZoomedImage:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
      {
          // Find out the current orientation and tell the still image output.
          AVCaptureConnection *stillImageConnection = [stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
          UIDeviceOrientation curDeviceOrientation = [[UIDevice currentDevice] orientation];
          AVCaptureVideoOrientation avcaptureOrientation = [self avOrientationForDeviceOrientation:curDeviceOrientation];
          [stillImageConnection setVideoOrientation:avcaptureOrientation];
          [stillImageConnection setVideoScaleAndCropFactor:1.0f];
          [stillImageOutput setOutputSettings:[NSDictionary dictionaryWithObject:AVVideoCodecJPEG
                                                                          forKey:AVVideoCodecKey]];
           stillImageOutput.outputSettings = @{AVVideoCodecKey: AVVideoCodecJPEG, AVVideoQualityKey:@1};
      
          [stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection
                                                        completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
                                                            if (error) {
                                                                [self displayErrorOnMainQueue:error withMessage:@"Take picture failed"];
                                                            }
                                                            else {
                                                                NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                                                                UIImage *image =[UIImage imageWithData:jpegData];
                                                                //Grabbing the image here
                                                                dispatch_async(dispatch_get_main_queue(), ^(void) {
      
                                                                  //Update UI if necessary.
      
      
                                                                });
      
      
                                                            }
                                                        }
      
           ];
      
      }
      

      【讨论】:

      • 我是否在捕获元数据的方法中调用此方法?我没有看到如何获取缓冲区。
      • 它是来自AVCaptureVideoDataOutputSampleBufferDelegate 协议的委托方法。您可以在这里阅读所有相关信息:developer.apple.com/librarY/mac/documentation/AVFoundation/…
      • 谢谢,但是在我收到元数据后会调用它吗?我想知道第一种方法中的 captureOutput 对象中是否有任何图像缓冲。谢谢你的帮助
      【解决方案3】:

      我想将 Tim 的回答翻译成 Swift =) 这是第一节:

      let session = AVCaptureSession()
      var metadataOutput = AVCaptureMetadataOutput()
      var stillCameraOutput = AVCaptureStillImageOutput()
      let sessionQueue = dispatch_async(dispatch_get_main_queue(), nil)
      
      metadataOutput.setMetadataObjectsDelegate(self, queue: sessionQueue)
              if session.canAddOutput(metadataOutput) {
                  session.addOutput(metadataOutput)
              }
      session.addOutput(stillCameraOutput)
      

      这是第二个:

      var image = UIImage()
      let stillImageConnection = stillCameraOutput.connectionWithMediaType(AVMediaTypeVideo)
      stillImageConnection.videoOrientation = .Portrait
      stillImageConnection.videoScaleAndCropFactor = 1.0
      stillCameraOutput.captureStillImageAsynchronouslyFromConnection(stillImageConnection, completionHandler: { (imageDataSampleBuffer, error) in
                          if (error != nil) {
                              print("There are some error in capturing image")
                          } else {
                              let jpegData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
                              image = UIImage(data: jpegData)!
                          }
                      })
      

      【讨论】:

        猜你喜欢
        • 1970-01-01
        • 2012-07-31
        • 1970-01-01
        • 1970-01-01
        • 2020-11-18
        • 1970-01-01
        • 1970-01-01
        • 2014-07-19
        • 2012-03-28
        相关资源
        最近更新 更多