要具体回答您的问题,不,没有办法从 AVCaptureMetadataOutput 实例中保存图像。
但是,正如codingVoldemort 的优秀示例所示,您可以创建一个AVCaptureStillImageOutput 实例并将其添加到您的AVCaptureSession 输出中。一旦您的应用检测到一些元数据,您就可以立即触发对该 CaptureStillImageOutput 实例的捕获。
这里有一个更明确的解决方案,使用codingVoldemort的初始代码作为基础:
首先,无论您在何处建立AVCaptureSession,都向其中添加AVCaptureStillImageOutput:
_session = [[AVCaptureSession alloc] init];
_output = [[AVCaptureMetadataOutput alloc] init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:_output];
_stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[_session addOutput:_stillImageOutput];
现在,在- captureOutput: didOutputMetadataObjects 中,您可以在触发该方法时捕获静止图像:
AVCaptureConnection *stillImageConnection = [_stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
[stillImageConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[stillImageConnection setVideoScaleAndCropFactor:1.0f];
[_stillImageOutput setOutputSettings:[NSDictionary dictionaryWithObject:AVVideoCodecJPEG
forKey:AVVideoCodecKey]];
_stillImageOutput.outputSettings = @{AVVideoCodecKey: AVVideoCodecJPEG, AVVideoQualityKey:@1};
[stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (error) {
NSLog(@"error: %@", error);
}
else {
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image =[UIImage imageWithData:jpegData];
//Grabbing the image here
dispatch_async(dispatch_get_main_queue(), ^(void) {
//Update UI if necessary.
});
}
}
];