【问题标题】:Cannot capture dual photo from dual camera on iPhone X无法从 iPhone X 上的双摄像头拍摄双照片
【发布时间】:2019-03-28 22:24:19
【问题描述】:

我正在尝试同时从 iPhoneX 上的长焦和广角相机拍摄。这就是我初始化设备的方式:

let captureDevice = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .back)

我为 AVPhotoOutput 请求了双照片传送:

let photoSettings = AVCapturePhotoSettings()

photoSettings.isDualCameraDualPhotoDeliveryEnabled = true

capturePhotoOutput.capturePhoto(with: photoSettings, delegate: self)

但是,我遇到了这个错误:

*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCapturePhotoOutput setDualCameraDualPhotoDeliveryEnabled:] Dual Camera dual photo delivery is not supported in this configuration'

我需要启用或禁用其他设置吗?

【问题讨论】:

  • 你找到解决这个问题的方法了吗?
  • 不。你遇到过同样的问题吗?
  • 也许您需要修改一些其他设置,例如将 isAutoDualCameraFusionEnabled 和 isAutoStillImageStabilizationEnabled 属性设置为 false 并将 flashMode 属性设置为 off,并可能更改请求的照片尺寸。
  • 同样的问题:(我真的不知道该怎么办......有什么想法吗?

标签: ios swift ios11


【解决方案1】:

您必须确保正确配置了捕获设备、捕获会话和捕获输出:

  1. 使用以下设置(在您的代码中已经正确)获取捕获设备:AVCaptureDeviceTypeBuiltInDualCamera、AVMediaTypeVideo、AVCaptureDevicePositionBack

  2. 使用您刚刚在 1 中检索到的设备创建 AVCaptureDeviceInput。

  3. 创建 AVCaptureSession 并将其 sessionPreset 设置为 AVCaptureSessionPresetPhoto
  4. 创建 AVCapturePhotoOutput
  5. 将创建的 AVCaptureDeviceInput 和 AVCapturePhotoOutput 添加到 AVCaptureSession
  6. 将 AVCapturePhotoOutput 的 dualCameraDualPhotoDeliveryEnabled 设置为 YES
  7. 开始捕获会话

对应代码(Objective-C):

// Create capture device discovery session to retrieve devices matching our
// needs
// -------------------------------------------------------------------------------
AVCaptureDeviceDiscoverySession*    captureDeviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInDualCamera]
                                                                                                                           mediaType:AVMediaTypeVideo
                                                                                                                            position:AVCaptureDevicePositionBack];

// Loop through the retrieved devices and select correct one
// -------------------------------------------------------------------------------
for(AVCaptureDevice* device in [captureDeviceDiscoverySession devices])
{
    if(device.position == AVCaptureDevicePositionBack)
    {
        self.captureDevice = device;
        break;
    }
}

// Get camera input
// -------------------------------------------------------------------------------
NSError*                error               = nil;
AVCaptureDeviceInput*   videoDeviceInput    = [AVCaptureDeviceInput deviceInputWithDevice:self.captureDevice error:&error];

if(!videoDeviceInput)
{
    NSLog(@"Could not retrieve camera input, error: %@", error);
    return;
}

// Initialize capture session
// -------------------------------------------------------------------------------   
self.captureSession                 = [[AVCaptureSession alloc] init];
self.captureSession.sessionPreset   = AVCaptureSessionPresetPhoto;

// Add video device input and photo data output to our capture session
// -------------------------------------------------------------------------------
self.captureOutput  = [AVCapturePhotoOutput new];
[self.captureSession beginConfiguration];
if(![self.captureSession canAddOutput:self.captureOutput])
{
    NSLog(@"Cannot add photo output!");
    return;
}

[self.captureSession addInput:videoDeviceInput];
[self.captureSession addOutput:self.captureOutput];
[self.captureSession commitConfiguration];

// Configure output settings AFTER input & output have been added to the session
// -------------------------------------------------------------------------------
if(self.captureOutput.isDualCameraDualPhotoDeliverySupported)
    self.captureOutput.dualCameraDualPhotoDeliveryEnabled = YES;

// Create video preview layer for this session, and add it to our preview UIView
// -------------------------------------------------------------------------------
AVCaptureVideoPreviewLayer* videoPreviewLayer   = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
videoPreviewLayer.videoGravity                  = AVLayerVideoGravityResizeAspect;
videoPreviewLayer.frame                         = self.view.bounds;
[self.view.layer addSublayer:videoPreviewLayer];

// Start capturing session
// -------------------------------------------------------------------------------
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
    [self.captureSession startRunning];
});
  1. 稍后,对于您将从 AVCapturePhotoOutput 请求的每张照片,使用 AVCapturePhotoSettings 并将 dualCameraDualPhotoDeliveryEnabled 设置为 YES

代码(Objective-C):

AVCapturePhotoSettings* settings = [AVCapturePhotoSettings photoSettingsWithFormat:@{AVVideoCodecKey: AVVideoCodecTypeJPEG}];
settings.dualCameraDualPhotoDeliveryEnabled = YES;
[self.captureOutput capturePhotoWithSettings:settings delegate:self];

【讨论】:

    猜你喜欢
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 2014-07-10
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 2017-12-25
    • 1970-01-01
    相关资源
    最近更新 更多