【问题标题】:Start recording video with AVCaptureSession使用 AVCaptureSession 开始录制视频
【发布时间】:2018-10-15 20:04:37
【问题描述】:

我已经创建了 AVCaptureSession 和预览层,但是如何开始录制视频?

这是我的代码..

AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];

AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];


AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];

if ([session canAddInput:deviceInput]) {
    [session addInput:deviceInput];
}

AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];

[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [[self view] layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.view.frame;

[previewLayer setFrame:frame];

[rootLayer insertSublayer:previewLayer atIndex:0];

AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    videoOutput.videoSettings = nil;
    [session addOutput:videoOutput];
    [session startRunning];

现在我希望它在我按下按钮时开始录制,并且在我完成录制后,它将视频保存在相机胶卷中,那该怎么做呢?

谢谢。

【问题讨论】:

  • 致电startRecordingToOutputFileURL:。苹果为此提供了出色的示例代码。

标签: ios objective-c iphone ios7 avfoundation


【解决方案1】:

如果你想记录你可以使用一个文件,你可以使用AVCaptureMovieFileOutput,像这样添加到你的 AVCaptureSession

 AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];

    if ([session canAddOutput:movieFileOutput]) {
        [session addOutput:movieFileOutput];
    }

然后使用AVCaptureMovileFileOutput方法startRecordingToOutputFileURL开始录制到某个URL..

希望有帮助

丹尼尔

【讨论】:

  • 它不起作用:(我应该重写委托方法还是它是可选的?当我调用 startRecordingToOutputFileURL:recordingDelegate: 方法时它会崩溃应用程序
  • 应用程序崩溃说什么?
  • 我没有使用xcode,所以看不到日志。但可能的原因是什么?我应该删除 AVCaptureVideoDataOutput 还是将其留在那里?
【解决方案2】:

按下按钮,初始化AVCaptureMovieFileOutput,然后通过提供outputFilePath调用startRecordingToOutputFileURL

AVCaptureMovieFileOutput *movieFileOutput;

// Initialization
// Start recording with Path
[movieFileOutput startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFilePath] recordingDelegate:self];


// Override the callback method
(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{

    // Use ALAssetsLibrary to save recorded video to Photo Album
    [[[ALAssetsLibrary alloc] init] writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) {

    }];
}

【讨论】:

    【解决方案3】:

    用 swift 版本试试这个:

    var session: AVCaptureSession?
    var userreponsevideoData = NSData()
    var userreponsethumbimageData = NSData()
    
    override func viewDidLoad() {
    super.viewDidLoad()
     createSession()
    }
    
    func for stopRecording(){
      session?.stopRunning()
    }
    
     func createSession() {
    
     var input: AVCaptureDeviceInput?
     let  movieFileOutput = AVCaptureMovieFileOutput()
     videosPreviewLayer?.frame.size = photoPreviewImageView.frame.size
     session = AVCaptureSession()
    let error: NSError? = nil
    do { input = try AVCaptureDeviceInput(device: self.cameraWithPosition(position: .back)!) } catch {return}
    if error == nil {
        session?.addInput(input!)
    } else {
        print("camera input error: \(String(describing: error))")
    }
    videosPreviewLayer = AVCaptureVideoPreviewLayer(session: session!)
    videosPreviewLayer?.frame.size = self.photoPreviewImageView.frame.size
    videosPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
    videosPreviewLayer?.connection?.videoOrientation = .portrait
    photoPreviewImageView.layer.sublayers?.forEach { $0.removeFromSuperlayer() }
    photoPreviewImageView.layer.addSublayer(videosPreviewLayer!)
    
    switchCameraButton.isHidden=true
    flashButton.isHidden=true
    msgLabel.isHidden=true
    galleryCollectionView.isHidden=true
    timerLabel.isHidden=false
    
    let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
    fileURL = URL(string:"\(documentsURL.appendingPathComponent("temp"))" + ".mov")
    print("*****fileurl%@",fileURL ?? "00000")
    
    let maxDuration: CMTime = CMTimeMake(600, 10)
    movieFileOutput.maxRecordedDuration = maxDuration
    movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024
    if self.session!.canAddOutput(movieFileOutput) {
        self.session!.addOutput(movieFileOutput)
    }
    session?.startRunning()
    movieFileOutput.startRecording(to: fileURL!, recordingDelegate: self)
    }
    
    func cameraWithPosition(position: AVCaptureDevice.Position) -> AVCaptureDevice? {
    let devices = AVCaptureDevice.devices(for: AVMediaType.video)
    for device in devices {
        if device.position == position {
            return device
        }
    }
    return nil
    }
    }
    extension SwipeGallerymainViewController: AVCaptureFileOutputRecordingDelegate
    {
    
    func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
    print(outputFileURL)
    let filemainurl = outputFileURL
    
    do
    {
        let asset = AVURLAsset(url:filemainurl as URL, options:nil)
        print(asset)
        let imgGenerator = AVAssetImageGenerator(asset: asset)
        imgGenerator.appliesPreferredTrackTransform = true
        let cgImage = try imgGenerator.copyCGImage(at: CMTimeMake(0, 1), actualTime: nil)
        let uiImage = UIImage(cgImage: cgImage)
        previewImage = uiImage
        userreponsethumbimageData = NSData(contentsOf: filemainurl as URL)!
        print(userreponsethumbimageData.length)
        print(uiImage)
    
    }
    catch let error as NSError
    {
        print(error)
        return
    }
    let VideoFilePath = URL(fileURLWithPath:NSTemporaryDirectory()).appendingPathComponent("mergeVideo\(arc4random()%1000)d").appendingPathExtension("mp4").absoluteString
    if FileManager.default.fileExists(atPath: VideoFilePath)
    {
        print("exist")
        do
        {
            try FileManager.default.removeItem(atPath: VideoFilePath)
        }
        catch { }
    }
    
    let tempfilemainurl =  NSURL(string: VideoFilePath)!
    let sourceAsset = AVURLAsset(url:filemainurl as URL, options:nil)
    let assetExport: AVAssetExportSession = AVAssetExportSession(asset: sourceAsset, presetName: AVAssetExportPresetMediumQuality)!
    assetExport.outputFileType = AVFileType.mov
    assetExport.outputURL = tempfilemainurl as URL
    assetExport.exportAsynchronously { () -> Void in
        switch assetExport.status
        {
        case AVAssetExportSessionStatus.completed:
            DispatchQueue.main.async {
                do
                {
    
                    self.userreponsevideoData = try NSData(contentsOf: tempfilemainurl as URL, options: NSData.ReadingOptions())
                    print("MB - \(self.userreponsevideoData.length) byte")
                    self.isVideoLoad=true
                    self.performSegue(withIdentifier:"previewSegue", sender:self)
                }
                catch
                {
    
                    print(error)
                }
            }
    
        case  AVAssetExportSessionStatus.failed:
            print("failed \(String(describing: assetExport.error))")
        case AVAssetExportSessionStatus.cancelled:
            print("cancelled \(String(describing: assetExport.error))")
        default:
            print("complete")
    
        }
    
    }
    }
    
     func captureOutput(captureOutput: AVCaptureFileOutput!, didStartRecordingToOutputFileAtURL fileURL: NSURL!, fromConnections connections: [AnyObject]!) {
    print(fileURL)
    }
    
    
    }
    

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 2012-05-08
      • 1970-01-01
      • 2023-03-05
      • 1970-01-01
      • 2012-03-19
      • 2012-02-28
      相关资源
      最近更新 更多