【问题标题】:How can I capture an image when AVPlayer playing m3u8 stream?AVPlayer 播放 m3u8 流时如何捕获图像?
【发布时间】:2015-04-02 02:28:49
【问题描述】:

我使用AVPlayer 播放m3u8 文件,我想在这些代码中捕获图像:

AVAssetImageGenerator *gen = [[AVAssetImageGenerator alloc] initWithAsset:self.player.currentItem.asset];
gen.appliesPreferredTrackTransform = YES;
NSError *error = nil;
CMTime actualTime;
CMTime now = self.player.currentTime;
[gen setRequestedTimeToleranceAfter:kCMTimeZero];
[gen setRequestedTimeToleranceBefore:kCMTimeZero];
CGImageRef image = [gen copyCGImageAtTime:now actualTime:&actualTime error:&error];
UIImage *thumb = [[UIImage alloc] initWithCGImage:image];
NSLog(@"%f , %f",CMTimeGetSeconds(now),CMTimeGetSeconds(actualTime));

NSLog(@"%@",error);
if (image) {
    CFRelease(image);
}

但它不起作用。错误是:

Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x7fadf25f59f0 {NSUnderlyingError=0x7fadf25f1670 "The operation couldn’t be completed. (OSStatus error -12782.)", NSLocalizedFailureReason=An unknown error occurred (-12782), NSLocalizedDescription=The operation could not be completed}

我该如何解决?
非常感谢。

【问题讨论】:

    标签: ios image screenshot avplayer m3u8


    【解决方案1】:

    AVAssetImageGenerator 可能需要本地资源。也许您会更幸运地将AVPlayerItemVideoOutput 添加到您的AVPlayer,寻找所需的位置并在视频输出上调用copyPixelBufferForItemTime:itemTimeForDisplay:

    【讨论】:

      【解决方案2】:

      我用下面的代码解决了你同样的问题。

      您可以使用此代码:

      属性

      @property (strong, nonatomic) AVPlayer *player;
      @property (strong, nonatomic) AVPlayerItem *playerItem;
      @property (strong, nonatomic) AVPlayerItemVideoOutput *videoOutput;
      

      初始

      AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
      self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
      self.player = [AVPlayer playerWithPlayerItem:_playerItem];
      

      获取图片

      CMTime currentTime = _player.currentItem.currentTime;
      CVPixelBufferRef buffer = [_videoOutput copyPixelBufferForItemTime:currentTime itemTimeForDisplay:nil];
      CIImage *ciImage = [CIImage imageWithCVPixelBuffer:buffer];
      UIImage *image = [UIImage imageWithCIImage:ciImage];
      //Use image^^
      

      【讨论】:

      • 如何初始化videoOutput?
      【解决方案3】:

      从 HLS 视频的 avplayer 中捕获图像:

      private let videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)])
      
      private let jpegCompressionQuality = 0.7
      
      private func imageFromCurrentPlayerContext() {
          guard let player = player else { return }
          let currentTime: CMTime = player.currentTime()
      
          guard let buffer: CVPixelBuffer = videoOutput.copyPixelBuffer(forItemTime: currentTime, itemTimeForDisplay: nil) else { return }
          let ciImage: CIImage = CIImage(cvPixelBuffer: buffer)
          let context: CIContext = CIContext.init(options: nil)
      
          guard let cgImage: CGImage = context.createCGImage(ciImage, from: ciImage.extent) else { return }
          let image: UIImage = UIImage.init(cgImage: cgImage)
      
          guard let jpegImage: Data = UIImageJPEGRepresentation(image, jpegCompressionQuality) else { return }
          // be happy
      }
      

      【讨论】:

      • 您能解释一下视频输出对象如何知道要使用哪个播放器或播放器项目吗?
      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 2016-08-22
      • 1970-01-01
      • 2016-01-21
      • 2022-02-03
      • 2019-11-29
      • 1970-01-01
      • 2017-04-27
      相关资源
      最近更新 更多