【问题标题】:Is there a way to capture screenshot of the view within AVPlayerLayer?有没有办法在 AVPlayerLayer 中捕获视图的屏幕截图?
【发布时间】:2020-02-04 19:55:51
【问题描述】:

我目前正在制作视频拼贴应用。我有一个包含 Avplayerlayer 作为子层的视图。我需要获取包含 AvplayerLayer 的视图的屏幕截图。当我尝试拍摄它时,它得到了屏幕截图,但 avplayerlayer(视频正在里面播放)不在屏幕截图中,只是一个黑屏。对于模拟器来说,它可以完美地工作并显示图层,但对于真实设备来说,只是一个黑屏。

我尝试了 StackOverFlow 和 appleds 的开发者文档中的所有解决方案,但都没有奏效。

我尝试过的一些解决方案:

swift: How to take screenshot of AVPlayerLayer()

Screenshot for AVPlayer and Video

https://developer.apple.com/documentation/avfoundation/avcapturevideopreviewlayer

正如您在我的代码中看到的那样,它应该可以从视图中获取图像,但 avplayerlayer 不起作用。

- (UIImage *)imageFromView:(UIView *)view
{
   UIGraphicsBeginImageContext(view.frame.size);

    [view drawViewHierarchyInRect:_videoFrame afterScreenUpdates:false];

    UIImage *image = UIGraphicsGetImageFromCurrentImageContext();

    NSString *fileExtension = @"png";

    NSData *data;

    Boolean *isOutputJPS = false;

    if (isOutputJPS) {
        data = UIImageJPEGRepresentation(image, 0.5);
        fileExtension = @"jpg";
    }else{
        data = UIImagePNGRepresentation(image);
    }

    UIImage  *rasterizedView = [UIImage imageWithData:data];
    UIGraphicsEndImageContext();
    return rasterizedView;
}

//in the viewController

UIImage *image =  [self imageFromView:recordingView];

我现在有点绝望,因为 Avplayerlayer 没有任何解决方案。 当我检查在真实设备中生成的图像时。它只是向我显示视图,但对于模拟器,它可以按我的预期工作。

【问题讨论】:

    标签: ios objective-c uiimage avfoundation avplayerlayer


    【解决方案1】:

    有很多方法可以实现您想做的事情。我发现使用资产图像生成器始终有效。

    - (NSImage *)getImageFromAsset:(AVAsset *)myAsset width:(int)theWidth height:(int)theHeight {
    
                                Float64 durationSeconds = CMTimeGetSeconds(myAsset.duration);
    
        /// Change frametimetoget section to your specific needs ///
                                CMTime frametimetoget;
                                if (durationSeconds <= 20) {
                                                frametimetoget = CMTimeMakeWithSeconds(durationSeconds/2, 600);
                                } else {
                                                frametimetoget = CMTimeMakeWithSeconds(10, 600);
                                }
    
                                AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:myAsset];
                                imageGenerator.appliesPreferredTrackTransform = YES;
                                imageGenerator.maximumSize = CGSizeMake(theWidth, theHeight);
                                NSString *aspect = @"AVAssetImageGeneratorApertureModeEncodedPixels";
                                imageGenerator.apertureMode = aspect;
    
        /// NSError not handled in this example , you would have to add code ///
                                NSError *error = nil;
                                CMTime actualTime;
                                CGImageRef frameImage = [imageGenerator copyCGImageAtTime:frametimetoget actualTime:&actualTime error:&error];
    
                                Float64 myImageWidth = CGImageGetWidth(frameImage);
                                Float64 myImageHeight = CGImageGetHeight(frameImage);
                                Float64 ratio = myImageWidth/theWidth;
                                NSSize imageSize ;
                                imageSize.width=myImageWidth/ratio;
                                imageSize.height=myImageHeight/ratio;
    
        /// You may choose to use CGImage and skip below
        /// Swap out NSImage (Mac OS x) for the ios equivalence
                                NSImage * thumbNail = [[NSImage alloc]initWithCGImage:frameImage size:imageSize];
    
        /// CGImageRelease is a must to avoid memory leaks
                                CGImageRelease(frameImage);
                                return thumbNail;
    

    }

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 2020-12-22
      • 1970-01-01
      • 1970-01-01
      • 2018-12-23
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      相关资源
      最近更新 更多