【问题标题】:AVAssetExportSession - The video could not be composedAVAssetExportSession - 无法合成视频
【发布时间】:2014-11-22 21:22:34
【问题描述】:

我正在尝试在 Xamarin / Monotouch 中进行一些基本的视频合成,并取得了一些成功,但我遇到了似乎相当简单的任务。

我从相机纵向录制视频,因此我使用 AVAssetExportSession 来旋转视频。我创建了一个图层指令来旋转视频,效果很好。我能够以正确的方向成功导出视频。

问题:

当我将音轨添加到导出中时,我总是得到一个失败的响应并出现以下错误:

Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo=0x1912c320 {NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=视频无法合成。}

如果我没有在 exportSession 上设置 videoComposition 属性,则音频和视频导出完全正常,只是方向错误。如果有人能给我一些建议,将不胜感激。以下是我的代码:

var composition = new AVMutableComposition();
                var compositionTrackAudio = composition.AddMutableTrack(AVMediaType.Audio, 0);
                var compositionTrackVideo = composition.AddMutableTrack(AVMediaType.Video, 0);
                var videoCompositionInstructions = new AVVideoCompositionInstruction[files.Count];
            var index = 0;
            var renderSize = new SizeF(480, 480);
            var _startTime = CMTime.Zero;
            //AVUrlAsset asset;



            var asset = new AVUrlAsset(new NSUrl(file, false), new AVUrlAssetOptions());
            //var asset = AVAsset.FromUrl(new NSUrl(file, false));


            //create an avassetrack with our asset
            var videoTrack = asset.TracksWithMediaType(AVMediaType.Video)[0];
            var audioTrack = asset.TracksWithMediaType(AVMediaType.Audio)[0];

            //create a video composition and preset some settings

            NSError error;

            var assetTimeRange = new CMTimeRange { Start = CMTime.Zero, Duration = asset.Duration };

            compositionTrackAudio.InsertTimeRange(new CMTimeRange
            {
                Start = CMTime.Zero,
                Duration = asset.Duration,
            }, audioTrack, _startTime, out error);

            if (error != null) {
                Debug.WriteLine (error.Description);
            }

            compositionTrackVideo.InsertTimeRange(assetTimeRange, videoTrack, _startTime, out error);

            //create a video instruction


            var transformer = new AVMutableVideoCompositionLayerInstruction
            {
                TrackID = videoTrack.TrackID,
            };

            var audioMix = new AVMutableAudioMix ();
            var mixParameters = new AVMutableAudioMixInputParameters{ 
                TrackID = audioTrack.TrackID
            };

            mixParameters.SetVolumeRamp (1.0f, 1.0f, new CMTimeRange {
                Start = CMTime.Zero,
                Duration = asset.Duration
            });


            audioMix.InputParameters = new [] { mixParameters };
            var t1 = CGAffineTransform.MakeTranslation(videoTrack.NaturalSize.Height, 0);
            //Make sure the square is portrait
            var t2 = CGAffineTransform.Rotate(t1, (float)(Math.PI / 2f));
            var finalTransform = t2;

            transformer.SetTransform(finalTransform, CMTime.Zero);
            //add the transformer layer instructions, then add to video composition


            var instruction = new AVMutableVideoCompositionInstruction
            {
                TimeRange = assetTimeRange,
                LayerInstructions = new []{ transformer }
            };
            videoCompositionInstructions[index] = instruction;
            index++;
            _startTime = CMTime.Add(_startTime, asset.Duration);

            var videoComposition = new AVMutableVideoComposition();
            videoComposition.FrameDuration = new CMTime(1 , (int)videoTrack.NominalFrameRate);
            videoComposition.RenderScale = 1;
            videoComposition.Instructions = videoCompositionInstructions;
            videoComposition.RenderSize = renderSize;

            var exportSession = new AVAssetExportSession(composition, AVAssetExportSession.PresetHighestQuality);

            var filePath = _fileSystemManager.TempDirectory + DateTime.UtcNow.Ticks + ".mp4";

            var outputLocation = new NSUrl(filePath, false);

            exportSession.OutputUrl = outputLocation;
            exportSession.OutputFileType = AVFileType.Mpeg4;
            exportSession.VideoComposition = videoComposition;
            exportSession.AudioMix = audioMix;
            exportSession.ShouldOptimizeForNetworkUse = true;
            exportSession.ExportAsynchronously(() =>
            {
                Debug.WriteLine(exportSession.Status);

                switch (exportSession.Status)
                {

                    case AVAssetExportSessionStatus.Failed:
                        {
                            Debug.WriteLine(exportSession.Error.Description);
                            Debug.WriteLine(exportSession.Error.DebugDescription);
                            break;
                        }
                    case AVAssetExportSessionStatus.Completed:
                        {
                            if (File.Exists(filePath))
                            {
                                _uploadService.AddVideoToVideoByteList(File.ReadAllBytes(filePath), ".mp4");
                                Task.Run(async () =>
                                {
                                    await _uploadService.UploadVideo(_videoData);
                                });
                            }
                            break;
                        }
                    case AVAssetExportSessionStatus.Unknown:
                        {
                            break;
                        }
                    case AVAssetExportSessionStatus.Exporting:
                        {
                            break;
                        }
                    case AVAssetExportSessionStatus.Cancelled:
                        {
                            break;
                        }

                }
            });

【问题讨论】:

    标签: ios xamarin.ios avassetexportsession


    【解决方案1】:

    所以这是一个非常愚蠢的错误,因为在视频之前添加了音轨,所以说明一定是试图将转换应用于音轨而不是我的视频轨道。

    【讨论】:

    • 您好,我遇到了完全相同的错误,我可以知道您编辑什么来解决这个问题吗?
    • 嗨@HongZhou,这已经有一段时间了,但这是由于如上所述我的资产顺序错误。我没有以前工作时的代码了。
    【解决方案2】:

    我的问题是忘记设置timeRange,应该是这样的

    let instruction = AVMutableVideoCompositionInstruction()
    instruction.layerInstructions = [layer]
    instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: videoDuration)
    

    注意AVMutableVideoCompositionInstruction.timeRange 的结束时间必须有效。不同于AVAssetExportSession.timeRange

    要从源中导出的时间范围。 导出会话的默认时间范围是 kCMTimeZero 到 kCMTimePositiveInfinity,这意味着(以文件长度的可能限制为模)将导出资产的整个持续时间。 您可以使用 Key-value 观察来观察此属性。

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2013-07-25
      • 2019-03-02
      • 2016-11-20
      • 1970-01-01
      相关资源
      最近更新 更多