这有点烦人,但如果您不想降低质量,我相信您需要在 AVComposition 中重新创建视频。我很想知道是否有另一种方法,但这就是我想出的。从技术上讲,您可以通过 AVAssetExportSession 导出视频,但使用 PassThrough 质量将产生相同的视频文件,这不会是慢动作 - 您需要对其进行转码,这会降低质量(AFAIK。请参阅Issue playing slow-mo AVAsset in AVPlayer解决方案)。
您需要做的第一件事是获取源媒体的原始时间映射对象。你可以这样做:
let options = PHVideoRequestOptions()
options.version = PHVideoRequestOptionsVersion.current
options.deliveryMode = .highQualityFormat
PHImageManager().requestAVAsset(forVideo: phAsset, options: options, resultHandler: { (avAsset, mix, info) in
guard let avAsset = avAsset else { return }
let originalTimeMaps = avAsset.tracks(withMediaType: AVMediaTypeVideo)
.first?
.segments
.flatMap { $0.timeMapping } ?? []
}
获得原始媒体(位于文档目录中的那个)的 timeMappings 后,您可以传入该媒体的 URL 和您想要重新创建的原始 CMTimeMapping 对象。然后创建一个可以在 AVPlayer 中播放的新 AVComposition。你需要一个类似于这样的类:
class CompositionMapper {
let url: URL
let timeMappings: [CMTimeMapping]
init(for url: URL, with timeMappings: [CMTimeMapping]) {
self.url = url
self.timeMappings = timeMappings
}
init(with asset: AVAsset, and timeMappings: [CMTimeMapping]) {
guard let asset = asset as? AVURLAsset else {
print("cannot get a base URL from this asset.")
fatalError()
}
self.timeMappings = timeMappings
self.url = asset.url
}
func compose() -> AVComposition {
let composition = AVMutableComposition(urlAssetInitializationOptions: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
let emptyTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let audioTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)
let asset = AVAsset(url: url)
guard let videoAssetTrack = asset.tracks(withMediaType: AVMediaTypeVideo).first else { return composition }
var segments: [AVCompositionTrackSegment] = []
for map in timeMappings {
let segment = AVCompositionTrackSegment(url: url, trackID: kCMPersistentTrackID_Invalid, sourceTimeRange: map.source, targetTimeRange: map.target)
segments.append(segment)
}
emptyTrack.preferredTransform = videoAssetTrack.preferredTransform
emptyTrack.segments = segments
if let _ = asset.tracks(withMediaType: AVMediaTypeVideo).first {
audioTrack.segments = segments
}
return composition.copy() as! AVComposition
}
然后您可以使用CompositionMapper 类的compose() 函数为您提供一个准备在AVPlayer 中播放的AVComposition,它应该尊重您传入的CMTimeMapping 对象.
let compositionMapper = CompositionMapper(for: someAVAssetURL, with: originalTimeMaps)
let mappedComposition = compositionMapper.compose()
let playerItem = AVPlayerItem(asset: mappedComposition)
let player = AVPlayer(playerItem: playerItem)
playerItem.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed
如果您需要将其转换为 Objective-C 的帮助,请告诉我,但它应该相对简单。