【发布时间】:2020-03-09 20:58:09
【问题描述】:
在 iOS/Swift 中,我正在与 ReplayKit 一起使用 AVAssetWriter 创建用户屏幕和麦克风音频的 mov 或 MP4 视频。
当我创建视频时,它在本地播放良好,并且音频和视频是同步的。但是,当我使用 AWS Mediaconvert 将此视频转换为 HLS(HTTP 实时流)格式时,音频与视频不同步。有谁知道这可能是什么原因造成的?我阅读了有关时间编码的信息,也许我需要在我的视频中添加时间码?有没有更简单的方法来解决这个问题,或者有没有人遇到过类似的问题?
private func startRecordingVideo(){
//Initialize MP4 Output File for Screen Recorded Video
let fileManager = FileManager.default
let urls = fileManager.urls(for: .documentDirectory, in: .userDomainMask)
guard let documentDirectory: NSURL = urls.first as NSURL? else {
fatalError("documentDir Error")
}
videoOutputURL = documentDirectory.appendingPathComponent("OutputVideo.mov")
if FileManager.default.fileExists(atPath: videoOutputURL!.path) {
do {
try FileManager.default.removeItem(atPath: videoOutputURL!.path)
} catch {
fatalError("Unable to delete file: \(error) : \(#function).")
}
}
//Initialize Asset Writer to Write Video to User's Storage
assetWriter = try! AVAssetWriter(outputURL: videoOutputURL!, fileType:
AVFileType.mov)
let videoOutputSettings: Dictionary<String, Any> = [
AVVideoCodecKey : AVVideoCodecType.h264,
AVVideoWidthKey : UIScreen.main.bounds.size.width,
AVVideoHeightKey : UIScreen.main.bounds.size.height,
];
let audioSettings = [
AVFormatIDKey : kAudioFormatMPEG4AAC,
AVNumberOfChannelsKey : 1,
AVSampleRateKey : 44100.0,
AVEncoderBitRateKey: 96000,
] as [String : Any]
videoInput = AVAssetWriterInput(mediaType: AVMediaType.video,outputSettings: videoOutputSettings)
audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio,outputSettings:audioSettings )
videoInput?.expectsMediaDataInRealTime = true
audioInput?.expectsMediaDataInRealTime = true
assetWriter?.add(videoInput!)
assetWriter?.add(audioInput!)
let sharedRecorder = RPScreenRecorder.shared()
sharedRecorder.isMicrophoneEnabled = true
sharedRecorder.startCapture(handler: {
(sample, bufferType, error) in
//Audio/Video Buffer Data returned from the Screen Recorder
if CMSampleBufferDataIsReady(sample) {
DispatchQueue.main.async { [weak self] in
//Start the Asset Writer if it has not yet started
if self?.assetWriter?.status == AVAssetWriter.Status.unknown {
if !(self?.assetWriter?.startWriting())! {
return
}
self?.assetWriter?.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(sample))
self?.startSession = true
}
}
//Handle errors
if self.assetWriter?.status == AVAssetWriter.Status.failed {
print("Error occured, status = \(String(describing: self.assetWriter?.status.rawValue)), \(String(describing: self.assetWriter?.error!.localizedDescription)) \(String(describing: self.assetWriter?.error))")
return
}
//Add video buffer to AVAssetWriter Video Input
if (bufferType == .video)
{
if(self.videoInput!.isReadyForMoreMediaData) && self.startSession {
self.videoInput?.append(sample)
}
}
//Add audio microphone buffer to AVAssetWriter Audio Input
if (bufferType == .audioMic)
{
print("MIC BUFFER RECEIVED")
if self.audioInput!.isReadyForMoreMediaData
{
print("Audio Buffer Came")
self.audioInput?.append(sample)
}
}
}
}, completionHandler: {
error in
print("COMP HANDLER ERROR", error?.localizedDescription)
})
}
private func stopRecordingVideo(){
self.startSession = false
RPScreenRecorder.shared().stopCapture{ (error) in
self.videoInput?.markAsFinished()
self.audioInput?.markAsFinished()
if error == nil{
self.assetWriter?.finishWriting{
self.startSession = false
print("FINISHED WRITING!")
DispatchQueue.main.async {
self.setUpVideoPreview()
}
}
}else{
//DELETE DIRECTORY
}
}
}
【问题讨论】:
-
嘿@PatPatchPatrick!你找到了一个可行的例子吗?使用 ReplayKit 时不会因为用户权限而延迟吗?你需要如何同步这个?
-
你好。我最终使用 AVCaptureSession 来录制视频/音频,而不是 replaykit。 ReplayKit 可以很好地在本地录制视频/音频,但在将视频转换为其他格式时会出现问题。我相信这些可以通过对媒体进行时间编码或设置 mediaTimeScale 来解决,正如@derickito 在他的回复中提到的那样,但我还没有机会亲自尝试。
-
你好@PatPatchPatrick!非常感谢您的回复!我想知道是否有使用 AVCaptureSession 捕获视频和音频的示例?你说的对;目前我正在尝试使用 ReplayKit 在本地保存音频和视频。但是,用户权限并没有使录音很好地同步。我正在尝试寻找另一种可能解决此问题的解决方案..
-
如果您的问题是用户权限,那么也许您可以在开始录制之前在工作流程的早期请求权限。有很多关于 AVCaptureSession 以及如何使用它的帖子。这是一个例子:stackoverflow.com/questions/39431390/…
标签: ios swift video video-streaming http-live-streaming