【问题标题】:Creating video from a set of images从一组图像创建视频
【发布时间】:2012-02-17 17:11:19
【问题描述】:

我想从一组图像中创建一个视频,我的大部分代码参考了这些图像:

How do I export UIImage array as a movie?

我写的代码是:

NSError *error = nil;

NSLog(@"Gonna start writing");

writer = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:@"ravi4"] fileType:AVFileTypeQuickTimeMovie error:&error];
NSParameterAssert(writer);

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:640], AVVideoWidthKey, [NSNumber numberWithInt:480], AVVideoHeightKey, nil];

AVAssetWriterInput *writerInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] retain];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                 assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                 sourcePixelBufferAttributes:nil];

NSParameterAssert(writerInput);
NSParameterAssert([writer canAddInput:writerInput]);
[writer addInput:writerInput];
[writer startWriting];
[writer startSessionAtSourceTime:kCMTimeZero];

UIImage *image = [UIImage imageNamed:@"back.jpg"];
UIImage *image2 = [UIImage imageNamed:@"arthas.jpg"];

UIImage *resizedImage = [ImageToVideoViewController imageWithImage:image scaledToSize:CGSizeMake(640, 480)];
UIImage *resizedImage2 = [ImageToVideoViewController imageWithImage:image2 scaledToSize:CGSizeMake(640, 480)];

CGImageRef img = [resizedImage CGImage];
CGImageRef img2 = [resizedImage2 CGImage];  

CVPixelBufferRef buffer = [ImageToVideoViewController pixelBufferFromCGImage:img size:CGSizeMake(640, 480)];
CVPixelBufferRef buffer2 = [ImageToVideoViewController pixelBufferFromCGImage:img2 size:CGSizeMake(640, 480)];

[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(0, 1)];
while ((adaptor.assetWriterInput.readyForMoreMediaData)==NO ) {

}
[adaptor appendPixelBuffer:buffer2 withPresentationTime:CMTimeMake(3,1)];
while ((adaptor.assetWriterInput.readyForMoreMediaData)==NO ) {

}
[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(6, 1)];
while ((adaptor.assetWriterInput.readyForMoreMediaData)==NO ) 
{

}
[adaptor appendPixelBuffer:buffer2 withPresentationTime:CMTimeMake(9, 1)];

while ((adaptor.assetWriterInput.readyForMoreMediaData)==NO ) 
{

}
[adaptor appendPixelBuffer:buffer2 withPresentationTime:CMTimeMake(10, 1)];
//[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(10, 2)];
while ((adaptor.assetWriterInput.readyForMoreMediaData)==NO ) 
{

}
CVBufferRelease(buffer);
CVBufferRelease(buffer2);

[writerInput markAsFinished];
//[writerInput finishWriting];
[writerInput release];

[writer endSessionAtSourceTime:CMTimeMake(10, 1)];
[writer finishWriting];
[writer release];

代码很乱,因为我只是在尝试如何解决这个问题

除了我尝试显示的最后一张图片外,视频一切正常 视频会在需要显示该图像时立即冻结并停止,但在到达该点之前会正常工作

谢谢!

【问题讨论】:

    标签: iphone objective-c uiimage avfoundation


    【解决方案1】:

    实际的解决方案是,您需要在最后添加一个额外的帧,以便在屏幕上显示最终图像的持续时间。

    您有效地添加了两次图像,第一次是在您希望它出现的开始时间偏移处,然后在您希望电影结束的时间偏移处再次添加。

    否则,您将在显示最终图像的同时结束影片。

    【讨论】:

      猜你喜欢
      • 2012-08-15
      • 1970-01-01
      • 2013-01-03
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2010-12-24
      • 2019-04-12
      相关资源
      最近更新 更多