【问题标题】:CGContextDrawImage camera app crashCGContextDrawImage 相机应用程序崩溃
【发布时间】:2012-08-31 15:18:42
【问题描述】:

我正在尝试使用 AVCaptureSession 获取图像。我遵循了本教程http://www.benjaminloulier.com/posts/2-ios4-and-direct-access-to-the-camera。我正在从图像参考创建 uiimage,然后从该 uiimage 获取像素。 但应用程序在一段时间后(不到 30 秒)崩溃。我尝试使用 Leaks 进行分析,但也崩溃了。使用日志我发现应用程序在 CGContextDrawImage(context, rect, image1.CGImage); 行之前崩溃。 你们对我可能做错的事情有什么建议吗?我还在应用程序崩溃前几秒钟看到内存分配错误。请帮忙。

代码贴在下面..

// Create a UIImage from sample buffer data

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
{
lock = @"YES";

 CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0); 

// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer); 
size_t height = CVPixelBufferGetHeight(imageBuffer); 

// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 


size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);

// Create a Quartz direct-access data provider that uses data we supply.
NSData *data = [NSData dataWithBytes:baseAddress length:bufferSize];

CGDataProviderRef dataProvider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);

CGImageRef quartzImage = CGImageCreate(width, height, 8, 32, bytesPerRow,
              colorSpace, kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
              dataProvider, NULL, true, kCGRenderingIntentDefault);

CGDataProviderRelease(dataProvider);

// Unlock the pixel buffer

CGColorSpaceRelease(colorSpace);

// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];

// Release the Quartz image
CGImageRelease(quartzImage);

CVPixelBufferUnlockBaseAddress(imageBuffer,0);
baseAddress = nil;
[data  release];
lock = @"NO";
return(image);
}

-(void)calculate
{
@try {

        UIImage *image1 = [self stillImage];   //Capture an image from the camera.
        //Extract the pixels from the camera image.

        CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();

        size_t bytesPerRow = image1.size.width*4;
        unsigned char* bitmapData = (unsigned char*)malloc(bytesPerRow*image1.size.height);

        CGContextRef context = CGBitmapContextCreate(bitmapData, image1.size.width, image1.size.height, 8, bytesPerRow,colourSpace,kCGImageAlphaPremultipliedFirst|kCGBitmapByteOrder32Big);

        CGColorSpaceRelease(colourSpace);

        CGContextDrawImage(context, rect, image1.CGImage);

        unsigned char* pixels = (unsigned char*)CGBitmapContextGetData(context);

        totalLuminance = 0.0;
        for(int p=0; p<image1.size.width*image1.size.height*4; p+=4)
        {
            totalLuminance += pixels[p]*0.3 + pixels[p+1]*0.59 + pixels[p+2]*0.11;
        }

        totalLuminance /= (image1.size.height * image1.size.width);                   

        pixels = nil;

        bitmapData = nil;

        [image1 release];

    CGContextRelease(context);
        //image1 = nil;

        //totalLuminance = [n floatValue];                   //Calculate the total luminance.
        float f = [del.camcont.slider value];
        float total = totalLuminance * f;
        NSString *ns = [NSString stringWithFormat:@"Lux : %0.2f", total];
        NSLog(@"slider = %f",f);
        NSLog(@"totlaluminance = %f",totalLuminance);
        NSLog(@"%@",ns);
        //NSString *ns = [NSString initWithFormat:@"Lux : %0.2f", total];
        [del.camcont.lux setText:ns];//Display the total luminance.

        self.stillImage = nil;
        //[self.stillImage release];
         ns = nil;
        //n = nil;
        //del = nil;
    }

    @catch (NSException *exception) {
        NSLog(@"main: Caught %@: %@", [exception name], [exception reason]);
}
}

【问题讨论】:

标签: ios memory crash avcapturesession


【解决方案1】:

我不清楚你为什么要使用 CMSampleBufferRef,然后创建 CGImageRef,然后创建 UIImage,然后使用 UIImageCGImageRef,然后吸出数据然后将其推入 unsigned char 指针(本质上,它指向与 CMSampleBufferRef 中的相同字节)。

如果你做这样的事情,你会简化你的生活(你应该会发现它更容易调试):

CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
size_t width = CVPixelBufferGetWidth(imageBuffer); 
size_t height = CVPixelBufferGetHeight(imageBuffer); 
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
uint8_t *pixels = malloc(bytesPerRow*height);
memcpy(pixels, baseAddress, bytesPerRow*height);
baseAddress = nil;
imageBuffer = nil;
sampleBuffer = nil;
float totalLuminance = 0.0;
for(int r=0; r<height; r++)
{
   for(int p=0, p<width, p+=4)
   {
      totalLuminance += pixels[p+(r*bytesPerRow)]*0.3 
                     + pixels[p+1+(r*bytesPerRow)]*0.59 
                     + pixels[p+2+(r*bytesPerRow)]*0.11;
   {
}
free(pixels);
totalLuminance /= (width * height);

(嵌套的for 循环是为了弥补bytesPerRow 不能被假定为与width*4 相同的事实,由于填充。)

【讨论】:

  • 感谢您的回复。这是真的。我没有注意到我不必要地制作的复杂性。再次感谢您。这肯定会帮助我的应用程序的性能。顺便说一句,我不能赞成你的回答因为我没有足够的声誉。对不起
【解决方案2】:

内存管理乍一看还不错。作为一种解决方法,您可以考虑 UIImage -imageWithData: 以防自定义 CGImageCreate 代码有问题。这是因为无论如何您都在使用 CGImage 创建 UIImage。

【讨论】:

  • 您好,感谢您的回复。但它不能使用 imageWithData。几秒钟后一直在同一行失败。
  • 值得一试。您可以发布崩溃日志吗?例如,如果您将此事件提交给 Apple DTS,这是他们要求的第一件事......
  • 但是在 Xcode 管理器中没有找到崩溃日志。我在每一行之后添加了 NSLog,发现每次应用程序只是在某个时间点关闭,控制台中没有任何崩溃日志或任何错误消息。
猜你喜欢
  • 1970-01-01
  • 2018-05-07
  • 2012-10-15
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
相关资源
最近更新 更多