【问题标题】:App Crashes while processing image in iOS 7在 iOS 7 中处理图像时应用程序崩溃
【发布时间】:2014-05-22 15:51:32
【问题描述】:

我正在开发一个图像处理应用程序。图像上可以有许多层,并且可以将它们保存到磁盘。这是执行此操作的代码(TransparentBG.png3000x3000 透明图像):

CCSprite *blankImage = [CCSprite spriteWithFile:@"assetsFullSize/TransparentBG.png"];
//CCSprite *blankImage = [CCSprite spriteWithFile:@"assets/centerPaneBG.png"];
blankImage.tag=SAVE_IMAGE_BASE_TAG + [[AppManager instance] generateNextSaveImageIndex];
NSLog(@"   blankImage.tag = %i",blankImage.tag);
NSLog(@"   blankImage.size = %@",NSStringFromCGSize(blankImage.contentSize));
for(int i=1; i<[layers count]; i++)
{
    NSLog(@"   i = %i",i);

    ImageFeature *feature = [layers objectAtIndex:i];
    CCSprite *layer = (CCSprite *)[self getChildByTag:LAYER_INDEX_BASE + i];
    NSLog(@"   ********* layer.position = %@",NSStringFromCGPoint(layer.position));
    NSLog(@"   ********* feature.posX,posYition = %i,%i",feature.posX,feature.posY);
    //        [layer removeFromParent];
    //        CCSprite *layerCopy = [layer copy];
    CCTexture2D *texture = [layer texture];
    CCSprite *layerCopy = [CCSprite spriteWithTexture:texture];
    layerCopy.anchorPoint = layer.anchorPoint;
    NSLog(@"   anchorPoint = %@",NSStringFromCGPoint(layer.anchorPoint));
    //        layerCopy.position = ccpAdd(layer.position,ccp(-LEFT_PANE_WIDTH,0));
    //        layerCopy.position = ccp([self getImageFeature_posX_fomSpritePosition:layer],[self getImageFeature_posY_fomSpritePosition:layer]);
    layerCopy.position = ccp((feature.posX/3000) * blankImage.contentSize.width,(feature.posY/3000) * blankImage.contentSize.height);
    layerCopy.position = IS_RETINA ? ccp(feature.posX / 2,feature.posY / 2) : ccp(feature.posX,feature.posY);
    NSLog(@"   ********* layerCopy.position = %@",NSStringFromCGPoint(layerCopy.position));
    layerCopy.color = layer.color;
    layerCopy.scaleX = layer.scaleX / VISUAL_SCALING_FACTOR;
    layerCopy.scaleY = layer.scaleY / VISUAL_SCALING_FACTOR;
    layerCopy.rotation = layer.rotation;
    layerCopy.opacity = layer.opacity;
    [blankImage addChild:layerCopy z:i tag:layer.tag];
    //        layer.anchorPoint = ccp(0.5,0.5);
    //        layer.position = ccpAdd(layer.position,ccp(-LEFT_PANE_WIDTH,0));
    //        [blankImage addChild:layer z:i tag:layer.tag];
}

//    CCSprite *attribution = [CCSprite spriteWithFile:@"assets/pikpark.png"];

//    CCSprite *attribution = [CCSprite spriteWithFile:@"assetsFullSize/pikpark.png"];
//    attribution.anchorPoint = ccp(0.5,0.5);
//    attribution.position = ccp(blankImage.contentSize.width-  (attribution.contentSize.width/2.0),attribution.contentSize.height/2.0);
//    attribution.opacity = 64;
//    [blankImage addChild:attribution z:8999 tag:ATTRIBUTION];

//    blankImage.scale = 300.0/blankImage.contentSize.height;
CGPoint p = blankImage.anchorPoint;
[blankImage setAnchorPoint:ccp(0,0)];

//  CCRenderTexture *renderer = [CCRenderTexture renderTextureWithWidth:300 height:300];
//  CCRenderTexture *renderer = [CCRenderTexture renderTextureWithWidth:1500 height:1500];
//  CCRenderTexture *renderer = [CCRenderTexture renderTextureWithWidth:3000 height:3000];
CCRenderTexture *renderer = [CCRenderTexture renderTextureWithWidth:blankImage.contentSize.width height:blankImage.contentSize.height];

[renderer begin];
[blankImage visit];
[renderer end];

[blankImage setAnchorPoint:p];

UIImage *thumbImage = [renderer getUIImage];

NSLog(@"   thumbImage.size = %@",NSStringFromCGSize([thumbImage size]));
NSString *key = [NSString stringWithFormat:@"%i",blankImage.tag];
NSLog(@"   key = %@",key);
CCSprite *renderedSprite = [CCSprite spriteWithCGImage:thumbImage.CGImage key:key];
NSLog(@"   width=%3f   height=%3f",renderedSprite.contentSize.width,renderedSprite.contentSize.height);

// And save to UserDocs
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *galleryDirectory = [documentsDirectory stringByAppendingPathComponent:@"gallery"];
NSLog(@"   galleryDirectory = %@",galleryDirectory);

NSString *saveFileName = [NSString stringWithFormat:@"image_%i.png",blankImage.tag];
NSLog(@"   saveFileName = %@",saveFileName);

NSString *galleryPath = [galleryDirectory stringByAppendingPathComponent:saveFileName];
NSLog(@"   galleryPath = %@",galleryPath);

NSError *error;
NSData *imageData = UIImagePNGRepresentation(thumbImage);
[imageData writeToFile:galleryPath options:NULL error:&error];// atomically:NO];
NSLog(@"   GALLERY IMAGE SAVED!");

我已经在模拟器上测试过了,效果很好。但是当我在 iPad2 上测试它时,它崩溃了,导致内存压力致命异常。

通过断点,我可以看到以下行导致应用程序崩溃,从而导致上述代码集出现内存压力异常。

UIImage *thumbImage = [renderer getUIImage];

如果我将 CCRenderTexture *renderer 的大小更改为 300x300,应用程序将停止崩溃。但它严重影响所保存图像的质量和大小。 3000x3000 生成高品质图像。我尝试使用despatch_async,但没有成功。

无论如何我可以解决memory pressure 问题吗?请帮忙。

【问题讨论】:

    标签: ios iphone image-processing cocos2d-iphone


    【解决方案1】:

    3000x3000 纹理消耗超过 34 MB 的纹理内存。

    您从图像 (x1) 创建纹理。然后创建一个渲染纹理进行渲染 (x2)。然后从渲染纹理 (x3) 创建一个 UIImage。最后,您使用 UIImagePNGRepresentation (x4) 创建一个 NSData。

    所以此时相同的纹理内存保存在内存中的各种缓冲区中,至少 4 次 = 136 MB。

    我说“至少”是因为一些纹理加载效率低下在 cocos2d 中是已知的,并且可能存在于 UIImage 等中。例如,创建 CCTexture2D 可能实际上会创建两个大小相同的缓冲区,因此实际内存使用量可能为 170 MB。运行 Instruments 以了解实际使用了多少内存。

    您可以尝试的一件事是将各个部分分开,以允许从内存中释放临时缓冲区。例如,在加载 CCTexture2D 之后,不要立即创建精灵和渲染纹理,而是调度 performSelectorInBackground:afterDelay: 延迟可能为 20 秒。您可以在从渲染纹理创建 UIImage 后执行相同的操作,以便将 PNG 表示形式再暂停几分之一秒。

    【讨论】:

    • 你让我免于许多沮丧的时刻。谢谢! :)
    猜你喜欢
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 2013-09-28
    • 1970-01-01
    • 1970-01-01
    • 2011-04-10
    • 2013-08-31
    • 1970-01-01
    相关资源
    最近更新 更多