【问题标题】:How to get the pixel color on touch?如何获得触摸时的像素颜色?
【发布时间】:2016-05-07 01:44:19
【问题描述】:

我知道这是一个常见问题,并且这个问题有很多答案。我已经使用了其中的一些。虽然很多都是一样的。但对我来说可悲的是,他们都没有为我工作。以下代码我一直使用到现在。

-(void)getRGBAsFromImage:(UIImage*)image atX:(int)xx andY:(int)yy
{
    // First get the image into your data buffer
    CGImageRef imageRef = [image CGImage];
    NSUInteger width = CGImageGetWidth(imageRef);
    NSUInteger height = CGImageGetHeight(imageRef);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    unsigned char *rawData = (unsigned char*) calloc(height * width * 4, sizeof(unsigned char));
    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * width;
    NSUInteger bitsPerComponent = 8;
    CGContextRef context = CGBitmapContextCreate(rawData, width, height,
                                                 bitsPerComponent, bytesPerRow, colorSpace,
                                                 kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGColorSpaceRelease(colorSpace);

    CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
    CGContextRelease(context);

    // Now your rawData contains the image data in the RGBA8888 pixel format.
    int byteIndex = (bytesPerRow * yy) + xx * bytesPerPixel;

    CGFloat red   = (rawData[byteIndex]     * 1.0) / 255.0;
    CGFloat green = (rawData[byteIndex + 1] * 1.0) / 255.0;
    CGFloat blue  = (rawData[byteIndex + 2] * 1.0) / 255.0;
    CGFloat alpha = (rawData[byteIndex + 3] * 1.0) / 255.0;
    byteIndex += 4;
    NSLog(@"the vale of the rbg of red is %f",red);

    demoColor.tintColor = [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
    free(rawData);
}

这是我使用的另一种方法 -

- (CGContextRef) createARGBBitmapContextFromImage:(CGImageRef) inImage {

    CGContextRef    context = NULL;
    CGColorSpaceRef colorSpace;
    void *          bitmapData;
    int             bitmapByteCount;
    int             bitmapBytesPerRow;

    // Get image width, height. We'll use the entire image.
    size_t pixelsWide = CGImageGetWidth(inImage);
    size_t pixelsHigh = CGImageGetHeight(inImage);

    // Declare the number of bytes per row. Each pixel in the bitmap in this
    // example is represented by 4 bytes; 8 bits each of red, green, blue, and
    // alpha.
    bitmapBytesPerRow   = (pixelsWide * 4);
    bitmapByteCount     = (bitmapBytesPerRow * pixelsHigh);

    // Use the generic RGB color space.
    //colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);
    colorSpace = CGColorSpaceCreateDeviceRGB();
    if (colorSpace == NULL)
    {
        fprintf(stderr, "Error allocating color space\n");
        return NULL;
    }

    // Allocate memory for image data. This is the destination in memory
    // where any drawing to the bitmap context will be rendered.
    bitmapData = malloc( bitmapByteCount );
    if (bitmapData == NULL)
    {
        fprintf (stderr, "Memory not allocated!");
        CGColorSpaceRelease( colorSpace );
        return NULL;
    }

    // Create the bitmap context. We want pre-multiplied ARGB, 8-bits
    // per component. Regardless of what the source image format is
    // (CMYK, Grayscale, and so on) it will be converted over to the format
    // specified here by CGBitmapContextCreate.
    context = CGBitmapContextCreate (bitmapData,
                                     pixelsWide,
                                     pixelsHigh,
                                     8,      // bits per component
                                     bitmapBytesPerRow,
                                     colorSpace,
                                     kCGImageAlphaPremultipliedFirst);
    if (context == NULL)
    {
        free (bitmapData);
        fprintf (stderr, "Context not created!");
    }

    // Make sure and release colorspace before returning
    CGColorSpaceRelease( colorSpace );

    return context;
}


- (UIColor*) getPixelColorAtLocation:(CGPoint)point {
    UIColor* color = nil;
    //CGImageRef inImage = self.image.CGImage;
    CGImageRef inImage = [AppDelegate getInstance].capturedImage.CGImage;
    // Create off screen bitmap context to draw the image into. Format ARGB is 4 bytes for each pixel: Alpa, Red, Green, Blue
    CGContextRef cgctx = [self createARGBBitmapContextFromImage:inImage];
    if (cgctx == NULL) { return nil; /* error */ }

    size_t w = CGImageGetWidth(inImage);
    size_t h = CGImageGetHeight(inImage);
    CGRect rect = {{0,0},{w,h}}; 

    // Draw the image to the bitmap context. Once we draw, the memory
    // allocated for the context for rendering will then contain the
    // raw image data in the specified color space.
    CGContextDrawImage(cgctx, rect, inImage); 

    // Now we can get a pointer to the image data associated with the bitmap
    // context.
    unsigned char* data = CGBitmapContextGetData (cgctx);
    if (data != NULL) {
        //offset locates the pixel in the data from x,y.
        //4 for 4 bytes of data per pixel, w is width of one row of data.
        int offset = 4*((w*round(point.y))+round(point.x));
        int alpha =  data[offset];
        int red = data[offset+1];
        int green = data[offset+2];
        int blue = data[offset+3];
        NSLog(@"offset: %i colors: RGB A %i %i %i  %i",offset,red,green,blue,alpha);
        color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)];
    }

    // When finished, release the context
    CGContextRelease(cgctx);
    // Free image data memory for the context
    if (data) { free(data); }

    return color;
}

但是这些都不适合我。请帮我解决这个问题。有什么我想念的吗?

我的 UI 中有 2 个 UIImageView。后面的那个包含我需要从中选择触摸像素颜色的图像。另一个 UIImageView 是用选择的颜色在背面图像上绘制。

请帮忙。任何帮助将不胜感激。

【问题讨论】:

  • 这些都不起作用 并不是什么解释。 什么没用?它会崩溃吗?返回错误像素的颜色?
  • 抱歉没有具体说明。它没有崩溃。它返回错误的像素颜色。即使我不能假设它返回的是哪种像素颜色。

标签: iphone ios touch paint pixel


【解决方案1】:
import UIKit

class ViewController: UIViewController {
    
    let imageV = UIImageView(frame: CGRect(x: 0, y: 0, width: 223, height: 265))
    override func viewDidLoad() {
        super.viewDidLoad()
        imageV.center = view.center
        imageV.image = UIImage(named: "color_image")
        view.addSubview(imageV)
        // Do any additional setup after loading the view.
        
        let tapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(imageTapped(tapGestureRecognizer:)))
        imageV.isUserInteractionEnabled = true
        imageV.addGestureRecognizer(tapGestureRecognizer)
    }
    
    
    @objc func imageTapped(tapGestureRecognizer: UITapGestureRecognizer)
    {
        
        let cgpoint = tapGestureRecognizer.location(in: view)
        let color : UIColor = colorOfPoint(point: cgpoint)
        print("Picked Color is:",color)
        let new = UIView(frame: CGRect(x: 10, y: 10, width: 50, height: 50))
        new.backgroundColor = color
        view.addSubview(new)
    }
    
    func colorOfPoint(point:CGPoint) -> UIColor
    {
        let colorSpace:CGColorSpace = CGColorSpaceCreateDeviceRGB()
        let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
        
        var pixelData:[UInt8] = [0, 0, 0, 0]
        
        let context = CGContext(data: &pixelData, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
        context!.translateBy(x: -point.x, y: -point.y);
        self.view.layer.render(in: context!)
        
        let red:CGFloat = CGFloat(pixelData[0])/CGFloat(255.0)
        let green:CGFloat = CGFloat(pixelData[1])/CGFloat(255.0)
        let blue:CGFloat = CGFloat(pixelData[2])/CGFloat(255.0)
        let alpha:CGFloat = CGFloat(pixelData[3])/CGFloat(255.0)
        
        let color:UIColor = UIColor(red: red, green: green, blue: blue, alpha: alpha)
        return color
    }
}

【讨论】:

  • 虽然此代码可能会解决问题,但 including an explanation 关于如何以及为什么解决问题将真正有助于提高您的帖子质量,并可能导致更多的赞成票。请记住,您正在为将来的读者回答问题,而不仅仅是现在提出问题的人。请edit您的回答添加解释并说明适用的限制和假设。
【解决方案2】:

首先,我要感谢这段代码的作者,它对我的​​游戏项目帮助很大,因为我一直在寻找这个函数来做一个像素完美的命中框(不包括 aplha 是 O 的地方)。以下是 Swift 5 的一点更新:

// Fonction permettant de retourner les valeurs RGBA d'un pixel d'une vue
func getPixelColor(atPosition:CGPoint) -> UIColor{

    var pixel:[CUnsignedChar] = [0, 0, 0, 0];
    let colorSpace = CGColorSpaceCreateDeviceRGB();
    let bitmapInfo = CGBitmapInfo(rawValue:    CGImageAlphaInfo.premultipliedLast.rawValue);
    let context = CGContext(data: &pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue);

    context!.translateBy(x: -atPosition.x, y: -atPosition.y);
    layer.render(in: context!);
    let color:UIColor = UIColor(red: CGFloat(pixel[0])/255.0,
                                green: CGFloat(pixel[1])/255.0,
                                blue: CGFloat(pixel[2])/255.0,
                                alpha: CGFloat(pixel[3])/255.0);

    return color;

}

我在 pixel.dealloc(4) 上遇到了一些问题,就像在 Swift 5 中一样,您似乎不能再使用容量参数解除分配。我删除了 (4),但它有一些奇怪的行为(因为 dealloc() 没有释放整个数组)。

我没有做 UIView 的扩展,因为在我的项目中我有自己的子类,但这很容易做到。

我实现代码的方式:

// Méthode déterminant si le "touch" est validé par l'objet (par défaut, exclut les zones transparentes et les objets invisibles). A surcharger si nécessaire.
func isHit(atPosition position:CGPoint) -> Bool
{

    // Si l'objet n'est pas caché (paramètre isHidden) et si la zone touchée correspond à une zone effectivement dessinée (non transparente), retourne true.
    if (!self.isHidden && self.getPixelColor(atPosition: position).cgColor.alpha != 0) {return true}
    else {return false}

}

我希望这会有所帮助。

【讨论】:

    【解决方案3】:

    Swift 4、Xcode 9 - 带有 UIImageView 扩展

    这是上述所有答案的组合,但在扩展中

    extension UIImageView {
        func getPixelColorAt(point:CGPoint) -> UIColor{
            
            let pixel = UnsafeMutablePointer<CUnsignedChar>.allocate(capacity: 4)
            let colorSpace = CGColorSpaceCreateDeviceRGB()
            let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
            let context = CGContext(data: pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
            
            context!.translateBy(x: -point.x, y: -point.y)
            layer.render(in: context!)
            let color:UIColor = UIColor(red: CGFloat(pixel[0])/255.0,
                                        green: CGFloat(pixel[1])/255.0,
                                        blue: CGFloat(pixel[2])/255.0,
                                        alpha: CGFloat(pixel[3])/255.0)
            
            pixel.deallocate(capacity: 4)
            return color
        }
    }
    

    如何使用

    override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
        let touch = touches.first
        if let point = touch?.location(in: view) {
            let color = myUIImageView.getPixelColorAt(point: point)
            print(color)
        }
    }
    

    【讨论】:

    • 什么是layer.render(in: context!)?它说unresolved identifier layer
    • 哇,谢谢!该代码有效。但我一直想知道没有更简单的方法吗?还是没有简单的方法来访问像素,因为我们不应该这样做?
    • @Illep,layer 是 UIImageView 的一个属性。如果您在开始时没有使用“扩展”一词而是使用“类”,则会出现此错误。
    • 请注意,您可能希望在覆盖中调用 super.touchesBegan()。请参阅developer.apple.com/documentation/uikit/uiresponder/… 的讨论
    • 你是最棒的??- @MarkMoeykens
    【解决方案4】:

    感谢@Aggressor 发布上面的代码

    斯威夫特 2.1

    func getPixelColorAtPoint(point:CGPoint) -> UIColor{
    
       let pixel = UnsafeMutablePointer<CUnsignedChar>.alloc(4)
       let colorSpace = CGColorSpaceCreateDeviceRGB()
       let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedLast.rawValue)
       let context = CGBitmapContextCreate(pixel, 1, 1, 8, 4, colorSpace, bitmapInfo.rawValue)
    
       CGContextTranslateCTM(context, -point.x, -point.y)
       view.layer.renderInContext(context!)
       let color:UIColor = UIColor(red: CGFloat(pixel[0])/255.0, green: CGFloat(pixel[1])/255.0, blue: CGFloat(pixel[2])/255.0, alpha: CGFloat(pixel[3])/255.0)
    
       pixel.dealloc(4)
       return color
    }
    

    Swift 3、Xcode 版本 8.2 (8C38) 和 Swift 4,Xcode 版本 9.1 (9B55)

     func getPixelColorAtPoint(point:CGPoint, sourceView: UIView) -> UIColor{
    
        let pixel = UnsafeMutablePointer<CUnsignedChar>.allocate(capacity: 4)
        let colorSpace = CGColorSpaceCreateDeviceRGB()
        let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
        let context = CGContext(data: pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
        var color: UIColor? = nil
    
        if let context = context {
            context.translateBy(x: -point.x, y: -point.y)
            sourceView.layer.render(in: context)
    
            color = UIColor(red: CGFloat(pixel[0])/255.0,
                            green: CGFloat(pixel[1])/255.0,
                            blue: CGFloat(pixel[2])/255.0,
                            alpha: CGFloat(pixel[3])/255.0)
    
            pixel.deallocate(capacity: 4)
        }
        return color
    }
    

    【讨论】:

    • 您的代码在模拟器上返回正确的像素值,但在设备上返回错误值
    • @Raghuram 感谢您的通知。我会尽快修复它。
    • 你好,我只是测试了上面的代码,在模拟器和设备上似乎都是正确的。这是我的测试link
    • 欢迎你@Raghuram
    【解决方案5】:

    很好的回答 rdelmar 这对我帮助很大!

    这是我在 Swift 中完成上述操作的方式:

    override func touchesBegan(touches: NSSet, withEvent event: UIEvent)
        {
            var touch:UITouch = event.allTouches()!.anyObject() as UITouch
            var loc = touch.locationInView(self)
            var color:UIColor = getPixelColorAtPoint(loc)
            println(color)
        }
    
        //returns the color data of the pixel at the currently selected point
        func getPixelColorAtPoint(point:CGPoint)->UIColor
        {
            let pixel = UnsafeMutablePointer<CUnsignedChar>.alloc(4)
            var colorSpace = CGColorSpaceCreateDeviceRGB()
            let bitmapInfo = CGBitmapInfo(CGImageAlphaInfo.PremultipliedLast.rawValue)
            let context = CGBitmapContextCreate(pixel, 1, 1, 8, 4, colorSpace, bitmapInfo)
    
            CGContextTranslateCTM(context, -point.x, -point.y)
            layer.renderInContext(context)
            var color:UIColor = UIColor(red: CGFloat(pixel[0])/255.0, green: CGFloat(pixel[1])/255.0, blue: CGFloat(pixel[2])/255.0, alpha: CGFloat(pixel[3])/255.0)
    
            pixel.dealloc(4)
            return color
        }
    

    【讨论】:

      【解决方案6】:

      这是我用过的,看起来比你试过的方法简单。

      在我的自定义视图类中,我有这个:

      - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
          UITouch *touch = [[event allTouches] anyObject];
          CGPoint loc = [touch locationInView:self];
          self.pickedColor = [self colorOfPoint:loc];
      }
      

      colorOfPoint 是 UIView 上某个类别中的一个方法,代码如下:

      #import "UIView+ColorOfPoint.h"
      #import <QuartzCore/QuartzCore.h>
      
      @implementation UIView (ColorOfPoint)
      
      -(UIColor *) colorOfPoint:(CGPoint)point
          {
          unsigned char pixel[4] = {0};
          CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
          CGContextRef context = CGBitmapContextCreate(pixel,
                  1, 1, 8, 4, colorSpace, (CGBitmapInfo)kCGImageAlphaPremultipliedLast);
      
          CGContextTranslateCTM(context, -point.x, -point.y);
      
          [self.layer renderInContext:context];
      
          CGContextRelease(context);
          CGColorSpaceRelease(colorSpace);
          UIColor *color = [UIColor colorWithRed:pixel[0]/255.0
              green:pixel[1]/255.0 blue:pixel[2]/255.0
              alpha:pixel[3]/255.0];
          return color;
          }
      

      不要忘记将类别导入自定义视图类并添加 QuartzCore 框架。


      2013 年的小贴士:将最后一个参数转换为 (CGBitmapInfo) 以避免隐式转换警告:例如 here。希望对您有所帮助。

      【讨论】:

      • 这不考虑 alpha。如果 alpha ≠ 1.0,则颜色分量由于预乘而关闭。
      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2010-11-20
      • 2014-03-14
      • 1970-01-01
      • 2013-02-17
      • 1970-01-01
      相关资源
      最近更新 更多