【问题标题】:AVFoundation - Detect face and crop face area?AVFoundation - 检测面部和裁剪面部区域?
【发布时间】:2014-07-27 19:11:09
【问题描述】:

正如标题所说,我想检测面部,然后只裁剪面部区域。这是我目前所拥有的:

- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection {

for (AVMetadataObject *face in metadataObjects) {
    if ([face.type isEqualToString:AVMetadataObjectTypeFace]) {

        AVCaptureConnection *stillConnection = [_stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
        stillConnection.videoOrientation = [self videoOrientationFromCurrentDeviceOrientation];
        [_stillImageOutput captureStillImageAsynchronouslyFromConnection:stillConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
            if (error) {
                NSLog(@"There was a problem");
                return;
            }

            NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
            UIImage *stillImage = [UIImage imageWithData:jpegData];

            CIDetector *faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:[CIContext contextWithOptions:nil] options:nil];
            CIImage *ciimage = [CIImage imageWithData:jpegData];

            NSArray *features = [faceDetector featuresInImage:ciimage];
            self.captureImageView.image = stillImage;

            for(CIFeature *feature in features) {
                if ([feature isKindOfClass:[CIFaceFeature class]]) {
                    CIFaceFeature *faceFeature = (CIFaceFeature *)feature;

                    CGImageRef imageRef = CGImageCreateWithImageInRect([stillImage CGImage], faceFeature.bounds);
                    self.detectedFaceImageView.image = [UIImage imageWithCGImage:imageRef];
                    CGImageRelease(imageRef);
                }
            }
            //[_session stopRunning];
        }];
    }
}

}

此代码部分工作,它可以检测到人脸,但它不能用人脸裁剪部分,它总是裁剪错误的区域,它根本裁剪了一些东西。我一直在浏览堆栈寻找答案,尝试了这个和那个,但无济于事。

【问题讨论】:

    标签: ios ipad crop face-detection


    【解决方案1】:

    这就是答案

    - (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    
    // when do we start face detection
    if (!_canStartDetection) return;
    
    CIImage *ciimage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)];
    NSArray *features = [_faceDetector featuresInImage:ciimage options:nil];
    
    // find face feature
    for(CIFeature *feature in features) {
    
        // if not face feature ignore
        if (![feature isKindOfClass:[CIFaceFeature class]]) continue;
    
        // face detected
        _canStartDetection = NO;
        CIFaceFeature *faceFeature = (CIFaceFeature *)feature;
    
        // crop detected face
        CIVector *cropRect = [CIVector vectorWithCGRect:faceFeature.bounds];
        CIFilter *cropFilter = [CIFilter filterWithName:@"CICrop"];
        [cropFilter setValue:ciimage forKey:@"inputImage"];
        [cropFilter setValue:cropRect forKey:@"inputRectangle"];
        CIImage *croppedImage = [cropFilter valueForKey:@"outputImage"];
        UIImage *stillImage = [UIImage imageWithCIImage:ciimage];
    }
    

    }

    注意我这次用的是AVCaptureVideoDataOutput,代码如下:

    // set output for face frames
    AVCaptureVideoDataOutput *output2 = [[AVCaptureVideoDataOutput alloc] init];
    [_session addOutput:output2];
    output2.videoSettings = @{(NSString*)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)};
    output2.alwaysDiscardsLateVideoFrames = YES;
    dispatch_queue_t queue = dispatch_queue_create("com.myapp.faceDetectionQueueSerial", DISPATCH_QUEUE_SERIAL);
    [output2 setSampleBufferDelegate:self queue:queue];
    

    【讨论】:

    • _faceDetectorCIDetector的一个实例,可以轻松初始化
    猜你喜欢
    • 2021-05-21
    • 1970-01-01
    • 2018-03-24
    • 2011-06-16
    • 2015-08-25
    • 2014-03-03
    • 2014-07-28
    • 2019-09-06
    • 1970-01-01
    相关资源
    最近更新 更多