【问题标题】:iOS: Capture image from front facing cameraiOS:从前置摄像头捕获图像
【发布时间】:2012-04-17 20:55:51
【问题描述】:

我正在制作一个应用程序,我想从前置摄像头捕获图像,而不显示任何类型的捕获屏幕。我想在没有任何用户交互的情况下完全用代码拍照。我将如何为前置摄像头执行此操作?

【问题讨论】:

  • 您的意思是在用户不知情的情况下静默捕捉图像?
  • 是的,我知道这听起来很糟糕,但它完全无害。该应用程序将使他们拉出一张有趣的脸,我想捕捉它,这样他们就可以看到他们看起来有多傻。
  • 您对此类功能的实现可能是无害的,但我可以想到许多其他情况,但它可能不是(这可能是不可能的原因)。

标签: ios camera


【解决方案1】:

如何使用 AVFoundation 前置摄像头捕捉图像:

开发注意事项:

ViewController.h

// Frameworks
#import <CoreVideo/CoreVideo.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>
#import <UIKit/UIKit.h>

@interface CameraViewController : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate>

// Camera
@property (weak, nonatomic) IBOutlet UIImageView* cameraImageView;
@property (strong, nonatomic) AVCaptureDevice* device;
@property (strong, nonatomic) AVCaptureSession* captureSession;
@property (strong, nonatomic) AVCaptureVideoPreviewLayer* previewLayer;
@property (strong, nonatomic) UIImage* cameraImage;

@end

ViewController.m

#import "CameraViewController.h"

@implementation CameraViewController

- (void)viewDidLoad
{
    [super viewDidLoad];

    [self setupCamera];
    [self setupTimer];
}

- (void)setupCamera
{    
    NSArray* devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for(AVCaptureDevice *device in devices)
    {
        if([device position] == AVCaptureDevicePositionFront)
            self.device = device;
    }

    AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
    AVCaptureVideoDataOutput* output = [[AVCaptureVideoDataOutput alloc] init];
    output.alwaysDiscardsLateVideoFrames = YES;

    dispatch_queue_t queue;
    queue = dispatch_queue_create("cameraQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];

    NSString* key = (NSString *) kCVPixelBufferPixelFormatTypeKey;
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
    [output setVideoSettings:videoSettings];

    self.captureSession = [[AVCaptureSession alloc] init];
    [self.captureSession addInput:input];
    [self.captureSession addOutput:output];
    [self.captureSession setSessionPreset:AVCaptureSessionPresetPhoto];

    self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

    // CHECK FOR YOUR APP
    self.previewLayer.frame = CGRectMake(0, 0, self.view.frame.size.height, self.view.frame.size.width);
    self.previewLayer.orientation = AVCaptureVideoOrientationLandscapeRight;
    // CHECK FOR YOUR APP

    [self.view.layer insertSublayer:self.previewLayer atIndex:0];   // Comment-out to hide preview layer

    [self.captureSession startRunning];
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);

    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);

    self.cameraImage = [UIImage imageWithCGImage:newImage scale:1.0f orientation:UIImageOrientationDownMirrored];

    CGImageRelease(newImage);

    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
}

- (void)setupTimer
{
    NSTimer* cameraTimer = [NSTimer scheduledTimerWithTimeInterval:2.0f target:self selector:@selector(snapshot) userInfo:nil repeats:YES];
}

- (void)snapshot
{
    NSLog(@"SNAPSHOT");
    self.cameraImageView.image = self.cameraImage;  // Comment-out to hide snapshot
}

@end

将它连接到带有 UIImageView 的 UIViewController 用于快照,它会工作的!快照以 2.0 秒的间隔以编程方式拍摄,无需任何用户输入。注释掉选中的行以移除预览层和快照反馈。

还有任何问题/cmets,请告诉我!

【讨论】:

  • 非常好!我建议接受这个答案而不是我的(假设它有效)。
  • 这对 Apple App Store 友好吗?
  • 我不确定,这是我第一次想到这样的应用程序。我想您需要深入了解细则并真正让用户/Apple知道它没有被用于任何险恶的目的(如此处其他帖子中所述)。您的应用听起来确实有趣且无害,所以也许会没事的!
  • 是的,我同意,但应用程序审查过程非常严格并且(有时)毫无意义。许多政策是任意的,不会为“无害”的应用程序提供便利。感谢您的代码!
  • Ricardo,感谢一百万的出色回答。撇开“无预览”方面不谈,它只是 CoreMedia、AVFoundation 等的一个很好的例子。再次感谢!
【解决方案2】:

将以上代码转换为 Swift 4

import UIKit
import AVFoundation

class CameraViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {

@IBOutlet weak var cameraImageView: UIImageView!

var device: AVCaptureDevice?
var captureSession: AVCaptureSession?
var previewLayer: AVCaptureVideoPreviewLayer?
var cameraImage: UIImage?

override func viewDidLoad() {
    super.viewDidLoad()

    setupCamera()
    setupTimer()
}

func setupCamera() {
    let discoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera],
                                                            mediaType: AVMediaType.video,
                                                           position: .front)
    device = discoverySession.devices[0]

    let input: AVCaptureDeviceInput
    do {
        input = try AVCaptureDeviceInput(device: device!)
    } catch {
        return
    }

    let output = AVCaptureVideoDataOutput()
    output.alwaysDiscardsLateVideoFrames = true

    let queue = DispatchQueue(label: "cameraQueue")
    output.setSampleBufferDelegate(self, queue: queue)
    output.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable as! String: kCVPixelFormatType_32BGRA]

    captureSession = AVCaptureSession()
    captureSession?.addInput(input)
    captureSession?.addOutput(output)
    captureSession?.sessionPreset = AVCaptureSession.Preset.photo

    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession!)
    previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
    previewLayer?.frame = CGRect(x: 0.0, y: 0.0, width: view.frame.width, height: view.frame.height)

    view.layer.insertSublayer(previewLayer!, at: 0)

        captureSession?.startRunning()
    }

    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: 0))
        let baseAddress = UnsafeMutableRawPointer(CVPixelBufferGetBaseAddress(imageBuffer!))
        let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!)
        let width = CVPixelBufferGetWidth(imageBuffer!)
        let height = CVPixelBufferGetHeight(imageBuffer!)

        let colorSpace = CGColorSpaceCreateDeviceRGB()
        let newContext = CGContext(data: baseAddress, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo:
        CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)

        let newImage = newContext!.makeImage()
         cameraImage = UIImage(cgImage: newImage!)

        CVPixelBufferUnlockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: 0))
    }

    func setupTimer() {
        _ = Timer.scheduledTimer(timeInterval: 2.0, target: self, selector: #selector(snapshot), userInfo: nil, repeats: true)
    }

    @objc func snapshot() {
        print("SNAPSHOT")
        cameraImageView.image = cameraImage
    }
}

【讨论】:

    【解决方案3】:

    您可能需要使用AVFoundation 来捕获视频流/图像而不显示它。与UIImagePickerController 不同,它不能“开箱即用”。以 Apple 的 AVCam 为例,了解一下。

    【讨论】:

      【解决方案4】:

      如果有人在 2017 年仍在寻找解决方案,我将上面的代码从 Objc 转换为 Swift 3。

      import UIKit
      import AVFoundation
      
      class CameraViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {
      
      @IBOutlet weak var cameraImageView: UIImageView!
      
      var device: AVCaptureDevice?
      var captureSession: AVCaptureSession?
      var previewLayer: AVCaptureVideoPreviewLayer?
      var cameraImage: UIImage?
      
      override func viewDidLoad() {
          super.viewDidLoad()
      
          setupCamera()
          setupTimer()
      }
      
      func setupCamera() {
          let discoverySession = AVCaptureDeviceDiscoverySession(deviceTypes: [.builtInWideAngleCamera],
                                                                 mediaType: AVMediaTypeVideo,
                                                                 position: .front)
          device = discoverySession?.devices[0]
      
          let input: AVCaptureDeviceInput
          do {
              input = try AVCaptureDeviceInput(device: device)
          } catch {
              return
          }
      
          let output = AVCaptureVideoDataOutput()
          output.alwaysDiscardsLateVideoFrames = true
      
          let queue = DispatchQueue(label: "cameraQueue")
          output.setSampleBufferDelegate(self, queue: queue)
          output.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable: kCVPixelFormatType_32BGRA]
      
          captureSession = AVCaptureSession()
          captureSession?.addInput(input)
          captureSession?.addOutput(output)
          captureSession?.sessionPreset = AVCaptureSessionPresetPhoto
      
          previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
          previewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill
      
          previewLayer?.frame = CGRect(x: 0.0, y: 0.0, width: view.frame.width, height: view.frame.height)
      
          view.layer.insertSublayer(previewLayer!, at: 0)
      
          captureSession?.startRunning()
      }
      
      func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
          let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
          CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: .allZeros))
          let baseAddress = UnsafeMutableRawPointer(CVPixelBufferGetBaseAddress(imageBuffer!))
          let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!)
          let width = CVPixelBufferGetWidth(imageBuffer!)
          let height = CVPixelBufferGetHeight(imageBuffer!)
      
          let colorSpace = CGColorSpaceCreateDeviceRGB()
          let newContext = CGContext(data: baseAddress, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo:
              CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)
      
          let newImage = newContext!.makeImage()
          cameraImage = UIImage(cgImage: newImage!)
      
          CVPixelBufferUnlockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: .allZeros))
      }
      
      func setupTimer() {
          _ = Timer.scheduledTimer(timeInterval: 2.0, target: self, selector: #selector(snapshot), userInfo: nil, repeats: true)
      }
      
      func snapshot() {
          print("SNAPSHOT")
          cameraImageView.image = cameraImage
      }
      }
      

      另外,我找到了一种从 CMSampleBuffer 获取图像的更短的解决方案:

      func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
          let myPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
          let myCIimage = CIImage(cvPixelBuffer: myPixelBuffer!)
          let videoImage = UIImage(ciImage: myCIimage)
          cameraImage = videoImage
      }
      

      【讨论】:

      • 没问题,我很高兴它很有用,不知道它是否仍然适用于 Swift 4 而不会弹出警告..
      • 不只是警告,有些东西需要改变,但修复-它主要处理它。
      【解决方案5】:

      在 UIImagePickerController 类的文档中有一个名为 takePicture 的方法。它说:

      将此方法与自定义叠加视图结合使用,以启动静态图像的编程捕获。这支持在不离开界面的情况下拍摄多张照片,但需要您隐藏默认的图像选择器控件。

      【讨论】:

        猜你喜欢
        • 2017-07-20
        • 1970-01-01
        • 1970-01-01
        • 1970-01-01
        • 1970-01-01
        • 1970-01-01
        • 2013-02-09
        相关资源
        最近更新 更多