【问题标题】:How to detect eye blink using Google vision API in android ?如何在 android 中使用 Google vision API 检测眨眼?
【发布时间】:2025-12-20 06:45:09
【问题描述】:

我正在使用视觉 API 进行面部检测,现在我想实现眨眼,但视觉 api 仍然可以检测人(非现场)的图像(照片)中的眨眼。

此外,我使用 Tracker 来跟踪一段时间内的眼睛状态,以检测指示左眼眨眼的事件序列:

左眼睁开 -> 左眼闭上 -> 左眼睁开

GraphicFaceTracker 类定义如下:

private class GraphicFaceTracker extends Tracker<Face> {
        private GraphicOverlay mOverlay;
        private FaceGraphic mFaceGraphic;
        private Context context ;

        GraphicFaceTracker(Context context, GraphicOverlay overlay) {
            mOverlay = overlay;
            this.context= context;
            mFaceGraphic = new FaceGraphic(overlay);
        }

        private final float OPEN_THRESHOLD = 0.85f;
        private final float CLOSE_THRESHOLD = 0.4f;

        private int state = 0;


        void blink(float value, final int eyeNo, String whichEye) {
            switch (state) {
                case 0:
                    if (value > OPEN_THRESHOLD) {
                        // Both eyes are initially open
                        state = 1;
                    }
                    break;

                case 1:
                    if (value < CLOSE_THRESHOLD ) {
                        // Both eyes become closed
                        state = 2;
                    }
                    break;

                case 2:
                    if (value > OPEN_THRESHOLD)  {
                        // Both eyes are open again
                        Log.i("BlinkTracker", "blink occurred!");

                        mCameraSource.takePicture(null, new CameraSource.PictureCallback() {
                            @Override
                            public void onPictureTaken(byte[] bytes) {
                                Bitmap bmp = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
                                Log.d("BITMAP", bmp.getWidth() + "x" + bmp.getHeight());
                                System.out.println(bmp.getWidth() + "x" + bmp.getHeight());
                            }
                        });
                        state = 0;
                    }
                    break;
            }


        }

        /**
         * Start tracking the detected face instance within the face overlay.
         */
        @Override
        public void onNewItem(int faceId, Face item) {
            mFaceGraphic.setId(faceId);
        }

        /**
         * Update the position/characteristics of the face within the overlay.
         */
        @Override
        public void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) {
            mOverlay.add(mFaceGraphic);
            mFaceGraphic.updateFace(face);

            float left = face.getIsLeftEyeOpenProbability();
            float right = face.getIsRightEyeOpenProbability();
            if (left == Face.UNCOMPUTED_PROBABILITY)  {
                // At least one of the eyes was not detected.
                return;
            }
            blink(left,0,"left");

            if(right == Face.UNCOMPUTED_PROBABILITY ){
                return ;
            }
        }
}

我已启用“分类”,以便让检测器指示眼睛是否睁开/闭合:

FaceDetector detector = new FaceDetector.Builder(context)
            .setProminentFaceOnly(true) // optimize for single, relatively large face
            .setTrackingEnabled(true) // enable face tracking
            .setClassificationType(/* eyes open and smile */ FaceDetector.ALL_CLASSIFICATIONS)
            .setMode(FaceDetector.FAST_MODE) // for one face this is OK
            .build();

然后添加跟踪器作为处理器,用于从检测器接收随时间推移的面部更新。例如,此配置将用于跟踪视野中最大的人脸是否眨眼:

Tracker<Face> tracker = new GraphicFaceTracker(this,mGraphicOverlay);
detector.setProcessor(new LargestFaceFocusingProcessor.Builder(detector, tracker).build());

但上面的代码检测到人的图像中的眨眼。但一个人的形象是不能眨眼的。如何通过相机检测眨眼?

【问题讨论】:

    标签: android google-play-services google-vision android-vision


    【解决方案1】:

    从人脸对象你可以得到低于概率。

     float leftOpenScore = face.getIsLeftEyeOpenProbability();
    if (leftOpenScore == Face.UNCOMPUTED_PROBABILITY) {//left eye is open }else{//left eye closed }
    
     float leftOpenScore = face.getIsRightEyeOpenProbability();
    if (leftOpenScore == Face.UNCOMPUTED_PROBABILITY) {//Right eye is open }else{//Right eye closed }
    

    现在您可以将此值传递到您想要使用的地方。

    【讨论】:

      【解决方案2】:

      您可以将检测器传递给相机源,并从表面视图处理眨眼检测。

      public class LivelinessScanFragment extends Fragment {
      
          SurfaceView cameraView;
          CameraSource cameraSource;
          final int RequestCameraPermissionID = 1001;
          FaceDetector detector;
      
             @Override
          public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
      
              switch (requestCode) {
                  case RequestCameraPermissionID: {
                      if (grantResults[0] == PackageManager.PERMISSION_GRANTED) {
                          if (ActivityCompat.checkSelfPermission(getActivity(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
                              return;
                          }
                          try {
                              cameraSource.start(cameraView.getHolder());
                          } catch (IOException e) {
                              e.printStackTrace();
                          }
                      }
                  }
              }
          }
      
      
          public LivelinessScanFragment() {
              // Required empty public constructor
          }
      
      
          @Override
          public View onCreateView(LayoutInflater inflater, ViewGroup container,
                                   Bundle savedInstanceState) {
      
                  // Inflate the layout for this fragment
                  View rootView = inflater.inflate(R.layout.fragment_liveliness_scan, container, false);
      
      
      
                  cameraView = (SurfaceView)rootView.findViewById(R.id.surface_view);
      
                  detector = new FaceDetector.Builder(getActivity())
                      .setProminentFaceOnly(true) // optimize for single, relatively large face
                      .setTrackingEnabled(true) // enable face tracking
                      .setClassificationType(/* eyes open and smile */ FaceDetector.ALL_CLASSIFICATIONS)
                      .setMode(FaceDetector.FAST_MODE) // for one face this is OK
                      .build();
      
      
                  if (!detector.isOperational()) {
                      Log.w("MainActivity", "Detector Dependencies are not yet available");
                  } else {
                      cameraSource = new CameraSource.Builder(Application.getContext(), detector)
                              .setFacing(CameraSource.CAMERA_FACING_FRONT)
                              .setRequestedFps(2.0f)
                              .setRequestedPreviewSize(1280, 1024)
                              .setAutoFocusEnabled(true)
                              .build();
      
                      cameraView.getHolder().addCallback(new SurfaceHolder.Callback() {
                          @Override
                          public void surfaceCreated(SurfaceHolder surfaceHolder) {
                              try {
                                  if (ActivityCompat.checkSelfPermission(Application.getContext(), Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
      
                                      ActivityCompat.requestPermissions(getActivity(),
                                              new String[]{Manifest.permission.CAMERA}, RequestCameraPermissionID);
                                      return;
                                  }
                                  cameraSource.start(cameraView.getHolder());
                                  detector.setProcessor(
                                          new LargestFaceFocusingProcessor(detector, new GraphicFaceTracker()));
      
                              } catch (IOException e) {
                                  e.printStackTrace();
                              }
                          }
      
                          @Override
                          public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) {
      
                          }
      
                          @Override
                          public void surfaceDestroyed(SurfaceHolder surfaceHolder) {
                              cameraSource.stop();
                          }
                      });
      
      
                  }
      
                  return rootView;
              }
      
          private class GraphicFaceTracker extends Tracker<Face> {
      
              private final float OPEN_THRESHOLD = 0.85f;
              private final float CLOSE_THRESHOLD = 0.4f;
      
              private int state = 0;
      
      
              void blink(float value) {
                  switch (state) {
                      case 0:
                          if (value > OPEN_THRESHOLD) {
                              // Both eyes are initially open
                              state = 1;
                          }
                          break;
      
                      case 1:
                          if (value < CLOSE_THRESHOLD ) {
                              // Both eyes become closed
                              state = 2;
                          }
                          break;
      
                      case 2:
                          if (value > OPEN_THRESHOLD)  {
                              // Both eyes are open again
                              Log.i("BlinkTracker", "blink occurred!");
                              state = 0;
      
                          }
                          break;
                  }
      
      
              }
      
              /**
               * Update the position/characteristics of the face within the overlay.
               */
              @Override
              public void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) {
      
                  float left = face.getIsLeftEyeOpenProbability();
                  float right = face.getIsRightEyeOpenProbability();
                  if ((left == Face.UNCOMPUTED_PROBABILITY) ||
                          (right == Face.UNCOMPUTED_PROBABILITY)) {
                      // One of the eyes was not detected.
                      return;
                  }
      
                  float value = Math.min(left, right);
                  blink(value);
              }
          }
      
      
      }
      

      【讨论】:

      • 如何从表面视图处理眨眼检测?有代码吗?
      • 仔细阅读上面的代码,检测器作为参数传递给相机源,cameraView是Surface View的实例。它是完整的工作代码。
      • 在检测到眨眼的地方设置祝酒词。
      • 是的,您可以设置祝酒词或实现任何其他功能。
      • @IccheGuri 如果您的问题解决了,请采纳答案。
      【解决方案3】:

      这是一个 Github 项目 open source eye blink detector for Android,它在 Android 中实时检测眨眼,它是在 FaceDetectorApi 之上实现的

      【讨论】:

      • 我可以从我的 Web 应用程序代码中使用它吗?有云api吗?
      【解决方案4】:

      我认为这看起来是对的。如果您将检测器与正在运行的 CameraSource 实例相关联,如下例所示:

      https://developers.google.com/vision/android/face-tracker-tutorial

      这将跟踪摄像机的眼球运动。我还认为您可以稍微更改 onUpdate 代码以更好地确定眨眼阈值:

          @Override
          public void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) {
              mOverlay.add(mFaceGraphic);
              mFaceGraphic.updateFace(face);
      
              float left = face.getIsLeftEyeOpenProbability();
              float right = face.getIsRightEyeOpenProbability();
              if ((left == Face.UNCOMPUTED_PROBABILITY) ||
                  (right == Face.UNCOMPUTED_PROBABILITY)) {
                  // One of the eyes was not detected.
                  return;
              }
      
              float value = Math.min(left, right);
              blink(value);
          }
      

      【讨论】:

      • 将左右之间的最小值传递给闪烁功能背后的逻辑是什么?
      • 因为打开概率较小(关闭概率较大)的那个更准确地作为眨眼的案例 - 因为在眨眼中,眼睛必须闭合一次。
      • 如果你在拍照,你可能希望双眼有相对较高的睁开概率。这将是一个更好的画面。但是,这也意味着眨眼而不是眨眼也会触发拍照。