【问题标题】:Can i make stream with flutter camera package?我可以用颤振相机包制作流吗?
【发布时间】:2021-04-01 20:35:32
【问题描述】:

我的目的是流式传输到 websocket。

首先我需要相机数据。我为此使用了 startImageStream 函数。此函数以 yuv 格式返回图像。我发现了一些关于 yuv 到图像小部件的信息,但我无法将其转换为 base 64。

问题是如何将此 yuv 图像数据转换为 base64 ?或者这是用颤振制作流的不好方法。

【问题讨论】:

    标签: flutter android-camera


    【解决方案1】:
    controller.startImageStream((image) {
      // convert image here
    
      // assuming it's converted
      final imageBytes = <int>[];
    
      final encodedImage = base64.encode(imageBytes);
    });
    

    至于yuv->rgb转换,可以这样(代码从https://gist.github.com/Alby-o/fe87e35bc21d534c8220aed7df028e03复制):

    // imgLib -> Image package from https://pub.dartlang.org/packages/image
    import 'package:image/image.dart' as imglib;
    import 'package:camera/camera.dart';
    
    Future<List<int>> convertImagetoPng(CameraImage image) async {
      try {
        imglib.Image img;
        if (image.format.group == ImageFormatGroup.yuv420) {
          img = _convertYUV420(image);
        } else if (image.format.group == ImageFormatGroup.bgra8888) {
          img = _convertBGRA8888(image);
        }
    
        imglib.PngEncoder pngEncoder = new imglib.PngEncoder();
    
        // Convert to png
        List<int> png = pngEncoder.encodeImage(img);
        return png;
      } catch (e) {
        print(">>>>>>>>>>>> ERROR:" + e.toString());
      }
      return null;
    }
    
    // CameraImage BGRA8888 -> PNG
    // Color
    imglib.Image _convertBGRA8888(CameraImage image) {
      return imglib.Image.fromBytes(
        image.width,
        image.height,
        image.planes[0].bytes,
        format: imglib.Format.bgra,
      );
    }
    
    // CameraImage YUV420_888 -> PNG -> Image (compresion:0, filter: none)
    // Black
    imglib.Image _convertYUV420(CameraImage image) {
      var img = imglib.Image(image.width, image.height); // Create Image buffer
    
      Plane plane = image.planes[0];
      const int shift = (0xFF << 24);
    
      // Fill image buffer with plane[0] from YUV420_888
      for (int x = 0; x < image.width; x++) {
        for (int planeOffset = 0;
            planeOffset < image.height * image.width;
            planeOffset += image.width) {
          final pixelColor = plane.bytes[planeOffset + x];
          // color: 0x FF  FF  FF  FF
          //           A   B   G   R
          // Calculate pixel color
          var newVal = shift | (pixelColor << 16) | (pixelColor << 8) | pixelColor;
    
          img.data[planeOffset + x] = newVal;
        }
      }
    
      return img;
    }
    

    一些背景:

    在后者中,您可以找到一条评论,链接到似乎更有效的实现,它位于 YUV_2_RGB 项目中:https://github.com/alexcohn/YUV_2_RGB/blob/master/lib/service/image_result_processor_service.dart

    【讨论】:

    • 感谢您的回答。 startImageStream func 是否足够快,可以制作视频通话应用程序?
    • 我从来没有做过,但是看看 YUV_2_RGB 回购中的数字,如果你需要全高清,我会说不,因为转换太慢了。看看你是否可以实现一个性能更好的原生插件
    【解决方案2】:

    我解决了。这是我在互联网上找到的转换函数的答案是:

    imglib.Image img;
    
    
          if (Platform.isAndroid) {
            // Allocate memory for the 3 planes of the image
            Pointer<Uint8> p =
                allocate(count: _savedImage.planes[0].bytes.length);
            Pointer<Uint8> p1 =
                allocate(count: _savedImage.planes[1].bytes.length);
            Pointer<Uint8> p2 =
                allocate(count: _savedImage.planes[2].bytes.length);
    
            // Assign the planes data to the pointers of the image
            Uint8List pointerList =
                p.asTypedList(_savedImage.planes[0].bytes.length);
            Uint8List pointerList1 =
                p1.asTypedList(_savedImage.planes[1].bytes.length);
            Uint8List pointerList2 =
                p2.asTypedList(_savedImage.planes[2].bytes.length);
            pointerList.setRange(0, _savedImage.planes[0].bytes.length,
                _savedImage.planes[0].bytes);
            pointerList1.setRange(0, _savedImage.planes[1].bytes.length,
                _savedImage.planes[1].bytes);
            pointerList2.setRange(0, _savedImage.planes[2].bytes.length,
                _savedImage.planes[2].bytes);
    
            // Call the convertImage function and convert the YUV to RGB
            Pointer<Uint32> imgP = conv(
                p,
                p1,
                p2,
                _savedImage.planes[1].bytesPerRow,
                _savedImage.planes[1].bytesPerPixel,
                _savedImage.planes[0].bytesPerRow,
                _savedImage.height);
    
            // Get the pointer of the data returned from the function to a List
            List imgData = imgP.asTypedList(
                (_savedImage.planes[0].bytesPerRow * _savedImage.height));
            // Generate image from the converted data
            img = imglib.Image.fromBytes(_savedImage.height, _savedImage.planes[0].bytesPerRow, imgData);
    
            // Free the memory space allocated
            // from the planes and the converted data
            free(p);
            free(p1);
            free(p2);
            free(imgP);
          }
          else if (Platform.isIOS) {
            img = imglib.Image.fromBytes(
              _savedImage.planes[0].bytesPerRow,
              _savedImage.height,
              _savedImage.planes[0].bytes,
              format: imglib.Format.bgra,
            );
          }
    

    在这段代码给你一个 imglib.Image 对象之后,我用那个 func 将此图像转换为 jpg:

          List<int> a = imglib.encodeJpg(img, quality: 100);
          var resp = await sendImage(base64.encode(a));
          print(resp.body);
    

    我将它发送到一个 api 以查看图像并且它有效..

    没有编码我无法转换为base64,所以重点是:

    List a = imglib.encodeJpg(img, quality: 100);

    【讨论】:

      猜你喜欢
      • 2022-06-29
      • 1970-01-01
      • 2021-05-26
      • 2020-08-02
      • 2021-06-10
      • 1970-01-01
      • 2019-01-30
      • 1970-01-01
      • 2021-09-02
      相关资源
      最近更新 更多