【问题标题】:Can't obtain output buffer from iOS camera in Kotlin Multiplatform无法从 Kotlin Multiplatform 中的 iOS 相机获取输出缓冲区
【发布时间】:2020-11-06 12:05:32
【问题描述】:

我有一个 Kotlin Multiplatform 多模块项目,其中一个模块处理相机。我已经为这个特定模块分离了 Android 和 iOS 源集。使用适用于 iOS 的 Kotlin 本机,我可以成功添加视频输入、视频输出并在图层中显示预览,但我无法在 AVCaptureVideoDataOutputSampleBufferDelegateProtocol 委托中获取示例缓冲区。

我遇到了这个错误:

Uncaught Kotlin exception: kotlin.native.IncorrectDereferenceException: illegal attempt to access non-shared <object>@8121d008 from other thread
    at 0   MyFramework                      0x000000010118fd3c kfun:kotlin.Throwable#<init>(kotlin.String?){} + 92
    at 1   MyFramework                      0x0000000101188cf0 kfun:kotlin.Exception#<init>(kotlin.String?){} + 88
    at 2   MyFramework                      0x000000010118883c kfun:kotlin.RuntimeException#<init>(kotlin.String?){} + 88
    at 3   MyFramework                      0x00000001011b7074 kfun:kotlin.native.IncorrectDereferenceException#<init>(kotlin.String){} + 88
    at 4   MyFramework                      0x00000001011ba8cc ThrowIllegalObjectSharingException + 428
    at 5   MyFramework                      0x00000001015fe100 TryAddHeapRef + 0
    at 6   MyFramework                      0x00000001015fddc4 _ZN27BackRefFromAssociatedObject9tryAddRefEv + 108
    at 7   MyFramework                      0x00000001015ed7f4 _ZN12_GLOBAL__N_113_tryRetainImpEP11objc_objectP13objc_selector + 64
    at 8   libobjc.A.dylib                     0x00000002202f1f94 objc_loadWeakRetained + 348
    at 9   libobjc.A.dylib                     0x00000002202f2060 objc_loadWeak + 20
    at 10  AVFoundation                        0x00000002271f73b8 <redacted> + 32
    at 11  libdispatch.dylib                   0x00000001049e8de4 _dispatch_client_callout + 16
    at 12  libdispatch.dylib                   0x00000001049f7fe4 _dispatch_sync_invoke_and_complete + 124
    at 13  AVFoundation                        0x00000002271f72c8 <redacted> + 164
    at 14  AVFoundation                        0x00000002271f715c <redacted> + 36
    at 15  AVFoundation                        0x00000002271d7fa4 <redacted> + 124
    at 16  AVFoundation                        0x00000002271d7ce8 <redacted> + 100
    at 17  CoreMedia                           0x00000002246ffdfc <redacted> + 280
    at 18  CoreMedia                           0x000000022471d4cc <redacted> + 224
    at 19  libdispatch.dylib                   0x00000001049e8de4 _dispatch_client_callout + 16
    at 20  libdispatch.dylib                   0x00000001049ec1e0 _dispatch_continuation_pop + 528
    at 21  libdispatch.dylib                   0x00000001049fecac _dispatch_source_invoke + 1864
    at 22  libdispatch.dylib                   0x00000001049f0cd8 _dispatch_lane_serial_drain + 288
    at 23  libdispatch.dylib                   0x00000001049f1bb4 _dispatch_lane_invoke + 516
    at 24  libdispatch.dylib                   0x00000001049f0cd8 _dispatch_lane_serial_drain + 288
    at 25  libdispatch.dylib                   0x00000001049f1b7c _dispatch_lane_invoke + 460
    at 26  libdispatch.dylib                   0x00000001049fbc18 _dispatch_workloop_worker_thread + 1220
    at 27  libsystem_pthread.dylib             0x0000000220d260f0 _pthread_wqthread + 312
    at 28  libsystem_pthread.dylib             0x0000000220d28d00 start_wqthread + 4

我猜这与线程问题有关,也许是在将图像发送给委托时,但我不确定。我尝试使用@ThreadLocal 没有成功。

这是代码:

class MyClass(val listener: MyListener) {

...

private val sampleDelegate: AVCaptureVideoDataOutputSampleBufferDelegateProtocol = object : NSObject(), AVCaptureVideoDataOutputSampleBufferDelegateProtocol {
        override fun captureOutput(output: AVCaptureOutput, didOutputSampleBuffer: CMSampleBufferRef?, fromConnection: AVCaptureConnection) {
            //Not called due to previous error
        }
    }

...

fun addVideoOutput(captureDevicePosition: AVCaptureDevicePosition) {
        val videoOutput = AVCaptureVideoDataOutput()
        val myqueue = dispatch_queue_create("cameraQueue", null)
        videoOutput.setSampleBufferDelegate(sampleDelegate, myqueue)
        val format = kCVPixelFormatType_32BGRA
        videoOutput.videoSettings = NSDictionary.dictionaryWithObject(
            NSNumber.numberWithUnsignedInt(format),
            kCVPixelBufferPixelFormatTypeKey as NSCopyingProtocol
        )
        if (captureSession.canAddOutput(videoOutput)) {
            captureSession.addOutput(videoOutput)
        }
        val conn = videoOutput.connectionWithMediaType(AVMediaTypeVideo)
        if (conn!!.isVideoOrientationSupported()) {
            conn.videoOrientation = captureDevicePosition
        }
        videoOutput.alwaysDiscardsLateVideoFrames = true
        captureSession.sessionPreset = AVCaptureSessionPresetHigh
    }


【问题讨论】:

    标签: kotlin-multiplatform ios-camera kotlin-native


    【解决方案1】:

    使用

    val myqueue = dispatch_get_main_queue()
    

    解决了问题

    【讨论】:

      猜你喜欢
      • 2020-07-22
      • 1970-01-01
      • 2014-04-19
      • 1970-01-01
      • 2021-03-19
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2011-07-24
      相关资源
      最近更新 更多