【问题标题】:ARKit - Projection of ARAnchor to 2D spaceARKit - ARAnchor 到 2D 空间的投影
【发布时间】:2018-09-26 10:22:54
【问题描述】:

我正在尝试将 ARAnchor 投影到 2D 空间,但我正面临方向问题...

在我的函数下方,将左上角、右上角、左下角、右下角位置投影到 2D 空间:

/// Returns the projection of an `ARImageAnchor` from the 3D world space
/// detected by ARKit into the 2D space of a view rendering the scene.
///
/// - Parameter from: An Anchor instance for projecting.
/// - Returns: An optional `CGRect` corresponding on `ARImageAnchor` projection.
internal func projection(from anchor: ARImageAnchor,
                         alignment: ARPlaneAnchor.Alignment,
                         debug: Bool = false) -> CGRect? {
    guard let camera = session.currentFrame?.camera else {
        return nil
    }

    let refImg = anchor.referenceImage
    let anchor3DPoint = anchor.transform.columns.3

    let size = view.bounds.size
    let width = Float(refImg.physicalSize.width / 2)
    let height = Float(refImg.physicalSize.height / 2)

    /// Upper left corner point
    let projection = ProjectionHelper.projection(from: anchor3DPoint,
                                              width: width,
                                              height: height,
                                              focusAlignment: alignment)
    let topLeft = projection.0
    let topLeftProjected = camera.projectPoint(topLeft,
                                      orientation: .portrait,
                                      viewportSize: size)

    let topRight:simd_float3 = projection.1
    let topRightProjected = camera.projectPoint(topRight,
                                       orientation: .portrait,
                                       viewportSize: size)

    let bottomLeft = projection.2
    let bottomLeftProjected = camera.projectPoint(bottomLeft,
                                         orientation: .portrait,
                                         viewportSize: size)

    let bottomRight = projection.3
    let bottomRightProjected = camera.projectPoint(bottomRight,
                                          orientation: .portrait,
                                          viewportSize: size)

    let result = CGRect(origin: topLeftProjected,
                        size: CGSize(width: topRightProjected.distance(point: topLeftProjected),
                                     height: bottomRightProjected.distance(point: bottomLeftProjected)))

    return result
}

当我在世界原点面前时,这个功能效果很好。但是,如果我向左或向右移动,角点的计算将不起作用。

【问题讨论】:

  • 您是在尝试在检测到的图像周围画一个框架吗?
  • @JoshRobbins 我尝试获取角点以将它们投影到 2D 空间。

标签: ios swift scenekit arkit


【解决方案1】:

我找到了一种解决方案,可以根据 anchor.transform 获取 ARImageAnchor 的角 3D 点并将它们投影到 2D 空间:

    extension simd_float4 { 
        var vector_float3: vector_float3 { return simd_float3([x, y, z]) } 
    }

    /// Returns the projection of an `ARImageAnchor` from the 3D world space
    /// detected by ARKit into the 2D space of a view rendering the scene.
    ///
    /// - Parameter from: An Anchor instance for projecting.
    /// - Returns: An optional `CGRect` corresponding on `ARImageAnchor` projection.
    internal func projection(from anchor: ARImageAnchor) -> CGRect? {
        guard let camera = session.currentFrame?.camera else {
            return nil
        }
        
        let refImg = anchor.referenceImage
        let transform = anchor.transform.transpose

        
        let size = view.bounds.size
        let width = Float(refImg.physicalSize.width / 2)
        let height = Float(refImg.physicalSize.height / 2)
        
        // Get corner 3D points
        let pointsWorldSpace = [
            matrix_multiply(simd_float4([width, 0, -height, 1]), transform).vector_float3, // top right
            matrix_multiply(simd_float4([width, 0, height, 1]), transform).vector_float3, // bottom right
            matrix_multiply(simd_float4([-width, 0, -height, 1]), transform).vector_float3, // bottom left
            matrix_multiply(simd_float4([-width, 0, height, 1]), transform).vector_float3 // top left
        ]
        
        // Project 3D point to 2D space
        let pointsViewportSpace = pointsWorldSpace.map { (point) -> CGPoint in
            return camera.projectPoint(
                point,
                orientation: .portrait,
                viewportSize: size
            )
        }
        
        // Create a rectangle shape of the projection
        // to calculate the Intersection Over Union of other `ARImageAnchor`
        let result = CGRect(
           origin: pointsViewportSpace[3],
           size: CGSize(
               width: pointsViewportSpace[0].distance(point: pointsViewportSpace[3]),
               height: pointsViewportSpace[1].distance(point: pointsViewportSpace[2])
           )
        )
        
        
        return result
    }

【讨论】:

  • 嘿,你能帮我解决我目前的问题吗?我试图使用你的代码示例,我遇到了这个错误:“'simd_float4'(又名'float4')类型的值没有成员'vector_float3'”
  • 嘿@PiotrGawłowski,它只是将simd_float4 转换为vector_float3 的属性:extension simd_float4 { var vector_float3: vector_float3 { return simd_float3([x, y, z]) } }
  • 我真正想弄清楚的是在 3dSpace 中获取角位置。我尝试使用 pointsWorldSpace 并使用此向量放置一个圆形几何节点。效果是 - 点以正确的方式分布(ImageRef 的形状),正确的大小(宽度-高度)但不正确的位置 - 完全在空间中的 ImageRef 位置之外。任何提示如何处理这个?
  • 可以通过变换矩阵得到3D位置:anchor.transform.columns.3。
  • 嗨@YasinNazlıcan,当您检测到标记时,您必须调用projection 函数。具体来说,您应该在func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) 委托函数上调用此函数。希望这对你有帮助;)
猜你喜欢
  • 1970-01-01
  • 2021-01-15
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 2020-10-28
  • 2018-05-03
  • 2015-05-14
  • 1970-01-01
相关资源
最近更新 更多