iosswiftscenekitarkitarworldmap

Persist object orientations in ARKit ARWorldMap


I'm trying to persist a model in ARKit using the ARWorldMap. I can save and load the models, but the orientation I apply to the objects before I save is not persisted with the object.

What I'm currently doing

Objects are saved and loaded:

  /// - Tag: GetWorldMap
  @objc func saveExperience(_ button: UIButton) {
    sceneView.session.getCurrentWorldMap { worldMap, error in
      guard let map = worldMap
        else { self.showAlert(title: "Can't get current world map", message: error!.localizedDescription); return }

      // Add a snapshot image indicating where the map was captured.
      guard let snapshotAnchor = SnapshotAnchor(capturing: self.sceneView) else {
        fatalError("Can't take snapshot")

      }
      map.anchors.append(snapshotAnchor)

      do {
        let data = try NSKeyedArchiver.archivedData(withRootObject: map, requiringSecureCoding: true)
        try data.write(to: self.mapSaveURL, options: [.atomic])
        DispatchQueue.main.async {
          self.loadExperienceButton.isHidden = false
          self.loadExperienceButton.isEnabled = true
        }
      } catch {
        fatalError("Can't save map: \(error.localizedDescription)")
      }
    }
  }

  /// - Tag: RunWithWorldMap
  @objc func loadExperience(_ button: UIButton) {

    /// - Tag: ReadWorldMap
    let worldMap: ARWorldMap = {
      guard let data = mapDataFromFile
        else { fatalError("Map data should already be verified to exist before Load button is enabled.") }
      do {
        guard let worldMap = try NSKeyedUnarchiver.unarchivedObject(ofClass: ARWorldMap.self, from: data)
          else { fatalError("No ARWorldMap in archive.") }
        return worldMap
      } catch {
        fatalError("Can't unarchive ARWorldMap from file data: \(error)")
      }
    }()

    // Display the snapshot image stored in the world map to aid user in relocalizing.
    if let snapshotData = worldMap.snapshotAnchor?.imageData,
      let snapshot = UIImage(data: snapshotData) {
      self.snapshotThumbnail.image = snapshot
    } else {
      print("No snapshot image in world map")
    }
    // Remove the snapshot anchor from the world map since we do not need it in the scene.
    worldMap.anchors.removeAll(where: { $0 is SnapshotAnchor })

    let configuration = self.defaultConfiguration // this app's standard world tracking settings
    configuration.initialWorldMap = worldMap
    sceneView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])

    isRelocalizingMap = true
    virtualObjectAnchor = nil
  }

Rotation:

@objc func didRotate(_ gesture: UIRotationGestureRecognizer) {
    sceneView.scene.rootNode.eulerAngles.y = objectRotation
    gesture.rotation = 0
}

And then it's rendered:

  func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
    guard anchor.name == virtualObjectAnchorName else {
      return
    }

    // save the reference to the virtual object anchor when the anchor is added from relocalizing
    if virtualObjectAnchor == nil {
      virtualObjectAnchor = anchor
    }
    node.addChildNode(virtualObject)
  }

How can I do this?

How can I go about doing this? I have tried multiple solutions, but the orientation is never kept. It loads the object at the correct position, but rotation and scaling is never kept, even if I apply it to the rootnode. The only option I can see is to also store the transform as a seperate data object, and load that and apply it. But seems like it should be possible to store this data with the object.


Solution

  • Apple Documentation for ARWorldMap shows that the properties for an ARWorldMap class are: <code>anchors: [ARAnchor]</code>, <code>center: simd_float3</code>, and <code>extent: simd_float3</code>

    When you archive a world map, these are the only information that get saved. Any information about the nodes added to the anchors during the session (e.g. changing node scale and orientation) are not saved along with the world map during the archiving.

    I remember watching a WWDC session where they demoed a multiplayer AR game called SwiftShot where players hit different objects with balls. They provided the source code and I noticed they used a custom ARAnchor subclass called BoardAnchor which they used to store additional information in the anchor class such as the size of the game board. See: SwiftShot: Creating a Game for Augmented Reality.

    You can use the same approach to store, for example, the scale and orientation of a node, so that when you unarchive the world map and it get's relocalized, you can use ARSCNViewDelegate's renderer(_:didAdd:for:) to resize and scale the node based on the information stored in your custom ARAnchor.