arkitblenderrealitykitusdz

3D model made in Blender not appearing on screen where expected


I created an object (a face) in Blender, exported it as a .obj, then used RealityConverter to make it a .usdz file.

I imported the .usdz file into my Xcode project. This is how I've attempted to anchor the Blender object to the user's face:

let cat = try! Entity.loadAnchor(named: "alienCat")
uiView.scene.addAnchor(cat)

When I use the above code with a simple object I created in Reality Composer, it works as expected and is anchored perfectly on the user's face.

But when I try the same code with my custom made .usdz file from Blender, the object doesn't appear on screen.

As I rotate the phone though, I see a large, blown up version floating above. I've attached 2 screenshots:

  1. The camera facing my face, and my face is black (expected). But no object attached to my face (unexpected).

enter image description here

  1. When I rotate the phone, I see my object floating way above my head and MUCH bigger than expected: enter image description here

When I run the code, this is the error I get in the Xcode console:

Warning: in AppendProperty at line 858 of sdf/path.cpp -- Can only append a property 'preliminary:anchoring:type' to a prim path (/)
Warning: in AppendProperty at line 858 of sdf/path.cpp -- Can only append a property 'triggers' to a prim path (/)

I believe my errors are originating from however I am creating my .obj in Blender. I'd like to fix these errors in Blender if possible. With the goal of just importing a .usdz and it works as expected. There was an answer posted here: RealityKit – Getting runtime warning when placing a model in ARView by Andy Jazz. And he says If you setup a preliminary anchoring for USDZ model, then Xcode will not print such warnings.

Is there a way to set up this preliminary anchoring in Blender?

How do I get these errors removed and have my .usdz object placed on the face as expected, and in an appropriate size?


Solution

  • Scaling issue

    When exporting from Blender or Maya, the scale of the model is 100:1 by default (because you're exporting your model in centimeter scale, not RealityKit's meter scale). Running facial tracking, the distance between the mask's pivot point and the mask surface is always greater than 0, which means that at a scale of 100:1, this distance increases by hundred times. The pivot point can even be outside the bounding box of the model. Also check which Blender model's axis is facing up: positive Y or positive Z.

    anchor.scale /= 100