Entity Coordinates in Object Tracking

A second post on the same topic, as I feel I may have over complicated the earlier one.

I essentially am performing object tracking inside Reality Composer Pro and adding a digital entity to the tracked object. I now want to get the coordinates of this digital entity inside Xcode..

Secondly, can I track more than 1 object inside the same scene? For example if I want to find a spanner and a screwdriver amongst a bunch of tools laid out on the table, and spawn an arrow on top of the spanner and the screwdriver, and then get the coordinates of the arrows that I spawn, how can I go about this?

Answered by Vision Pro Engineer in 799194022

Hi @adityach

To protect people's privacy you need to obtain permission to read an entity's position when it's an AnchorEntity (in this case for an Object) or contained in an AnchorEntity.

Add an entry for NSWorldSensingUsageDescription to your app’s information property list to provide a usage description that explains how your app uses the position.

Start a SpatialTrackingSession configured to track objects.

.task {
    let configuration = SpatialTrackingSession.Configuration(
        tracking: [.object])
    
    // Declare this on your view
    session = SpatialTrackingSession()
    await session.run(configuration)
}

Find the entity in your Reality Composer Pro scene then ask for its position relative to a given entity (in this case I pass nil so the coordinates are in world space). Here's code to do that. Note until the SpatialTrackingSession is running the AnchorEntity will report its position as [0, 0, 0].

// named: is the name you gave the entity in Reality Composer Pro.
if let objectEntity = immersiveContentEntity.findEntity(named: "MyObject") {
    let position = objectEntity.convert(position: objectEntity.position, to: nil)
}
Accepted Answer

Hi @adityach

To protect people's privacy you need to obtain permission to read an entity's position when it's an AnchorEntity (in this case for an Object) or contained in an AnchorEntity.

Add an entry for NSWorldSensingUsageDescription to your app’s information property list to provide a usage description that explains how your app uses the position.

Start a SpatialTrackingSession configured to track objects.

.task {
    let configuration = SpatialTrackingSession.Configuration(
        tracking: [.object])
    
    // Declare this on your view
    session = SpatialTrackingSession()
    await session.run(configuration)
}

Find the entity in your Reality Composer Pro scene then ask for its position relative to a given entity (in this case I pass nil so the coordinates are in world space). Here's code to do that. Note until the SpatialTrackingSession is running the AnchorEntity will report its position as [0, 0, 0].

// named: is the name you gave the entity in Reality Composer Pro.
if let objectEntity = immersiveContentEntity.findEntity(named: "MyObject") {
    let position = objectEntity.convert(position: objectEntity.position, to: nil)
}

First at all, I wanna to TANKS YOU. I'm really surprised to have a so nice and complet reply !!!!! Amazing !!!!!

I've create a new project to test your code, and I've a simple but unsolvable trouble, is XCode can't found SpatialTrackingSession : "Cannot find 'SpatialTrackingSession' in scope".

I've searched over the web and I've not found how to have it. Could you help me please ?

Cheers and THANKS ! Mathis

Hello Apple,

Ok understand need to have VisionPro OS 2 and last XCode Beta.

I've no word, your sample runs very well and enjoy me. I can make many many things from this.

"I love you" !!!

Thanks and thanks and thanks !!!! Mathis

Entity Coordinates in Object Tracking
 
 
Q