Reality Composer

RSS for tag

Prototype and produce content for AR experiences using Reality Composer.

Posts under Reality Composer tag

58 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Is there a way to make an .objcap file from a .USDZ file
I have design a 3D object and exported it as a USDZ. I also 3D printed said object. I want to use the object as a 3D trigger for an AR experience I am building. My question is: is there a process that would let me take the 3D .usdz file and convert it to a .arobject or a .objcap medium/low density point cloud to use as an AR trigger. Because I do have the 3D print of the object I did use the "scan" option when setting up my scene but the "resolution"/fidelity seems really low, and the results I get are just mediocre. I would love to take my 3D USDZ that I already have and use it to generate a file that can be used as a 3D trigger. is this possible, or is there a process to do this. I am able to take the 3D that I scan in Reality Composer (which is exported as a .objcap file), send it to reality converter on my Mac and make a USDZ from it. I am looking for a way to go the other way .USDZ > .objcap or .arobject. I am trying to make a experience that mimic projection mapping but in AR. I have a 3D object I built and textured in substance painter. I also printed this object in a base gray color. I want to use the 3D print of the object as an AR trigger that would start a scene placing/overlaying/projection mapping the textured 3D model over the gray 3D printed model. Ideally the mapped 3D model would be spatial attached to the 3D print and move with it when the object is handled.
1
0
156
1w
RealityKit scene with the Entity Component System
I'm following WWDC for interactive 3D content in reality composer pro and apple's document https://developer.apple.com/wwdc24/10102 https://developer.apple.com/documentation/realitykit/implementing-systems-for-entities-in-a-scene#Retrieve-entities-with-an-entity-query However, this simple code to declare a dummy Component and System has compile error /Users/Workspaces/repository/Packages/RealityKitContent/Sources/RealityKitContent/RobotComponent.swift:18:24 Static property 'query' is not concurrency-safe because non-'Sendable' type 'EntityQuery' may have shared mutable state // Define a query to return all entities with a MyComponent. private static let query = EntityQuery(where: .has(MyComponent.self)) // Initializer is required. Use an empty implementation if there's no setup needed. required init(scene: Scene) { } // Iterate through all entities containing a MyComponent. func update(context: SceneUpdateContext) { for entity in context.entities( matching: Self.query, updatingSystemWhen: .rendering ) { // Make per-update changes to each entity here. } } } I'm using XCode beta3 and project target visionos 2
1
0
274
Jul ’24
Can’t Figure Out How to Get My Earth Entity to Rotate on its Axis
I can‘t Figure Out How to Get My Earth Entity to Rotate on its Axis. This is a follow up post from a previous Apple Developer forum post. How would I have the earth (parent) entity rotate CCW underneath the orbiting starship child? I tried adding the following code block to the RealityView but it is not working: if let rotatingEarth = starshipEntity.findEntity(named: "Earth") { rotatingEarth.transform.rotation = simd_quatf.init(angle: 360, axis: SIMD3(x: 0, y: 1, z: 0)) if let animation = try? AnimationResource.generate(with: rotatingEarth as! AnimationDefinition) { rotatingEarth.playAnimation(animation) } } Any advice on getting the earth to rotate? I tried reviewing the Hello World WWDC23 project code, but I was unable to understand the complexity and how that sample project got the earth to rotate. i want to do this for visionOS 1.2. I realize there are some new animation and possible other capabilities coming up in vision 2.0 but I want to try to address this issue in the current released visionOS version.
5
0
368
Jul ’24
Cloud Service for Apple Vision Pro App
For all the AVP devs out there, what cloud service are you using to load content in your app that has extremely low latency? I tried using CloudKit and it did not work well at all. Latency was super bad :/ Firebase looks like the most promising at this point?? Wish Apple would create an ultra low latency cloud service for streaming high quality content such as USDZ files and scenes made in Reality Composer Pro.
1
0
322
Jul ’24
3D Object Capture not working on iphone 12 pro
The 3D object capture feature doesn’t seem to work on my iphone 12 pro. The circle that is supposed to show up when you begin to begin to move around the object doesnt show up so object capture doesn’t even begin. It says ‘more light..’ or ‘move closer’ but this doesnt happen on my iphone 14 pro. Works perfectly fine on that even with the same lighting. How can this be fixed?
1
0
343
Jul ’24
Multi-platform app for visionOS and iOS: How to include 3D models for both?
I created an app for visionOS, using Reality Composer Pro. Now I want to turn this app into a multi-platform app for iOS as well. RCP files are not supported on iOS, however. So I tried to use the "old" Reality Composer instead, but that doesn't seem to work either. Xcode 15 does not include it anymore, and I read online that files created with Xcode 14's Reality Composer cannot be included in Xcode 15 files. Also, Xcode 14 does not run on my M3 Mac with Sonoma. That's a bummer. What is the recommended way to include 3D content in apps that support visionOS AND iOS?! (I also read that a solution might be using USDZ for both. But how would that workflow look like? Are there samples out there that support both platforms? Please note that I want to setup the anchors myself, using code. I just need the composing tool to the create 3D content that will be placed on these anchors.)
1
0
542
Jun ’24
Object Capture to USDZ Scaling
Im trying to take an object capture, and scale it. What I did so far is create a Reality Composer project, insert the .objcap file into the project, and then scaled it from 100%, to 200%. I then extracted it as a USDZ. it just won't show up in the Xcode preview now, and im not sure why it doesn't show. Is there any way to fix this? im going crazy trying to find a fix for this to work.
1
0
619
Apr ’24
Video Player controllers aren't showing for the video in immersive view
I have a AVPlayer() which loads the video and places it on the screen ModelEntity in the immersive view using the VideoMaterial. This also makes the video untappable as it is a VideoMaterial. Here's the code for the same: let screenModelEntity = model.garageScreenEntity as! ModelEntity let modelEntityMesh = screenModelEntity.model!.mesh let url = Bundle.main.url(forResource: "<URL>", withExtension: "mp4")! let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer() let material = VideoMaterial(avPlayer: player) screenModelEntity.components[ModelComponent.self] = .init(mesh: modelEntityMesh, materials: [material]) player.replaceCurrentItem(with: playerItem) return player I was able to load and play the video. However, I cannot figure out how to show the player controls (AVPlayerViewController) to the user, similar to the DestinationVideo sample app. How can I add the video player controls in this case?
0
0
554
Apr ’24
Adding 2D PNG to Reality Composer Pro
I've got a couple 2D PNG assets that I want to add to a scene made of a couple other udsz files in RCP (picture adding a couple 2D videogame characters to a simple 3D diorama). When I try to drag the PNGs to the workspace or the file tree…nothing happens. I found a walkthrough on Medium (called "Importing and Exporting Personalized Objects for Augmented Reality: Reality Composer and SwiftUI" for those curious as I can't link to Medium posts here) that makes it look like users could do this with simple drag-and-drop. The Medium post is from June 2023, and in the screenshots RCP visually looks a lot more like Reality Composer on iPad, so I'm assuming it's changed a lot since then? Is there still a way to do this? I've tried adding the 2D elements to a scene with Blenders "import images as planes," but I'm getting weird halos around them and was hoping RCP could make the process a bit easier/cleaner.
1
0
511
Jun ’24
Custom material getting converted to PhysicallyBasedMaterial
I have a custom material in Reality Composer. When I attach it to a cube and try loading the scene in XCode, the material cannot be cast to a ShaderGraphMaterial because it has been changed to a PhysicallyBasedMaterial. The material was always a Custom material, I did not change the type in Reality Composer. Does anyone know how to fix?
1
0
779
Mar ’24
"Meet Reality Composer Pro" - Spatial Audio Problem
I'm following the Meet Reality Composer Pro walkthrough and ran into something that didn't function as expected. When I got to the step where I add five "Bird_With_Audio.usda" references to the scene, I found they did not play audio. After some trial and error, I found that Preview > Resource in each of their Spatial Audio items was set to "None." If I click the dropdown menu, I see several "Bird_Calls" groups to pick from. I checked the original Bird_With_Audio.usda that I had created, and the "Bird_Calls" audio group was correctly assigned and worked. I tried dragging a sixth Bird_With_Audio into the scene and confirmed that the Spatial Audio item suddenly empties, rendering the bird silent. I was able to go through each of the five birds and set their Spatial Audio Resource to Bird_Calls, and the group worked like the video demonstrates. While this fixed the issue, as a beginner I'd like to know why this happened. It doesn't seem right that I would build and item and then have to re-attach any sounds to it when I place it in the main scene. So…where did I mess up?
0
0
462
Mar ’24
Reality Composer Pro node previews?
I have been digging into learning shader graphs by watching Unity shader graph content, cause lots of the same concepts apply. One thing I noticed was that in Unity, each node in the shader graph has a little preview. I don't think this exists in Reality Composer Pro, but is there anyway to mimic it (like can I hook up a node that allows me to debug the graph at that point?) If not, I'm happy to just file a feedback about it, but just thought I'd ask!
2
0
670
1d