visionOS https://developer.apple.com/forums/tags/visionOS Discuss developing for spatial computing and Apple Vision Pro. Video from https service on immersive view https://developer.apple.com/forums/thread/762777 Hello, I'm trying to load a video on immersive space. Actually using VideoMaterial and applying it on a plane surface I'm able to load video that I have locally. If I want to load something from an external link like youtube or other service how can I do that? Remembering obv that I'm loading it in an immersive space. Thanks ;) Sun, 25 Aug 2024 09:56:14 GMT Mirco46 SwiftUI's alert window won't get automatically focus in visionOS https://developer.apple.com/forums/thread/762781 I have three basic elements in this UI page: View, Alert, Toolbar. I put Toolbar and Alert along with the View, when I click a button on Toolbar, my alert window shows up. Below could be a simple version of my code: @State private var showAlert = false HStack { // ... } .alert(Text("Quit the game?"), isPresented: $showAlert) { MyAlertWindow() } message: { Text("Description text about this alert") } .toolbar { ToolbarItem(placement: .bottomOrnament) { MyToolBarButton(showAlert: $showAlert) } } And in MyToolBarButton I just toggle the binded showAlert variable to try ... Sun, 25 Aug 2024 06:46:58 GMT milanowth ModelEntity move duration visionOS 2 issue https://developer.apple.com/forums/thread/762772 The following RealityView ModelEntity animated text works in visionOS 1.0. In visionOS 2.0, when running the same piece of code, the model entity move duration does not seem to work. Are there changes to the way it works that I am missing? Thank you in advance. RealityView { content in let textEntity = generateMovingText() content.add(textEntity) _ = try? await arkitSession.run([worldTrackingProvider]) } update: { content in guard let entity = content.entities.first(where: { $0.name == .textE... Sat, 24 Aug 2024 22:07:15 GMT jamesboo Unable to detect collision https://developer.apple.com/forums/thread/762763 In RealityView I have two entities that contain tracking components and collision components, which are used to follow the hand and detect collisions. In the Behaviors component of one of the entities, there is an instruction to execute action through onCollision. However, when I test, they cannot execute action after collisions. Why is this? Sat, 24 Aug 2024 15:10:32 GMT lijiaxu Vision framework not working on Apple Vision Pro https://developer.apple.com/forums/thread/762761 com.apple.Vision Code=9 "Could not build inference plan - ANECF error: failed to load ANE model file:///System/Library/Frameworks/ Vision.framework/anodv4_drop6_fp16.H14G.espresso.hwx Code rise this error: func imageToHeadBox(image: CVPixelBuffer) async throws -> [CGRect] { let request:DetectFaceRectanglesRequest = DetectFaceRectanglesRequest() let faceResult:[FaceObservation] = try await request.perform(on: image) let faceBoxs:[CGRect] = faceResult.map { face in let faceBoundingBox:CGRect = face.boundingBox.cgRect return faceBoundingBox } return faceBox... Sat, 24 Aug 2024 12:40:08 GMT Shengjiang Retrieve AnchorEntity (Hand Locations) Position through update(context: Scene) function https://developer.apple.com/forums/thread/762722 Hello there, I'm currently working on a Hand Tracking System. I've already placed some spheres on some joint points on the left and right hand. Now I want to access the translation/position value of these entities in the update(context: Scene) function. Now my question is, is it possible to access them via .handAnchors(), or which types of .handSkeleton.joint(name) are referencing the same entity? (E.g. is AnchorEntity(.hand(.right, location: .indexFingerTip)) the same as handSkeleton.joint(.indexFingerTip). The goal would be to access the translation of the joints where a sphere has been plac... Fri, 23 Aug 2024 18:20:21 GMT XWDev Location of tap gesture in VisionOS https://developer.apple.com/forums/thread/762729 Is it possible to get the location of where a user uses the tap gesture on the screen? Like an x/y coordinate that can be used within my app. I know there is spatialTapGesture but from what I can tell that is only linked to content entities like a cube or something. Does this mean I can only get the x/y coordinate data by opening an immersive space and using the tap gesture in relation to some entity? TLDR: Can I get the location of a tap gesture in Vision OS in a regular app, without opening an immersive space? Fri, 23 Aug 2024 16:52:00 GMT cosorio Material Reference from Reality Composer Pro https://developer.apple.com/forums/thread/762721 I have a model entity (from Reality Composer Pro) I want to change the material of the model entity inside swift. The material is also imported in reality composer pro. I am copying the USDZ file of the material in the same directory as the script. This is the code I am using to reference the Material. do { // Load the file data if let materialURL = Bundle.main.url(forResource: "BlackABSPlastic", withExtension: "usdz") { let materialData = try Data(contentsOf: materialURL) // Check the first few bytes of the data to see if it matches expected types let ... Fri, 23 Aug 2024 11:06:43 GMT snehdev Rendering a windowed game in stereo - Build Failure https://developer.apple.com/forums/thread/762594 The sample code project RealityKit-Stereo-Rendering found at the article Rendering a windowed game in stereo here fails to compile with the following errors in Xcode 16 beta 6. Thu, 22 Aug 2024 06:50:35 GMT VaiStardom Creating tabletop games - Build Failure https://developer.apple.com/forums/thread/762606 The sample code project Tabletopkit Sample found at the article Creating tabletop games here fails to compile with the following errors in Xcode 16 beta 6. error: [xrsimulator] Component Compatibility: Billboard not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Exception thrown during compile: compileFailedBecause(reason: "compatibility faults") error: Tool exited with code 1 Thu, 22 Aug 2024 06:45:17 GMT VaiStardom Configuring the PencilKit tool picker - Build Failure https://developer.apple.com/forums/thread/762585 The sample code project PencilKitCustomToolPicker found at this article Configuring the PencilKit tool picker here fails to compile with the following errors in Xcode 16 beta 6 Thu, 22 Aug 2024 06:37:58 GMT VaiStardom Compose interactive 3D content in Reality Composer Pro -- Build Error https://developer.apple.com/forums/thread/762591 Compilation of the project for the WWDC 2024 session title Compose interactive 3D content in Reality Composer Pro fails. After applying the fix mentioned here (https://developer.apple.com/forums/thread/762030?login=true), the project still won't compile. Using Xcode 16 beta 7, I get these errors: error: [xrsimulator] Component Compatibility: EnvironmentLightingConfiguration not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Component Compatibility: AudioLibrary not available for 'xros 1.0', please update 'platforms' array in Package.swift error:... Thu, 22 Aug 2024 03:50:52 GMT VaiStardom SafariWebView in ImmersiveSpace with hand tracking https://developer.apple.com/forums/thread/762577 Is it possible to show a SafariWebView in an ImmersiveSpace with hand tracking enabled? I have an app with an initialView that launches an immersive space and opens a SafariView. I noticed that hand tracking stops working when I open the SafariView, but not when I open the TestView (which is just an empty window). Here's the Scene: var body: some Scene { WindowGroup(id: "control") { InitialView() }.windowResizability(.contentSize) WindowGroup(id: "test") { TestView() }.windowResizability(.contentSize) WindowGroup(id: "safari") { SafariView... Thu, 22 Aug 2024 02:44:20 GMT waba Model and Text Overlap Issue in Vision Pro App After GitHub Push https://developer.apple.com/forums/thread/762521 I’m developing an app for Vision Pro and have encountered an issue related to the UI layout and model display. Here's a summary of the problem: I created an anchor window to display text and models in the hand menu UI. While testing on my Vision Pro, everything works as expected; the text and models do not overlap and appear correctly. However, after pushing the changes to GitHub and having my client test it, the text and models are overlapping. Details: I’m using Reality Composer Pro to load models and set them in the hand menu UI. All pins are attached to attachmentHandManu, and attachmentHa... Wed, 21 Aug 2024 12:24:59 GMT SidAura how do i extend the Full immersive boundary? https://developer.apple.com/forums/thread/762497 hello guys as u know, full immersive boundary range is 3m * 3m, for safe the users. but i need the full immersive space by more extended range. bcuz i got the quite large area and to play more fun. is the any problem? thank you! Wed, 21 Aug 2024 09:00:07 GMT cue how to hide window in vision pro? https://developer.apple.com/forums/thread/762455 when I use openwindow to show a volume window,I want to hide the old window ,keep it stays where it is, when I close the volume window, the old window appears again.the logic is simple, I tried to use opaque to hide the old window, but the window bar is still there, which is very annoying. how could I solve this? Wed, 21 Aug 2024 03:00:05 GMT windowresized Write Permission to Camera Feed? https://developer.apple.com/forums/thread/762415 I am working on a project for a university which wants to alter the passthrough camera feed more so than the standard filters (saturation/contract/etc) that some of the headsets provide. I don't have access to the headset or enterprise SDK yet, as I'd like to nail down whether or not this is feasible before we purchase the hardware. In the API I see I can use CameraFrameProvider to access a CameraFrame and then grab a sample. The sample has a CVPixelBuffer. I have 2 questions regarding the pixelBuffer: I see that the buffer itself is read only but can I alter the bytes within this pixel buff... Tue, 20 Aug 2024 17:35:56 GMT ickydime Raw Camera ,Sensor Data from Vision Pro https://developer.apple.com/forums/thread/762396 Can we get the raw sensor data from the apple vision pro? Tue, 20 Aug 2024 10:54:56 GMT balasivasurya RelaityView Clone https://developer.apple.com/forums/thread/762370 Is there any action that can clone the entity in RealityView to the number I want? If there is, please let me know. Thank you! Tue, 20 Aug 2024 10:47:16 GMT lijiaxu Entity.applyTapForBehaviors() only works on Simulator, not device https://developer.apple.com/forums/thread/762364 I created a simple Timeline animation with only a "Play Audio" action in RCP. Also a Behaviors Component setting an "OnTap" trigger to fire this Timeline animation. In my code, I simply run Entity.applyTapForBehaviors() when something happened. The audio can be normally played on the simulator but cannot be played on the device. Any potential bug leads this behavior? Env below: Simulator Version: visionOS 2.0 (22N5286g) XCode Version: Version 16.0 beta 4 (16A5211f) Device Version: visionOS 2.0 beta (latest) Tue, 20 Aug 2024 08:46:46 GMT milanowth