Hello, I'm setting up an ar view using scene anchors from reality composer. The scenes load perfectly fine the first time entering the AR View. When I go back to the previous screen and re-enter the AR View the app crashes before any of the scenes appear on the screen. I've tried pausing and resuming the session and am still getting the following error.
validateFunctionArguments:3536: failed assertion `Fragment Function(fsRenderShadowReceiverPlane): incorrect type of texture (MTLTextureTypeCube) bound at texture binding at index 0 (expect MTLTextureType2D) for projectiveShadowMapTexture[0].'
Any help would be very much appreciated.
Thanks
ARKit
RSS for tagIntegrate iOS device camera and motion features to produce augmented reality experiences in your app or game using ARKit.
Posts under ARKit tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hi all,
I have an .obj model which I converted using Reality converter.
When trying to load it in safari quicklook, it pops up an error "Object requires a newer version of iOS"
I'm on an iPhone X with iOS 13.6.
Any thoughts?
Thanks
Are there any good tutorials or suggestions on creating models in Blender and exporting with the associated materials and nodes? Specifically I'm looking to see if there is an ability to export translucency associated with an object (i.e. glass bottle). I have created a simple cube with a Principled BSDF shader, but the transmission and IOR settings are not porting over. Any tips or suggestions would be helpful.
Hello,
I'm working on an application that will try to take measures from parts of a human body (hand, foot..).
Now that Lidar has been integrated on the iPhone Pro and Arkit 4.0 came out, I would like to know if it seems feasible to combine the feature of the library such as model creation and geometry measurement, to get precise measurement of a body part.
Thanks for your help,
Tom
Hi,
The ARkit is a great tool, I have my small app doing things, and it's fun!
but I wanted to try to migrate from ARWorldConfiguration to ARGeoTrackingConfiguration - https://developer.apple.com/documentation/arkit/argeotrackingconfiguration
and then we can see that this configuration is limited to a couples of USA only cites.
But I can't manage to figure Why and if, in the near future, this will be expanded world wide ?
I am developing an app using ARKit4 ARGeoTrackingConfiguration following https://developer.apple.com/documentation/arkit/content_anchors/tracking_geographic_locations_in_ar.
I am outside of the US, so my location is not supported. I simulate a location but the CoachingState always stays at .initializing
Is there a way to test geotracking apps outside of the US?
I want to know are depth map and RGB image are perfectly aligned(do both have the same principle point)? If yes then how the depth- map is created.
The depth map on iphone12 has 256x192 resolution as opposed to an RGB image (1920x1440). I am interested in exact pixel-wise depth. Is it possible to get the raw depth map of 1920x1440 resolution ?
How is the depth-map is created at 256 x 192 resolution? Behind the scenes does the pipeline captures it at 1920 x1440 resolution and then resize it to 256x192?
I have so many questions as there are no intrinsic, extrinsic, and calibration data given regarding the lidar.
I would greatly appreciate it if someone can explain the steps from a computer-vision perspective.
Many Thanks
I'm creating a custom scanning solution for iOS and using RealityKit Object Capture PhotogrammetrySession API to build a 3D model. I'm finding the data I'm sending to it is ignoring the depth and not building the model to scale. The documentation is a little light on how to format the depth so I'm wondering if someone could take a look at some example files I send to the PhotogrammetrySession. Would you be able to tell me what I'm not doing correctly?
https://drive.google.com/file/d/1-GoeR_KMhX_i7-y8M8ElDRrySasOdoHy/view?usp=sharing
Thank you!
I'm making an app that captures data using ARKit and will ultimately send the images+depth+gravity to an Object Capture Photogrammetry agent. I need to use the depth data and produce a model with correct scale, so from what I understand I need to send the depth file + set proper exif data in the image. Since I'm getting the images+depth from ARKit I'll need to set the exif data manually before saving the images. Unfortunately the documentation on this is a bit light, so would you be able to let me know what exif data needs to be set in order for the Photogrammetry to be able to create the model with proper scale?
If I try and set my Photogrammetry agent with manual metadata like this:
var sample = PhotogrammetrySample(id: id, image: image)
var dict:[ String: Any ] = [:]
dict["FocalLength"] = 23.551325
dict["PixelWidth"] = 1920
dict["PixelHeight"] = 1440
sample.metadata = dict
I get the following error in the output and depth is ignored:
[Photogrammetry] Can't use FocalLenIn35mmFilm to produce FocalLengthInPixel! Punting...
Hello!
I am having trouble calculating accurate distances in the real world using the camera's returned intrinsic matrix and pixel coordinates/depths captured from the iPhone's LiDAR. For example, in the image below, I set a mug 0.5m from the phone. The mug is 8.5cm wide. The intrinsic matrix returned from the phone's AVCameraCalibrationData class has focalx = 1464.9269, focaly = 1464.9269, cx = 960.94916, and cy = 686.3547. Selecting the two pixel locations denoted in the image below, I calculated each one's xyz coordinates using the formula:
x = d * (u - cx) / focalx
y = d * (v - cy) / focaly
z = d
Where I get depth from the appropriate pixel in the depth map - I've verified that both depths were 0.5m. I then calculate the distance between the two points to get the mug width. This gives me a calculated width of 0.0357, or 3.5 cm, instead of the 8.5cm I was expecting. What could be accounting for this discrepancy?
Thank you so much for your help!
Hi
Is this possible to have RoomCaptureSession and ARSession together, as we need feature points
I have a arobject files that's already tested and working perfectly in Reality Composer as an anchor. But for whatever reason when I try them both as AR Resource group in my asset or even loading it directly from the url it always fails (returns nil) I've double check all the file/group names and they seems fine and I couldn't find the error, just always nil.
This is my code :
var referenceObject: ARReferenceObject?
if let referenceObjects = ARReferenceObject.referenceObjects(inGroupNamed: "TestAR", bundle: Bundle.main) {
referenceObject = referenceObjects[referenceObjects.startIndex]
}
if let referenceObject = referenceObject{
delegate.didFinishScan(referenceObject, false)
}else {
do {
if let url = Bundle.main.url(forResource: "Dragonball1", withExtension: "arobject") {
referenceObject = try ARReferenceObject(archiveURL: url)
}
} catch let myError{
let error = myError as NSError
print("try \(error.code)")
}
}
Any idea? Thanks
I found that almost all third-party applications cannot focus when they ask me to take an ID card photo.
The same issue description from Twitter: https://twitter.com/mlinzner/status/1570759943812169729?s=20&t=n_Z5lTJac3L5QVRK7fbJZg
Who can fix it? Apple or third-party developers?
I'd like to capture the room with materials obtained through the camera while scanning with RoomPlan.
Is there any way to capture room surface material and render the object while capturing the room geometry using RoomPlan?
Hi
We're working on an app that uses RealityKit to present products in the customer's home.
We have a bug where every once in a while (somewhere around 1 in 100 runs), all entities are rendered using the green channel only (see image below).
It seems to happen to occur in all entities in the ARView, regardless of model or material type.
Due to the flaky nature of this bug, I found it really hard to debug, and I can't seem to rule out an internal issue in RealityKit.
Did anyone run into similar issues or have any hints of where to look for the culprit?
I have a single fbx model with several animations(idle, walk, run, eat......), but after I convert it to usdz format, I can only play 1 animation, where can I find other animations and how can I play them, or USDZ doesn't support it yet?
Thank you.
Cyan
hi I am not sure what is going on...
I have been working on this model for a while on reality composer, and had no problem testing it that way...it always worked out perfectly.
So I imported the file into a brand new Xcode project... I created a new ARApp, and used SwiftUI.
I actually did it twice ...
And tested the version apple has with the box. In Apple's version, the app appears but the whole part where it tries to detect planes didn't show up. So I am confused.
I found a question that mentions the error messages I am getting but I am not sure how to get around it?
https://developer.apple.com/forums/thread/691882
//
// ContentView.swift
// AppToTest-02-14-23
//
// Created by M on 2/14/23.
//
import SwiftUI
import RealityKit
struct ContentView : View {
var body: some View {
return ARViewContainer().edgesIgnoringSafeArea(.all)
}
}
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
// Load the "Box" scene from the "Experience" Reality File
//let boxAnchor = try! Experience.loadBox()
let anchor = try! MyAppToTest.loadFirstScene()
// Add the box anchor to the scene
arView.scene.anchors.append(anchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {}
}
#if DEBUG
struct ContentView_Previews : PreviewProvider {
static var previews: some View {
ContentView()
}
}
#endif
This is what I get at the bottom
2023-02-14 17:14:53.630477-0500 AppToTest-02-14-23[21446:1307215] Metal GPU Frame Capture Enabled
2023-02-14 17:14:53.631192-0500 AppToTest-02-14-23[21446:1307215] Metal API Validation Enabled
2023-02-14 17:14:54.531766-0500 AppToTest-02-14-23[21446:1307215] [AssetTypes] Registering library (/System/Library/PrivateFrameworks/CoreRE.framework/default.metallib) that already exists in shader manager. Library will be overwritten.
2023-02-14 17:14:54.716866-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.743580-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arKitPassthrough.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.744961-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/drPostAndComposition.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.745988-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arSegmentationComposite.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.747245-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute0.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.748750-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute1.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.749140-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute2.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.761189-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute3.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.761611-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute4.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.761983-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute5.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.762604-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute6.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.763575-0500 AppToTest-02-14-23[21446:1307215] [Assets] Resolving material name 'engine:BuiltinRenderGraphResources/AR/arInPlacePostProcessCombinedPermute7.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
2023-02-14 17:14:54.764859-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 18: Json Deserialization; unknown member 'EnableARProbes' - skipping.
2023-02-14 17:14:54.764902-0500 AppToTest-02-14-23[21446:1307215] [Foundation.Serialization] Json Parse Error line 20: Json Deserialization; unknown member 'EnableGuidedFilterOcclusion' - skipping.
2023-02-14 17:14:55.531748-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534559-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534633-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534680-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534733-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534777-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534825-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534871-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:55.534955-0500 AppToTest-02-14-23[21446:1307215] throwing -10878
2023-02-14 17:14:56.207438-0500 AppToTest-02-14-23[21446:1307383] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [2]
2023-02-14 17:17:15.741931-0500 AppToTest-02-14-23[21446:1307414] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1]
2023-02-14 17:22:07.075990-0500 AppToTest-02-14-23[21446:1308137] [Technique] ARWorldTrackingTechnique <0x1149cd900>: World tracking performance is being affected by resource constraints [1]
code-block
I'm looking for ARKit standard face mesh with blendshapes to download, is this available?
Hi,
I'm developing an AR App using Apple ARKit.
At the moment in my AugmentedView I'm using the boilerplate directly provided by Apple for AR Apps template when choosing initial type of project.
When I run my app in debug mode I see that I'm receiving this warning/advice in the console:
ARSession is being deallocated without being paused. Please pause running sessions explicitly.
Is possible to pause ARSession in SwiftUI? Because as far as I know all this stuff is managed by default by the OS. Is that correct?
To notice, I have two Views, a parent View and by using a NavigationStack / NavigationLink the child "subView"
Here is my code snippet for the parent View:
import SwiftUI
struct ArIntroView: View {
var body: some View {
NavigationStack{
NavigationLink(destination: AugmentedView(), label: {
HStack {
Text("Go to ARView")
}
.padding()
}
)
}
}
}
struct ArIntroView_Previews: PreviewProvider {
static var previews: some View {
ArIntroView()
}
}
Here is my code for the child View:
import SwiftUI
import RealityKit
struct AugmentedView : View {
@State private var showingSheet = false
// add state to try explicitly end AR
// @State private var isExitTriggered: Bool = false
var body: some View {
ZStack {
ARViewContainer()
// to hide toolbar in View
.toolbar(.hidden, for: .tabBar)
.ignoresSafeArea(.all)
VStack {
Spacer()
HStack {
Button {
//toggle bottom sheet
showingSheet.toggle()
} label: {
Text("Menu")
.frame(maxWidth: .infinity)
.padding()
}
.background()
.foregroundColor(.black)
.cornerRadius(10)
//present the bottom sheet
.sheet(isPresented: $showingSheet) {
HStack{
Button {
// dismiss bottom sheet
showingSheet.toggle()
} label: {
Label("Exit", systemImage: "cross")
}
}
.presentationDetents([.medium])
.presentationDragIndicator(.visible)
}
}
.padding()
}
Spacer()
}
}
}
struct ARViewContainer: UIViewRepresentable {
// @Binding var isExitTriggered: Bool
let arView = ARView(frame: .zero)
func makeUIView(context: Context) -> ARView {
// Load the "Box" scene from the "Experience" Reality File
let boxAnchor = try! Experience.loadBox()
// Add the box anchor to the scene
arView.scene.anchors.append(boxAnchor)
return arView
}
// not work it will remove the View and after it will re-create it from makeUIView
func updateUIView(_ uiView: ARView, context: Context) {
// if isExitTriggered {
// uiView.session.pause()
// uiView.removeFromSuperview()
// }
}
}
#if DEBUG
struct Augmented_Previews : PreviewProvider {
static var previews: some View {
AugmentedView()
}
}
#endif
In addition, when running the app and check the performance with Apple Instruments, during the AR session I have memory leaks, apparently with the CoreRE library here the snap:
Any suggestion or critique will be welcome.
Is there a particle system in RealityKit? if so can some one point me to the correct documentation/articles ?