
Over the past fewer months, Apple experts person fielded questions astir visionOS successful Apple Vision Pro developer labs each implicit the world. Here are answers to immoderate of the astir predominant questions they’ve been asked, including insights connected caller concepts similar entities, immersive spaces, collision shapes, and overmuch more.
How tin I interact with an entity utilizing gestures?
There are 3 important pieces to enabling gesture-based entity interaction:
- The entity indispensable person an InputTargetComponent. Otherwise, it won’t person motion input astatine all.
- The entity indispensable person a CollisionComponent. The shapes of the collision constituent specify the regions that gestures tin really hit, truthful marque definite the collision shapes are specified appropriately for enactment with your entity.
- The motion that you’re utilizing indispensable beryllium targeted to the entity you’re trying to interact with (or to immoderate entity). For example:
It’s besides a bully thought to springiness an interactive entity a HoverEffectComponent, which enables the strategy to trigger a modular item effect erstwhile the idiosyncratic looks astatine the entity.
Should I usage a model group, an immersive space, oregon both?
Consider the method differences betwixt windows, volumes, and immersive spaces erstwhile you determine which country benignant to usage for a peculiar diagnostic successful your app.
Here are immoderate important method differences that you should origin into your decision:
- Windows and volumes from different apps the idiosyncratic has unfastened are hidden erstwhile an immersive abstraction is open.
- Windows and volumes clip contented that exceeds their bounds.
- Users person afloat power implicit the placement of windows and volumes. Apps person afloat power implicit the placement of contented successful an immersive space.
- Volumes person a fixed size, windows are resizable.
- ARKit lone delivers information to your app if it has an unfastened immersive space.
Explore the Hello World illustration code to familiarize yourself with the behaviors of each country benignant successful visionOS.
How tin I visualize collision shapes successful my scene?
Use the Collision Shapes debug visualization successful the Debug Visualizations menu, wherever you tin find respective different adjuvant debug visualizations arsenic well. For accusation connected debug visualizations, cheque retired Diagnosing issues successful the quality of a moving app.
Can I presumption SwiftUI views wrong an immersive space?
Yes! You tin presumption SwiftUI views successful an immersive abstraction with the offset(x:y:) and offset(z:) methods. It’s important to retrieve that these offsets are specified successful points, not meters. You tin utilize PhysicalMetric to person meters to points.
What if I privation to presumption my SwiftUI views comparative to an entity successful a world view?
Use the RealityView attachments API to make a SwiftUI presumption and marque it accessible arsenic a ViewAttachmentEntity. This entity tin beryllium positioned, oriented, and scaled conscionable similar immoderate different entity.
RealityView { content, attachments in // Fetch the attachment entity utilizing the unsocial identifier. let attachmentEntity = attachments.entity(for: "uniqueID")! // Add the attachment entity arsenic RealityView content. content.add(attachmentEntity) } attachments: { // Declare a presumption that attaches to an entity. Attachment(id: "uniqueID") { Text("My Attachment") } }Can I presumption windows programmatically?
There’s nary API disposable to presumption windows, but we’d emotion to cognize astir your usage case. Please record an enhancement request. For much accusation connected this topic, cheque retired Positioning and sizing windows.
Is determination immoderate mode to cognize what the idiosyncratic is looking at?
As noted successful Adopting champion practices for privateness and idiosyncratic preferences, the strategy handles camera and sensor inputs without passing the accusation to apps directly. There's nary mode to get precise oculus movements oregon nonstop enactment of sight. Instead, make interface elements that radical tin interact with and fto the strategy negociate the interaction. If you person a usage lawsuit that you can't get to enactment this way, and arsenic agelong arsenic it doesn't necessitate explicit oculus tracking, delight record an enhancement request.
When are the onHover and onContinuousHover actions called connected visionOS?
The onHover and onContinuousHover actions are called erstwhile a digit is hovering implicit the view, oregon erstwhile the pointer from a connected trackpad is hovering implicit the view.
Can I amusement my ain immersive situation textures successful my app?
If your app has an ImmersiveSpace open, you tin make a ample sphere with an UnlitMaterial and standard it to person inward-facing geometry:
struct ImmersiveView: View { var body: some View { RealityView { contented in do { // Create the sphere mesh. let mesh = MeshResource.generateSphere(radius: 10) // Create an UnlitMaterial. var worldly = UnlitMaterial(applyPostProcessToneMap: false) // Give the UnlitMaterial your equirectangular colour texture. let textureResource = try await TextureResource(named: "example") material.color = .init(tint: .white, texture: .init(textureResource)) // Create the model. let entity = ModelEntity(mesh: mesh, materials: [material]) // Scale the exemplary truthful that it's mesh faces inward. entity.scale.x *= -1 content.add(entity) } catch { // Handle the error. } } } }I person existing stereo videos. How tin I person them to MV-HEVC?
AVFoundation provides APIs to constitute videos successful MV-HEVC format. For a afloat example, download the illustration codification task Converting side-by-side 3D video to multiview HEV.
To person your videos to MV-HEVC:
- Create an AVAsset for each of the near and close views.
- Use AVOutputSettingsAssistant to get output settings that enactment for MV-HEVC.
- Specify the horizontal disparity accommodation and tract of presumption (this is plus specific). Here’s an example:
- Create an AVAssetWriterInputTaggedPixelBufferGroupAdaptor arsenic the input for your AVAssetWriter.
- Create an AVAssetReader for each of the near and close video tracks.
- Read the near and close tracks, past append matching samples to the tagged pixel buffer radical adaptor:
How tin I airy my country successful RealityKit connected visionOS?
You tin airy your country successful RealityKit connected visionOS by:
- Using a system-provided automatic lighting situation that updates based connected real-world surroundings.
- Providing your ain image-based lighting via an ImageBasedLightComponent. To spot an example, make a caller visionOS app, prime RealityKit arsenic the Immersive Space Renderer, and prime Full arsenic the Immersive Space.
I spot that CustomMaterial isn’t supported connected visionOS. Is determination a mode I tin make materials with customized shading?
You tin make materials with customized shading successful Reality Composer Pro utilizing the Shader Graph. A worldly created this mode is accessible to your app arsenic a ShaderGraphMaterial, truthful that you tin dynamically alteration inputs to the shader successful your code.
For a elaborate instauration to the Shader Graph, ticker Explore materials successful Reality Composer Pro.
How tin I presumption entities comparative to the presumption of the device?
In an ImmersiveSpace, you tin get the afloat alteration of the instrumentality utilizing the queryDeviceAnchor(atTimestamp:) method.
Learn much astir gathering apps for visionOS

Q&A: Spatial plan for visionOS
View now
Spotlight on: Developing for visionOS
View now
Spotlight on: Developer tools for visionOS
View nowSample codification contained herein is provided nether the Apple Sample Code License.