name: realitykit-visionos-developer
description: Build, debug, and optimize RealityKit scenes for visionOS, including entity/component setup, rendering, animation, physics, audio, input, attachments, and custom systems. Use when implementing RealityKit features or troubleshooting ECS behavior on visionOS.
RealityKit visionOS Developer
Description and Goals
This skill provides comprehensive guidance for implementing RealityKit-based spatial experiences on visionOS. RealityKit uses an Entity Component System (ECS) architecture where entities are lightweight containers, behavior comes from components, and systems drive per-frame updates.
Goals
- Enable developers to build immersive 3D experiences on visionOS using RealityKit
- Provide clear guidance on when to use each component and system
- Help developers understand ECS patterns and best practices
- Support debugging and optimization of RealityKit scenes
- Ensure proper integration with SwiftUI via RealityView
What This Skill Should Do
When implementing RealityKit features on visionOS, this skill should:
- Guide component selection - Help you choose the right components for rendering, interaction, physics, audio, and animation needs
- Provide system implementation patterns - Show how to create custom systems for continuous behavior
- Offer code examples - Demonstrate common patterns like async asset loading, interactive entities, and custom systems
- Highlight best practices - Emphasize proper async loading, component registration, and performance considerations
- Warn about pitfalls - Identify common mistakes like using ARView on visionOS or blocking the main actor
Load the appropriate component or system reference file from the tables below for detailed usage, code examples, and best practices.
Core Concepts
Entities and Components
- Entities are lightweight containers; behavior comes from components.
- Prefer composition over inheritance and use custom
Component + Codable when you need per-entity state.
- Register custom components once with
Component.registerComponent() before use.
- Keep entity transforms and component updates on the main actor.
RealityView and Attachments
- Use
RealityView to bridge SwiftUI and RealityKit.
- Load assets with
Entity(named:) or Entity(contentsOf:) asynchronously and handle errors.
- Always use
ViewAttachmentComponent for SwiftUI overlays in 3D and avoid the RealityView attachments closure.
Systems and Queries
- Use a custom
System for continuous, per-frame behavior.
- Query entities with
EntityQuery + QueryPredicate and process them in update(context:).
- Use
SystemDependency to control update order when multiple systems interact.
Components Reference
Use this table to decide which component reference file to load when implementing RealityKit features:
Rendering and Appearance
| Component |
When to Use |
ModelComponent |
When rendering 3D geometry with meshes and materials on entities. |
ModelSortGroupComponent |
When experiencing depth fighting (z-fighting) issues with overlapping geometry or need to control draw order. |
OpacityComponent |
When creating fade effects, making entities semi-transparent, or implementing visibility transitions. |
AdaptiveResolutionComponent |
When optimizing performance in large scenes by reducing render quality for distant objects. |
ModelDebugOptionsComponent |
When debugging rendering issues, visualizing model geometry, or inspecting bounding boxes during development. |
MeshInstancesComponent |
When rendering many copies of the same mesh efficiently (trees, crowds, particle-like objects). |
BlendShapeWeightsComponent |
When implementing facial animation, character expressions, or morphing mesh deformations. |
User Interaction
| Component |
When to Use |
InputTargetComponent |
When making entities interactive (tappable, draggable) or handling user input events. |
ManipulationComponent |
When implementing built-in drag, rotate, and scale interactions with hand gestures or trackpad. |
GestureComponent |
When implementing custom gesture recognition beyond what ManipulationComponent provides. |
HoverEffectComponent |
When providing visual feedback when users look at or hover over interactive entities. |
AccessibilityComponent |
When making entities accessible to screen readers, VoiceOver, or other assistive technologies. |
BillboardComponent |
When creating 2D sprites, text labels, or UI elements that should always face the viewer. |
Anchoring and Spatial
| Component |
When to Use |
AnchoringComponent |
When anchoring virtual content to detected planes, tracked images, hand locations, or world targets. |
ARKitAnchorComponent |
When accessing the underlying ARKit anchor data for an anchored entity. |
SceneUnderstandingComponent |
When accessing scene understanding data like detected objects or room reconstruction. |
DockingRegionComponent |
When defining regions where content can automatically dock or snap into place. |
ReferenceComponent |
When implementing lazy loading of external entity assets or referencing entities in other files. |
AttachedTransformComponent |
When attaching an entity's transform to another entity for hierarchical positioning. |
Cameras
Lighting and Shadows
Audio
| Component |
When to Use |
SpatialAudioComponent |
When playing 3D positioned audio that changes based on listener position and orientation. |
AmbientAudioComponent |
When playing non-directional ambient audio that doesn't change with listener position. |
ChannelAudioComponent |
When playing channel-based audio content (stereo, surround, etc.) without spatialization. |
AudioLibraryComponent |
When storing and managing multiple audio resources for reuse across entities. |
ReverbComponent |
When applying reverb effects to an entity's audio for spatial acoustic simulation. |
AudioMixGroupsComponent |
When grouping audio sources for centralized mixing control and volume management. |
Animation and Character
Physics and Collision
| Component |
When to Use |
CollisionComponent |
When defining collision shapes for hit testing, raycasting, or physics interactions. |
PhysicsBodyComponent |
When adding physical behavior (mass, gravity, forces) to entities for physics simulation. |
PhysicsMotionComponent |
When controlling linear and angular velocity of physics bodies programmatically. |
PhysicsSimulationComponent |
When configuring global physics simulation parameters like gravity or timestep. |
ParticleEmitterComponent |
When emitting particle effects (smoke, sparks, debris) from an entity position. |
ForceEffectComponent |
When applying force fields (gravity wells, explosions) that affect multiple physics bodies. |
PhysicsJointsComponent |
When creating joints (hinges, springs) between physics bodies for articulated structures. |
GeometricPinsComponent |
When defining geometric attachment points for connecting entities at specific locations. |
Portals and Environments
| Component |
When to Use |
PortalComponent |
When creating portals that render a separate world or scene through an opening. |
WorldComponent |
When designating an entity hierarchy as a separate renderable world for portal rendering. |
PortalCrossingComponent |
When controlling behavior (teleportation, scene switching) when entities cross portal boundaries. |
EnvironmentBlendingComponent |
When blending virtual content with the real environment for mixed reality experiences. |
Presentation and UI
Networking and Sync
| Component |
When to Use |
SynchronizationComponent |
When synchronizing entity state, transforms, and components across networked multiplayer sessions. |
TransientComponent |
When marking entities as temporary, non-persistent, and excluded from network synchronization. |
Systems Reference
Use this reference when implementing custom ECS behavior:
| System/API |
When to Use |
System and Component Creation |
When creating custom systems for continuous, per-frame behavior or custom components for per-entity state. |
Implementation Patterns
RealityView Async Load
RealityView { content in
do {
let entity = try await Entity(named: "Scene")
content.add(entity)
} catch {
print("Failed to load entity: \(error)")
}
}
Interactive Entity Setup
let entity = ModelEntity(mesh: .generateBox(size: 0.1))
entity.components.set(CollisionComponent(shapes: [.generateBox(size: [0.1, 0.1, 0.1])]))
entity.components.set(InputTargetComponent())
entity.components.set(ManipulationComponent())
Custom System Skeleton
import RealityKit
struct SpinComponent: Component, Codable {
var speed: Float
}
struct SpinSystem: System {
static let query = EntityQuery(where: .has(SpinComponent.self))
init(scene: Scene) {}
func update(context: SceneUpdateContext) {
for entity in context.entities(matching: Self.query, updatingSystemWhen: .rendering) {
guard let spin = entity.components[SpinComponent.self] else { continue }
entity.transform.rotation *= simd_quatf(angle: spin.speed * Float(context.deltaTime), axis: [0, 1, 0])
}
}
}
SpinSystem.registerSystem()
Pitfalls and Checks
- Always load assets asynchronously; avoid blocking the main actor.
- Avoid
ARView on visionOS; use RealityView.
- Add
CollisionComponent + InputTargetComponent for draggable or tappable entities.
- Never use the
RealityView update closure; use a custom System for continuous behavior instead.
- Mesh generation is limited to
box, sphere, plane, cylinder, and cone.