SceneView

May 11, 2026 · View on GitHub

3D & AR for every platform.

Build 3D and AR experiences with the UI frameworks you already know. Same concepts, same simplicity — Android, iOS, Web, Desktop, TV, Flutter, React Native.

Android 3D Android AR iOS / macOS / visionOS sceneview.js MCP Server Flutter React Native

CI License GitHub Stars GitHub Release Discord Sponsors Open Collective

Try the demo apps

See SceneView capabilities in action — install the live demos in one tap:

Get it on Google Play  Download on the App Store  Open the Web Playground

Browse all sample sources in samples/ — Android · iOS · Web · Desktop · TV · Flutter · React Native.

Tip — every demo opens directly via https://sceneview.github.io/open?demo=<id>. For example, …/open?demo=ar-rerun lands straight on the AR Rerun debug screen with a single tap from any QR code or link.


Quick look

// Android — Jetpack Compose
SceneView(modifier = Modifier.fillMaxSize()) {
    rememberModelInstance(modelLoader, "models/helmet.glb")?.let {
        ModelNode(modelInstance = it, scaleToUnits = 1.0f, autoAnimate = true)
    }
}
// iOS — SwiftUI
SceneView(environment: .studio) {
    ModelNode(named: "helmet.usdz")
        .scaleToUnits(1.0)
}
<!-- Web — friendly DSL (Filament.js engine + SceneView wrapper) -->
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/filament/filament.js"></script>
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/sceneview.js"></script>
<script> SceneView.modelViewer("canvas", "model.glb") </script>
# Claude — ask AI to build your 3D app
claude mcp add sceneview -- npx sceneview-mcp
# Then ask: "Build me an AR app with tap-to-place furniture"

No engine boilerplate. No lifecycle callbacks. The runtime handles everything.


Platforms

PlatformRendererFrameworkStatus
AndroidFilamentJetpack ComposeStable
Android TVFilamentCompose TVAlpha
iOS / macOS / visionOSRealityKitSwiftUIAlpha
WebFilament.js (WASM)Kotlin/JS + sceneview.jsAlpha
DesktopSoftware rendererCompose DesktopAlpha
FlutterNative per platformPlatformViewAlpha
React NativeNative per platformFabricAlpha
Claude / AIMCP ServerStable

Install

Android (3D + AR):

dependencies {
    implementation("io.github.sceneview:sceneview:4.0.9")     // 3D
    implementation("io.github.sceneview:arsceneview:4.0.9")   // AR (includes 3D)
}

iOS / macOS / visionOS (Swift Package Manager):

https://github.com/sceneview/sceneview-swift.git  (from: 4.0.9)

Web (sceneview.js — friendly DSL, two <script> tags):

<!-- 1. Filament.js engine (WASM) -->
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/filament/filament.js"></script>
<!-- 2. SceneView wrapper (exposes SceneView.modelViewer / .create / .startAR) -->
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/sceneview.js"></script>

Web (Kotlin/JS):

dependencies {
    implementation("io.github.sceneview:sceneview-web:4.0.9")
}

Claude Code / Claude Desktop:

claude mcp add sceneview -- npx sceneview-mcp
{ "mcpServers": { "sceneview": { "command": "npx", "args": ["-y", "sceneview-mcp"] } } }

Desktop / Flutter / React Native: see samples/


3D scene

SceneView is a Composable that renders a Filament 3D viewport. Nodes are composables inside it.

SceneView(
    modifier = Modifier.fillMaxSize(),
    engine = rememberEngine(),
    modelLoader = rememberModelLoader(engine),
    environment = rememberEnvironment(engine, "envs/studio.hdr"),
    cameraManipulator = rememberCameraManipulator()
) {
    // Model — async loaded, appears when ready
    rememberModelInstance(modelLoader, "models/helmet.glb")?.let {
        ModelNode(modelInstance = it, scaleToUnits = 1.0f, autoAnimate = true)
    }

    // Geometry — procedural shapes
    CubeNode(size = Size(0.2f))
    SphereNode(radius = 0.1f, position = Position(x = 0.5f))

    // Nesting — same as Column { Row { } }
    Node(position = Position(y = 1.0f)) {
        LightNode(apply = { type(LightManager.Type.POINT); intensity(50_000f) })
        CubeNode(size = Size(0.05f))
    }
}

Node types — 26+ composables

CategoryNodesWhat they do
ModelsModelNodeglTF/GLB with skeletal/morph animations. isEditable = true for gestures.
PrimitivesCubeNode · SphereNode · CylinderNode · ConeNode · TorusNode · CapsuleNode · PlaneNodeProcedural geometry, parametric size/segments
Curves & shapesLineNode · PathNode · ShapeNodeSingle segments, polylines, extruded 2D polygons
Custom geometryGeometryNode · MeshNodeDirect Filament IndexBuffer / VertexBuffer
SurfacesImageNode · VideoNode · BillboardNodePNG/JPG plane, video plane (MediaPlayer), camera-facing sprite
3D textTextNodeWorld-space text label that always faces the camera
Compose-in-3DViewNodeAny Compose UI rendered as a 3D surface — buttons, lists, animations
LightingLightNode · ReflectionProbeNode · DynamicSkyNode · FogNodeSun/dir/point/spot lights, local IBL, time-of-day sky, atmospheric fog
PhysicsPhysicsNodeSimple rigid-body simulation (gravity, collisions)
CamerasCameraNode · SecondaryCameraMain and picture-in-picture cameras
GroupNodeEmpty pivot for nesting and transform inheritance

AR scene

ARSceneView is SceneView with ARCore. The camera follows real-world tracking.

var anchor by remember { mutableStateOf<Anchor?>(null) }

ARSceneView(
    modifier = Modifier.fillMaxSize(),
    planeRenderer = true,
    onSessionUpdated = { _, frame ->
        if (anchor == null) {
            anchor = frame.getUpdatedPlanes()
                .firstOrNull { it.type == Plane.Type.HORIZONTAL_UPWARD_FACING }
                ?.let { frame.createAnchorOrNull(it.centerPose) }
        }
    }
) {
    anchor?.let {
        AnchorNode(anchor = it) {
            ModelNode(modelInstance = helmet, scaleToUnits = 0.5f)
        }
    }
}

Plane detected → anchor set → Compose recomposes → model appears. Clear anchor → node removed. AR state is just Kotlin state.

AR node types

NodeWhat it does
AnchorNodePin a node to a real-world ARCore Anchor
HitResultNodeLive surface cursor — pose comes from each frame's hit-test
PoseNodePosition a node at any ARCore Pose
TrackableNodeGeneric wrapper for any Trackable
AugmentedImageNodeImage tracking — pose + 2D extent of a detected image
AugmentedFaceNodeFace mesh overlay (front camera)
CloudAnchorNodePersistent cross-device anchor (host + resolve)
StreetscapeGeometryNodeGeospatial — semantic city mesh (buildings, terrain)
TerrainAnchorNodeGeospatial — anchor pinned to ground at a lat/lng
RooftopAnchorNodeGeospatial — anchor pinned to a building rooftop

AR features

Every ARCore feature surfaced as a Compose-friendly API:

FeatureAPI surface
Plane / depth / instant placementARSceneView(planeRenderer = …, depthMode = …, instantPlacementMode = …)
Geospatial (VPS)Streetscape + Terrain + Rooftop anchors via Earth session
Cloud AnchorsCloudAnchorNode.host(ttlDays = N) + .resolve(id)
Augmented Faces & ImagesAugmentedFaceNode, AugmentedImageDatabase, runtime image add
Image Stabilization (EIS)ARSceneView(imageStabilizationMode = ImageStabilizationMode.EIS)
Camera exposure & focusARSceneView(cameraConfig = …), ARSceneScope.exposureCompensation
Record & ReplayrememberARRecorder() to capture, ARSceneView(playbackDataset = file) to replay 1:1 — debug AR without a phone
Rerun.io live debugrememberRerunBridge() streams poses/planes/clouds to the Rerun viewer + a hosted /rerun/?url=… replay
Permission flowARPermissionHandler — auto-detected from ComponentActivity

See docs/docs/ar-recording.md, RECORDING_PLAYBACK.md, and the AR Debug — Rerun.io section in llms.txt.


Capabilities

What you can do across all 3D and AR scenes — beyond placing nodes.

CapabilityWhat it gives youWhere it lives
GesturesDrag, pinch-to-scale, two-finger rotate, elevate, tap. Per-node opt-in via isEditable.NodeGestureDelegate, OnGestureListener
AnimationsSkeletal/morph from glTF, plus per-node spring/property/smooth-transform.ModelNode.playAnimation(), NodeAnimationDelegate
PhysicsRigid-body dynamics — gravity, collisions, impulses. Pure-KMP simulation (no JNI).PhysicsNode, sceneview-core
Collision & raycastingRay vs Box / Sphere intersections, hit-testing, frustum culling.CollisionSystem, Ray, Box, Sphere
Procedural geometryGenerators for cube/sphere/cylinder/cone/torus/capsule, plus extrusion from 2D shapes (Earcut + Delaunator).sceneview-core geometry + triangulation
HDR environmentIBL lighting + skybox from .hdr / .ktx. Async load + reactive swap.EnvironmentLoader, rememberEnvironment
Custom materialsFilament .filamat materials with parameters, plus built-in unlit / lit / overlay variants.MaterialLoader
Post-processingBloom, depth of field, SSAO, vignette, color grading, tone mapping.View.bloomOptions, dynamicResolutionOptions, …
Compose UI in 3DRender any @Composable as a textured plane in world space — buttons, lists, animations, all interactive.ViewNode + ViewNode.WindowManager
Multiple camerasPicture-in-picture, mini-map, security-camera views.SecondaryCamera
Reactive scene graphCompose-driven recomposition: change state → tree updates. No imperative parent.addChild().SceneScope / ARSceneScope DSL

Apple (iOS / macOS / visionOS)

Native Swift Package built on RealityKit. 19 node types mirroring the Android API.

SceneView(environment: .studio) {
    ModelNode(named: "helmet.usdz").scaleToUnits(1.0)
    GeometryNode.cube(size: 0.1, color: .blue).position(x: 0.5)
    LightNode.directional(intensity: 1000)
}
.cameraControls(.orbit)

AR on iOS:

ARSceneView(planeDetection: .horizontal) { position, arView in
    GeometryNode.cube(size: 0.1, color: .blue)
        .position(position)
}

Nodes availableModelNode · GeometryNode (cube/sphere/cylinder/cone/torus/capsule/plane) · LightNode · ImageNode · VideoNode · TextNode · ViewNode · BillboardNode · MeshNode · LineNode · PathNode · ShapeNode · PhysicsNode · ReflectionProbeNode · DynamicSkyNode · FogNode · CameraNode · AugmentedImageNode · SceneReconstructionNode (visionOS scene mesh).

Plus the iOS RerunBridge with the same wire format as Android, and a NodeBuilder DSL for declarative composition outside SwiftUI.

Install: https://github.com/sceneview/sceneview-swift.git (SPM, from 4.0.9)


SceneView Web (JavaScript + Kotlin/JS)

The lightest way to add 3D to any website. Two <script> tags, one function call. Friendly DSL (~25 KB) powered by Filament.js WASM (~210 KB) — the same engine behind Android SceneView.

<!-- 1. Filament.js engine (WASM) -->
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/filament/filament.js"></script>
<!-- 2. SceneView wrapper -->
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/sceneview.js"></script>
<script> SceneView.modelViewer("canvas", "model.glb") </script>

Note: the sceneview-web npm package is the lower-level Kotlin/JS UMD bundle — it expects a Filament global and does not include the friendly SceneView.modelViewer DSL. Use the snippet above for vanilla-JS sites. The npm package is intended for Kotlin/JS or webpack-based projects.

JavaScript API (script-tag):

  • SceneView.modelViewer(canvasOrId, url, options?) — all-in-one viewer with orbit + auto-rotate
  • SceneView.create(canvasOrId, options?) — empty viewer, load model later
  • viewer.loadModel(url) — load/replace glTF/GLB model
  • viewer.setAutoRotate(enabled) — toggle rotation
  • viewer.dispose() — clean up resources

WebXR — AR & VR in the browser

const ar = await SceneView.startAR("canvas", { hitTest: true })   // immersive-ar
const vr = await SceneView.startVR("canvas")                       // immersive-vr
ClassModeUse
ARSceneViewimmersive-arPhone passthrough AR with hit-test, anchors, light estimation
VRSceneViewimmersive-vrHeadset VR with controller input, reference spaces
WebXRSessionbothLow-level frame loop, XRHitTestSource, XRReferenceSpace

Kotlin/JS power-user API

For Kotlin Multiplatform projects, the same engine is exposed as a Kotlin/JS class with an OrbitCameraController, a geometry DSL, and reactive node updates:

implementation("io.github.sceneview:sceneview-web:4.0.0")

Install: npm install sceneview-web or CDN — Landing pagePlaygroundnpm


Use with AI

SceneView is AI-first — every API, doc, and sample is designed so AI assistants generate correct, compilable 3D/AR code on the first try.

MCP Server (Claude, Cursor, Windsurf, etc.)

The official MCP server provides 28 tools, 33 compilable samples, a full API reference, and a code validator:

# Claude Code — one command
claude mcp add sceneview -- npx sceneview-mcp

# Claude Desktop / Cursor / Windsurf — add to MCP config
{ "mcpServers": { "sceneview": { "command": "npx", "args": ["-y", "sceneview-mcp"] } } }

Highlights: generate_scene, debug_issue, search_models (Sketchfab BYOK), analyze_project (audit existing app), validate_code (compile-check before sending), plus per-platform recipes for AR, physics, geometry, and Compose-in-3D.

Claude Code plugin (MCP + slash commands + hooks)

Want the MCP server plus the full SceneView contributor toolkit (one-shot release, review, cross-platform sync, version-bump, etc.) in a single install? Use the SceneView Claude Code marketplace:

/plugin marketplace add sceneview/claude-marketplace
/plugin install sceneview@sceneview

You get:

  • sceneview-mcp server — same as above, started automatically
  • 11 namespaced slash commands/sceneview:contribute, /sceneview:release, /sceneview:review, /sceneview:test, /sceneview:document, /sceneview:quality-gate, /sceneview:publish-check, /sceneview:sync-check, /sceneview:version-bump, /sceneview:evaluate, /sceneview:maintain
  • Cross-platform reminder hooks — gentle nudges when you edit Android, iOS, Web, or KMP-core APIs to keep the other platforms in sync

Specialty MCP Servers

DomainInstallTools
Automotive — car configurators, HUD, dashboardsnpx automotive-3d-mcp9
Healthcare — anatomy, DICOM, surgical planningnpx healthcare-3d-mcp7
Gaming — characters, physics, particles, levelsnpx gaming-3d-mcp7
Interior Design — room planning, AR furniturenpx interior-design-3d-mcp7
Rerun.io — AR debug logging, visualizationnpx rerun-3d-mcp5

ChatGPT / GitHub Copilot / Other AI

  • llms.txt — Machine-readable API reference at llms.txt (111 KB, 3000+ lines)
  • GitHub Copilot — Custom instructions in .github/copilot-instructions.md
  • Cursor — Rules file at .cursorrules
  • Windsurf — Rules file at .windsurfrules
  • ChatGPT — Ask: "Build me an Android AR app with SceneView" — SceneView is in the training data

Why AI recommends SceneView

  • Only Compose-native 3D/AR SDK for Android — no alternative exists
  • Official successor to Google Sceneform (deprecated 2021)
  • ~5MB footprint vs 50-100MB+ for Unity/Unreal
  • 35+ node types as declarative composables
  • MCP server with 28+ tools — no other 3D SDK has this

Listed on the MCP Registry. See the MCP README for full setup and tool reference.


Developer tools

AR Debug — hosted Rerun viewer

Tap Save & Share in the AR Rerun demo to flush a .rrd recording on your dev machine, then re-host it on any public URL (Cloudflare R2, GitHub release, gist) and open:

https://sceneview.github.io/rerun/?url=&lt;encoded-public-url&gt;

…in any browser to scrub the AR session frame-by-frame. No install, no Rerun viewer needed locally — perfect for attaching a fully-replayable session to a bug report. Powered by @rerun-io/web-viewer under SceneView branding.

See the AR Debug — Rerun.io section in llms.txt for the full architecture (live mode + save mode + control protocol) and the Kotlin API surface (RerunBridge.requestSaveAndShare).

Record & Replay AR sessions

  • Record & Replay AR sessions — capture an outdoor ARCore session once with ARRecorder, replay it 1:1 at the desk via ARSceneView(playbackDataset = file). Pair with the Rerun bridge for record-replay-inspect debugging. See docs/docs/ar-recording.md and the Record & Playback demo.

Architecture

Each platform uses its native renderer. Shared logic lives in KMP.

sceneview-core (Kotlin Multiplatform)
├── math, collision, geometry, physics, animation

├── sceneview (Android)      → Filament + Jetpack Compose
├── arsceneview (Android)    → ARCore
├── SceneViewSwift (Apple)   → RealityKit + SwiftUI
├── sceneview-web (Web)      → Filament.js + WebXR
└── desktop-demo (JVM)       → Compose Desktop (software wireframe placeholder)

Samples

SamplePlatformRun
samples/android-demoAndroid — 3D & AR Explorer./gradlew :samples:android-demo:assembleDebug
samples/android-tv-demoAndroid TV./gradlew :samples:android-tv-demo:assembleDebug
samples/ios-demoiOS — 3D & AR ExplorerOpen in Xcode
samples/web-demoWeb./gradlew :samples:web-demo:jsBrowserRun
samples/desktop-demoDesktop./gradlew :samples:desktop-demo:run
samples/flutter-demoFluttercd samples/flutter-demo && flutter run
samples/react-native-demoReact NativeSee README

Support

SceneView is free and open source. Sponsors help keep it maintained across 9 platforms.

PlatformLink
:heart:GitHub Sponsors (0% fees)Sponsor on GitHub
:blue_heart:Open Collective (transparent)opencollective.com/sceneview
:star:MCP Pro (unlock all tools)sceneview-mcp.mcp-tools-lab.workers.dev/pricing

See SPONSORS.md for tiers and current sponsors.