Server data from the Official MCP Registry
3D & AR SDK for Android, iOS, Web — API docs, samples, validation, and code generation.
3D & AR SDK for Android, iOS, Web — API docs, samples, validation, and code generation.
Valid MCP server (1 strong, 1 medium validity signals). 3 known CVEs in dependencies (0 critical, 3 high severity) ⚠️ Package registry links to a different repository than scanned source. Imported from the Official MCP Registry. 1 finding(s) downgraded by scanner intelligence.
14 files analyzed · 4 issues found
Security scores are indicators to help you make informed decisions, not guarantees. Always review permissions before connecting any MCP server.
Add this to your MCP configuration file:
{
"mcpServers": {
"io-github-thomasgorisse-sceneview": {
"args": [
"-y",
"sceneview-mcp"
],
"command": "npx"
}
}
}From the project's GitHub README.
3D & AR for every platform.
Build 3D and AR experiences with the UI frameworks you already know. Same concepts, same simplicity — Android, iOS, Web, Desktop, TV, Flutter, React Native.
See SceneView capabilities in action — install the live demos in one tap:
Browse all sample sources in samples/ — Android · iOS · Web · Desktop · TV · Flutter · React Native.
Tip — every demo opens directly via
https://sceneview.github.io/open?demo=<id>. For example,…/open?demo=ar-rerunlands straight on the AR Rerun debug screen with a single tap from any QR code or link.
// Android — Jetpack Compose
SceneView(modifier = Modifier.fillMaxSize()) {
rememberModelInstance(modelLoader, "models/helmet.glb")?.let {
ModelNode(modelInstance = it, scaleToUnits = 1.0f, autoAnimate = true)
}
}
// iOS — SwiftUI
SceneView(environment: .studio) {
ModelNode(named: "helmet.usdz")
.scaleToUnits(1.0)
}
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/filament/filament.js"></script>
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/sceneview.js"></script>
<script> SceneView.modelViewer("canvas", "model.glb") </script>
# Claude — ask AI to build your 3D app
claude mcp add sceneview -- npx sceneview-mcp
# Then ask: "Build me an AR app with tap-to-place furniture"
No engine boilerplate. No lifecycle callbacks. The runtime handles everything.
| Platform | Renderer | Framework | Status |
|---|---|---|---|
| Android | Filament | Jetpack Compose | Stable |
| Android TV | Filament | Compose TV | Alpha |
| iOS / macOS / visionOS | RealityKit | SwiftUI | Alpha |
| Web | Filament.js (WASM) | Kotlin/JS + sceneview.js | Alpha |
| Desktop | Software renderer | Compose Desktop | Alpha |
| Flutter | Native per platform | PlatformView | Alpha |
| React Native | Native per platform | Fabric | Alpha |
| Claude / AI | — | MCP Server | Stable |
Android (3D + AR):
dependencies {
implementation("io.github.sceneview:sceneview:4.0.9") // 3D
implementation("io.github.sceneview:arsceneview:4.0.9") // AR (includes 3D)
}
iOS / macOS / visionOS (Swift Package Manager):
https://github.com/sceneview/sceneview-swift.git (from: 4.0.9)
Web (sceneview.js — friendly DSL, two <script> tags):
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/filament/filament.js"></script>
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/sceneview.js"></script>
Web (Kotlin/JS):
dependencies {
implementation("io.github.sceneview:sceneview-web:4.0.9")
}
Claude Code / Claude Desktop:
claude mcp add sceneview -- npx sceneview-mcp
{ "mcpServers": { "sceneview": { "command": "npx", "args": ["-y", "sceneview-mcp"] } } }
Desktop / Flutter / React Native: see samples/
SceneView is a Composable that renders a Filament 3D viewport. Nodes are composables inside it.
SceneView(
modifier = Modifier.fillMaxSize(),
engine = rememberEngine(),
modelLoader = rememberModelLoader(engine),
environment = rememberEnvironment(engine, "envs/studio.hdr"),
cameraManipulator = rememberCameraManipulator()
) {
// Model — async loaded, appears when ready
rememberModelInstance(modelLoader, "models/helmet.glb")?.let {
ModelNode(modelInstance = it, scaleToUnits = 1.0f, autoAnimate = true)
}
// Geometry — procedural shapes
CubeNode(size = Size(0.2f))
SphereNode(radius = 0.1f, position = Position(x = 0.5f))
// Nesting — same as Column { Row { } }
Node(position = Position(y = 1.0f)) {
LightNode(apply = { type(LightManager.Type.POINT); intensity(50_000f) })
CubeNode(size = Size(0.05f))
}
}
| Category | Nodes | What they do |
|---|---|---|
| Models | ModelNode | glTF/GLB with skeletal/morph animations. isEditable = true for gestures. |
| Primitives | CubeNode · SphereNode · CylinderNode · ConeNode · TorusNode · CapsuleNode · PlaneNode | Procedural geometry, parametric size/segments |
| Curves & shapes | LineNode · PathNode · ShapeNode | Single segments, polylines, extruded 2D polygons |
| Custom geometry | GeometryNode · MeshNode | Direct Filament IndexBuffer / VertexBuffer |
| Surfaces | ImageNode · VideoNode · BillboardNode | PNG/JPG plane, video plane (MediaPlayer), camera-facing sprite |
| 3D text | TextNode | World-space text label that always faces the camera |
| Compose-in-3D | ViewNode | Any Compose UI rendered as a 3D surface — buttons, lists, animations |
| Lighting | LightNode · ReflectionProbeNode · DynamicSkyNode · FogNode | Sun/dir/point/spot lights, local IBL, time-of-day sky, atmospheric fog |
| Physics | PhysicsNode | Simple rigid-body simulation (gravity, collisions) |
| Cameras | CameraNode · SecondaryCamera | Main and picture-in-picture cameras |
| Group | Node | Empty pivot for nesting and transform inheritance |
ARSceneView is SceneView with ARCore. The camera follows real-world tracking.
var anchor by remember { mutableStateOf<Anchor?>(null) }
ARSceneView(
modifier = Modifier.fillMaxSize(),
planeRenderer = true,
onSessionUpdated = { _, frame ->
if (anchor == null) {
anchor = frame.getUpdatedPlanes()
.firstOrNull { it.type == Plane.Type.HORIZONTAL_UPWARD_FACING }
?.let { frame.createAnchorOrNull(it.centerPose) }
}
}
) {
anchor?.let {
AnchorNode(anchor = it) {
ModelNode(modelInstance = helmet, scaleToUnits = 0.5f)
}
}
}
Plane detected → anchor set → Compose recomposes → model appears. Clear anchor → node removed. AR state is just Kotlin state.
| Node | What it does |
|---|---|
AnchorNode | Pin a node to a real-world ARCore Anchor |
HitResultNode | Live surface cursor — pose comes from each frame's hit-test |
PoseNode | Position a node at any ARCore Pose |
TrackableNode | Generic wrapper for any Trackable |
AugmentedImageNode | Image tracking — pose + 2D extent of a detected image |
AugmentedFaceNode | Face mesh overlay (front camera) |
CloudAnchorNode | Persistent cross-device anchor (host + resolve) |
StreetscapeGeometryNode | Geospatial — semantic city mesh (buildings, terrain) |
TerrainAnchorNode | Geospatial — anchor pinned to ground at a lat/lng |
RooftopAnchorNode | Geospatial — anchor pinned to a building rooftop |
Every ARCore feature surfaced as a Compose-friendly API:
| Feature | API surface |
|---|---|
| Plane / depth / instant placement | ARSceneView(planeRenderer = …, depthMode = …, instantPlacementMode = …) |
| Geospatial (VPS) | Streetscape + Terrain + Rooftop anchors via Earth session |
| Cloud Anchors | CloudAnchorNode.host(ttlDays = N) + .resolve(id) |
| Augmented Faces & Images | AugmentedFaceNode, AugmentedImageDatabase, runtime image add |
| Image Stabilization (EIS) | ARSceneView(imageStabilizationMode = ImageStabilizationMode.EIS) |
| Camera exposure & focus | ARSceneView(cameraConfig = …), ARSceneScope.exposureCompensation |
| Record & Replay | rememberARRecorder() to capture, ARSceneView(playbackDataset = file) to replay 1:1 — debug AR without a phone |
| Rerun.io live debug | rememberRerunBridge() streams poses/planes/clouds to the Rerun viewer + a hosted /rerun/?url=… replay |
| Permission flow | ARPermissionHandler — auto-detected from ComponentActivity |
See docs/docs/ar-recording.md, RECORDING_PLAYBACK.md, and the AR Debug — Rerun.io section in llms.txt.
What you can do across all 3D and AR scenes — beyond placing nodes.
| Capability | What it gives you | Where it lives |
|---|---|---|
| Gestures | Drag, pinch-to-scale, two-finger rotate, elevate, tap. Per-node opt-in via isEditable. | NodeGestureDelegate, OnGestureListener |
| Animations | Skeletal/morph from glTF, plus per-node spring/property/smooth-transform. | ModelNode.playAnimation(), NodeAnimationDelegate |
| Physics | Rigid-body dynamics — gravity, collisions, impulses. Pure-KMP simulation (no JNI). | PhysicsNode, sceneview-core |
| Collision & raycasting | Ray vs Box / Sphere intersections, hit-testing, frustum culling. | CollisionSystem, Ray, Box, Sphere |
| Procedural geometry | Generators for cube/sphere/cylinder/cone/torus/capsule, plus extrusion from 2D shapes (Earcut + Delaunator). | sceneview-core geometry + triangulation |
| HDR environment | IBL lighting + skybox from .hdr / .ktx. Async load + reactive swap. | EnvironmentLoader, rememberEnvironment |
| Custom materials | Filament .filamat materials with parameters, plus built-in unlit / lit / overlay variants. | MaterialLoader |
| Post-processing | Bloom, depth of field, SSAO, vignette, color grading, tone mapping. | View.bloomOptions, dynamicResolutionOptions, … |
| Compose UI in 3D | Render any @Composable as a textured plane in world space — buttons, lists, animations, all interactive. | ViewNode + ViewNode.WindowManager |
| Multiple cameras | Picture-in-picture, mini-map, security-camera views. | SecondaryCamera |
| Reactive scene graph | Compose-driven recomposition: change state → tree updates. No imperative parent.addChild(). | SceneScope / ARSceneScope DSL |
Native Swift Package built on RealityKit. 19 node types mirroring the Android API.
SceneView(environment: .studio) {
ModelNode(named: "helmet.usdz").scaleToUnits(1.0)
GeometryNode.cube(size: 0.1, color: .blue).position(x: 0.5)
LightNode.directional(intensity: 1000)
}
.cameraControls(.orbit)
AR on iOS:
ARSceneView(planeDetection: .horizontal) { position, arView in
GeometryNode.cube(size: 0.1, color: .blue)
.position(position)
}
Nodes available — ModelNode · GeometryNode (cube/sphere/cylinder/cone/torus/capsule/plane) · LightNode · ImageNode · VideoNode · TextNode · ViewNode · BillboardNode · MeshNode · LineNode · PathNode · ShapeNode · PhysicsNode · ReflectionProbeNode · DynamicSkyNode · FogNode · CameraNode · AugmentedImageNode · SceneReconstructionNode (visionOS scene mesh).
Plus the iOS RerunBridge with the same wire format as Android, and a NodeBuilder DSL for declarative composition outside SwiftUI.
Install: https://github.com/sceneview/sceneview-swift.git (SPM, from 4.0.9)
The lightest way to add 3D to any website. Two <script> tags, one function call.
Friendly DSL (~25 KB) powered by Filament.js WASM (~210 KB) — the same engine behind Android SceneView.
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/filament/filament.js"></script>
<script src="https://cdn.jsdelivr.net/gh/sceneview/sceneview@v4.0.9/website-static/js/sceneview.js"></script>
<script> SceneView.modelViewer("canvas", "model.glb") </script>
Note: the
sceneview-webnpm package is the lower-level Kotlin/JS UMD bundle — it expects aFilamentglobal and does not include the friendlySceneView.modelViewerDSL. Use the snippet above for vanilla-JS sites. The npm package is intended for Kotlin/JS or webpack-based projects.
JavaScript API (script-tag):
SceneView.modelViewer(canvasOrId, url, options?) — all-in-one viewer with orbit + auto-rotateSceneView.create(canvasOrId, options?) — empty viewer, load model laterviewer.loadModel(url) — load/replace glTF/GLB modelviewer.setAutoRotate(enabled) — toggle rotationviewer.dispose() — clean up resourcesconst ar = await SceneView.startAR("canvas", { hitTest: true }) // immersive-ar
const vr = await SceneView.startVR("canvas") // immersive-vr
| Class | Mode | Use |
|---|---|---|
ARSceneView | immersive-ar | Phone passthrough AR with hit-test, anchors, light estimation |
VRSceneView | immersive-vr | Headset VR with controller input, reference spaces |
WebXRSession | both | Low-level frame loop, XRHitTestSource, XRReferenceSpace |
For Kotlin Multiplatform projects, the same engine is exposed as a Kotlin/JS class with an OrbitCameraController, a geometry DSL, and reactive node updates:
implementation("io.github.sceneview:sceneview-web:4.0.0")
Install: npm install sceneview-web or CDN — Landing page — Playground — npm
SceneView is AI-first — every API, doc, and sample is designed so AI assistants generate correct, compilable 3D/AR code on the first try.
The official MCP server provides 28 tools, 33 compilable samples, a full API reference, and a code validator:
# Claude Code — one command
claude mcp add sceneview -- npx sceneview-mcp
# Claude Desktop / Cursor / Windsurf — add to MCP config
{ "mcpServers": { "sceneview": { "command": "npx", "args": ["-y", "sceneview-mcp"] } } }
Highlights: generate_scene, debug_issue, search_models (Sketchfab BYOK), analyze_project (audit existing app), validate_code (compile-check before sending), plus per-platform recipes for AR, physics, geometry, and Compose-in-3D.
Want the MCP server plus the full SceneView contributor toolkit (one-shot release, review, cross-platform sync, version-bump, etc.) in a single install? Use the SceneView Claude Code marketplace:
/plugin marketplace add sceneview/claude-marketplace
/plugin install sceneview@sceneview
You get:
sceneview-mcp server — same as above, started automatically/sceneview:contribute, /sceneview:release, /sceneview:review, /sceneview:test, /sceneview:document, /sceneview:quality-gate, /sceneview:publish-check, /sceneview:sync-check, /sceneview:version-bump, /sceneview:evaluate, /sceneview:maintain| Domain | Install | Tools |
|---|---|---|
| Automotive — car configurators, HUD, dashboards | npx automotive-3d-mcp | 9 |
| Healthcare — anatomy, DICOM, surgical planning | npx healthcare-3d-mcp | 7 |
| Gaming — characters, physics, particles, levels | npx gaming-3d-mcp | 7 |
| Interior Design — room planning, AR furniture | npx interior-design-3d-mcp | 7 |
| Rerun.io — AR debug logging, visualization | npx rerun-3d-mcp | 5 |
llms.txt (111 KB, 3000+ lines).github/copilot-instructions.md.cursorrules.windsurfrulesListed on the MCP Registry. See the MCP README for full setup and tool reference.
Tap Save & Share in the AR Rerun demo to flush a .rrd recording on
your dev machine, then re-host it on any public URL (Cloudflare R2,
GitHub release, gist) and open:
https://sceneview.github.io/rerun/?url=<encoded-public-url>
…in any browser to scrub the AR session frame-by-frame. No install, no
Rerun viewer needed locally — perfect for attaching a fully-replayable
session to a bug report. Powered by @rerun-io/web-viewer under SceneView branding.
See the AR Debug — Rerun.io section in llms.txt for the
full architecture (live mode + save mode + control protocol) and the
Kotlin API surface (RerunBridge.requestSaveAndShare).
ARRecorder, replay it 1:1 at the desk via ARSceneView(playbackDataset = file). Pair with the Rerun bridge for record-replay-inspect debugging. See docs/docs/ar-recording.md and the Record & Playback demo.Each platform uses its native renderer. Shared logic lives in KMP.
sceneview-core (Kotlin Multiplatform)
├── math, collision, geometry, physics, animation
│
├── sceneview (Android) → Filament + Jetpack Compose
├── arsceneview (Android) → ARCore
├── SceneViewSwift (Apple) → RealityKit + SwiftUI
├── sceneview-web (Web) → Filament.js + WebXR
└── desktop-demo (JVM) → Compose Desktop (software wireframe placeholder)
| Sample | Platform | Run |
|---|---|---|
samples/android-demo | Android — 3D & AR Explorer | ./gradlew :samples:android-demo:assembleDebug |
samples/android-tv-demo | Android TV | ./gradlew :samples:android-tv-demo:assembleDebug |
samples/ios-demo | iOS — 3D & AR Explorer | Open in Xcode |
samples/web-demo | Web | ./gradlew :samples:web-demo:jsBrowserRun |
samples/desktop-demo | Desktop | ./gradlew :samples:desktop-demo:run |
samples/flutter-demo | Flutter | cd samples/flutter-demo && flutter run |
samples/react-native-demo | React Native | See README |
SceneView is free and open source. Sponsors help keep it maintained across 9 platforms.
| Platform | Link | |
|---|---|---|
| :heart: | GitHub Sponsors (0% fees) | Sponsor on GitHub |
| :blue_heart: | Open Collective (transparent) | opencollective.com/sceneview |
| :star: | MCP Pro (unlock all tools) | sceneview-mcp.mcp-tools-lab.workers.dev/pricing |
See SPONSORS.md for tiers and current sponsors.
Be the first to review this server!
by Modelcontextprotocol · Developer Tools
Read, search, and manipulate Git repositories programmatically
by Toleno · Developer Tools
Toleno Network MCP Server — Manage your Toleno mining account with Claude AI using natural language.
by mcp-marketplace · Developer Tools
Create, build, and publish Python MCP servers to PyPI — conversationally.
by Microsoft · Content & Media
Convert files (PDF, Word, Excel, images, audio) to Markdown for LLM consumption
by mcp-marketplace · Developer Tools
Scaffold, build, and publish TypeScript MCP servers to npm — conversationally
by Taylorwilsdon · Productivity
Control Gmail, Calendar, Docs, Sheets, Drive, and more from your AI