Case Study
Archipelago – physics-first VR adventure
A solo-developed VR adventure set on a chain of tropical islands in a near-future dystopia. Built across roughly 18 months and ported from SteamVR to visionOS in collaboration with Apple at Battersea pre-launch.
UnityC#SwiftMetalvisionOSSteamVR / OpenXRBlender
Problem & Context
VR has a software problem: most users stop wearing their headsets within a few months of buying one, mostly because of content quality, accessibility, and comfort. Archipelago set out to push against that by treating VR as a first-class medium rather than a port target — a short, dense, physics-driven adventure built around presence and environmental storytelling.
The game is set around 2099 on a remote archipelago of three tropical islands. Even paradise is polluted: reefs are bleached, the oceans are emptying, and militias fight over ancient ruins hiding a star map. The player uncovers and must choose what to do with it. The design intentionally uses VR's reputation as escapism against itself — drawing the player in with beauty, then confronting them with the slow decay underneath.
Research
- Academic grounding: Steinhaeusser et al. on designing emotion-inducing virtual environments, and Thoma et al. on VR as a tool for climate-change awareness — both pointed to presence and emotional coherence being more important than photorealism.
- Primary research with Valve: I reached out to Valve Software and was connected with Robin Walker, lead programmer on Half-Life: Alyx. His advice on iteration, cutting weak parts of a game, and making "this game" a vehicle for getting better at games in general became the core development philosophy of the project.
- School-wide target-audience survey: validated demand for shooter and adventure genres in 13–18s, revealed how rarely people actually use the headsets they own, and pushed me to design for 30–50 minute sessions rather than 2-hour ones.
- Apple Battersea Developer Lab: on-site visit in January 2024 to port the game to visionOS on real hardware months before launch, working with Apple engineers on Metal, hand tracking, and spatial rendering.
- AWE Los Angeles: volunteered at a startup's stand at the Augmented World Expo, got direct feedback from engineers at Apple, Google, and Meta, and worked with a shader programmer on the stereoscopic ocean shader used in the final build.
Design
- Three-act structure, three islands: each island is a level and an act. Act I is familiar tropical colours, Act II introduces contrasting militia outposts, Act III is volcanic with a boss battle and a branching ending.
- Stylised realism: rather than chasing AAA realism (impractical in VR where frames need to render sub-12ms to avoid motion sickness), the art style pushes saturation, lighting, and texture to feel subtly heightened — a dream of paradise with darker secrets underneath.
- Diegetic, presence-first design: no floating UI markers; exploration is guided with environmental cues — light, signs, paths, sound — rather than waypoints. The weapon wheel is accessed via a menu rather than body holsters for accessibility and consistency.
- Spatial audio: diegetic soundscapes with attenuation and reverb simulated in 3D, designed for the visionOS audio-raytracing pipeline so sources feel anchored in the player's space.
Technical Highlights
- Fully physics-based player controller: unlike Half-Life: Alyx — which only simulates the hands — Archipelago simulates the entire player body with rigid bodies and a rolling-ball locomotion system. This makes swimming, jumping, and object interaction behave consistently without bespoke animation.
- Water interaction system: custom Unity scripts using the Crest ocean package sample ocean height, normal, and surface velocity every FixedUpdate to trigger collision audio, haptics, and above/below-water state transitions based on velocity thresholds.
- Quaternion-smoothed weapon handling: weapons follow the hand via a blend of
Vector3.MoveTowards andQuaternion.Slerp with tuned position/rotation magic numbers, so aiming feels responsive without losing the sense of mass. - Stereo-aware ocean shader: the original refraction shader sampled a single render texture; in VR this had to be adapted to sample separately for each eye, working inside the constraints of foveated rendering on visionOS.
- Cross-platform input: the PC build uses motion controllers via HurricaneVR; the visionOS build replaces all controller input with a limited gesture set (pinch, grab), which drove a redesign of the weapon and inventory UX.
Outcomes
Delivered a Minimum Viable Product of the game across both SteamVR and visionOS, with the visionOS version reviewed by Apple for App Store submission. The project was the artefact for my A-Level EPQ, resulted in an invite-only visit to Apple's Battersea Developer Lab, and taught me how to ship and iterate on a large C# codebase alone. It also made very concrete what "presence" means as a design goal — the thing worth optimising for above almost everything else in VR.