Orbital : Expanding the Audiobook Experience
A generative world that translates the rhythm of orbit and sound into spatial storytelling. Reimagining the audiobook as a living, responsive environment.
Concept
Orbital is an experimental world developed by Raik Studio as part of a research initiative exploring how sound, motion, and environment can extend narrative beyond the page.
Originally conceived in response to Orbital by author Samantha Harvey for Audible, the project investigates how the structure of an audiobook might unfold spatially, transforming literary rhythm into physical and visual motion.
Research Focus
Each chapter of Harvey’s book follows the orbit of astronauts circling Earth. Raik translated this structure into a generative system where gravitational rhythm, sound, and light are choreographed in real time.
Orbital 14, descending. Generative and audio reactive visuals that captures one of the final chapters from the novel. Music by Nils Frahm.
Unique generative and traditional artwork (like the planet texturing) added a human dimension to a deep space theme that draws parallels with the narrative.
Methodology
Data-driven simulation of orbital motion and sound resonance.
Dynamic lighting system responding to narrative cadence and emotional intensity.
Spatial sound sketching to test audience orientation and immersion.
Prototyped in-studio using real-time rendering and multi-projection mapping.
Potential Applications
While developed as a self-initiated research piece, Orbital proposes a new format for literary and cultural experiences; where listening becomes an embodied act.
The system is designed for adaptation as:
A museum or festival installation, integrating narrative and spatial choreography.
An immersive brand environment translating stories or product worlds into evolving, responsive spaces.
A performance framework for generative music or spoken word.
Status
Prototype phase complete, now open for co-commission, exhibition, or adaptation.
For the artwork on the pyramid Raik create a custom system pulling in generative AI, and layering multiple images around a specific narrative.
Chapter 3 Ascending. Dynamic audio reactive artwork, reads excerpts from the novel.
NY Tesseract : Public Installation (Research Prototype, 2024)
A responsive installation that treats New York’s collective memory like water: when touched, it ripples, revealing layered histories through motion, sound, and light.
Concept
NY Tesseract is a Raik Studio research project exploring how urban memory can be experienced as a living surface. Drawing on the poetic idea that water “holds” traces of what passes through it, the work turns interaction into revealed memory: gestures create ripples that expose fragments of the city’s people, places, and time.
Research Focus
How can touch, movement, and proximity become instruments that surface hidden narratives?
We study memory as a dynamic field—not a static archive—where audience input distorts, blends, and recomposes images and sound in real time.emory through fluid motion, immersive sound, and dynamic visuals.
Experience & Interaction
Visitors shape the piece through touch, body movement, and mobile input.
Surfaces warp and refract like disturbed water.
Visual strata layer and shear, surfacing archival textures and newly captured city traces.
Spatial audio swells and thins in response to collective activity, creating a chorus of place.
Methodology
Interactive video & generative systems: TouchDesigner pipelines translate sensor data and mobile inputs into fluid distortion, layering, and reveal.
Real-time compositing: A shader stack mimics refraction, interference, and turbidity to “pull up” deeper layers.
Immersive sound design: Parameterized stems respond to gesture density and dwell time, reinforcing the metaphor of memory as an elastic medium.
Site-responsive build: Projection, screens, and speaker layout adapt to architectural conditions; content can be localized to each neighborhood/venue.
Applications & Formats
Museum / festival installation: A participatory archive where communities “play” the city’s memory.
Public space activation: Lobby/atrium or cultural corridor; content packs can be customized to the site.
Brand & civic collaborations: Launches, placemaking, or heritage programs where audience contribution becomes authored content.
Commissioning Notes
Prototype ready: real-time system tested in-studio; adaptable to single- or multi-projection.
Scalable footprint: 8–20m projection or LED wall; 8–16 channel audio; optional sensors (ToF/LiDAR/vision).
Production model: 2–3 week discovery (localize content + site plan) → 4–6 week prototype → install & calibration (3–7 days).
Accessibility: Designed for continuous flow; no instructions required; clear affordances via light and sound.
Status
Research prototype complete. Open for co-commission, exhibition, or site-specific adaptation.
Memory Machine: Translating Place into Identity
Research prototype, Maria Hernandez Park, Bushwick, NY
Concept
Memory Machine explores how a park can function as a communication system rather than a backdrop. Beginning with field recordings and photographs from Maria Hernandez Park, Raik Studio builds a generative framework where memory becomes material: audio textures, visual strata, and simple gestures cohere into an environment that expresses identity instead of merely signaling it.
Research Focus
Can everyday traces (audio, image, motion) become the raw material of an identity?
What happens when brand is defined by place + participation, not just marks and guidelines?
How might a living archive of neighborhood inputs evolve over time—reflecting change, not freezing it?
Experience
Visitors move through an immersive field of projection and sound.
Footsteps and proximity modulate rhythm and density.
Photographic textures abstract into patterns, revealing layers of civic memory.
Voices (collected with consent) ebb and return as shifting harmonics—identity as chorus rather than slogan.
The result is a worldbuilt identity: dynamic, participatory, rooted in the specific.
Custom generative artwork was created for the created scenes in the project.
Methodology
Field-based data capture: location audio, ambient rhythms, ambient light/time-of-day photos.
AI-assisted abstraction: trained on site-specific imagery to derive palettes, textures, and motifs.
Generative graphics + real-time audio: TouchDesigner pipelines, custom GLSL; optional Unreal for spatial previews.
Interaction layer: camera/ToF/LiDAR for crowd flow; mobile web controls for community prompts.
System thinking: identity components (motion, texture, tone) codified as a brand grammar that can be licensed and extended.
Metrics: dwell time, repeat visitation, sentiment capture, UGC volume.
Ethics & Community
This work centers consent, attribution, and co-authorship. Data capture and voice participation are opt-in; prompts and credits surface contributors. The system avoids aestheticizing displacement—place-first, people-forward is the guiding constraint.
Status
Research prototype complete. Open for site-specific adaptation, co-commission, or civic partnership.
Credits / Tools
Raik Studio — concept, system, design.
TouchDesigner, custom GLSL, Unreal Engine preview, field recording kit, spatial audio toolchain.
(Music source material: original stems only; licensed works used only for internal tests.)
Beneath the Surface : Bio-Presence Installation
An interactive video wall that reveals the body’s hidden rhythms. Motion, breath, and micro-gesture translated into living fields of light and sound.
Concept
Beneath the Surface explores how we might see presence rather than depict it. Using real-time motion tracking and procedural systems inspired by anatomical structures, the installation converts movement into organic abstractions—vein-like flows, cellular swarms, neural lattices—suggesting the invisible processes that sustain life.
Research Focus
Can body movement and breath become inputs for a non-diagnostic, experiential visualization of internal rhythm?
What forms make embodiment legible without resorting to literal medical imagery?
How does a space respond when multiple bodies co-compose a living field?
Experience
As visitors walk alongside the wall, the environment anticipates, mirrors, and diverges from their gestures.
Proximity modulates density and scale of structures.
Velocity shears and braids flows; stillness invites micro-patterns to emerge.
Group motion creates interference—collective presence forming larger “organ” behaviors.
Paired spatial audio breathes with the visuals, producing a calm, physiological sense of drift and surge.
Methodology
Sensing: camera/ToF/LiDAR for silhouette, velocity, and dwell; optional breath-proxy via micro-motion.
Generative system: TouchDesigner + custom GLSL for fluid fields, particle advection, and lattice growth; optional Unreal for spatial previews.
Palette & form: Anatomy-inspired, non-diagnostic motifs derived from public-domain references and procedural rules (no PHI).
Adaptive behaviors: parameter sets for “pulse,” “diffusion,” “repair,” and “coherence” to script states over a show run.
Audio: responsive stems mapped to density, flow, and stillness; 8–16ch spatial layout.
Applications & Formats
Museums & cultural venues: bio-presence as aesthetic research; performance scaffold for dancers/musicians.
Wellness & public space: restorative, body-aware environments for lobbies, hospitals (non-clinical), or meditative rooms.
R&D interface: a speculative, real-time body-driven UI for education and engagement.
Commissioning Notes
Prototype ready: studio-tested, scales from 8–24m LED/projection.
Footprint: 8–16ch audio, 2–6 sensors, blackout optional; works in daylight with LED.
Timeline: Discovery (2–3 wks: site plan + behavior grammar) → Prototype (4–6 wks) → Install (3–7 days).
Deliverables: live installation plus Behavior Library (states + parameters), Technical Rider, and an Operations Mode for daily run (open/idle/peak).
Partners: adaptable to local AV integrators; Raik to supervise calibration.
Ethics & Data
No medical diagnosis, no PHI, no storage of personally identifiable biometric data.
Sensor streams are processed ephemerally; optional anonymized metrics limited to dwell and flow.
Visual language is inspired by anatomy—not literal scans—unless site-specific licenses/permissions are provided.
Status
Research prototype complete; open for co-commission and site-specific adaptation (including Knockdown Center configuration).
Credits / Tools
Raik Studio — concept, system, design. TouchDesigner, custom GLSL, Unreal preview, spatial audio toolchain.
(Additional collaborators and venue partners TBC.)