
RB19 Shadow Box
Step inside the garage. A 3D diorama that reacts to your physical movement, allowing you to peek around the RB19 just by moving your head.
Client
Personal Project
Platform
web-app
Deliverables
- ui-ux-design
- brand-identity
- frontend-dev
- prompt-engineering
- product-strategy
The Challenge
Most 3D websites are passive users drag a mouse to rotate a model, breaking the immersion. I wanted to recreate the feeling of looking into a physical die-cast model display case, where the perspective shifts naturally as you move, not when you click a button.
The Solution
I built a "Digital Shadow Box." The "Hologram" Effect: By integrating Google MediaPipe, the application hijacks the user's webcam (with permission) to track their eye position in real-time. Inverse Camera Rig: As the user leans left, the 3D camera pans right. This inverse mapping creates a perfect optical illusion of depth, making the screen look like a window into a garage rather than a flat surface. Atmosphere: To sell the realism, I simulated a "Cold Start" sequence flickering fluorescent lights and volumetric fog that reacts to the "engine start" button.
Face-Linked Parallax
The camera coordinate system is bound to the user's nose X/Y coordinates. If the user leans in, the camera zooms; if they lean side-to-side, the perspective shifts.
Volumetric Fog
Custom shaders that create "god rays" coming from the garage ceiling lights, adding thick atmosphere to the scene.
Ignition Sequence
A cinematic button interaction that triggers camera shake, audio revs, and a light flicker sequence synced to the engine RPM.
The Process
This section highlights the technical hurdles of mixing Computer Vision with 3D rendering.
1. The "Fourth Wall" Problem Standard 3D websites feel like toys, You click and drag a model. I wanted to create a "Digital Shadow Box" where the screen feels like a physical window. To do this, I needed to track the user, not the mouse.
2. Taming the Webcam (MediaPipe) I integrated Google’s MediaPipe Face Mesh to track the user’s nose coordinates. The raw data was incredibly jittery (micro-tremors), causing the 3D camera to shake. I implemented a Damped Spring System (using Linear Interpolation or "Lerp") to smooth out the face data, ensuring the camera movement felt heavy and cinematic, not twitchy.
3. The Inverse Parallax Math The "hologram" illusion relies on a simple trick: Inverse Mapping.
- If you lean Left, the virtual camera pans Right.
- If you lean In, the virtual camera zooms In. This mimics how our eyes perceive depth through a physical windowframe, tricking the brain into seeing the screen as a 3D box.
4. Performance Optimization Running Computer Vision (Face Tracking) and High-Fidelity 3D (WebGL) simultaneously is heavy. I moved the MediaPipe tracking logic to a separate Web Worker, ensuring the main thread remained free to render the 3D scene at a rock-solid 60fps.