LP3 Studio is a self-organized creative studio and portfolio platform, where work is developed across 3D motion design, and creative direction. The studio functions as both a production framework and an evolving archive, presenting experimental visuals, spatial narratives, and brand-driven systems that bridge digital and physical experiences.
B1.Ladybug, Bee, and Pappillon
Brooklyn
20253D Animation 1080 x 1080px Built in Cinema 4D, contains AI generated content
This project explores the transformation of insects into mechanical systems. Their bodies are deconstructed and sliced into layers, then transformed into interlocking gears that rotate, shift, and reassemble into rhythmic motion. The animation functions as a short teaser that moves from organic form to engineered movement, with the opening sequence generated using AI tools.
Click to Play ↗ Click to Play ↗
B2.FOOL
Dumbo / Brooklyn
2025Mixed Reality Film 1920 x 1080px Shot on iPhone, built in Cinema 4D
Inspired by graffiti’s raw, handwritten typography, situated within the streets of Brooklyn, the piece translates the visual language of street writing into a spatial form, using 3D modeling and a pyro system to simulate the organic drip and decay of sprayed paint. By bringing graffiti into motion and volume, the work explores how informal, ephemeral marks can be reinterpreted through digital tools while remaining grounded in the physical urban environment.
Click to Play ↗
B3.No. 1 / No. 2
Dumbo / Brooklyn
2025AI Campainge 1920 x 1080px Images & videos generated using AI
Imagining fragrance as image, an abstract system where temperature, texture, and light stand in for smell. Two speculative perfumes, No.1 and No.2, unfold through opposing atmospheres: cold clarity and glowing warmth, rendered in close-up surfaces, slow-moving gestures, and luminous material studies. Coffee bean, pear, okra, honey, and fig become visual metaphors rather than ingredients, composing a choreography of density and diffusion. All images and videos in this project were generated using artificial intelligence.
Click to Play ↗
B4.Spotted: Blue Horse Underground Mahhatan / New York City
2025Mixed Reality Film
1920 x 1080px Shot on iPhone, Cinema 4D
Responding to 2026, the Year of the Horse, set within a New York City subway station, the work stages the unexpected appearance of a blue horse beneath an existing poster, as if it has quietly emerged from the infrastructure of the underground.
Click to Play ↗
B5.NOVA
Navy Yard
/ Brooklyn
2025Mixed Reality Film (collaborative) 4K LED Wall, motion capture, 3D scaning, Unreal Engine
A collaborative mixed-reality project that follows a central character traveling through time in search of the time capsule. Produced using an LED wall and Unreal Engine, driven virtual environments, the work merges physical staging with real-time 3D worlds to create a cinematic hybrid space. The project explores speculation, emergence, and temporal displacement through virtual production workflows, spatial choreography, and immersive environmental design.
Click to Play ↗
B6.Jazz 365
New York 2025Album Cover Design
288 x 288mm Digital animation, silk screen
Jazz 365 is a year-long poster series created from a daily ritual of listening, responding, and designing. Each day, I select a favorite jazz classic or contemporary track and translate its mood, rhythm, and abstraction into a single visual composition. Treating sound as a design system, the project explores how improvisation, syncopation, and tonal contrast can be re-imagined through typography, color, and form, building an evolving archive of musical influence rendered through graphic experimentation.
Click to Play ↗
B7.Weaving
New York
2026 3D Animation
1920 x 1080 px Cinema 4D
Digital tactility through a simulated weaving process in Cinema 4D. Using procedural systems and physics-driven dynamics, strands interlace to form a dense textile surface, later developed into a soft, fuzzy-fur texture.
B8.*PicXooo**
New York
2023 Web Tool 310 x 310mm
Javascript
This creative coding project is an interactive image-manipulation tool that invites users to experiment with visual transformation at the pixel level. After uploading an image, users can control a set of custom sliders that algorithmically reshape color, density, and distortion—revealing how small computational changes accumulate into dramatic aesthetic shifts. A dedicated “Blue Mode” introduces a glitch-like chromatic field, isolating figures from their surroundings and generating artificial backdrops through spectral separation. The project foregrounds play, authorship, and real-time feedback, turning image editing into a tactile, exploratory experience driven by code.