Pipeline
A modular, scalable approach using procedural workflows
and automation for asset creation and engine integration.
A modular, scalable approach using procedural workflows
and automation for asset creation and engine integration.
Developed for internal teams.
Trigger mesh decimation and texture baking from Microsoft Teams, powered by headless Houdini PDG running on a virtual machine.
Trigger mesh decimation and texture baking from Microsoft Teams, powered by headless Houdini PDG running on a virtual machine.
I designed a Teams-based Power App that allows artists and engineers to submit asset optimization jobs directly from their everyday workspace. Users define LOD percentages, baking settings, and output preferences without opening Houdini or touching command-line tools.
Each request generates a structured job file consumed by a headless Houdini PDG pipeline running on a dedicated virtual machine. A queue and watchdog service manage execution, enforce limits, and prevent blocking tasks, ensuring stable and predictable processing.
Because the system runs remotely on a VM, jobs can be triggered from any device, laptop, home setup, or studio machine, without requiring local installs or hardware constraints.
No DCC dependency on user machines
Device-agnostic access through Teams
Centralized compute and controlled execution
Reduced friction between artists and engineering
Scalable batch processing without manual intervention
Process and optimize large-scale LiDAR point clouds into Unity-ready meshes with automated segmentation, decimation, UVs, and texture baking.
Process and optimize large-scale LiDAR point clouds into Unity-ready meshes with automated segmentation, decimation, UVs, and texture baking.
I designed a procedural pipeline that ingests high-density LiDAR point clouds and converts them into structured, optimized mesh assets for real-time use.
The system scans directories for raw datasets, processes and splits point clouds into manageable regions, remeshes geometry at a controlled density, and performs automated UV generation and texture baking. Each stage is handled through Houdini using a PDG-driven workflow, allowing scalable and repeatable batch processing across large environments.
The pipeline restructures unorganized scan data into segmented, Unity-ready assets with predictable topology, reduced polycount, and baked textures suitable for runtime performance.
Optimized mesh segments are exported with consistent naming conventions, clean hierarchy structure, and LOD-ready outputs for direct Unity integration.
Converts heavy LiDAR scans into real-time ready assets
Automated segmentation of large environments
Controlled mesh density for performance targeting
Batch processing through procedural workflows
Clean export structure for Unity scenes and LOD systems
Procedural pipeline using Houdini PDG to prepare Gaussian splat data.
Procedural pipeline using Houdini PDG to prepare Gaussian splat data.
I developed a procedural pipeline using Houdini PDG to prepare Gaussian splat data for real-time streaming across multiple levels of detail. The system focuses on cleaning, structuring, and organizing large splat datasets so they can be efficiently streamed and managed at runtime.
Captured splat data is first processed to remove outliers and unstable points, improving visual consistency and runtime performance. Each level of detail provided by the capture process is then spatially aligned and divided into a volumetric grid, allowing the data to be handled in smaller, independent regions rather than as a single dataset.
The volumetric grids are generated consistently across all levels of detail and are associated with simple descriptive metadata, such as spatial context and LOD classification. This information is used by the runtime system to support distance-based level-of-detail selection and dynamic loading in Unity, ensuring higher-density splats are shown near the user while lower-density representations are used farther away.
As the user moves or teleports through the environment, grid cubes are loaded and unloaded dynamically, maintaining visual quality where it matters most while keeping performance predictable in large-scale scenes.
This approach makes large Gaussian splat environments practical for real-time use by reducing visual noise, adapting detail based on distance, and streaming only the data needed around the user.