• Over the course of 100 consecutive days, I committed to daily piano practice and documentation—a journey that transformed not just my musical abilities, but my entire approach to skill development. Each day, I recorded my practice sessions and progress on Instagram, holding myself accountable to the discipline of showing up consistently, regardless of how I felt or how busy life became. What started as a personal challenge became a masterclass in the power of incremental improvement and deliberate practice.

    Throughout this challenge, I developed crucial skills that extended far beyond the piano itself. My sight-reading improved dramatically, my finger dexterity and independence strengthened, and my understanding of music theory deepened through practical application. I learned to identify my weaknesses quickly and target them with focused practice rather than avoiding difficult passages. More importantly, I discovered how to practice efficiently—breaking down complex pieces into manageable sections, using varied repetition techniques, and building muscle memory through consistent, mindful repetition.

    The most valuable lesson, however, was understanding the compound effect of daily routine. Missing zero days taught me that consistency beats intensity, and that small, focused efforts accumulate into significant progress over time. This discipline of building and maintaining routines has directly translated to my development work—whether it’s daily coding practice, learning new VR frameworks, or iterating on Unity projects. The mental resilience required to practice piano for 100 straight days is the same resilience that drives me to debug complex VR interactions or master new 3D modeling techniques. Routine isn’t restrictive; it’s the foundation that makes ambitious goals achievable.

  • Beyond performance, I’ve developed a comprehensive music production workflow that mirrors the technical pipelines required in game development and VR content creation. My process involves transcribing and arranging piano pieces in MuseScore, editing and mixing audio in Logic Pro DAW, and creating visualizations using PianoVFX software on my Windows PC. This multi-platform, multi-software workflow has taught me invaluable lessons about technical integration, file format compatibility, and managing complex production pipelines—skills that directly translate to my VR development work.

    The transcription process in MuseScore sharpens my attention to detail and my understanding of musical structure. I analyze performances note-by-note, translating what I hear into accurate notation, adjusting dynamics, articulations, and tempo markings to capture the essence of each piece. Once transcribed, I export MIDI files—a process that requires understanding data formats and ensuring information is preserved across software transitions. In Logic Pro, I refine these MIDI performances, adjusting velocity curves, adding subtle timing variations for musicality, mixing tracks, and applying effects to achieve a polished final sound. The audio is then exported as high-quality WAV files, ready for the final stage.

    This cross-platform workflow—moving seamlessly between macOS (Logic Pro) and Windows (PianoVFX)—has made me comfortable navigating different operating systems and understanding their unique quirks and capabilities. Building the final PianoVFX videos requires synchronizing audio with visual elements, managing file paths, optimizing render settings, and troubleshooting compatibility issues when things don’t work as expected. The systematic problem-solving required here is identical to what I encounter in Unity development: integrating assets from Blender, managing audio files, scripting interactions, and building for different VR platforms. Every step in my music production pipeline reinforces the same principle that drives my development work—understanding how individual components integrate into a cohesive, functional whole.

  • Expanding my production pipeline further, I’ve begun creating custom piano visualizations in Blender using the MIDI PianoMotionPro add-on. This process bridges the gap between my MuseScore transcriptions and 3D animation, allowing me to transform the MIDI data I’ve meticulously crafted into dynamic visual representations. Working with MIDI files in a 3D environment has deepened my understanding of how data drives animation—each note becomes a keyframe, each velocity value influences visual intensity, and timing information controls the choreography of virtual piano keys. This direct relationship between musical data and 3D motion mirrors the data-driven systems I build in Unity, where user input, sensor data, or game state information drives character animations, UI updates, and interactive behaviors.

    The technical challenges of this workflow have been incredibly educational. I’ve learned to troubleshoot issues with MIDI import settings, understand how the add-on parses note data into Blender’s animation curves, and adjust timing offsets to ensure perfect synchronization between audio and visual elements. Blender’s node-based material system for creating custom key colors and lighting effects has taught me about procedural generation and parameterized design—concepts that translate directly to shader programming in Unity. Managing the performance demands of rendering complex 3D scenes with hundreds of animated objects has reinforced the importance of optimization, a critical skill when developing VR applications where frame rate is non-negotiable.

    Most importantly, this process has solidified my understanding of asset pipelines and data flow between creative tools. The journey from musical performance to MuseScore notation to MIDI file to Blender animation to final rendered video is a complete production pipeline that requires careful file management, format compatibility awareness, and systematic problem-solving when any link in the chain breaks. This experience directly parallels VR development workflows where 3D models move from Blender to Unity, audio files are processed and implemented, and multiple systems must integrate seamlessly. Every visualization I create reinforces the interdisciplinary thinking required to build complex interactive experiences—understanding not just individual tools, but how they communicate and work together as a cohesive system.

  • For 100 consecutive days, I practiced table tennis exclusively in virtual reality using Eleven Table Tennis on the Meta Quest 3, documenting my progress daily across YouTube Shorts, Instagram, and TikTok. I tracked my ELO ranking from day one through day 100, meticulously recording the gradual improvement in my gameplay. The results weren’t just impressive numbers on a leaderboard—when I transitioned to playing physical table tennis, the skills I developed entirely in VR transferred directly and effectively to real-world performance. I reached a professional level of table tennis competency without ever touching a physical paddle during those 100 days. This wasn’t a gaming achievement; it was empirical proof that VR simulation is a legitimate training tool.

    The skills that developed in virtual reality were surprisingly comprehensive. My reaction time, spatial awareness, and hand-eye coordination improved measurably. I learned to read spin, anticipate ball trajectory, and execute strategic shot placement—all within a virtual environment. The muscle memory built through thousands of virtual swings translated seamlessly when I picked up a real paddle. My footwork, positioning, and understanding of angles carried over because the VR simulator accurately replicated the physics, timing, and spatial relationships of actual table tennis. What made this possible was the fidelity of the simulation: accurate ball physics, realistic paddle mechanics, proper court dimensions, and critically, the embodied nature of VR that requires real physical movement and spatial reasoning rather than abstract button presses.

    This experience fundamentally shapes my career aspirations in VR simulation development. I don’t just believe VR training works—I’ve lived it. I understand firsthand what makes a simulation effective: accurate physics, intuitive interactions, proper haptic feedback, and the importance of replicating not just visuals but the physical and cognitive demands of the real task. This is why I’m passionate about building VR simulators for businesses, particularly for training scenarios involving heavy machinery, complex equipment operation, or dangerous procedures where mistakes in the real world carry significant consequences or costs. My table tennis journey proves that VR can prepare people for real-world performance in a safe, repeatable, cost-effective environment. Whether it’s training forklift operators, teaching surgical techniques, or preparing technicians for equipment maintenance, the principles remain the same: high-fidelity simulation plus deliberate practice equals real-world skill transfer. I’m not building games—I’m building tools that genuinely develop human capability.

  • My journey into 3D animation began with a clear purpose: to understand every stage of the 3D asset pipeline that powers VR development. Using Blender, I’ve created various animation projects utilizing QuickMagic workflows and imported .fbx models, deliberately exposing myself to the full spectrum of 3D production—from asset preparation and rigging to materials, lighting, and final rendering. These projects weren’t about becoming a professional animator; they were about gaining the technical literacy required to work effectively with 3D assets in Unity and understanding the constraints, possibilities, and optimization considerations that shape VR development.

    Working with .fbx models taught me the critical importance of proper asset preparation and file format compatibility. I learned to navigate common import issues: incorrect scale, missing textures, broken material assignments, and animation clip problems that emerge when moving assets between software packages. Understanding how UV mapping works, how texture files reference materials, and how different shader models interpret surface properties has been invaluable. In Blender, I experimented extensively with the shader node system—creating materials with proper albedo, roughness, metallic, and normal maps—directly mirroring the PBR (Physically Based Rendering) workflow used in Unity. This hands-on experience means I don’t just import assets blindly; I understand what’s happening under the hood and can troubleshoot when materials don’t look right in the VR headset.

    Beyond materials, my Blender work demystified lighting and camera systems that are fundamental to creating immersive VR experiences. I learned how different light types affect performance, how to bake lighting for optimization, and how shadows impact both visual quality and frame rate—considerations that become critical in VR where maintaining consistent 90fps is non-negotiable. Working with cameras taught me about field of view, depth of field, and composition principles that inform how I design VR environments and user perspectives. Every animation I rendered reinforced lessons about polygon count, draw calls, and the performance tradeoffs between visual fidelity and real-time rendering—lessons that directly translate to building optimized VR applications. Understanding the full 3D pipeline hasn’t just made me a better developer; it’s made me a more collaborative one, able to communicate effectively with 3D artists and make informed technical decisions about asset integration.

  • My flagship VR development project is a fully functional harmonica simulator built for the Meta Quest 3, deployed as a standalone .apk application for the Android-based Quest platform. This project synthesized everything I’ve learned across music, 3D modeling, and VR development into a cohesive, interactive experience. Users can hold a virtual harmonica, play notes by blowing and drawing through proper head movements and controller interactions, and experience the tactile feedback of playing a real instrument in virtual space. Building this simulator required solving complex technical challenges across multiple domains: asset pipeline integration, real-time audio synthesis, physics-based interaction design, and Android deployment—all while maintaining the performance standards critical for comfortable VR experiences.

    The development process began with creating and optimizing 3D assets in Blender, then importing them into Unity while preserving materials, textures, and proper scale. I learned to navigate Unity’s component-based architecture, attaching scripts to GameObjects, managing parent-child hierarchies for the harmonica’s individual note holes, and configuring colliders for spatial interaction. The core technical challenge was implementing MIDI-based audio playback through C# scripting specifically optimized for Android devices. I had to manage real-time MIDI note triggering based on user input, handle polyphony when multiple notes play simultaneously, and ensure audio latency remained imperceptible—critical for maintaining the illusion of playing a real instrument. Each note hole required precise box collider placement and scripting to detect controller proximity and trigger appropriate audio responses, balancing realistic spatial constraints with forgiving interaction volumes that feel natural in VR.

    Beyond the core functionality, I implemented shader systems for visual feedback—highlighting active note holes and providing visual cues for proper hand positioning. Optimization became paramount: reducing draw calls, managing audio resource loading, and ensuring consistent frame rates since any performance dip in VR causes discomfort. Once development was complete, I learned the entire Android deployment pipeline for Quest: building the .apk through Unity, sideloading via SideQuest, testing on actual hardware, and iterating based on real-world VR usage. I then marketed the application through SideQuest’s platform, writing compelling descriptions, creating promotional materials, and gathering user feedback to inform updates. This end-to-end experience—from concept to deployed, user-tested application—demonstrated that I can take a VR project through every stage of development, not just prototype in the editor but deliver a polished, distributable product that real users can experience.

  • I’m a computer science student specializing in VR development with Unity3D, driven by a singular goal: to build immersive training simulations that bridge the gap between virtual practice and real-world competency. My focus is on developing VR solutions for businesses—particularly simulations for heavy machinery operation, technical training, safety procedures, and complex skill development where traditional training methods are costly, dangerous, or logistically impractical.

    Why VR Simulation?

    I don’t just develop VR experiences—I prove they work through personal experimentation. Over 100 consecutive days, I documented my journey learning table tennis entirely through Eleven Table Tennis VR on the Meta Quest 3. I tracked my ELO ranking from day one through day 100, recording daily practice sessions across YouTube Shorts, Instagram, and TikTok. The results were remarkable: my in-game ranking improved significantly, and when I transitioned to playing physical table tennis, the skills transferred directly to real life. I reached a professional level of competency without touching a physical paddle during those 100 days.

    This wasn’t just a gaming achievement—it was proof of concept. VR training works. Muscle memory, reaction time, spatial awareness, and strategic thinking developed in virtual reality translated seamlessly to the physical world. This firsthand experience fundamentally shapes my development philosophy: VR simulations aren’t just immersive—they’re effective training tools that can prepare people for real-world scenarios without the risks, costs, or logistical challenges of traditional methods.

    My Approach: Mastering the Full Pipeline

    What sets me apart is my commitment to understanding how different tools and systems integrate into cohesive workflows. I don’t just use software—I learn how it communicates with other tools, where compatibility issues arise, and how to troubleshoot when pipelines break. My projects span multiple domains deliberately:

    VR Development: I’ve built a VR Harmonica Simulator for Meta Quest 3, navigating the entire pipeline from Blender asset creation to Unity development to Android .apk deployment on SideQuest. I learned to manage MIDI-based C# scripting, optimize performance for mobile VR, implement physics-based interactions with precise collider placement, and handle real-time audio synthesis—all while maintaining the 90fps standard required for comfortable VR experiences.

    3D Asset Pipeline: My Blender work goes beyond creating animations—it’s about understanding how models, textures, materials, and shaders move between software ecosystems. Working with .fbx imports, PBR shader workflows, UV mapping, and lighting systems has given me the technical literacy to troubleshoot asset integration issues and communicate effectively with 3D artists. I know what happens under the hood when a model doesn’t import correctly or materials break in Unity.

    Audio Production: Through MuseScore transcription, Logic Pro DAW editing, and PianoVFX visualization, I’ve built complete music production pipelines that move seamlessly between macOS and Windows environments. I understand MIDI data structures, file format compatibility, cross-platform workflows, and how to synchronize audio with visual elements—skills that directly translate to implementing spatial audio and interactive soundscapes in VR.

    Documentation & Iteration: My multiple 100-day challenges (piano practice, table tennis training) demonstrate not just technical skills but the discipline of consistent documentation, progress tracking, and incremental improvement. These same principles drive my development process: iterative testing, user feedback integration, and systematic problem-solving.

    What I Bring to VR Simulation Development

    Employers hiring for VR simulation roles need developers who understand more than just Unity scripting. They need people who:

    • Grasp the full 3D asset pipeline and can optimize for real-time performance
    • Understand how physics, audio, and interaction design create believable experiences
    • Can troubleshoot cross-platform compatibility and deployment issues
    • Recognize what makes training simulations effective from a user experience perspective
    • Have genuine passion for VR backed by hands-on experience as both developer and user

    I’m not building games—I’m building tools that develop real human capability. Whether it’s training forklift operators, teaching equipment maintenance, or simulating dangerous procedures, I understand that effective VR simulation requires technical excellence across multiple domains and a deep appreciation for how virtual experiences translate to real-world performance.

    This portfolio showcases my technical breadth, but more importantly, it demonstrates my ability to learn new tools quickly, integrate complex systems, and solve problems systematically—exactly what’s required to build professional VR training solutions that genuinely work.

  • Hi I’m Coco,

    First Post!