Build Immersive Worlds That People Actually Want to Experience
Twelve months of hands-on work with Unity and Unreal. You'll wrestle with spatial computing, debug VR interactions until they feel natural, and ship projects that show what you can actually do.

How We Actually Teach This Stuff
Most courses throw theory at you. We start with broken scenes and ask you to fix them. Then we discuss why it broke in the first place.
Start with Real Engine Projects
Week one, you're already importing 3D assets and making them interactive. We use Unity for the first semester because its component system teaches you how modern game architecture works.
You'll build a simple VR room escape by month two. Nothing fancy, but it teaches collision detection, grab mechanics, and why framerate matters when someone's wearing a headset.

Then Break Everything and Rebuild It
Semester two switches to Unreal Engine. Same projects, different approach. You'll see why Blueprint visual scripting matters for rapid prototyping, and why C++ still runs the show for performance-critical code.
The transition is deliberately uncomfortable. That's when you realize you're learning patterns, not just memorizing one tool's quirks.

What You'll Actually Build
Eight projects over twelve months. Each one gets more complex. Each one goes in your portfolio.
VR Interaction Lab
Physics-based grabbing, button pressing, lever pulling. The foundational stuff that needs to feel right before anything else matters.
Spatial Audio System
Position-based sound that actually helps users navigate your world. Most teams get this wrong and wonder why their VR feels disorienting.
Locomotion Testing
Teleportation, smooth movement, room-scale boundaries. You'll implement five different methods and test which ones make people less nauseous.
Multiplayer Sync
Getting two people in the same virtual space without lag ruining the experience. Network replication is harder than it looks.
Performance Profiling
Your scene runs at 20fps and users feel sick. Now what? Drawcalls, batching, LOD systems. The unglamorous stuff that ships products.
AR World Anchoring
Making virtual objects stick to real surfaces. Plane detection, spatial mapping, and why lighting matters more in AR than VR.
Your Typical Week Here
Three evening sessions plus weekend lab access. We built the schedule for people who work during the day.
Monday
New concept introduction. Short lecture, then we break it by trying to apply it wrong.
Wednesday
Lab work on current project. Instructors circulate. Most learning happens when someone hits a weird bug.
Friday
Show your work. Even broken work. Especially broken work. That's where good questions come from.
Saturday
Open lab. Work on your project, experiment with ideas, or just test other people's builds and give feedback.

Lachlan Merrick
Lead Instructor
I spent six years building VR training simulations for manufacturing companies. Mostly boring stuff like forklift operation and safety protocols. But that work taught me what actually matters when someone puts on a headset.
Most of my job was fixing performance problems and making interactions intuitive enough that workers didn't need a manual. Those constraints made me a better developer.
I started teaching because I kept meeting talented programmers who wanted to work in XR but didn't know where to start. Turns out the gap between traditional game dev and spatial computing is wider than most tutorials admit.
Program Starts September 2025
Applications open June 1st. We're capping enrollment at fifteen students so everyone gets lab time with actual hardware.
You'll need basic programming knowledge. If you've built something in any C-family language, you'll manage.