Rushbon Initiated

//Booting Pulse..._

Welcome to Rushbon — a studio from the other side of rhythm.We design real-time reactive games that treat players like input signals, not avatars.Our worlds don’t wait for you. They surge, vibrate, distort, and evolve. If you don’t react, you get absorbed. Founded by VJ engineers, audio hackers, and speedrun philosophers, Rushbon creates games that operate as living signal processors.We don’t build stories. We simulate frequency experiences.

line

You don’t play our games.
You perform them.

ABOUT

//We’re Not a Studio. We’re a Pulse Generator..._

vector

Rushbon began in an abandoned rehearsal room during a sound-reactive projection test. Three artists synced their laptops to the same beat grid and accidentally created the first prototype of our first game: Refractoid. It wasn't designed — it emerged.

We believe gameplay is a neurological instrument. That’s why we experiment with:

  • Sensor over keyboard
  • Latency as a feature
  • Color correction as difficulty

Our games require adaptation, not memorization.They’re fast. Raw. Beautifully unstable.Welcome to the interface of chaos.

We now operate at the edge of interaction, building:

  • Games that change with BPM
  • Enemies that attack with phase shifts
  • Visuals that respond to your breath (via mic input)
  • Systems that teach players rhythm not through sound, but through shape

PROJECTS

//Artifacts of Signal Play..._

Below are three of our main waveform artifacts — playable, shapeshifting, hostile.

Design Pulse

//We Don’t Design Games We Compose Systems..._

Traditional design asks: How does the player win? We ask: How does the game react when they don’t? At Rushbon, we treat design as choreography — a dance between visual information, muscle memory, and unpredictable rhythm.Our process is iterative and sensory-first. We prototype with audio before code. We sketch levels using waveform editors. We build interfaces based on discomfort, not convenience.

line

Our core design laws:

  • Latency is part of the loop.
    We intentionally introduce micro-delays in feedback to test player adaptation curves.
  • Disruption is clarity.
    We use screen glitches, false inputs, and visual decay to sharpen the senses — not to punish, but to awaken.
  • Narrative ≠ story.
    Our narratives are kinetic: they exist in timing, velocity, heatmaps — not dialogue.

We’re not interested in polish.
We’re interested in resonance.
If the player exits your game still vibrating, then the design worked.

TEAM

//The Coders of Collapse..._

STACK

//The Stack Doesn’t Sleep..._

Visual Stack

  • Custom Shader Engine – built atop Unity URP + glitch layer
  • Touch latency sandbox tools – for testing phase delays
  • Volumetric UI renderer – 3D input maps rendered over 2D UIs
  • Realtime waveform visualizer – used as both HUD and menu background

Input Systems

  • Mic input stream handler – frequency + amplitude analysis for gameplay
  • Swipe parser with ghost trace buffer – used in Neuropunk Arena
  • MIDI trigger parser – allows players to plug MIDI pads into game as controller

Backend & Dev

  • Unity + WebGL builds with custom WebAssembly bridge
  • GLSL experiments for visuals
  • Python scripts for level reshuffling logic
  • Audio engine: FMOD + custom OpenAL hacks

Signal Bridge

//Connect the Noise..._

We don’t work for clients.
We don’t take briefs.
We jam with other reality benders.

If you’re a:

  • Sensor developer
  • Audio-reactive sculptor
  • Mathematician who dreams in sound
  • Composer tired of loops
  • Physicist who codes generative poetry

We’re listening. And resonating.

Previous bridges:

CONTACT

//Transmit Packet to Node..._

Our inbox filters for glitches. No press kits. No PR fluff. Just pure signal.
Send raw data. Or don’t. We’ll feel it anyway.

line