in active development
The workspace forparallel AI.
Run 10 agents at once. See which ones need you. Navigate with your hands. Control from your phone. gmux is the ambient layer your AI development is missing.
⚠ Write to auth.py? [y] once [a] always [n] deny ▌
Refactoring route handlers ▌ ✓ routes/auth.ts ✓ routes/users.ts › routes/api.ts
📷 ·
!1 ·
◉2 ·
●3 ·
◉4 ·
^!5
tmux · live
gesture navigation ·
voice routing ·
phone control ·
real-time SSE state ·
tmux + opencode ·
sub-agent detection ·
session restore ·
MediaPipe hands ·
gesture navigation ·
voice routing ·
phone control ·
real-time SSE state ·
tmux + opencode ·
sub-agent detection ·
session restore ·
MediaPipe hands ·
what's included
Everything your workflowhas been missing
gmux wraps tmux. No tool replacement — just the visibility and navigation layer that makes parallel AI development possible.
🔍 Live state detection Subscribes to every opencode SSE stream. Knows the exact moment an agent switches state — not by polling, by listening.
✋ Gesture navigation MediaPipe hand tracking. Right hand navigates windows. Left hand commands. Swipe, point, three-finger agent jump.
🎤 Local voice routing faster-whisper STT on-device. Nav to tmux. AI queries to the focused pane. 400ms. No cloud. No keys.
📱 Phone as remote Mobile PWA, zero install. Agent carousel. Volume-key cycling. Push-to-talk to the focused agent.
🖥 Tauri native UI Real xterm.js terminal via PTY, live sidebar with todo counts, one-click permission approval, gesture calibration.
♻️ Session persistence Window names saved every 30s. Agents relaunch on resurrect. Custom names survive kills and reboots.
$ gmux status
✓ monitor.py running · SSE connected on 9 panes
✓ session_restore daemon · saving names every 30s
✓ cam-broker /dev/video2 · 30fps
· voice bridge offline · start with voice-toggle.sh
$
state system
Every agent, every state,at a glance
Permission beats waiting beats working. Each window shows the most urgent pane inside it.
!
permission
Main agent needs tool approval now
^!
sub-agent perm
Task-tool child needs approval (one level down)
●
waiting
Agent ready for your next input
◉
working
Running tools, streaming response
◆
done
Just completed a task
✗
error
Broke — needs your attention
○
idle
Plain shell, no AI running
gesture control
Two hands.Two roles.
Right hand navigates. Left hand commands. The camera broker keeps gesture tracking off your browser's camera.
◉ right hand — navigation
swipe right → next tmux window
swipe left ← previous tmux window
swipe up ↑ scroll up
swipe down ↓ scroll down
pinch select / configurable
◉ left hand — commands
point ☝ toggle voice listen
three fingers jump to next waiting agent
thumbs up 👍 accept / approve
thumbs down 👎 reject / deny
peace ✌ always-listen mode
early access
Be first to know
Terminal stack is working. Tauri UI is being built. Leave your email — we'll reach out when it's ready.
Notify me
✓ you're on the list