Project Deep Dive

GestureKit

GestureKit is a local Node.js application that listens to keyboard input and detects gesture patterns — single taps, double taps, long presses, and combinations — then executes configurable macro sequences with human-like timing randomization. Built for SWTOR but applicable to any keyboard-driven workflow.

Node.jsTypeScriptRobotJSVitest

Project Vlog

The Problem

PC games and productivity apps often require complex, multi-step key sequences executed with precise timing. Doing this manually is slow, inconsistent, and exhausting over a long session. Existing macro tools either lack the nuance needed for gesture-based triggering or are too simplistic for per-key independent detection.

What GestureKit Does

GestureKit intercepts keyboard input globally (even when another window is focused), classifies each key's tap pattern into one of 12 gesture types, and executes a bound macro sequence in response. Each key operates independently, so you can have simultaneous gestures on different keys without interference.

Gesture Types

  • Single / Double / Triple / Quadruple — based on tap count
  • Short / Long / Super Long — based on how long the final tap is held
  • Combined: 4 counts × 3 durations = 12 gesture types per key
  • Key Features

  • 22 input keys monitored simultaneously (WASD, 1-6, mouse buttons, and more)
  • Per-key isolation — each key's gesture detection runs independently
  • Human-like timing — randomized delays (configurable min/max) between each keypress in a sequence
  • Multiple backends — RobotJS (default), Interception Driver (kernel-level), or Mock (testing)
  • JSON profiles — define all macros in a simple config file, no code changes needed
  • How It Works

    The input listener hooks into the OS-level keyboard event stream. When a key event arrives, it's dispatched to that key's dedicated GestureDetector instance. The detector tracks tap timing against configurable thresholds (long press: 80–145ms, super long: 146–265ms) and resolves the gesture type once the key sequence completes.

    Once a gesture is resolved, GestureDetector emits the event and the executor fires the bound macro sequence — pressing each key in order with randomized delay between presses.

    Profile Format

    ```json

    {

    "trigger": { "key": "1", "gesture": "double" },

    "sequence": [

    { "key": "a", "minDelay": 25, "maxDelay": 30, "echoHits": 1 },

    { "key": "b", "minDelay": 30, "maxDelay": 40, "echoHits": 2 }

    ]

    }

    ```

    Use Case

    Originally built for SWTOR to handle ability rotations without requiring superhuman button timing, GestureKit is useful any time you want to bind complex sequences to simple gesture inputs — games, DAWs, video editing, or accessibility tools.

    Engineering Highlights

    The trickiest design challenge was per-key isolation with concurrent execution. If keys A and B are both mid-gesture simultaneously, they need completely independent state machines. I solved this with a Map of per-key detector instances that never share state, plus a traffic controller to prevent modifier key collisions when two macros run simultaneously.