Robotic Arm Interface
Operators needed weeks of training to control a 7-DOF robotic arm through CLI-based tooling.
Ergonomic physical controller with force feedback + React telemetry dashboard showing arm state in real-time.
Training time reduced from 3 weeks to 2 days. Operator error rate dropped 74%.
Overview
Industrial robotic arms are powerful but intimidating. Most operators interact through terminal commands or teach pendants with dense button layouts. I designed a controller that feels like an extension of your hand, paired with a dashboard that shows exactly what the arm sees and plans to do.
Process
Operator Observation
Shadowed operators for 40 hours across 3 facilities. Mapped the most common tasks: pick-and-place, welding path teaching, and inspection positioning. Identified that 80% of errors came from spatial disorientation — operators couldn't intuit the arm's coordinate frame.
Controller Design
6-DOF input device with gimbal mechanism and brushless DC motors for force feedback. 3D-printed ergonomic shell tested with 12 operators across different hand sizes. Haptic detents at workspace boundaries to prevent collisions.
ROS Integration
ROS 2 node translating controller input to MoveIt motion plans. Collision-aware path planning with real-time joint limit visualization. Latency budget: <20ms from input to arm movement onset.
Dashboard
React dashboard with 3D arm visualization (Three.js), live camera feeds, joint torque graphs, and task scripting interface. Operators can record motions with the controller, then replay and fine-tune in the dashboard.