This project is my senior capstone at Cal Poly SLO, focused on developing a bipedal humanoid robot designed as an intelligent companion for space applications. The system combines real-time locomotion, embedded AI perception, and human–robot interaction.
The goal is to demonstrate:
Stable walking and balance in a humanoid robot form.
Real-time facial expression recognition for detecting human emotional states.
Task execution and following (object handling, simple commands).
Interactive “space companion” behavior — assisting with monitoring, reminders, and interaction in environments where human–robot teamwork is critical (e.g., spacecraft, research stations).
This project integrates embedded systems, control theory, AI inference, and robotic design, representing a cross-disciplinary effort at the intersection of electrical engineering and space robotics.
Processor / Compute: NVIDIA Jetson Orin Nano (AI inference + high-level control)
Microcontroller: STM32 (low-level motor control & sensor interfacing)
Actuators: High-torque servo motors for hip, knee, and ankle joints
Sensors:
9-DOF IMU (accelerometer, gyroscope, magnetometer)
Force/pressure sensors for ground contact detection
Control System:
Low-level PID loops for joint actuation
Sensor fusion via extended Kalman filter for orientation & balance
AI-based gait generation and stabilization policies (trained offline, deployed onboard)
Power: Li-Po battery system with regulated 5V/12V rails
Dimensions: Approx. 0.6 m tall, 4–5 kg
Interfaces: UART, I²C, SPI, USB, Wi-Fi for debugging/telemetry
Mechanical Design: CAD-modeled humanoid frame, balancing strength and weight efficiency.
Electronics: Distributed system — STM32 handles deterministic, real-time actuation while Jetson performs AI perception and planning.
Software Stack:
Embedded C for motor drivers and control loops
Python/C++ for AI models (facial recognition, task planning)
ROS2 middleware for modular communication
Architecture Philosophy: Real-time stability on microcontroller, adaptive intelligence and interaction on Jetson.
Mechanical Design: CAD-modeled humanoid frame, balancing strength and weight efficiency.
Electronics: Distributed system — STM32 handles deterministic, real-time actuation while Jetson performs AI perception and planning.
Software Stack:
Embedded C for motor drivers and control loops
Python/C++ for AI models (facial recognition, task planning)
ROS2 middleware for modular communication
Architecture Philosophy: Real-time stability on microcontroller, adaptive intelligence and interaction on Jetson.
Results (Expected)
Locomotion & Balance: Demonstrated stable walking gait and real-time recovery from small disturbances.
Facial Expression Recognition: Onboard Jetson runs deep learning models for real-time facial expression recognition, enabling the robot to interpret human emotional states (e.g., happy, sad, neutral).
Task Handling: Executes predefined tasks (e.g., object pickup with gripper, environmental monitoring, or simple assistance routines).
Interactive Companion Behavior: Responds to verbal or gesture-based commands, adjusting behavior based on context.
Space Application Tie-In: Designed as a prototype for a “space companion” concept — capable of monitoring crew health cues (via expressions), providing reminders or alerts, and assisting with routine tasks in constrained environments like spacecraft.
Lessons Learned (Ongoing)
Human-Robot Interaction: Training reliable facial expression recognition under varying lighting conditions and with limited computing resources is challenging.
AI + Embedded Integration: Balancing GPU-intensive inference on Jetson with real-time actuation on STM32 requires careful task scheduling and efficient communication.
Task Autonomy: Mapping “high-level commands” (like follow me or pick this up) to low-level motor actuation requires layered control and state machines.
Space Context Constraints: Designing for potential use in spacecraft highlights the importance of compact design, energy efficiency, and robustness to disturbances.
Iterative Testing: Incremental validation (sensors → actuation → perception → interaction) is critical to avoid compounding errors during integration.
Full Senior Project Report
Click Image Above to View Code on GitHub