Sesame Quadruped
1,619 stars on GitHub. Sesame is an affordable, open-source mini quadruped robot powered by an ESP32 microcontroller. Designed by Dorian Borian, Sesame uses 8 MG90S metal-gear servos (two per leg) for 8-DOF locomotion and features a 128×64 OLED display that serves as an expressive robot face. Source: https://github.com/dorianborian/sesame-robot All mechanical parts are fully 3D-printable on a standard FDM printer. The hardware folder contains both parametric STEP and Fusion 360 source models alongside the STL files, allowing full customization. The frame, internal structure, covers, and leg segments are all available as individual STLs. The ESP32 firmware handles inverse kinematics, face animations on the OLED display, and a WiFi-based control interface accessible from any browser. A desktop companion app (Sesame Studio) is included for easy gait configuration and pose tuning without writing code. Community documentation covers full assembly with detailed wiring diagrams and a comprehensive BOM. Hat variants are available (enclosed, open, cat ears) for personality customization. With over 1600 GitHub stars, Sesame has become a go-to beginner quadruped platform. License: Apache 2.0.
Kame32
Kame32 is an open-source quadruped walking robot by JavierIH. Built around an ESP32 and 8 servo motors, it walks, runs, dances, and performs a rich library of quadruped gaits — all controlled wirelessly via a built-in web-based gamepad over Wi-Fi. All structural parts are 3D-printable. A custom PCB (KiCAD gerbers included) centralizes servo wiring. Choose MG90S (higher torque) or SG90 servos — brackets for both variants are included. Specifications | Property | Value | |----------|-------| | Motors | 8 servos (MG90S or SG90) | | DOF | 8 (2 per leg) | | Controller | ESP32 Dev Kit | | Control | Web gamepad via Wi-Fi | | PCB | Custom KiCAD design (gerbers included) | | CAD | FreeCAD source file | | Build time | 1–2 weekends | | Skill level | Intermediate | Gaits Walk · Backward · Run · Omni Walk · Turn Left · Turn Right · Moonwalk · Dance · Up/Down · Push Up · Hello · Jump · Home All gaits use the Octosnake oscillator library — sinusoidal servo control with per-axis phase offsets. Hardware ESP32 drives 8 PWM servos at 50 Hz / 16-bit resolution on GPIO 5, 18, 19, 21, 25, 26, 32, 33. Per-servo calibration offsets stored in ESP32 NVS. Firmware PlatformIO / Arduino. Build environments: calibration (tune offsets) and gamepad (web UI controller). Attribution Creator: JavierIH Source: github.com/javierih/kame32 License: CC BY-SA 4.0 (hardware) - GPL-3.0 (code)
Mobile RobotsYertle Quadruped
Yertle — A 3D Printed Quadrupedal Robot for Locomotion Research Yertle is a 12-DOF quadruped robot designed for locomotion research. It fuses the leg geometry of the Kangal quadruped with the body geometry of SpotMicro, making most parts cross-compatible with the SpotMicro ecosystem. Source: https://github.com/Jerome-Graves/yertle Creator: Jerome Alexander Graves License: MIT Status: Work-in-progress (functional; ROS2 integration pending) This Program is a learning entry point. The original firmware is C++ on an ESP32 with a Python GUI master controller — the orobot Program here exposes a stubbed JavaScript interface so you can explore the gait/command surface in-browser. To run on real hardware, follow the upstream build and flash instructions. Mechanical 4 legs × 3 DOF (hip yaw + hip pitch + knee) = 12 servos total Leg extension: ~20 cm Mass: ~1.8 kg Frame: PLA or ABS, printable on a 150 × 150 mm bed (Ender 3 Pro tested) Print time: ~2 weeks (5–10 h/day on a single Ender 3 Pro) Electronics | Role | Component | |---|---| | Microcontroller (master of servos/sensors) | ESP32s | | Optional onboard SBC (vision/ROS) | Raspberry Pi 4B | | Servo driver | PCA9685 (16-channel PWM) | | Servos × 12 | SPT5435LV-180W, 35 kg·cm waterproof digital (≥ 15 kg·cm required) | | IMU (optional) | MPU9250 9-axis | | Battery | 7.4 V 2S 5000 mAh LiPo (~30 min runtime) | | Power | SBEC 6 V 20 A | The robot can run headless from the ESP32 alone (UDP-over-WiFi to a Python GUI on a laptop/phone) or with a Pi4 onboard for vision and ROS. Software architecture Master/slave over serial or UDP/WiFi: Slave (ESP32, C++/Arduino): servo control, sensor read, inverse kinematics, safety limits. Master (Python 3 GUI): gait generation, sensor fusion, ROS2 (todo). Runs on anything with WiFi + screen + Python 3. Simulation: Python-based built-in simulator; URDF available for Gazebo/Unity. Build cost ~£250 total (servos dominate at ~£162). See full BOM in the upstream Design/README. 3D printed parts (21 STLs) Shell (4): Top, Bottom, Front, Back Frame (9): Inner/Outer/Upper/Lower Shoulder Frame, Left/Right Servo Mount, Servo Mount Top Bracket, Side Body Beam, Electronics Mounting Plate Legs (8): Inner/Outer Tibia, Femur, Femur Servo Connector, Left/Right Shoulder, Short Link, Long Link All STLs: https://github.com/Jerome-Graves/yertle/tree/main/Design/STL Hardware compatibility (BYOD) Yertle is not an orobot-firmware-native build. It uses a custom ESP32 firmware. Running the orobot Program against real hardware requires bridging the orobot WebSocket protocol to Yertle's UDP master/slave protocol — a custom integration tracked under the orobot ESP32 BYOD effort. Inspirations Kangal (leg design) SpotMicro (body geometry, parts compatibility) Open Quadruped --- Extracted from commit on 2026-04-27.
SO-101 Teleop Arm
SO-101 is the current-generation open-source 6-DOF teleoperated robotic arm from The Robot Studio, designed in collaboration with Hugging Face's LeRobot project. It is the successor to the SO-100 and is one of the most widely built low-cost research arms in the world, used by hundreds of researchers and hobbyists as a platform for imitation learning, robot learning datasets, and end-to-end AI for manipulation. The design is a leader/follower pair: the human operator back-drives the leader arm by hand, and the follower mirrors the motion to manipulate objects. Both arms share the same 6-DOF serial kinematic structure (base rotation, shoulder, elbow, wrist pitch, wrist roll, gripper) with one STS3215 smart servo per joint. The total bill of materials is under 20 per arm including the parallel-jaw gripper, making it dramatically more accessible than conventional research arms. SO-101 improves on SO-100 with cleaner wiring channels, easier assembly (no gear disassembly required for installation), and updated motors on the leader arm for improved back-driveability. The design is fully parametric with print-orientation guides provided for both Ender and Prusa workflows. All CAD, STLs, firmware, and assembly instructions are Apache 2.0 licensed. The arm plugs directly into the Hugging Face LeRobot library for data collection, policy training (ACT, diffusion policy, VQ-BeT), and rollout — you can record teleop demonstrations, train a neural policy, and run autonomous manipulation on the same hardware. This program exposes cloud-side endpoints for homing, gripper control, recorded trajectory replay, and a mirror-leader hook for live teleop relay. Credit to The Robot Studio (therobotstudio.com) and the Hugging Face LeRobot team — upstream repository at https://github.com/TheRobotStudio/SO-ARM100 under Apache 2.0. Assembly guide at https://huggingface.co/docs/lerobot/so101.
Zowi Biped
163 stars on GitHub. Zowi is the official open-source biped from bq (sponsored until 2016) — a compact 4-servo educational walking robot, descended from the Otto/BoB Thingiverse lineage (original concept by k120189, Thingiverse 43708). Designed entirely in FreeCAD so every part is editable and remixable. Licensed CC-BY-SA 4.0. Source: https://github.com/JavierIH/zowi Hardware: 4 Futaba 3003 (or compatible) servos BQ ZUM BT328 or Arduino-compatible board 4xAAA battery holder ~38 M3 bolts and 22 M3 nuts The repository ships the canonical body/chassis/leg/foot STLs plus a deep mods folder with remix variants: Forge, IronZowi, JIM, Kobuki, MicroRaider, Scopum, Zowarrior, Zowimanoid, Zowiquilator. Also includes Arduino code, schematics, and Spanish-language docs. Resources: Original Thingiverse concept: http://www.thingiverse.com/thing:43708 DIWO blog (Spanish): http://diwo.bq.com/zowi-introduccion-a-los-robots-bipedos/ License: CC-BY-SA 4.0.
SpotMicro ESP32
377 stars on GitHub. SpotMicroESP32 is Michael Kubina's redesign of the SpotMicro quadruped, derived from KDY0523's original Thingiverse design, optimized for support-free 3D-printing and built around an ESP32-DevKitC. 12-DOF (3 servos per leg). Source: https://github.com/michaelkubina/SpotMicroESP32 Hardware: 12 servos (3 per leg: shoulder yaw, upper, lower) ESP32-DevKitC main controller Optional ESP32-CAM for vision LiPo battery with custom mounting brackets Software ecosystem (community forks): Maarten Weyn BLE/IK firmware: https://github.com/maartenweyn/SpotMicro_ESP32 Blacksheep Nitro Fork (PCB + walking gait + RC): https://github.com/Blacksheep909/SpotMicroESP32-Nitro-Fork SpotMicro-Leika (FreeRTOS + 2 gaits): https://github.com/runeharlyk/SpotMicroESP32-Leika SpotMicroAI Community: https://spotmicroai.readthedocs.io/ Resources: Thingiverse: https://www.thingiverse.com/thing:4559827 Original SpotMicro by KDY0523: https://www.thingiverse.com/thing:3445283
mjbots Hoverbot
57 stars on GitHub. The mjbots Hoverbot is a compact wheeled robot built around surplus hoverboard hub motors, designed for outdoor and semi-rugged terrain navigation. Developed by Josh Pieper of mjbots, it demonstrates how powerful consumer-grade hub motors combined with custom motor control electronics can produce a capable autonomous platform. Source: https://github.com/mjbots/hoverbot The robot uses two hoverboard hub motors driven by mjbots moteus-c1 brushless motor controllers — high-performance FOC controllers with integrated CAN-FD communication. An mjbots pi3hat provides IMU (gyro + accelerometer) fusion and CAN-FD bus aggregation. A Raspberry Pi 4 runs the control software, connected to the motor controllers over CAN-FD for real-time torque and velocity commands. Power comes from a cordless drill battery mounted in a custom 3D-printed battery housing with proper mechanical retention rails. The 18V battery drives the 36V-rated hoverboard motors at a reduced voltage, limiting top speed to ~2 m/s — faster than walking pace but mechanically safe. The robot can operate for hours on a single charge and handles varied terrain including grass, gravel, and uneven pavement. The entire chassis is 3D-printable in PETG, with M3 and M2.5 heat-set inserts providing structural joints. An optional GoPro mount allows first-person video capture. The design is fully open-source with all STL files, source code, and configuration published under Apache 2.0. Hardware includes: Raspberry Pi 4 (2GB), 2× mjbots moteus-c1 motor controllers, mjbots pi3hat (IMU + CAN-FD), mjbots power_dist module, 2× surplus hoverboard hub motors, 18V cordless drill battery, 3D-printed PETG chassis.
Otto DIY
Otto DIY — Bipedal Walker Robot Otto is one of the most beloved open-source DIY robots: a small bipedal walker that anyone can build with a 3D printer, an Arduino Nano, and four micro servos. Originally created by the Otto DIY community, Otto can walk, turn, dance, sing, and emote with optional ultrasonic, sound, and LED matrix add-ons. This Program is a learning interface for the Otto DIY platform. Full hardware control runs on the Arduino firmware in the source repo below. The orobot.io control sandbox lets you experiment with the command surface (walk, turn, jump, gestures, songs) before wiring it into your own Otto. Specs | Property | Value | |----------|-------| | Type | Bipedal walker | | Servos | 4 × SG90 micro servo (LeftLeg, RightLeg, LeftFoot, RightFoot) | | Controller | Arduino Nano (also Uno, Micro, Mega, ESP8266, ESP32 in dev) | | Height | ~12 cm | | Estimated cost | $30–50 USD | | Estimated build time | 2–6 hours | | Skill level | Beginner / Intermediate | Bill of Materials (typical Otto build) | Item | Qty | Notes | |------|-----|-------| | Arduino Nano | 1 | ATmega328P, USB-B Mini | | SG90 micro servo (180°) | 4 | LeftLeg, RightLeg, LeftFoot, RightFoot | | HC-SR04 ultrasonic sensor | 1 | optional, eyes / obstacle avoidance | | Active piezo buzzer | 1 | sounds + songs | | 4× AA battery holder + batteries | 1 | or 1S/2S LiPo with regulator | | Jumper wires + perfboard / Otto shield | 1 set | | | 3D-printed body parts (head, body, legs ×2, feet ×2) | 1 set | STLs at ottodiy.com | | M2 / M3 self-tapping screws | ~10 | | Source Repo: https://github.com/OttoDIY/OttoDIYLib (canonical Arduino library, v13.0) STLs + assembly guide: https://www.ottodiy.com/ Author: Otto DIY community License: GPL v3 (code) + CC-BY-SA 4.0 (mechanical design) Hardware integration status Otto runs on Arduino Nano with the OttoDIYLib firmware. orobot-firmware does not yet have Arduino Nano support — this Program provides the learning interface and command surface. To run a real Otto, flash OttoDIYLib onto your Arduino directly and use the bundled examples (, ). Available commands home — return to neutral standing position walk(steps, time, dir) — bipedal walk forward () or backward () turn(steps, time, dir) — turn in place jump(steps, time) — quick crouch + extend moonwalk(steps, time, h, dir) — Otto's signature dance move gesture(name) — Happy, Sad, Angry, Love, Confused, Wave, Magic, Fail, Sleeping, etc. sing(song) — 19 built-in tunes (Shappy, Ssad, S_surprise, ...) Credits Massive thanks to @JavierIH, @Obijuan, @sfranzyshen, and the dozens of contributors who have built and maintained Otto DIY for nearly a decade. Otto is one of the projects that proved tiny, friendly, accessible robots could be a global open-hardware movement.
Modular Biped
467 stars on GitHub. The Modular Bipedal Robot is an open-source 3D-printable companion robot from MakerForge Tech / Aaron Mason, built around a Raspberry Pi + Arduino split. The repo includes the full STL set for the v2 body, head, neck, and legs, plus a modular Python/C++ software framework with drop-in modules for vision, speech, motion detection, GPT chat, NeoPixel LEDs, RTL-SDR, and more. Source: https://github.com/makerforgetech/modular-biped Architecture: Raspberry Pi (Pi 4 or Pi 5) handles vision, speech, and orchestration Arduino handles real-time servo control via serial Custom PCBs for power and IO IMX500 AI camera module supported Software modules (drop-in): Animation, Tracking, Vision (face/object), TTS / Braillespeak ChatGPT / Translator (LLM-driven interaction) Servos, PiServo, Neopixel, Buzzer, Motion detection Serial bus, RTL-SDR radio, Viam integration Build resources: Wiki & build guide: https://github.com/makerforgetech/modular-biped/wiki 3D-print files: https://github.com/makerforgetech/modular-biped/tree/main/3d_prints/v2 Hardware list: https://github.com/makerforgetech/modular-biped/wiki/Hardware Software architecture: see Software Architecture.drawio.svg in repo License: MIT.
Spot Micro Quadruped — Jetson Nano (ROS Melodic)
Spot Micro Quadruped — Jetson Nano / ROS Melodic Port > Jetson-Nano-specific port of the SpotMicro design family. See also: SpotMicro (Pi) and SpotMicro ESP32 entries on orobot for other compute targets. A 12-servo, 4-legged 3D-printed open-source quadruped running ROS Melodic on an NVIDIA Jetson Nano. This fork () ports the original Raspberry Pi 3B + ROS Kinetic stack onto the Jetson Nano with ROS Melodic, unlocking the GPU for on-board SLAM, perception, and future learned-policy work. The robot supports sit, stand, body angle, and walk control via two configurable gaits (8-phase stable gait by default; faster trot gait optional). Body-mounted RPLidar A1 enables real-time SLAM and 2D mapping. State is published over tf2 with open-loop calculated odometry. Hardware Compute: NVIDIA Jetson Nano (this port). Original target was Raspberry Pi 3B. Frame: Thingiverse Spot Micro (KDY0523) — thing:3445283 Servos: 12× PDI-HV5523MG (or cls6336hv — print files compatible) Servo control: PCA9685, i2c Power: 2S 4000 mAh LiPo direct to servo board; HKU5 5V/5A UBEC for Jetson + peripherals Sensing: RPLidar A1 (body-mounted) Optional: 16×2 i2c LCD panel for state readout Software stack OS: Ubuntu 18.04 (for ROS Melodic on Jetson Nano). Original used Ubuntu 16.04 + ROS Kinetic. Framework: ROS Melodic catkin workspace Languages: C++ (motion control, kinematics) + Python (keyboard command, plot) Key packages: , , (URDF), , , Build flow 1. Flash Jetson Nano with Ubuntu 18.04 + ROS Melodic. Add a 1 GB SWAP partition (catkin will OOM without it). 2. Create a catkin workspace, clone this repo into , run . 3. . 4. . 5. Calibrate all 12 servos using the spreadsheet + workflow before powering the legs. 6. on the Jetson; from a remote machine. Family cross-reference This is one of three SpotMicro variants on orobot — pick the compute target that matches your build: SpotMicro (Raspberry Pi) — original , Pi 3B + ROS Kinetic. SpotMicro ESP32 — , microcontroller-only port without ROS. SpotMicro Jetson Nano (this entry) — Jetson Nano + ROS Melodic, GPU-accelerated SLAM. Source Repo: https://github.com/0x49b/spotMicro-ROS-Melodic-Jetson-Nano Commit: 8c027c8a357dceace856d586022954205bc247ed License: MIT Upstream: (this is a Jetson Nano fork)
XLeRobot Dual-Arm Mobile Home Robot
Low-cost dual-arm mobile robot for embodied AI and household manipulation
OpenBot
3,262 stars on GitHub. OpenBot turns your smartphone into the brain of a low-cost robot, democratizing autonomous robotics for anyone with a ~$50 budget. Developed at Intel Labs and the Technical University of Munich, OpenBot is an MIT-licensed platform designed to make AI-powered robotics accessible to researchers, students, and hobbyists worldwide. Source: https://github.com/isl-org/OpenBot The robot body is a 3D-printed differential-drive chassis that holds two gear motors, a speed controller, and a custom PCB — all controlled by an Arduino Nano. The Android or iOS smartphone docks on top, providing the camera, CPU, and network stack. The pairing eliminates the need for a dedicated compute board: your phone's neural engine runs person-following, autonomous navigation, and custom AI policies trained using the companion mobile app. Four body variants are included: the standard regularbody (two-part top/bottom), the blockbody designed for a PCB stack, the gluebody for simpler assembly without screws, and the slimbody for narrower builds. A universal phonemount adapter fits any of the variants. The 12-part printable set covers bodybottom and bodytop for the regular variant, blockbodybottom and blockbodytop, gluebodybottomA/B and gluebodytopA/B with glue connector halves, slimbodybottom and slimbodytop, and the phonemountbottom/top. The project also supports a tank variant, an MTV off-road variant, and an RTR (ready-to-run) version based on an RC chassis. License: MIT. --- Install Notes OpenBot's intelligence lives in the Android app** — vision, navigation, data collection, and AI inference all run on the phone. The orobot device code only bridges the Arduino Nano motor controller layer (forward/backward/turn via serial JSON). Higher-level behaviors (person following, autopilot, data recording) require the OpenBot Android or iOS app connected to the Arduino via USB OTG cable. The orobot integration is useful for basic motor testing and manual drive, but does not replicate the full OpenBot feature set.
Stack-chan
1,382 stars on GitHub. Stack-chan is a palm-sized, open-source "super-kawaii" companion robot driven by an M5Stack microcontroller and JavaScript firmware. Created by Shinya Ishikawa, Stack-chan sits on your desk, turns its head to watch you, expresses emotions on its built-in display, and responds through speech — a personality in a box you can hold in one hand. Source: https://github.com/meganetaaan/stack-chan The firmware is written on the Moddable SDK, a JavaScript framework for embedded systems, making the robot programmable without leaving the web development ecosystem. Behaviors — called "mods" — are composable: face expressions (happy, angry, sad), servo-driven head tracking, speech synthesis, and M5Stack unit add-on support can all be mixed and layered. The 3D-printable enclosure is modular and supports multiple servo configurations: the default case uses SG90 or MG90S servos for pan/tilt, while a second design (RS30X series) uses higher-torque TTL servos for smoother motion. The 46-part printable set covers the main shell, bracket, feet, spacer, accessories (hat, backpack variants, Lego adapter), and servo-specific geometry for SCS0009, SG90, MG90S, and RS30X actuators. An official M5Stack commercial version (StackChan) is also available, with the community meganetaaan fork remaining the primary open-source hardware and firmware reference. License: Apache 2.0.
Open Duck Mini
2,642 stars on GitHub. Open Duck Mini — Bipedal BDX Droid Replica Source: https://github.com/apirrone/OpenDuckMini A miniature bipedal character robot inspired by Disney's BDX droid (the little star of the Galactic Starcruiser walkabout experience and several Disney+ trailers). Designed by Antoine Pirrone (apirrone), Open Duck Mini scales that silhouette down to about 35 cm tall and lets you build one for a few hundred dollars in parts. What it is A walking, head-tilting, antenna-twitching little biped with serious character. Under the cute exterior it's a legitimate RL-trained biped: Legs trained in Isaac Gym, sim-to-real transferred to physical hardware Standing-up-from-fall policy with demonstrated perturbation robustness Sim2Sim pipeline through Mujoco for validation before hardware deployment Head + antennas add expressiveness on top of the locomotion stack Hardware Servos: Dynamixel XC330-M288-T across all joints (upgraded from XL330 for torque headroom on landing impacts) IMU: BNO055 9-axis for base orientation estimate Compute: an onboard board (RPi-class or similar SBC) running the learned policy Battery: small LiPo in the cage, hot-swappable What you can do with it Run the pretrained walking policy straight from the repo — step the robot around, send it forward/turn commands over the orobot cloud surface. Collect your own episodes — use the remote surface here as the operator input for demos. Research biped RL — swap in your own PPO / SAC training runs; the Mujoco and Isaac scenes are all published. Character animation — antenna and head servos give it genuine personality for demos, events, or as a companion prop. Printed parts (canonical set shipped with this program) , — shell and torso , — hip assemblies — leg reinforcement , — joint linkages — contact surface , , — internal electronics cage , — head antennas The full repo includes ~130 STLs covering wiring routing, battery variants, and community mods (see Jaime's v2 mods and Justin's "Park Head" variant in the upstream tree). Attribution & license Designer: Antoine Pirrone (apirrone) Upstream: https://github.com/apirrone/OpenDuckMini License: Apache 2.0 Related: Duck community mods + variants under in the upstream repo Links Upstream: https://github.com/apirrone/OpenDuckMini Training code: https://github.com/apirrone/OpenDuckMiniRuntime BDX inspiration: Disney's Galactic Starcruiser droid --- Install Notes The orobot action runs the MuJoCo simulation training script (), not the physical robot deployment. The actual walking firmware for the physical duck lives in a separate repository: apirrone/OpenDuckMiniRuntime. To run on hardware, clone the Runtime repo separately, deploy to a Raspberry Pi Zero 2W, and update the action in this Program's editor to point at the Runtime's main script.
Hexapod
Hexapod — 3D Printed Six-Legged Walking Robot A fully 3D-printed hexapod robot with 18 servo motors (three per leg) providing lifelike, agile locomotion. Designed by rookidroid.com, this project uses either an ESP32 or Raspberry Pi Pico W/2W controller board with built-in WiFi for wireless remote control. The firmware supports over-the-air (OTA) updates so you can iterate on motion patterns without touching the hardware. Hexapod v2 is the recommended build. The original v1 used MG90S servos which are prone to failure; v2 upgrades to stronger 21G DS Power/Miuzei servos and is significantly more reliable. Specifications | Property | Value | |----------|-------| | Legs | 6 | | Servos | 18 x 21G (3 per leg: hip, knee, ankle) | | Controller | ESP32 or Raspberry Pi Pico W/2W | | Communication | WiFi (UDP port 1234) + OTA updates | | Power | 2 x 18650 Li-ion cells | | Print time | ~40-60 hours total (no supports needed) | | Skill level | Intermediate | Motion Modes The ESP32 firmware implements a pre-computed look-up-table gait system with 18 motion modes including: directional walking at 0, 45, 90, 135 degrees (left and right variants), 180 degrees; fast forward and backward; turn left and right; climb forward and backward; body rotations on X, Y, Z axes; and a twist mode. Bill of Materials | Item | Qty | Notes | |------|-----|-------| | 21G servo (DS Power or Miuzei) | 18 | Main actuators | | Hexapod controller board (ESP32 or Pico version) | 1 | From rookidroid.com | | 18650 Li-ion battery | 2 | | | 18650 battery holder | 1 | | | Rocker switch | 1 | | | M2 x 6mm screws | 36 | | | M2 x 10mm screws | 198 | | | M2 nuts | 234 | | | M4 x 6mm pins (304 steel) | 18 | | | MR74-2RS bearings (4x7x2.5mm) | 18 | Leg joints | 3D-Printed Parts All parts print without supports. The full set covers: body (9 unique parts), joints (3 types, 6 sets), legs (3 types, 6 sets), feet (4 types, 6 sets), and a cable-holder accessory — 20 unique STL files, all included here. Attribution Creator: rookidroid.com Source: https://github.com/rookidroid/hexapod License: GNU GPL v3
Reachy Mini
Expressive open-source robot for hackers and AI builders
Home AutomationDIY SmartLock
DIY SmartLock A battery-powered 3D-printed smart lock powered by an ESP32 and driven by an N20 geared motor. Designed for residential door locks, it replaces the manual key turn with touch-triggered automation — touching the metal door knob from outside wakes the ESP32 from deep sleep and checks MQTT for authorization before unlocking. How It Works The ESP32 spends nearly all of its time in deep sleep (as low as 5.2 µA on bare ESP32), waking only when the capacitive touch sensor detects contact with the door knob. On wake it connects to WiFi, subscribes to an MQTT topic (door/auth), and drives the N20 motor clockwise or counter-clockwise to lock or unlock. The system integrates naturally with Node-RED, Home Assistant, or any MQTT-capable automation stack for presence-based auto-auth. Specifications | Property | Value | |---|---| | Controller | ESP32 (bare WROOM-32 or LOLIN D32) | | Motor | N20 geared DC motor (9 V, ~40 mA no-load) | | Motor Driver | TB6612FNG (direct GPIO control) | | Power — Logic | 2x AA alkaline batteries (~3.2 V) | | Power — Motor | 9 V block battery | | Deep Sleep Current | ~5.2 µA (bare ESP32), 125 µA (LOLIN D32) | | Wake Trigger | Capacitive touch sensor (ESP32 pin T2) | | Connectivity | WiFi 802.11 b/g/n + MQTT | | Printed Parts | 4 (base, gear-motor, cage-motor, gear-key-knob) | | Build Difficulty | Intermediate | | Firmware | Arduino / PlatformIO | 3D-Printed Parts (4 total) All parts designed in Fusion 360. The full assembled model is also included for reference. 1. Base — mounts to door hardware, holds electronics 2. Gear Motor — gear interface between motor shaft and key 3. Cage Motor — retains N20 motor in position 4. Gear Key Knob — adapter that couples to existing door key Bill of Materials (Estimated) | Component | Notes | |---|---| | ESP32 bare / LOLIN D32 | LOLIN D32 recommended (125 µA deep sleep from battery) | | N20 geared DC motor | Must operate at 9 V to actuate door trap reliably | | TB6612FNG motor driver | Direct GPIO control; no I2C overhead | | 2x AA battery holder | Powers ESP32 logic rail | | 9 V battery + clip | Powers motor only | | Jumper wires | 3 control pins: BIN1 (pin 27), BIN2 (pin 14), STBY (pin 26) | | Metal screw or plate | Connected to touch pin T2 to sense door knob contact | Attribution Designer: Florian Vogler (@vogler on GitHub) Source: https://github.com/vogler/SmartLock License: Source-available (no explicit open-source license — check repo for usage terms) 3D Model: https://a360.co/4lLHHwa (Fusion 360, downloadable in multiple formats) Build Album: https://photos.app.goo.gl/bewiZ1qH8sHnJjmg7
PAROL6 Desktop Robot Arm
PAROL6 Desktop Robot Arm A high-performance 6-DOF desktop robotic arm designed to mirror industrial robots in mechanical design, control software, and usability — but small enough to sit on your desk. Designed by Petar Crnjak (Source Robotics). Released under GPLv3. STL files, control software, and GUI are all open-source. What you can do Run kinematic demos from your browser Practice pick-and-place with the included gripper attachments Learn industrial-style arm control (home, jog, teach points) Extend with your own gripper tooling (pneumatic, vacuum, 2-finger) Specs | | | |---|---| | Degrees of Freedom | 6 + gripper | | Joints | J1 base, J2 shoulder, J3 elbow, J4/J5 forearm, J6 wrist | | Payload | ~500g (typical) | | Reach | ~400mm | | Controller | Custom PAROL6 control board (STM32-based, PlatformIO) | | Motors | NEMA 17 stepper motors on joints | | License | GPLv3 (software + STLs) | Build options Two paths: 1. Buy a kit from Source Robotics — fully supported, pre-sourced parts 2. Source & print yourself — follow the BOM and Building instructions STL files This program includes a representative subset of 12 STLs covering BASE, SHOULDER, ELBOW, FOREARM, GRIPPER, and ESTOP groups. For the full 41-part canonical set (plus mounting plates and extras), see the STL directory on GitHub. Resources 📖 Official Docs 🎥 YouTube demo 🐍 Python API 🎛 Commander software 🤖 ROS2 / MoveIt simulation 💬 Discord community ⚠️ Safety PAROL6 involves lethal voltages and moving mechanical parts. Read the full SAFETY WARNING AND DISCLAIMER before assembling or operating. Attribution Source: PCrnjak/PAROL6-Desktop-robot-arm · License: GPLv3 · © Petar Crnjak / Source Robotics
LeKiwi Mobile Base
LeKiwi — Mobile Base for SO-ARM100 LeKiwi is a three-wheeled omnidirectional mobile base designed to carry the SO-ARM100 (or its sibling Koch arm) around a real environment. Add a 5-DOF teleop arm on top, put a camera on the base and the wrist, and you have a low-cost mobile manipulation platform compatible with the entire LeRobot stack. Why "kiwi"? Kiwi drive — three omniwheels at 120° — gives you true holonomic motion: the base can translate in any direction and rotate independently, no slewing required. That's huge for teleop demos where a human operator is driving with a joystick and expects the base to move "like a spaceship," not like a car. What you get Three-wheel kiwi drive with omniwheels on custom-printed hubs Feetech or Dynamixel servo variants (pick your servo ecosystem — both hubs are included) Raspberry Pi + optional Jetson Orin compute cage printed onto the base plate Battery compartment sized for standard 3S/4S LiPo packs Servo-controller mount and base-mounted webcam holder Mounting pattern on the top plate that drops directly into an SO-ARM100 base What you do with it Mobile teleoperation: leader arm drives the follower arm, joystick drives the base. Stream both from a laptop anywhere on the network. Mobile imitation learning: extend LeRobot's episode recorder to include base velocities. The existing Diffusion Policy and ACT models have been adapted by the community to condition on base state. Multi-camera observation: base camera sees the environment, wrist camera sees the task. Both go into the policy. Classroom mobile robotics: cheap, printable, and works with commodity servos. Parts ~12 canonical printed parts on the base: / — sandwich plate structure — servo bracket, ×3 at 120° — attaches the omniwheel to the servo horn, ×3 — LiPo tray / — Raspberry Pi 5 enclosure — optional Jetson Orin bracket — Dynamixel U2D2 / Waveshare bus driver holder , — webcam rigs — replacement SO-ARM100 base shim that lets the arm sit flush on the base plate Attribution & license Designer: SIGRobotics-UIUC (Student Interest Group on Robotics, University of Illinois Urbana-Champaign) Upstream: https://github.com/SIGRobotics-UIUC/LeKiwi License: Apache 2.0 Ecosystem: LeRobot Links Upstream repo: https://github.com/SIGRobotics-UIUC/LeKiwi SO-ARM100 (arm to mount on top): https://github.com/TheRobotStudio/SO-ARM100 LeRobot stack: https://github.com/huggingface/lerobot
SO-ARM101 Standard Open Arm
SO-ARM101 — Standard Open Arm A 6-DOF open-source teleoperation arm system designed by The Robot Studio in collaboration with Hugging Face. SO-101 is the next-gen version of the SO-100 with improved wiring, easier assembly, and updated motors. Built to work seamlessly with the 🤗 LeRobot library for end-to-end AI robotics research. What makes it interesting Leader + Follower teleoperation pair — move the leader arm, the follower mimics in real time LeRobot-native — record demonstrations, train imitation-learning policies, deploy to the arm Low cost — ~ 00 per arm in printed parts + motors Active community — dozens of vendors selling kits worldwide Specs | | | |---|---| | Degrees of Freedom | 6 per arm (leader + follower = 12 total) | | Motors | Feetech STS3215 servos | | Reach | ~420mm | | License | Apache 2.0 (code + hardware) | | Ecosystem | Hugging Face LeRobot | Build options Two paths: 1. Buy a kit — dozens of vendors listed in the README (PartaBot US, Seeed Studio, WowRobo, etc.) 2. Source & print yourself — follow the 3DPRINT.md guide and the Hugging Face Assembly Guide STL files This program includes 12 representative SO-101 individual parts (base, motor holders, arm links, wrist roll/pitch, gripper jaw, handle). For the full SO-101 printable set plus the SO-100 legacy parts, Optional accessories (cam mounts, bases, grippers), and Mini variant, browse the STL directory on GitHub. Resources 📖 SO-101 Assembly Guide (HuggingFace) 🤗 LeRobot library 💬 Discord community 🖨 Printing guide (3DPRINT.md) 🏭 The Robot Studio Attribution Source: TheRobotStudio/SO-ARM100 · License: Apache 2.0 · © The Robot Studio / Hugging Face contributors
Assembler 0
3D-printed robot arms that can 3D-print their own structural parts
Qubit — Dual-Arm Desktop Robot
Qubit: a dual-arm desktop robot with expressive 16×16 LED eyes
Poppy Humanoid
891 stars on GitHub. Poppy Humanoid is an open-source, 3D-printed, 25-DOF humanoid robot from the Poppy Project, originally developed at Inria's Flowers team for research in embodied cognition, human-robot interaction, and robot learning. Standing 84 cm tall and weighing about 3.5 kg, Poppy is designed with biomechanically motivated bent-thigh proportions and a deliberately "life-like" proportion philosophy: because the robot's morphology influences how humans perceive and interact with it, the researchers argued that a scientifically grounded body design is as important as the control algorithms that run on top of it. Source: https://github.com/poppy-project/poppy-humanoid The robot uses 25 Dynamixel smart servos (MX-28 for torso and hips, MX-64 for knees, AX-12 for arms and head) communicating on a daisy-chained TTL bus. A Raspberry Pi onboard provides high-level control via pypot (the Python Dynamixel library developed for this project), with real-time joint targets streamed from user code. Compliant mode lets you back-drive the robot by hand to teach poses and trajectories — a powerful pedagogical and research primitive. Poppy Humanoid has been used in dozens of published studies on bipedal walking, imitation learning, developmental robotics, and tutor-assisted programming education. Poppy is a modular family: the Torso variant is a desk-mountable upper body, and the Ergo Jr is a 6-DOF desk arm. Both share the Dynamixel bus stack and the pypot API. This means behaviors developed for one Poppy robot often port directly to another with configuration-level changes rather than code rewrites. This program exposes cloud-side endpoints for rest/sit postures, arm waving, compliant-mode toggle, dance primitive playback, and full demonstration record/replay. The Raspberry Pi onboard bridges the orobot cloud to the Dynamixel bus via pypot. Credit to the Poppy Project community and Inria Flowers team (poppy-project.org) — upstream repository at https://github.com/poppy-project/poppy-humanoid. Hardware under Creative Commons Attribution-ShareAlike 4.0 (hardware/LICENSE.md), software under GPLv3 (software/LICENSE). Share-alike requires derivative hardware designs to be published under the same CC-BY-SA-4.0 license.