Good vibes.
The technical documentation is below - this is how we got to the state documented there! It is an incomplete list because this was a massive effort and it's impossible to reproduce every step here. We're doing our best to capture the important parts!
The goal is to write the software for an icosphere that can move forward by taking multiple steps in a given direction (orientation via IMU, 20 servos to actuate the sides of the icosphere). It should also play audio.
Matti and Sun met in the evening to figure out bi-directional communication between a browser page to act as the brain of the project and the ESP32 microcontroller. They're opting for a UDP connection and build a first prototype that establishes a connection.
The project is initialized and we're rolling! Well, not yet, but we're getting there.
The whole team meets on the 5th floor and prepares. It's the first time we get access to a board with an IMU on it. Thanks EE team! We get familiar with the data coming out of the IMU by running a basic Adafruit example.
Sophia creates a first version of an HTML page that will act as a remote control for debugging later. Sun is building an API that will enable proper communication between the 2 pieces of the puzzle. We come up with an organizational structure of the code base that allows everyone to work on their own thing before integration will happen later.
We discuss the vector maths required to make the bot move correctly but also maintain a consistent forward axis in world space (despite its own tumbling). Yufeng has a first visualization of the IMU data.

We have a quick conversation about how this thing will move. We decide that a naive algorithm for getting it to "take a step" forward is to actuate two faces adjacent to the face which is on the ground, to tip it onto the third neighboring face. Eitan had 3D-printed 20-sided dice for all of us, which we decided to use as a shared reference for how to refer to the faces. Miranda, Saetbyeol, and Eitan encode the face adjacencies into a JSON file which maps each face number to its neighbors. We generate unit normal vectors for each face of an arbtirary icosahedron. From these, Miranda and Saetbyeol implement a quick visualization for how to compute which faces to actuate given the face which is on the ground. In the below visualization, the face on the ground is colored in red, the face most closely aligned with the movement vector (greatest dot product between the face normal and the movement vector) is colored in yellow, and the two faces to actuate (the other two neighbors) are colored in green.

Matti and Miranda look at 3d vector maths in the morning and figure out a (theoretical) prototype that should do the trick. It's based on a calibration step that the user initiates once the IMU sufficiently settles. The calibration step grabs the current quaternion coming out of the IMU, inverts it and stores it for later.
Saetbyeol tests the muxes and gets the servos to run! Miranda modifies an example to get the quaternion reading from the IMU.
Miranda refactors 3D maths code into a Python script instead of a Jupyter notebook. Matti writes a first version of the visualization code and the first in-browser implementation of the logic that figures out when to activate the triangles for a forward move.
Yufeng starts making a miniature model of the shape we're working with.

We cover the whiteboard in the fifth floor study room (which we have been affectionately referring to as the War Room) with diagrams and lists of all of the features we need. We have most of the functionality we need in bits and pieces, and we spend most of the day linking everything together and making interfaces for testing and debugging.
We discuss switching our remote control protocol from UDP to BLE because
Matti, Saetbyeol and Sun made some failing servo code work.
In order to debug an issue where the BLE signal would sometimes brownout, we moved away from sending JSON strings over BLE and came up with a more compressed representation for sending commands. Our final command encoding sends 4 leading bits to encode the commands and 20 Discuss change of data stream from remote control to ESP32. Miranda implemented it. Miranda and Matti discussed the face -> [mux, channel] mapping with Ben and made a plan.

We successfully implemented servo movement to 270 degrees (had 180 before) with automatic retraction (timed on the ESP32). We also move our communication from the remote control in the browser to the ESP32 to a system that communicates data for the 32 mux channels (instead of triangle faces). This works great and will help with debuggon on Tuesday. We also set the respective pins on the breakout mux dev boards to align with the same I2C addresses that will be used on the final PCB. That allows us to keep our code static and not change I2C addresses when switching between testing on the bread boards and the milled PCBs.
We implement a sequencer that allows pre-planning steps/instructions. We anticipate that it might be helpful with filming if we get to a point where we want to film the icosphere moving multiple steps.
Some of us meet in the morning to do some late software changes in anticipation of a long day full of debugging. We focus on getting a lot of components in place that allow us to trigger parts of the system in isolation so that we have an easier time once we get access to the fully assembled system.
During assembly there are many tasks that we helped with on the software side. A short excerpt:
Tyler brought his new PCB and we helped verify that servos can be run off of all the servo ports he exposes. We also label the ports with him so we know which channels on the muxes are connected to these ports.
We have a setup that allows us to quickly zero out all the servos before assembling them. We need to make sure we know their current state befor putting them into the physical system. During assembly we notice that our min/max values don't align with how the mechanical team has set up the mechanism for extending the triangles. We switch these values in our code so that extension becomes retraction and vice versa.
We have problems with one servo not extending and retracting correctly. We have to change the max value we send for extension (from 520 to 512) because the servo doesn't react to 520. Other servos do. We don't fully understand, but this works. Dimitar helped with doing the math to derive the 512 value.
We debug the audio sub-system because we encounter some problems on Tyler's amazing spherical PCB.
The CBA Machine is an interactive robotic sculpture featuring an icosahedron geometry with 20 servo-controlled faces and a 9-axis IMU sensor for orientation tracking. The system enables real-time motion sensing, servo actuation, and audio playback, all controlled wirelessly via Bluetooth Low Energy (BLE) from a web browser.
Key Features:
Use Cases:
The system consists of three main components:
┌─────────────────────────────────────────────────────────────┐
│ Web Browser (Client) │
│ - Three.js 3D Visualization │
│ - BLE Communication (Web Bluetooth API) │
│ - Command Sequencer │
│ - User Interfaces (Operation & Debugging) │
└──────────────────────┬──────────────────────────────────────┘
│ BLE
│ Commands: 5-byte binary protocol
│ Data:
| - JSON (quaternion + accel) -> To browser
| - Bit Array (Actuation Commands) <- From browser
┌──────────────────────▼──────────────────────────────────────┐
│ ESP32-S3 Microcontroller │
│ - BLE Server
│ - IMU Data Processing (DMP Quaternion) │
│ - Servo Control (32 channels) │
│ - Audio Playback (SD Card + I2S) │
└──────┬────────┬─────────┬─────────────┬────────────────────┘
│ │ │ │
│ │ │ │
┌──────▼────┐ ┌─▼────────┐ ┌───▼─────┐ ┌▼─────────────┐
│ ICM-20948 │ │ 2x │ │ SD Card │ │ MAX98357A │
│ 9-DOF IMU │ │ PCA9685 │ │ Storage │ │ I2S Amplifier│
│ (I2C) │ │ PWM │ │ (SPI) │ │ │
└───────────┘ │ (I2C) │ └─────────┘ └──────┬───────┘
└────┬─────┘ │
│ │
┌────▼──────────────┐ ┌───────▼───────┐
│ 20 Servo Motors │ │ Speaker │
│ (Icosahedron │ └───────────────┘
│ Faces 1-20) │
└───────────────────┘
Data Flow:
| Component | Model/Type | Interface | Purpose |
|---|---|---|---|
| Microcontroller | Seeed Studio XIAO ESP32S3 | - | Main control unit with BLE |
| IMU Sensor | ICM-20948 (9-DOF) | I2C (address 0x68/0x69) | Orientation tracking |
| PWM Controllers | 2x PCA9685 (16-channel each) | I2C (0x48, 0x60) | 20 servo channels (total) |
| Servo Motors | 20x SunFounder Digital Servo | PWM | Icosahedron face actuation |
| Audio Amplifier | MAX98357A | I2S | Audio playback |
| Storage | SD Card | SPI | Audio file storage |
| Speaker | Standard 8Ω | Analog | Audio output |
I2C Bus (IMU + PWM Controllers):
I2S Audio (MAX98357A):
SPI (SD Card):
Power:
The firmware is organized into modular .ino files, with controller.ino as the main entry point.
controller/
├── controller.ino # Main program entry point
├── config.h # System configuration
├── bluetooth-name.h # BLE device name (gitignored)
├── bluetooth-name.h.example # Template for BLE name
├── servo-params.h # Servo PWM parameters
├── lib-01-ble.ino # BLE communication
├── lib-02-ble-message-handler.ino # Command parser
├── lib-03-sensor.ino # IMU sensor reading
├── lib-04-handle-move-servo.ino # Servo control
├── lib-05-handle-reset.ino # Device reset handler
├── lib-06-handle-sensor-transmission.ino # (Possibly deprecated)
└── lib-07-audio.ino # Audio playback
File: lib-03-sensor.ino
The IMU subsystem uses the SparkFun ICM-20948 library to read orientation data from the 9-axis IMU sensor. The system leverages the sensor's Digital Motion Processor (DMP) to compute quaternion orientation in hardware, reducing CPU load on the ESP32.
DMP Setup (Required): The DMP feature must be manually enabled in the library source code:
~/Arduino/libraries/SparkFun_ICM-20948_ArduinoLibrary/src/util/ICM_20948_C.h#define ICM_20948_USE_DMPI2C Configuration:
struct Quaternion {
float w; // q0 (scalar component)
float x; // q1 (i component)
float y; // q2 (j component)
float z; // q3 (k component)
uint16_t accuracy; // Heading accuracy estimate
};
struct Vector3D {
float x; // Acceleration X (mg)
float y; // Acceleration Y (mg)
float z; // Acceleration Z (mg)
};
void setupSensor()
void updateSensorData()
getSensorJSON()String getSensorJSON()
{
"w": 0.9966,
"x": 0.0368,
"y": 0.0132,
"z": 0.0718,
"ax": 5.13,
"ay": 15.63,
"az": 997.31
}
bool hasMoreSensorData()
true if FIFO contains additional samplesThe DMP is configured with:
INV_ICM20948_SENSOR_ORIENTATION (9-axis quaternion)ICM_20948_Status_e for error detectionThe IMU reports data in its local coordinate system:
Coordinate transformations are applied in the web interface (see brains.js).
Files: lib-04-handle-move-servo.ino, servo-params.h
The servo subsystem controls 20 independent servo motors arranged as faces of an icosahedron. Two PCA9685 PWM controllers (16 channels each) provide the necessary I/O expansion.
ESP32 I2C Bus
│
├── PCA9685 #1 (0x48) ─── Servos 0-15 (Faces 1-16)
│
└── PCA9685 #2 (0x60) ─── Servos 16-19 (Faces 17-20)
File: servo-params.h
#define SERVO_MIN 100 // Retracted position (PWM value)
#define SERVO_MAX 520 // Extended position (PWM value)
#define SERVO_FREQ 50 // PWM frequency (Hz)
Servo Model: SunFounder Digital Servo
PWM Range: 100-520 (corresponds to ~0-180° rotation)
void setupServo()
Wire.begin()void servo_write(int channel, int pwmVal)
channel: 0-19 (servo channel)pwmVal: 100-520 (PWM pulse width)void reset_servo_positions()
SERVO_MIN positionServo commands are received as 5-byte binary messages parsed in lib-02-ble-message-handler.ino:
Extend Servo:
0b0001SERVO_MAXRetract Servo:
0b0010SERVO_MINExample: To extend servos 1, 5, and 10:
0x00000211 (bits 0, 4, 9 set)⚠️ IMPORTANT: Servo min/max values may need adjustment per servo. See Questions section.
Files: lib-01-ble.ino, lib-02-ble-message-handler.ino
The BLE subsystem implements a Nordic UART Service (NUS) for bidirectional communication with a web browser using the Web Bluetooth API.
Device Name:
"ESP32-Labubu"bluetooth-name.h (gitignored)Service UUID:
6e400001-b5a3-f393-e0a9-e50e24dcca9e (Nordic UART Service)Characteristics:
TX Characteristic (ESP32 → Browser):
6e400003-b5a3-f393-e0a9-e50e24dcca9eRX Characteristic (Browser → ESP32):
6e400002-b5a3-f393-e0a9-e50e24dcca9eFormat: JSON strings sent via BLE notifications
{ "w": 0.875, "x": 0.0243, "y": 0.0393, "z": -0.482, "ax": 17.5781, "ay": 3.1738, "az": 1025.3906 }
Rate: 10-50 Hz (depending on FIFO availability)
Format: 5-byte binary encoding
Byte Layout:
┌─────────┬─────────┬─────────┬─────────┬─────────┐
│ Byte 0 │ Byte 1 │ Byte 2 │ Byte 3 │ Byte 4 │
├─────────┼─────────┼─────────┼─────────┼─────────┤
│ CCCC │ DDDDDDDD│ DDDDDDDD│ DDDDDDDD│ DDDD0000│
│ DDDD │ │ │ │ │
└─────────┴─────────┴─────────┴─────────┴─────────┘
↑ ↑ ↑
│ └── Data bits [0:3] └── Data bits [28:31]
└────── Command bits [0:3]
Total: 4-bit command + 32-bit data
Command Encodings:
| Command | Binary | Data Field | Description |
|---|---|---|---|
extend_servo |
0b0001 |
32-bit bitmask (20 bits used) | Extend specified servos |
retract_servo |
0b0010 |
32-bit bitmask (20 bits used) | Retract specified servos |
play_audio |
0b0100 |
20-bit audio ID | Play audio file |
reset |
0b1000 |
(unused) | Restart ESP32 |
Example: Extend servos 1, 2, 3
0b00010x00000007 (bits 0, 1, 2 set)[0x71, 0x00, 0x00, 0x00, 0x00]void setupBLE()
void sendBLEMessage(String message)
bool hasBLEMessage()
true if 5-byte message availablevoid getBLEMessage(uint8_t* data, size_t& len)
void handleBLEConnection()
File: lib-07-audio.ino
The audio subsystem streams 16-bit PCM WAV files from an SD card to a MAX98357A I2S amplifier for playback through a speaker.
| Component | Interface | Pins |
|---|---|---|
| SD Card Reader | SPI | CS: Pin 21 |
| MAX98357A Amplifier | I2S | BCLK: D1, LRCK: D0, DIN: D2 |
Audio files must be stored in /audio/ directory on SD card:
/audio/
├── audio_0000_filename.wav
├── audio_0001_another.wav
├── audio_0042_example.wav
└── ...
Naming Convention: audio_XXXX*.wav
XXXX: Zero-padded 4-digit ID (0000-9999)void setupAudio()
void playAudioById(uint32_t audioId)
/audio/ for file matching audio_{ID:04d}*.wavvoid playAudio()
bool initI2S(uint32_t sampleRate)
i2s_config_t config = {
.mode = I2S_MODE_MASTER | I2S_MODE_TX,
.sample_rate = <from WAV file>,
.bits_per_sample = I2S_BITS_PER_SAMPLE_16BIT,
.channel_format = I2S_CHANNEL_FMT_ONLY_RIGHT,
.communication_format = I2S_COMM_FORMAT_I2S_MSB,
.intr_alloc_flags = ESP_INTR_FLAG_LEVEL1,
.dma_buf_count = 8,
.dma_buf_len = 512,
.use_apll = true, // Precise clock
.tx_desc_auto_clear = true,
.fixed_mclk = 0
};
Current Implementation:
32.0f (hardcoded in playAudio())sample = sample * 32.0Modification:
To change volume, edit volumeGain variable in lib-07-audio.ino.
Audio playback is triggered via BLE command:
play_audio0b0100Example: Play audio file 42
play_audio0x00002A (42 in hex)/audio/audio_0042*.wavFile: controller.ino
void setup() {
Serial.begin(115200); // Debug output
setupBLE(); // Initialize BLE
setupSensor(); // Initialize IMU
setupServo(); // Initialize servos
setupAudio(); // Initialize audio
reset_servo_positions(); // Retract all servos
}
void loop() {
// 1. Handle BLE connection state
handleBLEConnection();
// 2. Read IMU sensor
updateSensorData();
// 3. Send sensor data if connected
sendSensorDataIfReady();
// 4. Process incoming commands
if (hasBLEMessage()) {
uint8_t data[5];
size_t len;
getBLEMessage(data, len);
if (len == 5) {
handleBLECommand(data, len);
}
}
// 5. Stream audio if playing
playAudio();
// 6. Delay only if no more sensor data
if (!hasMoreSensorData()) {
delay(10); // 100 Hz loop when idle
}
}
The loop uses adaptive timing:
Location: software/integration/server/
The web interface provides real-time 3D visualization, BLE control, and command sequencing capabilities using vanilla JavaScript and Three.js.
index.html
│
├── scheduler.js # Async task queue
├── bluetooth.js # BLE communication
├── threejs-vis.js # 3D scene setup
├── brains.js # IMU processing & face detection
├── calibration.js # IMU calibration
├── acceleration.js # Acceleration vector visualization
├── faceneighbors.js # Face topology data
├── cmd_sequencer.js # Command sequencer UI
├── utils.js # Utility functions
└── voice.js # Voice commands (optional)
Purpose: BLE communication layer
Key Functions:
BLE.connect(dataCallback); // Connect to ESP32
BLE.disconnect(); // Disconnect
BLE.send(cmd, args); // Send command (5-byte encoding)
BLE.sendCommand(cmd, args); // High-level command API
BLE.isConnected(); // Check connection status
Command Encoding:
function encodeCommand(cmd, args) {
// Returns Uint8Array(5) with binary encoding
// See "Networking/BLE Subsystem" for format
}
Features:
AsyncTaskScheduler)Purpose: 3D visualization of IMU orientation
Features:
Key Functions:
initThreeJS(); // Initialize scene
updateIMURotationData(x, y, z, w); // Apply quaternion rotation
Purpose: IMU data processing and face detection logic
Features:
Key Functions:
updateIMURotationData(x, y, z, w, ax, ay, az);
findDownTriangle(); // Detect lowest face
findMovementAlignedNeighbor(); // Find motion-aligned faces
getGreenNeighbors(); // Get neighbor face IDs
Face Detection Logic:
Purpose: Command sequence builder and executor
Features:
Command Types:
extend_servo: Extend specified servosretract_servo: Retract specified servosmove_servo: Extend → Wait → Retractreset: Restart ESP32Sequence File Format:
# Comment line
move_servo 1,2,3 1000 # Command Args Delay(ms)
reset 500
move_servo 5 2000
Purpose: IMU orientation calibration
Features:
Usage:
Purpose: Async task queue for sequential BLE writes
Features:
Why Needed:
BLE writeValue() calls must be sequential. Concurrent writes cause failures.
Main Controls:
Command Sequencer Panel:
3D Viewport:
Local Testing:
cd software/integration/server
npx http-server -p 8080
# Open http://localhost:8080
Production (GitLab Pages):
https://classes.pages.cba.mit.edu/863.25/CBA/cba-machine/index.html
Requirements:
Installation: Follow Espressif Arduino-ESP32 installation guide
| Platform | Version | Path |
|---|---|---|
| esp32:esp32 | 3.3.1 | ~/.arduino15/packages/esp32/hardware/esp32/3.3.1 |
Board Selection: "XIAO_ESP32S3" (Seeed Studio XIAO ESP32S3)
| Library | Version | Installation Method | Path |
|---|---|---|---|
| SparkFun ICM-20948 | 1.3.2 | Arduino Library Manager | ~/Arduino/libraries/SparkFun_9DoF_IMU_Breakout_-_ICM_20948_-_Arduino_Library |
| Adafruit PWM Servo Driver | (Latest) | Arduino Library Manager | ~/Arduino/libraries/Adafruit_PWM_Servo_Driver_Library |
| BLE | 3.3.0 | Included with ESP32 platform | ~/.arduino15/packages/esp32/hardware/esp32/3.3.1/libraries/BLE |
| Wire | 3.3.0 | Included with ESP32 platform | ~/.arduino15/packages/esp32/hardware/esp32/3.3.1/libraries/Wire |
| SPI | 3.3.0 | Included with ESP32 platform | ~/.arduino15/packages/esp32/hardware/esp32/3.3.1/libraries/SPI |
| SD | (Included) | Part of ESP32 core | (Built-in) |
| FS | (Included) | Part of ESP32 core | (Built-in) |
Critical Setup Step for ICM-20948:
⚠️ MUST ENABLE DMP SUPPORT
Edit the library header file:
# Linux/Mac
nano ~/Arduino/libraries/SparkFun_9DoF_IMU_Breakout_-_ICM_20948_-_Arduino_Library/src/util/ICM_20948_C.h
# Windows
# Open: Documents\Arduino\libraries\SparkFun_9DoF_IMU_Breakout_-_ICM_20948_-_Arduino_Library\src\util\ICM_20948_C.h
Uncomment line 29:
#define ICM_20948_USE_DMP
Verification: After editing, recompile the controller sketch. If successful, you'll see quaternion data streaming.
File: software/integration/package.json
{
"dependencies": {
"ws": "^8.18.3" // WebSocket server (for optional features)
},
"devDependencies": {
"concurrently": "^9.2.1" // Parallel command execution
}
}
Installation:
cd software/integration
npm install
NPM Scripts:
npm run build:controller # Compile firmware
npm run upload:controller # Upload to ESP32 (/dev/ttyACM0)
npm run monitor # Serial monitor (115200 baud)
npm run dev:controller # Build + Upload + Monitor
npm run serve # Start web server
External Dependencies (CDN):
Loaded via HTML:
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/r128/three.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/three@0.128.0/examples/js/controls/OrbitControls.js"></script>
No Build Process Required: All code is vanilla JavaScript.
Components Needed:
Wiring:
I2C Bus (shared):
I2S Audio:
SPI (SD Card):
Servos:
A. Install Arduino IDE and ESP32 Support
File > Preferenceshttps://espressif.github.io/arduino-esp32/package_esp32_index.json
Tools > Board > Boards ManagerB. Install Required Libraries
Tools > Manage LibrariesC. Enable DMP Support (Critical!)
# Linux/Mac
nano ~/Arduino/libraries/SparkFun_9DoF_IMU_Breakout_-_ICM_20948_-_Arduino_Library/src/util/ICM_20948_C.h
# Uncomment line 29:
#define ICM_20948_USE_DMP
D. Configure Firmware
software/integration/controller/controller.inobluetooth-name.h:cd software/integration/controller
cp bluetooth-name.h.example bluetooth-name.h
# Edit bluetooth-name.h to customize BLE name
E. Compile and Upload
Tools > Board > esp32 > XIAO_ESP32S3Tools > Port > /dev/ttyACM0 (or your port)Verify Success:
A. Local Development
cd software/integration/server
npx http-server -p 8080
Open browser to http://localhost:8080
B. Production Deployment (GitLab Pages)
Already deployed at:
https://classes.pages.cba.mit.edu/863.25/CBA/cba-machine/index.html
No additional setup needed for production use.
A. Prepare SD Card
/audio//audio/audio_0000_intro.wav
/audio/audio_0001_beep.wav
/audio/audio_0042_melody.wav
Requirements:
B. Insert SD Card
Insert SD card into ESP32 SD reader (CS on Pin 21)
1. Power On System
2. Connect from Browser
3. Observe IMU Visualization
4. Calibrate Orientation
5. Control Servos
Manual Control:
1,5,10 extends faces 1, 5, and 10Command Sequencer:
6. Play Audio
play_audio command with audio ID42 plays /audio/audio_0042*.wavPurpose: Create repeatable demonstrations with precise timing
Steps:
Example Sequence:
move_servo 1,2,3 2000 # Move faces 1-3, wait 2s
move_servo 4,5,6 2000 # Move faces 4-6, wait 2s
reset 0 # Restart device
Import/Export:
.txt filesFile: voice.js
Some voice command functionality may be present. See code comments for details.
The system automatically detects:
Use Case: Programmatic selection of faces to actuate based on orientation and motion.
Problem: "ESP32 not found" during connection
Problem: IMU data not streaming
Problem: Servos not responding
Problem: Audio playback silent or distorted
volumeGain in lib-07-audio.inoProblem: BLE disconnects frequently
The following aspects require clarification for proper documentation
Servo Power Supply Specifications
Physical Mounting
PCB Design
Servo Calibration
Face Numbering Convention
IMU Mounting Orientation
brains.js correct?Movement Vector
Audio File Naming
Command Timing
move_servo command? (Currently 2000ms)Testing Procedures
Safety Features
Power Management
Deployment Environment
Version History
Original Design Intent
Known Issues
Future Plans
Face Neighbors Data
face_neighbors.json?Example Sequences
example_sequence_17_18.txt?This skeleton of this documentation was generated with the following prompt
Carefully analyze the #file:integration folder to understand how the system works. Then produce a detailed documentation in #file:README.md. Make sure to organize the documentation based on system components:
- IMU Sensor
- Servo
- Networking
- Audio
- Main loop
Also, document the libraries we used (e.g. IMU, Servo, Audio, BLE).
The goal is to make sure the project can be reproduced by future engineers.
During documentation process, collect a list of questions that are unclear from the code artifacts. We will ask people to provide more information.