ArduinoVision v1.0

The first AI that can see your hardware.

Stop describing your wiring to a chatbot. Point a camera at your breadboard, tell the agent what you want, and watch it write, compile, and upload the code directly to your Arduino.

agent-session
User: "Make this LED blink once per second."
Agent analyzing video feed... Agent: "I see an Arduino Uno with an LED connected to pin 8. Writing sketch..." Compiling sketch for arduino:avr:uno... Uploading to /dev/ttyUSB0... ✓ Upload complete. LED is now blinking.

The copy-paste cycle is dead.

ArduinoVision bridges the gap between AI coding assistants and physical hardware by combining real-time video, voice, and direct CLI integration.

👁️

Visual Hardware Understanding

The agent reads your physical wiring through the camera. It identifies board types, pin connections, component placement, and breadboard layouts. No more guessing if you plugged into pin 8 or 9.

Zero-Touch Upload

Invokes arduino-cli directly to compile the sketch and flash it to the board. All within the same conversation.

🎙️

Real-time Voice

Talk to the agent hands-free while working with your hardware. Keep your hands on the jumper wires, not the keyboard.

🐛

Closed-Loop Debugging

If something doesn't work, describe the problem. The agent sees the board, reads the Serial.println() output, reasons about the logic vs. physical mismatch, and pushes a fix instantly.

From components to running code.

Three steps. Zero IDEs.

01

Wire your components

Connect your LEDs, sensors, or other components to the Arduino as you normally would. Physical connections are your only job.

02

Show the camera

Open the session interface and point your phone or webcam at your breadboard. Tell the agent what you want to build in plain English.

03

Watch it work

The agent identifies the wiring, writes the correct Arduino C++, compiles it, and uploads it directly to your board. Done.