docs/quick-start

Quick Start

Get your first AI model trained and deployed to a ROS 2 robot in under 10 minutes.


This guide assumes you have a Artemis account and a ROS 2 environment available. If not, see the Introduction for setup steps.

1 Install the Artemis Agent

Run the following on your robot's host machine (Ubuntu 22.04 recommended):

bash
pip install kairo-agent

Verify the install:

bash
kairo --version
# kairo-agent 1.4.2

2 Authenticate

Generate an API key from Dashboard → Settings → API Keys, then authenticate:

bash
kairo auth login --key kai_live_xxxxxxxxxxxx

Your credentials are stored at ~/.kairo/credentials.

3 Register Your Robot

Register the current machine as a robot in your Artemis fleet:

bash
kairo robot register \
  --name "my-arm-01" \
  --type manipulator \
  --ros-distro humble

You'll receive a Robot ID — note it for later steps.

bash
# Output:
✔ Robot registered: rob_a1b2c3d4
  Name:     my-arm-01
  Type:     manipulator
  Status:   online

4 Upload Training Data

Upload a ROS bag file or CSV dataset. Artemis accepts most common robotics data formats:

bash
kairo data upload ./my_dataset.bag \
  --robot rob_a1b2c3d4 \
  --label "pick-and-place v1"
Larger datasets (>500 MB) are chunked and uploaded in parallel automatically. You can monitor upload progress in the dashboard under Data → Uploads.

5 Train a Model

Trigger a training run from the CLI or the dashboard. Artemis auto-selects the best architecture for your data type:

bash
kairo train start \
  --dataset ds_xyz123 \
  --robot rob_a1b2c3d4 \
  --task manipulation

Track training progress in real time:

bash
kairo train logs --run run_789abc --follow

# [00:01] Preprocessing dataset...
# [00:45] Training epoch 1/20 — loss: 0.412
# [01:30] Training epoch 5/20 — loss: 0.189
# [03:22] Training complete. Accuracy: 94.3%
# [03:25] Model packaged as kairo-model-v1.ros2

6 Deploy to Your Robot

Push the trained model to your robot as a ROS 2 node:

bash
kairo deploy push \
  --model mdl_abc123 \
  --robot rob_a1b2c3d4 \
  --topic /kairo/inference

Verify the deployment is running:

bash
kairo deploy status --robot rob_a1b2c3d4

# Robot:    my-arm-01
# Model:    kairo-model-v1
# Status:   ✔ running
# Uptime:   2m 14s
# Topic:    /kairo/inference
Your model is now publishing inference results on the /kairo/inference topic. Subscribe to it from any ROS 2 node with ros2 topic echo /kairo/inference.

What's Next?

Was this page helpful?