After assembling the robot and working through the debugging in Part 2, the moment finally came.

Power on.

Motors twitch.

Antennas move.

And then… a small cheerful “phew phew” whistle.

Reachy didn’t speak words.

But in that moment it absolutely felt like a greeting.

Kia ora.

The First Boot — Kia Ora Aotearoa

The first time Reachy powered up properly it made a small whistle-like sound while its motors calibrated.

It isn’t speech. It’s just a sound effect the robot makes as it wakes up.

But when you hear it in New Zealand, it feels like a greeting.

A robot waking up and greeting the world with a little electronic whistle.

Close enough to “Kia ora” for me.


Reachy Learns to Dance

Once everything was working, I started exploring some of the behaviours.

Reachy Mini is an open-source robot designed for experimenting with human-robot interaction and AI behaviours. It includes a camera, microphones, speaker and expressive movement through its head and antennas.

You can read more about the robot here:
https://www.pollen-robotics.com/reachy-mini/

The movement is where the personality shows up.

Head tilts.

Body turns.

Antennas wiggle.

And occasionally… a little dance.

For something that mostly rotates and moves its head, the amount of personality it can express is surprisingly high.

Reachy’s head has six degrees of freedom movement plus animated antennas, which allows expressive gestures and body language.


The Moment It Starts Feeling Alive

The strangest moment wasn’t the whistle.

And it wasn’t the dance either.

It was this.

Something about the small movements — the pauses, the slight head tilts, the antenna wiggles — makes the robot feel less like a machine and more like a curious creature.

Not human.

Not animal.

Something in between.

A bit like a robotic bird that has just landed in the room and is trying to figure out where it is.

Movement alone creates personality.


My Thoughts on the Build Quality

Having assembled the kit myself, my overall impression is that the build quality is surprisingly solid.

The robot is small (about 28 cm tall) but the parts feel well engineered rather than toy-like. The motors move smoothly, the head movements are precise, and the joints feel sturdy rather than fragile.

The assembly process also gives you a good understanding of how the robot works internally. Instead of a sealed gadget, you actually see how everything fits together.

Inside the robot there’s quite a bit going on:

  • wide-angle camera
  • microphone array
  • 5W speaker
  • articulated head with six degrees of freedom
  • animated antennas for expression
  • full body rotation

These sensors and actuators are what allow Reachy to experiment with vision, sound and AI behaviours.

One thing I do wish, though: movement.

Right now Reachy rotates and moves its head beautifully, but it stays in one spot. I keep imagining how fun it would be if it had little wheels — or even legs — so it could roam around the house.

Maybe that’s Reachy Mini v2.


Apps and the Open Robot Ecosystem

Another interesting part of the system is the software.

Reachy is programmable in Python and connects to the wider AI ecosystem through Hugging Face, where developers share robot behaviours and applications.

That means you can run or build things like:

  • face tracking
  • AI conversation agents
  • emotional reactions
  • dancing behaviours
  • computer vision experiments

Instead of a closed gadget with fixed features, Reachy feels more like a tiny robotics platform for experimentation.


Why This Matters

For years, AI has lived inside screens.

Chatbots. Apps. APIs.

But when AI has movement and physical presence, the experience changes.

A chatbot answers.

A robot tilts its head, whistles, and looks around the room.

And suddenly it feels like something is present, not just responding.

Even if the greeting is just a small electronic whistle.

Kia ora.


What I Want to Try Next

Now that Reachy is alive, the real experiments begin.

Things I’m curious about exploring next:

  • connecting local AI models to control behaviours
  • giving Reachy a persistent personality
  • experimenting with face tracking and reactions
  • building custom apps with the Python SDK

In other words, seeing how far we can push embodied AI here in Aotearoa.

Part 4 coming soon.


Written for KiwiGPT.co.nz — Generated, Published and Tinkered with AI by a Kiwi