Reachy's Awakening
Your first tracking with head using arm kinematics
On this page
Introduction
Now, you’ve learned the basics behaviours available with reachy2_sdk. But how to use them to build your own program may still seem a little abstract.
That’s why we’ve created a series of practical tutorials to guide you step-by-step through the process of thinking about and building different programs that will help you understand the mechanics involved in creating your own behaviour!
For the moment, there are three tutorials available, ranging from the simplest to the most complex, to help you get to grips with different Reachy functions.
The first two use only the SDK Client, and the last one adds object detection with AI model from pollen-vision.
You will find the GitHub repository just there, and to make the third tutorial with pollen-vision, you need to install the library on your virtual environment :
pip install --user --no-warn-script-location "pollen-vision[vision] @ git+https://github.com/pollen-robotics/pollen-vision.git@develop"
pip install depthai
Now that you’re all set up, have fun !
Reachy’s Awakening
In this tutorial, we will make Reachy do the awake sequence, which is a series of movements that makes it look like it is waking up. It involves moving its head and arms and can be used as a starting point for more complex sequences.
What you will learn:
- How to make it move its head
- How to make it move its arms
- How to synchronize head and arms movements
