With the Grab-it robot arm from JOY-IT, hobbyists, school pupils, and students can gain their first experience of Python programming and complete simple projects using the Raspberry Pi or Arduino.
The first part of this article focused on assembling the robot arm, calibrating the servo motors and programming the robot arm’s initial movements. This second part provides developers with short examples and ideas for extensions that allow them to make even better use of Grab-it.
Creative projects & applications
Required hardware:
- Grab-it robot arm kit from JOY-IT, including accessories such as Raspberry Pi and Moto Pi
Recommended additional accessories:
- Arduino UNO R3
- JOY-IT Motorino: Motor control for Arduino
- HDMI-Kabel
- Raspberry Pi USB-C to USB cable with switch
- Cable set for breadboards
Creative project – stacking components
For the first creative project, program the robot arm to grasp the building blocks, which were manufactured using a 3D printer, and stack them in the provided box (figure 1).

To do this, the following steps must first be carried out, as shown in Part 1:
- Set up the Raspberry Pi with Moto Pi, or alternatively an Arduino Uno with Motorino, and connect it to the robot arm
- Install the Raspberry Pi OS
- Add the required libraries and calibrate the servo motors
The programme for stacking components can then be run on the Raspberry Pi.

Below is an excerpt from the code that makes Grab-it perform movements via the Raspberry Pi in conjunction with the Moto Pi.
"programs": {
"Wuerfel stapeln": [
{
"M1": 1.5,
"M2": 1.35,
"M3": 1.486,
"M4": 1.45,
"M5": 1.5,
"M6": 1.1
},
{
"M1": 2.126,
"M2": 1.425,
"M3": 0.942,
"M4": 1.39,
"M5": 2.1,
"M6": 1.1
},
{
"M1": 2.123,
"M2": 0.774,
"M3": 0.942,
"M4": 1.122,
"M5": 2.1,
"M6": 1.1
},
{
"M1": 2.112,
"M2": 0.786,
"M3": 1.05,
"M4": 1.082,
"M5": 2.1,
"M6": 1.1
},
{
"M1": 2.112,
"M2": 0.663,
"M3": 1.05,
"M4": 1.082,
"M5": 2.1,
"M6": 1.1
},
{
"M1": 2.112,
"M2": 0.663,
"M3": 1.05,
"M4": 1.082,
"M5": 2.1,
"M6": 0.872
},
{
"M1": 2.112,
"M2": 0.876,
"M3": 1.05,
"M4": 1.082,
"M5": 2.1,
"M6": 0.872
},
{
"M1": 2.112,
"M2": 1.524,
"M3": 0.71,
"M4": 1.082,
"M5": 1.125,
"M6": 0.872
},
Grab-it goes IoT
Control via web interface and app
The next step is to prepare the Grab-it for the Internet of Things (IoT). Similar to a smart home, the robot arm will be controlled via a web interface. The popular Flask web server acts as an interface between the HTML interface and the motor control. This allows the robot arm to be conveniently operated via the web interface.
As well as PC control, HTML control also works via an app and smartphone. When used with a Raspberry Pi, the Grab-it can easily be controlled via a smartphone. The idea behind this is as follows:
- The Raspberry Pi controls the Grab-it via the PCA9685 servo driver. The PCA9685 is a component that can control up to 16 servo motors simultaneously using only two cables from the Raspberry Pi because it generates the required PWM signals itself, thus relieving the Pi.
- A Flask web server runs on the Raspberry Pi. This is a lightweight web framework for Python that can be used to quickly create simple web applications or interfaces that enable devices such as the Grab-it to be controlled via a browser.
- The arm can be conveniently controlled via a browser interface (on a PC or smartphone).
First, the system must be prepared for installation. Then, the servo control is implemented in Python, and the Flask web server and HTML interface are created.
1. Prepare the system:
- Activate I2C:
sudo raspbi-config
sudo reboot
- Install the SMBus driver packages:
sudo apt update;
sudo apt install python3-flask python3-smbus i2c-tools
- Check whether the PCA9685 has been found:
sudo i2cdetect -y 1
2. Prepare servo control:
- Create a file called
pca9685_control.py.- This file encapsulates the control of the Grab-it (see code snippet):
# pca9685_control.py
import smbus
import time
import math
PCA9685_ADDR = 0x40
bus = smbus.SMBus(1)
MODE1 = 0x00
PRESCALE = 0xFE
LED0_ON_L = 0x06
def set_pwm_freq(freq_hz=50):
prescaleval = 25000000.0 / (4096.0 * freq_hz) - 1.0
prescale = int(math.floor(prescaleval + 0.5))
oldmode = bus.read_byte_data(PCA9685_ADDR, MODE1)
newmode = (oldmode & 0x7F) | 0x10
bus.write_byte_data(PCA9685_ADDR, MODE1, newmode)
bus.write_byte_data(PCA9685_ADDR, PRESCALE, prescale)
bus.write_byte_data(PCA9685_ADDR, MODE1, oldmode)
time.sleep(0.005)
bus.write_byte_data(PCA9685_ADDR, MODE1, oldmode | 0xA1)
def set_pwm(channel, on, off):
bus.write_byte_data(PCA9685_ADDR, LED0_ON_L + 4 * channel, on & 0xFF)
bus.write_byte_data(PCA9685_ADDR, LED0_ON_L + 4 * channel + 1, on >> 8)
bus.write_byte_data(PCA9685_ADDR, LED0_ON_L + 4 * channel + 2, off & 0xFF)
bus.write_byte_data(PCA9685_ADDR, LED0_ON_L + 4 * channel + 3, off >> 8)
def angle_to_pwm(angle):
pulse_min = 150 # ggf. an deine Servos anpassen
pulse_max = 600
return int(pulse_min + (pulse_max - pulse_min) * angle / 180)
def set_servo_angle(channel, angle):
angle = max(0, min(180, angle))
pwm_val = angle_to_pwm(angle)
set_pwm(channel, 0, pwm_val)
# Beispiel-Kanäle (anpassen falls nötig)
BASE = 0
SHOULDER = 1
ELBOW = 2
WRIST = 3
GRIPPER = 4
# Startposition (optional)
def home_position():
set_servo_angle(BASE, 90)
set_servo_angle(SHOULDER, 90)
set_servo_angle(ELBOW, 90)
set_servo_angle(WRIST, 90)
set_servo_angle(GRIPPER, 60)
# Beim Import einmal Frequenz setzen
set_pwm_freq(50)
3. Create Flask web server:
- The
app.pyfile enables control from Flask (figure 3).

4. Create the HTML web interface:
- Create a folder called templates next to
app.pyand create the index.html file inside it (figure 4).

5. JavaScript is also stored:
- The JavaScript activates the button functions (figure 5).

6. The server can now be started and controlled in the browser. To do this, simply execute this command in the folder with app.py:
python3 app.py
Then open the browser on a PC or smartphone connected to the same Wi-Fi network to access the Raspberry Pi:
e.g.: http://DEINE_PI_IP:5000i.e.: http://192.168.0.23:5000
The Grab-it can then be easily and conveniently controlled via the web interface on the PC or smartphone, just like a smart home device (figure 6).

You can also choose whether to start the Flask app manually or with Autostart. This is not specified in the code, but in the Raspberry Pi system.
1. Manual start:
cd /home/pi/grab-it-web
Python3 app.py
2. Autostart:
app.py Ausführungsrechte geben: sudo chmod +x app.py
Servicedatei erstellen: sudo nano /etc/systemd/system/roboterarm.service
Inhalt:
[Unit]
Description=Roboterarm Steuerung App
After=network.target
[Service]
ExecStart=/usr/bin/python3 /home/pi/V3/app.py
WorkingDirectory=/home/pi/V3/
User=root
# Setze Restart auf 'always', damit der Service bei einem Absturz neu startet
Restart=always
# Optional: Setze eine kleine Verzögerung vor dem ersten Neustart
RestartSec=5
[Install]
WantedBy=multi-user.target
===================
daemon-Dienst neu laden: sudo systemctl daemon-reload
Autostart aktivieren: sudo systemctl enable roboterarm.service
Control via Xbox Controller
As well as controlling the robot arm via a smartphone or PC, it is also possible to operate it with a joystick or controller. An Xbox controller was connected to the Raspberry Pi via a USB interface. The arm can then be conveniently controlled via the controller using the Flask server that is already installed.
- Install the Xbox controller using the following command: sudo apt install
python3-evdev. - Install the Python script:
XboxController.py(figure 7).
This allows the Grab-it to be quickly and easily moved to any desired position using the controller.

The following code excerpt shows how to activate the Xbox controller:
import threading
import time
from evdev import InputDevice, categorize, ecodes, list_devices
# Robotersteuerung importieren - NEUE IMPORT-METHODE
try:
import roboter_arm_steuerung as RAS
ARM_CONTROLLER = RAS.ARM_CONTROLLER
MIN_MAX_PULSES = RAS.MIN_MAX_PULSES
except ImportError:
print("FEHLER: roboter_arm_steuerung.py konnte nicht importiert werden.")
ARM_CONTROLLER = None
# -------------------------------------------------
# Controller Maps & Einstellungen
# -------------------------------------------------
SENSITIVITY_FACTOR = 0.005 # Pulsänderung pro Tick für M1-M4
SENSITIVITY_FACTOR_M5_M6 = 0.025 # NEU: Pulsänderung pro Tick für M5-M6 (5x schneller)
DEADZONE = 0.15 # Stick Drift Filter
# Achsen
ABS_X_LEFT = ecodes.ABS_X
ABS_Y_LEFT = ecodes.ABS_Y
ABS_X_RIGHT = ecodes.ABS_RX
ABS_Y_RIGHT = ecodes.ABS_RY
ABS_LT_TRIGGER = ecodes.ABS_Z
ABS_RT_TRIGGER = ecodes.ABS_RZ
# Buttons
BTN_START = ecodes.BTN_START
BTN_BACK = ecodes.BTN_SELECT
BTN_X = ecodes.BTN_SOUTH
BTN_Y = ecodes.BTN_NORTH
BTN_LB = ecodes.BTN_TL # NEU
BTN_RB = ecodes.BTN_TR # NEU
# Motor-Mapping
AXIS_MAPPING = {
ABS_X_LEFT: 'Motor1',
ABS_Y_LEFT: 'Motor2',
ABS_Y_RIGHT: 'Motor3',
ABS_X_RIGHT: 'Motor4',
ABS_LT_TRIGGER: 'Motor6', # Greifer (Öffnen)
ABS_RT_TRIGGER: 'Motor6', # Greifer (Schließen)
}
Extensions & modifications
If you want to get even more out of your Grab-it, you can significantly improve the robot arm with targeted mechanical and software optimisations.
Mechanical upgrades for the Grab-it from JOY-IT
The Grab-it robot arm can be adapted and optimised with simple mechanical upgrades. Particularly effective upgrades include gripper attachments, extended arms and camera holders, which can be produced with a 3D printer.
Different gripper shapes, such as rubberised tips for round objects or wide jaws for flat parts, improve versatility and precision when gripping. However, extended arm segments increase the working radius and should therefore be supported by more stable servos or lightweight materials, such as carbon, to prevent vibrations.
Mounting a camera above the gripper enables AI vision or colour recognition via OpenCV. To improve stability, replace wobbly plastic connections with metal joints or ball bearings. To connect a camera to the Grab-it, use this code to check the camera is correctly integrated, and then use it for colour or object recognition:
# camera_test.py
import cv2
# 0 = erste Kamera (USB-Webcam oder PiCam über /dev/video0)
cap = cv2.VideoCapture(0)
if not cap.isOpened():
print("Kamera konnte nicht geöffnet werden!")
exit()
while True:
ret, frame = cap.read()
if not ret:
print("Kein Kamerabild empfangen!")
break
# Bild anzeigen
cv2.imshow("Grab-it Kamera – Livebild", frame)
# ESC zum Beenden
if cv2.waitKey(1) == 27:
break
cap.release()
cv2.destroyAllWindows()
The servo mount can be reinforced or dampened to minimise vibrations. These modifications make the Grab-it more precise, stable and versatile, making it perfect for ambitious maker projects.
Software Upgrades for the Grab-it – More Intelligence with OpenCV
Targeted software upgrades make the Grab-it robot arm from JOY-IT significantly smarter and more autonomous. Integrating OpenCV is particularly effective for implementing colour, shape or object recognition using a USB or Raspberry Pi camera.
Simple OpenCV functions enable the robot to analyse live video data, filter colour ranges and determine the positions of objects. When combined with a servo driver, for instance, the robot can automatically approach and grasp recognised objects. Simple filters suffice for colour recognition, while contour recognition or AI-based models such as TensorFlow Lite deliver more precise results.
Example (object recognition with OpenCV):
import cv2
cap = cv2.VideoCapture(0)
while True:
ret, frame = cap.read()
hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
mask = cv2.inRange(hsv, (100,150,0), (140,255,255))
cv2.imshow("Erkennung", mask)
if cv2.waitKey(1) == 27:
break
cap.release()
cv2.destroyAllWindows()
Another upgrade is the addition of motion profiles that react flexibly based on camera data. Additionally, an API or Flask web interface can be used to accept commands from web or smartphone applications. These software enhancements make Grab-it an adaptive system that actively ‘perceives’ and reacts to tasks.
Community & resources
Although there are only a few dedicated projects on GitHub, the maker community surrounding the JOY-IT Grab-it robot arm provides valuable resources. GitHub has a small selection of repositories, including sample sketches for integrating OpenCV or Flask web interfaces.
It is also worth taking a look at platforms such as Thingiverse or MyMiniFactory for STL files for customised grippers or camera holders. Maker forums and DIY blogs, such as Reddit threads and Instagram stories, offer further insights and tips on topics such as improving the mechanics, fine-tuning the servos, and integrating a camera.
Although there is currently no large, central Grab-it community, you can find individual projects and tutorials by searching for keywords such as ‘JOY-IT Grab-it’. Use these resources as a starting point, adapt them to your setup and document your own development to become part of the growing community.
Grab-it in action and future prospects
The second part of the JOY-IT article on Grab-it shows how initial projects can be implemented using the robot arm. Grab-it has also been made ‘IoT-enabled’ and can be controlled via a controller, smartphone or PC.
As the community continues to grow, GitHub and other developer forums offer numerous projects, upgrades and other resources that can be used to expand Grab-it continuously, make it AI-capable and further develop it.
Bilder: Adobe Stock, reichelt elektronik
To the first part of the article:
Setup and Basics: Enter the world of robotics with Grab-it – Part 1












