Robohub.org
 

Hello Pepper: Getting started to program robots on Android


by
03 August 2016



share this:
The Future of Social Robotics, by Nicolas Rigaud. Source: Margaret Maynard-Reid/Medium

The Future of Social Robotics, by Nicolas Rigaud. Source: Margaret Maynard-Reid/Medium

I didn’t get to meet Pepper the humanoid Robot at Google I/O but I watched the video afterwards: A new development frontier: Android + Pepper the interactive robot. Love the robots’ dance! I was super excited to hear that Pepper will become available in the U.S. later this year, and Android developers can now program robots!

A few weeks after I/O, I attended a Seattle Java User Group meetup. That evening I met Pepper in person, and I learned about the Future of Social Robotics from Nicolas Rigaud (slides here).

Equipped with microphones, cameras, sensors and an intelligence to perceive emotions, Pepper was designed for social interactions. He has an Android tablet on his chest which is very convenient for displaying image, video and texts. Pepper is 4 feet tall (1.2 meters) and can connect to the network with either Wi-Fi or Ethernet. Pepper is multi-lingual in English, French, Japanese and Chinese.

Source: Margaret Maynard-Reid/Medium

Source: Margaret Maynard-Reid/Medium

I wanted to learn more about Pepper. So I followed the instructions from the Pepper SDK for Android and was able to quickly get set up and create my first robot Application with Android Studio.

Since I already have Android Studio installed, all I had to do was to install the Pepper SDK plugin before creating my first “Hello World” robot application. You can follow the official Getting Started guide for detailed instructions, and here I’m sharing with you a brief summary of the steps I took (note I already had JDK and Android Studio installed):

  1. In Android Studio, install the Pepper SDK plugin; then the Robot SDK Manager icon appears on the Android Studio main tool bar
  2. Click the Robot SDK Manager icon to get the Robot SDK and tools
  3. Creating my first robot application
  • First create a regular Android application with minSDK = API 22 Android 5.1 Lollipop.
  • Then click File > New > Robot Application. Then notice these changes:

These two robot development related icons in Android Studio become enabled: Emulator and Connect/Disconnect. The 4th icon Wakeup is still grayed out since I don’t have a real robot to connect to.

Source: Margaret Maynard-Reid/Medium

Source: Margaret Maynard-Reid/Medium

The Android project structure gets updated to the Robot Project Structure. Dependencies are automatically added in build.gradle file:

compile 'com.aldebaran:libqi-java-android:sdk-2016-05-16'
compile 'com.aldebaran:qisdk:0.7'
compile 'com.aldebaran:qichatplayer:1.0.1'
  • I set CPU/ABI as x86 and checked “Use Host GPU” under Run/Debug Configuration/AVD option. Make sure to launch the emulator by clicking on the Robot Emulator icon, instead of using a virtual device from the Android Studio AVD Manager.
  • When the Robot emulator launches, the Robot Viewer also gets launched.

Without an actual robot I was still able to follow the first 3 of the tutorialsSay “Hello, world!”, Go to one meter forward and Mimic the elephant.

I will share more as I continue to learn about Pepper. Hopefully in the near future, we all can make apps that work with Pepper the humanoid robot.



tags: ,


Margaret Maynard-Reid is an Android Developer currently building apps that showcase how we can leverage machine learning.
Margaret Maynard-Reid is an Android Developer currently building apps that showcase how we can leverage machine learning.

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?

Robot Talk Episode 153 – Origami-inspired robots, with Chenying Liu

  24 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Chenying Liu from University of Oxford about how a robot's physical form can actively contribute to sensing, processing, decision-making, and movement.

Sony AI table tennis robot outplays elite human players

  22 Apr 2026
New robot and AI system has beaten professional and elite table tennis players.

AI system learns to keep warehouse robot traffic running smoothly

  20 Apr 2026
This new approach adapts to decide which robots should get the right of way at every moment, avoiding congestion and increasing throughput.

Robot Talk Episode 152 – Dexterous robot hands, with Rich Walker

  17 Apr 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Rich Walker from Shadow Robot Company about their advanced robotic hands for research and industry.

What I’ve learned from 25 years of automated science, and what the future holds: an interview with Ross King

and   14 Apr 2026
Ross King created the first robot scientist back in 2009. He spoke to us about the nature of scientific discovery, the role AI has to play, and his recent work in DNA computing.



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence