Robohub.org
 

Hello Pepper: Getting started to program robots on Android


by
03 August 2016



share this:
The Future of Social Robotics, by Nicolas Rigaud. Source: Margaret Maynard-Reid/Medium

The Future of Social Robotics, by Nicolas Rigaud. Source: Margaret Maynard-Reid/Medium

I didn’t get to meet Pepper the humanoid Robot at Google I/O but I watched the video afterwards: A new development frontier: Android + Pepper the interactive robot. Love the robots’ dance! I was super excited to hear that Pepper will become available in the U.S. later this year, and Android developers can now program robots!

A few weeks after I/O, I attended a Seattle Java User Group meetup. That evening I met Pepper in person, and I learned about the Future of Social Robotics from Nicolas Rigaud (slides here).

Equipped with microphones, cameras, sensors and an intelligence to perceive emotions, Pepper was designed for social interactions. He has an Android tablet on his chest which is very convenient for displaying image, video and texts. Pepper is 4 feet tall (1.2 meters) and can connect to the network with either Wi-Fi or Ethernet. Pepper is multi-lingual in English, French, Japanese and Chinese.

Source: Margaret Maynard-Reid/Medium

Source: Margaret Maynard-Reid/Medium

I wanted to learn more about Pepper. So I followed the instructions from the Pepper SDK for Android and was able to quickly get set up and create my first robot Application with Android Studio.

Since I already have Android Studio installed, all I had to do was to install the Pepper SDK plugin before creating my first “Hello World” robot application. You can follow the official Getting Started guide for detailed instructions, and here I’m sharing with you a brief summary of the steps I took (note I already had JDK and Android Studio installed):

  1. In Android Studio, install the Pepper SDK plugin; then the Robot SDK Manager icon appears on the Android Studio main tool bar
  2. Click the Robot SDK Manager icon to get the Robot SDK and tools
  3. Creating my first robot application
  • First create a regular Android application with minSDK = API 22 Android 5.1 Lollipop.
  • Then click File > New > Robot Application. Then notice these changes:

These two robot development related icons in Android Studio become enabled: Emulator and Connect/Disconnect. The 4th icon Wakeup is still grayed out since I don’t have a real robot to connect to.

Source: Margaret Maynard-Reid/Medium

Source: Margaret Maynard-Reid/Medium

The Android project structure gets updated to the Robot Project Structure. Dependencies are automatically added in build.gradle file:

compile 'com.aldebaran:libqi-java-android:sdk-2016-05-16'
compile 'com.aldebaran:qisdk:0.7'
compile 'com.aldebaran:qichatplayer:1.0.1'
  • I set CPU/ABI as x86 and checked “Use Host GPU” under Run/Debug Configuration/AVD option. Make sure to launch the emulator by clicking on the Robot Emulator icon, instead of using a virtual device from the Android Studio AVD Manager.
  • When the Robot emulator launches, the Robot Viewer also gets launched.

Without an actual robot I was still able to follow the first 3 of the tutorialsSay “Hello, world!”, Go to one meter forward and Mimic the elephant.

I will share more as I continue to learn about Pepper. Hopefully in the near future, we all can make apps that work with Pepper the humanoid robot.



tags: ,


Margaret Maynard-Reid is an Android Developer currently building apps that showcase how we can leverage machine learning.
Margaret Maynard-Reid is an Android Developer currently building apps that showcase how we can leverage machine learning.

            AUAI is supported by:



Subscribe to Robohub newsletter on substack



Related posts :

Developing active and flexible microrobots

  13 May 2026
This class of robots opens up possibilities for biomedical applications.

How to teach the same skill to different robots

  11 May 2026
A new framework to teach a skill to robots with different mechanical designs, allowing them to carry out the same task without rewriting code for each.

Robot Talk Episode 155 – Making aerial robots smarter, with Melissa Greeff

  08 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Melissa Greeff from Queen's University about autonomous navigation and learning for drones.

New understanding of insect flight points way to stable flapping-wing robots

  07 May 2026
The way bugs and birds flap their wings may look effortless, but the dynamics that keep them aloft are dizzyingly complex and difficult to quantify.

Robotically assembled building blocks could make construction more efficient and sustainable

  05 May 2026
Research suggests constructing a simple building from interlocking subunits should be mechanically feasible and have a much smaller carbon footprint.

Robot Talk Episode 154 – Visual navigation in insects and robots, with Andrew Philippides

  01 May 2026
In the latest episode of the Robot Talk podcast, Claire chatted to Andrew Philippides from the University of Sussex about what we can learn from ants and bees to improve robot navigation.

Ultralightweight sonar plus AI lets tiny drones navigate like bats

  29 Apr 2026
Researchers develop ultrasound-based perception system inspired by bat echolocation.

Gradient-based planning for world models at longer horizons

  28 Apr 2026
What were the problems that motivated this project and what was the approach to address them?



AUAI is supported by:







Subscribe to Robohub newsletter on substack




 















©2026.02 - Association for the Understanding of Artificial Intelligence