Haptics and Virtual Interactions

In this episode, Shihan Lu interviews Dr. Heather Culbertson, Assistant Professor in the Computer Science Department at the University of Southern California, about her work in haptics. Dr. Culbertson discusses the data-driven realistic texture modeling and rendering, haptic technologies in the social touch, the combination of haptics and robots, expectations and obstacles of haptics in the next 5 years.


Fostering Creativity: RSS Pioneers and the YOLO Robot

In this episode, Lauren Klein interviews Human-Robot Interaction researcher Patrícia Alves-Oliveira. Alves-Oliveira tells us about the upcoming RSS Pioneers workshop at the 2020 Robotics: Science and Systems Conference; the workshop brings senior PhD students and postdoctoral researchers together to collaborate and discuss their work with distinguished members of the robotics field. She also describes her own research designing robots to encourage creativity in children.


Autonomous Bricklaying by FBR

In this episode, Ron Vanderkley interviews Mark Pivac, Chief Technical Officer and co-founder of FBR (formerly Fastbrick Robotics) about the world’s first end-to-end autonomous bricklaying robot, ‘Hadrian X’. Three years after his first interview, we catch up with Pivac to see how FBR has expanded its operation and chat about their latest commercial prototype, ‘Hadrian X’, as well as the future of the robotic construction industry.


Robot Operating System (ROS) & Gazebo

In this episode, Audrow Nash interviews Brian Gerkey, CEO of Open Robotics about the Robot Operating System (ROS) and Gazebo. Both ROS and Gazebo are open source and are widely used in the robotics community. ROS is a set of software libraries and tools, and Gazebo is a 3D robotics simulator. Gerkey explains ROS and Gazebo and talks about how they are used in robotics, as well as some of the design decisions of the second version of ROS, ROS2.


Halodi Robotics’ EVEr3: A Full-size Humanoid Robot

In this episode, Audrow Nash interviews Bernt Børnich, CEO, CTO, and Co-founder of Halodi Robotics, about Eve (EVEr3), a general purpose full-size humanoid robot, capable of a wide variety of tasks.  Børnich discusses how Eve can be used in research, how Eve’s motors have been designed to be safe around humans (including why they use a low gear ratio), how they do direct force control and the benefits of this approach, and how they use machine learning to reduce cogging in their motors.  Børnich also discusses the longterm goal of Halodi Robotics and how they plan to support researchers using Eve.

Articles News

Launch of ROBOTT-NET’s pilot projects

Irabia, Linak and Nissan have along with Trumpf, Maser, Piccolo, Weibel and Air Liquide been selected to team up on a real-world case study. Over the next 18 months ROBOTT-NET will take these eight voucher projects to full prototype installation – this includes third-party funding directly to the companies and additional professional support via robotics experts of the ROBOTT-NET consortium partners: DTI, Fraunhofer IPA, Tecnalia and the MTC.


DJI’s RoboMaster FPS Competition

In this episode, Audrow Nash interviews Shuo Yang about DJI’s RoboMaster first-person shooter (FPS) competition, a competition designed to get people excited about robotics. For the competition, university teams build and program a robot to go against DJI’s robots in a shooting battle. Each robot has a way of propelling marble-sized plastic balls and pressure sensors on their sides to register if they’ve been hit by an opponent’s projectile. Shuo speaks about the goals of the competition, the teams that are involved, what strategies the teams use, the difficulties the team had in making their robot’s good competitors, the future of the challenge, and how people can get involved.


eSIM in Wearable Technology

Image from the South China Morning Post

In this episode, Audrow Nash speaks with Karl Weaver (魏卡爾), formerly the Original Equipment Manufacturer Business Development Director for Oasis Smart SIM. Weaver discusses how wearable technology is growing as a form of payment system in China. He speaks about wireless technology, including Near-Field Communications (NFC) and Embedded SIM cards (eSIM), in wearable technology and in other applications, such as bike rental.


ICRA 2017 Company Showcase

Image: ICRA 2017

In this episode, Audrow Nash interviews several companies at the International Conference for Robotics and Automation (ICRA). ICRA is the IEEE Robotics and Automation Society’s biggest conference and one of the leading international forums for robotics researchers to present their work.


CUJO – Smart Firewall for Cybersecurity

In this episode, MeiXing Dong talks with Leon Kuperman, CTO of CUJO, about cybersecurity threats and how to guard against them. They discuss how CUJO, a smart hardware firewall, helps protect the home against online threats.


New AI algorithm monitors sleep with radio waves

Researchers have devised a new way to monitor sleep stages without sensors attached to the body. Their device uses an advanced artificial intelligence algorithm to analyze the radio signals around the person and translate those measurements into sleep stages: light, deep, or rapid eye movement (REM). Image: Christine Daniloff/MIT

More than 50 million Americans suffer from sleep disorders, and diseases including Parkinson’s and Alzheimer’s can also disrupt sleep. Diagnosing and monitoring these conditions usually requires attaching electrodes and a variety of other sensors to patients, which can further disrupt their sleep.

To make it easier to diagnose and study sleep problems, researchers at MIT and Massachusetts General Hospital have devised a new way to monitor sleep stages without sensors attached to the body. Their device uses an advanced artificial intelligence algorithm to analyze the radio signals around the person and translate those measurements into sleep stages: light, deep, or rapid eye movement (REM).

“Imagine if your Wi-Fi router knows when you are dreaming, and can monitor whether you are having enough deep sleep, which is necessary for memory consolidation,” says Dina Katabi, the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science, who led the study. “Our vision is developing health sensors that will disappear into the background and capture physiological signals and important health metrics, without asking the user to change her behavior in any way.”

Katabi worked on the study with Matt Bianchi, chief of the division of sleep medicine at MGH, and Tommi Jaakkola, the Thomas Siebel Professor of Electrical Engineering and Computer Science and a member of the Institute for Data, Systems, and Society at MIT. Mingmin Zhao, an MIT graduate student, is the paper’s first author, and Shichao Yue, another MIT graduate student, is also a co-author.

The researchers will present their new sensor at the International Conference on Machine Learning on Aug. 9.


Remote sensing

Katabi and members of her group in MIT’s Computer Science and Artificial Intelligence Laboratory have previously developed radio-based sensors that enable them to remotely measure vital signs and behaviors that can be indicators of health. These sensors consist of a wireless device, about the size of a laptop computer, that emits low-power radio frequency (RF) signals. As the radio waves reflect off of the body, any slight movement of the body alters the frequency of the reflected waves. Analyzing those waves can reveal vital signs such as pulse and breathing rate.

“It’s a smart Wi-Fi-like box that sits in the home and analyzes these reflections and discovers all of these changes in the body, through a signature that the body leaves on the RF signal,” Katabi says.

Katabi and her students have also used this approach to create a sensor called WiGait that can measure walking speed using wireless signals, which could help doctors predict cognitive decline, falls, certain cardiac or pulmonary diseases, or other health problems.

After developing those sensors, Katabi thought that a similar approach could also be useful for monitoring sleep, which is currently done while patients spend the night in a sleep lab hooked up to monitors such as electroencephalography (EEG) machines.

“The opportunity is very big because we don’t understand sleep well, and a high fraction of the population has sleep problems,” says Zhao. “We have this technology that, if we can make it work, can move us from a world where we do sleep studies once every few months in the sleep lab to continuous sleep studies in the home.”

To achieve that, the researchers had to come up with a way to translate their measurements of pulse, breathing rate, and movement into sleep stages. Recent advances in artificial intelligence have made it possible to train computer algorithms known as deep neural networks to extract and analyze information from complex datasets, such as the radio signals obtained from the researchers’ sensor. However, these signals have a great deal of information that is irrelevant to sleep and can be confusing to existing algorithms. The MIT researchers had to come up with a new AI algorithm based on deep neural networks, which eliminates the irrelevant information.

“The surrounding conditions introduce a lot of unwanted variation in what you measure. The novelty lies in preserving the sleep signal while removing the rest,” says Jaakkola. Their algorithm can be used in different locations and with different people, without any calibration.

Using this approach in tests of 25 healthy volunteers, the researchers found that their technique was about 80 percent accurate, which is comparable to the accuracy of ratings determined by sleep specialists based on EEG measurements.

“Our device allows you not only to remove all of these sensors that you put on the person, and make it a much better experience that can be done at home, it also makes the job of the doctor and the sleep technologist much easier,” Katabi says. “They don’t have to go through the data and manually label it.”


Sleep deficiencies

Other researchers have tried to use radio signals to monitor sleep, but these systems are accurate only 65 percent of the time and mainly determine whether a person is awake or asleep, not what sleep stage they are in. Katabi and her colleagues were able to improve on that by training their algorithm to ignore wireless signals that bounce off of other objects in the room and include only data reflected from the sleeping person.

The researchers now plan to use this technology to study how Parkinson’s disease affects sleep.

“When you think about Parkinson’s, you think about it as a movement disorder, but the disease is also associated with very complex sleep deficiencies, which are not very well understood,” Katabi says.

The sensor could also be used to learn more about sleep changes produced by Alzheimer’s disease, as well as sleep disorders such as insomnia and sleep apnea. It may also be useful for studying epileptic seizures that happen during sleep, which are usually difficult to detect.


Somersault simulation for jumping robots

In recent years engineers have been developing new technologies to enable robots and humans to move faster and jump higher. Soft, elastic materials store energy in these devices, which, if released carefully, enable elegant dynamic motions. Robots leap over obstacles and prosthetics empower sprinting. A fundamental challenge remains in developing these technologies. Scientists spend long hours building and testing prototypes that can reliably move in specific ways so that, for example, a robot lands right-side up upon landing a jump.

A pair of new computational methods developed by a team of researchers from Massachusetts Institute of Technology (MIT), University of Toronto and Adobe Research takes first steps towards automating the design of the dynamic mechanisms behind these movements. Their methods generate simulations that match the real-world behaviors of flexible devices at rates 70-times faster than previously possible and provide critical improvements in the accuracy of simulated collisions and rebounds. These methods are then both fast and accurate enough to be used to automate the design process used to create dynamic mechanisms for controlled jumping.

The team will present their methods and results from their paper, “Dynamics-Aware Numerical Coarsening for Fabrication Design,” at the SIGGRAPH 2017 conference in Los Angeles, 30 July to 3 August. SIGGRAPH spotlights the most innovative results in computer graphics research and interactive techniques worldwide.

“This research is pioneering work in applying computer graphics techniques to real physical objects with dynamic behavior and contact,” says lead author Desai Chen, a PhD candidate at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). “The techniques we’ve developed open the door to automating the design of highly dynamic, fast-moving objects.”

Chen’s co-authors include David I.W. Levin, assistant professor at the University of Toronto; Wojciech Matusik, associate professor of electrical engineering and computer science at MIT; and Danny M. Kaufman, senior research scientist at Adobe Research.

Major advances in computational design, physical modeling and rapid manufacturing have enabled the fabrication of objects with customized physical properties–such as tailored sneakers, complex prosthetics, and soft robots–while computer graphics research has seen rapid improvements and efficiencies in creating compelling animations of physics for games, virtual reality and film. In this new work, the team aims to combine efficiency and accuracy to enable simulation for design fabrication, and to accurately simulate objects in motion.

“The goal is to bring the physical rules of virtual reality much closer to those of actual reality,” says Levin.

In the research, the team addresses the challenge with simulating elastic objects as they collide – making things accurate enough to match reality and fast enough to automate that design process. Attempting to create such simulations in the presence of contact, impact or friction remains time-consuming and inaccurate.

“It is very important to get this part right, and, until now, our existing computer codes tend to break down here,” says Kaufman. “We realize that if we are doing design for the real world, we have to have code that correctly models things such as high-speed bouncing, collision and friction.”

The researchers demonstrate their new methods, Dynamics-Aware Coarsening (DAC) and Boundary Balanced Impact (BBI), by designing and fabricating mechanisms that flip, throw and jump over obstacles. Their methods perform simulations much faster than existing, state-of-the-art approaches and with greater accuracy when compared to real-world motions.

DAC works by reducing degrees of freedom, the number of values that encode motion, to speed up simulations while still capturing important motions for dynamic scenarios. It finds the roughest meshes that can correctly represent the key shapes that will be taken by dynamics and matches the material properties of these meshes directly to recorded video experiment. BBI is a method for modeling impact behavior of elastic objects. It uses material properties to smoothly project velocities near impact sites to model many real world impact situations such as the impact and rebound between a soft printed material and a table, for instance.

The team was inspired by the need for faster, more accurate design tools that can capture accurate simulations of elastic objects undergoing deformation and collision – especially at high-speeds. These new methods could, down the road, be applied to robotics design, developing robots as they increasingly take on human-like movements and characteristics.

“This project is really a first step for us in pushing methods for simulating reality,” says Kaufman. “We are focusing on pushing them for automatic design and exploring how to effectively use them in design. We can create beautiful images in computer graphics and in animation, let’s extend these methods to actual objects in the real world that are useful, beautiful and efficient.”


Reshaping computer-aided design

Adriana Schulz, an MIT PhD student in the Computer Science and Artificial Intelligence Laboratory, demonstrates the InstantCAD computer-aided-design-optimizing interface. Photo: Rachel Gordon/MIT CSAIL

Almost every object we use is developed with computer-aided design (CAD). Ironically, while CAD programs are good for creating designs, using them is actually very difficult and time-consuming if you’re trying to improve an existing design to make the most optimal product. Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and Columbia University are trying to make the process faster and easier: In a new paper, they’ve developed InstantCAD, a tool that lets designers interactively edit, improve, and optimize CAD models using a more streamlined and intuitive workflow.


Midwest Speech and Language Days 2017 Posters

In this episode, MeiXing Dong conducts interviews at the 2017 Midwest Speech and Language Days workshop in Chicago. She talks with Michael White of Ohio State University about question interpretation in a dialogue system; Dmitriy Dligach of Loyola University Chicago about extracting patient timelines from doctor’s notes; and Denis Newman-Griffiths of Ohio State University about connecting words and phrases to relevant medical topics.


Using MATLAB for hardware-in-the-loop prototyping #1 : Message passing systems

MATLAB© is a programming language and environment designed for scientific computing. It is one of the best languages for developing robot control algorithms and is widely used in the research community. While it is often thought of as an offline programming language, there are several ways to interface with it to control robotic hardware ‘in the loop’. As part of our own development we surveyed a number of different projects that accomplish this by using a message passing system and we compared the approaches they took. This post focuses on bindings for the following message passing frameworks: LCM, ROS, DDS, and ZeroMQ.

The main motivation for using MATLAB to prototype directly on real hardware is to dramatically accelerate the development cycle by reducing the time it takes to find out out whether an algorithm can withstand ubiquitous real-world problems like noisy and poorly-calibrated sensors, imperfect actuator controls, and unmodeled robot dynamics. Additionally, a workflow that requires researchers to port prototype code to another language before being able to test on real hardware can often lead to weeks or months being lost in chasing down new technical bugs introduceed by the port. Finally, programming in a language like C++ can pose a significant barrier to controls engineers who often have a strong electro-mechanical background but are not as strong in computer science or software engineering.

We have also noticed that over the past few years several other groups in the robotics community also experience these problems and have started to develop ways to control hardware directly from MATLAB.


The Need for External Languages

The main limitation when trying to use MATLAB to interface with hardware stems from the fact that its scripting language is fundamentally single threaded. It has been designed to allow non-programmers to do complex math operations without needing to worry about programming concepts like multi-threading or synchronization. However, this poses a problem for real-time control of hardware because all communication is forced to happen synchronously in the main thread. For example, if a control loop runs at 100Hz and it takes a message ~8ms for a round-trip, the main thread ends up wasting 80% of the available time budget waiting for a response without doing any actual work.

A second hurdle is that while MATLAB is very efficient in the execution of math operations, it is not particularly well suited for byte manipulation. This makes it difficult to develop code that can efficiently create and parse binary message formats that the target hardware can understand. Thus, after having the main thread spend its time waiting for and parsing the incoming data, there may not be any time left for performing interesting math operations.

comms single threaded.png
Figure 1. Communications overhead in the main MATLAB thread

Pure MATLAB implementations can work for simple applications, such as interfacing with an Arduino to gather temperature data or blink an LED, but it is not feasible to control complex robotic systems (e.g. a humanoid) at high rates (e.g. 100Hz-1KHz). Fortunately, MATLAB does have the ability to interface with other programming languages that allow users to create background threads that can offload the communications aspect from the main thread.

comms multi threaded.png
Figure 2. Communications overhead offloaded to other threads

Out of the box MATLAB provides two interfaces to other languages: MEX for calling C/C++ code, and the Java Interface for calling Java code. There are some differences between the two, but at the end of the day the choice effectively comes down to personal preference. Both provide enough capabilities for developing sophisticated interfaces and have orders of magnitude better performance than required. There are additional interfaces to other languages, but those require additional setup steps.


Message Passing Frameworks

Message passing frameworks such as Robot Operating System (ROS) and Lightweight Communication and Marshalling (LCM) have been widely adopted in the robotics research community. At the core they typically consist of two parts: a way to exchange data between processes (e.g. UDP/TCP), as well as a defined binary format for encoding and decoding the messages. They allow systems to be built with distributed components (e.g. processes) that run on different computers, different operating systems, and different programming languages.

The resulting systems are very extensible and provide convenient ways for prototyping. For example, a component communicating with a physical robot can be exchanged with a simulator without affecting the rest of the system. Similarly, a new walking controller could be implemented in MATLAB and communicate with external processes (e.g. robot comms) through the exchange of messages. With ROS and LCM in particular, their flexibility, wide-spread adoption, and support for different languages make them a nice starting point for a MATLAB-hardware interface.


Lightweight Communication and Marshalling (LCM)

LCM was developed in 2006 at MIT for their entry to DARPA’s Urban Challenge. In recent years it has become a popular alternative to ROS-messaging, and it was as far as we know the first message passing framework for robotics that supported MATLAB as a core language.

The snippet below shows how the MATLAB code for sending a command message could look like. The code creates a struct-like message, sets desired values, and publishes it on an appropriate channel.

%% MATLAB code for sending an LCM message
% Setup
lc = lcm.lcm.LCM.getSingleton();

% Fill message
cmd = types.command();
cmd.position = [1 2 3];
cmd.velocity = [1 2 3];

% Publish
lc.publish('COMMAND_CHANNEL', cmd);

Interestingly, the backing implementation of these bindings was done in pure Java and did not contain any actual MATLAB code. The exposed interface consisted of two Java classes as well as auto-generated message types.

  • The LCM class provides a way to publish messages and subscribe to channels
  • The generated Java messages handle the binary encoding and exposed fields that MATLAB can access
  • The MessageAggregator class provides a way to receive messages on a background thread and queue them for MATLAB.

Thus, even though the snippet looks similar to MATLAB code, all variables are actually Java objects. For example, the struct-like command type is a Java object that exposes public fields as shown in the snippet below. Users can access them the same way as fields of a standard MATLAB struct (or class properties) resulting in nice syntax. The types are automatically converted according to the type mapping.

 * Java class that behaves like a MATLAB struct
public final class command implements lcm.lcm.LCMEncodable
    public double[] position;
    public double[] velocity;
    // etc. ...

Receiving messages is done by subscribing an aggregator to one or more channels. The aggregator receives messages from a background thread and stores them in a queue that MATLAB can access in a synchronous manner using aggregator.getNextMessage(). Each message contains the raw bytes as well as some meta data for selecting an appropriate type for decoding.

%% MATLAB code for receiving an LCM message
% Setup
lc = lcm.lcm.LCM.getSingleton();
aggregator = lcm.lcm.MessageAggregator();
lc.subscribe('FEEDBACK_CHANNEL', aggregator);

% Continuously check for new messages
timeoutMs = 1000;
while true

    % Receive raw message
    msg = aggregator.getNextMessage(timeoutMs);

    % Ignore timeouts
    if ~isempty(msg)

        % Select message type based on channel name
        if strcmp('FEEDBACK_CHANNEL', char(

            % Decode raw bytes to a usable type
            fbk =;

            % Use data
            position = fbk.position;
            velocity = fbk.velocity;



The snippet below shows a simplified version of the backing Java code for the aggregator class. Since Java is limited to a single return argument, the getNextMessage call returns a Java type that contains the received bytes as well as meta data to identify the type, i.e., the source channel name.

 * Java class for receiving messages in the background
public class MessageAggregator implements LCMSubscriber {

     * Value type that combines multiple return arguments
    public static class Message {

        final public byte[] data; // raw bytes
        final public String channel; // source channel name

        public Message(String channel_, byte[] data_) {
            data = data_;
            channel = channel_;

     * Method that gets called from MATLAB to receive new messages
    public synchronized Message getNextMessage(long timeout_ms) {

		if (!messages.isEmpty()) {
		    return messages.removeFirst();

        if (timeout_ms == 0) { // non-blocking
            return null;

        // Wait for new message until timeout ...


Note that the getNextMessage method requires a timeout argument. In general it is important for blocking Java methods to have a timeout in order to prevent the main thread from getting stuck permanently. Being in a Java call prohibits users from aborting the execution (ctrl-c), so timeouts should be reasonably short, i.e., in the low seconds. Otherwise this could cause the UI to become unresponsive and users may be forced to close MATLAB without being able to save their workspace. Passing in a timeout of zero serves as a non-blocking interface that immediately returns empty if no messages are available. This is often useful for working with multiple aggregators or for integrating asynchronous messages with unknown timing, such as user input.

Overall, we thought that this was a well thought out API and a great example for a minimum viable interface that works well in practice. By receiving messages on a background thread and by moving the encoding and decoding steps to the Java language, the main thread is able to spend most of its time on actually working with the data. Its minimalistic implementation is comparatively simple and we would recommend it as a starting point for developing similar interfaces.

Some minor points for improvement that we found were:

  • The decoding step fbk = forces two unnecessary translations due to being a byte[], which automatically gets converted to and from int8. This could result in a noticeable performance hit when receiving larger messages (e.g. images) and could be avoided by adding an overload that accepts a non-primitive type that does not get translated, e.g., fbk =
  • The Java classes did not implement Serializable, which could become bothersome when trying to save the workspace.
  • We would prefer to select the decoding type during the subscription step, e.g., lc.subscribe(‘FEEDBACK_CHANNEL’, aggregator, ‘’), rather than requiring users to instantiate the type manually. This would clean up the parsing code a bit and allow for a less confusing error message if types are missing.


Robot Operating System (ROS)

ROS is by far the most widespread messaging framework in the robotics research community and has been officially supported by Mathworks’ Robotics System Toolbox since 2014. While the Simulink code generation uses ROS C++, the MATLAB implementation is built on the less common RosJava.

The API was designed such that each topic requires dedicated publishers and subscribers, which is different from LCM where each subscriber may listen to multiple channels/topics. While this may result in potentially more subscribers, the specification of the expected type at initialization removes much of the boiler plate code necessary for dealing with message types.

%% MATLAB code for publishing a ROS message
% Setup Publisher
chatpub = rospublisher('/chatter', 'std_msgs/String');

% Fill message
msg = rosmessage(chatpub);
msg.Data = 'Some test string';

% Publish

Subscribers support three different styles to access messages: blocking calls, non-blocking calls, and callbacks.

%% MATLAB code for receiving a ROS message
% Setup Subscriber
laser = rossubscriber('/scan');

% (1) Blocking receive
scan = laser.receive(1); % timeout [s]

% (2) Non-blocking latest message (may not be new)
scan = laser.LatestMessage;

% (3) Callback
callback = @(msg) disp(msg);
subscriber = rossubscriber('/scan', @callback);

Contrary to LCM, all objects that are visible to users are actually MATLAB classes. Even though the implementation is using Java underneath, all exposed functionality is wrapped in MATLAB classes that hide all Java calls. For example, each message type is associated with a generated wrapper class. The code below shows a simplified example of a wrapper for a message that has a Name property.

%% MATLAB code for wrapping a Java message type
classdef WrappedMessage

    properties (Access = protected)
        % The underlying Java message object (hidden from user)


        function name = get.Name(obj)
            % value = msg.Name;
            name = char(obj.JavaMessage.getName);

        function set.Name(obj, name)
            % msg.Name = value;
            validateattributes(name, {'char'}, {}, 'WrappedMessage', 'Name');
            obj.JavaMessage.setName(name); % Forward to Java method

        function out = doSomething(obj)
            % msg.doSomething() and doSomething(msg)
                out = obj.JavaMessage.doSomething(); % Forward to Java method
            catch javaException
                throw(WrappedException(javaException)); % Hide Java exception


Due to the implementation being closed-source, we were only able to look at the public toolbox files as well as the compiled Java bytecode. As far as we could tell they built a small Java library that wrapped RosJava functionality in order to provide an interface that is easier to call from MATLAB. Most of the actual logic seemed to be implemented in MATLAB code, but we also found several calls to various Java libraries for problems that would have been difficult to implement in pure MATLAB, e.g., listing networking interfaces or doing in-memory decompression of images.

Overall, we found that the ROS support toolbox looked very nice and was a great example of how seamless external languages could be integrated with MATLAB. We also really liked that they offered a way to load log files (rosbags).

One concern we had was that there did not seem to be a simple non-blocking way to check for new messages, e.g., a hasNewMessage() method or functionality equivalent to LCM’s getNextMessage(0). We often found this useful for applications that combined data from multiple topics that arrived at different rates (e.g. sensor feedback and joystick input events). We checked whether this behavior could be emulated by specifying a very small timeout in the receive method (shown in the snippet below), but any value below 0.1s seemed to never successfully return.


Data Distribution Service (DDS)

In 2014 Mathworks also added a support package for DDS, which is the messaging middleware that ROS 2.0 is based on. It supports MATLAB and Simulink, as
well as code generation. Unfortunately, we did not have all the requirements to get it setup, and we could not find much information about the underlying implementation. After looking at some of the intro videos, we believe that the resulting code should look as follows.

%% MATLAB code for sending and receiving DDS messages
% Setup
dp = DDS.DomainParticipant

% Create message
myTopic = ShapeType;
myTopic.x = int32(23);
myTopic.y = int32(35);

% Send Message
dp.addWriter('ShapeType', 'Square');

% Receive message
dp.addReader('ShapeType', 'Square');
readTopic =;



ZeroMQ is another asynchonous messaging library that is popular for building distributed systems. It only handles the messaging aspect, so users need to supply their own wire format. ZeroMQ-matlab is a MATLAB interface to ZeroMQ that was developed at UPenn between 2013-2015. We were not able to find much documentation, but as far as we could tell the resulting code should look similar to following snippet.

%% MATLAB code for sending and receiving ZeroMQ data
% Setup
subscriber = zmq( 'subscribe', 'tcp', '', 43210 );
publisher = zmq( 'publish', 'tcp', 43210 );

% Publish data
bytes = uint8(rand(100,1));
nbytes = zmq( 'send', publisher, bytes );

% Receive data
receiver = zmq('poll', 1000); // polls for next message
[recv_data, has_more] = zmq( 'receive', receiver );


It was implemented as a single MEX function that selects appropriate sub-functions based on a string argument. State was maintained by using socket IDs that were passed in by the user at every call. The code below shows a simplified snippet of the send action.

// Parsing the selected ZeroMQ action behind the MEX barrier
// Grab command String
if ( !(command = mxArrayToString(prhs[0])) )
	mexErrMsgTxt("Could not read command string. (1st argument)");

// Match command String with desired action (e.g. 'send')
if (strcasecmp(command, "send") == 0){
	// ... (argument validation)

	// retrieve arguments
	socket_id = *( (uint8_t*)mxGetData(prhs[1]) );
	size_t n_el = mxGetNumberOfElements(prhs[2]);
	size_t el_sz = mxGetElementSize(prhs[2]);
	size_t msglen = n_el*el_sz;

	// send data
	void* msg = (void*)mxGetData(prhs[2]);
	int nbytes = zmq_send( sockets[ socket_id ], msg, msglen, 0 );

	// ... check outcome and return
// ... other actions


Other Frameworks

Below is a list of APIs to other frameworks that we looked at but could not cover in more detail.

Project Notes

Simple Java wrapper for RabbitMQ with callbacks into MATLAB

Seems to be deprecated


Final Notes

Contrary to the situation a few years ago, nowadays there exist interfaces for most of the common message passing frameworks that allow researchers to do at least basic hardware-in-the-loop prototyping directly from MATLAB. However, if none of the available options work for you and you are planning on developing your own, we recommend the following:

  • If there is no clear pre-existing preference between C++ and Java, we recommend to start with a Java implementation. MEX interfaces require a lot of conversion code that Java interfaces would handle automatically.
  • We would recommend starting with a minimalstic LCM-like implementation and then add complexity when necessary.
  • While interfaces that only expose MATLAB code can provide a better and more consistent user experience (e.g. help documentation), there is a significant cost associated with maintaing all of the involved layers. We would recommend holding off on creating MATLAB wrappers until the API is relatively stable.

Finally, even though message passing systems are very widespread in the robotics community, they do have drawbacks and are not appropriate for every application. Future posts in this series will focus on some of the alternatives.