Robohub.org
 

Back to Robot Coding part 3: testing the EBB

by
23 March 2021



share this:

In part 2 a few weeks ago I outlined a Python implementation of the ethical black box. I described the key data structure – a dictionary which serves as both specification for the type of robot, and the data structure used to deliver live data to the EBB. I also mentioned the other key robot specific code:

# Get data from the robot and store it in data structure spec
def getRobotData(spec):

Having reached this point I needed a robot – and a way of communicating with it – so that I could both write getRobotData(spec)  and test the EBB. But how to do this? I’m working from home during lockdown, and my e-puck robots are all in the lab. Then I remembered that the excellent robot simulator V-REP (now called CoppeliaSim) has a pretty good e-puck model and some nice demo scenes. V-REP also offers multiple ways of communicating between simulated robots and external programs (see here). One of them – TCP/IP sockets – appeals to me as I’ve written sockets code many times, for both real-world and research applications. Then a stroke of luck: I found that a team at Ensta-Bretagne had written a simple demo which does more or less what I need – just not for the e-puck. So, first I got that demo running and figured out how it works, then used the same approach for a simulated e-puck and the EBB. Here is a video capture of the working demo.

So, what’s going on in the demo? The visible simulation views in the V-REP window show an e-puck robot following a black line which is blocked by both a potted plant and an obstacle constructed from 3 cylinders. The robot has two behaviours: line following and wall following. The EBB requests data from the e-puck robot once per second, and you can see those data in the Python shell window. Reading from left to right you will see first the EBB date and time stamp, then robot time botT, then the 3 line following sensors lfSe, followed by the 8 infra red proximity sensors irSe. The final two fields show the joint (i.e. wheel angles) jntA, in degrees, then the motor commands jntD. By watching these values as the robot follows its line and negotiates the two obstacles you can see how the line and infra red sensor values change, resulting in updated motor commands.

Here is the code – which is custom written both for this robot and the means of communicating with it – for requesting data from the robot.


# Get data from the robot and store it in spec[]
# while returning one of the following result codes
ROBOT_DATA_OK = 0
CANNOT_CONNECT = 1
SOCKET_ERROR = 2
BAD_DATA = 3
def getRobotData(spec):
    # This function connects, via TCP/IP to an ePuck robot running in V-REP

    # create a TCP/IP socket and connect it to the simulated robot
    sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
    try:
        sock.connect(server_address_port)
    except:
        return CANNOT_CONNECT
    sock.settimeout(0.1) # set connection timeout
    
    # pack a dummy packet that will provoke data in response
    #   this is, in effect, a ‘ping’ to ask for a data record
    strSend = struct.pack(‘fff’,1.0,1.0,1.0)
    sock.sendall(strSend) # and send it to V-REP
    # wait for data back from V-REP
    #   expect a packet with 1 time, 2 joints, 2 motors, 3 line sensors, 8 irSensors  
    #   all floats because V-REP
    #   total packet size = 16 x 4 = 64 bytes
    data = b”
    nch_rx = 64 # expect this many bytes from  V-REP 
    try:
        while len(data) < nch_rx:
            data += sock.recv(nch_rx)
    except:
        sock.close()
        return SOCKET_ERROR
    # unpack the received data
    if len(data) == nch_rx:
        # V-REP packs and unpacks in floats only so…
        vrx = struct.unpack(‘ffffffffffffffff’,data)
        # now move data from vrx[] into spec[], while rounding the floats
        spec[“botTime”] = [ round(vrx[0],2) ] 
        spec[“jntDemands”] = [ round(vrx[1],2), round(vrx[2],2) ]
        spec[“jntAngles”] = [round(vrx[3]*180.0/math.pi,2)
                             round(vrx[4]*180.0/math.pi,2) ]
        spec[“lfSensors”] = [ round(vrx[5],2), round(vrx[6],2), round(vrx[7],2) ]
        for i in range(8):
            spec[“irSensors”][i] = round(vrx[8+i],3)       
        result = ROBOT_DATA_OK
    else:       
        result = BAD_DATA
    sock.close()
    return result

The structure of this function is very simple: first create a socket then open it, then make a dummy packet and send it to V-REP to request EBB data from the robot. Then, when a data packet arrives, unpack it into spec. The most complex part of the code is data wrangling.

Would a real EBB collect data in this way? Well if the EBB is embedded in the robot then probably not. Communication between the robot controller and the EBB might be via ROS messages, or even more directly, by – for instance – allowing the EBB code to access a shared memory space which contains the robot’s sensor inputs, command outputs and decisions. But an external EBB, either running on a local server or in the cloud, would most likely use TCP/IP to communicate with the robot, so getRobotData() would look very much like the example here.



tags:


Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.
Alan Winfield is Professor in robotics at UWE Bristol. He communicates about science on his personal blog.





Related posts :



Looking beyond “technology for technology’s sake”

Whether building robots or helping to lead the National Society of Black Engineers, senior Austen Roberson is thinking about the social implications of his field.
08 December 2022, by

Estimating manipulation intentions to ease teleoperation

Introducing an intention estimation model that relies on both gaze and motion features.
06 December 2022, by and

Countering Luddite politicians with life (and cost) saving machines

Beyond aerial tricks, drones are now being deployed in novel ways to fill the labor gap of menial jobs that have not returned since the pandemic.
04 December 2022, by

Call for robot holiday videos 2022

That’s right! You better not run, you better not hide, you better watch out for brand new robot holiday videos on Robohub!
02 December 2022, by

The Utah Bionic Leg: A motorized prosthetic for lower-limb amputees

Lenzi’s Utah Bionic Leg uses motors, processors, and advanced artificial intelligence that all work together to give amputees more power to walk, stand-up, sit-down, and ascend and descend stairs and ramps.

Touch sensing: An important tool for mobile robot navigation

Proximal sensing often is a blind spot for most long range sensors such as cameras and lidars for which touch sensors could serve as a complementary modality.
29 November 2022, by




©2021 - ROBOTS Association


 












©2021 - ROBOTS Association