Blog from March, 2018

Assume you would like to quickly find additive changes in the scene. For example you have a static point cloud of empty car park, and would like to extract the parked cars from a stream of lidar data. If the extraction does not have to be perfect, a quick way of doing it would be using points-join --not-matching. A simple example:

> # make sample point clouds
> for i in {20..30}; do for j in {0..50}; do for k in {0..50}; do echo $i,$j,$k; done; done; done > minuend.csv
> for i in {0..50}; do for j in {20..30}; do for k in {20..30}; do echo $i,$j,$k; done; done; done > subtrahend.csv
> cat minuend.csv | points-join subtrahend.csv --radius 0.51 --not-matching | view-points "minuend.csv;colour=red;hide" "subtrahend.csv;colour=yellow;hide" "-;colour=white;title=difference"

The described car park scenario would look like:

> cat carpark-with-cars.csv | points-join --fields x,y,z "empty-carpark.csv;fields=x,y,z" --radius 0.1 --not-matching > cars-only.csv

The crude part is of course in choosing --radius value: it should be such that the spheres of a given radius around the subtrahend point cloud sufficiently overlap to capture all the points belonging to it. But then the points that are closer than the radius to the subtrahend point cloud will be filtered out, too. E.g. in the car park example above, the wheels of the cars will be chopped off at 10cm above the ground. To avoid this problem, you could for example erode somehow the subtrahend point cloud by the radius.

The described approach may be crude, but it is quick and suitable for many practical purposes.

Of course, for more sophisticated change detection in point clouds, which is more accurate and takes into account view points, occlusions, additions and deletions of objects in the scene, etc, you could use points-detect-change.

Assume that you happened to know the coordinates of your sensor in some Cartesian coordinate system, and you want to derive the coordinates of your robot centre. For the robot configuration you know the offset of the sensor from the robot centre, but not the other way around. The solution:

get inverse offset
# assume this is the offset of the sensor from the robot centre

inversed_offset=$( echo "0,0,0,0,0,0" | points-frame --to="$offset" --fields="x,y,z,roll,pitch,yaw" --precision=16 )

Now inversed_offset is the position (and pose) of the robot centre in the coordinate system associated with the sensor.

Step by step demo:

  • Start with some coordinates (navigation data in the world frame; the specific coordinate system does not matter). A sample data file is attached to this page:

    get nav
    cat nav.bin | csv-from-bin t,6d | head -n 2

    The example uses binary, but this is up to you. The nav.bin file contains the trajectory of the robot centre (GPS unit) in the world frame.

  • Get the coordinates of the sensor in the world frame:

    get sensor trajectory
    cat nav.bin | csv-paste "-;binary=t,6d" "value=$offset;binary=6d" \
        | points-frame --from --fields=",frame,x,y,z,roll,pitch,yaw" --binary="t,6d,6d" \
        | csv-shuffle --fields="t,,,,,,,,,,,,,x,y,z,roll,pitch,yaw" --binary="t,6d,6d,6d" --output-fields="t,x,y,z,roll,pitch,yaw" > sensor.bin

    Now sensor.bin is the trajectory of the sensor in the world frame. We want to get the trajectory of the robot centre from these data.

  • Just do it:

    get centre coordinates back
    cat sensor.bin | csv-paste "-;binary=t,6d" "value=$inversed_offset;binary=6d" \
        | points-frame --from --fields=",frame,x,y,z,roll,pitch,yaw" --binary="t,6d,6d" \
        | csv-shuffle --fields="t,,,,,,,,,,,,,x,y,z,roll,pitch,yaw" --binary="t,6d,6d,6d" --output-fields="t,x,y,z,roll,pitch,yaw" > restored.bin

    Note the use of inversed_offset.

  • Verify by comparing to the original nav.bin:

    cat nav.bin \
        | csv-paste "-;binary=t,6d" "restored.bin;binary=t,6d" \
        | csv-eval --full-xpath --binary="t,6d,t,6d" --fields="f/t,f/x,f/y,f/z,f/roll,f/pitch,f/yaw,s/t,s/x,s/y,s/z,s/roll,s/pitch,s/yaw" \
            "dx = abs(f_x - s_x); dy = abs(f_y - s_y); dz = abs(f_z - s_z); droll = abs(f_roll - s_roll); dpitch = abs(f_pitch - s_pitch); dyaw = abs(f_yaw - s_yaw);" \
        | csv-shuffle --fields=",,,,,,,,,,,,,,dx,dy,dz,droll,dpitch,dyaw" --binary="t,6d,t,6d,6d" --output-fields="dx,dy,dz,droll,dpitch,dyaw" \
        | csv-calc --fields="dx,dy,dz,droll,dpitch,dyaw" --binary="6d" mean \
        | csv-from-bin 6d

    The output is on the order of. The precision is defined by the accuracy of inversed_offset calculations above. If the --precision=16 option were not given, the comparison would be valid up toor so.

Rabbit MQ


Rabbit MQ is an open source message queue service (

It implements AMQP 0-9-1 (

programming tutorials:


For Ubuntu:


sudo apt-get install rabbitmq-server
# check service is installed and running
service rabbitmq-server status
# for python clients
sudo pip install pika


(for other platforms see installation from the website)


rabbit-cat is a light rabbit MQ client in python available in comma.

see rabbit-cat -h for examples

example 1

For receiver run:


rabbit-cat listen localhost --queue="queue1"


For sender in a separate terminal:


echo "hello world!" | rabbit-cat send localhost --queue="queue1" --routing-key="queue1"