Assume you would like to quickly find additive changes in the scene. For example you have a static point cloud of empty car park, and would like to extract the parked cars from a stream of lidar data. If the extraction does not have to be perfect, a quick way of doing it would be using points-join --not-matching. A simple example:
The described car park scenario would look like:
The crude part is of course in choosing --radius value: it should be such that the spheres of a given radius around the subtrahend point cloud sufficiently overlap to capture all the points belonging to it. But then the points that are closer than the radius to the subtrahend point cloud will be filtered out, too. E.g. in the car park example above, the wheels of the cars will be chopped off at 10cm above the ground. To avoid this problem, you could for example erode somehow the subtrahend point cloud by the radius.
The described approach may be crude, but it is quick and suitable for many practical purposes.
Of course, for more sophisticated change detection in point clouds, which is more accurate and takes into account view points, occlusions, additions and deletions of objects in the scene, etc, you could use points-detect-change.
Assume that you happened to know the coordinates of your sensor in some Cartesian coordinate system, and you want to derive the coordinates of your robot centre. For the robot configuration you know the offset of the sensor from the robot centre, but not the other way around. The solution:
inversed_offset is the position (and pose) of the robot centre in the coordinate system associated with the sensor.
Step by step demo:
Start with some coordinates (navigation data in the world frame; the specific coordinate system does not matter). A sample data file is attached to this page:
The example uses binary, but this is up to you. The
nav.binfile contains the trajectory of the robot centre (GPS unit) in the world frame.
Get the coordinates of the sensor in the world frame:
sensor.binis the trajectory of the sensor in the world frame. We want to get the trajectory of the robot centre from these data.
Just do it:
Note the use of
Verify by comparing to the original
The output is on the order of. The precision is defined by the accuracy of
inversed_offsetcalculations above. If the
--precision=16option were not given, the comparison would be valid up toor so.
Rabbit MQ is an open source message queue service (https://www.rabbitmq.com/).
It implements AMQP 0-9-1 (https://www.rabbitmq.com/tutorials/amqp-concepts.html).
programming tutorials: https://www.rabbitmq.com/getstarted.html
(for other platforms see installation from the website)
rabbit-cat is a light rabbit MQ client in python available in comma.
see rabbit-cat -h for examples
For receiver run:
For sender in a separate terminal: